Can I just say that society is screwed up. Usually I eat a lot of food, but
I hardly gain
any weight (I'm a healthy weight don't worry) but
I never work out, so
im just enjoying it while I can, I guess you could say, and I get
told all the time how I have the perfect body (Which I don't believe at
all).
Well yesterday my mom told me, 'Haley,
you need to start working out with me. Your legs are
getting fat.'
I already think I'm fat because I have a
little gut, but when she
said that, it made me feel so...ugly.
I feel fat, I look at myself in the mirror all the time, not just
face but like my whole body, and I stand there for a good few
minutes thinking of all the things I hate about me and I wish I
could change about myself.
I hate this.