But, and this is a big but, I am already SO sick of seeing 'bikini body' tips being shared left right and centre. It isn't hot enough to even wear shorts yet and already I am being told how to look good in a bikini.
As a human and as a woman, I understand. It’s okay to want to be attractive (whatever your definition of that may be). If there is something that you don’t like about yourself, it is absolutely your choice to change if it you want to, however you want to. It isn’t my or anyone else’s business. However, I don't think it is right that we are constantly being told what to do and what not to do to look good on the beach or beside the pool. Instead what we should be focusing on is how to feel happier in our skin and to love our bodies, and seeing tips on how to do that instead. So have no fear ladies, I've got you covered.