How to Foster Body Positivity

Body Positivity refers to the “assertion that all people deserve to have a positive body image, regardless of how society and popular culture view ideal shape, size, and appearance.” Although weight loss and how you look can be important, the real victory lies in how you feel about yourself. Fostering positive body image is very important but can be difficult for some. If you are struggling with positive body image or would like to learn some ways to be more body positive, check out the link below. 

Comments

comments

Leave a Reply