For the most part, women spend their pregnancy in fear of what their body might look like after giving birth. Body changes can be really scary and uncomfortable especially with diet-culture inundating us about needing to “get your body back.” Truth be told, your body will change after you have your baby but that doesn’t have to be a bad thing. There are things you can do to help yourself from spiraling into a negative place.
FOCUS ON EVERYTHING YOUR BODY HAS DONE AND IS DOING FOR YOU AND YOUR BABY
Your body just grew a baby, and brought that little baby into the world. This means that your body did a lot of shifting, changing, and growing to accommodate that new life. In addition, your body is now tasked with feeding that baby, healing from birth process, holding