top of page

Does your work make you feel healthy and strong?


I met with a friend the other day, who also happens to be my dental hygienist, and we were talking about work. She said that during the holidays she had developed a back pain but that it really didn't matter because it always happened to her - and that she knew that when she got back to work her "pain would just go away".

Work can be more than just growing professionally and having a good salary. We can find work that also helps us grow physically, emotionally, and even spiritually. Isn't this type of job one worth searching for?👏😊

Is it possible for you to imagine a job that is much more that professional growth? Have you ever had a job that has made you feel more healthier, safer, and happier? I would love to hear your thoughts on this in the comments!

bottom of page