Does your work make you feel healthy and strong?

January 21, 2019



I met with a friend the other day, who also happens to be my dental hygienist, and we were talking about work.  She said that during the holidays she had developed a back pain but that it really didn't matter because it always happened to her - and that she knew that when she got back to work her "pain would just go away".


Work can be more than just growing professionally and having a good salary.  We can find work that also helps us grow physically, emotionally, and even spiritually. 

Isn't this type of job one worth searching for?👏😊


Is it possible for you to imagine a job that is much more that professional growth?  Have you ever had a job that has made you feel more healthier, safer, and happier?  I would love to hear your thoughts on this in the comments!  


Please reload

Our Recent Posts

Are you minding your own (or someone else's) business?

March 27, 2019

I am successful because I exist

March 22, 2019

I am happy because I am alive

March 20, 2019

Please reload


Please reload

  • Black LinkedIn Icon