It seems to me Doctors are not Doctors any more.

Posted by DLFG74@reddit | GenX | View on Reddit | 664 comments

I am turning 51 soon and through the years of my youth I have abused the hell out of my body working, playing and stupid man tricks. Now all my injuries have taken their toll and every time I go to the doctor for anything it seems that they either don't know what to do and guess with a series of shit that does not help or they have their hands tied by the insurance pukes keeping the Doctors from doing what they went to school for. Does anyone else notice the same?