They say that doctors make a ton of money.
And some really do.
But after you take into consideration the many long hours that they put into education and refining their skills and knowledge, some would say that its relatively fair compensation for what they have to go through to be able to fix us when things go horribly wrong physically.
Really, would you want a doctor that wasn’t highly compensated taking care of your body?