“Everyone has a right to affordable health care!”
“Everyone has a right to go to college!”
“Everyone has a right to own a home, it’s the American Dream!”
We’ve all heard these. Some of us have even believed them, but do you honestly believe the government can give this to you? Do you think it’s their job? I’m sure we are all familiar with the term “reform.” It’s a political phrase like no other in that it tells you that every government plan is a failure. Just about everything they do is “reforming” their previous mistakes that have caused massive debt and further dependence on them.
If you ran your life like government is run where would you be? Do you have a printing press to bail yourself out as they do? You would be destitute; in the gutter. Yet people are brainwashed into believing that only the government should be in charge of the most “important” things.
To borrow an argument from the great Dave Smith, if government should be responsible for the most important things why don’t we put them in charge of food? Is there anything more important for survival? How about shelter? What about clothes?
When you realize how much they screw up everything they touch, do you want them involved in anything we need to survive? How about anything at all?