Americans allegedly believe in justice. Our founders deemed the concept of justice as necessary for the preservation of our republic; defined as a free and just society, where preserving liberty and property was the only legitimate role of government. Consider our...
