Originally Posted by
ChecklistMonkey
Getting rid of slavery, Western expansion, becoming a global super power, and women's suffrage all radically changed America. I'm glad we don't still only have the mindset and laws from 1777. Don't be ignorant.
Oh my and to think you are still making monthly payments on that university indoctrination.