You know that. I know that. Probably (almost) every poster on here knows that. Why then are there so many that don’t know? Do they not teach History anymore in most States? It’s like these idiots who want Socialism. Has it ever worked? And if Capitalism is so bad, why has the United States been the financial leader of the world for a hundred years? When we had the Great Depression, industrial nations all over the world suffered.
I know. I’m preaching to the choir, but dang, what’s going on in our country is unfathomable. It has to be coming from Universities. Private ones can do as they like, but State run Universities need to be scrutinized. I admit I’m not aware of how much power our Texas Govt has with ours, but if it has the power, why not require our U’s to teach our young folks the truth, rather than made up liberal s***. If I was Governor, they could stick that tenure where the sun don’t shine. Any professor promoting Socialism or Communism, could pack up his or her s*** and move to California. Big sore spot with me.
Anyone that knows the power Texas has over it’s Universities, please respond.