Beginning scene of the new HBO series The Newsroom explaining why America's Not the Greatest Country Any Longer... But It Can Be.
This is not a snide question but please tell me which war was fought on moral grounds? This is a serious question and I really want to know. I was thinking about it last night and couldn't for various reasons find an answer and if you are tempted to say WW2 remember Pearl Harbour and lend lease. Maybe WW1? I can't think of anthing that was gained by America in WW1.
Note :- All Nidokidos friends are requested to like our facebook Page