It just hit me like a pile of bricks. In the last eight years we've had a terrorist
attack on our homeland, wars, and a deadly natural disaster and there's no
obvious signifier that all these horrific events have really changed us a people.
It seems like it's been crisis after crisis. My generation hasn't learned squat
from the tragedies of this, have we?
Honestly answer this question: Do you think America is really any different from
10 years ago? We still spend money like crazy. Our news organizations still
do shallow and superficial reporting instead of digging into real meat of the stories
our times. We still have celebrity obsessed culture. (In fact, it might have gotten
worst since 9/11.)
One of the themes of the presidential election has been "change"? Have Americans
"changed" after all these hardships? My gut tells me no. The WWII generation is
often referred to as "The Greatest Generation,"and they witnessed a war like no
other. After WWII, Americans then focused domestically and really "changed" things.
The America ofthe 30s (prior to war) was no more. Can we say that the excesses of
the 1990s are no more? I guess it is arguable that it's too early too tell if a change
has occurred.
Or maybe I am wrong. After WWII Americans wanted to be a global leader and
a world "super power." After 9/11 we don't seem to like to the "super power" label
as much and all the responsibilities it comes with. We've realized it's hard work.
Maybe my first thought was wrong and things are changing. Perhaps, we changed
from an America that previously had ambivalence (perfectly describes my generation)
to our world leadership to one that willfully hand it over to let others take over
that role. Makes me wonder if America is such the "bully" that its critics claim
why would we would so gladly give up our international power?