to throw it out there:
You know how Hollywood makes movies criticizing America's supposed
Military-industrial complex. For example, the movie 'Iron Man.' Also, Hollywood
actors like to tell us how to vote. I have another term that academics like to throw
around: Cultural Imperialism. Funny how Hollywood elites are so eager to criticize
American foreign policy but seem to overlook their own "imperialism." A quote
from this article:
In the contemporary world, Hollywood, CNN and Disneyland are
more influential than the Vatican, the Bibe or the public relations
rhetoric of political figures. Cultural penetration is closely linked
to politico-military domination and economic exploitation
That's right! Hollywood you are bunch of imperialists!
Exit question: Do you think we will ever hear George Clooney talk about
cultural imperialism?