They do what they do because we want it that way...That's kinda fucked up. Don't you think?
Not really.
I typically go to the movies to escape the dynamic make up of modern
day wars and racism. I am quite interested in understanding what drives
us to do what we do, but I don't go to the movies to find answers or even
explore that.
We're no longer pushing the envelop and inquiring more about the World and although I don't blame Hollywood, I definitely think its reflected in Hollywood...
You say we no longer push the envelope. When did "Hollywood" push
the envelope? So to answer your question from my perspective there
is nothing wrong with "Hollywood". We (the collective "we") do inquire
more about the world. At least among the people I know and read. I
spend a lot of time reading and talking about deeper issues like racism
and the make up of modern day wars and the relationship between
humans and machines and what drives us to do the things we do. I
just don't look the movie studios (Hollywood) to do it.
I'm not suggesting there have never been movies that explore racism
or wars, just that I can't recall a time when "Hollywood" pushed the
envelope regarding these issues. Independent producer, writers and
directors have pushed the envelope, but "Hollywood" only wants to
make money. Not kinda fucked up at all. Just two very different entities
- the individual artist and the money driven machine.
You know, of course, that if people paid to see stories that explored the
issues you mention then the studio machine would make them. But
the rare times a movie is made about more important issues people do
not pay to watch them. A bit of the chicken-and-the-egg issue, isn't it?