If it's crap ... We'll tell you
So, if you ask a film student, or perhaps a great film journalist to name the films that have had the biggest impact on America, what would their answer be? I'm not talking about impact on American film, I'm looking closely at American culture and identity, which films have actually changed us as Americans? Film has been a large part of American culture for nearly 100 years now, and 100 years is nearly half of our 234-year development. I think it would be naive of anyone to say film has not had an impact, but which films have been most influential? I don't pretend to know, but I want to throw out a few possibilities and arguments, looking backwards. I'd love to hear what other people think, these are just a few to get the ball rolling.
(Oh, and to all you international spillios, I apologize, but I'm barely comfortable looking at the U.S., I couldn't pretend to have enough knowledge to cover the world)
This 1977 documentary examined the Austrian-Californian bodybuilder's training and competition at the 1975 Mr. Olympia, but it was also a look into bodybuilding, something that wasn't fully understood at the time. With an slow, easy approach to the subject, it's my opinion that this documentary started the exercise craze of the 80's, with exercise being picked up by the masses as something everyone should do. It's debatable if that was a good thing or not, because we have to look back on a decade of spandex and aerobics, but I think it's safe to say that this movie did influence Americans.
Enter the Dragon
It's not by any means the first martial arts action picture, it's probably not even the best one. However, it was the first Chinese martial arts film to be produced by a major studio, and debateably the most influential to America. When did martial arts, in this case eastern martial arts, come to America? I'm sure they've been here as long as the practitioners have immigrated to the U.S., but when did it become part of American culture? We didn't always sign our children up for karate classes, or have martial arts studios in strip malls across the country, something inspired the shift. It's my opinion that it was this movie. It's always a combination of factors, but I think you could make a strong case that it was this film that put martial arts in our mind and made America wonder, "How can I learn to do that?"
Gone with the Wind
This film symbolizes a shift in entertainment for America. In fact, it may be most renowned for what Americans did for it, rather than what it did for America. This film has performed a box office feat that will never be matched again, with over 202 million tickets sold during its run at the box office. Now, granted, the film was in theaters longer than any film would be today, its theatrical run covered years. However, perhaps what makes it even more astounding was that it sold these 202 million tickets at a time when the country's population was 123 million. More than just a record, this film marked a shift, movies had become America's entertainment medium, and it would last another 20 years until television made it's debut. I don't think any other film will ever be so universal.
These are the 3 that really sprang to mind, I want to hear what everyone else thinks. I'm not looking a films that had a huge impact on other films, or films that had an impact on how films were marketed. I really just wanted to concentrate on people, and the shifts our society has made because of specific films. Granted, some of this is speculation, but I don't think it's too far off.