Hollywood and Sexism

Wonder Woman has generated considerable discussion - about sexism in Hollywood, about heroines and the lack of them, gender equality, and so on. These are necessary discussions. Good. Any movie that is likely to appeal to younger viewers is going to have an impact on how they feel the world ought to be. In the process, though, we may be missing a major point. That is that Hollywood has for decades been producing films that glorify violence and anger. From those early Cowboy films, through John Wayne to Rocky, to Band of Brothers violence and anger have been validated over and over. It's so pervasive that it feels as if it's an essential part of life and growing up in America. Are we surprised then, that anger, violence and violent disagreement are now such a large part of our landscape? Wonder Woman is really no different from any other violent superhero, with her exploits in World War One, out-fighting the ordinary mortals she is pitted against. Violence is not naturally who we are. We have to learn it - children have to learn it. Might it be possible, therefore, to learn some other way of being?