- Industry
Women in Hollywood Making Strides
It’s been a long time coming, but lately women have stepped up to the plate and become major players in front of and behind the camera. Women are no longer invisible, and the growing trend in recent years is of women in Hollywood gaining more opportunities not only as actors but in careers which include such positions as directors, writers, producers, cinematographers and editors.