Women have been stereotyped for decades as secondary and male as dominance. I think eventually women will be seen by society as equal with male. America has slowly accepted having women become leaders in government and at large companies. Becoming more equal is changing the world and scares a lot of people, because they don’t know what to expect. Men are fighting to be number one and when they get beat by a woman, they feel less powerful and normal. Society has much influence in the dominance of male or female. They portray, they are equal but when it comes down to choosing male wins.