Saturday, September 24, 2011

Is This the End of Men or The Beginning of Women? : Discovery News

Is This the End of Men or The Beginning of Women? : Discovery News: THE GIST
American women are now dominating careers that once went only to men and more women are graduating from U.S. colleges.
Some experts argue that men are finished and women are taking over.
Others say that male and female roles are simply shifting.

No comments:

Post a Comment