Is This the End of Men or The Beginning of Women? : Discovery News: THE GIST
American women are now dominating careers that once went only to men and more women are graduating from U.S. colleges.
Some experts argue that men are finished and women are taking over.
Others say that male and female roles are simply shifting.
2 years ago