October 13, 2018

Sports have always been male-dominated. The Olympics didn’t include women until 1900 and some sports still don’t allow women. Neither baseball nor football, two of the United States’ most popular sports, have a women’s league or a modified version of the sport.

Professional female teams rarely receive the same treatment as their male counterparts or garner the same amount of coverage for their achievements. Male sports dominate TV; female sports are in the limelight only on special occasions.

The US women’s national soccer team has won three world cups, making them more successful than the men’s team, yet the average salary on th...

Please reload