The dominant historical narrative either turns women’s accomplishments and contributions into footnotes and sidebars or ignores them altogether. In reality, women have been key players in U.S. history since colonial times. During the Revolutionary War, women were firing cannons alongside male soldiers and discussing patriotic duties with their husbands and family members. The image of the woman as ruler of the domestic sphere, while not entirely false, is proven wrong by the words and deeds of women like Abigail Adams, Charity Clarke Moore and Deborah Sampson Gannett.
Moving forward in history, we see women agitating for a variety of social reforms: suffragists marching for the vote, Christians crusading against the evils of alcohol, and progressives like Ida B. Wells-Barnett educating the Western world on the horrors of lynching and racially motivated violence. As women gained a variety of rights and freedoms, the roles they were allowed and expected to play in society changed as well.