That's a big question and it also depends on where you mean, but it's true that the 19th century was a time of big changes for a lot of women in many countries especially (but by no means only) in the west. It was the era of the New Woman who often wanted education and a career rather than the traditional occupation of wife and mother. The wish to vote and participate in public life was also a concern for many women. Throughout the 19th century, the idea of equal rights for all men had been gaining ground, and gradually this came to include women as well.
There is quite a detailed overview of this period, from an American perspective, here.
There is quite a detailed overview of this period, from an American perspective, here.