This week it appears I reached my fill of misogynistic societies. I don’t mean in the real world (although I’ve definitely had enough of misogyny there, too), but in fictional worlds.
I’ve read several fae books lately (CN Crawford, I’m looking at you) and whilst the stories have been fabulous and enormously entertaining, I’ve been put off by the unredeemed sexism of fae society. In all the fae books I’ve picked up recently, when our heroine goes from earth into the fae realm she takes a step back several hundred years in terms of womens’ rights. Women become possessions, or mates to be trapped, and status is everything. In the stories I’ve read, the heroine always has a high-status male to protect her from the worst of his society, but that’s kinda not the point. Women don’t generally require protection, and in the twenty-first century they certainly shouldn’t need it.
Again, I’m left wondering whether I’m closing my eyes to unpleasant things and think it’s realistic to live in a land of rainbows and unicorns. (There’s nothing to stop me doing that in fiction if I choose; what I read is entirely my choice). However, I really don’t think so. My view is that I know unpleasant things happen in the world. If I wanted to see them in fiction, too, I’d still be bingeing on crime thrillers like I did in my teens.
I don’t want to be reminded how awful the world can be. I want to know the heights humans can reach, not the depths. In my opinion, what the real world needs from fiction more than a mirror of how bad people can be, is a refreshing picture of positive ways for people to get along.
What I’m looking for now is fictional worlds where men and women from all backgrounds get on as a matter of course, working together to deal with whatever the bigger picture problem is in their world without the spectre of sexual violence.
I’ve just tried to think of some “for instances” and come up short. Any recommendations?