Romance novels as social commentary?A few weeks ago I was the guest speaker in a university class studying Women in Fiction. One of the students I met would like to hear what romance writers think about the role their novels may play in critiquing society, and he has given me permission to post his questions here:
Do romance novelists/novels use the genre to make social critiques? We discussed science fiction--how it plays on the realities of "now" to make assumptions about "tomorrow." I was wondering if romance novels do the same?
Or is it more a play on the desires of the consumers at the time? Some novels in the 80's may have seemed negative in their portrayals, but those books must have sold if that was the marketing scheme at the time.
Do writers keep these issues in mind as they write, or is it pure imagination? Although even our imagination and desires are somewhat guided by our environment, surroundings, fears, and limitations, etc. Is there something more beyond the covers, or is it just entertainment?
And then there's the whole psychology of entertainment, mass media--the whole thing gets blown out of proportion. Like how kids are supposedly committing various crimes against society because of the video games they're playing and music they're listening to. I know some books get a bad reputation because of their content and influence. But if that were true, wouldn't we see a huge population of women running around after hunks because of the books they're reading? Of course not, that would be absurd.
I don't know if I can really compare the two, and I'm definitely not the expert in social psychology. And I'm leading off tangent from my original question, which was the idea that romance novels critique society. Who knows? Do you have an opinion on this matter?