When I first shared that I wanted to write a Christian novel, I received both positive and negative responses. I cherished the feedback but I quickly began to question the negative. You see, the main objection I received was that no one wanted to see Christianity in their media and that I would be limiting myself. This got me thinking; each night when I went home and browsed through Hulu, Netflix, or HBO I consistently encountered ‘Christianity.’
If you name a movie, tv show, or video game there is a very strong likelihood that at some point Christianity will be mentioned. There are other shows that feature it heavily such as Vikings and The Handmaids Tale. So why are these images more palatable to the same people who might scoff at a Christian writer writing a Christian thriller?
There is one thing all of these instances have in common. Christianity is not shown in a favorable light. We’ve all seen the tropes; a terrified priest defensively clutching a wooden cross, his brow perspiring and his hand trembling as he attempts to ward off evil spirits. Or maybe the drunk and belligerent preacher to really hammer home the hypocrisy of it all. All of this made me wonder, is it simply mentioning Christianity that is the problem or is it Christianity portrayed in a good way that is unacceptable?
We are taught not to take films and tv series seriously, but I am often puzzled why such good advice isn’t extended to Christianity or for that matter why Christianity is mentioned in so much of our media without actually having consulted any Christians in the making of the tv series, film, etc. The conclusion I have come to is that people of various different backgrounds and worldviews have no problem with Christian imagery or even Christian Scripture. They have a problem with Christianity portrayed in a positive or neutral way. I believe accurate representation is important, and would like to see more Christians get into entertainment.