Origami Elephants walks the tightrope between religion and philosophy, faith and certainty, symbol and science. Hosts Bryne Lewis and JR. Forasteros tackle the elephants in the room, talking about issues that often go ignored for fear of a fight. Bryne, JR. and their guests initiate a conversation about controversial subjects with an invitational tone.
Subscribe to Origami Elephants
Origami Elephants - June 10, 2016
Gorillas in the Zoo with Tripp York
More Messages Associated With "Animals"...
Meet the Hosts
Born and raised in Scranton, PA Bryne Lewis is a philosopher, poet, writer and teacher. Bryne is an adjunct instructor with classes in philosophy, ethics and religion. Her poems have appeared in Janus Head: A Journal of Philosophy and Art, The Anglican Theological Journal and other publications. In 2010, her poem “Conjoined” won first prize in the “Love at the Mutter” poetry contest, sponsored by the Mutter Museum, Philadelphia, PA. Bryne is the co-ordinator of the Sit Next to Me! Film Festival, featuring films stories that illustrate our universal need to belong and our personal responsibility to make others feel welcome. Bryne works out her philosophical demons at brynelewis.com.
JR. lives in Dallas, TX with his wife Amanda. He’s a writer, blogger and pastor (in the Church of the Nazarene). He also loves Batman, spirited debates and smoking meats. Check out his blog for his sermon podcast. In addition to Origami Elephants, JR. also cohosts the StoryMen (history, theology and pop culture), Bible Bites (education and theology) and Don’t Split Up! (horror films). He also blogs on faith and pop culture right here on NorvilleRogers.com. Follow him on Twitter or say Hi on Facebook.
Journalist Lewis Wallace recently lost his position as reporter at Marketplace, a news source distributed by American Public Media (APM), for self-publishing an article entitled “Objectivity is Dead and I am Okay with it.” In the article, Wallace makes the case that reporters from marginalized people groups (Wallace is transgender) do not have the luxury of being “objective” or “neutral” when the issue at hand directly impacts their ability to live and/or questions their right to exist in society. Further, he claims, “…as norms of government shift toward a “post-fact” framework, I’d argue that any journalist invested in factual reporting can no longer remain neutral.”
Now, I am not a journalist. I am a philosophy teacher, but I have found myself facing the same issues as Mr. Wallace. Because philosophy aims at exposing truth through logic, philosophy (done well) is as neutral as it gets. To any given problem, there are as many solutions as make sense. Regardless of who posits them. Of course, philosophy doesn’t excuse any office or authority from making sense either. As a philosopher, I cannot turn a blind eye to the rhetoric of “alternative facts” floated by the Drumpf administration. I have to call it out. And this causes some students to question my neutrality.
Under these circumstances, I’m willing to concede that apparent neutrality may be a lost cause. However, I don’t count objectivity a similar casualty. In fact, while I appreciate Wallace’s intent (and object to his loss of employment), I reject his assertion that objectivity is dead. My disagreement on this point is not really with Wallace, but stems from a cultural confusion about the nature of objective and subjective truth about which we can no longer afford to be complacent. Subjective truth is popularly depicted as the triumph of personal perspective over the universally accepted. To turn any given topic on its head, all one has to do is intone the magic objection: “That’s not my experience.” And POOF! truth as we know it goes up in a cloud of smoke. However, to reduce subjectivity to an authoritarian kind of personal experience misses much of the criticism it brings (with good case) against our assumptions about objectivity.
Let’s begin untangling this mess with elementary school grammar. Maybe you remember diagramming sentences in elementary school. Then you also might remember that a sentence like “I see the apple” has a subject and an object. The subject does the action. The object receives the action. In this case, the subject is “I” and the object is “apple.”
Objective truth is that truth which can be stated about the apple without reference to the I. The identity of I does not come into our account of the apple. Because the I is meant to be left out of the account, objectivity is not best understood as a personal trait. I am not objective. Rather, objectivity is an observational methodology. Objective truth consists of information about the apple that is universally available. Typically, objective truth is consistent with what we think of as scientific truth. It includes facts and data about the apple verified through observation and measurement, under strict controls and in lab conditions. If we understand the I to play a part at all, it is that of a scientist. Objective truth assumes that the human I, or ego, is such that it examines objects (like apples) with a degree of detachment, as if encountering them on a lab table.*
However, clinical precision is not typically how human beings encounter objects in everyday life. Rather, the human mind is located in the world in relationship to objects. These relationships are native and most of the time, completely invisible to us. As a human being, I take my position in the world for granted because I’ve occupied it prior to being aware I was in the world. There are situations, crises really, which can cause my position and relationships to become apparent, but that’s another story for another time. Suffice it to say, subjective truth is that truth which takes into account the relationship “I” have to the “apple.” It recognizes that the subject comes at the object from a point of view.
To American ears, that last statement may indeed sound like the triumph of personal perspective over objectivity. American culture is head-over-heels in love with individual autonomy and we are therefore prone to hearing things that reinforce the independence and power of the “I.” But that misunderstands what subjective truth is trying to say.
If we consider the problem in terms of position, then we can get a better picture of what is going on. The illustration at left is the same as above, just rotated so that all the objects are on the same plane. It demonstrates how my position may result in me knowing less about a given object. I can only see portions of the car from my position; my relationship with the chair and the teacup somewhat obscures my view. My neighbor is more distant from me than my favorite book. If I start to think of my “position” in terms of demographics (socio-economic class, ethnicity, health, gender, etc.), then I can imagine how my social location may inform my “view.” And again, my view is native to me. Therefore, I often don’t realize that I’m coming at a relationship from a particular direction. Maybe I’ve never considered that my relationship to my car is part of my socio-economic class. Maybe I’ve never realized that my relationship to my neighbor has something to do with what we call “race.”
Now here’s the important bit. What’s at stake in the subjective view of truth I’ve just laid out is NOT what it says about MY EXPERIENCE. The point is not to establish the limited viewpoint of the subject as authoritative, but to criticize the assumption which underlies the objective model: that the subject has the capacity to be objective under everyday circumstances. Most of the time, I am not a scientist and the world is not a lab table. Most of the time, I am in relationship to the things around me. Most of the time, those relationships are invisible to me because I am so used to them. So most of the time, even when (maybe especially when) I think I’m being objective, I am not. Therefore, according to the subjective account of truth, my perspective on other beings is less authoritative, not more so. Once I admit my limited capacity to know my car, my cat, my neighbor as the really are, the emphasis shifts away from my authority as ego and towards accounts of other beings as and by themselves. Especially when dealing with other conscious beings, there is a “what it is like” to their existence that is inaccessible to me because it is subjective by its very nature.
Thomas Nagel makes this point in his awesome essay “What it is like to be a Bat.”** He argues that if we attempt to imagine what it is like to be a bat, then the best we can do is imagine what it is like for ME to be a bat. However, this is not how a bat would experience itself at all. There is a quality to the experience of existing as a bat that is forever foreign to me because it is dependent on the bat’s perspective of itself. Which is not to say that there aren’t volumes of objective information available about bats. There are many, many books full of lab-quality facts, data, measurement and observations that I can know about bats. The fact that escapes me is “what it is like” to be a bat. My position excludes me from that information because it relies on the bat’s intimate position to itself.
In conclusion, Lewis Wallace is correct to argue that my social location matters. My “I” inhabits the world in relationship with the objects around me. However, declaring my positions does not disqualify me from asserting (or reporting or teaching) objective truth. I just need to take care when I do so. I need to guarantee my information is as lab-quality accurate as possible and I need to question whether others are upholding a similar standard in their assertions. I also need to be aware that some information might not be available as objective fact and commit to considering more deeply the experiences of others. In the current political climate, I likely will be perceived as less than neutral when I do so. And I am okay with that.
*It needs to be mentioned that historically the clinical detachment of the scientist has often been more about how consistent the “I” is with hegemonic cultural norms (male, upper class, white, etc) and less about laboratory accuracy.
** Nagel, Thomas. The Philosophical Review LXXXIII, 4 (October 1974): 435-50.
We are currently gearing up for my son’s first Christmas. Because he’s only 8 months old, this Christmas will be more about us celebrating around him than him celebrating with us. But, as I’ve been decorating the house and singing favorite carols, I’ve been thinking about what I want him to grow up to believe about the holiday.
When my older children were little, I was a practicing Christian and made sure that they had an age-appropriate understanding of the doctrinal import of the Christmas story. Now that I think about it, my degree in theology may have skewed my idea of what constituted “age-appropriate” when it came to doctrine. As a family, we didn’t do the Santa thing; I felt it was important not to pantomime belief in a pretend character while at the same time instructing my kids to believe that Jesus, Mary and Joseph were anything but pretend.* My kids became adept at navigating the touchy social landmines of not believing in Santa. Meanwhile, we celebrated Christmas full of wonder at the manger scene, angels and shepherds and all.
So I’ve done Christmas without believing in Santa. But now, as I face crafting new family Christmas traditions with my infant son, I also no longer believe in God.
My loss of belief in God is a long story. It involves my sister in a near-fatal bicycle accident. It involves getting an advanced degree in theology. It involves my father’s death and the break-up of my marriage. It involves the consolation that philosophy provided that doctrine did not. Suffice it to say, at the end of it all, I am both officially agnostic and deeply content.
But I have this relationship to Christianity that is an important part of who I am. For starters, I teach religion at a local university. But aside from my professional interest, I remain convinced that religion is an essential institution. On a more personal level, I still treasure the extensive Bible education my Baptist upbringing bequeathed to me. In fact, I am disappointed when my students demonstrate little or no familiarity with the Bible stories I grew up with. For example, between two classes, I recently had only a few students who knew the phrase “good Samaritan” came from the Bible. Even fewer students had ever read the text. Not only is it sad that they don’t know such a morally powerful story, but Bible illiteracy in a culture where Christianity is dominant is dangerous. It opens the door of the Church to manipulative, power hungry people who would use the tradition as a convenient incubator to hatch their own agendas and cultivate their own prejudices.
So I want my students to have a familiarity with the Bible. And I’d like my son to know those stories too. I want my son to know the parables of Jesus of Nazareth. I want him to know the stories of the patriarchs that form the shared bedrock of the three Abrahamic faiths. I still believe in Christmas and want him to connect Christmas to the wonder of the manger. I want to be able to share these stories in a way that is respectful to those that believe in them.
With that goal in mind, I have begun returning to the key Bible texts of my youth to see what meaning they might have aside from their theological significance. I realize it’s difficult for me to truly sit outside of that framework (you can take the girl out of the Baptist church, but not the Baptist out of the girl), but the exercise has given me the opportunity to reconnect to stories that I have a great deal of affection for and have informed the person that I am today.
For the record, reading the text in this manner does not necessitate the exclusion of supernatural elements. Instead, my position to the narrative is closer to the suspension of disbelief required by many stories to gain entrance into their world. Again, I don’t want to be misunderstood; I take it very seriously that people — people whom I love and respect — believe the supernatural claims in these texts. Suspension of disbelief is not just about entertainment. Rather, I set my disbelief aside to engage honestly with the symbolic fabric of the story. My students and I routinely read sacred texts from other traditions in this way. My kids and I have read the mythologies of Greece, Rome and Egypt in a similar fashion. If there is any disrespect taken, there is none of it meant.
So, what about the Christmas story? I’m not the first person to point out that the story of Jesus’s family at Christmas time is a refugee story. It is the story of a family uprooted and vulnerable. First of all, there is Mary’s unusual pregnancy. Regardless of what you believe about its origins, if we trust the text for the timing, there’s no doubt it looked scandalous to Mary’s community. Then, the family is required to meet the bureaucratic demands of an oppressive foreign power. Traveling to register for taxes, they are unable to find a place to stay and end up squatting in a barn. In these details, there’s definitely a call to listen attentively to the stories of others. Our origin stories are not all of who we are, but they are an essential part of understanding the particulars of our present position.
The Christmas story is certainly not the first time that a baby was born to a teenage girl on the run in less than ideal circumstances –not the last by any means either, remember that this holiday season– but this birth is proclaimed by angels to be the birth of a king. Odd enough to think a king worthy of an angelic birth announcement being born in a barn. But adding to the oddity, the announcement is made to shepherds out in the fields. Shepherds occupied the edges of ancient society, literally and figuratively. Their flocks demanded that they seek out and dwell in the uncivilized parts of the country. Hardly the sort that we would expect to receive divine correspondence. Hearing the news, they abandon their flocks, the most important thing to their livelihoods, and go to check out this baby born a king. They find him with the animals and worship him.
If we can imagine that the God who created the universe would be born as a refugee baby, then we must assume that every person, no matter what their circumstances, is worthy of dignity and respect. If a divine king can be born in a feeding trough, then there should be no boundaries as to which people I value. Not only is every person worthy of my respect, but my commitment to basic human dignity cannot be a casual one. I should be willing to put my own resources on the line in extending a hand to others. I should act as if each person I meet might in fact be divine.
I think this reading is not very far off from the accounts of Old Testament and post-resurrection theophanies. In these stories, God (or Jesus) shows up in ways that people don’t recognize and it is only after hospitality has been extended that the divine within is revealed. The human actor is not just found out to be behaving well, but is caught in the act of offering what they have to others. Only that act reveals the divine character of the neighbor, the stranger, the enemy across from us.
The Christmas story invites us to sit outside with others. Listen to their stories. To invite them in. Or accept their hospitality on their terms. Especially to the point where my acceptance of others costs me something. That is the story I will tell my son at Christmastime. And maybe he won’t understand it this year, but it is my hope that he will grow to believe it. Even as I learn to better live out its truth, religious belief aside.
*If you want more on my decision not to play Santa with my kids, you can read here.
Checking in with my newsfeed yesterday morning, I was immediately confronted by pictures of two adults, overdosed and unconscious, in the front of their car with a four year old boy strapped into the seat behind them. When they recover, they will face a number of criminal charges for endangering the boy. However, the City of East Liverpool, Ohio along with the local police department, took it upon themselves to release pictures of the incident on Facebook in order to illustrate “the other side of this horrible drug” and hopefully “convince another user to think twice” before engaging in further use.* Not only do I believe that they were wrong to post these pics, but I further believe that their decision to do so points to prejudices that constitute the real “other side” of drug addiction. In turn, these prejudices need to be overcome if we ever hope to be effective in beating the opioid epidemic.
To begin with, the adults in the Ohio photos were unconscious at the time the photos were taken. And being unconscious means being unable to give consent. Generally, consent plays a big role in our day to day lives. Whether we are visiting the doctor’s office, upgrading computer software or signing off on a school field trip, we are constantly being asked to indicate that we agree to certain terms. So the instances in which consent is slighted should clue us in to a deeper disregard. For example, it seems we are still having to make the case that consent is an essential part of sexual relations. And, if the Brock Turner sexual assault case is any indication, the lack of consent implicit in incapacitation is not universally acknowledged and protected. Taken against how regularly we give consent in other situations, these omissions reveal an embedded societal stigma around particular classes of behavior such as sexual relations, sexual violence (which are two separate things), alcohol consumption and yes, illegal drug use.
The fact that the Ohio pictures depict illegal drug use and were taken during a police response does not substantially mitigate the violation of privacy committed in posting the photos online. And it seems that both the courts and police departments are beginning to agree. Due to the staying power of social media, posting pictures of suspects to Facebook has been ruled illegal in the past. The same concern is echoed in the comments of Police Chief Trevor Whipple of South Burlington, VT in a NYTimes article about his department’s decision to end posting booking photos to Facebook: “Posting on the Internet is kind of like a bell you can’t unring.” While we have banned the pillory from the public square, the internet is being put to the same use and with longer effect. No amount of “bad” behavior, legal or illegal, should condemn someone to public (and potentially perpetual) humiliation. When we are willing to accept otherwise, we should be warned to beware.
This extends to instances in which our stated purpose in public punishment is education or prevention. In the famous “trolley dilemma,” when considering what actions are ethically acceptable to stop an oncoming train from hurting several people on the tracks, it is generally considered unacceptable to stop the train by pushing another person in its path. Likewise, Immanuel Kant prohibits treating another person as a means to an end in his ethical formulations. Whether it’s increasing the severity of a sentence to make a statement to others contemplating the same crime or posting a humiliating picture to influence another person’s behavior, it is an injustice to reduce a person to the unfortunate means by which a happier end is written for another person.
Add to this, we know that it just doesn’t work. Study after study has disproven the “scared straight” model of intervention. People don’t change their behavior after being informed of the worst case scenario. Usually, people assume that they will always end up the exception rather than the rule. In other cases, we run the risk of normalizing or desensitizing the target audience to the very behavior we are hoping to curtail. In the end, posting pictures of people in desperate states isn’t documenting, it’s dehumanizing. Pictures like those out of Ohio are not going to shock anyone into anything beyond reenforcing their disgust and condemnation.
And disgust and condemnation continue to dominate our perceptions of drug addicts and addiction. Somewhere at the base of our brains we believe that the people involved in cases like that in Ohio have morally disqualified themselves from normal considerations as outlined above. To put it flatly, we believe they deserve “it.” “It” being a stand in for any consequence perceived by the court of common opinion to be associated with the incident. Along these lines, as long as we continue to believe that addiction is a moral failing, we will continue to fail to address the medical realities of drug addiction. And as a result, we will continue to fail drug addicts and their families.
For example, one of the biggest practical indications that we view addiction as a moral rather than medical problem is the continued reliance on 12 step programs for treatment. In general, these programs are based on several principles including that, with the help of community and the strength of a higher power, an addict can effectively adopt a new set of behaviors. Meanwhile, 12 step programs report only a 5% to 10% success rate in preventing relapses of addiction. Further, many 12 step programs not only don’t include medical intervention for chemical dependency, but actively condemn it despite research demonstrating its effectiveness. I would also argue that the general unawareness that there is a distinct difference between drug addiction and chemical dependence is further evidence of prejudice. Predictably, training for doctors in addiction remains underfunded and largely unavailable. Although signs of change are visible, they are slow to emerge given the shame and judgement surrounding addiction.**
If you want to see the “other side” of drug addiction, then look into the faces of the friends and family members who have worn themselves out working to find effective treatment for their loved ones, all the while worrying that help won’t come in time. It’s time to stop reposing pictures, cease the scare tactics and move beyond medieval measures of public shaming. We cannot turn back the opioid epidemic until we are willing to look inward and confront the prejudices that are condemning people to humiliation rather than advocating for their recovery.
*These quotes and the pictures in question can be found on the city’s Facebook page. I am not linking to the page on purpose.
**Radiolab’s podcast “The Fix” highlights some of these changes. It’s a great show for anyone interested in more information.
(Shout out to my summer section of Introduction to Philosophy who graciously evaluated my argument and previewed this essay. Thanks!)
From the convention stage down to my Facebook feed, I feel like the public conversation has grown thick with rhetoric. Don’t get me wrong. Between the most recent repetitions of racial injustice, all too common acts of gun violence and the current dysphoric election cycle, I think there are plenty of legitimate reasons to be worked up, to be critical and to be angry. Persuasive speech is appropriate and inevitable. However, when rhetoric gets ratcheted up, we risk miscommunicating our position in favor of enflaming passion.
Among the oratorical misdeeds common to our current conversation, I want to focus specifically on confusion between moral condemnation and ethical censure. What I’m not suggesting is some feeble retreat to “making nice.” As a feminist, I’m very aware (and terrifically weary) of the way politeness has been used to silence dissent. Rather, I believe at the current moment we have necessary, hard things to say to one another and just as importantly, we need especially to be clear if we hope the conversation will be productive. The difference between moral and ethical is a nuanced one. While both attempt to distinguish between “good” and “bad,” those conclusions are reached in significantly different ways. I think making a distinction between them can help us more accurately hear one another.
When I teach Introduction to Ethics, one of the first texts I have students read is Plato’s Euthyphro. First of all, Euthyphro introduces students to the basics of philosophical argument. Philosophical argument, in turn, forms the logical foundation of ethics. In ethics, right and wrong is judged by reason. Secondly, Euthyphro draws out how morality, in contrast to ethics, approaches questions of conduct. In the text, the philosopher Socrates questions the title character, Euthyphro, about his motives for prosecuting his father for the murder of a family slave.* This was a strange act in Greek society given both the cultural reverence of the father figure and the widespread existence of slavery. When Socrates presses him, Euthyphro answers that he is prosecuting his father for the sake of “piety” or “holiness.” Throughout the rest of the play, Euthyphro attempts to define “piety” to the satisfaction of Socrates (SPOILER ALERT: he doesn’t do a great job).
Why does Socrates drill down on the word “holy?” Apart from rigorous interrogation being the hallmark of the
Socratic method, it’s because holiness indicates something beyond being merely correct or logically acceptable. By using “holy,” Euthyphro seems confident that what he is about to do is not only right, but also better than other courses of behavior. Euthyphro sees himself as occupying a vantage point from which he can better judge the situation. In Euthyphro’s case, this position is a result of his religious worldview; what he is doing is approved by or for the sake of the gods.
But morality should not be understood as being confined solely to religion. Broadly speaking, the signature of morality is that it appeals to authority and tradition first, reason second. Sometimes, religion is that authority, but the ethnic or political communities may play that role of tradition as well. To say that morality appeals to reason second is not to say that morality can’t be reasonable. It’s more like saying: because of my upbringing, I believe that 3 + 2 is the preferable way to get to 5. 3 + 2 is a perfectly reasonable way of adding to 5, but preferring it over 4 +1 is just a matter of what I was taught to believe.
I’d also like to avoid the misunderstanding that morality is necessarily arrogant. The distinctive of morality isn’t “better than you,” it’s “what’s best for us.” They reveal who we think we are as much as what we think to be good or evil. Because it is grounded in authority and tradition, morality comes from within a particular community, a specific perspective. It’s important to note that typically, my identity is not in question during the course of an issues based conversation. Rather, I speak out of my community of origin. Therefore, moral assessments add a dimension of identity that can’t be ignored.
So in public conversations, which are conversations that span across communities and identities, it’s really helpful to understand whether an objection is being made on moral or ethical grounds; in other words, whether it is more part of someone’s traditionally held beliefs or whether it is more about what makes sense. This helps me to distinguish what’s on the table in the conversation. When we ignore the difference we lose opportunities to agree with one another or we potentially argue over the wrong point.
Let me give a personal example: I am a vegetarian who doesn’t eat meat for ethical reasons. Although there are plenty of vegetarians who abstain from meat for moral reasons, I’ve decided to land my diet lower on the food chain because I think it makes sense as a more fair way to share in the world’s food supply. I don’t think eating meat is necessarily evil. Making that distinction in conversation has allowed me to have discussions with meat eaters about issues we might agree on like conservation and resource distribution. I’ve even learned some things about sustainable meat-eating practices (although I remain untempted).
As another example, I grew up in a faith community that taught homosexuality was wrong. This is a moral judgment drawn from traditional teachings on portions of sacred texts. Growing up, I accepted this position along with many other moral positions shared by my faith community. As I became an adult however, I met members of the LGBTQ community (very patient and gracious members) who made a compelling ethical case for equal treatment under civil law. Even before I began to question my moral assumptions about queer sexuality, I could agree that it made sense that civil rights should be extended to all citizens. This initial agreement opened the doors for me to begin exploring my religion for other voices and teachings on the issue. Eventually, I decided both that the judgement of homosexuality as morally wrong should not be accepted uncritically within my faith tradition and also that I no longer wanted to associate with a faith tradition that held those views.
It is equally helpful to observe cases in which the distinction cannot be made, instances when morality and ethics strongly align. I would argue racial injustice (police profiling and brutality, higher rates of arrest, sentencing inequities) is both unethical and immoral. It does not make sense to apply the law differently on the basis of race; also, doing so attacks the belief that all persons are equally due basic human dignity.
Distinguishing the moral from the ethical does not provide us with a no fail recipe for civil conversation. Far from being an easy fix, it may initially make matters seem more complex. However, if we practice speaking more precisely about the origins of our concerns, we have a better chance of focusing on solutions that we might all agree on.
*As a modern reader, our sympathies are certainly with Euthyphro’s intent.
If you’re a regular Origami Elephants listener, then you know that I recently had a baby with my partner, Peter. If you’re a regular Norville Rogers reader, then you know that writers around here often turn to film to discuss themes theological, philosophical and cultural. So why not a look at pregnancy in movies? Turns out, there’s plenty of good reason why not. In taking a look at what’s available, it rapidly emerges that culturally we have a very narrow view of pregnant women.
If you start with the lists — “25 Movies You Need to Watch During Pregnancy” or “10 Movies to Watch While You’re Expecting” — a couple of things become immediately obvious. First of all, there aren’t many films with pregnant protagonists out there. Which is strange if you compare to the abundance of superhero movies, considering how many women actually get pregnant versus how many superheroes actually foil plots for world destruction.* The official lists are full of movies that are ten and twenty years old. Nothing against the 80s and 90s (that’s my era after all); it’s just an indication that the lists have to dig back awfully deep in order to get up to a respectful number of picks. I’m hoping that the relatively few number of films also accounts for the almost universal inclusion of Junior, a movie-long gag in which Arnold Schwarzenegger carries a baby to term and beyond all credibility. On the other hand, I fully support the fact that Rosemary’s Baby also shows up here and there. Not only is it a good film, but the depiction of the isolation of pregnancy is realistic, despite the demonic character of the pregnancy.
Secondly, most of these movies are comedies. Considering how touchy our society is about women’s reproductive rights and female sexuality, it’s kinda weird that Hollywood also seems to insist that pregnancy is yuk-yuk hilarious. Most of the films focus on the absurdity of pregnancy symptoms are and the uncomfortable reactions they elicit from male partners. However, slapstick level comedy is not central to most women’s experience of pregnancy. And while we’re talking about it, what’s with everyone’s water breaking in embarrassing places?! Only 10-15% of women have their water breaking prior to labor and only a portion of those happen in public. So let’s agree to take a break (ba-dum-bum) from that joke already.
I would also like to point out that many of these movies center around unintended pregnancies. As almost half of all pregnancies in America are unintended, this at least rings true to the statistics. Sometimes the surprise is treated as the opening laugh line, but considering the very real physical and mental health consequences of mistimed and unwanted pregnancies, the subject, like pregnancy in general, seems hardly suitable for comedic treatment. Dramas that develop their stories around an unintended pregnancy often feature teenagers whose main mistake seems to be the decision to have sex in the first place. I remember being subjected to For Keeps? in high school as part of a campaign to scare me and my classmates celibate. What these films don’t address is the same thing that abstinence-only sex education ignores; there are some very safe and effective means of birth control out there. But then again, how often does a cinematic sex scene of any description included the couple either asking about or practicing safe sex?
Another thing the “crazy kids” story line misses is that there are marked and significant disparities in unintended pregnancy rates in disenfranchised demographic groups, such as poor and minority communities. With numbers like these, unintended pregnancy should not be understood or portrayed simply as a judgment lapse, but as related to restricted access to education and medical care which needs to be addressed at a policy level.**
Lastly, “pregnancy” movies are about pregnant women, as opposed to women who are pregnant. Let me break that down a little. What I mean to point out is that, almost across the board, the plots of these movies are all driven by the pregnancy primarily while the woman’s talents, career, and passions are sent to the sidelines. The pregnancy isn’t portrayed as a part of her life as much as her life is reshaped around her pregnancy. Now I’m not saying that pregnancy can’t be disruptive or isn’t life changing. But movies, and TV shows and ads, indicate how extreme our cultural expectations are. Consider commercials for a moment. You won’t see an ad for a copier or a weed wacker featuring a woman who is pregnant, although pregnant women definitely use these products. In fact, you won’t even see a pregnant woman selling typical feminine products like shampoo or makeup. Pregnant women are typically depicted only with specific reference to their pregnancy. In this way, “pregnancy” is not just a medical condition, but a cultural condition that indicates very narrow parameters of interest, activity and gender expression. As a result, you’ll almost never see an ad or a movie in which a woman just happens to be pregnant.
Which is why Fargo is my favorite “pregnancy” movie. In light of the fact that I rank The Mothman Prophecies among my favorite Christmas movies, this shouldn’t be too surprising. And yes, I know Fargo was made in the 90s. And Fargo *is* a kind of comedy, albeit a super dark kind of comedy. But here, the jokes have nothing to do with the fact that the movie’s hero, Chief of Police Marge Gunderson is pregnant. Instead, her pregnancy is treated as color, a facet of her character that brings interest and informs, rather than defines. Within Fargo, there are no tissue-paper stuffed baby shower scenes, there are no acrimonious discussions about quirky family names, no panic attacks in front of the mirror about changing body shape. Marge is depicted going about her normal life: her job, her marriage. As the police chief, Marge is an authority figure. She is a problem solver and a brave officer. The movie only references her pregnancy twice: once when Marge almost gets sick at a crime scene and once when Marge herself mentions her “condition” in order to disarm a suspect. The movie even allows Marge to struggle with an attraction to an old flame which challenges the idea that her sexuality is nullified by her pregnancy and allows us to see her as a person who is married in the same way we see her as a woman who is pregnant.
When it’s all said in done, I’m not making a case for more pregnancy movies. Rather, I’d be happier seeing more pregnant female characters populating other types of films (of course, that would mean we’d need better, stronger female characters in movies overall HINT HINT). More importantly, I think we need to start pushing back on the underlying cultural commitments that limit our picture of what being pregnant means.