Sorry, activists, another flawed survey doesn't prove rape culture
Yet another sexual assault survey was released this week. Cue the usual media outlets predictably touting it as proof that America is in the midst of a "rape culture" and that college women are victimized at rates similar to that of the Congo.
The alarming statistics included in summaries make for good headlines. Huffington Post wrote: "There's no more denying campus rape is a problem. This study proves it." Media Matters went with: "New study once again debunks right-wing media's favorite myths about campus sexual assault statistics."
These outlets can claim that the survey proves the existence of a campus rape epidemic, but the data prove just the opposite.
Research with an agenda
The lead researcher for the study was Christopher Krebs, and this is not his first sexual assault survey purporting to find 1-in-5 women in college were sexually assaulted. Asked by then-Slate (she's now with the Atlantic) reporter Emily Yoffe about his original survey, Krebs said: "we don't think one-in-five is a nationally representative statistic." A similar caution is included in this study.
"Importantly, neither this sample of nine schools nor the data collected from the students attending them are intended to be nationally representative of all college students or institutions of higher education," the researchers wrote.
The survey was funded through the Justice Department's Office on Violence Against Women, hardly a neutral agency. The survey was conducted through the Bureau of Justice Statistics, whose data on the issue show that the actual rate of victimization is closer to 6 out of 1,000 female students. The DOJ hired RTI International, a research group that was responsible for a previous campus sexual assault survey claiming 1-in-5 women are victims.
This survey was created by people who knew how to get high responses of victimization.
This survey, like all that came before it, broadly defines sexual assault and sexual battery so as to include everything from a stolen kiss to rape. For instance, sexual battery was defined as "any unwanted and nonconsensual sexual contact that involved forced touching of a sexual nature, not involving penetration." Researchers said this "could include forced kissing, touching, grabbing or fondling of sexual body parts."
The survey also included a broad definition of coercion, which was open to many interpretations.
"Since the beginning of the current academic year in [FILL: August/September], 2014, has anyone had sexual contact with you by threatening to tell lies, end your relationship, or spread rumors about you; making promises you knew or discovered were untrue; or continually verbally pressuring you after you said you didn't want to?" they survey asked.
Students were also asked if they were ever "unable to provide consent" because of a host of issues, including being drunk. Students were asked numerous times about being "incapacitated," yet it was not defined, meaning students who were even tipsy could assume they were incapacitated. And with current definitions of sexual assault allowing a tipsy feeling to negate consent because a person wouldn't have done the same thing sober, this definition leaves much to be desired.
Self-reported surveys are laughably unreliable
People, if given an option, will say anything. One need only look at the past few years of political polling to see that self-reporting is wildly inaccurate. In 2012, Gallup polling suggested Mitt Romney would win the presidency. A similar situation occurred during the 2014 midterms, the Scottish referendum and in the United Kingdom. Because what people will say anonymously doesn't always match reality.
Basing public policy on surveys, with manipulatively worded questions and agendas, is never a smart move.
Here's a survey that attempted to quell the argument that campus sexual assault surveys suffer from nonresponse bias because more people who have or believe they have had experience with sexual assault will take the survey than those who have not.
In the new BJS study, more women responded than men, with the average response rate across the nine schools surveyed being 54 percent for women and 40 percent for men. Male response rates met the researchers response targets in just five of the nine schools surveyed.
Beyond that, the researchers used a lot of scientific language to make it appear as though there wasn't a nonresponse bias. They compared the demographics of the students who replied with the demographics of the students who didn't reply and concluded that at worst there was a low nonresponse bias.
What they didn't compare — and indeed what they could not possibly have compared without participation — was whether those who did not participate did so because they had no experience or interest in current sexual assault discussions. Mark Perry, a scholar at the American Enterprise Institute and economics professor at the University of Michigan-Flint, explained to the Washington Examiner what the researchers did in this section of the survey.
"If there is no nonresponse bias, they can conclude that the students who took the survey are representative of the entire student population," Perry said. "So it looks like they made some type of statistical 'nonresponse adjustment' based on the analysis of potential nonresponse bias, even though they claim they didn't find any evidence of nonresponse bias."
Open to interpretation
Without investigations and follow-ups, some of the responses may not mean what the researchers want them to mean. For instance, the researchers found that roughly 68 percent of the alleged incidents of completed rape found by the survey occurred off campus. And to be clear, that's 68 percent of the 2,380 completed rapes across just nine schools in America in one year researchers (and a willing media) believed actually occurred.
Perhaps some of these off-campus encounters occurred in apartment complexes between two students, and would fall under a school's responsibility. But with so many alleged rapes occurring, perhaps some took place during Spring Break or in locations where the school does not have authority. One can see how easy it is to question data.
Assuming true data, straight women less likely to be victimized than non-heterosexual students
I'd like to question the people who buy into the claims of this survey. Because if the data really are true as they say, then we need to be focusing efforts on the LGBT community. The survey found that "the prevalence of sexual assault was significantly higher for non-heterosexual than heterosexual female students at the nine schools."
So if we aren't supposed to question the survey and to take it at face value, then we should stop pursuing heterosexual males (like fraternity members) and focus efforts on the LGBT community.
I still question the survey, because we don't know who the perpetrators were of the non-heterosexual students. They could have been heterosexual students or other non-heterosexual students. And again, I believe the victimization rates reported in this study are unrealistically high. But if people are touting this survey as proof of a rape culture, then they also have to buy that the rape culture is more prevalent in the LGBT community. But that wouldn't fit their narrative of white, male, straight, fraternity athletes committing rape.
Assuming true data, students surveyed don't believe they're victims
One data point conveniently left out of reports by people who buy into these surveys is the number one response for why people who answered in the affirmative for the unwanted sexual contact questions chose not to report. The number one response in every survey I have seen — by a landslide — is always that the students didn't think the incident was serious and didn't want to take action.
So again, assuming everyone answered truthfully without exaggeration, upwards of 80 percent of respondents didn't report allegedly unwanted contact because they "did not need assistance, did not think the incident was serious enough to report or did not want any action taken."
That's a pretty broad category, honestly, but still suggests that the large number of "victims" identified by the survey and the media are not actually victims and do not see themselves as such.
One might think: Well, maybe these respondents gave that answer because they were afraid of what would happen if they reported. But variations of that fear were given as other options. Respondents were much less likely to report that they didn't report because they feared retaliation or that they wouldn't be believed or would be victim-blamed.
The second-highest reason given for why students didn't report is that they thought "others might think you were partly at fault," but it received 20 to 30 percentage points fewer responses than the top reason. For sexual battery, the top reason by far for not reporting was that respondents did not believe the incident was serious, with roughly 80 percent of respondents saying so compared to the next highest answer — victim-blaming — which got around 20 percent or less (respondents could choose more than one answer).
This also suggests the actual victim rate is much, much lower than surveys and the media would have you believe, and begs the question: Why do researchers and the media so badly want women to feel like victims? Isn't being the victim of a sexual assault one of the worst things a person can be? Why would anyone wish that on another person?
Unless, of course, the narrative matters more than the actual people.
One last thought: Even if every word of this survey were true (that's not possible, for the reasons I have stated), it still is not an excuse to eviscerate due process rights. Creating a new culture where accusations equal guilt invites liars and others who want to take advantage of the system to make false accusations.
We have rule of law in this country, and no self-reported survey should change that.