In Case You Were Ever Under the Misapprehension That Thin Privilege Doesn't Exist…

by Shaker RachelB

[Trigger warning for fat hatred.]

Google Instant and "common knowledge": Do you want to think what your neighbors are thinking? Do you even want to know what your neighbors are thinking…?

Several bloggers have written about Google Instant since it launched in early September. A feature that suggests a range of search completions while you're still typing, Instant is supposed to cut down on the time it takes to perform a search. But as various bloggers have pointed out, Instant, by "guessing" what you mean based on other people's queries, also presents an interesting tool for analyzing a number of linguistic issues—what is considered prurient, what is considered hateful, and what is considered common knowledge.

A number of people noticed that certain words—including "bisexual" and "lesbian"—returned no Instant results; see, for example, Grace Chu's post at After Ellen, "Google doesn't want to give us lesbians."1

Kelly F., a Google employee, responded to an early question about why Instant did not offer completions for some searches: "The algorithms we use to remove a prediction from autocomplete consider a variety of factors. Among other things, we exclude predictions for queries when the query itself appears to be pornographic, violent or hateful. Our algorithms look not only at specific words, but also at compound queries based on those words, and across all languages."

So what is "pornographic, violent or hateful," anyway, according to Google?

An ongoing discussion [trigger warnings for roughly everything apply] at 2600.com, "Google Blacklist: Words That Google Instant Doesn't Like," gives a range of terms that don't return instant results. Please note that many of the terms on the list are homophobic, misogynist, racist, or transphobic insults that would be unwelcome at Shakesville. Some terms referring to sexual violence also appear on the list.2

Words and phrases that Instant does not autocomplete offer insight into what Google finds "pornographic, violent or hateful," but they also offer insight into what people are already searching for: Regulations tend to appear in response to problems, not in anticipation of them. Suggesting completions for your search also offers insight into what people are already searching for. And here's where we run into a chicken-and-egg problem. As Google's page explaining Instant points out, "Even when you don't know exactly what you're looking for, predictions help guide your search."

Danger, Will Robinson!

Why would I be troubled that Instant guides your search "when you don't know exactly what you're looking for"? First, that kind of guiding is potentially coercive. I teach composition, mostly to first-year college students. When I don't understand what a student is trying to say about a passage we've just read, my first impulse is to paraphrase what I think zie just said, then ask, "Am I understanding you correctly?" In my experience, this has been a dodgy pedagogical impulse. A student who isn't quite sure what zie is trying to say, unless zie is unusually comfortable with hir classmates and me, chooses the path of least resistance and replies, "Yes." If you don't know exactly what you mean, having words put in your mouth or fingers might not just guide your search: It might shape that search, too.

My second problem? The efforts Google has gone through to keep Instant from autocompleting hateful terms are ultimately ineffective. When the people who have designed Instant fail to recognize an axis of privilege—say, thin privilege—the algorithm they use to head hatefulness off at the pass is going to fail. What's more, when the same search engine suggests how you might complete that search, it guides you toward a search that many other people have already done—other people living in a society that prizes one body type at the expense of others. And some of the results it offers reinforce bigotry against fat people in a way that is both appalling and unsurprising.

Which brings me to a cluster of posts in the Fatosphere this weekend, all of which I recommend if you have the intestinal fortitude. Sugared Venom incisively examines the thin privilege and fat hate visible in the way Google completes queries like "fat people are" and "fat people should." Hir post and the follow-ups use screen caps of searches to illustrate their commentary, so trigger warnings for fat hate and eliminationism apply.

Sleepy Dumpling at Fat Heffalump extended Sugared Venom's experiment by plugging in "fat people m," "fat people g," and "fat people h" to see how Google would complete them. (As it turns out, badly. Although fat people do apparently have Wisconsin.) Due to a search result reported in one of the comments, this post probably needs an extra trigger warning for sexual violence.

Brian at Red No. 3 responded by investigating what happens when you type "thin people" into a Google search window. Unsurprisingly, he found none of the vitriol that Sugared Venom and Sleepy Dumpling observed in their examples.

As the thin-privileged sister of a fat brother, I would guess that if you are fat, the viciousness explicit in these search completions is not news to you.3 It's disheartening that the viciousness Sugared Venom and Sleepy Dumpling's searches found is apparently prevalent enough to pass for bog-standard common knowledge. And it's infuriating that Instant compounds the problem, making it easier for bigots to find the sneering that they're looking for, and harder for those of us who just want to find a damn pair of hiking boots.

--------------------------------------

1 As someone has pointed out in every comment thread I've seen discussing Google Instant's mysterious treatment of the word "lesbian," you can find results for the word "lesbian" using Google; you just have to hit as you would have before Instant debuted. As of when I last checked (October 25, 2010), Google offers search completions for "lesbian," once you type the final "n": "lesbian dating," "lesbian vampire killers," and "lesbian bed death" are chief among them. Woot. There are no suggested completions yet for the search term "bisexual." (Insert your favorite joke about bisexual invisibility here.)

2 TW for reference to sexual violence: One commenter on 2600.com suggested that because the word "rape" is currently on the list of terms that Instant doesn't autocomplete, survivors' resources might be marginally harder to locate. Furthermore, the presence of "rape" among sexual terms that don't connote violence, and that would appear here without a trigger warning, makes me think that Google is not actually differentiating between reporting and treating sexual violence, committing said sexual violence, and having sex with one or more enthusiastically consenting adult partner. Which, you know, fails on many accounts.

3 Until reading Sugared Venom's post this weekend, I hadn't done a Google search on fat-related issues since discovering Fat Acceptance blogs. And I had forgotten how much hateful and misinformed stuff there was to sift through in order to find the good and useful FA nuggets the first time. Being able to forget that? Privilege.

Shakesville is run as a safe space. First-time commenters: Please read Shakesville's Commenting Policy and Feminism 101 Section before commenting. We also do lots of in-thread moderation, so we ask that everyone read the entirety of any thread before commenting, to ensure compliance with any in-thread moderation. Thank you.

blog comments powered by Disqus