Binary
From The Matrix, Warner Bros., 1999
From The New York Review of Books:
A Google search—which Chorost would have us doing in our own technologically modified heads—”curates” the Internet. The algorithm is, in essence, an editor, pulling up what it deems important, based on someone else’s understanding of what is important. This has spawned a whole industry of search engine optimization (SEO) consultants who game the system by reconfiguring a website’s code, content, and keywords to move it up in the rankings. Companies have also been known to pay for links in order to push themselves higher up in the rankings, a practice that Google is against and sometimes cracks down on. Even so, results rise to the top of a search query because an invisible hand is shepherding them there.
It’s not just the large number of search variables, or the intervention of marketers, that shapes the information we’re shown by bringing certain pages to our attention while others fall far enough down in the rankings to be kept out of view. As Eli Pariser documents in his chilling book The Filter Bubble: What the Internet Is Hiding from You, since December 2009, Google has aimed to contour every search to fit the profile of the person making the query. (This contouring applies to all users of Google, though it takes effect only after the user has performed several searches, so that the results can be tailored to the user’s tastes.)
The search process, in other words, has become “personalized,” which is to say that instead of being universal, it is idiosyncratic and oddly peremptory. “Most of us assume that when we google a term, we all see the same results—the ones that the company’s famous Page Rank algorithm suggests are the most authoritative based on other page’s links,” Pariser observes. With personalized search, “now you get the result that Google’s algorithm suggests is best for you in particular—and someone else may see something entirely different. In other words, there is no standard Google anymore.” It’s as if we looked up the same topic in an encyclopedia and each found different entries—but of course we would not assume they were different since we’d be consulting what we thought to be a standard reference.
Among the many insidious consequences of this individualization is that by tailoring the information you receive to the algorithm’s perception of who you are, a perception that it constructs out of fifty-seven variables, Google directs you to material that is most likely to reinforce your own worldview, ideology, and assumptions. Pariser suggests, for example, that a search for proof about climate change will turn up different results for an environmental activist than it would for an oil company executive and, one assumes, a different result for a person whom the algorithm understands to be a Democrat than for one it supposes to be a Republican. (One need not declare a party affiliation per se—the algorithm will prise this out.) In this way, the Internet, which isn’t the press, but often functions like the press by disseminating news and information, begins to cut us off from dissenting opinion and conflicting points of view, all the while seeming to be neutral and objective and unencumbered by the kind of bias inherent in, and embraced by, say, the The Weekly Standard or The Nation.
“Mind Control & the Internet”, Sue Halpern, The New York Review of Books