Eli Pariser's new book The Filter Bubble: What the Internet Is Hiding from You is a must-read for pretty much anyone who uses the Internet. Eli breaks down troubling trends emerging in the World Wide Web that threaten not only individual privacy but also the very idea of civic space.
Of key concern to Eli is "web personalization": code that maps the algorithms of your individual web use and helps you more easily find the things that the code "thinks" will pique your interest. There's a daunting amount of information out there, and sometimes it can feel overwhelming to even begin sorting through it. Personalization can help. For instance, I can find music that fits my tastes by using Pandora, or movies I like through Netflix. The services provided by companies like Pandora, Netflix, Amazon, et al are designed to study us—to get to know us rather intimately—to the point where Netflix can now predict the average customer's rating of a given movie within half a star. Eli paints a picture of your computer monitor as "a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click."
Whatever the benefits, the intent of these services isn't just to benevolently help us find the things we're looking for. They're also designed to help companies find unwitting customers. When you open your web browser to shop for a product—or really for any other reason—you yourself are a product whose personal information is literally being sold. Companies that you know, like Google and Facebook, and companies you've probably never heard of (e.g. Acxiom) are using increasingly sophisticated programs to map your personality.
And it's not just creepiness and individual privacy that's at issue here. Personalization is also adding to a civic crisis. It's one thing for code to help us find music, movies and other consumer products we like. But what about when code also feeds us our preferred news and political opinions, shielding us from alternative viewpoints? Personalization now means that you and your Republican uncle will see dramatically different results when you run the same exact Google news search. You're both likely to see results that come from news sources that you prefer — sources that tend to reinforce your existing opinions. Maybe your search will pull articles from NPR and Huffington Post, while his will spotlight stories from FOX News. Both of you will have your biases and worldviews fed back to you — typically without even being aware that your news feed has been personalized.
Web personalization is invisibly creating individual-tailored information universes. Each of us is increasingly surrounded by information that affirms—rather than challenges—our existing opinions, biases, worldviews, and identities.
This filter bubble impacts everyone. And it poses big challenges for grassroots activists and organizers in particular.
Values reflected back: the illusion of doing something
If you're an activist, then probably a lot of your Facebook friends are activists too. Your friend Susan has been posting all week about the public workers in Wisconsin. Jacob posted an insightful read about white privilege that's at the top of your newsfeed — 50 of your friends "like" it. Sam is a climate activist, and her Facebook presence reflects it. And you just posted an article about an upcoming protest to end the U.S. occupation in Afghanistan.
When you log in on Facebook as an activist, it might feel like you're part of a mass movement. Social justice issues are front and center — as if that were the main thing people used Facebook for. That's how web personalization works on Facebook. When you click on a lot of posts about gay marriage, you will start seeing more similar posts. When you check out certain people's profiles, they'll show up more often in your newsfeed. If these folks think a lot like you do, you'll see a lot of stuff that reinforces your worldview.
To read the complete article HERE.
No hay comentarios.:
Publicar un comentario