User Review - Flag as inappropriate
Sometimes, a book is more valuable because of the conversations it provokes than its actual content. This may turn out to be one of the those book. Eli Pariser pushes the panic button a bit too frequently for my taste, especially when he starts talking about some of what he terms "odd" viewpoints held by some in Silicon Valley. Of course, my viewpoints are pretty odd so that may be a personal issue. :)
The author tends to lump together all personalization as being a concern. While I am 100% sold on the concerns about filtering creating a bubble of sycophants, the criticism of tailored advertising seems to be more that it isn't very good yet. A lot of the criticism seems to be aimed more at the general characteristics of the modern corporation than the specific companies cited. Those that are (justifiably) concerned about this should check out "Move to Amend" for lots more on this topic.
I was also hoping for some more actionable ways to reduce the filter effect beyond the suggestions given. The suggestions seemed in line with something I already do - try to read at least one thing every day that you disagree with. And be aware that everything you perceive is filtered.
Despite all of this, Pariser did a good job of supporting his key points. The book has sparked a conversation and driven actions by social media companies that probably would not have happened otherwise. It's a good introduction to the topic yet those that have been online as long as I have probably won't get a lot out of it.
User Review - Flag as inappropriate
Reviewing an extremely critical book with views that I strongly agree with is harder than I thought it would be. Overall, I agree with almost all of Pariser’s notions. He is deeply worried about what will happen if our Internet worlds continue to get filtered and filtered until our computers know exactly what we want to ask them before we even get the chance to. First of all, The Filter Bubble is a concept, coined by Pariser, used to explain what the new generation of the Internet really is doing to us. He says that the “internet filter looks at the things you seem to like- the actual things you’ve done, or the things people like you like- and tries to extrapolate.” This used to just be a worry because of Google, but now more than a handful of sites and companies are adopting this type of optimization, especially for marketing. It optimizes everything, business wise. Advertisements are as streamlined as they could possibly be and the user is never getting pulled in the wrong direction towards something they are uninterested in. It’s a win-win situation, right? Pariser points out the flaws with this model that has become so embedded within our technological framework.
This book is for everybody that is curious about where things could go if we are not careful as well as media critics wondering what’s next. In Nancy Baym’s book “Personal Connections in the Digital Age” she critically defines technological determinism. I think that Pariser opinions are closely aligned with the theory of technological determinism. We hope that the public sphere is stronger than what will happen at the next Apple conference. We hope that the public will affect the technology, not the other way around. But with search personalization as a legitimate thing we are now dealing with, I am more fearful of what will happen to society if our own individual filter bubbles get smaller and more confined (Pariser as well).
Pariser is good at taking the other side as well, understanding that this is a sticky situation because having optimized results fitting our interests really does help, a lot of the time, and he says that, “to some extent, we’ve always consumed media that appealed to our interests and avocations and ignored much of the rest” but the difference now, is that, there are three new dynamics:
1) You are the only person in your filter bubble.
2) Your filter bubble is invisible: it’s hard to believe that results showing up on your Google or yahoo page are biased or subjective since you are not told this is what is going on.
3) You don’t choose to enter the bubble. Unlike television, where you know, most of the time, what type of view you are getting, the Internet is seemingly democratic and open, leading you to believe that what you stumble upon, is really just stumbling.
And Pariser says, it is not just stumbling!! This is the point he makes again and again that resonates with me. We believe that the internet is full of free information waiting for us to soak it up, what we don’t know is how much information and media we are missing just because of our past search history. What if there was a day where you binged on Justin Bieber videos? You may start seeing more celebrity-focused news and advertisements rather than what’s currently going on in Libya or Egypt. This is something to be worried about.
User Review - Flag as inappropriate
Eli Pariser's "Filter Bubble" largely restates a thesis developed a decade ago in both Cass Sunstein's "Republic.com" and Andrew L. Shapiro's "The Control Revolution," that increased personalization is breeding a dangerous new creature -- Anti-Democratic Man. "Democracy requires citizens to see things from one another's point of view," Pariser notes,"but instead we're more and more enclosed in our own bubbles."
Pariser worries that personalized digital "filters" like Facebook, Google, Twitter, Pandora, and Netflix are narrowing our horizons about news and culture and leaving "less room for the chance encounters that bring insights and learning." "Technology designed to give us more control over our lives is actually taking control away," he fears.
Pariser joins a growing brigade of Internet pessimists. Almost every year for the past decade a new book has been published warning that the Internet is making us stupid, debasing our culture, or destroying social interaction. Many of these Net pessimists -- whose ranks include Andrew Keen (The Cult of the Amateur), Lee Siegel (Against the Machine), Jaron Lanier (You Are Not a Gadget) and Nicholas Carr (The Shallows) -- lament the rise of "The Daily Me," or the rise of hyper-personalized news, culture, and information. They claim increased information and media customization will lead to close-mindedness, corporate brainwashing, an online echo-chamber, or even the death of deliberative democracy.
Implicitly, criticisms like those set forth by Net pessimists represent a call for a return to a "simpler time" and some mythical "good ol' days" when someone wiser than us was setting the agenda, or when our options were limited to things that were supposedly better for us. But were we really better off back then? It's largely revisionist history. The good ol' days weren't so great. By most measures we're more informed and interactive than ever before. Here's a simple test that works particularly well for anyone over the age of 35: Did you have more serendipitous encounters with alternative viewpoints before or after the rise of the Internet?
Most of us had very limited interactions with people and ideas beyond our communities before the Net. Even as modern technology has allowed increased user-customization, it has also opened our eyes to a world of new ideas, perspectives, and culture. The Digital Age is more personalized but also more participatory. It promotes greater cultural heterogeneity and gives everyone a better chance to be heard.
Pariser doesn't offer much of a blueprint regarding how he'd like to change things. That's unsurprising since the logical conclusion to draw from his thesis is that someone should be doing more to de-personalize the Net and force us to consume more information that they think is good for us.
The problem with this "eat your greens" approach -- besides being somewhat elitist -- is that it just isn't practical. People will continue to want, and get, a more personalized web experience. But that doesn't mean deliberative democracy is dying. As the existence of MoveOn.org and countless groups like it proves, vigorous debate and political activism have never been stronger.