Sign In  |  Register  |  About Burlingame  |  Contact Us

Burlingame, CA
September 01, 2020 10:18am
7-Day Forecast | Traffic
  • Search Hotels in Burlingame

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Facebook and the perils of a personalized choice architecture

The recent Facebook-Cambridge Analytica chaos has ignited a fire of awareness, bringing the risks of today’s data surveillance culture to the forefront of mainstream conversations. This episode and the many disturbing prospects it has emphasized have forcefully awakened a sleeping giant: people seeking information about their privacy settings and updating their apps permissions, a “Delete […]

The recent Facebook-Cambridge Analytica chaos has ignited a fire of awareness, bringing the risks of today’s data surveillance culture to the forefront of mainstream conversations.

This episode and the many disturbing prospects it has emphasized have forcefully awakened a sleeping giant: people seeking information about their privacy settings and updating their apps permissions, a “Delete Facebook” movement has taken off, and the FTC launched an investigation into Facebook, causing Facebook’s stocks to drop. A perfect storm.   

The Facebook-Cambridge Analytica debacle is comprised of pretty simple facts: Users allowed Facebook to collect personal information, and Facebook facilitated third party access to the information.  Facebook was authorized to do that pursuant to its terms of service, which users formally agreed to but rarely truly understood. The Cambridge-Analytica access was clearly outside the scope of what Facebook, and most of its users, authorized.  Still, this story has turned into an iconic illustration of the harms generated by massive data collection.

While it is important to discuss safeguards for minimizing the prospects of unauthorized access, the lack of consent is the wrong target. Consent is essential but its artificial quality has been long established. We already know that our consent is, more often than not, meaningless beyond its formal purpose.  Are people really raging over Facebook failing to detect the uninvited guest who crashed our personal information feast, when we’ve never paid attention to the guest list? Yes, it is annoying. Yes, it is wrong. But it is not why we feel that this time things went too far.

In their 2008 book, “Nudge,” Cass Sunstein and Richard Thaler coined the term “choice architecture.”  The idea is simple and pretty straightforward: the design of the environments in which people make decisions influences their choices.  Kids happy encounters with candies in the supermarket are not serendipitous: candies are commonly located where children can see and reach them.

Tipping options in restaurants are usually tripled because individuals tend to go with the middle choice, and you must exit through the gift shop because you might be tempted to buy something on your way out. But you probably knew that already because choice architecture has been here since the dawn of humanity and is present in any human interaction, design, and structure. The term choice architecture is ten years old, but choice architecture itself is way older.

The Facebook-Cambridge Analytica mess, together with many preceding indications before it, heralds a new type of choice architecture: personalized, uniquely tailored to your own individual preferences, and optimized to influence your decision.

We are no longer in the familiar zone of choice architecture that equally applies to all.  It is no longer about general weaknesses in human cognition. It is also not about biases that are endemic to human inferences.  It is not about what makes humans human. It is about what makes you yourself.

When the information from various sources coalesces, the different segments of our personality come together to present a comprehensive picture of who we are.  Personalized choice architecture is then applied to our datafied curated self to subconsciously nudge us to choose one course of action over another.

The soft spot at which personalized choice architecture hits is that of our most intimate self.  It plays on the dwindling line between legitimate persuasion and coercion disguised as voluntary decision.  This is where the Facebook-Cambridge Analytica story catches us – in the realization that the right to make autonomous choices, the basic prerogative of any human being, might soon be gone, and we won’t even notice.

Some people are quick to note that Cambridge Analytica did not use the Facebook data in the Trump campaign and many others question the effectiveness of the psychological profiling strategy.  However, none of this matters.  Personalized choice architecture through microtargeting is on the rise, and Cambridge Analytica is not the first nor the last to make successful use of it.

Jigsaw, for example, a Google -owned think tank, is using similar methods to identify potential ISIS recruits and redirect them to YouTube videos that present a counter-narrative to ISIS propaganda.  Facebook itself was accused of targeting at-risk youth in Australia based on their emotional state.  The Facebook-Cambridge Analytica story may have been the first high profile incident to survive numerous news cycles but many more are sure to come.

We must start thinking about the limits of choice architecture in the age of microtargeting.  Like any technology, personalized choice architecture can be used for good and evil: It may identify individuals at risk and lead them to get help.  it could motivate us into reading more, exercising more, and developing healthy habits. It could increase voter turnout. But when misused or abused, personalized choice architecture can turn into a destructive manipulative force.

Personalized choice architecture can frustrate the entire premise behind democratic elections – that it is we, the people, and not a choice architect, who elect our own representatives. But even outside the democratic process, unconstrained personalized choice architecture can turn our personal autonomy into a myth.

Systematic risks such as those induced by personalized choice architecture would not be solved by people quitting Facebook or dismissing Cambridge-Analytica’s strategies.

Personalized choice architecture calls for systematic solutions that involve a variety of social, economic, technical, legal, and ethical considerations. We cannot let individual choice die out in the hands of microtargeting. Personalized choice architecture must not turn into nullification of choice.

 

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 Burlingame.com & California Media Partners, LLC. All rights reserved.