Facebook—though it’s certainly not the only one—know this

in #face6 years ago

wp_ss_20180606_0001 (2).png
magine owning a robot that’s programmed to follow your orders, no matter what. With full predictability guaranteed, you’d be assured of absolute compliance and would never have to utter the anguished words, “Et tu, Beep Boop? I trusted you!”
If someone says, “I don’t trust technology,” she’s probably speaking in overly general terms. If she gets more specific and says, “Well, what I mean is I don’t trust self-driving cars,” she’s still off the mark. This statement really says she’s skeptical of the competence or good faith of self-driving car designers, companies, marketers, safety inspectors, reporters, regulators, or insurers. Her suspicion is that one or more of these parties in the sociotechnical system is putting lives in danger by making unreliable claims about how the cars will perform.
These (and countless other) examples of how we interact with technology show that trust always involves three things: vulnerability, risk, and power. When we offer up our trust, mild and momentary disappointment can follow, as can the tragedy of a betrayal that irrevocably severs ties. Without these risks, trust can’t exist. To be trusting, you have to relinquish control. And tech companies like Facebook—though it’s certainly not the only one—know this.

The Problem with ‘The Circle’
I wasn’t surprised when a spokesperson for a company — let’s call it “The Circle” — recently asked if I’d like to be considered for a spot on its ethics board. (The Circle, by the way, is not Facebook, in this particular example.) If consumers genuinely want heightened privacy protections, greater transparency, and more sway over how platforms organize and present information, then The Circle and its competitors can’t bank on surveillance capitalism as a surefire bet.
Perhaps The Circle genuinely wants to make the world a better place and is soliciting help to find the right path and stay on it. Then again, maybe the true agenda is courting good public relations, in which case folks like me will be used to manage optics. I couldn’t tell, and if you were in my shoes, I suspect you’d also be puzzled.
Like the rest of the pack, The Circle protects its intellectual property with nondisclosure agreements (NDA). By design, these legal documents are necessary for safeguarding trade secrets. But they also can impose uncomfortable silences that impede trust. Nobody I know has any idea what’s going on behind the scenes, and if I were bound by The Circle’s NDA, my public voice could be muzzled.
It would have been pointless for me to try to gauge The Circle’s trustworthiness by scrutinizing its user agreement. Even if I obsessed over the document to infinity and beyond, I’d still be unsure what, exactly, it permits.
The dilemma I faced speaks to a larger issue. Tech companies have a lot to prove when their economic engines run on absorbing, scrutinizing, and sharing our intimate information. Sadly, the burden of trust for demonstrating good faith is so high that excellent companies entering the market — ones that really do aspire to add value to our lives — might unfairly pay for the sins of their predecessors. If we’ve learned from the recent past, we’ll insist they have to go above and beyond to convince us that things might be different this time.
To be trusting, you have to relinquish control. And tech companies know this.
After extensive deliberation, I decided against joining The Circle’s ethics board. Since trust grows and dies in situations filled with uncertainty, I’ll never know if this was the right decision.
Let’s consider the two legal documents that can impose barriers to trust: user agreements and NDAs.
Consumers often trade data for services without grasping how their information can be stored, shared, analyzed, and acted upon, or how much their information is actually worth. The problem of information asymmetries arises over and over again because user agreements have been normalized to be corporation-friendly, not reader-friendly. The agreements are notoriously opaque and present take-it-or-leave-it offers that deprive consumers of bargaining power.
Think about why critics allege Mark Zuckerberg’s Cambridge Analytica–inspired apologies are crocodile tears and that he isn’t taking consumer trust as seriously as he should. While Facebook recently revised its terms of service, it has been persuasively argued that the limited changes, along with the persistence of limited opt-in and opt-out options, only aim to instill “pseudo-trust — a fake or superficial trust” that’s “purely transactional” and oriented for “short-term” results.
Then there’s the double-edged sword of nondisclosure agreements: necessary for companies to be viable and protect their growing business, but also a mechanism for creating black-boxed corporate cultures. Remember when the Weinstein Company filed for bankruptcy in March after the accusations of sexual assault and harassment could no longer be ignored? Well, the company tried to distance itself from Harvey and admitted that he had “used nondisclosure agreements as a secret weapon to silence his accusers.” (In a similar respect, much debate exists over the guiding intentions that led to Stormy Daniels being given one.)
Furthermore, NDAs, no matter how airtight they are, can’t prevent anonymous criticism from leaking and, even when legitimate, adding to an overall atmosphere of distrust. When competing sides fight over legitimizing and discrediting anonymous claims, the winner can be the dangerous conviction that truth lies solely in the eye of the beholder.

Facebook and flow
Back in 1927, the German philosopher Martin Heidegger published Being and Time, a pre-digital-age book that does a great job of describing part of this concern. One of Heidegger’s great insights was that human beings can use the same technology in two different ways: “present-at-hand” and “ready-to-hand.” Each orientation has profound implications for how the technology and the activities it facilitates are perceived. Scholars have long debated whether these outlooks are mutually exclusive and if they can be turned on and off like a light switch.
Heidegger captures these shifts in focus in a nuanced way, describing how our involvement with objects transforms how we see and understand them. To use one of his favorite examples: if someone skillfully uses a hammer in a ready-to-hand way, the tool fades to the background and attention is drawn to the task at hand, which is getting in the zone, becoming one with the tool , and effortlessly driving nails into wood.
By contrast, someone could take a present-at-hand approach to a hammer. She might take a step back from the activity of carpentry and concentrate fully on analytically determining how much pressure needs to be exerted to use a hammer — which feels like an external object — without splitting the wood.
Heidegger suggests that the ready-to-hand approach can’t be sustained in cases where breakdowns occur, such as accidentally striking the hammer on your thumb. In these instances, the hammer surfaces from the background and becomes a focal point while the injured party cries, “Ouch, freakin’ hammer!” (“Hammer don’t hurt ’em,” if you like.)
What The Circle might want to accomplish, therefore, is nudging users to have a ready-to-hand relation with its software so that the tool disappears into the background and the focal point becomes the peers on the network. This way, the experience seems like it’s all about the conversations and not about the commodification of heart-to-hearts. The smooth back and forth of a tête-à-tête can break down through inhibition if strong reminders exist that intermediaries are present and staring with objectifying gazes.
In face-to-face environments, it’s often easier to detect privacy threats. You can speak in hushed tones at a crowded bar to limit the likelihood that anyone but your buddy sitting next to you will listen in. But nothing comparable exists when you’ve clicked “agree” to a user agreement and effectively authorize an online company to perpetually lurk in the shadows.
Social media companies like Facebook leverage expertise in user-experience design to pull off this trick. They instill trust by getting 2.2 billion users to forget about the platform and make trusted “friends” (and, of course, “friendly” brands and organizations) the center of attention.
“Facebook made these design choices because it knows the power of trust,” Ari Waldman, author of Privacy as Trust: Information Privacy for an Information Age, told me. “When we trust, we share, and Facebook’s design is made to trigger all sorts of mechanisms of trust, from signals of our friends’ behavior to intimacy of community.”

Intimacy by design
Facebook — which has now added matchmaking to its many other offerings — experiences are meant to feel more intimate than they are. Facebook constructs a space for users that encourages them to leave their corporate concerns behind.
They instill trust by getting 2.2 billion users to forget about the platform and make trusted “friends” (and, of course, “friendly” brands and organizations) the center of attention.
One thing that concerned me when I looked into The Circle is that it encourages customers to use its online service to message one another. On the one hand, we all know how much we value these exchanges with our friends and family. On the other hand, even though The Circle explicitly informs users when they sign up for the service that their conversations are monitored, analyzed, and shared with specified third parties, it was less clear to me how salient this disclosure would remain as time passed, relationships were formed, and trust between peers became established.
If someone asked you to describe the thoughts you associate with Facebook as a company, you’d probably think of images of Mark Zuckerberg, perhaps looking like a college student in jeans and a laid-back gray T-shirt or hoodie. Or maybe a different uniform — one of his I’m Sorry suits.
You could think of chief operating officer Sheryl Sandberg or perhaps of Facebook’s headquarters in Menlo Park, renowned for offering techies perks like vending machines that dispense free power cords and keyboards.
But when you log on to Facebook as a user, do any of these images stay with you? Your attention is directed elsewhere — to content selected for you that makes the company’s machinations and interests largely invisible.
“Facebook’s conception of community is a mile wide but an inch deep,” Waldman added. “Everyone in our network is the same: They’re all friends. Our mom is a friend, so is that guy we met at yoga that one time. Our bosom buddy gets the same treatment as the person we flirted with last night. The community really isn’t there. It’s a facade, meant to lull you into a false perception that you’re really having a chat on the stoop at your local hangout, not helping to create an increasingly creepy virtual version of yourself with every click, message, or comment.”
Indeed, the iconic boxed-off blue letter F logo in the corner is good branding. It’s aesthetically pleasing and doesn’t invite you to think much about what it stands for, much less ponder if corporate interests diverge from your own. What’s not present is just as significant.
For obvious reasons, there isn’t a prominently featured box featuring real-time updates of the value of Facebook’s common stock on NASDAQ or any indication of how much what you type or click on is worth to the company. Forget about getting visceral notice that Facebook stores everything you type and delete before posting.
Much has been said about Facebook recently rolling out a tool called Access Your Information that allows users to download information about what Facebook knows about them. While many people have expressed shock about how much information the platform has, it remains an open question whether that transparency will motivate much change. After all, if you want to scrutinize the information, you’ll have to click “Download a copy of your Facebook data,” wait for the file to be created and shared, and then examine the dossier. You’ll be able to make sense of what you see, unlike when you’re viewing user agreements. And who does that?
Nevertheless, such inspection occurs in present-at-hand mode, and the emotions you feel while contemplating the aggregated information might diminish considerably when you’re back at it in ready-to-hand mode, scrolling through your newsfeed, clicking and writing up a storm, and caught up in Facebook’s trance.

Breaking the spell
Have you ever experienced the unnerving feeling of viewing someone else’s Facebook account? It’s an odd, almost Black Mirror–ish experience. On one level, it’s peculiar because you get a glimpse into how different people can be online versus in real life. You can think you know someone well but then see online interactions with folks you’ve never heard of.
What makes the experience strange is that it presents the all too familiar in a bizarre light — and lets you see for yourself how contrived Facebook’s entire setup is. Your feed feels intimate to you, but someone else’s feed gives off the vibe of faux-intimacy because the pandering stands out.
Your attention is directed elsewhere — to content selected for you that makes the company’s machinations and interests largely invisible.
It’s a Gestalt switch — another transition from the ready-to-hand to the present-at-hand. It’s like seeing things with a different pair of eyes. Because you’re subversively using the technology in a way that Facebook isn’t recommending, this time you get to decide what’s salient. You become free to break down Facebook’s conditioning mechanisms and turn malfunction into liberation.
Facebook conjures together the spell of personalization from many things, ranging from likes and interests to personality quirks and political leanings. This is a central strategy for engineering our trust. Concerns about filter bubbles and the difficulty of determining reliable from unreliable news sources exist in part because our experience of Facebook is designed to make us trust the “personal” space we’re in and therefore open to sharing with extended friends.
In the future, it might even give users a “whole virtual ‘you’ museum.” Because, to quote from Westworld, “You know who loves staring at their own reflection? Everybody.”

1.4K

written by
Evan Selinger
Prof. Philosophy at RIT. New book: “Re-Engineering Humanity” (2018). Bylines everywhere. http://eselinger.org/ & https://www.reengineeringhumanity.com/
Follow

Trust Issues
In Medium's first themed digital magazine, we'll be publishing stories, from the exalted to the mundane, about the state of trust in 2018.
A New Tech Manifesto
Six demands, from a citizen to Big Tech
Baratunde ThurstonJun 4

User Agreements Are Betraying You
There’s a better way for us to interact with tech companies
Woodrow HartzogJun 5

We All Have ‘Trust Issues’
Introducing Medium’s first monthly (digital) magazine
Siobhan O'ConnorJun 4

Find Out What Google and Facebook Know About You
How to do a data detox, in a zillion easy steps
Baratunde ThurstonJun 4

Homepage
About
Help
Legal

Coin Marketplace

STEEM 0.27
TRX 0.13
JST 0.032
BTC 60986.03
ETH 2921.26
USDT 1.00
SBD 3.57