Grâce à son travail journalistique rigoureux et sa relation privilégiée avec les dirigeants de Facebook, et malgré une plume parfois pataude et des ellipses souvent confondantes, Steven Levy livre un récit passionnant de la décadence et de la chute de l’empire zuckerbergien. À trop avoir voyagé avec Zuckerberg et diné avec Sandberg, le journaliste manque quelque peu de distance, et se montre complaisant dans certains passages sur l’affaire Cambridge Analytica. À trop étudier les carnets de l’étudiant et analyser la stratégie du patron, il élude les défauts patents de l’homme Zuckerberg, et ne « tue » pas complètement son sujet.
« Facebook may have to change », dit-il quelques lignes avant de conclure, « but Zuckerberg doesn’t believe he has to. » La lecture de cet ouvrage me renforce dans mon idée que Zuckerberg pêche par idéalisme plutôt que par cynisme. Or l’erreur fortuite, lorsqu’elle est répétée, ne peut être distinguée de la malice. Sincère ou pas, le patron de Facebook finit par causer un tort irréparable. Malgré ses défauts facilement excusables, Facebook: The Inside Story est, avec Hatching Twitter et Super Pumped, parmi les meilleures monographies que j’ai lues ces dernières années.
La théorie de l’invention simultanée appliquée aux réseaux sociaux :
Tillery stopped his involvement with the facebook program after graduating from Exeter. His next stop was Harvard University. So he was present at the school in February 2004, when an online facebook suddenly appeared and swept through the school like a tornado. He wasn’t surprised to see that it was created by Mark Zuckerberg. Even in his limited contact with Zuckerberg at Exeter, Tillery noticed that the intense young man had “big, big ambition.” Nor was he bothered by what was arguably an appropriation of his idea. In his view, the online facebook was something he’d worked on in prep school, and he was done with it. More power to Mark.
J’oublie toujours l’épisode Synapse :
Overall, though, the Slashdot attention was a boon. Zuckerberg heard from multiple companies interested in the student project, including Microsoft and AOL. Zuckerberg and D’Angelo got an offer approaching a million dollars from one of those suitors. But the payout would be contingent on Zuckerberg and D’Angelo committing to work for that company for three years. They turned it down. Neither was willing to leave school—at least, not for that offer. They both moved on from Synapse. “We knew that we could do something better,” says Zuckerberg.
Facebook comme réseau « social-privé » :
Zuckerberg had an important decision to make as he extended Thefacebook beyond Harvard: would the newcomers to the system be regarded as part of a single contiguous network, or would they be regarded as a discrete unit? Specifically, could you cruise the profiles of students from another school as you would on your home campus? He later explained the trade-off: “Would it be better for people to be able to see everyone and maybe not feel like this was a secure environment in which they can share their interests and what they thought and what they cared about? Or would it be better that more information and more expression was available, but to a smaller audience, which is probably the relevant audience for any person?” After a lot of thought, Zuckerberg decided that he would limit profile-surfing to one’s own school. People would be more likely to share things like their cell-phone numbers if they knew that only others in their community could see it. Privacy would rule. Or, as Zuckerberg later put it, “If people feel like their information isn’t private, then that screws us in the long-term too.”
Dropbox avant Dropbox :
Parker, with his instinctive product genius, had taken a close look at what McCollum and D’Angelo were doing on Wirehog, and understood that the concepts were advanced. It not only let you spread your files across many devices with full access to them, but allowed people to share selected files with their friends in various galleries segregated by content—photos, documents, music. The music gallery had a player built in so you could play music from someone else’s library on your own computer. One night, discussing the product with the two coders, Parker gave them a suggestion that basically foretold what would later be known as cloud computing: “You have to make it really easy for people,” he said. “People should have just one thing to put files in, then everyone could put their stuff in and make it available.” He even suggested a new name for the project: Dropbox.
Le plan de domination du monde de Zuckerberg :
In June 2005, Mark Zuckerberg gathered his employees and told them what he had in mind for Facebook’s second summer. A site redesign. A photo application. A personalized newspaper based on users’ social activity. An events feature. A local business product. And a feature that he called I’m Bored, which would give people things to do on Facebook. It was a list that would transform his site from a college directory to the world’s premier social utility.
Pourquoi Facebook est bleue :
Eschewing that riot of color, Sittig limited himself to a blue palette. This had the advantage of registering most clearly to Zuckerberg, who is color-blind and can’t see reds or greens.
Facebook comme interface para-gouvernementale dans un futur dystopique :
Using Facebook needs to feel like you’re using a futuristic government-style interface to access a database full of information linked to every person. The user needs to be able to look at information at any depth … The user experience needs to feel “full.” That is, when you click on a person in a governmental database, there is always information about them. This makes it worth going to their page or searching for them. We must make it so every search is worth doing and every link is worth clicking on. Then the experience will be beautiful.
Le futur de la presse :
The Facebook team believed that MySpace wasn’t a technology company, and didn’t have the rigor that came from focusing on products. Zuckerberg didn’t bother to hide his views on this, even to the MySpace founders themselves, much to their annoyance. (DeWolfe disagreed with Zuckerberg: “I think we were both media companies and both technology companies,” he says, though he admits Facebook was driven more by engineering.) At a later NewsCorp retreat, Zuckerberg told Rupert Murdoch that the future of media wouldn’t be people tuning into Fox News or getting The Wall Street Journal on their doorstep, but getting links from their friends online.
Tu sais que tu deviens vieux quand un livre d’« histoire » décrit un projet auquel tu as participé :
Dave Morin was a Montana kid who grew up on computers. He’d paid his way through the University of Colorado by running a web-development firm out of his dorm room. After graduating in 2003, he joined his dream company, Apple, working for its higher-education marketing team. His job was getting college kids to use Apple tools, and he was put in charge of its campus-representative program. At the time, there were about 100 reps across the United States, mostly geeks, and their task was providing technical support to their peers. Morin shifted the program to evangelism, expanding to 900 students who pitched Apple to classmates. Morin believed in communities, and he always urged his reps to join social networks—Friendster, LinkedIn, even AIM. One day in early 2005 the rep from Harvard called him. You have to see this thing called Thefacebook.
Quelques années plus tard, mon premier « vrai » job était d’appliquer le playbook de Morin à la France :
Morin wasn’t interested in buying ads. He wanted to start a Facebook group to promote Apple—someplace where people could learn about the products, share videos and other content, and exchange tips about using Macs. Apple would lure them there with giveaways of iPods and iTunes cards. They cut a deal where Apple paid Facebook $25,000 a month. The total contract might have been a million dollars. Parker would boast about the contract when negotiating the Accel deal.
Apple aurait pu acheter Facebook :
Morin tried to get his bosses at Apple excited about Facebook. His dream was for Apple to make a social operating system. Instead of organizing your system around files, why not around people? Maybe Apple could buy Facebook, as the basis of this new system. The matter came before CEO Steve Jobs. No go. Jobs was open to buying companies, but why join forces with a college-only site of a few million people when MySpace had fifty million?
Les normes sociales changent très rapidement :
Ultimately, the safeguards were built on an optimistic view of what developers might do. Facebook’s executives at the time now admit that the protections were relatively weak in part because the data held by Facebook in 2007 wasn’t seen to be as critical as it would later be. The stakes were lower and the norms were different. In that time period, the tech community was urging Facebook not to lock down its information but to be more open. Facebook, said its critics, was a “walled garden.” This was the term used when the owner of an online destination owned all the services and features that people used when they visited. These digital “company towns” ran counter to the democratic ethos of the Internet. They smothered innovation. Tearing down the walls of your garden meant you were being a supporter of the free Internet.
Avant l’iPhone, Facebook aurait pu être un système d’exploitation ; depuis l’iPhone, Facebook n’est rien d’autre qu’une application :
Facebook’s original ambitions for Platform—a thriving operating system where developers would write original apps that ran inside Facebook—were over. “Unfortunately, mobile just completely undermined the entire system and basically relegated the platform to irrelevance,” says Facebook’s head of partnerships, Dan Rose.
Bill Gates a raté une carrière dans le comique :
The two would eventually become friends, with Gates offering lessons from his experience. Gates acknowledged their similarities—both were Harvard dropouts forming a paradigm-busting software company. But Bill Gates V.2? Not so fast. “Mark never wrote as much code as I did—that’s the most important thing. Put that in your book!” Gates tells me, joking but maybe not joking. Furthermore, “And if Steve Jobs was sitting here he’d say, Hey, Mark never designed a beautiful-looking goddamned thing, so how can you talk about him as any successor of me?” (Joke? Probably joke. Bill is a card.)
Les trois facteurs de l’algorithme gouvernant le News Feed :
The News Feed algorithm was called EdgeRank. It depended on three main factors: Affinity, Weight, and Time Decay. Affinity was measured by how close you were to the person making a post; something by your brother or your best friend would get a high score. Weight was determined by a formula that predicted how likely you were to engage with a post, based on your interests and previous behavior. Time Decay dealt with how recent the post was—newer ones were prioritized. There was a lot of computer science involved in assigned scores according to those criteria. Where a post would show up on your feed—or whether you would see it at all—depended on how each of those factors was weighted. It was largely a question of Facebook’s turning the knobs that measure how much influence each of those three factors would have in determining the score for each possible post. At any given time, this algorithm might change, reprioritizing the importance of one factor over the others.
Le développement de la première application de Facebook :
By August, Hewitt was finished. He’d written the app in two months. Though arguably it represented the future of the company, he released it sans fanfare. “I didn’t really have to ask anyone permission because it was kind of the Wild West,” he says. He doesn’t even recall running it by Zuckerberg. “He probably saw it before we went live. But I didn’t have to meet with him and do any design consultations.” Hewitt didn’t get around to even posting a blog item about it until a day later. The press was rhapsodic, with some calling it the best app for the iPhone yet.
Zuckerberg, vieux à trente ans :
A trickier problem was Zuckerberg himself. He hadn’t yet reached thirty, but the technology he grew up on was no longer ascendant, and he had to understand the dynamics of a new one. After all, he would make the ultimate call on the new apps. “I went to him and said, One of the problems is that you don’t understand native development. You make a thousand decisions a day, and they’re wrong for native,” says Ondrejka. So the new mobile team started training Zuckerberg, showing him what was different in design, in product development, and in the economics of the mobile ecosystem. One lesson Zuckerberg had to relearn was the cost of a mistake. “Done is better than perfect” doesn’t work so well when your version 1 keeps crashing and you have to wait for Apple’s approval process to push out your bug fix.
Le cauchemar du téléphone Facebook :
The software, written in Hewitt’s artisanal programming language that Apple had rejected, was designed around communication with your Facebook contacts. The idea was that the Facebook phone would be so tied to one’s social graph and interests that it would be inseparable from the person themselves. As soon as you turned it on, it would present a list of potential activities based on who you were and what your friends were up to. If some random stranger called you, the phone might not ring. But it might shriek at top volume when a friend called or texted with important personal news, like an engagement, a new baby, or a photo of truffle pizza. When you wanted to communicate with a friend, you would just express that, and the phone would figure out the best way to contact that person, maybe even by checking your friend’s calendar and location. If she was in a meeting, for instance, it would text her. When you shopped, it would suggest options based on your Likes. If you went to a friend’s birthday party, the photos you took would instantly be posted on Facebook.
Le « piège » Onavo :
But Facebook’s motivation wasn’t really providing an app to improve phone performance in developing countries. It maintained Onavo’s business model, which was gathering data from deceptively “free” apps to inform its money-making business intelligence operations. When the mobile performance tool no longer served its purpose, Facebook created a different honey trap for user data, Onavo Protect, which delivered what seemed like a bargain: a free “Virtual Private Network” (VPN) that provided more security than public Wi-Fi networks. It takes a certain amount of chutzpah to present people with a privacy tool whose purpose was to gain their data.
Comment Facebook utilisait Onavo pour identifier les cibles d’acquisition :
In contrast, the first contact from Facebook came in 2013 from Mark Zuckerberg himself. Like so many things at Facebook, it sprang from the Growth team. Though WhatsApp had been decidedly under the radar, especially inside the United States, Facebook deeply understood how popular it was, because of the private data its subsidiary Onavo had been stealthily gathering for years. In a sense, flagging WhatsApp for attention justified the entire price of the Onavo acquisition.
Sur l’interprétation extensive du concept de liberté d’expression :
Kaplan’s boss, Elliot Schrage, strongly takes issue with the perception that Kaplan was carrying water for the party he identified with. Schrage says that the decisions on DCLeaks—and, for that matter, all the decisions that later were criticized because of Kaplan’s alleged bias—were reached by vigorous debate, with his participation. Schrage has a background as a human-rights activist and describes himself as “a First Amendment advocate in the Brandeis tradition,” which leads him to give the benefit of the doubt to free speech. In a famous dissent in a First Amendment case, Louis Brandeis wrote that “the fitting remedy for evil counsels is good ones,” though it’s not clear what he would have thought of the News Feed. “I am hard-pressed to recall a single one of these debates where Facebook’s conservative public policy head and his liberal boss disagreed on how to proceed,” says Schrage. Of course, Zuckerberg, while not necessarily a student of Justice Brandeis, also tilted toward a free-speech approach.
L’optique déformante des statistiques :
So in order to avoid interfering with the election, Facebook effectively gave a green light to misleading, sensationalistic posts that themselves arguably interfered with the election. The ultimate justification for this could be attributed to the engineering mentality that Mark Zuckerberg celebrated in his company. It was a matter of metrics. Compared to the number of total posts hosted by Facebook, the disputed content was minuscule. Those on the product side viewed it from a data perspective and noted that fake news comprised a tiny percentage of the billions of stories posted to Facebook every day. The numbers did not indicate the urgency of the problem. […] In short, Zuckerberg’s inner circle had no clue that misinformation was thriving in their system because, well, where was the data? “We do a lot of work to understand what the top twenty-five things are that people are concerned about or the things where people are having bad experiences,” says Chris Cox. “We asked them what are the bad experiences you’re having and then we rate the bad experiences and then we get things like sensationalism, click bait, hoaxes, redundant stories, and stuff like that. But as a practical matter, [misinformation] wasn’t on our radar. We missed it.”
Comment l’équipe de Trump a exploité les outils de Facebook pour cibler les citoyens les plus vulnérables :
By the end of the campaign, Trump’s team had a database of age, gender, region, and other demographics, and which messages resonated for each one. Facebook’s worry had been that its targeting infrastructure would encourage politicians to deliver different messages to different groups—pro-immigration to one region, anti-immigration to another. That was tempting because Facebook ads, unlike, say, radio or television ads, aren’t generally exposed—they go straight to the News Feed streams of targeted users. But Trump’s campaign didn’t have to do that because it used Facebook to figure out which of its many messages would drive a dagger into the brain stem of each individual. “They were just showing only the right message to the right people,” says the tech executive familiar with the techniques. “To one person it’s immigration, to one person it’s jobs, to one person it’s military strength. And they are building this beautiful audience. It got so crazy by the end that they would run the campaigns in areas where he was about to give a stump speech and find out what was resonating in that area. They would modify the stump speech in real time, based on the marketing.”
Cacher plutôt que supprimer :
The team came up with a number of approaches to minimizing fake news, like helping people identify the sources of a story, fact-checking questionable stories, and more aggressively weeding out bogus accounts that spread toxic posts. All of these were on the table now that the election was over. But what was not on the table was the idea of outright banning misinformation from the platform. That would be a violation of Zuckerberg’s core belief about granting his users free expression. A platform of censorship would mean the end of his dream. The goal would be minimizing those lies, or burying them in the low-ranking sub-basement of the News Feed scroll.
Le problème fondamental de la prétention à l’objectivité :
The news team found that it was getting harder to actually run anything because of Facebook’s fear of alienating the right wing. The seemingly intractable problem was that the media outlets that spent the most on quality were generally perceived as liberal, while a number of popular right-wing outlets thought nothing of twisting the content of an article into partisan fantasies. Determining truth was scary enough for Facebook, but asking it to do so when truth was politicized made it impossible. “You would run an experiment and say, Okay, we’re ready to go, but you might have one hyperconservative publisher who’s doing fairly nefarious things who is going to be penalized and that person has a very loud lobbying body,” says one news team official. “Are you okay waging war with a certain type of constituent in our current government?” That was the debate: Can we actually try to do what is as close as possible to the right thing versus the more politically sensitive thing?
La fondation de Cambridge Analytica :
Not long after, Wylie met hard-core conservative warrior Steve Bannon, then editing the notoriously partisan right-wing news site Breitbart. Somehow the gay nerd and the proto–white nationalist bonded. “It felt like we were flirting,” Wylie would later write about their data-wonky intellectual jam sessions. Soon they were hatching a plan for SCL to enter America. Bannon set up a meeting with a wealthy funder of right-wing causes named Robert Mercer. Before making his fortune in hedge funds, Mercer had been a celebrated IBM researcher, so SCL’s promise to change voting behavior resonated with him. He agreed to fund the subsidiary. In December 2013, “Cambridge Analytica” was registered in Delaware. The name came from Bannon, who liked the implication that it was involved with the university.
L’erreur majeure de Facebook dans la gestion des données de ses utilisateurs :
Facebook’s new rules included an “App Review,” where developers had to request permission to access certain user data. Kogan went through the review and was turned down—but because he had a preexisting app, Facebook allowed him continued access to user data during the one-year transition. If Facebook had enforced its new rules immediately, the GSR–Cambridge Analytica partnership would have ended. Without the friend information he accessed during the grace period, Kogan would have been able to provide only a tiny fraction of the population he promised, insufficient to target a significant number of voters.
Facebook n’a jamais vérifié les déclarations de Cambridge Analytica :
What Facebook did not do for more than a year after learning about the Cambridge Analytica data abuse was get a formal affirmation that Cambridge had deleted the data. (Facebook’s excuse: its outside law firm was negotiating.) While Kogan had not turned in his affirmation until that June, Cambridge did not do so at all during the entire election campaign, even as Nix had been boasting to his clients, current and prospective, about the huge database he had. Meanwhile, Facebook was a partner to Cambridge Analytica, which was a major political advertiser, enjoying support and advice from Facebook’s Advertising team. At any time during the election, Facebook could have threatened to cut off access to its platform if Nix and company did not prove that they had deleted the ill-gotten personal information of 87 million Facebook users. Or Facebook could have demanded an audit. It did not. But it did collect millions of advertising dollars from Cambridge Analytica, without checking whether the money might be the fruit of the unauthorized profile data. In accepting advertising money, it accepted the company’s claims that it wasn’t, even while Cambridge had not yet signed an affirmation.
L’outil n’a pas de volonté propre (mais il guide la main de son utilisateur) :
“There is no morality attached to technology, it’s people that attach morality to technology,” WhatsApp co-founder Brian Acton told me in 2018, looking back on the controversy. “It’s not up to the technologists to be the ones to render judgment. I don’t like being a nanny company. Insofar as people use a product in India or Myanmar or anywhere for hate crimes or terrorism or anything else, let’s stop looking at the technology and start asking questions about the people.”
Zuckerberg et la réalité virtuelle :
Quest was not something that people would use on a persistent basis, and it would not fulfill Zuckerberg’s dreams of virtual or augmented reality being the platform for social interaction. That could only be done by ditching those cumbersome headsets and creating the technology that would allow people to become a form of cyborg—part human, part Facebook. That would happen, he hoped, by the efforts of Oculus Research, the lab in Seattle working on long-range projects. It was making progress on its wear-all-the-time Augmented Reality eyeglasses. Beyond that, Facebook was exploring how to get its products literally in people’s heads. It hired a team of neuroscientists to create typing-free interfaces between thought and action. And in 2019, Facebook bought a company called CTRL-Labs, which picked up brain signals from one’s wrist so one could control apps just by thinking. Every time the project got a mention in the press people would joke, Oh, Facebook now wants to get inside your brain. But that was actually true.
Le virage de Facebook vers le chiffrement :
Actually, Zuckerberg had done a considerable amount of thinking on the subject. He had been outraged in 2014 when Facebook learned, via Edward Snowden’s leaks, that the US government was snatching its communications from Facebook’s data centers. Zuckerberg also had an emotional bias toward encryption. If his own early communications—the IMs and emails regarding ConnectU when he was at Harvard—had been encrypted, he might have been spared embarrassment. When Zuckerberg did express reservations about encryption, his issues were not about addressing the concerns of law enforcement, but about Facebook’s bottom line.
Le problème du chiffrement pour la modération des contenus :
When Zuckerberg made his announcement about a new Facebook, where all the franchises would be integrated into one giant infrastructure, it seemed like a great opportunity for Cox, whose role would be to quarterback the integration. But Cox had no appetite for the job. He disagreed with the whole Privacy-Focused Vision. In particular, he had concerns about Zuckerberg’s insistence that products would be protected by strong encryption. In part, Zuckerberg was doing this as a reaction to his own experience: if some of his early communications had been encrypted—or vanished in the way Stories went away—his early IMs and emails would never have been exposed. And, of course, making privacy a centerpiece of the Next Facebook was a firm answer to critics charging Facebook with being an Orwellian snoop. Cox saw the other side. Besides posing a technical challenge, encrypting the contents of all the messaging services so that even Facebook couldn’t read the posts would hamstring the company’s efforts to fight hate speech and misinformation.
Mea culpa :
He says this even while conceding that some of those mistakes have had terrible consequences. “Some of the bad stuff is very bad and people are understandably very upset about it—if you have nations trying to interfere in elections, if you have the Burmese military trying to spread hate to aid their genocide, how can this be a positive thing? But just as in the previous industrial revolution or other changes in society that were very disruptive, it’s difficult to internalize that, as painful as some of these things are, the positive over the long term can still dramatically outweigh the negative. If you can handle the negative as well as you can. “Through this whole thing I haven’t lost faith in that. I believe we are one part of the Internet that’s part of a broader arc of history. But we do definitely have a responsibility to make sure we address these negative uses that we probably didn’t focus on enough until recently.”