Chat with us, powered by LiveChat Clearly, Facebook Is Very Flawed. What Will We Do About It? Klonick, Kate . New York Times (O | Office Paper
+1(978)310-4246 credencewriters@gmail.com
  

Clearly, Facebook Is Very Flawed. What Will We
Do About It?
Klonick, Kate . New York Times (Online) , New York: New York Times Company. Oct 1, 2021.

ProQuest document link

FULL TEXT
Two weeks ago, The Wall Street Journal published “The Facebook Files,” a damning series based on a cache of

leaked internal documents that revealed how much the company knew about the harms it was causing and how

little it did to stop it.

In a hearing on Thursday, senators on the consumer protection subcommittee accused Facebook of hiding vital

information on its impact on users. “It has attempted to deceive the public and us in Congress about what it

knows, and it has weaponized childhood vulnerabilities against children themselves,” Senator Richard Blumenthal,

the chairman of the subcommittee and a Democrat from Connecticut, charged.

I’ve spent the last six years researching how platforms govern speech online, including a year inside Facebook

following the development of its Oversight Board. While the “factory floor” of the company is full of well-

intentioned people, much of what the series has reported confirmed what I and other Facebook watchers have

long suspected.

The Journal’s reporting showed that Facebook regularly gave preferential treatment to elites if their speech was

flagged on the platform; that it implemented shoddy solutions to mitigate the harmful mental and emotional health

effects of its products on teenagers; and that it underinvested in enforcing its own rules about what is allowed on

the site outside of the United States. The series has stirred the now familiar outrage at Facebook for failing to take

responsibility for how people use its platform. While these revelations are disturbing, they also point to some

opportunities for reform.

One of those opportunities is redefining how Facebook determines what a “good” product is. For much of its

history, the company’s key metric has been user engagement —how long users log in, the pages they spend time

on, which ads they click. The greater the user engagement, the more valuable Facebook’s ads, and the more profit

for shareholders. But the “Facebook Files” stories have put to rest any doubt that this narrow concept of

engagement fails to capture the platform’s real impact —both the bad and, yes, the good.

Facebook is perfectly capable of measuring “user experience” besides the narrow concept of “engagement,” and it

is time those measurements were weighted more heavily in company decision-making. That doesn’t mean just

weighing harmful effects on users; it could also mean looking at and measuring the good things Facebook offers

—how likely you are to attend a protest or give to a charitable cause you hear about on Facebook. However it ends

up being calculated, it needs to be transparent and it needs to become a bigger part of the company’s decision-

making going forward.

The series also revealed that Facebook had conducted its own research into the harmful effects of Instagram, the

popular photo-sharing platform it acquired in 2012, on the mental health of teenage girls but downplayed the

results. For social-media researchers, these revelations confirmed much of what we already knew from multiple

third-party studies showing that cellphones and social media are bad for teenage mental health. (And long before

smartphones and Instagram, social science placed similar blame on fashion magazines and television.)

While calling out Facebook for its mistakes and omissions may seem like a win, berating the company for its

flawed internal and external research projects does not mean this type of work will become more ethical or

transparent. The outcome is that it doesn’t get done at all —not by Facebook or anyone else —and if it does, the

PDF GENERATED BY PROQUEST.COM Page 1 of 3

results stay hidden.

Other popular platforms are also part of the problem. Snapchat supposedly studied the effect of its platform on its

users’ mental health, but never released the results. Instead, it announced new intervention tools. After the

publication of the “Facebook Files” series, TikTok rolled out “mental health guides” for users.

These moves reveal what companies are trying to avoid. If you look inward and investigate the harms your

platform has caused and it turns out to be too expensive or too hard to fix them, it stirs up the exact kind of public

relations storm Facebook is now enduring. From these companies’ perspective, the alternative is simpler: If you

don’t study it, there’s nothing to reveal.

Between Facebook’s internal research and reports last month on the company’s failed program to share its data

with outside social scientists, executives across Silicon Valley at other companies are most likely breathing a sigh

of relief: They’ve managed to dodge pressure from outside researchers to interrogate their own practices.

The series’ most damning takeaways were the revelations around how Facebook has handled content issues in

Africa, Latin America and Asia. While Facebook applies its community rules globally, those rules can’t possibly

adhere to the wide range of cultural norms of Facebook users around the world. Understanding those differences

requires more and better people to constantly revise the rules and enforce them.

Last week, Facebook announced it has spent more than $13 billion on safety and security since 2016 and currently

employs 40,000 full- and part-time safety and security workers. For 2020 alone, this puts the costs in this area

between $5 billion to $6 billion —or about one-tenth of the company’s overall costs. To put this all in perspective, in

the United States there is roughly one law enforcement officer for every 500 people. Facebook has 2.8 billion

global monthly active users; that means just 1.3 people working in safety and security for every 100,000 users.

There is no quick fix for content moderation. The only way to do it better is to hire more people to do the work of

“safety and security,” a term that encompasses all who both directly and indirectly write, revise and enforce

Facebook’s community standards. According to Facebook’s S.E.C. filings, the average revenue per users in the

United States and Canada in the last quarter of 2020 was $53.56. Europe, its next-largest market, accounted for

only a fraction of that at $16.87, with Asia-Pacific users at just $4.05. “Rest of World” was just $2.77 per user.

Those numbers don’t necessarily reflect where Facebook ultimately ends up investing in safety and security. But it

does help explain one powerful set of incentives that might motivate the company’s priorities.

The “Facebook Files” series is motivating change. But it will take more than breathless reporting to make sure that

reform happens in effective ways. That will require laws demanding transparency from platforms, a new agency to

specialize in online issues and more science. Whistle-blowing gets us halfway there. We have to do the rest.

Dr. Kate Klonick (@klonick) is a lawyer and an assistant professor at St. John’s University Law School. She is a

fellow at Yale Law School’s Information Society Project and the Brookings Institution, and currently writing a book

on Facebook and Airbnb.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this

or any of our articles. Here are some tips. And here’s our email: [email protected]

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.

DETAILS

Subject: Law schools; Mental health; Decision making; Social networks

Business indexing term: Subject: Social networks

Location: United States–US Asia

Company / organization: Name: Facebook Inc; NAICS: 518210, 519130

PDF GENERATED BY PROQUEST.COM Page 2 of 3

LINKS
Check Primo for Availability

Database copyright  2021 ProQuest LLC. All rights reserved.
Terms and Conditions Contact ProQuest

Identifier / keyword: Facebook Inc Instagram Inc Social Media Corporate Social Responsibility Computers

and the Internet Teenagers and Adolescence Mental Health and Disorders Wall

Street Journal Regulation and Deregulation of Industry United States Politics and

Government

Publication title: New York Times (Online); New York

Publication year: 2021

Publication date: Oct 1, 2021

Section: opinion

Publisher: New York Times Company

Place of publication: New York

Country of publication: United States, New York

Publication subject: General Interest Periodicals–United States

Source type: Blog, Podcast, or Website

Language of publication: English

Document type: News

ProQuest document ID: 2578051273

Document URL: https://seattlecentral.idm.oclc.org/login?url=https://www.proquest.com/blogs-

podcasts-websites/clearly-facebook-is-very-flawed-what-will-we-

do/docview/2578051273/se-2?accountid=145

Copyright: Copyright 2021 The New York Times Company

Last updated: 2021-10-11

Database: U.S. Newsstream

PDF GENERATED BY PROQUEST.COM Page 3 of 3

  • Clearly, Facebook Is Very Flawed. What Will We Do About It?

What Is Facebook Worth to Us?
Tressie McMillan Cottom . New York Times (Online) , New York: New York Times Company. Oct 8, 2021.

ProQuest document link

FULL TEXT
It is easy to forget how new Facebook is, but remember I do. The first time I logged in to the social networking site

was around 2006. I was taking classes in an adult enrichment program at a small Catholic school in Charlotte, N.C.

A young literature professor used Facebook to cultivate informal communication with the traditional-age students.

I joined Facebook using my university email address, which at the time was required to sign up.

Facebook’s layout and organization of information —what scholars now call “affordances” —were not intuitive to

me. It forced me to “like” the literature professor who encouraged us to sign up, followed by the other students in

the class. I did not realize that this digital space was an extension of the university’s institutional life, so I was

surprised and dismayed when the professor scolded me for making a joke on my Facebook wall. I dropped that

class and deactivated that, my first Facebook account. I would not try again for two more years. By that time,

anyone above age 13 with an email address could join the platform. At the time, this expansion felt like a

democratization of an elite online platform. It is clear now that this was also the moment that Facebook was set

on the course to becoming the political boondoggle it is today.

Opening up Facebook gave it incentives to scale and to make scale its No. 1 priority. When platforms prioritize

scale over users’ safety or even the user experience, the people who own the platform have chosen a set of

political beliefs that inform their economic decisions.

Tarleton Gillespie is a principal researcher at Microsoft Research New England, and an affiliated associate

professor at Cornell University. He is also the author of “Custodians of the Internet: Platforms, Content Moderation,

and the Hidden Decisions That Shape Social Media.” Tarleton has argued that “platforms now function at a scale

and under a set of expectations that increasingly demand automation. Yet the kinds of decisions that platforms

must make, especially in content moderation, are precisely the kinds of decisions that should not be automated,

and perhaps cannot be.” Entrusting decisions to algorithms when they should be made by humans is a political

decision; this means scale is politics. That is something that Facebook’s founder is well aware of.

Mark Zuckerberg’s speechwriter from 2009 to 2011, Kate Losse, says that one of his favorite sayings during her

time with him was “companies over countries.” The statement could be brushed off as the braggadocio of a young

billionaire. It can also be seen as a foundational principle of technology’s pursuit of scale as politics. It is best to

think of it as both. The politics of platform scale is similar to the politics of “too big to fail” that made banks

impervious to the risks of their own making during the 2008 financial crisis. There is a lot to be said about whether

banks should have been bailed out and who paid the long-term cost for doing so. But it is at least within the realm

of reason to accept that financial institutions are truly so intertwined with U.S. policy, militarization and geopolitics

that defending their scale is a matter of national interest. It’s hard to make a similar case for Facebook. Zuckerberg

may well will Facebook’s inevitability into being, but we still have time to determine if we should govern Facebook

as if it is inevitable.

The inevitability question is complicated by another dimension of scale: that Facebook is not just a U.S. political

problem. When Facebook went down this week, so did the company’s other platforms Instagram and WhatsApp.

The outage brought into focus the divide between different groups’ experience of Facebook’s politics. For many

Americans, Facebook going down is an inconvenience; there were memes about rediscovering one’s husband,

writing deadline or bookshelf during the hourslong Facebook outage. But internationally, WhatsApp is a primary

PDF GENERATED BY PROQUEST.COM Page 1 of 3

messaging service. It’s critical infrastructure for the federal government in the Philippines and hospitals in India.

Immigrants in the United States worried about contacting their family back home in places like Malaysia, Ghana

and Brazil. But the fault lines in how people use Facebook were also made visible in other domains, like that of

disabled people who worried about communicating with their friends, families and caregivers on free-to-use

platforms.

My U.N.C. colleague Matt Perault told me this week that tech policy is like all policymaking in that it is cost-benefit

analysis. That is to say, good policy accepts the trade-offs between insufficient but practical regulations for some

agreed-upon, if incomplete, social benefit. Matt’s insight comes from his former post as a director of public policy

at Facebook and now as director of a U.N.C. lab on information technology policy. It’s a useful lens through which

to view the comments made by the Facebook whistle-blower Frances Haugen in congressional testimony this

week. She testified that the company “chooses profit over safety,” and explained that it conducted its own

research on platform affordances that encourage dangerous behaviors, such as eating disorders and self-harm.

Despite this research, Facebook chooses to develop affordances that generate attention, which in turn generates

profit, even when those affordances are dangerous for some users.

Siva Vaidhyanathan is a professor at the University of Virginia and a foremost expert on the social and cultural

implications of Facebook’s political dominance. On a recent podcast with Virginia Heffernan, another media

scholar, Siva characterized Haugen’s testimony as equivalent to the smoking gun documents that felled the

tobacco industry. In the case of Big Tobacco, we decided that the smoking was enjoyable but was also dangerous

to public health. We made a cost-benefit analysis of imperfect trade-offs and chose collective well-being. Some

people were hurt by that trade-off. People with a physical addiction had to pay more for their vice, for example. But

the trade-off was made. Paying attention to technology policy and debates about Facebook may have seemed

niche 10 or even five years ago. With the last week —from outages to congressional testimony —it is clear to me

that now is the time for every informed citizen to have a position on regulating Facebook. We should be guided by

understanding the trade-offs and whom they affect.

If we decide to regulate Facebook, some people will lose a critical if predatory communication platform. Poor

people, disabled people and the global south will likely, as they often do, bear the brunt of rolling back bad policy

decisions. And in countries where Facebook’s business dominance has become the national communication and

economic infrastructure, marginalizations will be compounded. A Facebook scaled down by meaningful regulation

might not have the incentives to surface hate speech, disinformation and controlling images like those that lead to

disordered eating. It will almost certainly have less amplification power to compromise democratic elections or

target your family members with financial scams or conspiracy theories. The question for us is whether the

upsides are worth it, and whether we can build systems to insulate the vulnerable from the downsides.

Tressie McMillan Cottom (@tressiemcphd) is an associate professor at the University of North Carolina at Chapel

Hill School of Information and Library Science, the author of “Thick: And Other Essays” and a 2020 MacArthur

fellow.

DETAILS

Subject: College professors; Testimony; Automation; Content management; Benefit cost

analysis; Disabled people; Communication; Politics; Social networks; Social research

Business indexing term: Subject: Automation Benefit cost analysis Social networks

Location: United States–US

Company / organization: Name: WhatsApp Inc; NAICS: 511210; Name: Facebook Inc; NAICS: 518210, 519130

PDF GENERATED BY PROQUEST.COM Page 2 of 3

LINKS
Check Primo for Availability

Database copyright  2021 ProQuest LLC. All rights reserved.
Terms and Conditions Contact ProQuest

Identifier / keyword: internal-sub-only Facebook Inc Zuckerberg, Mark E Computer Network Outages

Whistle-Blowers Regulation and Deregulation of Industry Instagram Inc WhatsApp

Inc Haugen, Frances

Publication title: New York Times (Online); New York

Publication year: 2021

Publication date: Oct 8, 2021

Section: opinion

Publisher: New York Times Company

Place of publication: New York

Country of publication: United States, New York

Publication subject: General Interest Periodicals–United States

Source type: Blog, Podcast, or Website

Language of publication: English

Document type: News

ProQuest document ID: 2580092370

Document URL: https://seattlecentral.idm.oclc.org/login?url=https://www.proquest.com/blogs-

podcasts-websites/what-is-facebook-w orth-us/docview/2580092370/se-

2?accountid=145

Copyright: Copyright 2021 The New York Times Company

Last updated: 2021-10-25

Database: U.S. Newsstream

PDF GENERATED BY PROQUEST.COM Page 3 of 3

  • What Is Facebook Worth to Us?
error: Content is protected !!