Facebook doesn’t care about you: Tech giant chooses profits over people
A cardboard cutout of Mark Zuckerberg, CEO of Facebook, dressed up as the QAnon Shaman, along with other cutouts of people involved in the Capitol insurrection, stand on the National Mall ahead of a House hearing on 'Disinformation Nation: Social Media's Role in Promoting Extremism and Misinformation' in Washington on March 25, 2021. | Caroline Brehman / CQ Roll Call via AP

This article won the 2022 Best Analysis – National/International – Honorable Mention, from the Labor Media Awards by the International Labor Communications Association.

When it debuted on the internet in 2004, Facebook seemed like a revolutionary innovation. A way for millions of people to engage with one another on a central virtual platform. It was free communication at your fingertips. But as more bombshells pertaining to the mega-giant tech company have hit the news in recent days, we now see that nothing was free or revolutionary.

It’s clearer now that we, the users of Facebook, are actually the company’s product, sold to the highest advertising bidder, and placed at the mercy of an algorithm that incites human hate in the name of increasing “engagement.” Judging by recently leaked evidence, Facebook doesn’t care about you, your grandmother, or your cat videos. And on top of that, its growing influence and monopoly power pose a real danger to democracy and our society as a whole.

Earlier this month, data scientist Frances Haugen came forward as the Facebook whistleblower. Having previously worked for the company, Haugen shared a trove of internal documents and gave several interviews the past few weeks showing that whenever there was a conflict between the interests of the company and the public good, the social media giant would choose its own interests.

In an interview on 60 Minutes, Haugen stated, “Facebook, over and over again, has shown it chooses profit over safety.” She said, “I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.”

Some of her damning claims include the company deciding to dissolve a civic integrity unit she was part of and its failure to put up safeguards against misinformation after President Joe Biden defeated former president Donald Trump in the 2020 elections. Haugen believes this, along with the unregulated festering of dangerous groups on Facebook, helped to bring about the Jan. 6 insurrection at the U.S. Capitol.

Former Facebook employee Frances Haugen speaks during a hearing of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, on Capitol Hill, Oct. 5, 2021. | Alex Brandon / AP

In a Senate hearing on children and social media, Haugen testified: “The company intentionally hides vital information from the public, from the U.S. government, and from governments around the world. The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems, and its role in spreading divisive and extreme messages.”

The data scientist’s words are explosive, but if we pay attention to the timeline of events when it comes to Facebook, then we know that the record of such behavior has been accumulating for some time, simultaneous with the company’s rise to a nearly trillion-dollar market value.

Follow the money

You may not need to pull out your credit or debit card whenever you log on, but Facebook definitely comes with a price. The most obvious and immediate ones are your attention and privacy, but the way the company makes the lion’s share of its profit is through advertising revenue. Of course, this isn’t a new business model, as television and newspapers have been doing it for a long time. But Facebook (along with other tech companies like Google) have a means, through their algorithms, to control when and how users engage and come across the advertisements.

The Facebook algorithm is a set of calculations the company uses to decide what you see on its platform. The timeline of posts from friends and groups on your Facebook is neither chronological nor complete because the company, based on its observations and records of your behavior, puts posts in front of you that it believes you will be the most likely to engage with. The longer you are on the platform, the more time Facebook has to show you advertisements.

Engagement is put above all else, even if what is engaged with incites violence or hate. Studies have shown, and even Facebook’s own research if we are to go by Haugen’s claims, that posts that get the most engagement are ones that elicit a negative emotional response. The algorithm is not set up to discern between so-called good or bad engagement. Therefore, if a user is engaging with content that deploys things like racism, sexism, or other forms of bigotry, Facebook’s algorithm isn’t set up to dissuade the user from engaging with it. Quite the opposite. If it keeps the user on the site, then Facebook’s algorithm will continue to peddle this kind of content to them.

This has resulted in a virtual world where many users live in echo chambers of political viewpoints and (mis)information that can be harmful to the well-being of themselves and others. Facebook claimed in a statement to 60 Minutes that the divisiveness in the United States was there long before Facebook existed, and it is innocent when it comes to the growing divide. However, the company’s claim to be blameless feels hollow.

John Minchillo / AP

To drive the point home, think of a situation involving a caregiver and a baby. The caregiver is found to be feeding the baby poison. When questioned, they respond that they continued to feed this to the baby because initially, the infant seemed to respond favorably to the taste. The caregiver claims the final result isn’t their fault, because it’s what the baby continued to go for. The caregiver is aware of the dangers of the toxin, but they seemingly ignore this as the baby keeps coming back for more. The caregiver is only concerned with the fact that the baby is eating their food, rather than what the baby is consuming.

That bleak analogy isn’t far off from Facebook’s dealings. Many users are naïve as to the workings of the social platform they rely so heavily upon. The metaphorical poison millions of users have been radicalized with, on top of the potential dangers to our privacy and security, has created a perfect storm fueled by unchecked capitalism. This not only affects individuals who use the platform but the institutions that we use to govern our society—like elections.

Facebook’s transgressions

There are some key areas that display just how problematic Facebook’s influence is.

Privacy and Security—This was illustrated by the now infamous Facebook–Cambridge Analytica data scandal.

Whistleblower Christopher Wylie exposed how millions of Facebook users had their personal data collected without their consent by the British consulting firm Cambridge Analytica. This company, which harvested private information from more than 50 million Facebook user profiles, would go on to profile voters for Donald Trump’s 2016 presidential campaign based on their Facebook activity.

Although Facebook would go on to suspend Cambridge Analytica from the platform, the company contended that what Analytica did wasn’t a data breach, since technically users consented—via their privacy settings—to allow the harvesting of their data. Of course, many users were never made aware by Facebook that there was a privacy setting they needed to change in order to prevent such an intrusion.

The $5 million fine Facebook was ordered to pay by the Federal Trade Commission was pennies in comparison to its net worth. The data the company holds, and seemingly refuses to take full responsibility for, could have very well determined the outcome of national elections in the hands of bad actors.

Misinformation—From baseless election fraud claims to outright lies regarding the COVID-19 pandemic, Facebook has featured prominently in the spread of socially detrimental rhetoric. According to a study by researchers at New York University and the Université Grenoble Alpes in France, from August 2020 to January 2021, news publishers known for putting out misinformation got six times the number of shares, likes, and engagement on the platform in comparison to news sources considered more trustworthy.

As reported in the Washington Post, Facebook officials refuted the study, claiming it didn’t give the full picture. Yet Facebook, which is in the position to give the full picture, refuses to do so. The company has increasingly limited the amount of data others can access regarding what happens on the platform. This was demonstrated by the frustration shown by the Biden administration when it tried to get info from Facebook regarding anti-vaccine propaganda on the website.

The world is still in the midst of a global pandemic. A recent survey found that people who rely exclusively on Facebook for news and information about the coronavirus are less likely than the average American to have been vaccinated. The platform plays a critical role in public safety, yet it chooses not to take the responsibility that comes when you have outside influence.

Eating up the competition—The major Facebook outage that occurred earlier in October might have felt overly dramatic, as some users complained about not being able to get their cat video fix, but the problem was more severe than that. It was the day the digital world stood still as Facebook, along with Instagram and WhatsApp, both owned by Facebook, all went down.

Millions of people in multiple countries were unable to communicate with loved ones or even run their businesses since the WhatsApp platform and Facebook messenger serves as the dominant forms of communication in many developing countries. It’s not a matter of simply leaving the platform as the popular #DeleteFacebook hashtag may have us believe. The company has strategically undermined or acquired potential competitors, leaving many nowhere else to go.

The decimation of independent journalism—A less obvious casualty of the domination of Facebook’s business model is local and independent press. Newspapers and publications at one time relied heavily on advertisements. Now that Facebook and Google command nearly 60% of all digital advertising revenue, that leaves less money to keep local newspapers, alternative publications, and community news services afloat.

Earlier in October, all of Facebook’s social media platforms – Facebook, Instagram, and WhatsApp – went down for a day earlier in October. While this was an inconvenience for some users, for others it totally cut off communications with their family and customers. The simultaneous outage was one small example of the dangers of monopoly. | Richard Drew / AP

According to a 2020 Pew Research study, Facebook is a regular source of news for about a third of Americans. It’s a platform rife with misinformation and possesses an algorithm that controls what every user sees. An engaged and independent free press is vital to the health of a society. Facebook’s practices and unchecked dominance put that in peril.

What is to be done?

Corporate monopolies have been broken up before, as was done to railroad, oil, and steel. It’s time to break up Big Tech. Sen. Elizabeth Warren, who has been at the forefront of this effort, once pointed out that Facebook has “vast power over our economy and our democracy.” That may sound like hyperbole for a social media platform, but as the aforementioned information in this article makes clear, it is not.

The companies Google, Amazon, and Facebook are allowed to dominate their markets in ways that seep into all walks of life as we know it. This includes the information we are presented with, the products we choose to buy, and the narratives that shape the perspectives of millions across the planet.

As long as companies like Facebook are treated like new ventures that can’t be properly defined, they will continue to evade regulation. What Facebook is doing is not new when it comes to its push for profit. It is not some kind of anomaly in history that has never been seen before. It is not some benevolent new media tool. It is a corporate entity whose mounting power has put it at odds with the public good.

This is dangerous. A call for regulation, such as the petition from the organization Color of Change, is a step in the right direction.

Lastly, it is also important to be critical of the capitalist system that allows information to be bought and sold for profit. The perverse notion that we—in the form of our attention and information—should be ok with becoming products to be sold by companies, in exchange for efficient forms of communication and human connection, should be rebuked.


Like free stuff? So do we. Here at People’s World, we believe strongly in the mission of keeping the labor and democratic movements informed so they are prepared for the struggle. But we need your help. While our content is free for readers (something we are proud of) it takes money — a lot of it — to produce and cover the stories you see in our pages. Only you, our readers and supporters, can keep us going. Only you can make sure we keep the news that matters free of paywalls and advertisements. If you enjoy reading People’s World and the stories we bring you, support our work by becoming a $5 monthly sustainer today.


CONTRIBUTOR

Chauncey K. Robinson
Chauncey K. Robinson

Chauncey K. Robinson is an award winning journalist and film critic. Born and raised in Newark, New Jersey, she has a strong love for storytelling and history. She believes narrative greatly influences the way we see the world, which is why she's all about dissecting and analyzing stories and culture to help inform and empower the people.

Comments

comments