Americans are getting fed up with social media companies, they just don’t want the government to do anything about it
The EU continued its streak of tackling big tech in a way the U.S. has either never been able or willing to. This time, the bloc took aim at TikTok, the Chinese-owned app that has caused concerns both for its highly addictive algorithm and for being a possible national-security threat.
Brussels will investigate allegations that TikTok failed to properly moderate content shown to minors and that some of the features built into the platform intended to keep people scrolling. The investigation falls under the Digital Safety Act, a landmark bill that went into effect last August and holds internet companies accountable for the content posted on their sites. Under the law, the EU has sweeping powers to investigate companies for failing to remove illegal content, limits the data they can collect on users, and the type of ads they can be targeted with.
The EU’s investigation into TikTok could serve as a blueprint for how governments can not only regulate large online platforms, but possibly punish them should they violate those regulations. In the U.S., lawmakers introduced several new bills meant to reign in Big Tech after public frustrations with social-media companies reached a fever pitch over the last year.
“People are increasingly conscious of the fact that these algorithms are being used to shift their behavior to change their behavior in ways that are very insipid, very insidious and difficult to counter, given the current architecture of the Internet,” says Tomicah Tilleman, president of Project Liberty, an initiative that supports reforming social media.
Social media has a reputation for being “weapons of mass distraction” that keep teenagers glued to a screen. Instead of studying or paying attention in school, they scroll endlessly on algorithmic feeds designed to keep them doing just that. For some teens, though, addiction to social media has turned what seemed like online fun into real-world dangers. Some young girls developed severe body-image issues. Kids struggling with mental-health problems found themselves served increasingly more disturbing content about self harm. And the algorithms that serve content based on a user’s interests made it easier for individuals who may have wanted to sexually exploit children to find and, at times, contact them.
On more than one occasion, parents have filed lawsuits against social-media companies alleging their algorithms were designed to keep users scrolling ad infinitum, essentially addicting their children to the platform. Some of the lawsuits came after a whistleblower at Meta, parent company of Facebook and Instagram, released internal documents showing the company was aware its products had especially negative effects on young girls. Social media’s critics say these instances were examples of negligence on behalf of the social-media companies that disregarded warning signs about the negative consequences their products might have on minors.
For others these issues are endemic to social media. There is “an infrastructure problem when it comes to the internet,” Tilleman says. “The way the models have been optimized for the aggregation of private information and the use of that information to manipulate our behavior has created a bunch of big problems.”
A recent study conducted by Project Liberty found a majority of parents blame the social-media companies themselves. The survey released last week found that 59% of respondents in the U.S. blamed social-media companies for online safety. More specifically, 69% of respondents in the U.S. are “very concerned” social media could expose their children to “inappropriate sexual content.” A further 62% of respondents said they were “very concerned” social media would serve their kids information about self-harm.
But the Project Liberty survey also uncovered another fact about parents in the U.S.—one that’s uniquely American. People in the U.S. are lukewarm about the role the government should play in limiting the power and influence of social-media companies, especially when compared to other countries. Only 38% of U.S. respondents wanted the government to have a significant role in ensuring online safety. The U.S. ranked dead last among the seven countries surveyed when it came to relying on the government to regulate social-media companies.
View this interactive chart on Fortune.com
Tilleman isn’t dissuaded by the middling levels of support for government intervention in the U.S. He attributed the varying levels of support for government regulation to cultural differences between countries. “The real takeaway from this data is that folks are deeply unhappy with the way things are,” Tilleman says. “Independent of whether they’re calling for the private sector or the public sector to take the lead on building better solutions, they want better solutions than what we have today.”
Others see it as a continuation of the U.S.’s aversion to government regulations. ”It is part of the psyche of America,” says Mark Fagan, a professor of public policy at Harvard’s Kennedy School. “The psychology of the country is the self made man. It may be a myth, but that is the image we have. So the idea of relying on the government for anything, has a tougher road to hoe here than in Europe.”
Digital democracy
In the U.S., governments have started to take action at both the state and federal level. In Florida, the state legislature passed a bill that would ban social media for kids under the age of 16. The bill is expected to face legal challenges on First Amendment grounds, but it’s nonetheless notable for a state to consider such a measure.
At the federal level, Congress is considering several new bills that would limit how much data social-media companies can collect on minors. The bills, which have bipartisan support, would also limit their ability to offer personalized feeds for children and eliminate some of its most addicting features, like autoplaying videos. Most notably, however, one bill would seek to remove the blanket immunity afforded social-media platforms in criminal matters that happen on their platforms.
Tilleman sees those ideas as bringing what are obvious rights in the real world to our online lives. “Ideas of popular consent, representative governance and private property have been with us for a very long time, but they haven’t yet made the leap into the digital world,” he says. “We’re suggesting that we take concepts and principles that have served us very well for a long time in the analog space and simply translate those more effectively into the digital realm.”
Project Liberty and Tilleman are staunch supporters of the notion that social-media users should have governance over their own data. For example, Tilleman proposes that instead of users opting into a social-media company’s terms of service, it be users who dictate to platforms how they’d like their data to be used.
“We should flip that,” Tilleman says. “Websites and apps should click on our Terms of Use.”
He also says platforms should alert users when their data is being used to target them with advertisements, especially if they’re political.
The exact nature of how user data gets used in targeting ads remains a black box. Most social-media companies are loath to share it publicly because they consider it a competitive advantage. That means users are often unaware of how exactly their information gets repackaged and sold to advertisers.
“The platform knows a lot more about you, and how the information can be used than you—the consumer—does,” Fagan explains.
Tilleman, too, sees the collection of information as a serious problem. “The challenge right now is that the algorithms are being shaped based on information that in many cases, we are unknowingly giving up,” he says. “That information is then being aggregated and used to manipulate our behavior for a variety of different purposes that may or may not align with our own freewill. So that’s the big issue.”
The fact consumers don’t know exactly how their data gets used also raises concerns. Whenever an industry possesses such an outsized level of expertise and knowledge over its customers—known as information asymmetry—it is often ripe for regulations, according to Fagan. As an example, he cites medicine, where a heart surgeon knows much more about a patient’s upcoming operation than they do. So to protect patients, states set up medical boards to oversee doctors and surgeons.
Another reason calls for regulation have reached a fever pitch is that consumers are realizing the companies rarely bear the full weight of the worst outcomes that happen when things go awry. “The platform that hosts them does not face a cost associated with any negative outcomes, but an individual might,” Fagan says.
There’s also the fact that social-media platforms know more about their users than their users know about them. “The platform knows a lot more about you, and how the information can be used than you the consumer does,” Fagan says.
Tilleman says these conditions have resulted in what he calls a “neo-feudal system online.”
“Every morning we take out the equivalent of digital-farming implements and we go to work creating very valuable information that’s sent up to manor houses in Silicon Valley, Seattle, and Shanghai, and then we get back some bare necessities of digital survival in return,” Tilleman says, referring to the hubs of the global-tech industry.
And while the government could certainly play a role in shaping the digital future Tilleman hopes to see, he isn’t shy about working with businesses to bring it to fruition. “Historically, the private sector has had a leading role in shaping the evolution of tech in the United States, and it should take a leading role in building out alternatives to what is widely seen as a broken status quo,” he says.
This story was originally featured on Fortune.com