Credit: Mimi Phan / Shutterstock
Researchers, policymakers, and users have identified several key issues with the social media ecosystem. These include vast power held by a few corporations, which hurts innovation and competition; the spread of false news and debates about the limits of free speech; how social media threatens privacy, election integrity, and democracy; and platform oversight and transparency.
The Social Media Summit @ MIT brought together experts to discuss these issues and focus on solutions, which range from new oversight panels to breaking up big companies.
“Social media is rewiring the central nervous system of humanity in real time,” said MIT Sloan professor who led the event. “We’re now at a crossroads between its promise and its peril.”
A new report from the summit, now available online, takes a deep look at the range of problems posed by existing social media models, and offers 25 potential solutions to address them.
Here’s a look at seven areas of concern addressed at the summit, and just a few of the potential solutions.
1. The spread of false news and misinformation
False news spreads quickly online, aided by social media algorithms that amplify popular, and often incendiary, content. And social media companies and their advertisers often benefit from it, Aral noted.
One solution is to crack down on the most prolific offenders, said Clint Watts, a research fellow with the Foreign Policy Institute. “We know about them, and [enforcement] needs to focus there for maximum impact,” Watts said.
2. The difficult balance between user privacy and platform transparency
Social media poses what Aral calls a “transparency paradox.” Researchers and the public have the right to know how social media platforms are accessing and using consumer data. But there’s also a need to protect user privacy and security.
Algorithmic transparency that lets researchers examine peer-to-peer information sharing without sharing personal information would lead to greater understanding about malicious use and how to prevent it, said Kate Starbird, an associate professor at the University of Washington. Some platforms are already more transparent than others. “We’re able to review data patterns on Twitter because their data is public,” she said. “Facebook and YouTube do not readily share data and we can’t study them very well.”
3. Lack of regulation for social media companies
Nick Clegg, vice president of global affairs at Facebook, said he agreed that independent oversight is a necessity.“We’re way beyond the stale debate of whether we need new rules of the road,” Clegg said during a discussion with Aral. Clegg also noted that if different areas of the world regulate social media differently, it could balkanize the internet. The U.S. and European Union need to work together, he said, and bring India into the fold.
4. Lack of competition
Competition is a big incentive for companies to change behavior, Aral noted, but there is market concentration in the social economy with Facebook, Twitter, and Google.
“We’re dealing with an array of issues, including concentration that is choking off innovation, harming advertisers and small businesses, and leading to less competition for quality and privacy,” said Zephyr Teachout, an assistant professor of law at Fordham Law School.
The European Union is considering the Digital Markets Act, which would address anti-competitive practices and dictate corporate responsibility for non-compliance. This might be a model for other areas.
5. Algorithms contribute to bias, racism, and polarization
Social media and search engines have become the main way people organize and access information, said Safiya Noble, co-founder of the Center for Critical Internet Inquiry at UCLA. But companies that run them are guided by profit, and not things like democracy or human rights, she noted, and sometimes the most popular, profitable speech promotes racism, misinformation, and polarization.
Part of the problem is frictionless systems that allow users to easily retweet and share this kind of information, said a principal research scientist at MIT Sloan. Introducing friction by slowing online interactions and giving users the chance to think before sharing information is one solution, she said.
Related Articles
6. Social media business models don’t always serve users
Social media business models are built on the attention economy, in which platforms sell users’ attention for advertising. But what gets attention isn’t always good for users, or society. Revising business models away from the attention economy could help.
Subscription-based models, which aren’t tied to adverting, are an alternative, said Scott Galloway, an adjunct professor of marketing at New York University, though he noted that there is a danger if the best, fact-checked information is available only behind a paywall.
7. The line between free speech and harmful speech is sometimes unclear
Section 230 of the Communications Decency Act provides websites with immunity from third-party content. It needs to be reformed to make platforms more liable for the content they publish, said Richard Stengel, a former undersecretary of state for Public Affairs and Public Diplomacy and former managing editor of Time magazine. “Regulations have to incentivize platforms to take responsibility for illegal content just as Time magazine was,” he said, noting that platforms are currently in a gray area when it comes to regulating content.
Renée Diresta, research manager at the Stanford Internet Observatory, said policy should also differentiate between free speech and free reach. The right to free speech doesn’t extend to a right to have that speech amplified by algorithms.
“There’s always been this division between your right to speak and your right to have a megaphone that reaches hundreds of millions of people,” she said.
Read next: MIT Sloan research about social media, misinformation, and elections