Overview of Allegations
Meta, the parent company of Facebook and Instagram, is facing lawsuits from 45 states and the District of Columbia, accusing the company of failing to protect children on its platforms. The lawsuits allege that Meta, under the leadership of CEO Mark Zuckerberg, prioritized user engagement over the safety and well-being of young users. This negligence has reportedly led to issues such as sexual solicitation, harassment, bullying, body shaming, and compulsive use of the platforms.
Internal Concerns and Ignored Proposals
In April 2019, David Ginsberg, a Meta executive, proposed a project aimed at researching and reducing loneliness and compulsive use on Instagram and Facebook. In an email to Zuckerberg, Ginsberg highlighted the scrutiny the company was facing regarding its impact on teens and requested additional staff for the project. However, Susan Li, Meta’s chief financial officer, and Adam Mosseri, the head of Instagram, declined to fund the project due to staffing constraints.
Lawsuits and State Attorneys General’s Involvement
The lawsuits, which include evidence from approximately 1,400 pages of internal company documents, indicate that Meta was aware of the risks posed to young users but chose to downplay these concerns. The attorneys general argue that Zuckerberg and other executives failed to implement necessary safety measures, leading to significant harm to minors using their platforms. Raúl Torrez, the attorney general of New Mexico, emphasized that many critical decisions ultimately rested with Zuckerberg, calling for him to be held accountable for these decisions.
Public Health Warnings and Legislative Actions
The United States surgeon general, Dr. Vivek H. Murthy, recently called for warning labels on social networks, stating that the platforms pose a public health risk to young people. This warning could influence Congress to pass the Kids Online Safety Act, which would require social media companies to disable features that encourage addictive behaviors in minors. Critics of the bill argue that it could restrict minors’ access to important information, although the bill has an exemption for news sites and apps.
Arrests and Real-World Impacts
In May, New Mexico arrested three men accused of soliciting children for sex on Instagram and Facebook. These arrests underscore the argument that Meta’s algorithms enable adult predators to find and target minors, an issue that the lawsuits aim to address. Despite Meta’s claims of having developed numerous safety tools and features, state officials and parents of affected children argue that these measures are inadequate.
Issues with Underage Users
Internal reports from January 2018 estimated that four million children under 13 were using Instagram, despite the platform’s age restrictions. The sign-up process allowed children to easily lie about their age, violating federal laws that require parental consent for collecting personal data from children under 13. Frances Haugen, a former Facebook employee, later disclosed thousands of internal documents showing that the company prioritized profit over safety, further intensifying scrutiny on Meta’s practices.
Controversial Beauty Filters
Meta faced internal debates over the use of beauty filters on Instagram, which some mental health experts warned could harm teenagers by promoting unrealistic beauty standards. In October 2019, Instagram temporarily banned filters that made users look like they had undergone cosmetic surgery. However, Zuckerberg ultimately decided to lift the ban, arguing that there was no data to suggest these filters caused harm. This decision highlighted the company’s focus on maintaining user engagement, even at the potential cost of users’ mental health.
Development of Instagram Kids
In 2021, Meta planned to launch Instagram Kids, a social app specifically for children. This plan faced strong opposition from 44 state attorneys general, who urged Zuckerberg to abandon the project due to Meta’s poor track record in protecting children on its platforms. Following this pressure, Meta paused its plans for Instagram Kids.
Financial Priorities and Resource Allocation
In late 2021, Nick Clegg, Meta’s head of global affairs, warned Zuckerberg about increasing concerns from regulators regarding the company’s impact on teenage mental health. Clegg requested funding for additional staff to address these issues, but the request was not granted. Despite significant revenue growth, Meta’s focus remained on developing new products like virtual reality, rather than investing in youth safety.
Explicit Content and Advertiser Concerns
Last fall, Match Group, the owner of dating apps like Tinder, discovered that its ads on Meta’s platforms were appearing next to violent and sexualized content, some involving children. Despite Meta’s removal of some flagged posts, the response was deemed insufficient. Bernard Kim, Match Group’s CEO, emailed Zuckerberg directly to express his concerns, but did not receive a response.
Ongoing Legal Battles and Meta’s Defense
Meta has disputed the states’ claims and filed motions to dismiss the lawsuits. However, a judge recently denied Meta’s motion to dismiss the New Mexico lawsuit, although Zuckerberg was dropped as a defendant in the case. Meta continues to assert that it has developed numerous safety tools and is committed to youth well-being, despite the ongoing legal challenges.
This article is based on the following article:
Background Information
What is Meta?
Meta Platforms, Inc., formerly known as Facebook, Inc., is a major American technology company founded by Mark Zuckerberg in 2004. It owns several popular social media and communication platforms, including Facebook, Instagram, WhatsApp, and Oculus. Meta’s mission is to build technologies that help people connect, find communities, and grow businesses.
Who is Mark Zuckerberg?
Mark Zuckerberg is the co-founder, chairman, and CEO of Meta Platforms, Inc. He started Facebook while he was a student at Harvard University, and it quickly grew into one of the world’s largest social media networks. Zuckerberg has been a prominent figure in the tech industry and has often faced scrutiny and controversy over the years regarding privacy, data security, and the impact of social media on society.
What are Social Media Platforms?
Social media platforms are online services that allow users to create profiles, share content, and interact with others. Facebook and Instagram are two of the most popular social media platforms worldwide. They enable users to post updates, photos, and videos, follow other users, and engage in various forms of communication.
The Rise of Social Media Use Among Teens
Social media has become an integral part of teenagers’ lives, offering a way to stay connected with friends, share experiences, and express themselves. However, the widespread use of social media among young people has raised concerns about its impact on mental health, privacy, and safety.
Concerns About Child Safety on Social Media
There are several key concerns regarding child safety on social media platforms:
- Addiction and Compulsive Use: Social media can be highly engaging, leading to excessive use that may interfere with daily life, school, and sleep.
- Exposure to Inappropriate Content: Children and teens may encounter content that is violent, sexual, or otherwise inappropriate for their age.
- Cyberbullying and Harassment: Social media can be a platform for bullying and harassment, which can have serious emotional and psychological effects on young users.
- Privacy and Data Security: Children’s personal information may be collected and used in ways that compromise their privacy.
- Online Predators: There is a risk of adults using social media to solicit, groom, or exploit children and teenagers.
Legal and Regulatory Background
There are several laws and regulations aimed at protecting children online:
- Children’s Online Privacy Protection Act (COPPA): This U.S. federal law requires websites and online services to obtain parental consent before collecting personal information from children under 13.
- General Data Protection Regulation (GDPR): This European Union regulation includes provisions for protecting the data and privacy of children.
- State-Level Lawsuits: States in the U.S. can also file lawsuits to enforce consumer protection laws and ensure the safety and well-being of their residents, including minors.
Meta’s Controversies and Legal Challenges
Meta has faced numerous controversies and legal challenges over the years related to privacy, data security, and the impact of its platforms on users. Some key events include:
- Cambridge Analytica Scandal (2018): It was revealed that the data of millions of Facebook users was harvested without consent by the political consulting firm Cambridge Analytica. This led to widespread criticism and regulatory scrutiny of Facebook’s data practices.
- Frances Haugen’s Whistleblower Revelations (2021): Haugen, a former Facebook employee, leaked internal documents showing that the company was aware of the harmful effects of its platforms on users, particularly teenagers, but prioritized profit over safety.
- Current Lawsuits by State Attorneys General: These lawsuits accuse Meta of failing to protect young users from various harms on Instagram and Facebook, such as addiction, harassment, and exposure to inappropriate content.
Efforts to Improve Child Safety
Meta has introduced several measures to enhance child safety on its platforms, including:
- Age Verification and Restrictions: Implementing measures to prevent underage users from creating accounts.
- Content Moderation Tools: Using technology and human moderators to identify and remove inappropriate content.
Safety Features for Teens: Introducing features like restricting direct messages from unknown users and providing resources for mental health support.
Debate/Essay Questions
- Should Social Media Companies Be Legally Responsible for the Safety of Minors on Their Platforms?
- Is Meta Prioritizing Profit Over the Safety and Well-Being of Its Users?
Please subscribe to Insight Fortnight, our biweekly newsletter!