translate
Loading...

CURRENT AFFAIRS DAILY DIGEST – 2026-03-27


Social Media Addiction and Its Impact on Youth

Social Media Addiction and Its Impact on Youth

A jury in Los Angeles, USA, found Meta and YouTube guilty of designing platforms that are addictive in nature and harmful to a young user.
The companies were held liable for negligence, malice, and fraud, with total damages of $6 million awarded—Meta responsible for 70% and YouTube for 30%.

The case highlights allegations that Meta (Facebook, Instagram) and YouTube deliberately designed platforms that harm young users.
A 20-year-old plaintiff argued that early exposure led to anxiety, depression, and body dysmorphia.

In this lawsuit, social media was treated as a “product”, and its design was compared to “digital casinos” that exploit dopamine-driven engagement.


Beyond Section 230: A Shift in Social Media Accountability

Earlier, many lawsuits against social media companies failed due to Section 230 of the Communications Decency Act, which protects platforms from liability for user-generated content.

However, in this case, plaintiffs focused not on content but on platform design, such as:

  • News feed algorithms
  • Engagement-enhancing mechanisms

The jury examined:

  • Whether harm resulted from platform design
  • Whether companies met negligence criteria (duty of care, breach, causation, harm)

Using the “substantial factor test,” the jury concluded that platform design was a significant cause of harm.

Internal research also revealed that companies were aware of the risks but continued harmful design practices—indicating disregard for user safety.


Parallel Verdict: Concerns Over Platform Safety

A jury in New Mexico also found Meta liable under consumer protection laws and awarded $375 million in damages.

Key issues included:

  • Misleading users about platform safety
  • Expanding end-to-end encryption despite warnings

Together, these rulings signal a broader shift: platforms are now being held accountable not just for content, but also for design choices and safety policies.


India’s Regulatory Framework for Children on the Internet

1. Information Technology Act, 2000

  • Prohibits harmful and explicit content involving children
  • Mandates removal of unlawful content within 2–3 hours
  • Requires reporting offences under laws like POCSO Act

2. Digital Personal Data Protection Act, 2023

  • Requires verifiable parental consent for children’s data processing
  • Prohibits tracking, behavioural monitoring, and targeted advertising

3. Information Technology (SPDI) Rules, 2011

  • Ensures data is collected for specific purposes with consent
  • Restricts disclosure of sensitive personal data

Awareness and Capacity Building

  • CERT-In: Cybersecurity advisories and awareness campaigns
  • Information Security Education and Awareness (ISEA): Conducts large-scale training for teachers, police personnel, and volunteers

Technical and Enforcement Measures

  • Blocking of CSAM (Child Sexual Abuse Material)
  • Collaboration with National Center for Missing and Exploited Children
  • Promotion of parental controls and cyber safety awareness

Overall Significance

India has adopted a multi-layered approach combining:

  • Legal provisions
  • Regulatory frameworks
  • Awareness programs
  • Institutional mechanisms

This ensures:

  • Protection of children from digital risks
  • Mitigation of threats arising from AI and social media

👉 The case reflects a global trend where social media companies are increasingly viewed not just as platforms, but as responsible product designers accountable for user safety.

Topic Related Videos ⬇️

Video Thumbnail



Recent Current Affairs Videos

Watch Now!

WhatsApp