Ginlix AI
50% OFF

In-Depth Analysis of the Impact of Stricter Regulation Under the EU Digital Services Act (DSA) on Social Media Platforms

#digital_services_act #social_media_regulation #eu_regulation #platform_compliance #meta #tiktok #x_platform #advertising_restrictions #minor_protection #regulatory_fines
Mixed
US Stock
January 16, 2026

Unlock More Features

Login to access AI-powered analysis, deep research reports and more advanced features

About us: Ginlix AI is the AI Investment Copilot powered by real data, bridging advanced AI with professional financial databases to provide verifiable, truth-based answers. Please use the chat box below to ask any financial question.

Related Stocks

META
--
META
--
GOOGL
--
GOOGL
--
AAPL
--
AAPL
--

Based on the latest data and analysis I have collected, I am now presenting an in-depth analysis report on the comprehensive impact of the EU Digital Services Act (DSA) on social media platforms.


In-Depth Analysis of the Impact of Stricter Regulation Under the EU Digital Services Act (DSA) on Social Media Platforms
1. Overview of the DSA Regulatory Framework

The EU Digital Services Act (DSA) is a major digital regulatory legislation passed by the EU in 2022, which has been fully in effect for Very Large Online Platforms (VLOPs) since February 2024. The Act applies to platforms with over 45 million monthly active users, including major social media platforms such as Meta (Facebook, Instagram), TikTok, X (formerly Twitter), YouTube, and Snapchat [1][2].

The core regulatory objectives of the DSA include:

  • Transparency Requirements
    : Platforms must disclose content moderation standards, advertising information, and recommendation algorithm logic
  • Minor Protection
    : Prohibit targeted advertising to minors and require the implementation of age verification measures
  • Risk Management Obligations
    : Platforms must identify and mitigate systemic risks posed by their services
  • User Empowerment
    : Provide non-personalized content recommendation options and protect users’ right to know and right to choose

As of December 2025, the European Commission has launched investigations into multiple platforms and imposed the first DSA fine, marking the official entry of the Act into the enforcement phase [3][4].


2. Analysis of Compliance Costs
2.1 Direct Regulatory Fees

Pursuant to Article 43 of the DSA, the European Commission collects annual regulatory fees from large tech companies. The total fees for 2025 amount to

€58.2 million
(approximately $62.8 million), representing a 29.3% increase from the €45 million in 2024 [5]:

Platform Covered Services Nature of Fees
Meta Facebook, Instagram Calculation method challenged in litigation
Google Search, Google Play, Maps, Shopping, YouTube Calculation method challenged in litigation
TikTok Main TikTok App Calculation method challenged in litigation
Apple App Store Involves multiple services

These fees only cover the supervision costs of regulatory authorities and do not include the technical and operational costs that platforms themselves incur to meet compliance requirements [5].

2.2 Risk of Non-Compliance Fines

The DSA establishes a strict fine mechanism, with a maximum fine of

6% of global annual turnover
[6]. On December 5, 2025, the European Commission imposed the first DSA non-compliance fine of
€120 million
on X (formerly Twitter), citing the following violations [3][4][7]:

  • Deceptive Design (Dark Patterns)
    : The paid “Blue Check” verification system misleads users
  • Advertising Transparency Violations
    : Incomplete and hard-to-use advertising repository
  • Restricted Data Access for Researchers
    : Violation of data accessibility obligations

For Meta, the investigation launched on April 30, 2024, may result in a maximum fine of 6% of its global turnover. Meta’s global annual revenue is approximately $134 billion, bringing the theoretical maximum fine to $8.04 billion [6][8]. In addition, the Amsterdam District Court ruled on October 2, 2025, that Meta violated Article 38 of the DSA, requiring it to provide a non-algorithmically sorted timeline option, or face a fine of

€100,000 per day
(capped at €5 million) [9].

2.3 Technical Compliance Investment
Cost of Age Verification Systems

TikTok will launch new age detection technology in Europe in the coming weeks. The system has undergone a one-year pilot in Europe, predicting whether an account user is a minor by analyzing user profile information, posted videos, and behavioral signals [10]. Accounts flagged by the system will be manually reviewed by professional moderators instead of being directly banned.

Since July 2025, the European Commission has collaborated with multiple countries to test a unified age verification application blueprint, linking the EU Digital Wallet (eID) for adult content platforms, and promoting it to other platforms to meet the compliance requirements of Article 28 of the DSA [11]. Technical options for age verification include [12]:

  • High-Effectiveness Methods
    : Credit card verification, open banking verification, facial age estimation
  • Identity Document Verification
    : Government-issued documents such as passports and national ID cards
  • Behavioral Signal Analysis
    : Inferring age based on user activity patterns
Cost of Algorithm Transparency and Adjustments

The DSA requires platforms to provide transparency of recommendation algorithms and allow users to opt out of personalized recommendations. TikTok, Facebook, and Instagram currently offer options to turn off personalized feeds [2]. Meta is required to integrate “safety by design” into its product architecture, avoiding design features that may lead to addiction, which requires significant resources to optimize its algorithm recommendation system [6][8].

2.4 Operational Compliance Costs

The DSA requires VLOPs to fulfill the following ongoing compliance obligations, resulting in significant operational costs [6][8]:

Obligation Category Specific Requirements Cost Impact
Risk Assessment Conduct annual systemic risk assessments (illegal content, disinformation, minor risks) Need to establish specialized teams and processes
Transparency Reports Publish semi-annual transparency reports covering moderation operations, user reports, and appeal handling Increased costs for report preparation and auditing
Data Access for Researchers Open datasets to qualified researchers Increased costs for data management and security
Explanation of Reasons Provide clear explanations to users for content moderation decisions Increased costs for customer service and system development
Advertising Repository Maintain a publicly searchable database of advertising campaigns Investment in technical development and maintenance

3. Impact on User Growth Strategies
3.1 Restrictions on Minor User Acquisition

Article 28 of the DSA explicitly requires online platforms targeting minors to take appropriate and proportionate measures to ensure a high level of protection of minors’ privacy, safety, and security [2][12]. Major platforms have taken the following measures:

Platform Measures Impact on User Growth
TikTok Default private accounts for users under 16; launched age detection technology Restricted growth of underage users
YouTube Default private accounts for users under 16; restricted targeted advertising Restricted acquisition of adolescent users
Snapchat Stopped displaying targeted ads to minor users Reduced precision marketing capabilities
Meta (Facebook/Instagram) Age verification methods under investigation; algorithm system under review Reduced attractiveness to adolescent users

On October 24, 2025, the European Commission preliminarily determined that Facebook, Instagram, and TikTok may have violated the DSA’s transparency obligations [13]. The European Commission also launched investigations into Snapchat, YouTube, Apple App Store, and Google Play, reviewing their age verification systems and measures to prevent minors from accessing illegal products [14].

3.2 Adjustments to User Acquisition Strategies
Special Considerations for the European Market

Given the increasingly strict regulatory environment in Europe, the user growth strategies of social media platforms are undergoing structural changes:

  1. Geographic Differentiation Strategy
    : Platforms may treat the European market as an independent operating unit, differentiating product functions and content moderation standards from other regions to reduce compliance complexity
  2. Prioritizing User Quality Over Quantity
    : In an environment of high compliance costs, platforms may focus more on improving the activity and retention rate of existing users rather than simply pursuing user quantity growth
  3. Compliance-First Product Design
    : New product features must consider DSA compliance requirements during the design phase rather than making adjustments after launch, which will affect product iteration speed and innovation capabilities
Restrictions on Adult Content Platforms

The European Commission launched DSA investigations into Pornhub, Stripchat, XNXX, and XVideos, accusing them of inadequate age verification, risk assessment, and safeguard measures [11]. Platforms with insufficient age verification may face user access restrictions, directly affecting their user base.

3.3 Challenges to User Retention
Impact of Algorithm Adjustments on User Engagement

The DSA requires the provision of non-personalized content recommendation options, allowing users to opt out of algorithm-driven feeds. This may lead to:

  • Decreased User Dwell Time
    : Non-algorithmically sorted feeds typically have lower user engagement
  • Reduced Advertising Display Opportunities
    : Decreased user time directly impacts advertising revenue
  • Risk of User Churn
    : Some users may switch to competing platforms that offer more “addictive” experiences
Demonstration Effect of the “SkinnyTok” Incident

The European Commission expanded its investigation into TikTok to include the “SkinnyTok” issue—a harmful algorithm-driven trend promoting extreme weight loss. Following concerns from EU and French regulators, TikTok banned related hashtags globally and redirected searches to mental health support resources [11]. Such incidents may prompt platforms to adopt more conservative algorithm strategies, reducing content attractiveness to avoid risks.


4. Impact on Advertising Business Models
4.1 Restrictions on Targeted Advertising

The core impact of the DSA on the social media advertising business model is reflected in the following bans [2]:

Prohibition of Targeted Advertising to Minors
  • Snapchat, Google/YouTube, and Meta (Facebook/Instagram) have stopped displaying targeted ads to minor users
  • TikTok and YouTube set default private accounts for users under 16
  • This directly weakens the platforms’ ability to monetize advertising targeting the adolescent market
Prohibition of Targeted Advertising Based on Sensitive Personal Data

The DSA prohibits the use of special categories of personal data for user profiling and advertising targeting, including [2][15]:

  • Racial or ethnic origin
  • Political opinions
  • Sexual orientation
  • Religious beliefs
  • Health data

This restricts advertisers’ ability to use refined user profiles, reducing advertising delivery efficiency and pricing power.

4.2 Advertising Transparency Requirements
Advertising Repository Obligations

VLOPs must maintain a publicly searchable advertising repository containing [2][3]:

  • Detailed information of each paid advertising campaign
  • Advertiser information
  • Reasons for ad display
  • Advertising targeting criteria

X (formerly Twitter) was fined €120 million due to its advertising repository being “sparse and hard to use” [3][4][7]. TikTok has committed to providing a searchable and reliable advertising repository to meet DSA requirements [1].

Ad Labeling and Explanation Requirements

All ads must be clearly labeled, and platforms must explain to users why they are seeing a specific ad. This increases the complexity and operational costs of advertising systems.

4.3 Fundamental Challenges to the Advertising-Driven Business Model
Inherent Contradiction of the “Attention Economy” Model

Academic research points out that the advertising-driven business model of social media platforms relies on the “attention economy,” maximizing users’ online time and ad exposure through the following design strategies [16]:

Design Strategy Business Purpose DSA Compliance Conflict
Infinite Scrolling Increase page views May constitute “deceptive design”
Variable Reinforcement (Likes, Comments) Cultivate user habits and addiction May stimulate “behavioral addiction”
Precision Push Notifications Recover churned users May be regarded as manipulative design
Algorithmic Content Sorting Maximize user engagement May create an “echo chamber” effect

The DSA requires platforms to assess and mitigate the risks of their designs to users’ mental health, which fundamentally conflicts with the core logic of platforms profiting by maximizing user attention [16].

Brand Safety and Advertiser Relationships

Snapchat has recently adopted strategies to strengthen brand safety measures, attracting advertisers amid Meta’s reduced focus on brand safety [17]. Social media is considered the advertising format with the greatest brand safety challenges, and Snapchat’s initiatives may give it a competitive advantage in the compliance environment.

However, overall, DSA regulation may:

  • Reduce Advertising Pricing Power
    : Restrictions on data usage weaken the value of precision targeting
  • Increase Advertising Review Costs
    : Content moderation and compliance verification requirements increase costs
  • Change Advertiser Preferences
    : Some advertisers may shift to platforms or channels with looser regulation
4.4 Trends in Structural Adjustments

Facing the business model pressures brought by the DSA, major platforms are exploring the following adjustment directions:

  1. Exploration of Subscription Models
    : Meta’s “pay or OK” model faces challenges under the GDPR and DMA, and was deemed in violation of the DMA by the European Commission on April 23, 2025, resulting in a €20 million fine [18]

  2. Reconstruction of Advertising Systems
    : Maintain advertising value amid reduced targeting capabilities, possibly shifting to alternative solutions such as contextual advertising and context-related advertising

  3. Investment in Compliance Technology
    : Treat compliance as a differentiated advantage rather than a cost burden, such as Snapchat’s emphasis on brand safety to attract compliance-conscious advertisers

  4. Global Policy Standardization
    : Due to the global operational nature of social media platforms, platforms may choose to apply DSA standards to global operations to simplify compliance (the “Brussels Effect”) [19]


5. Case Analysis of Major Platforms
5.1 Meta (Facebook/Instagram)
Dimension Specific Situation
Regulatory Investigations
DSA investigation launched on April 30, 2024, focusing on the addictive impact of the algorithm system on children and the “echo chamber” effect; preliminarily determined that it may have violated transparency obligations in October 2025
Litigation
Amsterdam District Court ruled on October 2, 2025, that Meta violated Article 38 of the DSA, requiring it to provide a non-algorithmic timeline option
Fine Risk
Up to 6% of global annual turnover (approximately $8.04 billion)
Compliance Measures
Stopped targeted advertising to minors; provided options to turn off personalized recommendations; adjusted algorithm recommendation systems
Impact on Business Model
Restricted access to adolescent advertising market; brand safety concerns may affect advertising revenue; significant investment in compliance technology required
5.2 TikTok
Dimension Specific Situation
Regulatory Investigations
Underwent DSA compliance investigation in December 2024; preliminarily determined that it may have violated transparency obligations in October 2025; the “TikTok Lite” reward program was required to be permanently taken offline in the EU
Age Verification Measures
Launched new age detection technology (one-year pilot in Europe); collaborated with Yoti on facial age estimation; uses credit card or ID verification when processing ban appeals
Compliance Measures
Committed to establishing a searchable advertising repository; set default private accounts for users under 16; banned harmful “SkinnyTok” hashtags globally
Impact on User Growth
Age verification restricts acquisition of underage users; regulatory pressures affect expansion in the adolescent market
Impact on Business Model
Increased advertising compliance costs in the European market; restricted advertising targeting capabilities
5.3 X (formerly Twitter)
Dimension Specific Situation
Non-Compliance Penalties
Imposed a €120 million fine by the European Commission on December 5, 2025 (the first DSA fine)
Violations
Deceptive design (Blue Check verification); incomplete advertising repository; restricted data access for researchers
Response Measures
Suspended the European Commission’s advertising account in response
Impact on Business Model
Damaged brand reputation; reduced advertiser trust; continuously increasing compliance costs
5.4 Snapchat
Dimension Specific Situation
Regulatory Investigations
Underwent DSA investigation on October 10, 2025, reviewing its age verification system and minor protection measures
Compliance Measures
Already stopped targeted advertising to minors; strengthened brand safety measures
Competitive Strategy
Attracted advertisers amid Meta’s brand safety controversy; treated compliance as a differentiated advantage
Impact on Business Model
Restricted access to adolescent advertising market; brand safety positioning may attract specific advertisers

6. Long-Term Trends and Outlook
6.1 Continued Strengthening of the “Brussels Effect”

The impact of the DSA is extending beyond EU borders. Due to the global operational nature of social media platforms, to simplify compliance processes, platforms may choose to apply DSA standards to global operations [19]. This means:

  • Global content moderation policies align with EU standards
  • Global unification of advertising transparency standards
  • Age verification measures become industry-wide practices
  • Overall improvement in user privacy protection levels
6.2 Possible Expansion of Regulatory Scope

Currently, the DSA primarily applies to VLOPs, but:

  • The European Commission is evaluating the possibility of extending DSA obligations to small and medium-sized platforms
  • The UK Online Safety Act (OSA) provides a parallel regulatory framework, creating dual compliance requirements
  • Australia has introduced the world’s first social media ban for users under 16, forming a global trend
6.3 Pressure for Fundamental Changes to Business Models

The DSA reveals the inherent contradiction of the advertising-driven social media model, which may give rise to [16]:

  • More platforms exploring subscription or hybrid revenue models
  • Transformation from the “attention economy” to a “value economy”
  • Further expansion of transparency and user control over data usage
  • Mental health protection becoming a core consideration in product design
6.4 Law Enforcement Resources and Capacity Building

In 2024, the European Commission sent approximately 100 information requests to VLOPs and launched 9 formal investigation procedures [5]. However, regulatory authorities face:

  • Severe capacity constraints
  • Issues with enforcement failures in some countries
  • Industry lobbying pressures
  • Geopolitical interference

This may lead to uncertainty in law enforcement intensity, and platforms need to continuously monitor regulatory developments.


7. Conclusions and Recommendations
7.1 Key Findings

The EU Digital Services Act (DSA) has exerted far-reaching impacts on the compliance costs, user growth strategies, and advertising business models of social media platforms:

  1. Significant Increase in Compliance Costs
    : Direct regulatory fees, fine risks, and technical compliance investments constitute multiple cost pressures
  2. Restrictions on User Growth Strategies
    : Structural constraints on the acquisition and retention of minor users
  3. Pressure on Advertising Business Models
    : Restrictions on targeted advertising and transparency requirements weaken platforms’ core monetization capabilities
  4. Fundamental Challenges to Business Models
    : The DSA reveals the inherent conflict between the “attention economy” and user protection
7.2 Impact Assessment on Platforms
Platform Short-Term Impact Medium-Term Impact Long-Term Impact
Meta Risk of huge fines; increased compliance investment Slow user growth; declining advertising revenue Forced transformation of business model
TikTok Investment in age verification technology Pressure on market share in the European market Need to restructure global compliance strategies
X Damaged brand reputation Advertiser churn Possible forced exit or significant contraction
Snapchat Regulatory investigation pressure Opportunities brought by brand safety positioning Need to continue investing in compliance capabilities
7.3 Strategic Recommendations

For social media platforms, the following strategies should be adopted in the DSA regulatory environment:

  1. Front-Load Compliance Investment
    : Integrate compliance requirements into product design processes to avoid high costs of post-hoc rectification
  2. Differentiated Compliance Advantages
    : Transform compliance measures into competitive advantages favored by users and advertisers
  3. Diversify Revenue Sources
    : Explore non-advertising revenue sources such as subscriptions and value-added services to reduce reliance on targeted advertising
  4. Coordinate Global Policies
    : Establish unified global compliance standards, using EU requirements as a benchmark to simplify operations
  5. Reconstruct User Relationships
    : Treat user empowerment as an opportunity to enhance user loyalty rather than a mere compliance burden

The DSA represents an important trend in global digital regulation, whose impact will extend far beyond the EU and shape the future development direction of the global social media industry.


References

[1] Global Digital Policy Roundup: December 2025 - TechPolicy.Press

[2] The impact of the Digital Services Act on digital platforms - European Commission

[3] EC Issues First Non-Compliance Fine Under the DSA: X Fined €120 Million - Goodwin

[4] €120 million later: the DSA enters the enforcement phase - MediaLaws

[5] A Drop in the Bucket: EU Charges Big Tech Just €58M Under DSA - PYMNTS

[6] The DSA & OSA Enforcement Playbook - Checkstep

[7] First DSA fine: €120 M for X for deceptive design and transparency failures - Sypher

Related Reading Recommendations
No recommended articles
Ask based on this news for deep analysis...
Alpha Deep Research
Auto Accept Plan

Insights are generated using AI models and historical data for informational purposes only. They do not constitute investment advice or recommendations. Past performance is not indicative of future results.