Listen to the article

0:00
0:00

Indonesia has issued a stern warning to Meta Platforms Inc. for its inadequate efforts in controlling the spread of harmful content across its social media platforms, marking an escalation in the country’s push for stricter digital content regulation.

The warning followed an unscheduled visit by Indonesia’s Communications and Digital Affairs Minister Meutya Hafid to Meta’s operational headquarters in Jakarta on Wednesday. During the inspection, officials determined that the tech giant had failed to meet compliance standards for moderating problematic content including disinformation, online gambling, defamation, and hate speech on Facebook, Instagram, and WhatsApp.

According to ministry data, Meta had addressed only 28.47% of flagged content related to online gambling and disinformation, a figure Indonesian authorities deemed unacceptably low. This poor compliance rate prompted the formal warning issued Thursday.

“Disinformation, defamation, and hate content threaten lives in Indonesia, yet Meta has allowed them to persist,” Minister Hafid stated firmly. The ministry has demanded that Meta strengthen its content moderation systems and expedite the removal of illegal and harmful material from its platforms.

The warning represents the latest development in Indonesia’s ongoing efforts to regulate digital content. The Southeast Asian nation, home to approximately 270 million people and one of the world’s largest social media markets, has been increasingly assertive in demanding accountability from tech companies operating within its borders.

This action follows a broader initiative begun last year when Indonesian authorities summoned representatives from Meta and other social media platforms, ordering them to improve content moderation practices amid growing concerns about the proliferation of disinformation. The government has been particularly concerned about the potential for harmful content to create social division and unrest in the diverse archipelago nation.

Indonesia’s regulatory approach reflects a growing global trend of governments taking stronger positions against tech giants regarding content moderation. Many countries are developing or implementing digital regulations that require social media companies to take greater responsibility for content published on their platforms.

For Meta, this warning comes at a challenging time as the company navigates varying regulatory requirements across different jurisdictions while trying to maintain its business operations. The company has faced similar pressures in other markets, including Europe, India, and Australia, regarding content moderation and platform responsibility.

Digital policy experts note that Indonesia’s market is particularly significant for Meta, as the country has one of the highest rates of social media usage in the world. Facebook alone has an estimated 130 million Indonesian users, making it one of the platform’s largest markets globally.

Meta had not immediately responded to requests for comment on the warning at the time of reporting. However, the company typically emphasizes its investments in content moderation technology and human reviewers to address harmful content across its platforms.

The Indonesian government has indicated that it will be monitoring Meta’s response closely and may consider additional regulatory actions if compliance does not improve. The communications ministry has not specified what potential penalties Meta might face if it fails to address the government’s concerns, but Indonesia has previously threatened to restrict access to platforms that do not comply with local regulations.

This regulatory tension highlights the complex balance tech companies must strike between operating freely in lucrative international markets while respecting increasingly assertive national regulations on digital content.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

11 Comments

  1. While content moderation is challenging, Meta should be able to do better than a 28% compliance rate. I wonder what specific obstacles they are facing in Indonesia that is hampering their efforts.

    • William Lopez on

      Good point. The low compliance rate does seem quite concerning. Meta will likely need to invest more resources and refine its processes to better address the issues in Indonesia.

  2. Mary Jackson on

    This is a complex issue without easy solutions. Social media companies have to balance user privacy, freedom of expression, and content moderation. I hope Meta and Indonesia can find a mutually agreeable path forward.

  3. Elizabeth Jackson on

    This is a concerning development for Meta, but also highlights the broader challenges of content moderation at scale. It will be interesting to see if this leads to any new regulatory approaches or collaborations between social media companies and governments.

  4. Lucas Johnson on

    It’s troubling to see Meta struggling so much with content moderation in Indonesia. As one of the world’s largest social media platforms, they need to be able to effectively address issues like disinformation and hate speech.

    • Liam Hernandez on

      Absolutely. Meta has a responsibility to its users to maintain higher standards, especially in sensitive markets. This warning from Indonesia should serve as a wake-up call for the company to invest more in improving its content moderation systems.

  5. Elijah Hernandez on

    Disinformation and hate speech can have very real consequences, so I understand Indonesia’s concerns. It will be interesting to see if this leads to any broader changes in how Meta approaches content moderation globally.

  6. James Jackson on

    This is certainly a concerning issue for Meta and Indonesia. Maintaining content moderation standards on such a large platform must be an ongoing challenge. I wonder what specific steps Meta could take to better address disinformation and hate speech in the region.

  7. Oliver Thompson on

    It’s good to see Indonesia taking a firm stance on this. Social media platforms have a responsibility to moderate harmful content, especially in sensitive markets. Meta needs to act quickly to improve compliance and protect its users.

  8. This highlights the delicate balance social media companies must strike between free expression and content regulation. I hope Meta can work constructively with Indonesia to find a workable solution that prioritizes public safety.

  9. Oliver Williams on

    Kudos to Indonesia for taking a strong stance on this issue. Social media platforms can’t just pay lip service to content moderation – they need to actually enforce their policies and protect users. I hope this leads to meaningful changes at Meta.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.