spot_imgspot_img

Indonesia and the politics of platform governance

Three tween boys crowd around a mobile phone in Jakarta, Indonesia.

Four tween boys crowd around a mobile phone in Jakarta, Indonesia. Image from Flickr. License CC BY-NC-ND 2.0.

Indonesia’s Minister of Communication and Digital Affairs, Meutya Hafid, along with several officials from government agencies, conducted an inspection of the technology company Meta’s office in South Jakarta in early March. The inspection aimed to compel the company to comply with Indonesian law. Meutya said:

Sore ini kita melakukan giat sidak di kantor Meta. Ini adalah tindak lanjut Pasal 40 Undang-Undang ITE yang menyatakan pemerintah bertugas melindungi keselamatan dan kepentingan umum dari berbagai gangguan akibat misinformasi dan disinformasi.

This afternoon, we conducted an inspection at Meta’s office. This is a follow-up to Article 40 of the Electronic Information and Transactions Law (UU ITE), which states that the government has the duty to protect public safety and the public interest from disruptions caused by misinformation and disinformation.

She explained that the inspection was conducted because Meta was considered not fully compliant with regulations in Indonesia, particularly regarding the spread of disinformation. During the inspection, the minister also asked Meta to be transparent about its algorithms and content moderation practices.

The inspection reflects the Indonesian government’s latest attempt to assert greater control over global digital platforms. Yet experiences in many countries show that the relationship between governments and digital platforms is never entirely one-directional. In Southeast Asia, efforts by states to “tame” platforms often turn into negotiations of power between governments, global technology companies, and broader geopolitical pressures.

TKTK speaks about Indonesia's plans to tighten control over social media platforms, in a bid to protect Indonesia’s youth.

Minister of Communication and Digital Affairs of Indonesia, Meutya Hafid, discusses Indonesia’s plan to tighten control over social media companies to protect Indonesia’s youth. Screenshot from CNA YouTube. Fair use.

In Indonesia, debates around disinformation and content moderation rarely move beyond ambiguous phrases such as “attacking honor,” “violating propriety,” or “disturbing the public.” But whose honor is being attacked? What social values are being violated? And what kind of content is considered disturbing to the public?

Is content criticizing the rape cases during the May 1998 riots, or criticism of nickel mining in Raja Ampat, considered “disturbing the public”? In June 2025, several public accounts on platform X received official notifications stating that their posts were “violating the law” according to the Indonesian government. These posts contained criticism of the government. Watchdogs argue that the government, through the Ministry of Communication and Digital Affairs (Komdigi), requested that X take down critical posts from accounts such as @neohistoria_id and @perupadata.

“Regulating platforms is not a problem, but the rules must be clear. Harmful content for whom?” said Masgustian from the Center for Digital Society (CFDS) at Universitas Gadjah Mada.

He argues that platforms are generally willing to cooperate with governments on content moderation. However, the main issue lies in the lack of clarity in regulatory definitions. In ongoing research conducted by CFDS, discussions with government representatives show that even within government institutions, there are different definitions of terms such as “terrorism” or “harmful content.”

For example, there is still no shared understanding between the definitions used by Komdigi and the National Cyber and Crypto Agency (BSSN). According to Masgustian, this lack of clarity may create new problems in the implementation of content moderation.

Government efforts to regulate platforms

Government efforts to regulate platforms have intensified since technology companies were required to register as Electronic System Operators (PSE) in 2020. PSEs that fail to register may face administrative sanctions ranging from warnings and fines to access blocking by internet service providers (ISPs). According to the government, the policy aims to prevent the spread of harmful content and ensure personal data protection.

Within this regulatory framework, the government has greater room to influence platform policies. For example, TikTok temporarily suspended its “live” feature during a national demonstration in September 2025. TikTok stated that the suspension was voluntary after being summoned by Komdigi, while the government denied issuing any direct order.

Komdigi also operates a system called SAMAN. This system allows the ministry to compel social media platforms such as Facebook, Instagram, X, TikTok, and YouTube to remove content within 4–24 hours based on government orders. If platforms fail to comply, they may be issued fines of up to Indonesian Rupiah 500 million (over USD 29,000) per piece of content, or be blocked altogether.

According to Alia Yofira, a researcher at Purplecode who focuses on technology and human rights, platforms must not only comply with takedown requests but also fulfill mandatory registration requirements. If they fail to comply, they risk being blocked.

“The consequence was faced by Wikimedia, their system was blocked because they refused to register,” Alia explained in an interview with Global Voices. “Blocking websites, however, can also be considered a human rights violation because it disrupts public access to information and administrative services.”

Members of Wikipedia Indonesia meet up for a workshop.

Members of Wikimedia Indonesia meet up for a workshop. Image from Wikimedia Commons. License CC BY-SA 4.0.

Alia further explained that the concern is not only about website blocking but also about access to personal data. She described how Komdigi held private meetings with digital platforms following large-scale demonstrations in August 2025. After those meetings, platforms disabled the “live” feature, particularly on TikTok and Instagram, although Instagram limited the restriction to users with fewer followers.

“During protests, demonstrators not only documented police brutality, but also shared information about routes in and out of protest locations, closed roads, and access to first aid,” Alia said. “But at that time, Komdigi instead asked platforms to provide data on users who were livestreaming if those users monetized their livestreams.” According to Alia, this suggests that the government’s intervention is not only about regulating disturbing content but may also limit public access to information.

From media control to platform governance

Government attempts to regulate digital platforms are not entirely new. As people increasingly access information through social media platforms, the state appears to be shifting its mechanisms of control toward these platforms. Previously, such control was directed primarily at traditional media.

This had been the case since before the Reformasi period (1998), when the media in Indonesia was tightly regulated and controlled by the state, reflecting an authoritarian media system rather than a free public sphere. Under the Suharto regime, press freedom was constrained through political licensing, bureaucratic oversight, and administrative pressure that limited journalistic autonomy and critical reporting. Scholars note that media licensing and permit systems restricted dissent and empowered the state to sanction outlets that challenged official narratives. Regulatory bodies such as the Press Council operated under the Department of Information, reinforcing government influence rather than protecting autonomy.

However, the dynamics of government control over the media changed over time, particularly as media ownership became increasingly dominated by conglomerates whose political interests shaped media coverage.

Work by researcher Merlyna Lim maps the relationship between media and politics after Reformasi. Her report, The League of Thirteen, documents how a relatively small number of conglomerates, thirteen key media groups, dominate the media landscape across television, print, radio, and online outlets. within Indonesia. Her work shows how this concentration extends economic and political power into the hands of a few powerbrokers.

Media power became particularly visible during the 2014 presidential election. Tapsell describes these media owners as part of a “media oligarchy” that embedded their influence during electoral cycles. For example, Metro TV, owned by Surya Paloh, chairman of the NasDem Party, gave extensive and favorable coverage to then-presidential candidate Joko Widodo (Jokowi, President 2014–2024) during the 2014 presidential election. The station was widely perceived as supportive of Jokowi. Meanwhile, TVOne, owned by Aburizal Bakrie, former chairman of the Golkar Party, was more sympathetic to Prabowo during the same contest.

Government-platform relations in Southeast Asia

Similar tensions between governments and digital platforms can also be observed across Southeast Asia, a region characterized by fragile electoral democracies, hybrid regimes, and long histories of state repression, violence, and impunity. In this context, the relationship between governments and digital platforms is not always one-directional.

Two cases that illustrate these dynamics are Cambodia and Myanmar. In Cambodia in 2023, Prime Minister Hun Sen had a dispute with Meta. The case began when Hun Sen livestreamed on Facebook for more than an hour. During the speech, he threatened political opponents and said they could choose between “the legal system or a baton.” He also mentioned the possibility of sending “gangsters” to the homes of political opponents.

Cambodian Prime Minister Hun Sen, who has been embroiled in numerous Facebook-related controversies.

Cambodian Prime Minister Hun Sen, who has been embroiled in numerous Meta-related controversies. Images from Pexels (free to use) and Wikimedia Commons (CC BY 4.0).

The video was reported by many users because it contained threats of violence and intimidation ahead of the election. Initially, Meta did not remove the video because it was considered newsworthy. However, the decision drew criticism from human rights activists and democracy researchers.

The case was later reviewed by the Meta Oversight Board, an independent body that evaluates the company’s moderation decisions. In June 2023, the board ruled that the video violated Meta’s violence policies and recommended that Meta remove the video and suspend Hun Sen’s account for six months.

The Cambodian government reacted strongly. Hun Sen deleted his Facebook account and moved to Telegram and TikTok. The government also banned Oversight Board members from entering Cambodia and accused Meta of interfering in domestic affairs.

In the end, Meta removed the video but did not suspend Hun Sen’s account as recommended by the Oversight Board. Many analysts believe Meta sought to avoid escalating conflict with the Cambodian government. Facebook remains highly dominant in the country, while the government also retains the ability to restrict access to the platform. In such circumstances, both sides ultimately depend on one another.

A different dynamic can be seen in the violence against the Rohingya in Myanmar, often considered one of the largest moderation failures in Meta’s history.

According to a 2018 report by the United Nations Independent International Fact-Finding Mission on Myanmar, the Myanmar military and nationalist groups used Facebook to spread anti-Rohingya propaganda, misinformation about Muslims, and hate speech that incited violence. The report concluded that Facebook played a “significant role” in spreading hatred in Myanmar.

One of the main issues was the lack of Burmese-language moderation, combined with algorithms that tend to promote emotionally provocative content. Following international criticism, Meta acknowledged that it had responded too slowly.

In 2021, Rohingya communities filed lawsuits against Meta in the United States and the United Kingdom seeking damages of up to USD 150 billion. The lawsuits argue that Facebook’s algorithms amplified hate speech that contributed to violence against the Rohingya.

However, these cases face major legal barriers under U.S. law, particularly Section 230 of the Communications Decency Act, which provides internet platforms with legal immunity for content posted by users. To date, no court has ordered Meta to pay any damages for its moderation failures.

Nevertheless, legal pressure, reputational risks, and regulatory scrutiny pushed Meta to introduce reforms, including hiring more local-language moderators, dismantling Myanmar military propaganda networks, and banning Myanmar military accounts from Facebook and Instagram.

These cases show that platform decisions are often shaped by three main pressures: legal risk, regulatory risk, and reputational risk. For platforms, the decision to comply with local regulations is often determined by the balance of costs and benefits.

Can the Indonesian government truly tame digital platforms?

Answering this question involves more than examining domestic policy tools such as those historically used to control media before Reformasi. Unlike traditional media, which operate within national jurisdictions, digital platforms are global infrastructures that operate across borders.

As a result, attempts by governments to regulate or “tame” platforms are rarely purely coercive. Instead, they often become negotiations between state authority, the interests of global technology companies, and broader geopolitical dynamics within global internet governance.

The relationship between governments and digital platforms is therefore never simple. Governments can introduce regulations, summon platforms, or even threaten to block access. Yet digital platforms also possess enormous power through the technologies, algorithms, and global information infrastructure they control.

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Popular Articles

0
Would love your thoughts, please comment.x
()
x