How Unelected EU Officials Built a Transnational Speech Police

by | Feb 24, 2026

How Unelected EU Officials Built a Transnational Speech Police

by | Feb 24, 2026

article 13. copyright in the digital single market in european union

On December 5, 2025, the European Commission imposed a €120 million fine on X, formerly Twitter—its first enforcement action under the Digital Services Act (DSA). The stated violations had nothing to do with incitement, fraud, or child exploitation. The Commission objected to the design of X’s blue checkmark, the layout of its advertising repository, and its data-sharing arrangements with academic researchers (read the Commission’s decision here). In the bureaucratic vocabulary of Brussels, these constituted breaches of “transparency obligations.” In plainer language, the world’s most powerful unelected regulatory body had just demonstrated that it could extract nine-figure penalties from an American company for interface design choices that no European citizen was ever asked to vote on.

The fine, for all its size, was a warning shot. The DSA authorizes penalties of up to 6% of a company’s global annual turnover—a figure that, for the largest American platforms, runs into tens of billions of dollars. Ten of the nineteen platforms Brussels has designated as “Very Large Online Platforms” subject to the law’s most invasive requirements are American companies, including Amazon, Meta’s family of apps, Google, and Wikipedia. The architecture is unmistakable: a European regulatory apparatus with global jurisdiction over American-built infrastructure, enforced by officials whom no European voter directly elects, and backed by financial penalties severe enough to function as existential threats. The Commission insists the DSA is a “transparency” law. The machinery it has assembled tells a different story.

The Digital Services Act, which became fully applicable in February 2024, operates through a structure of cascading obligations that grow more coercive at each level. At its foundation lies a definition of “illegal content” so expansive that it functions as an invitation to suppress speech at will. Article 3(h) defines illegal content as anything that is “not in compliance with Union law or the law of any member state”—a formulation that means the most restrictive speech law in any of the European Union’s twenty-seven member states can become the effective continent-wide standard. Germany’s NetzDG hate speech provisions, France’s memory laws criminalizing certain historical interpretations, Poland’s blasphemy statutes—any of these can furnish grounds for content removal across the entire bloc.

Above this baseline, the DSA erects an apparatus of “trusted flaggers”—government-approved entities, including NGOs and law enforcement agencies such as Europol, deputized to flag content for priority removal by platforms. Their notifications must be processed “without undue delay.” The platforms must then report to EU regulators what action they took in response. The incentive structure is not subtle: process the flags quickly and remove the content, or face questions from Brussels about why you failed to act.

For the largest platforms, the obligations escalate further. Articles 34 and 35 impose a duty to “assess and mitigate systemic risks,” a category that includes threats to “civic discourse,” “electoral processes,” and “public security.” The language is so broad that virtually any political speech disfavored by the Commission could qualify as a systemic risk requiring mitigation. The Center for Strategic and International Studies—hardly an institution of the populist fringe—has acknowledged that the DSA’s architecture creates structural incentives for platforms to remove more speech than the law technically requires, simply to reduce regulatory exposure. The Chicago Journal of International Law has warned that these incentives will fall heaviest on “political speech, criticism of political figures, parody, and pro-LGBTQ+ speech.”

There is no meaningful check on this power. Appeals from Commission enforcement decisions must travel through the Court of Justice of the European Union, a process that takes one to two years and costs sums that only the largest corporations can afford. National courts cannot overrule the Commission’s interpretations. The entire edifice—the definitions, the trusted flaggers, the systemic risk obligations, the enforcement mechanism—was enacted by the European Parliament and Council, then placed in the hands of a Commission whose president is proposed by the European Council and confirmed by Parliament. No citizen of any EU member state cast a direct ballot for Ursula von der Leyen or any of the officials who now wield this apparatus.

If the Digital Services Act’s architecture were merely theoretical, its critics could be accused of speculation. Subpoenaed documents obtained by the U.S. House Judiciary Committee have now closed that gap between design and deployment.

In July 2025, the Committee’s Republican majority published an interim staff report titled “The Foreign Censorship Threat,” based on nonpublic documents obtained under subpoena from major American technology companies. The documents include email communications between Commission staff and platform executives, and internal materials from a closed-door “DSA Workshop” the Commission hosted with platforms in May 2025. At that workshop, Commission regulators presented a hypothetical social media post—”we need to take back our country,” a phrase so common in democratic politics that it could have appeared in any campaign in any Western nation—and labeled it “illegal hate speech” that platforms are obligated to censor under the DSA.

The Committee’s findings go further. Commission regulators explicitly told platforms at the workshop that “continuous review of global community guidelines” was a best practice for DSA compliance—a directive that, because major platforms generally maintain a single set of content moderation policies worldwide, means that speech restrictions demanded by Brussels are imposed on users in the United States, Brazil, Japan, and everywhere else. The Committee concluded that the Commission’s censorship efforts have been overwhelmingly one-sided, targeting political conservatives—a characterization the Committee offered as a documented finding, supported by the subpoenaed materials.

The Commission’s reach extended deep into the electoral politics of its own member states. Internal TikTok documents submitted to the Committee revealed that the Commission pressured platforms to increase content takedowns ahead of elections in Slovakia, the Netherlands, France, Moldova, Romania, and Ireland, as well as the 2024 European Parliament elections. TikTok’s own staff expressed concern in internal communications about what they called the Commission’s “very informal approach” to Romanian election authorities, and warned about “unjustified removal of legal content (such as political speech).”

Former EU Commissioner Thierry Breton established the tone before the DSA’s enforcement apparatus was even fully operational. In 2023, he publicly declared, “I am the enforcer,” and threatened to ban platforms from operating in the European Union entirely if they failed to remove content he deemed illegal. Breton’s letter to Elon Musk in August 2024, warning that “spillovers” of American speech into the EU during a U.S. presidential campaign could trigger “interim measures” against X, was regarded as so nakedly political that it contributed to his departure from the Commission under pressure from von der Leyen herself. The episode revealed not a system of neutral regulation but a political apparatus in which individual commissioners claimed personal authority to determine the boundaries of permissible speech across an entire continent.

The most dramatic deployment of this apparatus came in Romania, where the interaction between the DSA, intelligence claims, and judicial intervention produced what may be the first annulment of a democratic election in EU history carried out under the banner of “platform regulation.”

The sequence is worth reconstructing precisely. On November 24, 2024, Călin Georgescu—an independent populist candidate running on a nationalist, anti-establishment platform—won the first round of Romania’s presidential election with 23% of the vote, having polled in single digits weeks earlier. The result stunned Romania’s dominant political parties and alarmed Brussels. On December 4, outgoing President Klaus Iohannis declassified intelligence documents from Romania’s intelligence service, the SRI, alleging that Russia had orchestrated a covert TikTok campaign using 25,000 coordinated accounts to boost Georgescu’s candidacy. Two days later, on December 6—forty-eight hours before the scheduled runoff—Romania’s Constitutional Court unanimously annulled the election.

The European Commission immediately opened a DSA investigation into TikTok, framing the annulment as vindication of the act’s purpose.

Then the official story began to unravel. Internal TikTok documents submitted to the U.S. House Judiciary Committee state that TikTok “did not find” any evidence of a coordinated network of 25,000 accounts associated with Georgescu’s campaign. By late December 2024, Romanian media—citing the national tax authority, ANAF—reported that the alleged TikTok campaign had actually been funded by another Romanian political party, not by Russia. TikTok’s own internal communications expressed alarm at the Commission’s handling of the situation, warning about the platform being pressured into the “unjustified removal of legal content.”

None of this mattered. The election was never reinstated. Georgescu was barred from running in the re-do election in March 2025. In May 2025, the pro-EU candidate Nicușor Dan won the rescheduled contest. The Commission’s investigation into TikTok, meanwhile, has still not concluded—more than a year after the election it helped annul. Romanian MEPs have called the pace “unacceptably slow.” The investigation’s function, by this point, is clear enough: it provided procedural cover for the annulment when the annulment was politically useful. Its completion is no longer necessary.

A democratic election in an EU member state was overturned on the basis of intelligence claims that the platform at the center of the allegations says it cannot substantiate, and that the country’s own tax authority traced to domestic political funding. The Commission weaponized unsubstantiated charges of “Russian interference” to nullify a result that defied the Brussels consensus, then launched an investigation it has shown no urgency to conclude. The bureaucratic vocabulary is impeccable. The substance is election theft.

The Digital Services Act’s deployment against domestic political dissent is inseparable from the European Union’s broader foreign policy trajectory under von der Leyen and her High Representative for Foreign Affairs, Kaja Kallas. Kallas—appointed in December 2024 without any direct vote by European citizens—arrived in the role as the first head of government placed on Russia’s wanted list, a credential that observers across the political spectrum described as signaling the EU’s intent to escalate confrontation with Moscow. Under von der Leyen’s Commission, the EU has imposed seventeen sanctions packages against Russia. The most recent of these extends well beyond conventional economic restrictions.

On May 20, 2025, under Council Decision CFSP 2025/966, the EU broadened its “hybrid threats” sanctions regime to target individuals and entities accused of “destabilising activities.” The measures include asset freezes, travel bans, and the suspension of broadcasting licenses for media outlets deemed to operate under Russian influence. Among those sanctioned were Thomas Röper and Alina Lipp—German nationals whose work as bloggers and commentators critical of EU policy on Russia earned them designations under the same framework applied to GRU operatives and companies engaged in undersea cable sabotage. The Council’s official reasoning against Lipp states that she is “engaging in and supporting” actions by the Russian government through “coordinated information manipulation and interference.”

In the months that followed, the regime’s reach extended further. In June 2025, the Council sanctioned Nathalie Yamb, a Swiss-Cameroonian activist and social media figure known across francophone Africa as the “Lady of Sochi” for a speech she delivered at the 2019 Russia-Africa Summit calling for an end to French economic and military dominance on the continent. Yamb’s activism centered on African sovereignty, the abolition of the CFA franc—the colonial-era currency still underwritten by the French Treasury—and the withdrawal of French military forces from West Africa. She had not, in any substantive sense, commented on the war in Ukraine or advocated for Russian strategic interests in Europe. Her offense was campaigning against French neocolonialism in Africa—a cause with roots decades older than the current confrontation with Moscow. The EU sanctioned her anyway, claiming she had “adopted Moscow’s language” targeting the West and France in particular with a view to ousting them from the African continent. The formula is revealing: opposition to French colonial legacies in Africa is reclassified as a Russian hybrid threat, and the activist who voices it is subjected to the same asset freezes and travel bans as intelligence operatives accused of sabotaging undersea cables. The CFSP framework’s stated purpose—countering Russian destabilisation of European security—becomes, in application, an instrument for punishing anyone whose dissent happens to align, however incidentally, with positions Moscow also holds.

Then, in December 2025, the Council added Jacques Baud, a retired Swiss army colonel, former member of the Swiss Federal Intelligence Service, and veteran of NATO and United Nations operations spanning three decades. Baud’s career included designing the first multidimensional UN intelligence unit in Sudan, heading small arms control at NATO in Brussels, and serving in peacekeeping doctrine at the UN’s Department of Peacekeeping Operations. The European Union’s stated rationale? He “acts as a mouthpiece” for pro-Russian propaganda and “makes conspiracy theories.” His actual transgression was publishing books and giving interviews in which he argued—drawing on his professional background in intelligence analysis—that NATO expansion contributed to the conditions for Russia’s invasion of Ukraine. At the time the sanctions were imposed, Baud was living in Brussels. He received no prior warning. He learned of the designation from the press. Switzerland explicitly declined, as his country of citizenship, to adopt the hybrid threats regime under which he was listed. The practical consequences for Röper, Lipp, Yamb, and Baud are identical: bank accounts frozen, freedom of movement curtailed, financial lifelines severed—all by executive decision of the Council, without a criminal charge, a trial, a conviction, or, in Baud’s case, so much as a notification.

The EU now claims authority under this sanctions regime to prohibit transactions involving digital and communications infrastructure linked to Russian “destabilising activities.” The definition of what constitutes destabilization remains, by design, in the hands of the same unelected officials who define “illegal content” under the DSA, who determine what constitutes a “systemic risk” to “civic discourse,” and who opened a platform investigation to provide cover for an annulled election whose evidentiary basis collapsed under scrutiny.

Censoring domestic critics of EU foreign policy, silencing media outlets, freezing the personal assets of journalists and commentators, and accelerating arms deliveries to Ukraine are not separate phenomena occurring in parallel. They compose a single program. War mobilization has always required the suppression of dissent—the manufacturing of consensus through the elimination of visible alternatives. The EU has simply dressed the machinery in the language of “transparency,” “platform governance,” and “hybrid threat mitigation,” terminology so anodyne that it obscures what is being constructed: a continental apparatus for enforcing narrative conformity on four hundred and fifty million people, administered by officials accountable to no electorate, and exported through the “Brussels effect” to billions more.

The European Union has no democratic mandate to police speech. The Digital Services Act is a censorship infrastructure that enforces narrative conformity across an entire continent and—because global platforms maintain uniform content moderation policies—reaches far beyond it. When unelected officials can extract penalties of up to 6% of a company’s worldwide revenue for hosting political expression they disfavor, when they can furnish the procedural basis for annulling election results grounded in intelligence claims their own investigation cannot confirm, when they can freeze the bank accounts and restrict the movement of journalists whose reporting contradicts the approved narrative, the appropriate term is not “regulation.” The appropriate term is authoritarianism—practiced by committee, documented in official journals, and insulated from democratic accountability by the deliberate complexity of its institutional architecture.

European citizens who value their remaining freedoms of expression and assembly should demand the repeal of the Digital Services Act and resist the broader militarization of European Union governance that the act’s censorship apparatus is designed to serve. Americans, for their part, should oppose any domestic legislation modeled on the DSA’s framework and support legislative measures that protect citizens and companies from the reach of extraterritorial censorship regimes. The machinery Brussels has built does not defend democracy. It is engineered to ensure that democracy never again produces results that Brussels cannot control.

Thomas Karat

Thomas Karat

Thomas Karat has spent a career in multinational technology corporations and is a behavior analyst holding a Master’s in Science and Communication from Manchester Metropolitan University. His work focuses on the psychology of language in power dynamics, and his graduate thesis examined linguistic deception markers in high-stakes business negotiations. He hosts a podcast, Salt Cube Analytics, featuring conversations with thought leaders from diplomacy, academia, and the intelligence community.

View all posts

Our Books

Recent Articles

Recent

Pin It on Pinterest

Share This