Internet after DSA – discussion about the limits of responsibility and freedom of speech
The government is implementing DSA: rapid blocking of illegal content and new platform liability rules. Is this order or censorship?
On Wednesday, November 12, a debate “Internet after DSA – between freedom and responsibility” was held at the PAP headquarters, devoted to the draft amendment to the Act on the provision of electronic services prepared by the Ministry of Digitization. The event was organized by the Human Answer Institute’s Tomorrow Dialogue Zone and concerned the practical implementation of the EU Digital Services Act (DSA).
Censorship on the Internet?
The DSA, fully applicable across the EU from February 17, 2024, transfers responsibility for content to online platforms for the first time on such a scale, and its Article Articles 9 and 10 allow judicial and administrative authorities to order the blocking of material on the Internet. The Polish implementation project clarifies the national procedure: reports of content considered illegal will be able to be submitted by public authorities (including the Police, the Prosecutor’s Office, the National Tax Administration) and ordinary users. Shorter deadlines are provided for institutions – if the Police indicates a violation, the Office of Electronic Communications has two days to issue a decision.
Deputy Minister of Digitization Dariusz Standerski emphasized that the new regulations are not intended to serve censorship, but law enforcement in the digital environment. He pointed out an analogy to the offline world: an illegal poster would be removed from the street – it should disappear online just as easily. He also referred to the falsification of documents: forging an mID is treated as forging a plastic ID and results in criminal liability, regardless of the age of the perpetrator.
Attorney Tomasz Bukowski added that currently the platforms operate “between the rigor of the press and telecommunications”, which causes chaos in terms of liability for published content. The DSA implementation is intended to sort out this problem once and for all.
The panel emphasized the growing problem of hate speech. Magda Rozenek-Majdan talked about her experience: “I’m fed up with reports to which platforms respond: ‘It doesn’t violate our standards,’ and three days later X removes this content to someone else because the algorithm worked.” She also pointed out that “10% of Meta’s revenues are scam advertisements”, which is difficult to imagine and undermines the effectiveness of moderation.
NASK specialists pointed out that many reports concern content that is harmful to children, and many such reports are received without an appropriate response from the platforms. Trusted whistleblowers, which will be sustainably funded, are expected to dramatically improve the removal of illegal material.
Implementation regulations
Critics, including PiS MP Dariusz Matecki, warn about the risk of political control of the Internet and arbitrary determination of “what is true.” Standerski replies that the mechanism applies only to illegal content, and the path of decisions and appeals is regulated.
The draft amendment specifies the catalog of materials subject to blocking. The procedure will cover content related to 27 prohibited acts in the Penal Code, including: punishable threats, inciting suicide, praising pedophilic behavior, promoting totalitarian ideologies, inciting hatred and insults based on nationality, ethnicity, race or religion. The orders may also cover: recruitment for human trafficking, recording the image of a naked person or during sexual activity without consent, presenting pornographic content to minors under 15, sending false alarms, computer fraud, as well as illegal online sales of tobacco products, e-cigarettes and nicotine pouches. The list also includes copyright infringements and content related to the illegal sale of goods or provision of services.
According to the assumptions, each service recipient will be able to report questionable content, and the authorities will issue an expedited decision to block the material. The government argues that this is necessary to actually enforce the law on the Internet; opponents warn against abuse. The dispute over the boundary between freedom of speech and platform responsibility is therefore expected to move from debate to practice with the entry into force of new regulations.
