Protecting children’s minds: what the Digital Fairness Act must name edit

12 May 2026

On 28 February 2024, Sewell Setzer, fourteen years old, took his own life in Orlando after several months of attachment to a conversational companion built by Character Technologies. In May 2025, Judge Anne Conway denied these architectures the protection of the First Amendment and characterised their algorithmic outputs as a defective product. Three years earlier, the London coroner Andrew Walker had recognised the role of Meta and Pinterest architectures in the death of Molly Russell, fourteen herself. Two rulings, in two jurisdictions, reach the same conclusion: attentional architectures deployed on minors can lead them to die. Europe has the cases, the reports, the legal texts. What it lacks is the doctrine that names the fundamental good these architectures impair. The Digital Fairness Act, which the Commission will table in the fourth quarter of 2026, is its last window to formulate it.

The materiality of the problem is now established, and we must stop treating it as moral panic or a matter of parental hygiene. The report submitted to the President of the French Republic on 30 April 2024 by the commission co-chaired by the psychiatrist Amine Benyamina and the neurologist Servane Mouton, under the telling title À la recherche du temps perdu, synthesised two years of scientific literature and framed screen addiction as a major public health issue. Benyamina’s call to “change the economy of capture” sets the terrain. The PISA 2022 data show that two-thirds of European pupils report being distracted by their devices during class. The OECD report of May 2025 adds a decisive dimension: children from working-class backgrounds are significantly more exposed to retention architectures, with less mediation and fewer alternatives. What we used to call the digital divide is changing in nature. It is no longer a divide of access. It is a divide of attention, and ultimately of citizenship.

The ByteDance asymmetry reveals the political nature of the problem. The same company caps Chinese minors’ time on Douyin at forty minutes per day, imposes a nighttime curfew and an educational filter, and simultaneously exports infinite scrolling to European children, through TikTok, with none of these protections. It is the admission, by an operator itself, that the protective standard it knows to be necessary is the one it refuses to deploy wherever the law does not compel it. Beijing invests in the cognitive capital of its youth while Europe leaves its own to the optimisation strategies of foreign operators. The French sequence opened by the Attal-Rufo joint piece of April 2025, formalised by the Miller law adopted in first reading in January 2026 and underpinned by the work of the Benyamina-Mouton commission, provides the first political response to this asymmetry. But without a European doctrine to carry it, it will remain an isolated initiative and the single market will fragment into as many regimes as there are member states resolved to protect their children.

The GDPR protects the personal data of the child. The Digital Services Act regulates the content to which the child is exposed. The AI Act prohibits subliminal techniques. Three considerable edifices. None has named what is actually at stake for subjects in formation: the alteration of their cognitive faculties by the design of the architectures themselves, independently of the lawfulness of the data and the content. An algorithm that pushes perfectly lawful content to a ten-year-old child, for the sole purpose of extending the duration of the session, violates no norm currently in force. The problem is not what the child sees. It is what the architecture does to the child’s capacity to see.

This gap is not technical. It proceeds from a juridical grammar that still treats minors as vulnerable consumers rather than as subjects in formation. The difference is decisive. The vulnerable consumer is to be protected against abuse: content is moderated, targeted advertising is forbidden, transparency is imposed. The subject in formation is to be protected within the very process of his or her constitution: it is guaranteed that the faculties of attention, judgement and deliberation are not formatted by an architecture whose mechanisms and purposes the child neither understands nor controls. It is precisely this passage, from the vulnerable consumer to the subject in formation, that the Digital Fairness Act must accomplish.

This passage has a name. Cognitive sovereignty designates the right of the subject in formation to have his or her faculties of attention, judgement and deliberation kept beyond the optimisation strategies of operators whose interests diverge from his or her own. It inscribes itself without rupture in the European matrix of personal integrity, built in three successive layers: physical, moral, and psychological — this last one recognised by the European Court of Human Rights in the ruling Bensaid v United Kingdom of 6 February 2001. A fourth layer, attentional, proper to subjects in formation, extends the same logic.

The historical precedent illuminates the gesture. In the nineteenth century, European democracies posited that the body of the child could not be the object of an economic transaction, even with parental consent, because the harms done to the working child could not be repaired by the future consent of the adult he or she would become. The same logic applies today to the cognitive faculties, which form during childhood and which capture architectures alter durably, to the point that the future adult no longer fully possesses the very faculties through which he or she could have consented or sought redress.

Inscribed in the recitals of the Digital Fairness Act, articulated to Articles 1, 24 and 38 of the Charter of Fundamental Rights of the European Union, this category would give the text a scope comparable to that which the GDPR gave to data protection: not one more technical instrument, but a principle that restructures everything that comes after. The Commission’s draft will be tabled in December. The parliamentary phase will run through 2027. Once adopted, the conceptual architecture will be fixed for the decade. Sewell Setzer was fourteen. Molly Russell was fourteen. The law had no word for what was happening to them. The children who are ten today will not wait for one to be found.