When Privacy Becomes a Question of Culture

A human face dissolves into digital patterns — a visual echo of the question: What happens to our identity when we are read as data?

A new surveillance law is being rushed through the Danish Parliament. But what happens to culture—when your voice is read as data and your silence becomes habit.

Imagine a room where you’re always being watched. Not by eyes, but by systems. A gaze without pupils, but with access to your dreams, your searches, your doubts. Perhaps this is no longer fiction. Perhaps it’s our present.

ADVERTISEMENT

Build Stunning Websites with Ease!

Design a professional website in minutes with a powerful drag-and-drop builder and secure hosting included.

Get Started Now

The Danish government is preparing to grant the national intelligence agency unprecedented powers to analyze the digital behavior of its citizens. A new PET law is on the table, and with it: the ability to collect, monitor, and interpret citizens’ data using artificial intelligence and pattern recognition.

ADVERTISEMENT

AMG Presents Moonboots

Surveillance’s New Face

The think tank Justitia has issued a press release and an open letter—joined by a broad alliance of experts, organizations, and civil society actors—warning against the proposed law. Signatories include the Danish Society of Engineers (IDA), the Danish Union of Journalists, Amnesty International, Djøf, Danish PEN, Danish Media Association, and Transparency International Denmark.

They raise concerns that the proposal could introduce a digital surveillance regime that penetrates deeply into citizens’ private lives—without the necessary democratic scrutiny of what such a system means for freedom, rule of law, and cultural expression.

AI and the Interpretive Gaze

Artificial intelligence is no longer science fiction. It’s embedded in our daily lives—and in this bill, it plays a lead role. Not as a neutral observer, but as interpreter. As an algorithmic reader, attempting to define the indefinable: the contradictory, intuitive, and ever-changing human experience.

ADVERTISEMENT
Advertisement

Sabochini Caps & Beanies Drop

Exclusively at Headz Up

Justitia warns that these systems could gain access to people’s most intimate information. Not just to store it—but to assign it meaning. What happens when your actions aren’t just saved, but analyzed? When a system begins to see your decisions as signals—and your silences as suspicion?

We are becoming streams of data. But where is the line for who gets to read us? And what does that mean for our right to change our minds, to make mistakes, and to act without having to justify ourselves?

The Inner Gaze and the Habit of Silence

A 2024 report from the Danish Institute for Human Rights shows that more and more Danes—young and old—are holding back from expressing themselves publicly. Not because they’re forbidden to speak, but because they hesitate. Because they anticipate being watched.

Self-censorship doesn’t require orders. It grows out of expectation. An internal gaze that calibrates itself to what an algorithm might infer. It’s not just a social adaptation—it’s a cultural shift in how we exist as humans.

The Danish Data Ethics Council emphasizes that generative AI challenges our notions of autonomy and dignity. Machines that mimic human language, expressions, and behavior risk redefining what is perceived as “normal.” The danger is that the human form begins to mirror the machine—not the other way around.

The Nerve Fibers of Democracy

In the essay “Artificial Intelligence in the Welfare State” (Kunstig intelligens i velfærdssamfundet), published as part of the Danish project Magtudredningen (Power and Democracy Study), coordinated by the Department of Political Science at Aarhus University, Helene Friis Ratner, associate professor at DPU, and Rikke Frank Jørgensen, senior researcher at the Danish Institute for Human Rights, argue that surveillance changes the citizen from within. Not through force or censorship, but through adaptation. We begin to see ourselves through the eyes of the state—before we speak, write, or act.

This is what makes the debate about the PET bill so vital. Not everything that can be surveilled should be. Not everything that can be interpreted ought to be. A free society is not defined by the absence of threat, but by the presence of trust—and trust requires space to be unseen, unpredictable, and unknowable.

Perhaps that is the freedom most under threat: the freedom to be someone, without being something in a system.

You might also like

More from WLC Magazine