Online households and digital gossip
The battle for privacy is less about secrecy and more about reclaiming power in a world run on data.
“Gossip is no longer the resource of the idle and of the vicious, but has become a trade, which is pursued with industry as well as effrontery,” wrote Samuel Warren and Louis Brandeis, in their seminal essay The right to privacy.
Warren, a Boston aristocrat, had grown frustrated at the increasingly invasive journalists that had taken an interest in his private life. Sexual relations, intimate details, coverage of family funerals and weddings, were being laid out for the rest of society to see, appeasing the growing appetite for sensational scandal.
Warren and Brandeis pointed to new technologies as the source of their woes – instantaneous photography and expanding newspaper circulation. They spoke of an industry forming around gossip and invasion into individuals' private lives. Existing property rights, they argued, set a precedent for rights over privacy, but were no longer adequate to defend individuals against these novel threats to personal dignity. They called for additional laws, a “right to be let alone”, to provide more coverage.
“In every such case the individual is entitled to decide whether that which is his shall be given to the public. No other has the right to publish his productions in any form, without his consent,” they wrote.
In the subsequent decades there have been legal additions introduced. Standards have been set that regulate how user data is handled, companies are compelled to disclose what kinds of data they are collecting, there are limitations on what can be collected and in what context. Yet reading Warren and Brandeis’ essay, even now, it feels like we are still fighting the same battle.
The private space
On Thursday, I took part in an X space hosted by Arcanum Ventures. I’ve been on a few of their spaces recently and they always incite interesting discussions. This particular space was focused on privacy – a subject I’ve come to see as an essential component in improving social organisation.
The initial question, “What’s one app you know is violating your privacy but you can’t live without it?” is meant as a fun icebreaker, a way to introduce ourselves beyond our credentials. But as I pause to think about it, and the other guests’ answers come in, I realise, once again, the extent of the digital privacy problem.
In my own mind, I think through the apps I use and their overarching patrons – Google, whose Google Docs app I’m using now to write this newsletter draft feeds into a lot of my online interactions. Meta has insight into my online social life (and possibly my offline one – the targeted ads are a bit too correlated). If I go to certain relatives’ houses, Amazon’s Alexa gets involved … in fact, now that I think about it, any app using AWS funnels data to Amazon. Then we have the new ones, the OpenAIs of the world with their ever-so friendly and helpful chatbot apps and plug ins.
While my personal inner monologue is running answers, I tune into those coming up in the space itself – Cars are another data collection point, says one guest, through the GPS. And what about banking apps? suggests another. The tidal wave of data collection practices in my mind continues to shoal.
Data collection has become the default for our digital lives. As with gossip, in the times of Warren, an industry has formed around it. Companies can now know things like personal details, our browsing history and online behaviour. Many of them own what we input on their platforms and can make use of it. Often, the data they harvest from our interactions is worth more than the products they’re trying to sell.
And yes, it comes with convenience. It might be easier for me to access websites, continue a train of thought through my search history and get directions quicker to recently visited addresses.
Sharing data can lead to a more personalised experience of online services. Only through the storage and tracking of my data can Spotify create my “for you” playlists, Airbnb surface the holiday rentals I fall in love with and Amazon give me helpful suggestions for products I didn't know I needed.
Now algorithms and AI can know us better than we know ourselves. As I discussed with Lane Rettig, it opens up some interesting avenues for AI agent development, one being the creation of proxy voting agents that can align with users' revealed preferences and vote for their interests.
But in a world where data collection is a default, instead of privacy and consent, that kind of data sharing could quickly lead to negative outcomes.
One interaction from the space in particular highlights this trade off. One guest explains that in Russia, all cars are, by law, required to have GPS – I have since checked this and think he was referring to a new mandate announced by the Russian government in June 2025 that all foreign nationals are required to sign up to a geolocation app from September which will share their location data with the government. He brings it up as an example of surveillance of a totalitarian government that, thankfully, in much of the global north-west doesn’t apply.
However, refutes another, “In the US, that kind of totalitarian control comes with a smile.” – despite not legally mandating the sharing of data, it’s very inconvenient, and frankly, downright unappealing to do otherwise.
Back doors and age verifications
Living in the UK, the stark reality of digital privacy has increasingly come to light.
In February it was leaked that the UK government had demanded access to Apple users data through the creation of a back door in the tech giant’s encryption software. This move was made possible due to the passing of the Investigatory Powers Act in 2016, which allows the UK government to force companies to hand over users’ data. If companies make these demands public, they could face criminal charges on the basis of unlawful disclosure.
More recently, the Online Safety Act imposed additional rules including age verification using identity documents and facial recognition for certain websites and increased social media blocks for “harmful content”. While the stated intentions are good ones, to protect children, some have seen it as an overreach of power and increased vulnerability should the websites get hacked.
In the days following the announcement, adult internet users reported difficulties accessing many forms of content and some stated, citing blocks on their content due to certain emojis or hashtags, that the broadly applied rules were verging on censorship. The measures also weren’t very sufficient, with ethical hackers showing Sky News that they quickly bypassed restrictions by turning on a VPN.
Peter Kyle, the UK Tech Secretary, has said this is just the first step and that “nothing would be ruled out” for additional measures. These could include things like social media time limits and all out bans for under 16 year olds.
When the news about Apple came to light, I interviewed Danny O’Brien, Senior Fellow at the Filecoin Foundation and a journalist who protested against the Investigatory Powers Act at the time of its introduction. Our conversation went through the security issues a back door into encryption could bring and how governments around the world had, in the past, used events like terrorist attacks and instances of criminal activity to get laws like the Investigatory Powers Act to pass.
“The challenges with laws like this is that they are, on their face, appealing because they appear to solve a problem that law enforcement or states have. But they're incredibly intrusive,” he said.
The interview came to mind because when I asked the space, in the context of the Online Privacy Act news, “How do we balance online safety and privacy?” one of the guests, Alex Linton of Session Foundation said a similar thing:
“The core to this issue is that anytime that you're introducing a policy that could negatively affect people's privacy, this is going to come with negative safety consequences. Privacy inherently is going to increase people's safety. And so when we're coming up with all of these, some might describe them as harebrained schemes, I think that that is a core consideration. We are stripping away privacy, and we're kind of forgetting the benefits that it holds, the inherent benefits that it holds, and that it should be a core part of any digital safety policy.”
Redefining the public and private
When Aristotle defined the household (oikos) from the public sphere (polis) he acknowledged that both were important, the private household forming the foundation for public life, necessary for social flourishing.
As society has evolved, and technology with it, the public sphere has been brought further and further into our private lives. It now lives on our computers and on our phone screens. It has made us more connected than ever, creating an online environment where communities can form, coordinate and flourish.
The private sphere now centred around the individual rather than a full household, could likewise have its digital replication. But this blurring of the lines between public and private has been blurred further by a lack of protections around personal data. A lack of individual control over choosing who gets to see what data and when. Corporations and governments, like the “yellow press” of Warren’s era, collect our data as the digital gossip of our time – poised to use something that is ours for their gain. The “right to privacy” has dissolved into the premise of “why do you need privacy if you have nothing to hide?” And there isn’t any real incentive for them to stop it.
“We're already wildly out of balance, skewed towards surveillance,” said “Seth for Privacy,” vice president of Cake Wallet, a privacy-focused wallet, during the space. “The Internet is already broken. The surveillance paradigm already exists. Personal privacy is very rarely achieved.”
Some internet users have opted for complete pseudonymity. Even on the space a number of participants had dialled in under their online personas.
Luckily, the individuals who don’t want to live their online lives under an alias do have options – increasing numbers of them as research into cryptographic technology evolves. When I spoke to David Chaum, the “godfather of cryptocurrency,” and an early cryptographer, he said “The only way to create enforceable rules or structure in the informational world is with cryptography [...] Cryptography could be used to empower people with respect to their informational lives in a way that protects society.”
Encrypted tools allow users to conduct their online lives with privacy. Guests on the space suggested using browsers like Duck duck go, email providers like Proton mail and tokens like Monero (although even Monero is facing threats to its continued privacy focus). Signal is being used, even in totalitarian regimes, to allow for private communications.
Cryptographic tools like zero knowledge (ZK) proofs allow the individual to prove elements about their identity, such as their age, without giving up specific sensitive information. Smart contracts within cryptographic tools can also give individuals the control over what specific data they want to share, and for how long.
“We're dealing with these very rotten and broken down rails that are super hard [to work with] even from a safety perspective and it's even harder to embed privacy on it,” said Lasha Antadze, co founder of Rarimo, a ZK-based digital identity company. “We need to first switch to complete empowerment of the user from a privacy tooling perspective and then we can reverse engineer all the safety aspects.”
Private futures
So why do I bring up privacy on a newsletter that focuses on building new societies?
In an environment where privacy isn’t the default there is no space to think outside conformity. Algorithms show us the news and products that align with corporations interests or specific political agendas, narrowing our perspectives and stifling dissent.
Trust may erode, leaking into social ties and potentially fragmenting friendships, professional connections and community bonds. Individuals may self-censor, or drop into social apathy, withdrawing from political participation and avoid holding those in power accountable, weakening the processes that sustain democratic governance.
Without clear boundaries between what is public and private, we risk recreating environments where power is skewed away from the individual and surveillance becomes normalised.
If our societies are going to be built for the true internet era — a world of increased connection, knowledge and empowerment for all, individual privacy should form a cornerstone. Not because we have something to hide, but because true participation demands space to think and connect without the glare of constant monitoring.
Food for thought:
I’m going to do a really shameless plug here and direct you towards a previous article I wrote on privacy when my newsletter was under the Digital Frontier umbrella. I explore the idea of privacy within the data economy, establishing the trade off between convenience and privacy. They recently took down the paywall so you can browse the full article at leisure.
If you want something a bit longer to read on the subject, I really recommend Winn Schwartau’s book, Metawar. Within it he explores the role of data collection and algorithms in distorting reality and belief systems, proposing a framework that preserves human individuality and free thought.
Another good book on the subject is Privacy is Power, by Carissa Veliz. She was on a podcast recently with Roger McNamee and Carole Cadwalladr talking about Big Tech’s data collection, surveillance and power in the digital age which is well worth the watch.
The AI ethics brief also had a great newsletter on data collection and human agency the other week, exploring how digital privacy may be affected as the world shifts towards AI agentic systems.




You know it's gonna be good when it starts with a quote from Warren & Brandeis ;)
Keep up the great work, really good read and you tap into some very important voices.
awesome article!