Why Does Privacy Matter for (Digital) Democracy?

Reading time: 17 minutes

Privacy is a physiological need for social animals. It is more than a human right. This is part of what we have discussed in the interview with Dr. Milan Stankovic, a PhD in Artificial Intelligence (AI) from Sorbonne University in France, and a co-founder of Blindnet, a B2B (business to business) startup that seeks to bring back privacy to the internet by using open-source technology.

This article contains the key notes that we discussed during our interview that was done on June 24, 2022, which you can find on YouTube or your favorite podcast app (search for Episode #3).

Quick Summary

Personal frustration can be the motivation necessary to start a business that solves a problem, like privacy in the digital world.
The survival of the internet as we know it will depend on building trust by restoring and rebuilding privacy on the internet.
Artificial intelligence will not replace humans but will be a tool that augments that capabilities and enhances the performance of humans.
The rise of technology has impacted and accelerated the decline of democracy, but technology can be a tool that helps negotiate better social contracts.  

Personal frustration can be a powerful motivator to start a business that tackles digital privacy.

Frustration. It can be a very powerful motivator to start a business. Dr. Milan Stankovic was frustrated that the French government decided to hire a company to develop an artificial intelligence tool that would analyze the perceived spending of its citizens, based on insights it would infer from social media and the internet match these insights and assumptions about estimated spending with their actual tax records. If there was a discrepancy, then the taxpayer would be asked to pay the difference.

An example of such a situation would be a taxpayer is “caught” with a picture of a Ferrari on their social media. The government could infer that the person is a high spender and that perhaps the tax revenues were under-reported. The reality is that the Ferrari most likely belongs to someone else, but the person just took a photo with someone else’s expensive car.  

Dr. Milan Stankovic was frustrated with the government and its decision to use AI to invade people’s privacy. He, along with two other co-founders who happen to be his classmates from his undergraduate studies in Computer Science joined forces to establish Blindnet with the financial backing from US-French venture capital (VC) fund back in 2021.

The first iteration of the internet was about connectedness.

Dr. Stankovic reminded the listeners that when the internet was invented over 50 years ago, and the web over 30 years ago, the focus was on connectedness. This was without regard to what would happen with the data that was shared. Now that we have collectively felt the consequences of lack of privacy over the internet, new regulations in the European Union (EU), the famous General Data Protection Regulation (GDPR) as well as across the United States of America (USA) such as the California Consumer Privacy Act (CCPA) and other parts of the world have been passed to regulate how data is handled by companies online.

These initial steps are important.  The rise of regulation has given rise of new businesses, such as Blindnet and Plausible Analytics (you can check out the interview we did with them here), both are open-source companies that seek to address the issues of privacy in their regard.

The future of the internet is a privacy-first and privacy-by-design internet.

So far, the two companies that we interviewed at Lifestyle Democracy are open-source. When I asked Dr. Milan Stankovic, why they chose to be open-source and whether they fear competitors would steal their code and start their own companies, he responded that choosing to be open-source was a strategic decision that was also agreed with their investors. The reason to go open source was to trust with customers. Without an open-source approach, customers would be suspicious.

With regards to competition, software companies always have the next update in mind, and this is how successful ones ensure their competitiveness. This is not withholding innovation and development of the software purposefully. Rather, with the development of the existing software, the developers learn what they can “do better” in the next iteration.  

The additional advantage of the companies writing the open source code is the internal expertise. Even if the code is open source, and in theory other developers could use it to start their own companies, they do not now the full timeline and roadmap of the planned features. They do not know the “soul” of the code. Expertise is another major advantage of open source companies.

When it comes to the future of the internet, for both Blindnet and Plausible Analytics, privacy is something that should come first and be part of the software’s design. It should not be an afterthought.  

Blindnet provides software components to clients that aim to allow users to control the full spectrum of the data they share, while at the same time providing an audit of the data that is shared, information about who manages data, and proof of compliance. Many websites we frequently use, rely on numerous third-party providers.

For example, the simple act of shopping requires the involvement of several third-party providers, such as the website interface where the products are displayed (e.g., powered by WordPress, Wix), the payment processing companies (e.g., PayPal, Stripe), the mailing companies (e.g., FedEx, UPS, DHL), and so forth. All these third parties, take pieces of data that the user has provided.

Today, it is difficult for a user to know which data is stored where and how it is managed. There are increasing regulations such as European Union’s (EU) General Data Protection Regulation (GDPR), which mandates how user data is to be handled by companies. However, the challenge with GDPR is enforcement and compliance.

It is complicated and expensive to comply with GDPR regulations without leveraging sophisticated tools and methods such that Blindnet has started developing and upgrading. Users do not have a standardized way of knowing whether a website is GDPR or privacy compliant. Nowadays, most website have cookie banners, intrusive banners notifying users that the website uses “cookies” for the website to “function” and “collect user data.” As per GDPR, users have the choice to make “informed consent,” but some websites are too big to be avoided, like Google or YouTube and they practically “force” users to accept the terms and conditions of their privacy and cookie policies.

Dr. Stankovic stated, “A crucial part of connection is privacy. When you connect to someone, you also want to regulate the connection. You need to have control over when the connection ends.”

As Dr. Stankovic explained, having control of the connection means being able to remove oneself from the discussion, initiating or ending a friendship.

An internet without privacy will be the end of the internet as we know it.  

The internet may not be around if the question of privacy is not resolved. Dr. Stankovic made a bold statement. Lack of privacy will breed further distrust and “if people distrust something they will find a way around it.”  

Dr. Milan Stankovic

He argued that given the natural need for privacy, for the internet to “survive,” it must have privacy built in.

Even though it is hard for us to imagine a world without internet, when many of us are so dependent on it for communication, work, and entertainment, Dr. Stankovic recognizes that the internet has been around for only 50 years, (while humans have been around for hundreds of thousands of years), and we can create a new, better internet, with privacy built into it, in the event that the current one becomes dysfunctional.

Dr. Stankovic cautioned that the Covid-19 induced global pandemic is a reminder for us that the “unthinkable can happen.” This reminds me of the concept of black swans, coined by Nassim Nicholas Taleb. In his book, The Black Swan: The Impact of the Highly Improbable, which you can get here from Bookshop.org or Amazon (these are affiliate links, we will get a small commission to help us support our work, without cost to you).

Black swans are highly improbable and very rare events that have a disproportionate impact on society. For example, financial markets are highly susceptible to black swan events. The occurrence of an unlikely event can significantly make stock values go up or down, as it happened during the Dot Com Bubble in the early 2000s or during Great Recession of 2007/2008 caused by the subprime mortgage lending crisis, or even more recently, with the GameStop stocks that went up in price by over 1,900% and then collapsed. Few made a lot of money on the GameStop craze, even though GameStop’s future as company remains uncertain in the face of rapid digital and technological changes.

Trust can help build a better internet, distrust can help make the internet dysfunctional.

In a study Blindnet conducted, when users feel safe and when they trust a service provider, they would like to share data that will help the company do a better job. When the opposite is the case, users will give wrong information that may lead the companies in questions to make wrong conclusions or insights.

Sometimes users use a particular service, not because they trust it, but because it is better, cheaper, and easier to use, or sometimes there is no alternative. In such cases, users try to figure out ways around it.  

In the words of Dr. Milan Stankovic, one of the biggest challenges with the internet today is how to restore and built trust to avoid having a mass of people trying to destroy it or make it dysfunctional.

In the pursuit of digital privacy, we also coined some new terms…

Along the note of helping make the internet more privacy-first and privacy-by design, we discussed the challenges faced by the current and the future generations.

For the current generations, it is too late to change or ensure the data that was already shared is returned or deleted safely and it is potentially out of reach criminals and fraudsters.

During the interview, we coined the following new terms regarding how people use the internet.

PRIVACY AWARE: The younger generations who grew up with the internet and are aware of the importance of using privacy-first or privacy-by-design tools on the internet. They are also aware of the dangers of lack of privacy on the internet, but they may not have an easy access, will, financial means, or technical know-how to master or develop tools required to have a private experience on the internet.  
PRIVACY AGNOSTIC OR PRIVACY INDIFFERENT: Typically, the current older generations, who did not grow up with the internet and have limited or no understanding of the importance of privacy on the internet. They are not aware of tools that could help them protect their privacy on the internet, nor they would know how to use them effectively without some training and guidance.
PRIVACY SAVVY OR PRIVACY RESPONSIBLE: These are typically the younger generations or even those more senior, who are motivated and willing to find, invest, learn, and use tools that help them protect their privacy on the internet. These are the “doers” and “self-learners.” They are not afraid to search for answers online. They are curious and proactive.
PRIVACY ENABLERS: These are the few pioneers like Dr. Milan Stankovic who embarked on the journey to build businesses and organizations that would solve the issue of privacy on the internet. This would also include those who create or shape policies that make it possible for companies like Blindnet or Plausible Analytics to become relevant and important in helping companies be compliant with regulations, but more importantly protect privacy and build trust.

“Digital Democracy is Digitally Negotiated Harmony”

Dr. Milan Stankovic understands digital democracy as “digitally negotiated harmony.” His belief is that at the core of democracy lie collectively negotiated social contracts. The core to healthy (digital) democracies is the process of negotiation among different communities on the internet.

Although, the rise of technology (especially social media) has amplified the crisis of democracy, technology can be used to solve it. But technology alone is not sufficient. Just like for traditional institutions associated with democracy, such as parliaments, congresses, senates, presidencies and so forth, on the internet there must be rules and processes set by humans how to manage the process of negotiating social contracts central to democracy. If certain groups feel that other groups exercise more power and influence and their own interests and needs are undermined, this will cause tensions between the “haves” and “have nots” (of power on the internet).

Technology and communication/negotiation processes should ensure to regulate the power balance in creating and managing social contracts in democratic societies. We also discussed that although the internet and social media technologies seem like a “great equalizers” in terms of providing access to power the internet and different media communication outlets, not everyone’s voice travels equally as far. Not everyone’s voice from the “digital soapbox” travels as far.

The challenge for developers is to work with social scientists to ensure that new and existing technologies help reverse the decline of democracies by building and maintaining trust in the process of negotiating social contracts.

A marble jar, the standard of trust and verification of privacy by design.

Dr. Stankovic referred to the idea of the marble jar, a concept he hear about in one of Rene Brown’s talks. The marble jar was used in a classroom setting by teachers to encourage positive behavior among students. Good behavior was awarded by placing marbles in a jar. Bad behavior was punished by taking marbles out of the jar. The marble jar was a visual indicator of the behavior of the students.

I suggested that the idea of the marble jar could be used to measure the level of trust of a website. This is like a rating and a review system on many websites, but the difference is that the marble jar could use a combination of objective and subjective indicators to measure the trustworthiness of a system or a service on the internet. It should be a universal symbol and a standard measure of trust and verification of the degree of privacy protection by design.

The quantitative measure could be the level of compliance with governmental regulations concerning privacy and user data protection. For example, if the particular website has 10 processes or works with 10 different third parties, and if 7 out of 10 of these processes or third-parties comply with privacy regulations, then the quantitative score for this particular website would be 7 out of 10. The measurement of the privacy compliance with the processes could be done in real-time with the support of AI technologies.

The qualitative score could come in from assessing the rating of the users of the system itself. This would be like the review system used by companies all throughout such as Amazon, Booking, AirBnb and so forth.

The “marble jar” would combine the scores from the quantitative and qualitative assessment to come up with a combined score. The marble jar would be a visual and real-time indicator of the degree of governmental compliance and user trust. It would function like a credit rating system for sovereign debt, ranging from “AAA” (highest level of compliance and user trust) to “junk” (lowest level of compliance and user trust).

We will see if this idea will be implemented in reality by Dr. Milan Stankovic and his team.

A world of efficient consumers or informed citizenry?

The world of the internet is a distracted place. Stefan asked Dr. Stankovic if the giant tech companies that rely heavily on advertising revenues such as Google and Facebook are causing people to be efficient consumers or informed citizens?

Dr. Stankovic recognizes merit in targeted and personalized ads. On one hand, targeted and personalized ads can simplify the life of citizens and potentially reduce the time spent “shopping” and “browsing.” It saves users’ time. On the other hand, it is important that there is trust in companies, otherwise, as the Blindnet commissioned research has shown, people may be consciously providing false information if there is no trust. If there is trust in the company, then users would share true information to help the company develop.

Artificial Intelligence will not replace humans, but it may create great power differentials.

Dr. Stankovic believes artificial intelligence will not replace humans. It will be a “tool” that augments the capabilities and performance of humans. Now, we are in the second wave of artificial intelligence, more research, more work, but also “more realism” what are the limits of AI.

AI is a useful tool, but it is not a panacea that will solve all world’s problems as some futurists and computer scientists may believe.

The existence of social contracts is core to healthy and sustainable democracies.

One of the biggest threats of AI is the power differential it can create. The power differential can become threatening and, in the words of Dr. Stankovic, “[AI] may degrade relationships between individuals and larger society.”

He compared AI to the power of nuclear weapons. Nuclear weapons are used as a deterrent by nuclear capable powers, but this requires setting up appropriate protocols, processes, restrictions, and regulations in place to avoid a nuclear war that would destroy the world. The same applies to AI. The risks can be mitigated by setting up the right protocols, processes, and regulations in place.

In summary, we should be optimistic about the outlook of technology on democracy.

Dr. Stankovic is optimistic, but at the same time he is realistic about the limitations and opportunities that AI can offer and the impacts it will have on the state of democracies. When asked about the question of ownership of AI protocols and algorithms in the hands of few corporations or the government, he countered that “knowledge is [key] to fight these differences.”

In his mind, there seems to be very little room for a fatalistic scenario where AI controlled by powerful governments or corporations could overshadow entire societies.

The great equalizer in acquiring knowledge [to create, manager, and control AI] is access to internet, which allows people from all over the world to have a more level playing field in the domain of knowledge and skills acquisition, unlike in any other part of our existence as a human civilization, up until now.

Compared to the celebratory tone that Peter Diamandis has regarding the advancement and convergence of multiple technologies such as autonomous vehicles, augmented reality, virtual reality, AI and many others as illustrated in this book The Future is Faster Thank You Think available through Bookshop.org or Amazon (affiliate links, we will get a small commission to help us support our work, without cost to you), Dr. Stankovic has a more pragmatic view with awareness about the limitations of these technologies and their ability to replace human cognition.    

Book Recommendations from Dr. Milan Stankovic

While we discussed with Dr. Stankovic the topic of privacy, we recognized that privacy is important in building a connection. Strangers bond by sharing something that is very private to them, something that makes them vulnerable. However, that does not mean that private information shared publicly or widely on the internet will help create trust among strangers. The uncontrolled, unregulated, and untrustworthy means of sharing private information can make the internet dysfunctional.

On the topic of vulnerability, Dr. Milan Stankovic recommended reading Brene Brown’s books such as The Power of Vulnerability. You can get some of the recommended books from our shop here (these are affiliate links, we will get a small commission to help us support our work, without cost to you)

To learn more about Lifestyle Democracy and Blindnet

To learn more about the work we do, please feel free to explore our website. For the latest articles, check out the blog section, for practical recommendations how to democratize your life, go to our resources section, and to support the work we do, click on the support tab or buy eco-friendly merchandise from our shop. Learn more about Dr. Milan Stankovic and his work at Blindnet.

You can also check out the interview with did with one of the co-founders of Plausible Analytics, an open-source alternative to Google Analytics.

To learn more about Blindnet, go to their website and to learn more about Dr. Milan Stankovic, check out his personal site.

For the full interview, please check out our Podcast or YouTube Channel.

Action Items

EDUCATE YOURSELF ABOUT DIGITAL PRIVACY: Continue following and supporting the work we do and follow other great blogs and websites, like Restore Privacy that provide actionable recommendations to get back privacy. It is not easy, but it is important to start learning why it matters and how it can be taken back. Stay on top of new regulation that is passed in your country.
USE OPEN-SOURCE TOOLS: Open-source software is shared publicly for other developers to review. It helps the overall state of software development but also it helps build trust in the company that is distributing its source code. In this sense, by being “vulnerable” and sharing something “private,” companies can build trust. Dr. Stankovic uses several open-source tools, but he underscored two during the interview, Atom, which is a markdown tool, Plausible Analytics, for web analytics (just like we do here at Lifestyle Democracy).
SUPPORT PRIVACY-FIRST AND PRIVACY-BY-DESIGN SOFTWARE: Through using privacy-first or privacy-by design software, the founders of the company as well as competitors will feel compelled to continue investing as they see a validation of their proof of concept.
Photo of author
Hi! I am Stefan Ivanovski, founder of Lifestyle Democracy, a knowledge platform that empowers individuals and communities through sharing and teaching how to apply actionable democratic principles and practices, one day at a time. I am currently a PhD student at the School of Industrial and Labor Relations at Cornell University studying the democratization of ownership and management of companies that are shaping the future of work, especially those that rely on remote work and cutting-edge technologies such as artificial intelligence.

Want to Read More Articles Like This? Become A Member

Writing and researching articles takes considerable time and resources. Consider supporting the work we do so we can continue sharing lessons from around the world how to apply democratic principles on a daily level.

Click on the image to get the updates.

Additional reading…