MillenniumPost
In Retrospect

The data dilemma

Courtesy Edward Snowden, we learned that every email we sent, every chat message, phone call, and document shared in the 'cloud' is accessible to intelligence agencies of the United States and allied countries. With the world recently celebrating Data Privacy Day, it has become all the more important to understand how data is helping power innovation, shape user experiences & fuel economies, not to mention raising critical questions about control, consent and security

The data dilemma
X

The landscape of data, privacy and how personal information is collected and used by organisations is constantly shifting as technology, computational power and societal expectations evolve rapidly. Every click, transaction, and interaction contributes to the vast reservoir of information that fuels our interconnected world.

Today, we stand on the cusp of a new wave — where the exponential increase in data utilised by Artificial Intelligence (AI) models is set to transform our digital experiences and lead to entirely new digital products and services.

As governments worldwide strive to keep pace with rapid advances in technology and the innovative ways data is both generated and utilised in order to protect citizens’ rights, so too are businesses compelled to swiftly adapt their strategies and operational frameworks to adhere to the evolving rules of the game.

Privacy, in the context of digital interactions, is broadly defined as the ability of an individual to understand and control what personal information is being shared, with whom, and for what purposes.

However terse and strict this definition is, in our present digital age, almost certainly it doesn’t do justice to the interplay that exists nor to the ever-changing conditions that give nuance to the meanings.

Almost 12 years ago, we learned that every email we sent, every chat message, phone call, and document shared in the cloud was accessible to intelligence agencies of the United States and allied countries. This became known after Edward Snowden leaked thousands of documents from the US National Security Agency (NSA) published in some of the world’s most important media outlets.

Snowden, an intelligence analyst who worked for the CIA and the NSA, at the beginning of his career considered his work a patriotic duty to defend his country. Eventually, he realised he was part of a secret global surveillance system used to spy even within the country. He felt that the population of a democratic country should at least be aware that something like this was taking place. For this reason, he leaked thousands of documents that evidenced what was going on to journalists who published several reports in major global media.

These documents revealed the various ways the NSA collected, analysed, and then used this information for espionage operations around the world. The global intelligence agencies have been gathering capabilities through a world map entailing data collection from fiber optic cables, through embassies, by spying on satellite communications, in collaboration with agencies of other countries or through computer attacks.

All of this collected information was stored in data centres for later access through the XKeyScore system, which works similarly to internet search engines, except that the searches are done on private information. The documents show that they could run queries on how to read a certain person’s emails, know who uses encrypted email in a given country, passwords for online accounts, and many others — basically a search engine on the private lives of billions of people, without any control.

PRISM is one of the exposed programmes that attracted the most attention, as it involves large internet companies such as Google, Facebook, Apple, YouTube, Microsoft, and Yahoo, among others. These companies have platforms that function as software as a service, also known as “the cloud.” When you share a document using Google Drive, you share it not only with your colleagues but also with Google. When you send an email using Outlook, Microsoft accesses that content. The same happens if you save your photos in Apple’s or Google’s cloud.

It is then reasonable to expect that these companies would have access to our information to provide us with the service. Some of us feared that the companies might take advantage of our information. What most of us didn’t realise was that our private communications were also under surveillance by intelligence agencies such as the NSA. What we learned at the time was that if you are not a US citizen and do not reside in the United States, the NSA can access data from these companies’ services to learn about you. We are talking about voice and video calls, emails, chats, documents, photos, locations and the like.

The collection of this type of information, in addition to the NSA’s analytical capacity, allowed spying operations on world leaders such as Angela Merkel, Enrique Peña Nieto, and Dilma Rousseff. Media operations to manipulate public opinion were also conducted, such as the QUITO operation, which promoted a favourable view of England on the Malvinas Islands in Latin America.

Having said this, it is essential to reflect on what has changed since then. Have these applications been eliminated and is our privacy now more secure? It appears, some things got better while others got worse.

A significant advance since the Snowden revelations is the adoption of end-to-end encryption. Unlike network traffic encryption, this allows us to protect the content of the information even from the company providing the service. If we encrypt a Gmail email, even Google can't read it.

The development of free software applications that allow us to control our information is another great advance since then. There are applications for chat, email, collaboration, document editing, and more.

Recently, India, along with the world, celebrated Data Privacy Day. The theme for 2025 was ‘Take Control of Your Data’ and it couldn’t have been more apt for the times we live in.

The increasing use of digital services in day-to-day activities has also led to a rise in data security threats and the sharing of data with multiple platforms and organisations.

Data privacy means the personal and sensitive information of an individual is secured, and they are in control of its sharing, storage and usage.

As the fierce debate rages on about privacy, security and the ethical implications of data exploitation, India’s very own Digital Personal Data Protection Act (DPDP Act) attempts to navigate this complex terrain.

It introduces a comprehensive framework for safeguarding digital personal data. While the Act aims to balance innovation and privacy, it presents a formidable challenge for businesses, especially those operating in the consumer sector.

The DPDP Act identifies two key stakeholders: firstly, the Data Fiduciary, which is any entity or individual that independently or jointly determines the purpose and method of processing the data and is subject to specific compliance requirements and penalties and secondly, the Data Principal, referring to the individual whose personal data is being processed.

In the context of mergers and acquisitions (M&A), the Data Principal refers to the individuals whose personal data is involved in the transaction, such as employees, customers, vendors, contractors, suppliers, and business partners of the target company. During the M&A transaction, the seller company is likely to be the Data Fiduciary, as it has control over the personal data of its stakeholders that is handed over to the acquirer company during the Due Diligence process.

M&A transactions typically involve managing substantial volumes of personal data belonging to the target company. During due diligence, the buyer often examines information related to data subjects, such as employees, customers, vendors, contractors, suppliers, and business partners (Data Principals).

If the seller or target company discloses this personal data during the process, compliance with the consent requirements outlined in Rule 3 of the DPDP Rules is necessary. Rule 3 of the DPDP Rules provides that consent in the form of notice must be obtained by the Data Fiduciary to the Data Principal must be clear and mention the specific purpose of the processing. This requirement can complicate the transaction for several reasons.

First, the sheer volume of data subjects involved may make the process logistically cumbersome and time-consuming. Second, issuing notices and securing consent might alert stakeholders to the potential transaction prematurely, potentially causing unrest among employees or uncertainty among customers and business partners.

Finally, there is the risk that some data subjects may withhold consent, limiting the seller’s ability to share critical information with the buyer, and thereby affecting the thoroughness of the due diligence process.

To mitigate risks associated with having to obtain consent from Data Principals and to provide the buyer with confidence, the seller can offer assurances through detailed representations and warranties, affirming that the target company complies with applicable data protection laws and has implemented adequate safeguards for personal data. Another option for the seller or target company is to provide the buyer with redacted or anonymised data sets during due diligence.

By removing or masking identifying information, the personal data becomes unrecognisable, thus falling outside the scope of the Data Protection Act. This approach allows the buyer to assess the data’s value and relevance without triggering the need for consent from the data subjects. However, this method may not be suitable for all types of data or transactions, particularly where detailed personal information is essential for evaluating the target company’s assets or liabilities.

While AI-driven data analysis offers immense potential for innovation and improved customer experiences, it also raises significant concerns about privacy and security. The Supreme Court’s recognition of the fundamental right to privacy in the 2017 Puttaswamy judgment and the subsequent enactment of the DPDP Act underscores the need for a delicate balance between technological advancement and individual rights.

A report published by PwC last year has been an eye-opener. In the survey of over 3,000 consumers across the country, it found that 56% were completely unaware of their rights related to personal data.

Meanwhile, the survey of 186 respondents representing organisations found that only 9% of organisations have a comprehensive understanding of the DPDP Act.

Robust data security is essential and integral to protect personal data from unauthorised access, loss or damage. Businesses must implement technical and organisational measures to safeguard personal data.

From unencrypted tyre pressure sensors that continuously reveal our vehicle’s location to our cellphones leaking data even when we turn off all sharing and geolocation features, no personal detail is hidden from data brokers amassing information for advertising and other commercial purposes. Technology embedded in our phones, our computers, our cars, and our homes is part of a vast ecosystem of data collection and analysis primarily aimed at understanding and in some cases manipulating our consumer behaviour.

In hindsight, this flood of information is transforming the government's relationship with its citizens, obviously in some cases, for the greater good like changes in public health, city planning, transportation, medicine, and energy efficiency through the use of big data. But it’s the invasion of privacy and the potential for misuse that is troublesome and the threat is quite real.

The public remains in the dark about the new world of surveillance their government and their business communities have been building in tandem. It is thus important to understand the amount of data and control ordinary consumers are handing over every day to the people in power and in the process, raise critical questions about control, consent and security.

Next Story
Share it