
This article is written by Srilekha Raman of Tamil Nadu National Law University, an intern under Legal Vidhiya.
ABSTRACT
Personal data has become a core economic asset powering digital industries. This article explores the trend of data monetisation, where user information is harvested, analysed, and sold – often without informed consent. It critiques the illusion of choice in digital privacy, particularly affecting users with low digital literacy. Despite legal protections like India’s Digital Personal Data Protection Act, 2023 and the EU’s GDPR, gaps remain in regulating post-collection use. The article advocates for transparent consent frameworks, stronger enforcement, and ethical alternatives such as data trusts and privacy-by-design models to ensure user dignity, autonomy, and real control in a data-driven economy.
KEYWORDS
Data Monetisation, Privacy, Informed Consent, User Consent, Consumer Autonomy, Digital Rights, Personal Data. DPDP Act of 2023, EU’s GDPR.
INTRODUCTION
Data Monetisation refers to the process of generating revenue by leveraging user data, i.e. by selling, analysing or targeting users. For example, you might have come across apps that ask permission to share your data with third party users. This is also data monetisation. Google and Meta using behavioural data for advertisement targeting is also an example of data monetisation.
But the real dilemma occurs when user data is accessed and sold to third party users without the consent of the users, in the name of enhanced customer experience, efficiency and innovation, thereby violating the right to privacy of the users. Consumers usually do not have any choice when it comes to whom their data is being shared and how the data is accessed and used.
This violates the right to privacy that is guaranteed by the Constitution of India and various other legislations like the Digital Data Protection Act. This article aims to discuss and elaborate on how consumers’ choice is being violated by the process of data monetisation and the legal framework governing this grey area.
WHAT IS DATA MONETISATION?
The technique of using data to produce quantifiable financial gains is known as data monetization. Data monetization entails turning unprocessed data into insightful knowledge that may be sold directly or utilized internally to boost company performance. There are 2 types of data monetisation: Direct Data Monetisation and Indirect Data Monetisation. [1]
selling other parties’ data or data-driven products to third parties is called direct data monetisation. In this type, the data is sold to third parties. On the other hand, utilizing data internally to improve client experiences, decision-making, and operational efficiency is called as indirect data monetisation. [2] In this type, data is used to improve upon customer satisfaction and thus, reduce costs.
PROCESS OF DATA MONETISATION
In today’s digital economy, data is constantly being generated and extracted without the user’s consent. Companies are monetising user data in sophisticated and often opaque ways. Firstly, companies collect the data. There are different types of data, which are:
- Personally Identifiable Information – This includes data such as names, email IDs, phone numbers, IP address, etc.
- Behavioural data – This data includes details on how the user uses the website, etc. For example, data such as what the user click on, scrolling habits, advertisements interacted with, etc.
- Transactional data – Data such as purchase history, subscription patters, payment modes used, etc are included within this category.
- Location data – Data like GPS, Wi-Fi, Bluetooth is tracked and used for geographically targeted advertisements/
- Inferred data – Sometimes data is used to predict political views, income level or even mental health status, etc to target and personalise advertisements.
Secondly, these collected data are sold to third parties for multiple purposes. This can be for targeted advertisement where platforms offer advertisers hyper-targeted ad placements based on user behaviour and demographics. The more precise the targeting, the more they can charge. For example, this is being done by platforms like Google and social media platforms like Instagram and Facebook.
Some apps and websites directly sell the data to third party brokers who combine massive data sets to create and sell consumer segments. These are then used for everything from credit scoring to political campaigns. Some companies also monetise data indirectly by using it to understand customer feedback from their usage and interaction. They then use this data to improve the product and services, which in turn drives higher profits.
In this way, every interaction of the user is turned into economic profit by companies. Whether it’s direct revenue through ads or indirect gains from product improvement, the value companies extract from personal data is immense. Yet, the average consumer remains largely unaware of just how much their digital life is being capitalised on.
CONSUMER CHOICE IN DATA MONETISATION
Customer data monetization has grown in popularity as a business strategy for organizations in a variety of sectors. Businesses may make a lot of money and learn a lot about their customers by evaluating and selling data on consumer behaviour, interests, and demographics. However, there are moral questions regarding consent and privacy when consumer data is used for financial gain.
On paper, most digital platforms claim to give users control over their data. But in practice, the consumer’s choice is often limited, coerced, or misinformed. What looks like consent is frequently a product of design manipulation, information overload, or sheer necessity.
While the monetisation of data has created huge business opportunities, it also opens the door to some serious ethical concerns. One of the biggest being how people themselves are turned into products. When personal data is treated like a commodity – bought, sold, and analysed purely for profit – it reduces individuals to just another resource to gain economic benefit. What’s worse is that the so-called consent that is given is often not truly informed. Most users don’t read the long, complex privacy policies filled with legal jargon – they just click “I agree” and move on. Companies are aware of this and often design their platforms with subtle tricks (known as dark patterns) to nudge users into sharing more than what they might be comfortable with.
This creates a clear imbalance of power. Big technology platforms know exactly what they’re collecting and how it benefits them. Most users, on the other hand, don’t have the same awareness or control. People who are less tech-savvy – like the elderly, children, or those without digital literacy – are especially vulnerable. Sometimes, they don’t even have a choice. They can either give up their data or lose access to apps and services that is required. This is not real consent – it’s a trade-off many people are forced to make.
LEGAL FRAMEWORK AND GAPS
The legal community has been rushing to keep up with the growing concerns about consumer choice and data privacy. Although laws have been introduced in several nations to control how businesses gather and profit from user data, the extent, rigor, and enforcement of these frameworks vary greatly. Particularly when it comes to safeguarding users against exploitative monetization practices, the laws in many areas are either unclear, poorly enforced, or leave a lot of room for interpretation.
The Digital Personal Data Protection Act, 2023 (DPDP Act) is a long-awaited step in the development of a data protection legal framework in India. It establishes user rights like consent, correction, and data erasure, defines important terms like “personal data,” and establishes the Data Protection Board of India to ensure compliance. In an effort to bring businesses into line with international best practices, the Act also mandates that consent be obtained in a precise and unambiguous way prior to processing personal data.
Nonetheless, several concerns have arisen regarding its application and reach. For instance, the Act’s protective power is diminished because it permits the government to exempt specific organizations, such as state agencies, from its application under the pretext of national security. [3]
Additionally, there are no explicit regulations governing how businesses can profit from data in indirect ways (like behavioural analytics), and consumers are not given the legal right to data portability or to object to profiling, two rights that are thought to be crucial for gaining actual control.
On the other hand, the General Data Protection Regulation (GDPR) of the European Union, which has been in effect since 2018, is generally considered the gold standard for data protection. It grants users complete control over their personal information, including the ability to view, amend, remove, and limit its use. GDPR also mandates purpose limitation, i.e. data collected for one reason can’t be repurposed without consent, and data minimisation, which means companies can only collect data that is necessary. Due to the GDPR’s significant fines for non-compliance and requirement for explicit, affirmative consent before any data can be processed, many multinational corporations have been forced to restructure their data practices.
Similar to this, states like California have enacted laws like the California Consumer Privacy Act (CCPA) and its amended version, the California Privacy Rights Act (CPRA), despite the fact that there isn’t a single federal data protection law. Customers have the right to request deletion, opt out of data sales, and know what data is being collected thanks to these laws. The now familiar “Do Not Sell My Personal Information” option was introduced by the CCPA and specifically targets the monetization model that many digital platforms employ. Customers are exposed to unrestricted data mining in other regions of the United States and a large portion of the developing world, where data protection is either non-existent or very inadequate.
A fundamental tension can be seen in the global legal landscape: although some laws are robust in theory, they are inconsistently enforced in practice. For instance, the DPDP Act’s enforcement mechanisms are still being operationalized in India, and the Data Protection Board’s ability has not yet been put to the test. Furthermore, a lot of current frameworks concentrate on the process of gathering data but give less consideration to how that data is later transferred, profiled, or monetized. Because of this regulatory blind spot, businesses can make money off user data in ways that are morally dubious but technically legal. [4]
Essentially, the existing legal systems are a patchwork, with some being strong and others being antiquated or inadequate. Most laws are still catching up to the rapid advancements in technology. The legal system will only provide a limited level of protection until there is greater clarity regarding the post-collection use of data and more robust protections against exploitative monetization. Frameworks that go beyond consent, prioritize accountability, guarantee user empowerment, and plug the gaps that let businesses treat data like an unbounded, unrestricted resource are what are required.
CONCLUSION
As data becomes the cornerstone of the modern digital economy, its monetisation continues to raise serious questions about privacy, fairness, and consumer autonomy. While businesses have unlocked immense value by leveraging personal data, this has often come at the cost of user awareness and control. Consent, in many cases, remains superficial granted without full understanding or genuine alternatives. Legal frameworks like India’s Digital Personal Data Protection Act, 2023, and the EU’s GDPR represent important steps toward empowering users, but enforcement gaps, vague language, and limited accountability continue to undermine their impact.
The ethical dilemmas around data monetisation are no longer hypothetical. From manipulative advertising to algorithmic profiling and behavioural nudging, individuals are increasingly being treated as products rather than as rights-bearing participants in the digital ecosystem. Vulnerable groups, including children and digitally excluded communities, face the greatest risks in this imbalance of power.
Going forward, a fair and responsible approach to data monetisation must combine strong legal safeguards with ethical design principles. Transparent consent, data minimisation, and user-centric governance models like data trusts and cooperatives offer viable pathways. Ultimately, protecting consumer choice in the data economy isn’t just about regulation – it’s about restoring trust, dignity, and agency to the people behind the data.
REFERENCES
- What is Data Monetization? Examples, Strategies and Benefits
- The Ethics Of Customer Data Monetization: How To Balance Profit And Privacy – Digital Marketing Strategy & Insights
- Enforcement Gaps in India’s DPDP Act and the case for decentralized data protection boards – Express Computer
- The Impact of India’s New Digital Personal Data Protection Rules | Privacy World
[1] HEXAWARE, Accessed on July 19th, 2025, What is Data Monetization? Examples, Strategies and Benefits.
[2] Ibid.
[3] EXPRESS COMPUTERS, Accessed on July 21, 2025, Enforcement Gaps in India’s DPDP Act and the case for decentralized data protection boards – Express Computer.
[4] Bindu Janarthanan & Scott Warren, The Impact of India’s New Digital Personal Data Protection Rules, PRIVACY WORLD, Accessed on July 21, 2025, The Impact of India’s New Digital Personal Data Protection Rules | Privacy World.
Disclaimer: The materials provided herein are intended solely for informational purposes. Accessing or using the site or the materials does not establish an attorney-client relationship. The information presented on this site is not to be construed as legal or professional advice, and it should not be relied upon for such purposes or used as a substitute for advice from a licensed attorney in your state. Additionally, the viewpoint presented by the author is personal.

0 Comments