This Article is written by Pritam Chandra Ashutosh, Narayan School of Law/Gopal Narayan Singh University (GNSU), an intern under Legal Vidhiya.
ABSTRACT
This article examines the legal framework governing the collection and storage of customer (personal) data. It surveys the evolution of privacy and data protection law, defining key concepts and the legal significance of customer data. We compare India’s new Digital Personal Data Protection Act, 2023 (“DPDP Act”) with the EU’s General Data Protection Regulation (GDPR) and California’s Consumer Privacy Act (CCPA/CPRA). Core principles – including consent and lawful basis for processing, data minimization, security and accountability, and cross-border transfers – are discussed in detail. The analysis incorporates examples of major companies (e.g. Facebook/Instagram, Google, Amazon) that collect large volumes of customer data, and highlights relevant enforcement actions and case law. By providing a comprehensive legal overview, this article shows how data-protection regimes converge on protecting individual privacy while balancing business and government interests.
KEYWORDS
Customer Data, Data Privacy, Data Protection Law, GDPR, CCPA, CPRA, DPDP Act 2023, Puttaswamy Judgement, Meta GDPR Fine.
INTRODUCTION
“Customer data” generally means information collected about individuals in a commercial context – often personal data identifying or describing a customer. In practice, customer data includes anything from names, contact details and purchase history to online identifiers, profile attributes, location data, and even sensitive details like health or financial information. Because customer data often includes personal data, businesses must comply with privacy and data-protection laws. Protecting this data is legally significant: it safeguards fundamental privacy rights and consumer trust, mitigates identity-theft risks, and ensures accountability for data breaches. A landmark Indian Supreme Court decision, K.S. Puttaswamy v. Union of India (2017), held that informational privacy is a fundamental right. Globally, nations have adopted data protection laws in response to concerns over surveillance, hacking, and misuse by large corporations. At the same time, modern businesses rely on customer data for targeted marketing and personalized services. Thus, regulatory frameworks aim to balance individuals’ privacy rights against legitimate business uses of data.[1]
Early privacy concerns trace back to Warren and Brandeis’s 1890 “right to be left alone.” After World War II, Article 12 of the Universal Declaration of Human Rights (1948) recognized privacy as a fundamental right. By 1980 the OECD Guidelines on the Protection of Privacy set baseline principles for fair data processing. Europe led formal regulation: the Council of Europe’s Data Protection Convention (1981) and the EU’s 1995 Data Protection Directive. In the U.S., privacy developed sector ally (e.g. HIPAA for health, GLBA for financial data). The rise of the Internet and digital tracking prompted stricter laws. The EU adopted the General Data Protection Regulation (GDPR) in 2016 (effective 2018), setting a high global standard. In 2018 the U.S. state of California enacted the CCPA (effective 2020), soon strengthened by the California Privacy Rights Act (CPRA) in 2023.[2] In India, the 2017 Puttaswamy case compelled enactment of comprehensive data protection, culminating in the Digital Personal Data Protection Act, 2023. This historical evolution reflects growing international consensus on core privacy principles.
GLOBAL DATA PROTECTION FRAMEWORKS: INDIA, EU, AND CALIFORNIA
August 2023, the Indian Parliament passed the Digital Personal Data Protection Act, 2023 — the country’s first comprehensive privacy legislation applicable across all sectors. “It has broad extraterritorial effect: It applies to any entity located outside India if it processes the personal data of individuals in India in connection with offering goods or services within the country.[3]
The Act’s scope is narrower than GDPR: it applies only to digital data (information collected in digital form), and excludes data that has been made public by law. Notably, unlike GDPR the DPDP Act does not recognize non-consensual lawful bases such as contractual necessity or “legitimate interests.” Instead, private data fiduciaries may process personal data only if the data principal (the individual) has given free, specific, informed consent, or the processing qualifies as a narrow “legitimate use” defined in the Act. Examples of legitimate uses include voluntarily shared data (with no objection), legal compliance, employment purposes, or emergency situations. Under the DPDP Act, consent must be ‘free, specific, informed, unconditional, and unambiguous,’ aligning closely with the standards set by the GDPR If consent was obtained and later a data element is found unnecessary for the stated purpose, that consent is considered invalid.
The Act grants individuals’ rights akin to those in GDPR: right of access, correction, erasure (withdrawal of consent) and data portability. It also introduces new rights: for example, a right to a grievance redressal officer and the right to nominate someone to exercise rights after one’s death. In the event of a data breach, fiduciaries must notify the Data Protection Board and affected individuals, without requiring any specific harm threshold or deadlines. The government can designate certain firms as “Significant Data Fiduciaries” (based on scale, sensitivity, or impact); such firms face extra obligations, including appointment of an independent data protection auditor and conducting periodic impact assessments.
“The DPDP Act prescribes penalties for violations that can reach up to 2% of a company’s global annual turnover or ₹250 crore (around €28 million), whichever is higher. “Unlike GDPR’s higher ceiling (4% or €20M), India’s fines are more modest, but still substantial. A Data Protection Board (a new regulator) will oversee enforcement, issue guidelines, and adjudicate disputes. The Board will have powers to investigate breaches, levy fines, and require remedial actions, though it will not have broad rulemaking authority. Importantly, the DPDP Act does not create a private right of action (in contrast to some state laws like CCPA).[4]
EU GENERAL DATA PROTECTION REGULATION (GDPR)
The GDPR is an EU regulation (2016/679) that took effect in May 2018. It applies to all organizations processing the personal data of EU residents, regardless of where the organization is based. GDPR defines personal data broadly (any information relating to an identifiable natural person), and special rules apply to “sensitive” categories (health, race, etc.) The Act grants robust rights to data subjects—such as access, correction, erasure, data portability, restriction of processing, and objection—while also establishing clear principles for data processing. It outlines key responsibilities, including the need for a lawful basis for processing (such as consent, contractual necessity, legal obligation, vital interests, public interest, or legitimate interests), transparency through privacy notices, implementation of data protection by design and by default, appointment of Data Protection Officers where necessary, and mandatory breach notification within 72 hours.[5]
GDPR places strict requirements on consent: it must be freely given, specific, informed, and unambiguous. Recital 32 emphasizes consent via an affirmative act (no pre-ticked boxes). The regulation also enshrines data minimization and purpose limitation: personal data must be “adequate, relevant and limited to what is necessary” for the stated purpose. Controllers must implement appropriate security measures and are accountable for demonstrating compliance.[6]
The GDPR enforces strict penalties for non-compliance—up to €20 million or 4% of global annual turnover, whichever is higher—and grants national Data Protection Authorities (DPAs) the authority to enforce these rules. It also severely restricts transfers of personal data outside the EU: transfers are allowed only if the destination country ensures an “adequate” level of protection or via approved safeguards (e.g. Standard Contractual Clauses), and as of Schrems II (CJEU 2020) extra scrutiny is required for U.S. transfers. The GDPR set a global benchmark, and many jurisdictions (including India) drew inspiration from it.[7]
CALIFORNIA CONSUMER PRIVACY ACT (CCPA) AND CALIFORNIA PRIVACY RIGHTS ACT (CPRA)
California enacted the CCPA in 2018 (effective 2020) and later the CPRA (amendment in 2020, effective 2023). These laws apply to businesses that collect data on California residents and meet specific criteria, such as having annual gross revenues exceeding $25 million or handling the personal data of more than 50,000 consumers. The CCPA/CPRA focus on transparency and consumer control. They grant California residents the right to know what personal data is collected about them, the purposes for which it is used, and with whom it is shared. Consumers are entitled to request the deletion of their personal data and to opt out of the “sale” of their personal information. The CPRA added rights to correct inaccuracies and to limit the use of sensitive personal information (e.g. precise geo-location, race, health).[8]
CCPA/CPRA do not impose affirmative consent requirements on most processing (unlike GDPR); instead, businesses must provide notices and an opt-out for sales. Notably, the CPRA bans many uses of sensitive data without additional consent. California law also forbids discrimination (e.g. service denial) against consumers exercising these rights. Enforcement is by the California Attorney General (and a new California Privacy Protection Agency under CPRA); there is a limited private right of action for certain data breaches (not general infringements). Statutory penalties range up to $7,500 per intentional violation. Despite its state scope, the California law influences many U.S. and international companies, given California’s market size.[9]
COMPARATIVE HIGHLIGHTS
Although DPDP, GDPR, and CCPA/CPRA all aim to protect personal data, they differ in approach:
- Scope: The GDPR applies to any processing of personal data belonging to EU residents, while the DPDP Act covers “digital personal data” of Indian residents, including processing activities conducted outside. CCPA/CPRA apply to data of California residents held by covered businesses. In practice, all three have extraterritorial reach (GDPR by its terms; DPDP by covering foreign entities serving India; CCPA if a company does business with Californians).
- Lawful Basis: GDPR allows processing under multiple bases: consent, contractual necessity, legal obligation, etc. By contrast, the DPDP Act permits only two bases for private entities: the individual’s consent or a narrow set of “legitimate uses”. Notably, DPDP does not recognize “legitimate interests” or implied consent as in GDPR. CCPA/CPRA do not use a lawful-basis framework; instead, they require businesses to provide notice and generally allow processing unless a consumer opts out of sales or limits use of sensitive data.
- Consent: The GDPR mandates that consent must be given through a clear, informed, and voluntary opt-in action. In contrast, the CCPA primarily follows an opt-out approach, allowing consumers to instruct businesses not to sell their personal data, rather than requiring opt-in for most data processing activities.
- Data Minimization and Purpose: GDPR’s principles require collecting only necessary data. Indian law likewise emphasizes necessity: even with consent, data collection must be necessary for the purpose, else consent is invalid. The CCPA/CPRA implicitly encourage minimal collection by giving consumers deletion rights and limiting use of sensitive data, but impose no explicit minimization rule.
- Security and Breach Notification: All regimes impose security obligations. GDPR requires “appropriate technical and organizational measures” (Art.32) and mandatory breach notification within 72 hours. The DPDP Act requires similar safeguards and mandates breach notifications to the Board and affected individuals, with no waiting period or risk threshold. California law requires “reasonable security” and mandates consumer notice for data breaches of certain personal information, but the timeline is generally governed by state breach laws.[10]
- Accountability: GDPR explicitly embeds an accountability principle: controllers must both comply with the law and demonstrate compliance. This includes record-keeping, data protection by design (Art.25), and appointing a Data Protection Officer when required. Under the DPDP Act, data fiduciaries also have obligations to implement measures (including DPIAs for significant fiduciaries), and they must establish grievance mechanisms (e.g. a grievance officer). California law does not have an overarching accountability framework, focusing instead on specific compliance requirements and the new regulatory agency’s enforcement.
- Cross-Border Transfers: GDPR strictly limits transfers outside the EU absent an “adequacy” decision or safeguards (and since Schrems II even SCCs require additional scrutiny). The DPDP Act’s approach is more permissive: it allows transfers to any country except those placed on a (yet-to-be-notified) negative list. No special mechanism (like SCCs) is prescribed beyond general security measures. CCPA/CPRA include no cross-border restrictions; transfer issues in the U.S. rely on other frameworks (e.g. Privacy Shield – now invalidated, or contracts).[11]
KEY PRINCIPLES OF DATA PROTECTION
Though jurisdictions vary, they share core principles governing customer data:
- Consent and Lawful Basis: Consent must be informed and freely given. According to Recital 32 of the GDPR, valid consent requires a “clear affirmative act” and cannot be implied or obtained through pre-ticked boxes. The DPDP Act reflects similar principles, requiring consent to be “free, specific, informed, unconditional, and unambiguous.” Once consent is granted, data processing must remain strictly within the agreed-upon purpose—any unrelated or excessive use may render the consent invalid. If consent is later withdrawn, data processing for that specific purpose must stop. For example, an individual cannot authorize the sale of their data and then later claim the sale was lawful after revoking consent. In contrast, California’s CCPA generally does not require opt-in consent for most data uses but instead provides consumers with the right to opt out of the sale of their personal information
- Data Minimization and Purpose Limitation: A fundamental GDPR principle is that data collected should be adequate, relevant and limited to the purpose. In practice, this means companies should collect only the personal data they actually need. India’s DPDP Act similarly emphasizes necessity: it requires that data collection be for a specific lawful purpose, and it even provides that consent alone does not justify collecting unnecessary data. For example, a telemedicine app may not, under DPDP, collect a patient’s full contact list without a separate consent or justification, because that information is not necessary to provide telemedicine. Purpose limitation means that data should not be repurposed in a way incompatible with the original notice to the user. GDPR explicitly forbids further processing beyond the stated purposes (except certain archival research exceptions). The DPDP Act also binds fiduciaries to the purposes disclosed in the notice. California’s law does not explicitly codify minimization, but the deletion and correction rights, and the requirement of notice upon collection, function to constrain excessive data gathering.
- Security and Breach Notification: Organizations are responsible for safeguarding personal data against risks such as unauthorized access, loss, or misuse. The DPDP Act imposes similar obligations, requiring data fiduciaries to implement adequate security safeguards—significant fiduciaries must also undergo independent audits to ensure compliance. Both laws treat data breaches as critical incidents: the DPDP Act mandates that all breaches be reported to the Data Protection Board and the individuals affected, regardless of severity. Under the GDPR, organizations must notify regulatory authorities without undue delay (within 72 hours as per Article 33), and must inform individuals if the breach poses a high risk (Article 34). Meanwhile, California’s CCPA requires breach notifications only when specific categories of sensitive information—like social security numbers or financial data—are exposed.
- Accountability and Governance: Controllers and fiduciaries must be accountable for compliance. GDPR’s accountability principle means organizations not only follow the rules but must prove compliance (through documentation, impact assessments, DPOs, codes of conduct, etc.). The DPDP Act similarly holds data fiduciaries responsible: for example, “significant” fiduciaries must conduct periodic data-protection impact assessments and undergo independent audits. Both regimes expect privacy by design/default, meaning systems should embed privacy safeguards from the outset (GDPR Art.25). In India, the concept of “Data Protection by Default” is embodied in notice requirements that closely mirror GDPR’s (though the Act stops short of prescribing technical design measures). The CCPA/CPRA establish a California Privacy Protection Agency which can oversee compliance, but otherwise rely on business-led accountability (e.g. privacy policies and opt-out mechanisms) rather than detailed mandates.
- Cross-Border Data Transfers: Transferring personal data across international borders involves complex issues related to national sovereignty and individual privacy rights. The GDPR, under Chapter V, strictly regulates such transfers from the EU. Data can only be sent to non-EU countries if those countries have received an “adequacy decision” from the European Commission, confirming that their data protection standards are essentially equivalent. Alternatively, businesses can use contractual tools like Standard Contractual Clauses (SCCs), but after the Schrems II judgment in 2020, additional safeguards must be in place to address concerns about foreign government surveillance In contrast, India’s DPDP Act takes a more relaxed approach. It generally permits cross-border data transfers unless the receiving country is specifically restricted by the Indian government—often referred to as a “blacklist.” There is no requirement for an EU-style adequacy decision, though the government retains the power to designate certain jurisdictions as restricted or impose conditions on data transfers for national security or privacy reason. CCPA, meanwhile, does not contain explicit rules on cross-border data transfers. U.S. companies that handle data from EU citizens often rely on contractual agreements or, in the past, frameworks like the EU-U.S. Privacy Shield—though the latter was invalidated by the Schrems II decision due to concerns over U.S. intelligence surveillance. That ruling also tightened the rules for using SCCs, forcing companies to conduct detailed assessments of how data is handled when transferred to countries like the U.S. This has had major implications for global tech giants such as Meta and Google, which routinely move EU user data to U.S. servers.
INDUSTRY EXAMPLES AND ENFORCEMENT
- Meta(Facebook/Instagram):
Meta’s platforms—including Facebook and Instagram—gather a wide range of personal information, including users’ age, gender, interests, social networks, messages, photos, and behavioural data such as likes, clicks, and location history. The 2018 Cambridge Analytica scandal revealed that personal data from approximately 250,000 Facebook users—and that of 50 to 87 million of their friends—was harvested without informed consent for political profiling. In 2023, Meta was again penalized in the EU when the Irish Data Protection Commission, backed by the European Data Protection Board (EDPB), fined its EU branch €1.2 billion for illegally transferring personal data to the United States using standard contractual clauses.[12] Instagram, as part of Meta, shares similar data practices and has come under increasing scrutiny, particularly for its handling of minors’ data and concerns about its impact on teen mental health. - Google:
Google, dominant in search, advertising, and mobile services, collects extensive user data including search history, email content, browsing behaviour, location tracking through Maps and Android devices, and more. This immense data pool fuels Google’s personalized advertising business, which has faced several legal challenges. In 2019, France’s data protection authority CNIL fined Google €50 million for failing to provide clear privacy notices and for using pre-checked boxes to obtain consent for ad personalization—violating GDPR transparency and consent requirements. Google has also been criticized for bundling services into account creation, raising questions about whether consent is truly “freely given.” With the invalidation of the EU-U.S. Privacy Shield following Schrems II, Google must now rely on stringent safeguards for transferring EU user data to the United States, increasing the complexity of maintaining GDPR compliance.[13] - Amazon:
As both a global e-commerce leader and a major cloud services provider, Amazon collects data on users’ purchase histories, browsing activity, voice commands through Alexa, and even biometric information such as facial recognition data used in physical stores. These data streams power Amazon’s recommendation systems and targeted advertising. In 2023, the Luxembourg Data Protection Authority (CNPD) fined Amazon Europe a record €746 million for violating GDPR principles—a penalty later upheld in court in 2025. The case serves as a stark warning about the risks of mishandling personal data, especially regarding profiling and potential discriminatory pricing based on consumer behaviour.[14] - OtherSectors:
Beyond tech giants, traditional industries such as banking, retail, insurance, and telecommunications also handle large volumes of personal data, including financial records, call logs, loyalty program activity, and behavioural analytics. These sectors are subject to strict data protection rules, particularly when it comes to sensitive data used by credit bureaus or insurance firms. Even non-digital businesses that store customer information must comply with evolving data privacy regulations. These examples underscore that aggressive data collection without proper legal safeguards not only invites regulatory action but also illustrates how deeply customer data has become embedded in modern business models.
CASE LAW AND ENFORCEMENT ACTIONS
Data protection laws are enforced through regulatory actions and court decisions worldwide. Notable examples include:
- Schrems II (CJEU 2020): In this landmark ruling, the Court of Justice of the European Union (CJEU) invalidated the EU–US Privacy Shield framework due to inadequate safeguards against U.S. surveillance laws. The court emphasized that data transfers relying on Standard Contractual Clauses (SCCs) are permissible only if they offer a level of protection “essentially equivalent” to that guaranteed by the GDPR. Where such protection cannot be assured, data transfers from the EU must be suspended. This ruling has far-reaching implications for U.S. tech giants such as Meta and Google, which routinely handle personal data of EU residents.
- Google Spain (CJEU 2014): Established the “right to be forgotten.” The Court held that search engines must remove search results for a person’s name when the subject’s privacy outweighs public interest. This landmark case expanded individuals’ control over online data.
- Meta (GDPR enforcement): In 2023 Ireland’s DPC (with EDPB agreement) fined Meta €1.2B for GDPR breaches. Earlier, the UK’s ICO had fined Facebook £500,000 in 2018 (under old law) for the Cambridge Analytica data misuse (since superseded by FTC action). These actions underline regulators’ willingness to impose severe penalties for privacy violations.
- Google (GDPR enforcement): France’s CNIL fined Google €50M (2019), and in 2022 Google was fined €220M by the Irish DPC for combining user data from various services without proper basis. National DPAs across Europe continue audits of Google’s consent practices.
- FTC v. Wyndham (3d Cir. 2015): The U.S. Third Circuit Court of Appeals confirmed that the Federal Trade Commission has the authority to penalize companies for insufficient data security, classifying it as an unfair trade practice. This ruling reinforced the FTC’s jurisdiction over data protection failures and established a key precedent for federal enforcement of privacy and security standards.[15]
- FTC v. Facebook/Cambridge Analytica (2019): In 2019, the Federal Trade Commission (FTC) levied a $5 billion fine against Facebook for serious breaches of user privacy and mandated significant reforms to its privacy management practices. In a separate action the same year, the FTC pursued enforcement against Cambridge Analytica and related individuals for misleading consumers about how their data was collected and used. The case concluded with consent orders requiring the deletion of unlawfully acquired data. These enforcement efforts underscore the FTC’s firm commitment to addressing privacy violations and ensuring accountability for the misuse of personal data in the United States.
- Carpenter v. United States (2018): The U.S. Supreme Court held that accessing historical cell-site location information is a search under the Fourth Amendment, requiring a warrant. Although a criminal case, Carpenter has been cited in privacy discussions as recognizing that digital data (here, location) carries privacy expectations.
- Riley v. California (2014): The Supreme Court ruled that police generally need a warrant to search a person’s cell phone, implying strong privacy protection for personal digital data.
- Puttaswamy (India 2017): India’s Supreme Court unanimously declared privacy (including informational privacy) a fundamental right under the Constitution. This case laid the foundation for India’s DPDP Act, underscoring that collection/storage of personal data is subject to constitutional protection.[16]
These examples show that courts and regulators are actively shaping data privacy law. Penalties for violations are steep: the GDPR record fine was €746M (Amazon) and $5B (Facebook), indicating that enforcement has “teeth”. Regulators worldwide (EU DPAs, UK ICO, US FTC, Indian authorities, etc.) monitor companies’ data-handling practices and can impose sanctions, require changes, or even ban certain processing.
CONCLUSION
While the collection and storage of customer data offer significant business benefits, they also come with critical legal responsibilities. Across various legal systems, data protection laws share core principles: individuals’ rights and consent must be respected, data must be collected minimally and secured properly, and organizations must be held accountable for their handling of personal information. At the same time, frameworks like the GDPR and CCPA/CPRA continue to evolve in response to emerging technologies like artificial intelligence and the Internet of Things, as well as ongoing enforcement challenges.
For legal professionals and students alike, mastering the fundamentals of data protection is essential. Any organization handling personal data must implement well-structured policies and systems to ensure compliance with requirements around consent, data minimization, and security. They must also remain alert to restrictions on cross-border data transfers and jurisdiction-specific nuances, such as the CCPA’s right to opt out of data sales. High-profile enforcement cases involving companies like Meta and Google illustrate that even the largest corporations are not immune from penalties or reputational harm. As data becomes the backbone of the digital economy, strong legal compliance is no longer optional—it is central to building and maintaining customer trust.
[1] K.S. Puttaswamy & Anr. v. Union of India & Ors., (2017) 10 SCC 1 (India).
[2] Carpenter v. United States, 138 S. Ct. 2206 (2018); Riley v. California, 573 U.S. 373 (2014).
[3] Digital Personal Data Protection Act, No. 22, Acts of Parliament (2023) (India).
[4] Bhatia, Sunir K., Understanding India’s New Data Protection Law, Carnegie Endowment (Oct. 13, 2023).
[5] General Data Protection Regulation, Reg. (EU) 2016/679, art. 5, 6, 2016 O.J. (L 119) 1 (EU).
[6] Google Spain SL v. AEPD & M. Costeja González, Case C-131/12, 2014 E.C.R. I-0000 (CJEU 2014).
[7] Maximillian Schrems v. Data Prot. Comm’r, Case C-362/14, 2015 E.C.R. I-0000 (CJEU 2015); id., Case C-311/18, 2020 E.C.R. I-0000 (CJEU 2020).
[8] California Consumer Privacy Act of 2018, Cal. Civ. Code §§ 1798.100–.199 (West 2020).
[9] California Privacy Rights Act, Prop. 24 (Cal. 2020) (amending Cal. Civ. Code §§ 1798.100–.199).
[10] Jaffer, Russell S., & Vijayaraghavan, Vikram, India’s Digital Personal Data Protection Act 2023 vs. GDPR: A Comparison, Latham & Watkins (Dec. 2023).
[11] Prabhu, Arun et al., Comparing Global Privacy Regimes Under GDPR, DPDPA and US Data Protection Laws, India Corp. L. Blog (Jan. 9, 2024).
[12] EDPB Press Release: €1.2B Fine on Facebook (Meta), European Data Protection Board (May 22, 2023)
[13] CNIL Imposes €50 Million Sanction on Google, Commission Nationale de l’Informatique et des Libertés (Jan. 21, 2019).
[14] Data Protection Commission Issues €746M Fine on Amazon, Data Protection Commission (Lux.) (Mar. 19, 2025).
[15] FTC v. Wyndham Worldwide Corp., 799 F.3d 236 (3d Cir. 2015).
[16] K.S. Puttaswamy & Anr. v. Union of India & Ors., (2017) 10 SCC 1 (India).
Disclaimer: The materials provided herein are intended solely for informational purposes. Accessing or using the site or the materials does not establish an attorney-client relationship. The information presented on this site is not to be construed as legal or professional advice, and it should not be relied upon for such purposes or used as a substitute for advice from a licensed attorney in your state. Additionally, the viewpoint presented by the author is personal.