1 The Collection and Use of Data and Information

Technologies today facilitate widespread dissemination of information, including visual images, and rapid communication to billions of people across the globe. The digital economy highlights the increasing economic importance and value of data, information, and other intangibles for many companies. This in turn underlines a shift in dominant business production and operation models towards approaches that significantly utilize intangibles. Intangibles have thus contributed to a marked yet understudied transformation in business practices and sources of economic value for numerous firms. These trends are only likely to accelerate in an era of big data solutions.

Digital economy firms increasingly shape how we create, disseminate, and access data and information. The activities of such firms have contributed to a growing value of information in recent years. Prominent digital economy firms have strongly embraced Facebook founder Mark Zuckerberg’s motto of “moving fast and breaking things”.Footnote 1 However, this ethos of disruption has posed significant challenges for existing policies, laws, and regulations, especially those that relate to the collection of consumer data and the usage and dissemination of data and information.

Data is the raw material that has come to be an essential and valuable feature of a broad range of digital economy activities. As Julie Cohen notes, “data extracted from individuals plays an increasingly important role as raw material in the political economy of informational capitalism”.Footnote 2 Information is often presented as data that is in some way organized, structured, or otherwise processed. The processing of data is thought to render it relevant for a “specific purpose or context, and thereby makes it meaningful, valuable, useful and relevant”.Footnote 3 In the digital economy era, data has become a significant driver of economic and business growth.Footnote 4 This is reflected, for example, in the market valuations of technology companies in the data technology business, including Alphabet, Amazon, Apple, Facebook, and Microsoft. In August 2020, the combined market capitalization of these five companies constituted more than 20% of the S&P 500, one of the most influential equity indices. That same month, Apple reached a market valuation of $2 trillion, and the market capitalization of the United States technology sector was said to be worth more than the entire European stock market, which was four times larger than the United States technology sector in 2007.Footnote 5

Several data technology companies use business models that derive significant value from targeted advertising driven by such companies’ uses of user data. These models involve the collection, aggregation, management, and organization of data in ways that enable these companies to sell targeted advertisements that may look like organic content.Footnote 6 For example, in 2019, Alphabet, Google’s parent company, derived more than 83% of its total revenues from online ads,Footnote 7 while Facebook received nearly all of its revenue from third-parties advertising on Facebook and Instagram.Footnote 8 Data technology companies’ collection of data has an impact on the ecology of digital spaces.Footnote 9

Data has been described as the new oil, which has contributed to the widespread implementation of data-driven business models: “[c]apitalizing on this data explosion is increasingly becoming a necessity in order for a business to remain competitive”.Footnote 10 The potential recognized by companies embracing data technology business models has become a beacon call to others who seek to harness value from the collection and exploitation of data and information.

Data-driven business models raise significant issues about the effectiveness of consent for all users and particularly for children. Collectors of data may not be transparent about uses of the data, and users may not be fully aware of the implications of consent. In some jurisdictions with less stringent data protection requirements, including many states in the United States, consumers may not effectively be given a choice about collection and uses of their data. The proliferation of mobile devices may give those collecting user data extensive information about users’ location and activities.

The 2021 adoption iOS 14.5 by Apple gives a strong indication of the importance of effective consumer consent. Apple’s iOS 14.5 includes App Tracking Transparency (ATT), which gives users of Apple devices greater awareness of which applications are tracking them and more control over such tracking.Footnote 11 More specifically, ATT requires that applications ask permission to track activity across other applications and websites.Footnote 12 Apple’s ATT was strongly opposed by Facebook, which is a leading company embracing an advertising data technology model. Facebook undertook a campaign opposing Apple’s ATT, publishing full page advertisements in leading newspapers in the United States: “titled ‘We are standing up to Apple for small businesses everywhere’, and ‘Apple vs. Free internet.’ Both the ads call the new iOS 14 privacy update harmful for small businesses because, without targetted ads, their sales will drop by 60%”.Footnote 13 Initial data about consumer consent suggests that many consumers will not consent to tracking if they are given the opportunity to block tracking. According to Flurry Analytics statistics, only 15% of iOS 14.5 users in the United States opted-in to ad tracking four months after the software was released, which is noticeably lower than the 21% of users who opted-in worldwide.Footnote 14

Consumer consent in contexts of extensive data collection may also be ineffective because consumers and even the companies themselves may not actually know the value of the information being given. Consumers may also not be aware of the risks of the data that they give such companies, particularly because the aggregation of such data and creation of profiles are part of what renders such data and information valuable. The harm imposed by breaches of such data are thus difficult to determine, even after breaches have occurred, because companies creating such models are not the only actors creating profiles of consumers. Hackers, for example, are actively scraping data from these collections of consumer data.

Once collected, data and information may rest in the hands of private companies that build business models derived from exploitation. The monetization of data has become pervasive among businesses. A 2017 survey by McKinsey found that, although data monetization as a means of growth was in its early stages of development, “an increasing share of companies is using data and analytics to generate growth... [and] are adding new services to existing offerings, developing new business models, and even directly selling data based products or utilities”.Footnote 15

The growth of data technology companies has led to a proliferation of business models that seek to monetize data, which has given many companies incentive to collect as much data as they possibly can. Edward Snowden, a former U.S. National Security Agency (NSA) contractor who in 2013 became a whistleblower by leaking classified documents that detailed NSA global surveillance programs, has persistently drawn attention to digital era intelligence collection and monitoring. In 2018, Snowden discussed data collection and data privacy, noting that the collection of data entrenches “a system that makes the population vulnerable for the benefit of the privileged”.Footnote 16 He also noted that the NSA and Facebook have similar data models.

The proliferation of data technology companies has contributed to changing cultural practices with respect to expectations of privacy and views about surveillance.Footnote 17 This has significant implications for a broad range of legal frameworks, including copyright law and privacy law. Broad data collection also raises questions about the accountability of those who gather, process, or use data. The rapid growth of business models that focus on the collection and use of information has contributed to the widespread retention of personal data and personally identifiable information by companies, governments, and others in varied contexts.

The collection and use of data have a particular impact on children. As the Office of the Children’s Commissioner for England (CCO) noted in a 2018 report, more data about children is collected today than ever before; further, the availability of such data may have significant consequences for children.Footnote 18 This data may stem from various activities and sources, including parents and children sharing information on social media, smart toys, speakers, and other connected Internet of Things (IoT) devices, which are increasingly present in homes. Data may also come from monitoring equipment like pedometers and location-tracking watches, and information given away when children use essential public services. As the CCO report states, “[c]hildren are being ‘datafied’ – not just via social media, but in many aspects of their lives”.Footnote 19 Access to information about children may pose both short-term and long-term threats.

In addition, children, unlike adults, may not have control over decisions related to their data, which must be taken into account in any attempt to regulate data and information about children. The proliferation of data is not just an issue for children. In a world of connected devices, many create data in the course of everyday activities and transactions. Many who use IoT devices, including smart speakers like Amazon’s Alexa and smart cameras like Amazon’s Ring camera, may not be aware of who might have access to their devices or how data generated by their devices is handled.Footnote 20

Companies and governments are not alone in accessing and collecting personal information. As a result, once private data has been collected, other parties may seek to access and exploit it for their benefit. A number of business models are being built around the theft and capture of private data and information, which is now so widely collected and retained.Footnote 21 Private data and information may not be well secured by those collecting and retaining it. The rise of data technology business models has been a factor in increased legal and regulatory attention to questions related to data privacy, data protection, and data security. This scrutiny is necessary because the data being collected has a potentially long lifespan and is not really disposable:

“The data we are all collecting around people has this same really odd property [as nuclear waste]… information about people retains its power as long as those people are alive, and sometimes as long as their children are alive. No one knows what will become of sites like Twitter in five years or ten. But the data those sites own will retain the power to hurt for decades.”Footnote 22

Data collected from children has potential to be long-lived, and it is prone to be “dangerous long after it has been created and forgotten because the massive amounts of data collected about people are not disposable; they could be useful at some point, particularly when consumer data are used in national security intelligence”.Footnote 23

Personal data is one of the most important targets for participants in markets for stolen data. In 2017, some 17 million Americans had personal data stolen and became victims of identity theft.Footnote 24 Data leaks have become pervasive. Concerns about data privacy and security have led to the adoption of data protection legal frameworks throughout the world, including the European Union’s General Data Protection Regulation (“GDPR”),Footnote 25 which has become a worldwide model for data protection. In his 2018 remarks, Snowden suggested that attention to data privacy and data protection “misplaces the problem” by not sufficiently focusing on the business of data collection itself: “[t]he problem isn’t data protection, the problem is data collection... [r]egulation and protection of data presumes that the collection of data in the first place was proper, that it is appropriate, that it doesn’t represent a threat or a danger”.Footnote 26

The extent to which businesses are effectively penalized may play a significant role in determining incentives. GDPR fines may give some indication of incentives. In 2018, the GDPR began being enforced by various European Union data protection agencies. By August 2020, one calculation suggested that GDPR fines assessed collectively surpassed €500 million.Footnote 27 The effectiveness of these fines will depend on several factors, including whether those penalized are actually the ones who pay them and what behavioral incentives are induced by fines. These penalties should also likely be evaluated in light of the enormous benefits that flow from uses of data in a wide range of business models today. In October 2020, in one of the largest GDPR fines to date, the Hamburg Commissioner for Data Protection and Freedom of Information fined H&M €35.3 million for excessive monitoring of employees at an H&M German subsidiary.Footnote 28 As of mid-September 2021, Google and Facebook led the list of those fined. Google received a penalty of €50 million by the French data regulator CNIL for “lack of transparency, inadequate information and lack of valid consent regarding ads personalization”.Footnote 29 WhatsApp, owned by Facebook, received a fine of €225 million in September 2021 for failure to fully disclose how it collected and shared user data.Footnote 30 Notably, although €50 million is a substantial amount of money by most measures, it is likely not a substantial amount for Google, which had almost $162 billion in revenue and earned $34.3 billion in net income in 2019.Footnote 31 Similarly, the €225 million fine imposed on Facebook is a fraction of Facebook’s close to $86 billion in revenue and $29 billion in net income in 2020.Footnote 32 Google’s €50 million GDPR fine is also substantially less than fines imposed on Google by the European Commission’s Directorate General for Competition, which totaled close to €10 billion by 2019.Footnote 33

2 Changing Business Models, Cultural Practices, and Regulation

Data technology business models pose constant challenges for regulation in part because they are varied, relatively new, and are often continuously under development. Data technology models enable companies to derive significant value from the collection, aggregation, control, and use of information. Companies and governments today collect a significant amount of data:

“‘These companies take enormous, enormous amounts of data about us’ [Senator Mark] Warner told Axios. ‘If you’re an avid Facebook user, chances are Facebook knows more about you than the US government knows about you. People don’t realize one, how much data is being collected; and two, they don’t realize how much that data is worth.’”Footnote 34

Google, for example, records every search performed and every YouTube video watched.Footnote 35 For users with smartphones, including iPhones and Android phones, Google Maps “logs everywhere you go, the route you use to get there and how long you stay – even if you never open the app”.Footnote 36 Google will now automatically delete private data after 18 months by default, but only for new users. For the 1.5 billion people on Gmail and 2.5 billion people already using Android, default account settings permit Google to retain private data forever unless the user changes this setting.Footnote 37 Users of IoT devices may authorize devices that enable them to be under surveillance. Although many of these devices are purchased for personal security, the data generated by them may be available to others, which may include employees of companies selling IoT devices, who may have access to such devices, and hackers.

The collection and retention of vast amounts of data, which consumers have in many cases given away for free, has led to problems. Data made available to Facebook has in turn been made available to others, often without the user’s consent. For example, Cambridge Analytica, a political data firm hired by the 2016 Trump campaign, “gained access to private information on more than 50 million Facebook users. The firm offered tools that could identify the personalities of American voters and influence their behavior”.Footnote 38 Cambridge Analytica was also involved in other campaigns, including those of Kenyan President Uhuru Kenyatta in 2013 and 2017 and the Brexit referendum.Footnote 39

Data breaches have become commonplace today and may not be reported in a timely fashion in the United States. The 2017 Equifax data breach revealed personal information about approximately 148 million people in the United States, 8000 people in Canada, and almost 700,000 citizens of the United Kingdom.Footnote 40 This data breach occurred between March and late July 2017. Equifax became aware of suspicious network activity in late July 2017 but did not make a public announcement about the breach until early September 2017. Notably, GDPR requires disclosure of certain types of data breaches.Footnote 41 Breaches at Equifax and other companies highlight ways in which data may not be properly secured. Companies in possession of such data may underinvest in security, for example, by not encrypting data. IoT and other devices may be compromised or lack basic security features. Companies may engage in improper or illegal data collection, including from children. In 2019, for example, Google and YouTube paid $170 million to the United States Federal Trade Commission and the State of New York for violating the Children’s Online Privacy Protection Act (COPPA) of 1998.Footnote 42

Changing digital economy business models that focus on data as raw material underscore fundamental changes in cultural practices. Many people generate content on YouTube, Facebook, Instagram, and more recently TikTok, as well as numerous other websites and applications that have become essential digital era tools. Content on these platforms have become important vehicles for expressing creativity, conveying knowledge, and forming and maintaining relationships. It has become commonplace for people to carry devices, including smartphones, that enable others to track them, at times with incomplete or inaccurate knowledge about the capabilities of such devices. People may install other applications that enable others to surveil them. In 2019, news reports emerged from Tennessee, where a stranger hacked a family’s Ring camera home security device and was able to watch and speak to young girls, the parents of whom had ostensibly purchased the device and placed it in their daughters’ bedroom to enable better security.Footnote 43

Many people voluntarily provide data to sources that they authorize without a full understanding of the potential uses of their data or how data has become a core raw material for varied digital economy business models, including in black-market ecosystems.Footnote 44 These and a myriad of other activities and behaviors underscore significant changes in cultural practices that have seemingly shifted views about surveillance and privacy for many, perhaps at times unknowingly because people may not always fully apprehend what their devices actually do or what happens to data they provide.

The applications and devices that have become a part of daily life are key conduits through which many access “the digital”. These applications and devices highlight the importance of networks in the digital era, the complexity and transparency that may come with using things we insufficiently understand, and the asymmetries of power and information that have become pervasive features of the digital era landscape.

This ongoing shift in business models and cultural practices poses significant regulatory challenges because we may not yet have an adequate understanding of the incentives driving business practices or the activities and motivations of users of apps and devices. As Professor Lawrence Lessig noted in his seminal discussion of cyberspace, digital economy spaces demand:

“a new understanding of how regulation works. It compels us to look beyond the traditional lawyer’s scope—beyond laws, or even norms. It requires a broader account of ‘regulation,’ and most importantly, the recognition of a newly salient regulator”.

This likely means more flexible approaches to regulation that can adjust with changing technologies, business models, and cultural practices.

3 Privacy and Children’s Data

Children’s data, just like adults’, is subject to the risks of unauthorized disclosure. Data breaches at Equifax and other companies thus affect children. Children, however, potentially have different and perhaps even greater risks because unlike adults, their relationships with digital spaces may be mediated by family relationships and the activities of other people who may have the authority to disclose their data. Child identity theft is a growing problem, and it may occur within the family context when persons related to children or authorized to disclose children’s data may engage in identity theft. In 2017, estimates suggest that more than one million children were victim of identity theft in the United States.Footnote 45 Two-thirds of the victims were under seven years of age, and 60% of child victims knew the perpetrator. In contrast, only seven percent of impacted adults have personal knowledge of their perpetrator.

Although older children may be more technologically competent than adult family members, young children may not be able to adequately monitor data disclosures about them. Approaches to privacy in the United States at the federal level do not sufficiently reflect the current topography of risks relating to data collection and to the protection of data once collected. The Children’s Online Privacy Protection Act (COPPA), as well as regulatory rules adopted by the Federal Trade Commission (FTC) pursuant to COPPA, create a framework of fair information practices to collect, access, and use personal information by websites directed at children under 13 years of age, certain general audience websites, and services whose operators have actual knowledge that they are collecting personal information online from children under 13 years of age.

FTC COPPA rules require operators to provide notice of what information is collected from children, uses of such information, and disclosure practices for such information.Footnote 46 Operators are required to obtain verifiable parental consent prior to collection, use, and/or disclosure of personal information from children and to provide a reasonable means for a parent to review personal information collected from a child and refuse to permit further use or maintenance of such data.Footnote 47 Operators may not condition a child’s participation in activities on the collection of more personal information than is reasonably necessary to participate in such activities.Footnote 48 Operators must also establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children.Footnote 49 FTC COPPA rules also contain a “safe harbor” provision that enables industry groups or others to submit self-regulatory guidelines that would implement COPPA rule protections to the FTC for approval.Footnote 50

The collection and use of data may also be regulated at the state level in the United States. The California Consumer Privacy Act of 2018 (“CCPA”),Footnote 51 which became effective on January 1, 2020, gives California consumers greater control over personal information collected by businesses.Footnote 52 The CCPA has been amended and expanded by the California Privacy Rights Act (“CPRA”),Footnote 53 a ballot measure (Proposition 24) approved by California voters on Nov. 3, 2020, which will become fully effective on January 1, 2023. The CPRA establishes the California Privacy Protection Agency (“CPPA”), which it grants investigative, enforcement, and rulemaking powers.Footnote 54

Effective enforcement of the CCPA was delayed pending the effectiveness of final CCPA regulations by the California Attorney General’s Office.Footnote 55 The CCPA applies to actors that satisfy one of the following three conditions: The subject must have a (1) gross annual revenue in excess of $25 million,Footnote 56 (2) independently or jointly annually buy, receive for commercial purposes, sell, or share for commercial purposes, alone or in combination, personal information of 50,000 or more consumers, households, or devices,Footnote 57 or (3) derive 50% or more of its annual revenue from selling consumers’ personal information.Footnote 58 Personal information subject to CCPA includes a broad range of information that identifies, relates to, describes, is reasonably capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household.Footnote 59 Under CCPA, personal information encompasses information that is not publicly available, such as (1) identifiers like names, aliases, addresses, and IP addresses; (2) characteristics of protected classifications under California or federal law; (3) commercial information, including records of personal property, products or services purchased, or consuming histories or tendencies; (4) biometric information; (5) internet or other electronic network activity information, such as browsing history; (6) geolocation data; (7) audio, electronic, visual, thermal, olfactory, or similar information; (8) professional or employment-related information; (9) education information; and (10) inferences drawn from any other information identified in the CCPA to create a profile about a consumer.Footnote 60

The CCPA gives California consumers five categories of data privacy rights in their personal information, including the right to know, the right of access, the right to deletion, the right to opt out, and the right to equal service. The right to know requires businesses subject to CCPA to make affirmative disclosures to all consumers and respond to verifiable consumer requests with individualized disclosures about the business’s collection, sale, or disclosure of that particular consumer’s personal information.Footnote 61 The right of access gives consumers the right to access a copy of the “specific pieces of personal information” that it has collected about the consumer and receive a copy by mail or electronically.Footnote 62 The right of deletion enables consumers to request that a business delete any personal information about the consumer that the business has collected from the consumer.Footnote 63 Consumers have a right to opt out of the sale of their personal information to third parties under the CCPA,Footnote 64 which also grants a right of equal service that prohibits discrimination against consumers who exercise their rights under the CCPA.Footnote 65 The CCPA prohibits selling personal information of consumers under age 16 without consent, which establishes an “opt-in” system for minors. Children aged 13–15 can provide such consent but consumers under age 13 require parental consent.Footnote 66

Although the CCPA bears certain similarities to the GDPR, the core principles of the CCPA differ significantly from the GDPR. The CCPA does not reflect the fundamental principle of the GDPR of a “legal basis” for all processing of personal data.Footnote 67 The CCPA requires businesses to allow consumers to “opt-out” of having their information sold (other than in the case of minors, who must opt-in), unlike the GDPR, which requires businesses to implement an “opt-in” system to obtain consumer consent prior to their data being processed.Footnote 68 Notably, the CPRA adopts concepts of data minimization, purpose limitation, and storage limitation found in the GDPR.Footnote 69 These principles and the creation of the CPPA bring data privacy law in California closer to the GDPR.Footnote 70

The GDPR includes specific provisions that protect children,Footnote 71 reflecting the underlying belief that

“children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data . . . specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child.”Footnote 72

Article 8 of the GDPR imposes conditions applicable to a child’s consent in relation to information society services. The CCPA and the GDPR both provide for monetary penalties for non-compliance, with different approaches to determinations of liability. The CCPA provides for a limited private cause of action that permits statutory damages for data breaches in the amount of $100-$750 per violation per consumer or actual damages.Footnote 73 All other causes of action not involving data breaches must be enforced by the California Attorney General. Penalties for violations subject to enforcement actions by the California Attorney General are up to $2,500 for each violation and $7500 for each intentional violation.Footnote 74 Under the GDPR, administrative fines may be imposed up to 1) the higher of two percent of global annual turnover or €10 million or 2) the higher of four percent of global annual turnover or €20 million, depending on the nature of the GDPR violation.Footnote 75

The CCPA excludes certain categories of data such as medical data, which is covered by other legal frameworksFootnote 76 like the Health Insurance Portability and Accountability Act of 1996 (HIPAA),Footnote 77 and personal information processed by credit reporting agencies.Footnote 78 The legal landscape for privacy laws is fragmented to a greater extent in the United States than in the European Union.

4 Conclusion

Failure to secure personal data and information has a particular impact on children. This also implicates the role of parents as decision makers about disclosures of children’s data, including images. It also reflects the realities of family dynamics and relationships that may impact the use and protection of children’s data. Children may not be able to monitor disclosure or use of their data for varied reasons. Children may not have access to their data, which may be controlled by their parents, or may not have the ability to monitor data disclosure. Given that a significant percentage of children’s identity theft comes from persons known to them, such as family members, regulation of their data implicates family relationships in ways that are potentially difficult and complex. Existing legal and regulatory approaches may focus on data privacy after it has been collected, with insufficient attention to the effectiveness of consent in light of widespread data technology business models, as well as to extensive data collection and aggregation. While this may harm both adults and children, it places a particular burden on children.