CMOs own the customer experience, but too often they overlook the role privacy plays in building trust and brand loyalty. As a privacy professional, have you ever considered the role marketing stakeholders should play in your privacy program?
Research shows that when marketing gets involved in privacy strategy, there are positive outcomes across the business: from customer values, to brand differentiation, and even revenue growth. Rather than sit on the sidelines, CMOs belong at the table when privacy and compliance are discussed.
Let’s look at three reasons why the marketing perspective is important to consider as part of a successful privacy program.
Interested in a deep dive into this topic with marketing and privacy experts? Register for our February 3rd IAPP-hosted webinar: Marketing and Consumer Experience Perspectives to Enhance Your Privacy Program.
Today’s consumers are more enlightened than ever before. They consider the impact that manufacturing processes, source ingredients, hiring practices, and carbon footprints have on the world, and they favor brands that align with their values.
According to the Ipso Global Trends Report 2021, 70% of consumers in 25 countries say they prefer to purchase from brands that align with their values and priorities.
Research shows us that respect for privacy is one of those values. According to a 2021 KPMG survey of consumers:
A brand’s commitment to privacy should do more than meet the letter of the law; it should demonstrate to the consumer that the company sees privacy as a fundamental human right, and will treat all data shared accordingly.
Many ethical brands are committed to privacy - in principle. When it comes to real world implementation, complexity is the challenge.
The average brand has 30+ point solutions in their marketing and consumer experience tech stack. When a consumer updates his or her privacy preferences, the data in every one of those point solutions must be updated. If your brand does this manually, it takes a massive amount of effort. And mistakes and delays, which are unavoidable, send a message to the customer that your brand doesn’t value privacy.
As a privacy professional, you know how critical data control is to privacy. Without a thorough understanding of how your marketing department stores, transfers, and uses customer data, you have a gap in your privacy program. Forging a relationship with your marketing counterparts will get you closer to having a full understanding of your data systems.
Still thinking about the manual data subject request process across all of those point systems? This is where Ketch’s Programmatic Privacy approach shines. When consumers express their privacy preferences, Ketch matches that user to the unique IDs assigned to them within each system. Then, using an API, Ketch updates each system automatically and in real time.
From that point forward, each system will create a customer experience that fully complies with the individual user’s jurisdiction and preferences. That’s how brands can walk the privacy talk.
Privacy has a real, beneficial effect on a brand’s bottom line. According to Gartner, “Rapidly maturing regulations and consumer desire to deal only with trustworthy businesses mean privacy materially affects financial results.”
Gartner estimates that by 2023, brands that earn and maintain consumer trust will earn 30% more profits from their digital commerce revenue initiatives that their competitors. A commitment to privacy enables organizations to participate in “50% more ecosystems to expand revenue generation opportunities.”
Given the revenue that’s at stake, every brand should view having a strong, public privacy stance as an opportunity to win consumer respect. Marketers can provide thought leadership input on how to translate your privacy program from an internal compliance necessity to a cornerstone of your brand’s consumer experience.
I’m excited to moderate an IAPP-Hosted Webinar, Marketing and Consumer Experience Perspectives to Enhance Your Privacy Program, where we’ll discuss these topics even further. I will be joined by Arielle Garcia, Chief Privacy Officer at UM Worldwide and Stephanie Liu, Privacy and Marketing Analyst from Forrester. Join us to hear even more about:
Registration is free, and I hope to see you there.
Marketing & Consumer Experience Perspectives to Enhance Your Privacy Program
February 3, 2022
1:00pm ET/11:00am PT
A first-party cookie is a tiny packet of text that is created and stored by a website which a user is visiting. The data it collects from the visitor is used to track the visitor’s activity in order to collect analytics data, remember user input and preferences (such as log-in details), and perform other functions to improve the browsing experience.
First-party cookies can’t move from one website to another; they can only track user activity on the website they are placed on. By default, websites allow first-party cookies. Otherwise, they won’t be able to identify returning users.
For example, if you are adding items to a shopping cart on an e-commerce site, first-party cookies make it possible to keep these items in your cart, even if you switch links, as you would if you are looking at multiple items. If first-party cookies are disabled, you would have to sign in and manually search and add items to your cart every time you jump from one page to another.
For more information on managing cookies and how a consent management system can help, contact the privacy experts at Ketch.
Both first-party cookies and third-party cookies track user activity and collect data from consumers. But there are differences in their creation, use, and purpose.
While first-party cookies are created by the person visiting a website, third-party cookies are installed by other programs that are separate or distinct from the site, which explains the term third-party. These are usually from scripts or tags on online advertisements placed on the site; the ads are neither owned nor controlled by the owner of the website.
Third-party cookies are found on any website that loads a third-party server’s code. This means that it can track user activity across multiple websites (even emails and social media platforms) over a long period versus first-party cookies that only live on one website or domain.
Insights collected from third-party cookies are often random and general, so businesses might find it difficult—and sometimes moot—to draw conclusions about their audiences from them. This is the opposite of first-party cookies, which base information from direct and intentional interactions with users of a business’s website.
Second-party cookies are basically first-party cookies that are used like third-party cookies. Websites that use first-party cookies exchange, sell, or transfer collected information to another business or website through data partnerships.
This data now falls under the category of second-party cookies.
Under the GDPR, businesses must provide users with information on cookies and obtain opt-in consent before activating cookies on their site. Meanwhile, the CCPA mandates that businesses give consumers the option to opt out of the sale of their personal information, which can be collected by cookies enabled on their websites.
In both laws, the definition of “personal information” doesn’t clearly define first-party cookies as the type of data that must be protected. In some interpretations, first-party cookies fall under the category of session cookies which websites only need to function and, so, don’t pose a risk to data privacy.
Data privacy laws have begun to employ stricter measures in the use of third-party cookies, which leave businesses relying on first-party cookies for consumer insight. And this isn’t at all a bad thing.
First-party cookies are actually more valuable to businesses since they draw insights directly from consumers (who are typically the target market already) intentionally engaging with their websites.
The information, then, is much more accurate and relevant. When used correctly, it can provide businesses with information that can help improve the site experience and differentiate the brand from others for a competitive advantage.
The key points to take away are the 3 types of cookies and how they are regulated by the data privacy laws.
First-party cookies are like atoms of information collected from your device, whether that be a laptop or phone, by a website you visit. When that cookie is exchanged or sold to another website, it becomes a second-party cookie. These two types of cookies carry personal information about an individual. Third-party cookies are imported into a website whenever that site loads code from another party’s server; these cookies pick up more general information from visitors.
Both the GDPR and the CCPA have legislated that businesses must take certain actions to protect the privacy of the personal information taken from visitors to their websites or apps. This protection includes providing information about website cookies and their purpose and gives website visitors some control over cookies and/or the use of their personal information.
Every business needs to be familiar with these two pieces of legislation since required compliance extends far beyond the state and region in which they became law.
Cookies are one of the first things that come to mind when discussing the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). Cookies collect information from people, and, under data privacy laws, businesses must inform users about these trackers and obtain their consent before setting them into action.
Since the GDPR is a law originating in the European Union (EU), you may wonder, does GDPR apply to non-EU citizens? If so, follow the link to see the answer.
After listing all active cookies, you must create a policy that details the purpose of each one, what data they collect, store, use, or sell, and how users can opt-in or opt-out of them. You can find templates for these online. But it’s good to review the regulations set by the GDPR and the CCPA to make sure that everything’s done by the book.
Consequently, businesses must inform their consumers about these changes and, if necessary, obtain their consent to allow current data practices to continue.
Data privacy laws generally include guidelines as to how privacy policies should be written or delivered, including what needs to be included. Businesses must follow these rules—and make any updates, as mandated by the laws, so they’re compliant, reducing the risk of any legal damages.
Privacy is a team sport requiring all hands -- marketing, legal, IT and HR -- on deck. It is not hard to see why. Adapting to the new privacy landscape -- with its complex new (and ever-changing) laws and consumers’ conflicting desires for both increased privacy and personalization -- requires a company-wide push. But successful collaboration to support a comprehensive privacy compliance program requires stakeholders to coordinate as a team.
It is not productive when stakeholders do not share a common understanding of purpose and the tools to achieve that purpose. This misalignment can result in endless meetings, with compliance achieved slowly, at great cost, and easily undone by legal or policy changes. Ensuring that stakeholders clearly understand the privacy objectives, and the business and technical support necessary to achieve those objectives, removes friction and fosters high-level collaboration resulting not only in legal compliance but a competitive advantage through greater insights derived from responsibly-leveraged data. In this article, we’ll explain how to form a collaborative, value-driven privacy program and best practices to avoid the frustrating technical challenges too many companies struggle with today.
First, realize that while diligent and highly aware legal policy owners are vital, successful engagements involve multiple stakeholders across the organization. Each department brings particular knowledge power to support a proactive privacy posture.
Responsibilities and contributions of each department include:
The marketing department is a translator between legal and the consumer. Privacy notices, disclosures and preference centers impact user experience and typically occur early in the buyer journey -- upon first visit to a website, for example. Their language, style and timing affect brand perception -- this is especially true where trust and transparency are core brand values. Marketing tunes these messages and builds them into a company’s branding to convey to consumers, with minimal interruption, that it respects their right to privacy.
Privacy programs and policies aren’t documents that just sit on a shelf. Their purpose is to ensure consumer consent and rights are respected, and this requires orchestration across internal and external third-party data systems. Some of IT’s responsibilities include implementing technology that honors the promises made in privacy notices and consumer consent disclosures, as well as adapting website and mobile infrastructure to collect and process data in a compliant manner. Data monetization and data privacy are increasingly necessitating IT input as part of the overall collaborative effort with legal, marketing and business departments. The result: alignment between compliance and growth.
IT contributions typically include:
With the passage of the California Privacy Rights Act (CPRA), starting January 1, 2023, the CCPA employer exemption expires, granting employees in California the same rights that consumers have enjoyed since CCPA passed. This means businesses will need to have systems in place to:
In addition, CPRA provides new rights to both consumers and employees, namely rights to correct personal information and to data minimization and retention limitations. California has been at the forefront of data privacy legislation in the US; others (Virginia, Colorado) have followed suit, and more will undoubtedly follow.
True operationalization of privacy, not just the Hollywood facade, requires buy-in from all departments. Stakeholder collaboration, however, can become stymied without a clear understanding of the necessary legal, compliance, and technical requirements to fulfill the desired objectives.
Using first-generation technologies for privacy compliance, which rely largely on manual and process-driven efforts, and which lack interoperability, triggers a repetitive cycle of small tech fixes to broad enterprise needs with every small business or legal change. Sophisticated, productive collaboration depends on unified technology that adapts easily to change, and is easy to understand, use and deploy by all relevant stakeholders. Programmatic privacy compliance that accounts for these needs is vital to competing in today’s market.
Under the California Consumer Privacy Act (CCPA) that went into effect on January 1, 2020 and the recently approved California Privacy Rights Act (CPRA) that will supersede CCPA come 2023, California residents have the right to opt-out of a business selling or sharing any of their personal information.
That means that if you are a for-profit entity with an annual gross revenue in excess of $25 million and handling personal information of more than 100,000 California consumers or households, you are required by law to provide a clear and conspicuous way for your customers to opt-out. But what exactly does the right to opt out mean, how is it implemented, and how can you ensure your business complies?
What Does it Mean?
When you give customers the option to opt out, it limits the extent to which your company can sell or share a customers’ personal information. Under CCPA/CPRA, personal information is considered any information that identifies, relates to, or could be linked to an individual or household. This includes information like name, social security number, email or IP address, Internet browsing history, product purchases, geolocation data, and professional or employment-related information—essentially any information that is not publicly available via federal, state or local government records. According to Section 1798.140 of the CCPA, personal information also includes any information used to create a customer profile that reflects preferences, characteristics, behavior, or attitude.
The opt-out requirement doesn’t preclude you from collecting personal information in the normal course of doing business. After all, your business needs personal data to fulfill purchases and enable transactions. Opting out just means that you can’t sell or share this information with any other entity—unless it is a service provider that is necessary to perform a business function.
It’s important to note that any disclosing of personal information deemed as providing monetary or other valuable consideration is considered a “sale” under CCPA. While often disputed, this broad definition includes the use of third-party advertising and analytics cookies that track a user’s browsing behavior. This does not apply to first-party cookies required to perform essential functions on your website, like remembering which products a customer has placed into an online shopping cart.
How is it Implemented?
Under CCPA/CPRA, businesses needing to comply must provide two or more methods for submitting requests to opt-out, including an interactive form accessible via a clear and conspicuous “Do Not Sell or Share My Personal Information” link on the business’ homepage. Other acceptable methods include a toll-free phone number, designated email address, forms submitted in person or by mail, and user-enabled privacy controls such as a browser plugins or settings.
One way of providing an opt-out method is via an interactive cookie banner on a website that allows users to decline or accept any non-essential cookies that collect personal information. Some also get a bit more specific and allows users to select only necessary cookies that enable core functionality to help improve the customer experience while preventing the sale or sharing of data for marketing analytics or targeted advertising.
CCPA/CPRA also has more restrictive “opt-in” requirements for children. This means that businesses cannot sell or share personal information for consumers less than 16 years of age without specific affirmative consent, with parental consent required for anyone under the age of 13. Unlike the opt-out option, opting in means that consumers are opted out by default and must take action to opt in. While this is contingent upon the business having knowledge of the age of the consumer, CCPA/CPRA does not allow a business to deliberately disregard a consumer’s age. Any business that targets children would therefore be wise to only use the “opt-in” option or implement a means to identify age to turn off any default selling or sharing of information for anyone under 16.
How Can You Ensure Compliance?
It is also recommended to conduct a thorough data mapping to identify all the ways your business and its systems handle personal information. This can help you determine if any third-party cookies are enabled on your website or if any of your data handling constitutes selling or sharing personal information. Because even if you think you aren’t selling or sharing personal information, it’s not always as obvious as disclosing data to third-party advertisers—think credit checking, identify verification services and other cloud-based services. And if you are unknowingly selling or sharing personal information, you’re still liable.
To see just how compliant (or not) your business is with CCPA/CPRA opt-out rights, start with a free assessment of your website at www.privacygrader.com.
Media and entertainment companies pay close attention to cybersecurity; after all, you can’t run a video-streaming service like Netflix or Disney+ without putting top-of-the-line safeguards in place to protect your media assets and ensure continuity of service. But how many media and entertainment companies can say, hand on heart, that they put the same effort into protecting their users’ data and privacy rights?
The irony is that, as the success of M&E-adjacent companies such as Facebook has shown, data itself is rapidly becoming a core asset for media companies. So far, though, few firms are investing sufficiently in the technologies needed to handle an expanding data universe, or to manage compliance in an increasingly demanding regulatory landscape.
At Ketch, we understand the relationship between media and tech. We the founders previously built Krux, working with media giants including Time Warner and Meredith to serve over 20 billion page views per month. So I speak with some authority when I say: if media companies don’t solve their tech stack, and put effective systems in place to manage privacy and data governance, they will face serious problems in the months and years to come.
The golden age of M&E
Ironically, this challenge stems from the fact that media companies are currently enjoying something of a golden age. Increasingly, M&E brands operate across a wide range of platforms — mobile, TV, game consoles, the Web — and an even wider range of jurisdictions. Look at Netflix: whether I’m in Caracas, Caen, or Cincinnati, I can fire up my laptop (or my phone, iPad, Roku box, or just about any other gadget) and instantly access a wealth of streamable content.
People aren’t just accessing media at home: they’re logging on at the gym, in schools and workplaces, during commutes, and even on long-haul flights. But precisely because M&E brands now touch so many parts of people’s lives, they face increasing scrutiny from both consumers and regulators.
People pay real attention to media companies, and have a direct and intimate relationship with them, so data breaches can do real brand damage. After all, if an anonymous ad company exposes my personal data, I’d be annoyed — but if my favorite media company messes up, I’d feel utterly betrayed. When Sony’s PlayStation Network got hacked, exposing 77 million users’ data, it took years for Sony to regain gamers’ trust.
Game on for regulators
Media companies are more likely to be punished by consumers for data breaches, and they’re also first in line to feel the impact of new privacy regulations, because they interact directly with both content and consumers. The entire downstream universe of ads and marketing relies on data and consent information collected by M&E companies, and regulators are paying close attention to the way that media companies meet their obligations to consumers.
Further complicating matters, media companies are seeing regulations such as the GDPR used as an offensive weapon. Data protection laws (and especially data subject rights such as the right to be forgotten) are being used alongside defamation law to raise objections to content, making data management a high-stakes area of special interest for M&E companies.
Media and entertainment brands also face additional regulations regarding their handling of users’ content-viewing and web-browsing histories, their decisions when using interactive content, and so forth. The stakes are especially high for companies that cater to children: parents care deeply about how their children’s information is handled, and many data laws have stringent provisions governing minors’ data.
Media companies must also monitor regulations that govern access to services across multiple jurisdictions. The European Union, for instance, now requires that media subscriptions purchased in one member-state remain valid in other member states, so an Italian Netflix subscriber must also be able to log on to local services when traveling in France or Finland. That requires companies with multinational presences to seamlessly coordinate privacy and consent across their national sites and services.
The fragmentation of data tech
Inevitably, as M&E organizations spread across multiple platforms and jurisdictions, their data-tech solutions grow fragmented. Dealing with a complex world, companies implement complex fixes — and in doing so, they create a hodgepodge of tech solutions incapable of delivering the streamlined, joined-up data infrastructure they need to succeed.
Unlike finance or health companies, where data security has always been a priority, M&E organizations haven’t historically had to worry much about these issues. It’s only now, with the explosion of data applications and the sprawl of global regulatory frameworks, that media companies have found themselves forced to try to evolve new capabilities, and rapidly add in-house data expertise.
To cope, media companies urgently need a new approach: a data-security solution that can sit alongside their existing tech stack, and serve as a single point of truth amidst the complexities of their tech architecture, their vendor ecosystem, and their own corporate structure. After all, media companies are made up of sub-organizations within sub-organizations: companies like Warner, Sony, or Disney aren’t single entities, but rather amalgams of different tech and media operators. Now more than ever, these organizations need a way of ensuring all the different parts of their M&E empires speak the same language when it comes to privacy and consent.
Ketch is the solution
The good news is that help is at hand. Ketch serves as the connective tissue enabling media and entertainment companies to manage data privacy and consent across their sprawling, interconnected business functions.
Here’s how it works: using a simple dashboard, your legal specialists determine how different regulations affect your business, and create policies that define how data can and can’t be used. Ketch then turns those policies into a single unified reference system that developers from across your organization can use to check how any given piece of data can be used.
Your legal team never has to weigh in on specific data-usage questions — they just set the overarching policies, and let Ketch do the rest. And your developers and engineers don’t have to fret about interpreting regulations — they just use APIs and other standard coding call-outs to check whether a given operation is permitted.
That means your legal team can stay focused on interpreting regulations, while your tech team can keep driving innovation forward. Ketch provides only the guide-rails that are truly needed, and stays out of your way the rest of the time — and that leaves you free to do whatever you want with your data, as long as it’s consistent with the regulatory interpretations you’ve implemented.
A richer experience
The benefits aren’t just internal: data managed by Ketch stays protected even when you work with outside parties. Because we extend through your whole vendor ecosystem, integrating both upstream and downstream, your users’ data is kept safe and handled appropriately even when you hand it across to third parties for ad-tech purposes or other permitted applications. That adds up to a seamless, flexible, and robust solution that allows media companies to provide exceptional experiences without compromising on data security.
Because Ketch lets you deliver just-in-time solutions, handling privacy and consent doesn’t detract from the user experience. With Ketch, you can wait until the exact moment that consent is needed (such as the instant of clicking a button or completing a form) before you make the request, rather than interrupting users before they’ve had the chance to start enjoying an experience or interacting with content.
An asset, not a liability
When it comes to privacy and data security, media and entertainment companies are on the front lines, and the stakes have never been higher. Regulators are raising the bar, and there’s simply no room for companies to make mistakes.
Even so, many M&E companies are woefully underprepared. Data is emerging as a vital strategic asset, but also a potential liability. To succeed, media brands urgently need a tech stack that’s able to ensure regulatory compliance without inhibiting innovation and growth. Ketch is the only solution that can cope with the complexity of the media environment, and help companies to manage data while delivering the seamless, engaging experiences their customers demand.
So if you’re trying to figure out how to manage data privacy in the M&E space, get in touch — and find out how Ketch can help you take privacy and consent to the next level.
The Switchbit team is driven by a belief in two key principles. First, privacy is an essential human right that all businesses should have the ability to respect and enforce. Second, data is property. Like land and other physical property, data must be protected and controlled according to the time, terms, and conditions of its owner’s choosing.
We don't see a zero-sum world where consumer privacy is protected and businesses lose. We believe that both consumers and businesses can prosper together. We're determined to help businesses honor the data dignity of their customers, while also giving them the privacy and security tools that let them preserve and unlock the power of data for core operations and AI-enabled business processes.
Since our inception, we’ve been working hard to achieve the radical simplification of data privacy, which we believe is among the most critical imperatives facing our economy and our society. Undeniably, we are in the midst of the Data Rights Revolution.
As tends to be the case with revolutions, optimism and commitment are all mixed up with complexity and confusion. In our experience, most businesses want to embrace and implement a consumer-first privacy paradigm--the question is How to get there?
We are committed to building powerful-but-simple infrastructure that guides our customers through the maze of laws and regulations, while at the same time recognizing and capitalizing on the opportunities along the way--opportunities hiding in plain sight.
Of course you need to achieve compliance and get the details right. But the companies that win this revolution will be those that go beyond, by creating privacy experiences that inspire customer satisfaction and trust. We help you imagine, design, and offer those experiences.
One of the many hurdles here is that the maze isn’t static. It changes as laws are born and evolve. All the energy you spent getting prepared for CCPA and GDPR? Congratulations! Your prize is…. CPRA and LGPD!. Data privacy is a dynamic challenge. That’s why we’re always focused on giving our customers a dynamic, deploy-once-comply-everywhere solution.
In this relentless pursuit of simplicity in the face of change, we’ll be with you every step of the way. And today, we’re practicing what we preach: We’ve decided our name adds more where less will do, and do better. Today, we’re saying goodbye to “Switchbit” and introducing you to “Ketch.” Strong and simple, just like our product and our mission.
Ketch is blazing a path in the Data Rights Revolution. Join us in fighting for privacy as an essential human right, and data as property to be preserved and protected.
And if you don’t know, now you know.
As I previously discussed in Part 1 and Part 2, to defeat COVID-19 we’ll need effective contact tracing — but in order to win widespread buy-in for digital tracing, we must also commit to zealously defending users’ privacy rights.
In Part 3, I’ll map out the path to privacy-preserving and effective contact tracing at scale. Watch the third video explainer, then see below for more.
The good news is that this is a solvable problem, and there’s a way to bring Americans on board as we deploy new contact-tracing apps. The bad news is that when it comes to winning the trust of the public, the teams building tracing apps have a lot of room for improvement.
According to a recent survey, 56% of Americans say they don’t trust tech companies to manage and protect tracing data. That’s 13 percentage points less than those who say they’d trust government health agencies or universities with their data.
Winning hearts and minds is going to be an uphill struggle. That’s why Ketch is calling for a new initiative to develop industry-wide privacy standards — a joint collaboration between tech firms, privacy advocates, health workers, and universities. This would be a foundational step toward genuinely trustworthy contact tracing.
In practice, what does that mean? In addition to developing contact-tracing apps, we must also build digital infrastructure capable of fusing tracing and health data and delivering the actionable insights we need to curb and preempt outbreaks. Crucially, we need to do so in a way that is unimpeachably secure, and that transparently safeguards users’ privacy and data rights.
To achieve this, we need three big things:
#1 Threat Exposure Notification Protocol (TENP)
#2 A commitment to citizen control
#3 A robust privacy standard
The Internet couldn’t exist without the HyperText Transfer Protocol (HTTP), which sets the standard for formatting and transmitting messages online. To collect the data needed to derail coronavirus transmission, we need a similar standard for contact tracing: a Threat Exposure Notification Protocol (TENP) that articulates how data can be gathered, stored, communicated, and shared between authorized stakeholders. Google and Apple’s new software kits empower individual applications, but we need a unified TENP to prevent the fragmentation of data across multiple tracing apps, and to allow analysts and health workers to leverage a unified data-stream as they work to thwart future pandemics.
In developing a TENP, we need to put end-users in the driver’s seat. Only by empowering citizens can we secure the buy-in that’s needed to trace contacts at scale. That means giving users the power to seek out information on their own terms, and to decide precisely how much of their data to hand over along the way. To be effective, any contact-sharing standard will need to have the user’s Right to Be Forgotten baked into its fundamental structure. We’ll also need to ensure that users can give, withhold, or withdraw consent for the use of their data at a granular level, and also veto the use of their data by authoritarian states such as Russia and China. These are foundational issues that need to be addressed as such, and not reinvented from scratch each time a developer sits down to code a tracing app.
Finally, we need to codify a privacy standard that is flexible enough to allow researchers and public health actors to fuse potentially chaotic health, location, and proximity data at scale, but also robust enough to give users reliable control over their data. The more we can ensure privacy, the more willing users will be to share their information, enabling not just basic contact tracing but perhaps also the use of more advanced health data, such as biometric data or medical results, to spur the development of new treatments for COVID-19. The potential is enormous, but only if we can give users an ironclad guarantee that nobody’s taking liberties with their data.
At Ketch, we’ve taken the first step towards these interconnected goals by launching the Threat Exposure Notification Protocol (TENP), a new standard for sharing data across contact tracing applications, data providers, public health institutions, and policymakers. Essentially, TENP is a framework that lets tech companies incorporate meaningful and verifiable privacy and data security into their tracing systems, and also gives users, regulators, and health professionals reassurance that tracing is being carried out in an effective and secure way.
Of course, TENP is only the first step. By establishing a clear standard for data sharing, we’re enabling the kind of collaboration and innovation that’s needed — but others, from policymakers to tech companies, will now need to step up and make use of these tools. What’s needed is a groundswell of support for these ideas, driven both by users and by tech companies themselves.
These are early days, and we don’t claim to have definitively solved the privacy challenges inherent in contact tracing. But with TENP, we’ve created a connective framework that the tech industry, the healthcare sector, and policymakers can build on to develop more robust tracing tools. Now, we need collaboration and creative thinking to move the ball forward — and to reassure users, in the United States and all over the world, that it’s safe to share their personal data.
There’s still plenty of work left to do, so if you’re interested in helping us figure out how to realize this vision, please get in touch. Defending privacy while building contact tracing systems at scale is a challenge that’s bigger than any one company or organization — but together, it’s a challenge we can overcome. Our ability to defeat COVID-19, and to prevent similar crises in the future, depends on it.
In Part 1, I explained that for contact tracing to achieve the requisite adoption levels in America, tracing technologies must be accompanied by robust privacy protections.
So what does it take for Americans to lay their personal data on the line in the name of public health? Watch the second video in our series below. Read on for more.
Despite the runaway success of digital contact tracing in places such as Taiwan and South Korea, about 60% of Americans don’t believe digital tracing will help us beat COVID-19. Yet 50% of Americans also say they’duse a contact tracing app if one were available.
Clearly, Americans are skeptical but persuadable. To overcome skepticism and win large-scale buy-in for digital tracing, we’ll need to address three big challenges:
The first challenge lies in gathering data in useful volumes. According to Covid-Watch, tracing technologies must be used by over 50% of a given population in order to be effective. Paradoxically, the current proliferation of tracing apps and technologies makes that goal harder to achieve ––unless apps share their data each new platform further fragments the total data-pool.
Clearly, gathering sufficient data will require careful coordination and data-aggregation between platforms. That’s especially important in a sprawling, geographically and demographically diverse country like the United States. The same network effects that lead kids to use TikTok and old-timers to use Facebook, or prompt Twitter users to coalesce into echo-chambers, could drive different groups to preferentially adopt different tracing apps. Without the ability for those apps to talk to one another, our ability to curb this pandemic is severely limited.
The bottom line is that the coronavirus doesn’t respect our app preferences, our social groupings, or our demographic and geographic divisions. As new genetic testing shows, the virus simply rolls across state lines and national borders, rippling inexorably from one hot spot to the next. To counter that, we’ll need tools that can share data effectively, both with each other and with researchers and health workers.
We need to gather as much data as possible, but to achieve that goal we’ll have to give users the right to opt out of contact tracing, and to delete any data they’ve previously shared. That’s because unless we put users firmly in control of their data, we’ll never achieve a critical mass of registered app users.
At a minimum, anyone who’s sharing personal data through a tracing app should be able to quickly and easily do the following:
Many contact-tracing solutions ignore these requirements, claiming they’re unnecessary for tools based on Bluetooth-powered proximity detection rather than GPS location tracking. But while the Bluetooth solutions touted by MIT, Apple, and Google are promising, citizens seeking to manage their personal health risk need more than a scary message telling them they’ve been in the proximity of someone who tested positive.
In the wrong hands, information about the people you’ve met can be just as sensitive as data about the places you’ve visited, and users have every right to demand control over how that data is stored and shared. Control works both ways, too: some app users might be happy to freely share both location and proximity data, at least in some circumstances. Rather than forcing users to wait for alarming messages to pop up on their cellphones, we should put them in control, and let them seek guidance on their own terms.
A lot of this boils down to giving people control of their data, and using their information only in ways to which they’ve explicitly consented. That might sound like a no-brainer, but it rubs up against some of the defining challenges of our modern world.
From digital staples such as web search and email to innovations such as contact tracing, we’re utterly reliant on big tech firms such as Apple and Google to build and maintain our digital infrastructure. Necessarily, and discomfitingly, that means trusting those companies to build a neutral infrastructure that serves our collective needs rather than their own corporate goals.
Don’t get me wrong: when it comes to COVID-19, we’re enormously lucky to have Apple and Google fighting in our corner. But there are real privacy concerns that come with the tech titans’ market dominance. The rise of privacy regulations such as the GDPR and the CCPA reflect legitimate concerns about the rise of a largely unregulated data oligarchy.
Tackling these concerns head-on, and building a system that handles privacy properly, should be a shared priority for regulators, users, health workers, and tech companies. We’ll need effective tools if we’re to solve this crisis and future ones. But we’ll also need apps that we can deploy on an enormous scale without sparking a privacy backlash. If we don’t get this right, we could be dealing with the consequences for years to come.
I’m hopeful about our ability to solve these problems and build a contact tracing network that respects people’s rights while delivering the data we need to defeat the coronavirus. In the next post, I’ll outline how we can unite to create the new data and privacy standards needed to win the fight.