Three takeaways from 2020 that give us reason to be optimistic about the future of data privacy and security

2020 has been unprecedented in so many ways: a global pandemic, wildfires, tectonic shifts in the world of politics and social justice, to name just a few.

It was also a big year for data privacy and security. 

We saw three significant trends that have the team at Gazelle Consulting feeling optimistic about the future of data privacy and security:

1. The coronavirus pandemic forced many organizations to shift operations almost entirely online, opening up the industry for telehealth.

We’re seeing entire industries fundamentally change how they do business. New industries are opening up and much-needed safeguards and accountability are being put into place to protect consumer privacy and security:

As our lives moved online, the need for reliable video conferencing tools became dreadfully apparent. Technology companies responded to this demand, delivering an array of different products in that space, the most notable of which was Zoom

  • The Office of Civil Rights (OCR) put a hold on HIPAA penalties related to good faith use of telehealth until the end of the COVID-19 state of emergency. This allowed healthcare providers to continue to have face to face interactions with patients on non-HIPAA compliant platforms like Zoom, Skype, and Google Meet while new telehealth technologies are developed
  • Zoom took a tremendous amount of heat over their lax privacy and security practices and have committed themselves to focusing on security for the next nine months.

2. There’s been a marked increase in concern about privacy and security issues — and a growing recognition that privacy is a civil rights issue.

For many Americans privacy can be an abstract concept.

As the pandemic forced us to move our lives online, people became increasingly concerned about what was being recorded about them. There is growing recognition that there are really personal details at stake.

Netflix’s “The Social Dilemma” amplified this conversation. The documentary, which featured interviews from former employees of big tech companies, highlighted  the degree to which we can be manipulated with information that any individual might think is benign.

We saw a public backlash over surveillance around protests and a heightened awareness that geolocation data and facial recognition technology has tremendous potential for misuse. 

In the U.S. we have some rights that govern our data — but that only applies to certain subsets of regulated data like health information and credit card numbers. That exposes all of us — especially those in our community who are most vulnerable — to warrantless gathering of our unprotected information. 

“On a fundamental level people know that this is wrong. Because this is all so unprecedented we have the opportunity to do something about it — but as soon as this is normalized, will we ever have this much motivation to solve it again? I don’t know.” — Christina Glabas, Principal, Gazelle Consulting

3. The laws are catching up to technology. 

Technology and data proliferation have quickly outpaced the existing legal frameworks aimed at protecting consumer privacy — but the laws are catching up.

Regulators are learning how to understand artificial intelligence (AI), curb some of the negative impacts of it, and hold businesses accountable. 

The California Privacy Rights Act (CPRA), for example, created the nation’s first regulatory agency to focus exclusively on consumer privacy. The California Privacy Protection Agency will begin enforcing the CPRA on July 1. 2023. 

The CPRA also extended protections to categories of data that were previously not explicitly covered by existing laws such as the processing of biometric data related to AI and inferences that AI can make about individuals through data sets that wouldn’t be considered sensitive — like an individual’s browsing history. The CPRA also regulates dark patterns, cross-contextual advertising, and profiling. 

We’re seeing policy being worked on at the federal level by legislators like Senators Cory Booker (D-NJ), Ron Wyden (D-OR), and Representative Yvette Clarke (D-NY) who introduced the Algorithmic Accountability Act in 2019.

How will a Biden Administration respond to issues of data privacy and security? That remains to be seen. But we have reason to be optimistic:


So what next?

If 2020 has taught us anything about data privacy, it’s just how vulnerable we are — and that there has never been a more critical time for the U.S. to adopt a national privacy law. 

A national law — similar to California’s CPRA — would enable us to continue doing business on a global stage.

When it comes to safeguarding consumer information, the U.S. is currently falling way behind expectations globally. A national privacy law would be a critical first step to correcting that.

Nav close