COVID-19: Triggering Event for Reassessing Risk and Adequacy of Data Privacy and Security Controls

This post was co-authored by Christopher J. Bender, Northcross Group.  Mr. Bender will be joining me on a panel at the Practising Law Institute’s December 2020 virtual seminar, “Fundamentals of Privacy Law 2020.”

COVID-19 measures have driven workplace operations beyond what most businesses ever planned.  The constructs of a remote workforce and virtual interactions between teams and customers are going to be major components for how business is conducted for the foreseeable future.  Many businesses are even seeing potential benefits from these scenarios, like savings in physical office needs and reduced lost productivity from travel—and looking to make these permanent.

With most people working from home networks and using a combination of company and personal devices, the ecosystem of data and processing has significantly expanded and changed.  The effectiveness of tools and methods to monitor and identify threats is different within this new operating model.  As a result, organizations must reevaluate their cybersecurity and privacy risks and controls

Maintaining Public Trust in State Courts: Why Privacy Matters

This post is part four in a series examining privacy and transparency issues in the context of public access to digital court records, building on my essay “Digital Court Records Access, Social Justice and Judicial Balancing:  What Judge Coffin Can Teach Us.”

Trust is a precious commodity.  Our social interactions, as well as our relationships with businesses and other organizations, including government agencies, are dependent upon it.

Although trust can be defined in various ways, at its core, according to scholars, “[t]rust is a state of mind that enables its possessor to be willing to make herself vulnerable to another – that is, to rely on another despite a positive risk that the other will act in a way that can harm the truster.”

People disclose more when they trust.  When they believe the other party is trustworthy, they are more likely to share information about themselves.


Rolling the Dice: How Not to Protect Privacy

This post is part three in a series examining privacy and transparency issues in the context of public access to digital court records, building on my essay “Digital Court Records Access, Social Justice and Judicial Balancing:  What Judge Coffin Can Teach Us.”

Maine state courts plan to start rolling out a new electronic filing and case management system by year’s end.  When it’s fully operational, approximately 85% of the electronic documents submitted by filers will be immediately accessible for public inspection upon acceptance by the court.  With a few exceptions (e.g., certain adoption, child protection, juvenile, and mental health civil commitment records), all criminal, civil, and traffic infractions records would be publicly available. Some records would be accessible only at the courthouse, while others would be accessible to the public both remotely (via the internet) and at the courthouse.

Foolhardy as it seems, the SJC plans to leave it to filers to serve as

The Data De-Identification Spectrum

This post is part two in a series examining privacy and transparency issues in the context of public access to digital court records, building on my essay “Digital Court Records Access, Social Justice and Judicial Balancing:  What Judge Coffin Can Teach Us.”

Given the significant risk of harm to individuals stemming from data re-identification, it is imperative that the SJC account for data identifiability in determining which information in court records will be made accessible to the public through its soon-to-be-launched digital system.  Data identifiability is a factor wholly separate and distinct from the sensitivity of data.  It exists in varying degrees along the de-identification spectrum.

In its 2012 report, “Protecting Consumer Privacy in an Era of Rapid Change, the FTC concluded that the process of removing personally identifiable information (PII) from data is not a silver bullet, acknowledging the broad consensus that “the traditional distinction between PII and non-PII

Redaction and Re-identification Risk

This post is part one in a series examining privacy and transparency issues in the context of public access to digital court records, building on my essay “Digital Court Records Access, Social Justice and Judicial Balancing:  What Judge Coffin Can Teach Us.”

In its proposed electronic court records access rules, the Maine Supreme Judicial Court (SJC) imposes on litigants new and extensive filing obligations, including requiring litigants to redact certain categories of sensitive personal information.

Regardless of what one might think about the wisdom of placing this burden on litigants, it is important to ask what the SJC hopes to achieve by this requirement.  Even assuming full compliance, which is doubtful, redaction as a de-identification technique, without more, would be wholly inadequate to protect the privacy of Maine citizens.

In today’s big data world, given the sophistication of data handlers, it is well-recognized that de-identification alone is not enough to prevent re-identification of

EU-U.S. Privacy Shield – Its Origins and the High Bar It Must Meet

To predict the Privacy Shield’s future, it’s helpful to recall its origins and to understand the high bar it must meet – namely, ensuring “an adequate level of protection” under the Data Protection Directive.

As to its origins, because the Commission had not recognized the United States as having adequate protection, in 2000 the EU and the U.S. were forced to come up with mechanisms to enable companies to continue to transfer personal data from the EU to the U.S. The Safe Harbor framework, blessed by the Commission in an adequacy decision (“Safe Harbor Decision”), was one of the mechanisms agreed upon between the EU and the U.S.

Under the Safe Harbor framework, U.S. companies were able to self-certify through the DOC that they adhered to the privacy principles set forth in the Safe Harbor Decision. Before being invalidated in 2015, more than 4,000 U.S. businesses, including Facebook, had self-certified under the framework.  Significantly, as

EU-U.S. Privacy Shield – What’s Its Future?

It’s been almost one year since the EU-U.S. Privacy Shield (Privacy Shield) came into existence.  Its upcoming annual review in September by the European Commission (Commission) and the U.S. Department of Commerce (DOC) – its first such review – is being viewed by many as a pivotal test for the framework.  Success will boost confidence in the Privacy Shield’s durability, a vulnerability often cited by its critics. Even if it passes, however, the Privacy Shield is likely to continue to face challenges going forward.

Thus, for U.S. companies presently considering self-certification, the timing is right to ask the question whether the Privacy Shield is here to stay, and if so, how it might change going forward. To answer these questions, I think we need to recall the Privacy Shield’s origins and the context in which it arose, as well as fully understand its requirements and what compliance entails.

At the same time, it also is important for U.S. companies to consider the

Internet Privacy – ISP Snooping and U.S. Surveillance Laws

It’s hard to imagine a world in which the U.S. Postal Service is permitted to peer inside our personal mail, or gather and track the address and other data we place on our mail, and then use and sell what it learns about us.

Yet, when it comes to our web browsing activities and electronic communications, isn’t that what Internet Service Providers (ISPs) are now lawfully able to do as a result of the U.S. government’s recent action overturning the FCC’s privacy rules?

The Electronic Communications Privacy Act (ECPA) puts some privacy limits on what ISPs can do. But the question is, are they sufficient based on what we know today?  Let’s look at some of those privacy limits, and you be the judge.

The ECPA, enacted in 1986, long before anyone knew about the Internet, e-mail, and the vast array of other new technologies that we use today, is the primary federal surveillance law applicable to

Internet Privacy – What the U.S. Can Learn from the European Union

With respect to Internet privacy, as a result of recent U.S. government action, Americans now have less protection and are more at risk of government surveillance and potential misuse of their personal information, as compared with citizens of the European Union (EU).

By overturning the FCC’s privacy regulations and stripping the FCC’s authority to implement similar privacy regulations in the future, the U.S. government has created an enormous Internet privacy regulatory void. As a result of such action, there now appear to be no federal regulatory limitations on the types of personal information Internet Service Providers (ISPs) can collect, use and disclose regarding the Internet activities of their subscribers, nor any obligations imposed on ISPs with respect to data retention, data protection or breach notification.

This regulatory void in the U.S. contrasts sharply with EU law, which generally prohibits ISPs from using or disclosing any personal information without the opt-in consent of their subscribers. Under the EU’s new General

Filling the Void in Internet Privacy: Time to Turn to the Courts (Again)

Now that the U.S. government has overturned the FCC’s privacy regulations, are courts more likely to step in to protect the Internet privacy rights of individuals?

More specifically, how will courts respond when an Internet Service Provider (ISP) divulges to law enforcement the content and details of a subscriber’s Internet activity without obtaining a search warrant, despite law enforcement having complied with the judicial process set forth in the Electronic Communications Privacy Act (ECPA), in particular, the Stored Communications Act (SCA). Will courts require a search warrant even though the SCA does not require one?

If the past is any indication, I anticipate that an increasing number of federal and state courts, when faced with this question, will find that individuals have a “reasonable expectation of privacy” in the content and details of their Internet activity and that they will prohibit the government from obtaining warrantless access to such information under applicable constitutional law. The constitutional law could