Privacy Laws and International Data Exchange: Comparing EU and US Standards

Privacy-laws-and-international-data-exchange-comparing-EU-and-US-standards-nordic-apis-sandoval

In the data world, every second, millions of lines of data are transferred containing personally identifiable information, passwords, profile contents, and authorization systems. These lines of data are collectively protected by a variety of laws, each interlocking to form a network of often confusing and contradictory limitations and protections.

This confusion is especially apparent in the transfer of data on the international level, and at this level, policies between data originating in the United States and data originating in the European Union. It’s important for API developers to understand these differences, so that they can both understand their protections as well as their responsibilities. These laws are universal, whether hardcoded into a web service or hosted on the futuristic “cloud”.

Today, we will briefly inspect the fundamental differences in data privacy laws between the two zones, as well as their various legal, economic, and ethical implications for API and application developers transferring data across international lines.

Data Protection in the European Union

Much of the data protection in the European Union arises from the fact that member states of the EU are also signatories of the European Convention on Human Rights. Article 8 of this convention specifically provides for the privacy protections of one’s “private and family life, his home and his correspondence.”

There are two main legal instruments that govern the protection of data within the European Union — the Data Protection Directive 1995/46/EC and the e-Privacy Directive 2002/58/EC. Within these, personal data is governed by three separate metrics — Purpose, Proportionality, and Transparency.

Purpose

According to EU laws, personal data can only be processed for a legitimate purpose, and that data cannot be further processed in any way that does not first meet the criteria for legitimate purposes.

This means that personal data can only be collected and processed when the situation warrants it. For example, an API developer may request registration information and transfer medical records as long as their API or application handles medical records and uses registration information for authentication purposes — otherwise, there is no specific legitimate purpose for processing the data.

Article 7 of the directive states that personal data can only be accessed if:

(a) the data subject has unambiguously given his consent, or
(b) processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract, or
(c) processing is necessary for compliance with a legal obligation to which the controller is subject, or
(d) processing is necessary in order to protect the vital interests of the data subject, or
(e) processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller or in a third party to whom the data are disclosed, or
(f) processing is necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection under Article 1(1).

Proportionality

According to this principle, data can only be transferred when it is proportional to the purposes for which it was collected and for which it needs to be processed. For instance, an application that processes requests for apartment hunters looking for a new home can request information about income, employment status, and other non-protected items of data, but cannot ask for religious affiliation, medical status, or race, amongst other protected items.

Most strikingly different from US privacy law, in the EU, data collected for the purpose of marketing must provide an opt-out clause for consumers that can be accessed at any time. That is, a consumer must have the ability to demand cessation of data collection when advertising is involved — like the standard automatic subscribe button at the end of modern email campaigns.

Transparency

The entity from whom data is collected, known in the law as the “data subject”, must be informed as to when their data is collected, and how that data is going to be processed. This could mean trouble for a US-based app with recognition APIs that, unknowingly to the user, collects facial gestures or eye movements on a screen to augment their advertising efforts.

Additionally, the collector of the data, known in the law as the “data controller”, must provide their name, address, processing and collection purpose, recipients of data, and any further data that is required to establish that the processing of the data is both proportional and legitimate.

Data Protection in the United States

When data privacy was first established in the European Union, it was a hodgepodge of industry-specific protections. Unfortunately, the United States never moved past this stage, and the current data privacy climate is itself a hodgepodge of laws passed when specific industries required them.

In fact, the goal of US privacy law seems to be more focused on the efficiency of data flow rather than the sum-total protection of private data from unauthorized use and access:

In general terms, in the U.S., whoever can be troubled to key in the data, is deemed to own the right to store and use it, even if the data were collected without permission. For instance the Health Insurance Portability and Accountability Act of 1996(HIPAA), the Children’s Online Privacy Protection Act of 1998 (COPPA), and the Fair and Accurate Credit Transactions Act of 2003 (FACTA), are all examples of U.S. federal laws with provisions which tend to promote information flow efficiencies.

Though several significant attempts have been made to establish the privacy of an individual’s data, these attempts are largely industry-specific. The aforementioned Health Insurance Portability and Accountability Act (HIPPA), for example, specifically establishes procedures for health information collection, disclosure, use, and the inherent rights therein.

Likewise, the Electronic Communications Privacy Act (ECPA) attempted to establish certain protections of private data over electronic communication systems; unfortunately, the large amount of loopholes therein renders the rights nill, as consent can be given either directly, as in the case of a data request, or implicitly, as when an employee uses email that is then monitored by their company.

Securing-the-API-stronghold-blog-post-CTA-01

International Data Exchange

With such disparate systems, much of the international data exchange is confusing and undefined. Rules for how the United States handles European Union client data, for example, is very different from how the European Union handles data originating in the United States. Much of the protection abroad that developers and users benefit from may not be as implicit as they first suspected.

According to Advocate General Yves Bot of the Court of Justice of the European Union, the way US companies handle data does not match the requirement under EU law for third party countries to have adequate and matching privacy provisions for transferred data. Conversely, in July of 2000, the European Union ruled that the Safe Harbor Privacy Principles agreement provided adequate and equal protection.

After the events of Edward Snowden’s revelations that the US, through the NSA, was spying on data held by companies such as Facebook, international data law was quickly brought into the forefront, with Austrian citizen Maximillian Schrems bringing an official complaint to the High Court of Ireland.

This case, accusing Facebook of not following Irish and European Law on privacy protections (which, being headquartered in Ireland, Facebook’s international headquarters was required to follow), challenged international data privacy rights in a way that pitted the relatively lax US regulations on security with the comparatively more serious and strict rules of the European Union.

In response to these allegations, a recent ruling from the European Court of Justice (ECJ) has far reaching consequences for Facebook and other international digital entities, essentially decreeing that a country’s data privacy laws have jurisdiction over digital exchange in that territory, regardless of headquarters. As The Guardian describes:

“The ECJ ruled Thursday that if a company operates a service in the native language of a country, and has representatives in that country, then it can be held accountable by the country’s national data protection agency despite not being headquartered in the country.”

What This Means for APIs

Now that we understand a little bit about data privacy history and the current climate, what does all of this mean for API developers? It means that understanding the law, the requirements of data holders, and the specific data collection activities allowed under each law is squarely on the developer.

Rights are implicit, and generally don’t have to be claimed in order to be employed. This means that API developers in the EU wishing to develop for both US and EU consumers must be aware that their data may not be protected at the same level as if the service was limited entirely to the EU. Likewise, US developers must know that their relatively lax security standards may not be acceptable for customers in the EU, and may even be illegal.

Federal Law in the United States may also mandate the sharing of information when national security is involved through the Patriot Act, a piece of legislation passed after the September 11th attacks on New York City. Subsequently, data that may be considered private or protected in the EU may at any time be accessed by federal agencies such as the NSA when national security is perceived to be threatened by such data.

For developers, much of this privacy can be controlled at the development level. When designing a platform or service, developers must always secure their most vulnerable layers. Likewise, different levels of security throughout your environment may have to be scaled for certain industries (for instance, HIPPA requires a far higher standard of privacy than e-Commerce), Even the types of internal security within an organization may have to comply with privacy laws in the US and EU before the data ever leaves your servers.

The levels of data privacy for each region must be considered early in development. Even something as simple as variations in the types of data metrics and analysis employed may result in an API that is entirely legal in the United States, and entirely illegal in the EU.

Example Scenario

In 2013, the US government demanded SSL encryption security codes from Google. In coverage of the unprecedented request, Jennifer Granick, director of civil liberties at Stanford University’s Center for Internet and Society, said:

“One of the biggest problems with compelling the private key is it gives you access to not just the target’s communications, but all communications flowing through the system, which is exceedingly dangerous.”

Unfortunately, Google operates internationally — meaning they are subject to both US and EU laws. EU law requires encryption standards be enforced in the case of private data, especially when that data is tranferred between member states and non-member states.

Accordingly, the juxtaposition of laws between the EU and the US creates a huge headache for Google, who are being pressured to both release and protect their security keys — and if these keys are released, all traffic, including EU traffic, will be open to US eyes.

For clarity, these rules are enforced, and are currently under heavy consideration and debate. EU courts are currently discussing whether or not data transfers with US interests are protected by the safe harbor exception, and their choice could impact international API development and interoperation. Much of what the safe harbor doctrine has protected, including the infamous activities of both the NSA and GHCQ, are technically illegal under EU law according to the EU Parliament.

Conclusion

Any service extending to the EU must adhere to both US and EU laws, and vice versa — a tricky and complex situation to find oneself in. This is an issue that is far too complex to summarize in a single piece, and understanding the privacy requirements of developers across the globe will fall on the developers themselves. Understanding the local rights and laws governing the sharing of personal data, especially in international terms, can be even more important than the choice of programming language or type of architecture will ever be.

Speak up!

As we have a distributed international audience, we thought this would be an interesting topic to explore. Has your company been impacted by issues of conflicting international data privacy laws? Please share below!