Lock It Up (Encrypt)

In honor of National Cybersecurity Awareness Month, we’re sharing our top practical tips for small businesses to keep their data secure.  Tip #1 is encryption.  The National Institute of Standards and Technology (NIST) defines encryption as “the process of transforming plaintext into ciphertext using a cryptographic algorithm and key.”  In plain terms, encryption is the process of securing data by using a digital lock and key. 

The premise behind encryption is pretty simple.  If you want to keep private papers from prying eyes, how would you do it?  You could put the papers in a safe.  Only someone who knows the combination to the safe can open it and access the papers inside.  Encryption does the same thing to data, except using digital methods.  Encryption essentially “locks” data by scrambling it so it becomes unintelligible to anyone who doesn’t have the “key” necessary to unscramble it.  The idea is that scrambled data is useless to anyone who can’t unscramble it.  It doesn’t matter if the encrypted data falls into the hands of a hacker or is released to the public due to a data security breach.  Data that looks like gibberish isn’t very useful.

Understanding this principle is the key to minimizing legal liability under data privacy laws.  Take Hawaii’s data breach notification law, for example.  The breach notification requirements of Hawaii Revised Statutes chapter 487N-2 apply when a “security breach” has occurred.  The term “security breach” refers to “an incident of unauthorized access to and acquisition of unencrypted or unredacted records or data containing personal information where illegal use of the personal information has occurred, or is reasonably likely to occur and that creates a risk of harm to a person.”  Did you catch the reference to “unencrypted” records?  If data that is the subject of a breach incident acquisition is encrypted, then a “security breach” did not happen for purposes of HRS 487N-2, and compliance with the breach notification requirements of the statute is unnecessary.

The California Consumer Privacy Act (CCPA) that will take effect on January 1, 2020 is another example.  A business can be sued by a consumer whose “nonencrypted or nonredacted personal information” is subject to unauthorized access and is copied, transferred, stolen, or disclosed due to the business’s failure to use reasonable security procedures.   Want to reduce exposure to private lawsuits under the CCPA?  Encrypt consumer data.

The General Data Protection Regulation (GDPR) isn’t quite as black-and-white in carving out liability for encrypted data, but the law certainly incentivizes encryption.  For example, Article 34 of the GDPR provides a safe harbor from the data breach notifications where “the controller has implemented appropriate technical and organizational protection measures, and those measures were applied to the personal data affected by the personal data breach, in particular those that render the personal data unintelligible to any person who is not authorized to access it, such as encryption.”  (Emphasis added.)  While encryption won’t guarantee exemption from the GDPR’s data breach notification requirements, failure to encrypt data almost certainly would trigger the requirements.

It should be fairly obvious by now that encrypting sensitive data is a highly recommended, if not mandatory, cybersecurity measure.  How encryption fits into your cybersecurity program depends on your organization’s IT system, the type of data at issue, operational needs, and cost, among other factors.  Encryption can deployed at different stages of the data lifecycle.  Encryption can also be paired with other data security practices such as pseudonymization and anonymization.  Consult a cybersecurity expert and privacy lawyer to determine how best to use encryption to secure your data and minimize legal liability.

A sea change in data protection law in the European Union (EU) is about to take place, and your organization doesn’t have to be based in the EU to feel its impact.  The General Data Protection Regulation (GDPR) will take effect on May 25, 2018.  The GDPR applies not just to EU Member States, but also to U.S. organization with EU-based employees.  Any U.S. organization that has a branch, office, affiliate, franchise, or agent based in the EU should check if it must comply with the GDPR.  Failure to comply with the GDPR can lead to fines of up to 20 million euros or 4% of annual global turnover (revenue), whichever is higher.

The GDPR regulates how “personal data” of EU citizens is collected, stored, processed, and destroyed.  The GDPR definition of “personal data” has a broader meaning than how U.S. laws usually define the term.  In addition to typical identifying information (e.g., name, address, driver’s license number, date of birth, phone number, or email address), “personal data” under the GDPR includes more expansive categories of data such as salary information, health records, and online identifiers (dynamic IP addresses, cookie identifiers, mobile device IDs, etc.).  The GDPR also provides heightened levels of protection for special categories of employee data, including racial and ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, data concerning an employee’s health, sex life, or sexual orientation, and biometric and genetic data.

The GDPR has wide-ranging effects on data collection, use, and retention.  Some of the data practices regulated by the GDPR include:

  • Data processing – Consent is one legitimate basis for processing personal data of employees, but the GDPR requires that consent be freely-given, specific, informed, and revocable. This means most blanket consent provisions typically found in employment contracts are not valid.  If obtaining consent according to GDPR requirements isn’t practical, an employer might need to rely on other legal bases for processing employee data.  Processing employee data is legal if it is necessary for the performance of the employment contract, required by law, or in the employer’s legitimate interests which outweigh the general privacy rights of employees.
  • Employee monitoring – The GDPR limits what employers may do with data obtained through employee monitoring.
  • Notification – The GDPR specifies what information employers must include in notices informing employees about the kind of personal data that will be collected from them.
  • Right to be forgotten – Under certain circumstances, data subjects have the right to require data controllers to erase their personal data.
  • Data portability – A person is entitled to transfer their personal data from one electronic processing system to another without being prevented from doing so by the data controller.
  • Data breach – The GDPR governs the procedures and substantive requirements for giving notification of a personal data breach.

Now is the time to revisit your employment contracts and policies with privacy counsel to ensure compliance with the GDPR.

The New York Times recently reported that Hillary Rodham Clinton used a personal email address for work and personal matters while she served as Secretary of State. Many employees could probably appreciate why Ms. Clinton chose to use a private email address for work purposes. She enjoyed the convenience of carrying one mobile device instead of two. That’s the same reason the Bring Your Own Device movement has been rapidly gaining momentum.

The convenience of commingling professional and personal online accounts comes at a price. One danger is unauthorized disclosure of confidential information.   Work-related information stored in an employee’s personal online account is not subject to security measures like firewalls, anti-virus software, and metadata scrubbing programs. Private online accounts may be vulnerable to cyberattacks, putting the confidentiality of their contents at risk. While such records might not concern national security matters as in the Clinton controversy, they could contain personnel information, medical history, or trade secrets, the disclosure of which could violate data privacy laws like HIPAA and the Sarbanes-Oxley Act, not to mention hurting a company’s competitive edge or creating a public relations debacle.

Another risk is noncompliance with recordkeeping policies. Work rules dictating how long work files are kept before they’re disposed help organizations manage the task of responding to information inquiries like discovery requests in litigation. In some jurisdictions, an organization’s failure to produce a document in discovery because it was destroyed in compliance with the organization’s document retention policy generally is not considered unlawful destruction of evidence. (Note: Hawaii’s court rules were amended this year to recognize such a defense). But spotty enforcement of a document retention policy could destroy that defense. Popular ways of transferring work files include forwarding them to a personal email address or uploading them to a personal cloud storage account. Such practices could result in work files being kept beyond their authorized retention period, thus casting doubt on whether an organization actually follows its document retention policy.

Managing these risks begins with adopting a formal policy on use of personal accounts for work purposes and training employees to follow the policy. Without a policy in place, employees might have few qualms about using their personal accounts for work.  Consult with a lawyer with data privacy experience to ensure that your policy manages legal risks.

If your company decides to prohibit the transfer of work data to external locations, enforce that policy diligently. Work with your IT department or outside vendors to implement physical and software safeguards against unauthorized transfers. Conduct audits to ensure compliance with the policy.

Another strategy is to offer solutions that allow employees to work outside of the office conveniently without having to use their personal accounts. Consider hosting a private cloud storage site where employees can share files in a secured environment under your control. Also popular is virtual desktop software that allows employees to access their workstation remotely in a controlled environment.

Don’t wait until your employees’ data handling practices make the headlines before taking action to protect the confidentiality of your work files.

The FTC released two guides on the privacy and security issues related to the Internet of Things.  The first is a staff report based on discussions in an FTC-hosted workshop on the subject held on November 19, 2013.  In addition to summarizing the workshop discussions, the report contains staff’s recommendations in the IoT space.  This prompted a FTC Commissioner (Joshua Wright) to dissent from the decision to issue the report.  In Commissioner Wright’s view, it is premature to publish staff recommendations in this area without further research, data, and analysis.  The dissenting statement can be found here.

The report discusses the benefits of IoT as well as three risks:

  1. enabling unauthorized access and misuse of personal information;
  2. facilitating attacks on other systems; and
  3. creating risks to personal safety

The report also discusses Fair Information Practice Principles including security, data minimization, notice, and choice.  Click here to read the full report.

Along with the staff report, the FTC issued a guide called “Careful Connections” that provides recommendations on building security into IoT applications.  Download the guide here.

The Federal Trade Commission (FTC) just announced that Snapchat agreed to settle charges that it deceived consumers about how its popular mobile message app worked and what personal user data it collected.  (Read the FTC’s press release here). Part of Snapchat’s appeal was a feature enabling users to control how long a message could be seen by the recipient. After the designated time limit expires, the message is destroyed, much like the mission briefings in Mission Impossible. At least that’s what Snapchat told users. According to the FTC, Snapchat misled consumers because the app didn’t exactly work the way it said it did. The FTC’s complaint against Snapchat (read it here) included these allegations:

  • Recipients of a “snap” (a Snapchat message) could save the snap using tools outside of the app. Snapchat apparently stored video snaps in a location on the recipient’s mobile device outside of the app’s secure “sandbox.” This enabled recipients to find and save video snaps by connecting their mobile device to a computer and using simple file browsing tools. Another way to bypass the deletion feature was to use apps that connected to Snapchat’s API to download and save snaps.
  • Snapchat told users that if a message recipient took a snapshot of the snap, the sender would be notified. In fact, the screenshot detection mention could be bypassed.
  • Snapchat collected geolocation data of users when it said it would not.
  • Snapchat told users to enter their mobile number to find friends who also use the app, implying that the user’s mobile phone number was the only information it collected. Without the user’s knowledge, Snapchat also collected the names and phone numbers of all contacts in the address book on the user’s phone.

So what’s the significance of the settlement? Here are a few quick takeaways.

  • Descriptions of mobile apps in an app marketplace like iTunes App Store or Google Play are product descriptions that could be the basis for false advertising claims.
  • Including boilerplate language in an app description, terms of use, or privacy policy is a bad idea if you don’t know what it means or can’t verify its accuracy. Snapchat’s privacy policy told users that it “did not ask for, track, or access any location-specific information.” A lot of apps say that. The problem was that Snapchat integrated an analytics tracking service in the Android version of the app that did collect location information.
  • Take into account exploits and workarounds when drafting privacy policies and product descriptions. This includes software that uses the app’s API.
  • The FTC is getting more active in pursuing false advertising claims against mobile app makers. In December of last year, the FTC settled charges that the developer of the “Brightest Flashlight Free” app deceived consumers about how their geolocation information would be shared with advertising networks and other third parties. The FTC’s interest in suing companies that allow a data breach to occur is also a growing concern, especially after the New Jersey federal district court’s decision in FTC v. Wyndham Worldwide Corp., recognizing the FTC’s authority to prosecute cases where a company is alleged to have failed to maintain “reasonable and appropriate data security for consumers’ sensitive personal information.”
  • Information transmitted over the Internet is rarely, if ever, gone forever. Somehow, somewhere, electronic data can be retrieved.
Enhanced by Zemanta