Data Destruction Contract Clauses Are a Hot Topic

Morgan Stanley’s recent payment of $60M to settle a civil proceeding for failing to properly dispose of customer data is a reminder of the importance of knowing applicable data disposal laws and drafting appropriate data destruction clauses in technology agreements.

Sources of Obligations

The sources of obligations to destroy or dispose of personal data are myriad. Direct and indirect federal requirements include the Gramm-Leach-Bliley (“GLB”) Interagency Guidelines Establishing Information Security Standards, the GLB Safeguards Rule, the Health Insurance Portability and Accountability Act (“HIPAA”) Privacy Rule, the HIPAA Security Rule, and the Fair and Accurate Credit Transactions Act Disposal Rule. Unfair and deceptive acts and practices laws, both federal and state, may also apply. In addition, at least 35 states have unique data disposal laws.

Common law negligence, invasion of privacy, and unjust enrichment are just a few other claims that may be brought against companies failing to properly destroy personal information. And, apart from these requirements, technology agreements typically include provisions requiring deletion or return of confidential information.

The data disposal requirements are not simple or easy to navigate, either. Numerous companies besides Morgan Stanley have suffered lapses, including, for example, American United Mortgage Company, Cornell Prescription Pharmacy, FileFax, CVS Pharmacy, Searchtec, Home Depot, and RadioShack.

Data Destruction Tips for Technology Customers

That said, for customers contracting for technology services or products that require the use or availability of personal data, several steps are available to reduce  data disposal risks.

  • Know what personal information destruction and disposal laws apply. Must the destruction efforts be reasonable or must the data be rendered unreadable or undecipherable? Must the data also be unusable?
  • Include agreement provisions requiring the vendor to destroy (or return) the data upon request and, in all cases, upon termination or expiration of the agreement. Add that, upon request, the vendor certify or acknowledge the destruction. Follow through on this requirement.
  • Contractually require the vendor to qualitatively destroy the information so that it is permanently irretrievable, unreadable, inaccessible, and indecipherable. Mandate that paper media be shredded, disintegrated, incinerated, pulverized, or pulped.
  • Include in the contract a right to audit the vendor’s data disposal or destruction and ensure that the vendor’s obligation to establish and implement reasonable security measures aligns with the data destruction requirements.

Data Disposal Tips for Technology Vendors

For technology vendors, which may also have legal obligations to destroy or dispose of data, contractual and operational mitigations also exist.

  • Before contracting, know the personal information destruction and disposal laws that apply. For example, is the vendor a business associate under HIPAA?
  • Proactively include language in the agreement permitting vendor destruction or disposal of the data.
  • Utilize best industry practices to destroy or erase the data – even if the technology agreement does not require it.
  • Segregate each customer’s personal data from other customers’ data, to facilitate discrete and expedient destruction or disposal.

These mitigations for technology customers and vendors are even more important, given the volume and dynamic nature of data destruction and disposal requirements and corresponding challenges for these companies.

Cyber Insurance: No Lifeline for Enterprise Technology Customers

Recent major cyber attacks have kickstarted a cyber insurance buying frenzy. However, because cyber insurance coverage is unpredictable on many levels, it is critical that technology customers take meaningful steps to address insurance risks and to contract appropriately with their technology vendors.

Cyber Insurance Challenges

Cyber insurance sounds great on paper but is difficult to implement effectively. Policies notably are not uniform or standard in providing coverage for particular occurrences, parties, or losses. Even within a particular insurance provision, contract language is unpredictable and varies widely across insurers. For example, cyber attacks initiated by state actors may or may not be covered, depending on whether the attack is considered terrorism, an act of war, or a warlike action.

Moreover, insureds and insurers routinely disagree as to the coverage and intent of cyber insurance policies. Litigation involving Mondelez, Payless Shoesource, Alorica, National Bank of Blacksburg, Sony, Target, and SS&C Technologies is just the tip of the iceberg. As for pace, let’s just say that two months ago, Home Depot filed suit against three insurers to seek to obtain coverage under its policies in connection with the massive date breach it suffered seven years ago.

Decision-Making Concerns

Of concern, then, enterprise technology customers frequently base their decision to accept cyber-related contractual indemnities and limitations of liability from their vendors based on the mere fact that the vendors – or the customers – have cyber liability insurance. The customers often accept the risks without evaluating the vendors’ purported policies and without revisiting their own coverages based on the particular technology transaction. Even obligating the vendor to implement reasonable security measures is not enough.

Contractual and Operational Mitigations

The following contractual and operational tips may help enterprise customers identify and mitigate cyber liability insurance related risks under their technology agreements.

  • Read your policies. Technology customers should carefully review and evaluate their insurance policies, including their cyber liability policy, to determine the extent of coverage for the cyber risks for the particular technology transaction and vendor. In some cases, standard business policies (such as property insurance, crime insurance, or commercial general liability coverage) may include cyberattack losses.
  • Summarize your policies for internal stakeholders. Your technology contract negotiation team will be much better able to assess applicable cyber risk for a particular technology transaction if they know the specific scope and extent of your own cyber and other insurance policies.
  • Monitor policy changes. The technology agreement should require the vendor to provide prompt notice of changes in the vendor’s insurance coverages. The agreement should establish that vendor breaches of insurance provisions specifically give rise to customer termination rights.
  • Increase insurance coverages. When the customer’s business team insists that the particular technology vendor is the best resource for the deal, but the vendor does not have adequate cyber insurance, the customer should consider obligating the vendor to procure sufficient coverage, even if only for the particular transaction. Be aware, however, that the vendor may seek to burden the customer with the cost of the additional coverage.

And, do keep in mind that businesses commonly underestimate the cyber coverage they need to mitigate cyber risks.

Contracting Conundrum: “Reasonable Security Measures”

In technology contracts between customers and vendors, it is common to obligate one or both parties to implement “reasonable security measures” to protect applicable data and information. Typically, the obligation is a function of risk allocation or legal requirements. The recently enacted (and more recently amended) California Consumer Privacy Act’s authorization of a private right of action against businesses that fail to implement reasonable security procedures and practices highlights the issue. But, what are “reasonable security measures?” And, which contracting party decides?

The Market Speaks

Often, technology contracts merely reference, but do not explain, reasonable security measures. A contract may require a party simply to “implement reasonable security measures” to safeguard applicable information. Alternatively, a contract may obligate the party to “implement reasonable security measures as required by applicable law” or to “comply with applicable data privacy and security laws, including those regarding security measures.” Both customers and vendors can find these examples appealing.

Pushing the Envelope

Less often, but frequently when the technology transaction involves financial services companies, the contract may impose more stringent requirements based on statute or regulation. For example, the vendor may be obligated to “implement administrative, technical, and physical safeguards to insure the security and confidentiality of customer records and information, to protect against any anticipated threats or hazards to the security or integrity of such records, and to protect against unauthorized access to or use of such records or information which could result in substantial harm or inconvenience to any customer.”

Similarly, technology contracts involving healthcare information can mirror applicable federal regulations and obligate a party to “implement administrative, physical, and technical safeguards that reasonably and appropriately protect the confidentiality, integrity, and availability of the information.” For EU personal data, the Standard Contract Clauses (which will likely soon change) may be invoked.

Although usually advocated by technology customers, because these more specifically stated obligations track legal requirements, they are often acceptable to the customers’ vendors.

Breaking the Envelope

In a few cases, customers or vendors may choose to sidestep the vagueness of the above options. For example, agreements with ties to California may explicitly reference the 2016 California Data Breach Report, which specifically states that an organization’s failure to implement all twenty controls in the Center for Internet Security’s Critical Security Controls constitutes a lack of reasonable security. When payment card information is in scope, the contracting vendor may be directed to comply with the PCI Data Security Standards.

Increasingly more common, a technology customer – or vendor – may expressly set out detailed, bespoke security measures. The contractual statement of these measures can range from one, to three, to five or more pages.

Clearly, there are many ways for contracting parties to reach agreement on applicable security measures to be implemented under a technology contract. Be sure that what you sign up for works best for your company – all costs, risks, and consequences considered.

Privileged Cybersecurity Investigations – A Checklist for Contracting with Consultants

Your company may suffer a cybersecurity incident that warrants bringing in third-party forensics or other consultants to investigate and report on the cause or consequences of the cyber event or compromise. To seek to protect the third parties’ reports with the work product privilege (and, thus, to avoid having to disclose the reports in litigation) – and to try to side-step the unexpected failure to establish such protection that Capital One recently experienced (In re: Capital One Consumer Data Security Breach Litigation) – do (and don’t do) the following with respect to your contracts with these third parties:

Do have outside counsel be the entity contracting directly with the third party. Have outside counsel pay the third party’s fees, directly. Then, have outside counsel bill you for reimbursement of the fees paid.

Do contract under a specific statement of work or services description that is exclusive to the particular cyber incident.

Do state and expressly limit the purpose of the third party’s services and reports to anticipating litigation arising from the cyber incident. The purpose should not explicitly or implicitly include, for example, financial controls or reporting.

Do require that the third party’s report be in a form and of substance specific to the purpose of anticipating litigation. The report should not mirror what would be provided for reports for other purposes.

Do require the third party to issue formal and informal reports and updates only to the contracting outside counsel. Outside counsel, then, as necessary or appropriate, can distribute further the reports or updates, for example, to select internal stakeholders.

Don’t allow those who receive reports and updates from outside counsel to further distribute the reports or updates, whether internally or externally. Require recipients to explicitly agree to limited use and handling terms, before receiving reports or updates.

Don’t allocate the costs and fees for the third party’s services to any internal billing or cost center other than Legal’s. The costs and fees should be assigned to Legal’s budget. Categorize the costs and fees as “legal” costs and fees, not, for example, cybersecurity or business costs or fees.

And, in the contract with the third-party forensics firm or consultant, do include requirements that the third party conform to all of the applicable above do’s and don’t’s.

Importantly, these are only a few do’s and don’t’s that may help guide many companies to attempt to structure and implement contracts with third-party consultants so as to establish the work product privilege applicable to the third party’s reports. Each company, each cybersecurity incident, and applicable law can vary and be unique, so it is perhaps even more critical for the company to immediately involve inside (or outside) counsel to navigate these thorny issues.

Background – In re: Capital One Consumer Data Security Breach Litigation

The above do’s and don’t’s follow from the recent decision of the U.S. District Court for the Eastern District of Virginia in the above-referenced litigation. Capital One sought to avoid having to disclose the report issued by the cybersecurity forensics firm that it retained in wake of the March 2019 data security breach suffered by the financial company.

In affirming a magistrate judge’s order to compel Capital One to disclose the forensics report, the Virginia federal district court made several observations. Well before the breach (and not specific to the March breach), Capital One had retained the forensics firm under a general SOW, on a retainer basis, to provide a set number of service hours for any one of a broad range of incident response services that might be needed. After the security breach, although the bank’s outside counsel signed a letter agreement with the forensics firm for services with respect to the breach. The terms of the letter agreement provided for the same scope and kind of services, on the same terms and conditions, as the general SOW (except that the forensics firm would work at the direction of the outside counsel and provide the forensics report to the outside counsel).

For performing under the letter agreement, the consultant was first paid from the retainer already provided under the general SOW. Then, Capital One directly paid the balance of the consultant’s fees due under the letter agreement – with funds from Capital One’s internal general cybersecurity budget. Capital One (at least at first) internally identified the fees paid to the consultant as a “business critical” expense – not as a “legal” expense.

During the forensics firm’s investigation, it communicated directly with the bank’s external financial auditors, so that the auditor’s could assess whether the breach impacted the bank’s accounting controls. Many internal and external parties received a copy of the forensics report, but Capital One provided no explanation as to why these recipients received a copy of the report, as to whether the report was provided for business purposes, regulatory reasons, or specifically in anticipation of litigation, or as to any restrictions placed on the recipients’ use, reproduction, or further distribution of the report.

Both the magistrate judge and, on appeal, the district court judge who opined on the matter saw these above facts, among others, as support for finding that the forensic firm’s investigation report was not protected from disclosure by the work product privilege.

RETURN TO SENDER: Aetna to Pay $17M to Settle Claims Related to Vendor Mailer Data Breach

Aetna has agreed to pay $17.2 million and to implement a “best practices” policy regarding sensitive policyholder data, in order to settle class action litigation brought against it arising from a mass mailing sent by one of its mailing vendors. As discussed in a blog post last year, federal class action litigation was brought against Aetna and its mailing vendor in 2017 based on the vendor’s use of glassine envelopes to communicate HIV medication information to Aetna insureds. The envelopes revealed that the named addressee was contacted about options for filling HIV medication prescriptions. The litigation alleged violations by Aetna and its vendor of several laws and legal duties related to security and privacy.

The contract lessons for customers and vendors that arise from the events in question, which were identified in the earlier post, remain the same. Do your contracts for non-IT and non-healthcare services fully consider the risk of privacy and security litigation? Do your contract’s indemnification and limitation of liability clauses contemplate the possibility of class action litigation? Before entering into a contract, have you considered whether the specific vendor services being provided to the particular customer in question implicate laws you hadn’t considered? And, Have you considered which specific aspects of vendor services may directly impact potential legal liability, and have you adequately identified and addressed them in the contract?

Importantly, the newly announced settlement, itself, provides three bonus lessons.

Published data breach cost statistics are helpful, to a point. 

In its 2017 Cost of Data Breach Study, Ponemon Institute reports that the average per capita cost of data breach in the U.S. for the period of the study was $225. It also reports that, for the same period, the average total organizational cost in the U.S. for a data breach was $7.35 million. Somewhat remarkable, as part of its settlement Aetna agreed to pay $17.2 million in connection with the breach in question – a figure that is about $10 million over the average reported by Ponemon Institute. But, Aetna’s payment is not out of the ballpark, as averages are averages, after all. Much more remarkable, however, is the per capita settlement amount. Aetna’s settlement figure represents a per capita amount of $1,272 – that number is more than five times the reported average. (For reference, that per capita cost would put Equifax’s settlement number for its recent breach at $185 billion dollars.) Bottom line, when considering or counseling clients as to the financial impacts of data breaches, the average cost figures for data breaches are as important as the qualification of the figures, themselves, as only averages (with any number of data security breaches costing more, or less, than the averages).

Data breach cost statistics often do not compare well with litigation settlement amounts. 

Yes, Aetna agreed to pay $17.2 million as part of the settlement, as compared to Ponemon Institute’s reported $7.35 million average U.S. data breach cost. While the $7.35 million figure includes forensics costs, customer churn, post data breach costs, and other direct and indirect expenses, the $17.2 million figure is not as comprehensive. It does not include, for example, Aetna’s legal fees incurred to defend and settle the class action litigation, nor does it include other pre-settlement costs and expenses incurred by Aetna. As efficient or helpful as it may be to compare published per capita or per breach data statistics with litigation settlement amounts, it’s also important to identify the full scope of costs and expenses that the published statistics include, as well as what costs and expenses are not included in the settlement amounts.

Data breach cost statistics and litigation settlement amounts don’t include non-monetary settlement obligations. 

Cost-per-record, cost-per-breach, and litigation settlement figures can be particularly meaningful and relatable, especially when considering or counseling clients as to the potential financial impacts of data security breaches. Notably, however, the material obligations of defendants settling data breach litigation matters typically are not limited to monetary payments. Aetna, for example, as part of its settlement, also agreed to develop and implement a “best practices” policy for use of certain personally identifiable information, to provide policy updates for five years, to provide policy training for certain Aetna personnel for five years, and to require outside litigation counsel to sign business associate agreements, among other commitments. These activities will require Aetna to incur additional costs and expenses, including costs and expenses for internal and, possibly, external resources in connection with the performance of these activities.

Supplementing the earlier post on this Aetna class action litigation and lessons learned, the recent Aetna settlement and the new lessons cited above provide an even fuller picture of data and security breach and related contract considerations. Not only is it invaluable to consider data privacy and security issues in contracts and the roles of vendors and service providers, it also is important to consider and counsel clients as to the full potential impacts of data breaches, including potential litigation settlement amounts, costs and expenses in addition to settlement amounts, and non-monetary settlement-related obligations.

Lessons Learned: Vendor Sued in Class Action Suit for Security Misses

You’re thinking that something about the title of this post sounds familiar, right? Information technology (IT) vendors and third party service providers have been in the spotlight for security breaches for some time (see, for example, vendor-based security lapses affecting TargetCVS, and Concentra, as just a few), and it doesn’t sound surprising that an IT vendor has been sued related to a security incident. After all, whether you’re an IT vendor or an IT customer, if you draft or negotiate contracts for a living, these situations are what you try to contract for, right?

Right…but…the recent federal class action suit filed in Pennsylvania against Aetna and its vendor surfaces several new privacy and security considerations for vendors and their customers. The vendor in question was not an IT vendor or service provider. Instead, the plaintiff’s allegations relate to Aetna’s use of a mailing vendor to send notification letters to Aetna insureds about ordering HIV medications by mail. According to the complaint, the vendor used envelopes with large transparent glassine windows – windows that did not hide the first several lines of the enclosed notification letters. The plaintiff asserts that anyone looking at any of the sealed envelopes could see the addressee’s name and mailing address – and that the addressee was being notified of options for filling HIV medications. As a result, the vendor and Aetna are alleged to have violated numerous laws and legal duties related to security and privacy.

For all vendors and service providers, but especially those that don’t focus primarily on privacy and security issues, the Aetna complaint is enlightening. To these vendors and service providers, and to their customers: Do your customer-vendor contracts and contract negotiations contemplate what Aetna and its mailing vendor may not have?

Do your contracts for non-IT and non-healthcare services fully consider the risk of privacy and security litigation? A noteworthy facet of the Aetna case is that the mailing vendor was sued for privacy and security violations that were not exclusively due to the customer’s acts or omissions. That is, while the contents of the mailer certainly were key, the vendor’s own conduct as a mailing services provider (not an IT or healthcare provider) was instrumental in the suit being filed against the vendor (and Aetna). Vendor services that previously didn’t, or ordinarily don’t, warrant privacy or security scrutiny, may, after all, need to be looked at in a new light.

Do your contract’s indemnification and limitation of liability clauses contemplate the possibility of class action litigation? Class action litigation creates a path for plaintiffs to bring litigation for claims that otherwise could not and would not be brought. Class action litigation against data custodians and owners for security breaches is the norm, and the possibility and expense of class action litigation is frequently on the minds of their attorneys and contract managers who negotiate contracts with privacy and security implications. But, for vendors and service providers providing arguably non-IT services to these customers – the idea of being subject to class action litigation is often not top-of-mind.

Before entering into a contract, have you considered whether the specific vendor services being provided to the particular customer in question implicate laws you hadn’t considered? Vendors that operate in the information technology space – and their customers – generally are well-aware of the myriad of privacy and security laws and issues that may impact the vendors’ business, including, as a very limited illustration, the EU General Data Protection RegulationHIPAANew York Cybersecurity Requirements, Vendors that aren’t “IT” vendors (and their customers), on the other hand, may not be. For example, the Aetna mailing vendor may not have contemplated that, as alleged by the Aetna plaintiff, the vendor’s provision of its services to Aetna would be subject to the state’s Confidentiality of HIV-Related Information Act and Unfair Trade Practices and Consumer Protection Law.

Have you considered which specific aspects of vendor services may directly impact potential legal liability, and have you adequately identified and addressed them in the contract? No, this is not a novel concept, but it nonetheless bears mention. A key fact to be discovered in the Aetna litigation is whether it was Aetna, or the vendor, that made the decision to use the large-window envelopes that, in effect, allegedly disclosed the sensitive and personally identifiable information. Given the current break-neck pace at which many Legal and Contract professionals must draft and negotiate contracts, however, unequivocally stating in a contract the details and descriptions of every single aspect of the services to be provided is often impractical (if not impossible). But, some contract details are still important.

Whether or not this class action suit is an outlier or is dismissed at some point, consider data security and other privacy and security issues in contracts and how vendor or service provider conduct may give rise to a security breach or security incident.

What Does Ransomware Cost Companies?

In its 10-Q filing for the quarter ended September 30, 2017, Merck & Co., Inc. stated the following:

On June 27, 2017, the Company experienced a network cyber-attack that led to a disruption of its worldwide operations, including manufacturing, research and sales operations. … [T]he Company was unable to fulfill orders for certain other products in certain markets, which had an unfavorable effect on sales for the third quarter and first nine months of 2017 of approximately $135 million. … In addition, the Company recorded manufacturing-related expenses, … as well as expenses related to remediation efforts … , which aggregated $175 million for the third quarter and first nine months of 2017.

Worth noting, this $310 million amount likely does not include all legal fees, forensic costs, and all other costs, expenses, and losses related to the cyber-attack. Nor does it appear to include other costs, expenses, and losses that may be indirectly revealed elsewhere in Merck’s business or operations. The attack in question is the NotPetya ransomware attack, which impacted countless companies worldwide on June 27 of this year.

Lost Business Resulting from Ransomware

Merck’s announcement is remarkable for several reasons, especially for those who negotiate technology contracts and agreements with data privacy and security implications. First, it’s noteworthy in its relatively clear quantification of lost business resulting from the ransomware attack. That is, often it is difficult to quantify lost business, lost sales, and consequential damages when negotiating liability provisions related to data security and information security in technology agreements and other commercial contracts. This is not to say that Merck’s recitation of these amounts is a new rule-of-thumb or benchmark, but it may start a conversation.

Quantifiable Losses

Second, the loss numbers reported by Merck are not small ones. It is common to discount publicly announced forecasts of ransomware impacts that are viewed as extreme – $75 billion per year, according to one recently cited resource. But the concreteness of Merck’s number and the specificity of the ransomware attack merits attention.

Ransomware is Fact-Specific

Third, the Merck announcement implicitly underscores the criticality of the precise facts surrounding the NotPetya ransomware attack and the unique business and situation of Merck. Not all ransomware or malware attacks can cause the same sort or amount of losses reported by Merck, nor does the same ransomware or other malware give rise to the same quality or quantity of losses for every corporate victim. When negotiating data privacy and data security provisions in commercial technology contracts and similar agreements, it is important for all sides to consider the specific circumstances and risks related to the transaction and parties in question.

Ransomware Impacts Are Not Necessarily Per-Record

And, fourth, the Merck report sheds light on the financial repercussions of ransomware, as opposed to other malware and hacking activities. That is, there are a number of industry and other reports and surveys that speak to the financial and other impacts of data breaches and security breaches on a per-record basis (for example, cost per record, records per breach, etc.). The 2017 Ponemon Institute Cost of a Data Breach StudyVerizon’s 2017 Data Breach Investigations Report, and Gemalto’s Breach Level Index Findings for the First Half of 2017 are just a few. However, in many cases the particular per-record numbers reported do not provide a clear picture of the financial effects of ransomware, which often is not the kind or scope of cyber-attack that can be assessed on a per-record basis.

Merck’s 10-Q for the third quarter of 2017 is definitely not a quick-fix answer to the question of how much a ransomware attack would or could financially impact a company. However, for attorneys, contract professionals, and others who draft and negotiate technology agreements and contracts and, specifically, information and data security and privacy provisions, the Merck quarterly report is potentially meaningful.