All posts by Seth Jaffe, CBCP, JD

Seth is our official rocket scientist in residence. Hailing from NASA’s Mission Control Center, Seth brings a unique perspective to incident response, applying aspects of one of the world’s preeminent emergency operations platforms to cyber response. In addition to twenty-plus years’ of technical experience, Seth was previously a member of the data protection task force at a large law firm, and served as the lead Legal team member of an incident response team at a major U.S. airline. Seth is a certified business continuity professional, and he holds a juris doctorate, which is why he also wears the General Counsel hat at LEO.

Canada’s Breach Notification Regulation Goes into Effect Today

Back in April, Canada adopted additional regulations related to its cyber security law, the Personal Information Protection and Electronic Documents Act (“PIPEDA”). The new regulations dictate requirements for reporting a data breach and they go into effect November 1, 2018. Specifically, a report to Canada’s Office of the Privacy Commissioner must contain:

  • a description of the circumstances of the breach and, if known, the cause;
  • the day on which, or the period during which, the breach occurred or, if neither is known, the approximate period;
  • a description of the personal information that is the subject of the breach to the extent that the information is known;
  • the number of individuals affected by the breach or, if unknown, the approximate number;
  • a description of the steps that the organization has taken to reduce the risk of harm to affected individuals that could result from the breach or to mitigate that harm;
  • a description of the steps that the organization has taken or intends to take to notify affected individuals of the breach in accordance with subsection 10.1(3) of the Act; and
  • the name and contact information of a person who can answer, on behalf of the organization, the Commissioner’s questions about the breach.

A notification to an individual affected by the data breach must contain:

  • a description of the circumstances of the breach;
  • the day on which, or period during which, the breach occurred or, if neither is known, the approximate period;
  • a description of the personal information that is the subject of the breach to the extent that the information is known;
  • a description of the steps that the organization has taken to reduce the risk of harm that could result from the breach;
  • a description of the steps that affected individuals could take to reduce the risk of harm that could result from the breach or to mitigate that harm; and
  • contact information that the affected individual can use to obtain further information about the breach.

The text of the regulation can be found here, along with its accompanying Regulatory Impact Analysis Statement, which clarifies a number of issues, including the meaning of “significant harm.” Baker Hostetler’s Melinda McLellan posted additional analysis on the DataPrivacyMontior blog, available here.

Companies maintaining personal data of Canadian residents should consider reviewing their incident response plans in light of this new law.

Seth is our official rocket scientist in residence. Hailing from NASA’s Mission Control Center, Seth brings a unique perspective to incident response, applying aspects of one of the world’s preeminent emergency operations platforms to cyber response. In addition to twenty-plus years’ of technical experience, Seth was previously a member of the data protection task force at a large law firm, and served as the lead Legal team member of an incident response team at a major U.S. airline. Seth is a certified business continuity professional, and he holds a juris doctorate, which is why he also wears the General Counsel hat at LEO.

Speed Warp – The Data Breach Notification Hustle

By Seth Jaffe.

Companies are starting to feel the squeeze of compressed data breach notification time frames. Facebook is a prime example.

Going by the wayside are the loose timelines for notifying agencies or data subjects, only to be replaced by concrete notification windows. At present, just under 20 states have injected specific time frame requirements into their notification laws.[1] While many of these laws still hover between 90 and 30 days, proposed bills are starting to creep into the teens. And then there are other notification laws, such as California’s medical information notification statute, which mandates the Department of Health Services be notified no later than 15 days. And finally, GDPR’s blindingly short 72-hour notification requirement.

Cyber security professionals are quick to list the following recommendations for companies concerned about notification:

  1. Implement a cyber incident response (crisis management) program
  2. Construct and adopt an incident response plan
  3. Designate incident response team members from relevant disciplines, such as information security, legal, communications, human resources, and corporate security
  4. Train incident response team members on the plan
  5. Train non-incident response team members on protocols to convey relevant information to the incident response team
  6. Pre-select data breach vendors beforehand and negotiate terms[2]
  7. Establish relationships with law enforcement
  8. Foster a culture of cyber security

But there is one additional recommendation that is often left off the list, that of matriculating your cyber incident response plan into an executable document.

“Executable,” as used herein, refers to plans inclusive of concrete, step-by-step procedures. I’ve written about the need for procedures before, here and again here. Benefits include concrete direction for team members, faster references to ancillary documents, easier communication of complicated concepts, simpler maintenance of the plan, and better training, all resulting in fewer mistakes, reduced workload, and less stress.

Let’s take a look at an example executable incident response plan procedure. Below is an excerpt from a LEO Cyber Security procedure template.

The left column indicates the discipline responsible for carrying out the action. The step is enumerated in the middle, with directions and a step number for easy reference. Reference to ancillary documents is embedded within the step description, as are links to related steps. The right-hand column rounds out the procedure with reference to rules/directives that give color to the reasons behind an action, as well as a rationale.

By transforming its incident response plan into an executable document, an incident response team can get a jump on the ticking clock of data breach notification. LEO can help you make this transition. Our Gemini Cyber Crisis Management program builds execution right into the incident response plan, along with much more.

[1] AL, AZ, CO, CT, DE, FL, MA, MD, ME, NM, OH, OR, RI, SD, TN, VA, VT, WA, WI.

[2] Examples include outside counsel, forensics, public relations, call centers, ID theft protection, and notification letter printing.

Seth is our official rocket scientist in residence. Hailing from NASA’s Mission Control Center, Seth brings a unique perspective to incident response, applying aspects of one of the world’s preeminent emergency operations platforms to cyber response. In addition to twenty-plus years’ of technical experience, Seth was previously a member of the data protection task force at a large law firm, and served as the lead Legal team member of an incident response team at a major U.S. airline. Seth is a certified business continuity professional, and he holds a juris doctorate, which is why he also wears the General Counsel hat at LEO.

Is Your Cybersecurity Program Protecting Against Hardware Threats?

By Seth Jaffe.

Last week, Bloomberg exposed a hardware backdoor surreptitiously placed on circuit boards by operatives from a unit of the China People’s Liberation Army. This tactic is not new. Indeed, the article claimed that U.S. officials had caught China attempting this in the past. Edward Snowden, back in 2014, famously accused the NSA of covertly implanting interception tools in hardware headed overseas.

Most companies (other than department of defense contractors) probably dismissed the nation state threat, assuming they possessed nothing of value to foreign militaries. But that has changed in view of China’s targeting of U.S. intellectual property and North Korea’s policy of funding its military through cyber bank theft.

And now we have a new threat, one that may not sufficiently be managed in a conventional cyber security program. I’ll leave technical controls to LEO’s CISOs (look for a future post on the subject), but I spent last night thinking about the legal issue. Companies installing vendor hardware in their network may want to demand a representation and warranty that the hardware will be free of vulnerabilities. This is a big ask in light of how difficult it was to locate the Super Micro chip. But if a vendor isn’t even obligated to look, the risk increases.

To that end, the following is a draft clause imposing a vendor, at the very least, to spot check its hardware and to rep and warranty that it is free of vulnerabilities. Obviously, it can be freely negotiated to meet the scope of the deal. I invite my transactional law colleagues to play around with the language and suggest improvements.

For any hardware constructed by, or at the direction of, VENDOR, VENDOR acknowledges that it has conducted a security inspection of a sample of said hardware, the inspection team including, inter alia, the original design team. VENDOR represents and warrants that [there are no/it has no knowledge of any] vulnerabilities existing therein. Furthermore, VENDOR agrees to conduct a spot security audit on the hardware at least annually and to report any security anomalies to COMPANY within 48 hours of discovery.

Seth is our official rocket scientist in residence. Hailing from NASA’s Mission Control Center, Seth brings a unique perspective to incident response, applying aspects of one of the world’s preeminent emergency operations platforms to cyber response. In addition to twenty-plus years’ of technical experience, Seth was previously a member of the data protection task force at a large law firm, and served as the lead Legal team member of an incident response team at a major U.S. airline. Seth is a certified business continuity professional, and he holds a juris doctorate, which is why he also wears the General Counsel hat at LEO.

Cyber Security ROI: It may happen sooner than you think

By Seth Jaffe.

You’ve heard it before. Companies are slow to invest in cyber security because they see few returns.[1]  But that is likely to change, and it may occur sooner than we expected.

Let’s first set the context. An executive recently made the comment to me that “cyber security is just another cost of doing business in the modern environment.” And to many institutions, that’s exactly how they see it. A decade or so ago, they did not have to worry about cyber theft, ransomware, or nation state attacks. But now, boards of directors list cyber security as the risk most likely to keep them awake at night. Moreover, even taking the cost out of it, companies are finding it difficult to secure experienced information security personnel.

So where does that leave us? Stuck in an ever-increasing cost overhead? Not likely. Consumers are awakening to the importance of data protection and privacy matters, and they are starting to demand safeguards. That presents quite a marketing opportunity for those at the forefront of the cyber curve. Now is the time to start cashing in.

The trend is beginning among those industries hit the hardest, like financial services. Bank of America proudly displays its Javelin award for Best Overall Identity Safety in Banking. JPMorgan Chase’s security center aims to demonstrate its cyber chops. Granted, neither leads with cyber security as the cornerstone of a marketing campaign…yet. (Then again, it took car companies decades to put cup holders in cars.[2]) Companies will soon realize that security is a market differentiator.

And that’s the take away. Consumers want their stuff protected, and they are willing to further that narrative with their wallets. But don’t take my word for it. Cyber security scorecards and certifications are popping up all over the place. FICO, the entity responsible for your credit score, offers a security rating service. Both the Pentagon and the EU utilize cyber scorecards in making contract decisions.

And don’t forget that Javelin award. The writing is on the wall. Consumers are paying attention to cyber. Those with something to say on information security will have an edge. But don’t forget to involve your legal counsel in any marketing campaign. Blindly promising security of your customers’ personally identifiable information is likely to land you in hot water in the face of an incident.

We are entering a new phase of cyber security, where implementation of well-designed programs by experienced information security professionals will provide a direct return on investment. This is something the board of directors can sink its teeth into, and maybe free up some budget for your information security program.

[1] Exceptions include entities under a consent decree from a regulatory agency, those trying to maintain a certification requiring security controls, and companies vying for a government contract.

[2] In the 80’s, they were flying off the shelves at Autozone. Seems car manufacturers are equally as slow in provisioning cell phone holders.

Seth is our official rocket scientist in residence. Hailing from NASA’s Mission Control Center, Seth brings a unique perspective to incident response, applying aspects of one of the world’s preeminent emergency operations platforms to cyber response. In addition to twenty-plus years’ of technical experience, Seth was previously a member of the data protection task force at a large law firm, and served as the lead Legal team member of an incident response team at a major U.S. airline. Seth is a certified business continuity professional, and he holds a juris doctorate, which is why he also wears the General Counsel hat at LEO.

Alabama Requires Entities to Safeguard Sensitive Information

By Seth Jaffe.

Alabama recently became the 50th state to pass a data breach notification law, but in doing so, the state upped the ante by including security obligations generally found in industry-specific cyber security laws. I’ve written about the Eight Principles of Cyber Security Laws in a prior blog post. Alabama adopted seven of them.

For covered entities—which is essentially any company that “that acquires or uses sensitive personally identifying information”—Alabama imposes the requirement that it:

  1. Conduct a risk assessment
  2. Implement appropriate safeguards to address those risks
  3. Involve the Board of Directors regarding cyber security
  4. Designate an individual to coordinate the entity’s security measures
  5. Respond to a breach (maintain an incident response program)
  6. Manage third party provides regarding security
  7. Evaluate and adjust the program as necessary

The only prong not included from my list of eight was training.

The implications of this law run deeper than just state prosecution. Attorney Fredric Bellamy recently wrote about the case of Community Bank of Trenton v. Schnuck Markets, Inc., where a federal court dismissed a claim brought by banks against a supermarket that suffered a data breach resulting in the compromise of hundreds of thousands of credit cards. The banks wanted compensation for losses incurred from fraud and replacing the cards. But the court dismissed the negligence per se claim because no statute or ordinance had been broken. Under Alabama’s law, and the similar statutes requiring security obligations, courts may come to a difference conclusion.

This means that companies failing to meet the security obligations imposed by Alabama’s law are more likely to find themselves ensnared in litigation due to a data breach.

The Alabama Data Breach Notification Act went into effect last Friday (June 1, 2018). Any company that resides in Alabama or holds sensitive personally identifying information of Alabama residents may want to reexamine its security program to ensure it meets the above principles.

Seth is our official rocket scientist in residence. Hailing from NASA’s Mission Control Center, Seth brings a unique perspective to incident response, applying aspects of one of the world’s preeminent emergency operations platforms to cyber response. In addition to twenty-plus years’ of technical experience, Seth was previously a member of the data protection task force at a large law firm, and served as the lead Legal team member of an incident response team at a major U.S. airline. Seth is a certified business continuity professional, and he holds a juris doctorate, which is why he also wears the General Counsel hat at LEO.

The 8 Principles of Cyber Security Laws

By Seth Jaffe.

The United States has yet to promulgate a comprehensive federal cyber security law aimed at improving the cyber hygiene of companies serving its citizens. But a collation of industry-specific laws (both federal and state), proposed bills, guidance documents, and cyber strategies yields a fair indication of where our nation is headed. This article attempts to distill the aforementioned into a list of eight principles that will likely find their way into forthcoming federal or state cyber security law.

Principle 1: Conduct a Risk Assessment

As far back as at least the Graham-Leach-Bliley act in 1999, authorities recognized the difficulty in designing a comprehensive cyber program without first identifying assets, understanding vulnerabilities, and forecasting attack vectors. For this reason, cyber laws will undoubtedly require a company to conduct a comprehensive risk assessment at periodic intervals.

For example, the Graham-Leach-Bliley safeguards rule requires a covered entity to “identify and assess the risks to customer information in each relevant area of the company’s operation, and evaluate the effectiveness of the current safeguards for controlling these risks.”

Additional legislation and regulations that include this principle can be found here.

Principle 2: Implement an Information Security Program

Upon completion of the risk assessment, companies will have to fashion an information security program designed to mitigate those risks. This includes authorship and maintenance of a Written Information Security Plan (“WISP”).

As an example, the Colorado Securities Act 3 CCR 704-1 states “A broker-dealer must establish and maintain written procedures reasonably designed to ensure cybersecurity.”

Additional legislation and regulations that include this principle can be found here.

Principle 3: Involve the Board of Directors in Cyber Security Management

Without buy-in from senior management, companies may find themselves culturally constrained when it comes to cyber security. Board of Director involvement can usually be satisfied through implementation of a process to percolate relevant cyber security information to the Board, as well as push decisions down to the company. The Board should have the ability to digest the information, which can be difficult if no members are conversant in cyber security technology; many Boards form a cyber security committee for this purpose.

The proposed Federal Cyber Regulation for Financial Institutions law provides a good example: “The board of directors, or an appropriate board committee, of a covered entity must be responsible for approving the entity’s cyber risk management strategy and holding senior management accountable for establishing and implementing appropriate policies consistent with the strategy.”

Additional legislation and regulations that include this principle can be found here.

Principle 4: Designate an Individual in Charge of Cyber Security

Often referred to as a Chief Information Security Officer (“CISO”), a company must designate an individual with the authority to oversee the security program and the accountability should incidents occur. This CISO need not be an employee of the company, but can be contracted from a third-party provider, such as LEO Cyber Security.

The New York Department of Financial Services Part 500.04 is instructive: “Chief Information Security Officer. Each Covered Entity shall designate a qualified individual responsible for overseeing and implementing the Covered Entity’s cybersecurity program and enforcing its cybersecurity policy (for purposes of this Part, “Chief Information Security Officer” or “CISO”).”

Additional legislation and regulations that include this principle can be found here.

Principle 5: Maintain an Incident Response Program

Organizations in the midst of an incident are notoriously terrible at improvising. Without a comprehensive enterprise cyber crisis management plan, mistakes will be made. Cyber laws will dictate that a company maintains an incident response plan, periodically updates the plan, and tests against the plan at least annually.

South Carolina’s Insurance Data Security Act (following the NAIC model law) states: “As part of its information security program, a licensee must establish a written incident response plan designed to promptly respond to, and recover from, a cybersecurity event.”

Additional legislation and regulations that include this principle can be found here.

Principle 6: Manage Cyber Security of Third-Party Vendors

A recent Soho study concluded that 63% of all data breaches involved the supply chain. Even if that number seems a bit high, regulating authorities are taking note as is evident from two recent settlements related to data breaches caused by a third-party.

Massachusetts 201 CMR 17.03(2)(f) requires companies to “Tak[e] reasonable steps to select and retain third-party service providers that are capable of maintaining appropriate security measures to protect such personal information consistent with these regulations and any applicable federal regulations.”

A number of laws are requiring companies to push cyber obligations on third parties. You can find a list of them here.

Principle 7: Conduct Routine Security Training

A company’s cyber program is oftentimes only as robust as the employees implementing it. A number of statistics put insider threats as a leading cause of data breaches. Whether it is because employees invariably click on suspicious links in emails, use easily defeatable passwords, fail to report malicious or accidental cyber issues, or simply do not practice good cyber hygiene, poorly trained employees are often the weakest link in a cyber security program.

The HIPAA security rule (see page 8377) requires covered entities to “implement a security awareness and training program for all members of its workforce (including management).”

Additional legislation and regulations that include this principle can be found here.

Principle 8: Regularly Update the Program

Authorities recognize that cyber security is a living program, requiring continuous modifications as new threats arise, infrastructure changes, and reorganizations occur. Companies are instructed to modify the program accordingly, but at the very least, it should be reviewed and updated annually.

PCI-DSS 12.1.1 requires entities to “review the security policy at least annually and update the policy when the environment changes.”

Additional legislation and regulations that include this principle can be found here.

As expected, laws and regs change all the time (as an example, there were 244 state cyber bills introduced in 2017 and already 233 as of August 2018), so check the Trello board often and follow my LinkedIn page where I will notify of updates.

Seth is our official rocket scientist in residence. Hailing from NASA’s Mission Control Center, Seth brings a unique perspective to incident response, applying aspects of one of the world’s preeminent emergency operations platforms to cyber response. In addition to twenty-plus years’ of technical experience, Seth was previously a member of the data protection task force at a large law firm, and served as the lead Legal team member of an incident response team at a major U.S. airline. Seth is a certified business continuity professional, and he holds a juris doctorate, which is why he also wears the General Counsel hat at LEO.

When It Comes to Cyber Security, Lack of Vendor Oversight Can Lead to Legal Trouble

By Seth Jaffe.

Third-party cyber security programs got a shot in the arm this week in the form of two legal actions.

The first, well summarized by Sue Ross over at Norton Rose Fulbright, is a proposed consent agreement by the Federal Trade Commission against mobile phone manufacturer BLU Products, Inc., alleging that BLU’s failure to oversee its vendor’s security practices amounts to a violation of Section 5 of the FTC Act. FTC consent orders are generally 20 years in length, and require adherence to a strict “never-let-this-happen-again” program. Indeed, BLU would have to implement a comprehensive data security program with a biennial assessment and all sorts of compliance obligations. In short, consent decrees come with an operational and monetary sting, and violation of one can find the company staring down the barrel of steep fines (see, e.g. FTC Commissioner Chopra’s memo calling for more serious penalties for violations of consent orders).

The second, as described by Kevin LaCroix on the D&O Diary, is a settlement in the shareholder derivative suit against Wendy’s for a 2016 data breach caused by the compromise of third-party credentials. We’ve seen a number of these derivative suits before, such as against Wyndham, Target, and Home Depot, where a shareholder steps into the shoes of the company and sues the directors. Unsuccessful in previous cases, Wendy’s had a different outcome. If adopted, Wendy’s would agree to implement remedial and prophylactic cyber security measures, form a cyber executive steering committee, and push cyber obligations down to franchisees. Oh, and pay the plaintiff’s attorneys’ fees of nearly $1M.

Because both cases are settlements, we don’t know what pressure was being applied to the defendants. Perhaps this is the beginning of a shift toward holding companies more accountable for the cyber missteps of their vendors. In the present, there are a number of steps companies can take, such as beefing up their contractual security provisions, conducting security audits of vendors, network isolation, vendor access control, and log management, to name a few. For a more comprehensive list, feel free to reach out to one of LEO’s experienced CISOs.

Seth is our official rocket scientist in residence. Hailing from NASA’s Mission Control Center, Seth brings a unique perspective to incident response, applying aspects of one of the world’s preeminent emergency operations platforms to cyber response. In addition to twenty-plus years’ of technical experience, Seth was previously a member of the data protection task force at a large law firm, and served as the lead Legal team member of an incident response team at a major U.S. airline. Seth is a certified business continuity professional, and he holds a juris doctorate, which is why he also wears the General Counsel hat at LEO.

Securing Financial Institution Core Migration

By Seth Jaffe.

The Credit Union Information Security Professionals Association held its yearly meeting last week in San Antonio. One of the topics that came up often was core migration, a security issue that just got a booster shot from Tuesday’s article by Brian Krebs on that very subject. One of Krebs’ colleagues received an email notification requiring password reset due to migration to a new e-banking platform. To successfully login to the new platform, a customer need only her username and the last four digits of the social security number, two items that are likely for sale on the darkweb for just about all Americans (though individuals are getting better at setting unique passwords, usernames often remain the same across accounts). Armed with just these two pieces of information, a cyber criminal could implement a new password, set new security questions, and register a new phone number, thereby bypassing any two-factor authentication that the customer originally relied upon.

Though most banking institutions utilize these core platforms (such as FISERV, FIS, Jack Henry, Corelation, CSI, D+H, and COCC), smaller entities like local banks and credit unions may feel they don’t have sufficient bargaining power when it comes to managing migration or even core upgrades. But in many cases, they have more leverage than they suspect. For example, the NCUA Part 748 Appendix A requires a credit union to “ensure the security and confidentiality of member information” and “protect against unauthorized access to or use of such information that could result in substantial harm or inconvenience to any member.” Appendix A goes on to impose obligations to oversee service provider arrangements, including “requir[ing] its service providers by contract to implement appropriate measures designed to meet the objectives of these guidelines.”

The Code of Federal Regulations, therefore, arms a credit union with significant bargaining power in matters such as core migration and upgrades. A service provider would be hard pressed to maintain an inadequate migration process in view of such. Moreover, the core platform company likely has similar information security obligations itself, whether it be through Gramm-Leach-Bliley, a state law, or something else. Take a look at the FFIEC’s information security website, or talk to your attorney.

Seth is our official rocket scientist in residence. Hailing from NASA’s Mission Control Center, Seth brings a unique perspective to incident response, applying aspects of one of the world’s preeminent emergency operations platforms to cyber response. In addition to twenty-plus years’ of technical experience, Seth was previously a member of the data protection task force at a large law firm, and served as the lead Legal team member of an incident response team at a major U.S. airline. Seth is a certified business continuity professional, and he holds a juris doctorate, which is why he also wears the General Counsel hat at LEO.

What the Hawaii Missile Scare Can Teach Incident Response Teams

By Seth Jaffe.

Heads finally rolled over at Hawaii’s Emergency Management Agency. What can the incident response community take away from this latest real-life example? Procedures, Rules, and Communication Protocols, which are the underlying principles of a modern incident response program. I’ve written about all three in prior incident response posts, but let’s apply them to the missile scare.

We should consider procedures first, in view of a couple of takeaways from the Fox News article. Apparently, the employee who sent out the incorrect warning message “’froze’ and ‘seemed confused.” I’ve been there—in fact, several times during training at NASA. “Deer-in-the-headlights” is a right of passage, and good instructors purposefully attempt to box in a trainee for this type of experience.

How do you get operators moving again? Two ways. This first is a cultural issue. Operators need to have experienced this phenomenon before, to understand that mistakes do happen, and that one is expected to focus on the present workarounds, rather than fester on the past mistake. Only training and an above-board culture can accomplish this mindset. The second motivation comes from solid procedures. I’ve written about deer-in-the-headlights moments before. Comprehensive and mature procedures combat this phenomenon by providing clarity of direction in a time of need. Pull out the procedure and start at step one. Now you are moving again. When the step is complete, the operator moves to the next step and, surprisingly, the shock wears off and the operator finds herself back in the game. Most incident response teams do not have on-site backup team members, so getting this operator on her feet is fundamentally crucial to the mission.

We also learned from the article that the language of the alert message to the operator strayed from typical scripts, and included the phrase “THIS IS NOT A DRILL.” That raises the issue of Rules, about which I’ve also written. Rules capture decisions made by steering committees charged with dictating policy, and by response teams learning lessons from training exercises and real-life events. Rules are isolated, numbered documents that clarify “shalts” and “shall nots.” For example, thou shall not use the term “THIS IS NOT A DRILL” during live drills.

Finally, the FCC cited a miscommunication during a shift change as contributing to the mistake. That’s the kind of thing that can cripple a response team. If your incident response team does not employ periodic communication exercises, focusing on uniform vernacular, concise comm tactics that combat tautology, and methods to ensure the correct parties are involved, then you may want to consider budgeting some time for them. Effective communication can make or break an incident response team. More on that soon. Until then, keep your finger off the red missile button.

——————

Seth Jaffe is the head of the Incident Response Division at LEO Cyber Security. He spent nearly 14 years as a NASA flight controller in Mission Control, where he was certified on the Space Shuttle and the International Space Station. As a controller, evaluator, and instructor, Seth trained candidates to react to time-sensitive emergency situations and to effectively communicate in the Mission Control environment.  He took part in over 100 simulations and logged over 3000 hours flying the ISS, experience he draws upon in his incident response practice. 

Seth is our official rocket scientist in residence. Hailing from NASA’s Mission Control Center, Seth brings a unique perspective to incident response, applying aspects of one of the world’s preeminent emergency operations platforms to cyber response. In addition to twenty-plus years’ of technical experience, Seth was previously a member of the data protection task force at a large law firm, and served as the lead Legal team member of an incident response team at a major U.S. airline. Seth is a certified business continuity professional, and he holds a juris doctorate, which is why he also wears the General Counsel hat at LEO.

Security Provisions Negotiation in the Wake of the OCC Risk Report

By Seth Jaffe.

The “severity of cyber threats is increasing.” It’s something most of us inherently understand, but now we have the Department of the Treasury’s Office of the Comptroller of the Currency (“OCC”) weighing in with its Fall Risk Report for banks and savings associations.  The OCC has been sounding the alarm for years now. Even back in 2000, the OCC opined that “[s]enior management and the board of directors are responsible for overseeing the development and implementation of their bank’s security strategy and plan.” Last week’s Semiannual Risk Report highlights cybersecurity trends facing our nation’s financial institutions. A couple of conclusions are worth note.

The OCC warned that the number and complexity of third-party relationships is expanding, which in turn increases risk management challenges on banks. The OCC recommends a heightened supervisory focus. But what does that mean? In view of the report, banks may want to start by revisiting their contract security provisions, which are often attached as an addendum to master agreements with vendors. Security provisions include obligations for a third party to implement protocols at least as rigorous as a given standard, adhere to the bank information security policy, provide SOC 2 reports, notify the bank within a certain timeframe in the event of a breach, warrant any software is free of vulnerabilities, perform periodic penetration testing, and allow a bank to audit the vendor’s compliance with these provisions. As banks outsource more and more of their operations (according to the OCC), it is even more important that they unify their third-party security requirements under a common policy.

The OCC lists, as an additional risk, the concentration of outsourced services in the hands of a few large service providers. This effectively reduces the bargaining power of banks as against the third parties. Regulatory authorities and trade associations can help by releasing model security provisions for adoption by financial institutions (like the Association of Corporate Counsel did). But that’s rare this early in the game. In the meantime, banks may want to consider sharing best practices with each other to collectively improve their position against the service providers.

The issue of contractual security provisions will only grow in importance. When I design cyber law events and lunch and learns, or submit proposals for speaking engagements, I almost always include the subject as a dedicated session. If you find the opportunity to attend a session on security provisions negotiation, by all means, go!

Seth is our official rocket scientist in residence. Hailing from NASA’s Mission Control Center, Seth brings a unique perspective to incident response, applying aspects of one of the world’s preeminent emergency operations platforms to cyber response. In addition to twenty-plus years’ of technical experience, Seth was previously a member of the data protection task force at a large law firm, and served as the lead Legal team member of an incident response team at a major U.S. airline. Seth is a certified business continuity professional, and he holds a juris doctorate, which is why he also wears the General Counsel hat at LEO.