The FDA's latest cybersecurity workshop was inevitably topical, given the rapidly shifting range of threats to device security, but the workshop took place as the Wannacry ransomware crippled hospitals around the world, making this a uniquely relevant event for the agency's efforts to ensure device cybersecurity and, by implication, patient safety.
Suzanne Schwartz, the FDA's associate director for science and strategic partnerships, said the term "regulatory science" has a slightly different meaning where device cybersecurity is concerned. "Its about the design ... before they go onto market," Schwartz said, but she noted that the agency is interested in anticipating cybersecurity hazards "so we can also be responsive to emerging issues."
The Wannacry ransomware attack hit thousands of hospitals in a number of nations, targeting Windows operating systems, with much of the impact felt by systems running the Windows 7 version. Schwartz said the CDRH science council had pinpointed device cybersecurity as among the center's top 10 concerns, but she pointed to the Wannacry episode as an event that "brings that message home to us today."
Schwartz noted also that investment in IT infrastructure is a priority at CDRH, which should bolster the center's ability to anticipate emerging threats, but the center is also focused on "developing evaluative tools" that will help ODE staff deal with long-standing issues that crop up frequently.
Off-the-shelf software presents cyber-issues
Pat Baird, director of global software standards at Philips of Andover, Mass., gave attendees at the workshop a look at some of the cybersecurity issues associated with commercial, off-the-shelf (COTS) software, but he started his talk by stating, "when I think about risk management, I think about Murphy's Law."
Baird said the consequences of mitigations have to be considered as a cybersecurity development process moves along, noting that unintended consequences routinely surface. He gave the example of anti-lock brakes, pointing out that the rate of automobile accidents fell in the first few years after they were introduced, but that the rate of accidents ticked back up as drivers became accustomed to the new technology. Baird speculated that drivers are "not hitting the brakes until later" in an impending crash situation.
Thus, when a new security feature is introduced to mitigate a risk, "people manage to bring [the risk] back to where it was before," Baird shrugged.
One of the issues with COTS software products is the prospect that the vendor will go out of business before the software's utility has expired, but Baird said that the problem of proprietary source code is still a potential snag as well. Throw in the use of several COTS programs in one installation, and the problem of inter-software dilution of security can grow exponentially.
Baird briefly revisited a presentation he made to health care providers, explaining, "there was venom in the room" from nurses regarding security features. One of the principal complaints was the frequency of log-in requirements, part of a larger issue revolving around how security features affect clinical workflow. "We have to keep in mind how our efforts are being perceived" by providers and other users, he said, adding, "all of us need to do a better job at communicating why we're doing some of the things" that draw fire from users.
Baird said he has spent some time shadowing caregivers in a hospital to get a feel for how security software interacts with workflow, remarking that nurses are highly creative "in their sort of off-label-y" means of dealing with nettlesome security features. He said safety and security features are often designed separately, but risk acceptability criteria are not always easy to construct in a manner that will satisfy all users. An example of a predicament that might give a software design team fits is a security feature that incurs a risk of minor injury, but which is needed to ward off a serious security breach, a scenario in which a company will find itself with a difficult decision to make.
Small hospitals face different problems
Ken Hoyme, director of product and engineering systems security for Boston Scientific Corp., of Marlborough, Mass., said a rather conspicuous problem in the device cybersecurity arena is that big health care providers and big device makers talk to each other routinely about cybersecurity, but smaller hospitals are more numerous, and they don't always have a full complement of staff for IT support.
Some of these smaller hospitals will not be up to speed on cybersecurity concerns, in some instances because they lack "knowledge of what's in there," Hoyme said. A component-by-component approach to cybersecurity might not secure a system, either, because, "I can take a bunch of individually safe components and build an unsafe system."
User authentication has to account for the different clinical environments in which a device is used, Hoyme said, but he pointed out that a security protocol using biometrics, such as an eye scan, might be more problematic in an operating room than in a nursing home, particularly if the operating surgeon has to use eyewear in the OR. Some may resort to a "break-glass" option for accessing a secured system under urgent or emergency circumstances, an activity that is typically checked in post-event audits, but this option carries its own set of hazards that must be taken into account.
Machine-to-machine authentication is another sticking point for many clinical operations, in part because authentication on a vendor-by-vendor basis might not ward off all potential problems in all environments, Hoyme said. Another factor is that hospitals might want to implement their own authentication programs, but leased equipment is yet another multiplier of authentication headaches. Another potentially labor-intensive demand is ensuring that a device's essential performance is not compromised when electronically separated from other devices or from a system.
Hoyme said the allure of COTS is somewhat diminished by the different life cycles of the numerous software components found in a system. For instance, a low-cost operating system favored by the accounting department may reflect a willingness to accept an operating system with an expected useful life that may end while a host of cybersecurity applications are still serviceable for that clinical operation. Another problem in this scenario is that an operating system update may engender compatibility issues with those cybersecurity programs.
Verification/validation of cybersecurity changes and patches are still essential functions, but Hoyme said developers should not ignore near-miss events that arise during validation and verification processes because these near misses, under the wrong conditions, can lead to a major security failure.