Ecmweb 5087 201ecmcspic1
Ecmweb 5087 201ecmcspic1
Ecmweb 5087 201ecmcspic1
Ecmweb 5087 201ecmcspic1
Ecmweb 5087 201ecmcspic1

Security Risks

Jan. 1, 2012
How secure are industrial control systems within today's facilities from cyber threats

Until a few years ago, breaches in cybersecurity, largely consisting of data theft and invasions of privacy for financial gain, were strictly the domain of information technology (IT). However, in March 2007, researchers at the Idaho National Laboratory, Idaho Falls, Idaho, at the behest of the U.S. Department of Homeland Security (DHS), demonstrated the vulnerability of a diesel generator by hacking in from a remote access point and causing it to self-destruct. Although the official report on the experiment, named Project Aurora, is classified, a video showing the generator malfunctioning with thick smoke coming out of it captivated viewers on the Internet. Project Aurora demonstrated the potential of malicious cyber attacks to compromise critical infrastructure through manipulation of industrial control systems (ICSs). However, at the time, this type of attack seemed only a remote possibility.

The Stuxnet virus brought awareness to security
issues surrounding control systems at facilities
such as power plants like this nuclear plant.

Then, in July 2010, it was revealed that a virus had targeted the programmable logic controllers (PLCs) controlling the centrifuges of an Iranian nuclear facility in Natanz located in central Iran. After inspectors with the International Atomic Energy Agency noticed an unusual number of decommissioned damaged centrifuges at the plant, it was eventually discovered that in June 2009, the malicious software, or malware, had made its way — most likely through infected USB flash drives — into computers in the plant.

The virus, named Stuxnet, used a number of “zero-day exploits,” which means the worm was created to exploit vulnerabilities in software that had no known patches. Although the malware first exploited the LNK file of Windows Explorer to covertly place an encrypted file from the USB stick onto computers, German ICS security expert Ralph Langner eventually figured out that its true aim was Simatic WinCC Step7 software, an ICS manufactured by German electronics and electrical engineering manufacturer Siemens, which has been installed to run the PLCs that control motors, valves, and switches in industries worldwide.

Langner’s conclusion was a surprising revelation. Previously, ICS and automation equipment were proprietary and ran on isolated, stand-alone networks. Because of this, they were believed to be off the radar regarding cyber attacks or not profitable to hackers who were mostly seeking financial gain. Therefore, an alarming lesson learned from Stuxnet was its ability to infect machines that weren’t connected to a computer network or the Internet. Furthermore, in recent years, ICSs have increased their interoperability and are more often connected to the Internet. Creating more reliable and usable systems through automation and remote access can actually make them more vulnerable.

“Arguably, Stuxnet was the first case of a sophisticated attack that was meant to damage equipment,” says Joe Weiss, managing partner of the San Francisco Bay Area-based Applied Control Solutions, which provides consulting services relating to the optimization and security of ICSs. Weiss, in addition to Langner, is recognized as one of the few experts in ICS security. A 35-year veteran in the field of industrial instrumentation controls and automation, with more than 10 years in ICS cybersecurity, he serves on numerous ICS security standards organizations and is author of the book Protecting Industrial Control Systems from Electronic Threats.

“Before, this was pretty simple,” Weiss continues. “If you had a problem, it was because a valve had a bad design, and that was easy enough to fix. Or you had a problem with boiler control software. When you fixed the boiler control software, it was done. Now this is ongoing. Every time you make a change, the question is: Have you done something to create a cyber vulnerability that wasn’t there before?”

Law of Opposite

Unlike Stuxnet, not all cyber incidents involving control systems are intentional. For example, the natural gas pipeline failure in San Bruno, Calif., which killed eight people in September 2010, was most likely caused by work being done on an uninterruptible power supply (UPS) system located about 39.33 miles southeast of the accident site, according to the National Transportation Safety Board’s report. During the course of this work, the power supply from the UPS system to the supervisory control and data acquisition (SCADA) system malfunctioned so that instead of supplying a predetermined output of 24V of direct current (DC), the UPS system supplied approximately 7VDC or less to the SCADA system. Because of this anomaly, the electronic signal to the regulating valve for Line 132 was lost — the loss of the electrical signal resulted in the regulating valve moving from partially open to the full open position, as designed.

In 2007, a DHS demo showed that emergency
generators can be vulnerable to cyber attacks.

Whether intentional or malicious, Weiss uses the NIST definition of a cyber incident, which is electronic communication between systems that affects confidentiality, integrity, and/or availability — the CIA triad. For control systems, availability has the biggest impact. “We’re talking about the availability of a system or a process,” says Weiss. “That’s where this gets to be very different from IT, which is about the loss of communications to an IT computer.”

As an example of loss of communication in control systems, Weiss uses a broadcast storm (too much data on the network) at the Browns Ferry Nuclear Plant, which resulted in the loss of two 10,000-horsepower main coolant pumps.

Until recently, the priorities for design requirements for control systems consisted of performance, reliability, and safety — not security. “Security is not only a new constraint, but also often goes in the opposite direction of reliability and safety,” says Weiss, who explains that securing a system requires retrofitting new security requirements where none previously existed. “You’re not building a new system from scratch. You’re putting requirements into a system that go exactly the opposite of why the system works in the first place. You’re making it more complicated.”

According to Weiss, Stuxnet is a great example of the differences between security for control systems and IT. “Many people focused on the Windows zero-days, but they were simply a delivery vehicle,” Weiss explains. “The warhead affected the controller by changing the controller logic. This was an unexpected ICS attack for which no IT security solution applied then or applies now.”

Therefore, security for control systems should not be approached in the context of IT. Consequently, says Weiss, the security requirements are not based on what it takes to secure a control system against control system threats, but for IT systems (Windows servers and PCs) used in control system applications against IT threats.

Unfairly then, ICS devices, which have proven safe and reliable for years, have been accused of failures in security. “The devices are really good, well-designed, reliable, and safe systems,” says Weiss. “They were never meant to be security systems, but now they’re being accused of not doing what they were not designed to do. How’s that for a double negative?”

100 Days

Because of its stealthy nature, the Stuxnet worm was not discovered for more than a year as it did not directly affect the performance and safety of its targeted nuclear plant. Moreover, because the vulnerability it exploited was a design flaw and not patchable, DHS didn’t even call it a vulnerability, says Weiss. “It wasn’t a design deficiency for reliability or safety; only for security,” he says.

However, since the discovery of Stuxnet and the vulnerabilities it exploited in the Siemens controllers, there’s been a move toward finding similar vulnerabilities in other controllers, particularly with the hard-coded default passwords. “That’s common in controllers from many of our vendors,” says Weiss. “They’ve always been there, but now people are starting to find them.”

In the past, availability has trumped security for
facilities such as water treatment plants.

For instance, for the last six months, Billy (BK) Rios, security expert at Google and former security engineer for Microsoft, and Terry McCorkle, an information security red team member for Boeing, have undertaken an independent ICS security project. The two set out to find 100 security flaws in 100 days so that they could then present their findings at a conference. “We wanted to take a look at the state of security for other control systems in general,” says Rios. “We wanted to approach it from the perspective of what can two regular guys do without a charter and without a lot of money when it comes to control systems and their security.”

The pair began requesting free trial software and auditing it in their spare time on nights and weekends. Instead of the 100 bugs in 100 days, they ended up finding more than 600 in 100 days. “When I look at what a modern, secure machine and robust software look like, control systems software does not fall into that category,” says Rios. “The software robustness and security of Apple’s iTunes is probably better than 99% of the control systems I’ve seen out there, both at the hardware and the software level —that’s pretty sad.”

Rios and McCorkle reported their findings to the Industrial Control Systems Cyber Emergency Response Team (ICS-CERT), overseen by DHS, which has since issued some public advisories based on their research. Recently, ICS-CERT published an alert regarding Internet-facing control systems. The ICS-CERT warning lists five reports so far in 2011 of SCADA and ICS systems exposed using scanners, and warns asset holders to secure their systems. “The thing we found to be the most disappointing piece is that implementers usually don’t know about this kind of stuff,” says Rios. “They buy the software from whatever vendor, and they implement it into their own environments, on their own networks, and they’re just completely vulnerable to all sorts of crazy stuff.”

In addition, Rios and McCorkle work with ICS-CERT to notify software vendors of their vulnerabilities. “We’ve been working with those guys to basically contact vendors and have them fix their software,” says Rios. “It’s a very sensitive subject. They don’t want to go to their customers and say, ‘Yeah, our software quality from a security standpoint isn’t very good.’ They don’t want to say that to people, for obvious reasons.”

Currently, Rios and McCorkle only share their research with ICS-CERT. However, they have plans to go public with a database of their findings. “At some point, we’re going to make these vulnerabilities public,” says Rios. “That way, people implementing the software will have some idea that they have these exposures, and they can take other mitigating actions to prevent exploitation of these vulnerabilities.”

In the meantime, Rios discusses some of the research on his blog at: http://xs-sniper.com. When the announcement of the final project comes, it will be published there.

Until the vendors take notice, however, both they and the integrators are in a bind, according to Weiss. “The vendors aren’t going to build a new secure system that costs more money if the end-users aren’t willing to pay more money for a secure system,” he says. “And the end-users aren’t going to specify a more secure system that costs more money and may or may not be as reliable as an older one without security if they’re not forced to do so. It’s a catch-22.”

The Official Octopus

Eventually, Langner came to realize that Stuxnet wasn’t just targeting the Siemens controller; its code contained information about the specific technical configuration of the nuclear facility. It was only targeting that particular facility. As such, Stuxnet was not treated as a wide threat to critical infrastructure that relies on automated systems, such as the electrical grid, transit systems, and sewage treatment plants and dams, or even non-critical systems such as automation for manufacturing facilities and electrical and mechanical systems for offices, schools, and hospitals. Also, because the attack slowed down the Iranian nuclear program, it was reported in a positive light. “Because it happened to Iran, and it was done to a centrifuge, people looked at it as a good thing,” says Weiss. “People are going, ‘Well, I don’t have a centrifuge; therefore, it can’t affect me.’ So, unfortunately, the security guys went nuts with it, but most of the operations people basically went, ‘Well, that’s interesting, but it doesn’t affect me.’ It hasn’t had nearly as much of an impact as we thought or we hoped it would in terms of getting people to do the right thing and secure their systems.”

Still, there have been some efforts made to improve ICS security. In the last year, the Obama administration launched a “cyber command” at the U.S. Department of Defense, which has improved coordination between the Pentagon’s efforts and the DHS’ initiative on the civilian response to cyber threats. Yet, although some regulatory agencies have laid out best practices for guarding against exploits, compliance is voluntary, and, according to Weiss, is often missing input from ICS experts. “Arguably, there are only a limited number of people who are actually control system cybersecurity experts,” says Weiss. “However, those people are generally not consulted when the subject of control system security is raised. The Enduring Security Framework (ESF) Operations Group not only has no control system experts, but it also hasn’t even included control system suppliers in the mix.”

In addition, Weiss finds the recent U.S. Department of Energy (DOE) and DHS road maps are vague and do not address the control system cybersecurity issues actually being faced. According to Weiss, DOE’s draft “Electricity Sector Cybersecurity Risk Management Process Guideline” does not distinguish between IT and control systems. NIST’s National Initiative for Cybersecurity Education (NICE) does not address control systems. The recent MIT report on the future of the electric grid does not adequately address cybersecurity of control systems. Furthermore, Weiss points to the omission of Stuxnet in the critical infrastructure protection standards (CIPs) put out by the North American Electric Reliability Corp. (NERC), which upholds the mission of ensuring the reliability of the North American bulk power system. NERC Standards CIP-002 through CIP-009 provide a cybersecurity framework for the identification and protection of Critical Cyber Assets to support reliable operation of the bulk electric system. “Not only do the NERC CIPs essentially exclude Stuxnet, but they also exclude almost 70% of the power generation in North America from being examined for cybersecurity threats,” Weiss says. “That doesn’t make sense, does it?”

Despite the potential enormous impact the revelations about Stuxnet could bring to the industry, very little has been done to improve security. “So even though this was a big thing, its impact on changing people’s behavior and taking security more seriously has been limited,” says Weiss. “We expected a lot more. It’s not clear to me that we have made much progress since 2000. ”

Steps can be taken to beef up security without sacrificing reliability or safety. Traditional security strategies, such as adding a firewall or changing default passwords, don’t necessarily work in a control system environment. In fact, changing the default passwords in a PLC could effectively shut down the PLC. “What is important is to learn how to secure them while allowing them to continue to do their jobs,” says Weiss, who advises that because many control systems (especially field devices) have no security and may not be patchable, it is critical that they be secured by policies and procedures meant for control systems.

But very few facilities have written ICS cybersecurity policies. “Everybody has IT policies, but IT policies are not adequate for control systems,” continues Weiss. “This is why the International Society of Automation (ISA) initiated S99, ‘Security for Industrial Automation and Control Systems.’ You can’t just give this to IT and walk away. ”

That’s not to say operators shouldn’t take advantage of the security measures that come with the Windows human machine interface (HMI) component of the ICS. “That not only can but also should have all kinds of available security,” says Weiss. “It’s a Windows system. If you’re not using the security, then shame on you.”

However, security is more difficult as you move into the hardware component of control systems, such as PLCs, chemical analyzers, variable-frequency drives (VFDs), and smart transmitters. “If you start trying to change them, with all sorts of crazy modifications, you can hurt reliability,” Weiss says.

Asset management plays an important part in security also. It’s important to know what equipment is installed in the field and how it’s connected. “Very few organizations know what they have installed in the field that can be affected by cyber threats,” says Weiss.

Furthermore, Weiss says much can be learned from past and recent incidents. He keeps a database with more than 200 actual control system cyber incidents.

Security and compliance are not the same. “You can be secure but not compliant, which is a paperwork problem, or you can be compliant but not secure, which is a real problem,” says Weiss.

There have been cases where two major electric utilities mandated the use of anti-malware for protective relays even though the anti-malware could have prevented the relays from operating.

The question of responsibility for the security of the systems controlling infrastructure is complex. Vendors, owners, and operators, along with researchers and regulators, all bear a portion. “It’s everybody’s and nobody’s,” says Weiss. “It’s the octopus with tentacles everywhere. It’s a genuinely big problem, because it is communication — and it affects everyone.

“The point of advanced automation technologies is to improve reliability and safety by having remote access to substitute having experts at all of these locations. The irony is we are making these systems more vulnerable by trying to make them more reliable and usable.”

About the Author

Beck Ireland | Staff Writer

Voice your opinion!

To join the conversation, and become an exclusive member of EC&M, create an account today!

Sponsored Recommendations

Electrical Conduit Comparison Chart

CHAMPION FIBERGLASS electrical conduit is a lightweight, durable option that provides lasting savings when compared to other materials. Compare electrical conduit types including...

Don't Let Burn-Through Threaten Another Data Center or Utility Project

Get the No Burn-Through Elbow eGuide to learn many reasons why Champion Fiberglass elbows will enhance your data center and utility projects today.

Considerations for Direct Burial Conduit

Installation type plays a key role in the type of conduit selected for electrical systems in industrial construction projects. Above ground, below ground, direct buried, encased...

How to Calculate Labor Costs

Most important to accurately estimating labor costs is knowing the approximate hours required for project completion. Learn how to calculate electrical labor cost.