In Perfect World, Sensors Prevent Oil Spills; In Real World, Not So Much

By Mary Catherine O'Connor

Data collected by the Pipeline and Hazardous Materials Safety Administration shows that remote-sensor systems are just barely more effective at detecting leaks than are passersby.

As President Obama continues to mull over whether to allow construction of the Keystone XL pipeline—which would bridge the U.S.-Canada border and send heavy crude oil called bitumen from Alberta to southern stretches of the pipeline—Pulitzer Prize-winning Inside Climate News reports that remote-sensing technologies, which pipeline companies tout as robust safeguards against damaging spills, do not have a great track record when it comes to being the first line of defense against spills.

Sensor networks designed to monitor pipelines for corrosion, leaks or fire (by tracking such factors as temperature, flow rates and pressure, or by using ultrasonic technology to gauge pipe thickness) are touted as one of the most promising Internet of Things applications, but the technology appears to have a long path to maturity. The majority—76 percent—of the 1,763 pipeline leaks that the Pipeline and Hazardous Materials Safety Administration (PHMSA) recorded between 2002 and 2010 involved fewer than 30 gallons of material, and Randy Allen, a staff consultant at UTSI International, a systems engineering, integration and consulting firm that serves oil and gas pipeline companies, told Inside Climate News that these types of spills are extremely difficult to detect. Yet, even when it filtered the PHMSA database to look only at spills of 1,000 or more barrels of oil, the publication found that remote leak-detection systems were the first to detect only 20 percent of the incidents, while "the general public and other third parties" were the first to detect 17 percent of the spills, astute employees in pipeline operators' control rooms found 18 percent, and company employees at the scene reported more than 40 percent of these large spills.

The findings are not all that new. In January of this year, the Wall Street Journal ran a story about an investigation that yielded similar data. It also reported that in 2013, a rupture in a pipeline carrying crude oil through North Dakota seeped 20,000 barrels (630,000 gallons) across an area the size of six football fields. A farmer discovered the leak while harvesting his wheat fields. Tesoro Logistics, which operates the pipe, told the WSJ that it "installed more monitoring and analysis equipment in the wake of that spill."

But that's not likely to squelch the concerns of residents in Nebraska, where a section of the proposed Keystone XL pipeline will pass over the shallow Ogallala water aquifer, which provides around 80 percent of the water used in the High Plains—and supplies a third of the country's irrigated farmland.

TransCanada claims it will install 13,500 sensors along the Keystone XL pipeline, and link them to hundreds of remote-controlled shut-off valves that can quickly stop a spill. Here's hoping that's the case, because past spill data indicates that remote monitoring systems sometimes work better on paper than in the field.

Fortunately, IoT companies are always working to improve sensors and monitoring systems, such that they can not only collect more reliable data, but make better sense of it. Toward that end, Wired Magazine recently reported on a Canadian company called C-Fer Technologies, which provides a test lab—actually, a dumpster containing dirt and a 24-foot length of pipe—where companies that are marketing sensor systems to pipeline firms can test their wares in real-world-ish environments. That's a step toward pipelines that are really smart—and, let's hope, faster than passersby when it comes to detecting leaks.

Mary Catherine O'Connor is the editor of Internet of Things Journal and a former staff reporter for RFID Journal. She also writes about technology, as it relates to business and the environment, for a range of consumer magazines and newspapers.