Like much of the nation’s infrastructure, more than half of its 2.5 million miles of oil and gas pipelines were built in the 1950s and 1960s. This aging network has become more difficult to maintain as corrosion and the years cause pipes and related equipment to deteriorate.
Though pipelines are inarguably the safest way to move oil and gas, keeping on top of pipeline conditions is critical for any operator. It’s an area where automation technologies have a role to play in helping to detect leaks, corrosion, failing equipment and other problems early enough to prevent serious damage to humans and the environment.
Pipeline operators are required by federal regulations to conduct a visual inspection for leaks at least 26 times a year by walking, driving or flying over their pipeline rights of way. Not only is this often manually intensive and time-consuming, but human eyes are probably the least effective way to find a leak unless it’s a major one.
That’s why operators are turning to sensors, smart instrumentation, analytical software and other digital technologies to gauge the real-time conditions in their pipeline networks, as well as the equipment that manages the flow of products within them. This increasing digitization of the industry, however, has brought with it new risks, and cybersecurity is a topic of growing concern for operators, regulators and public interest groups.
Managing complexity
“The pipeline industry relies on instrumentation for measurement and devices such as pumps and valves to control actions,” explains Paul Dickerson, technical and engineering product manager, R&D/engineering, for Emerson Automation Solutions. Key indicators for determining operating safety include flow, pressure, temperature, levels of CO2 and other gases, and product density and viscosity.
Operating requirements vary according to the product being transported, whether it’s crude oil, natural gas, refined products or liquefied gases such as propane or butane. The strictest standards are applied to purified products such as jet aviation fuels to avoid contamination.
“A pipeline is a very complex system—the volume of fluid can be vast and conditions are not constant,” Dickerson says. “You have to manage a massive amount of information and separate out what’s important and what can be ignored in order to interpret what’s really going on inside the pipe.”
Pipeline operators are beginning to use predictive analytics to provide a more complete view of conditions within their infrastructures. Smarter instrumentation and wireless sensors are allowing them to sample data at more places more frequently. Cameras, acoustic meters and fiber-optic wire are among the many technologies being deployed to identify corrosion and leaks. Cathodic protection systems that generate an electrical charge into a pipeline to prevent rust or corrosion are also becoming more common.
Simulation software is another tool being used to measure whether pressure limits are being exceeded within a pipeline. This can be an important indicator of operational safety, particularly on downslopes, where pressures tend to be higher. More powerful programmable logic controllers (PLCs) now enable smoother shutdowns, minimizing the short, sharp pressure waves created by sudden shutdowns that can weaken pipes.
“Aerial systems, such as drones, helicopters, planes and even satellites, can detect leaks by equipping them with different types of cameras to gain high-resolution information on vegetation, soil moisture and temperature,” explains Germán Fernández, global vertical marketing manager for Belden. “Automatic image processing makes it easier for companies to develop predictive maintenance programs.”
Despite the headlines that occur whenever there’s a spill, Fernández says, “there is actually more gasoline spilled on the forecourts of filling stations in the U.S. through carelessness than what emanates from pipelines.”
Modeling leaks
Automating more pipeline operations comes with its own risks and doing it properly requires a great deal of industry experience, according to Dickerson. “It requires fine-tuning the sensitivity of control systems, accurately calculating product volumes and locations, alarms that are reliable enough for an operator to trust, and achieving accuracy under many different operating and environmental conditions,” he says.
A case in point is Centurion Pipeline, whose continually changing operations were causing severe transients and frequent false leak alarms. Centurion is a wholly owned subsidiary of Occidental Petroleum, the largest operator and oil producer in the Permian Basin in West Texas.
Most of Centurion’s gathering pipelines have many injection and delivery points that are constantly being turned on and off depending on market prices, demand and well conditions. Changes in the elevation profile and the frequent startups and shutdowns were causing slack. Drag reducing agents, which increase a pipeline’s transportation capacity, also had to be tracked.
Emerson deployed its PipelineManager application, which is able to distinguish between real leaks and disturbances caused by transients, using dynamic thresholds to avoid false alarms. The software, which has significantly reduced nuisance alarms, employs a real-time transient model (RTTM) that constantly models and simulates actual pipeline dynamics based on field instrumentation data delivered through the supervisory control and data acquisition (SCADA) system.
The system was installed in a redundant fashion (local hot/standby and site to site) on a virtual environment and implemented on five oil pipelines and one CO2 pipeline, totaling 1,200 miles. Several filters based on the rate of change in pressure make it possible to detect leaks of different sizes.
Additional volume balance sections were configured and tuned to respond only to rupture leaks. The system reduces alarm situations and standardizes them to require an operator response, and rupture alarms are assigned the highest priority so that emergency teams can be quickly deployed.
Staying within the envelope
“Operators want to push the maximum amount of product through their pipelines, but to do that safely means they have to know if they’re staying within the operating envelope of what is possible with their infrastructure,” explains Lars Larsson, senior product manager at Schneider Electric.
Though automation technologies can assist and inform operators, people ultimately make the decisions. “An emergency shutdown is a big decision,” Larsson says. “Although some companies allow their control systems to do an automated shutdown based on certain conditions, like exceeding pipeline pressure maximums, most want their operators to be in full control at all times.”
The downside is that operator errors—such as ignoring alarms or overruling safety systems—have contributed to many major pipeline accidents over the years. That’s why the National Transportation Safety Board (NTSB), whose work is funded entirely by industry, has put such emphasis on control room operator training over the years, according to Robert Hall, director of NTSB’s Office of Railroad, Pipeline and Hazardous Materials Investigations. This emphasis on training has helped leaks become less of a problem for the industry.
Automation suppliers have also stepped up with improved technologies to help pipeline operators better manage their operations, says Emerson’s Marc Buttler, marketing director for the oil and gas industry. “Flowmeters, which can fail over time, are being replaced by extremely accurate Coriolis meters,” he says. “Wireless transmitters that consume little power can be placed at more locations to push data to control systems in seconds rather than minutes. Instrumentation that monitors pumps, motors and other rotating equipment now comes with predictive maintenance features.”
Inline inspection technology for detecting potential leaks is also changing rapidly, according to the NTSB’s Hall. “The industry has gotten significantly better at locating defects in pipelines with second-generation technologies such as ultrasonic sensors and magnetic flux detection systems, which are used extensively in gas pipelines,” he says. “Smaller and more accurate sensors with improved data capabilities also allow finer resolution of information.” Hall’s office continues to promote the development of near-future technologies for pipeline integrity management through its grant programs, he adds.
Detecting corrosion
Corrosion detection is a priority for many regulators looking to limit future leaks. Kjell Wold, global business development manager for Emerson, says that 25-50 percent of the pipeline incidents in North America are related to corrosion, and half of those are linked to internal corrosion. The company offers a non-intrusive monitoring technology, called Field Signature Method (FSM), to detect corrosion earlier, more accurately and at less total cost.
The causes of internal corrosion vary from pipeline to pipeline, depending on the product being transported, but it’s often linked to the presence of water. Water is heavier than oil and gas, so corrosion is often found at the bottom of pipes, where access is particularly challenging for most monitoring technologies. Changing process conditions, such as lower flow rates, also increase the probability of water holdup.
“Since pigging, the traditional method for cleaning and inspecting pipe interiors, is an expensive process, with limited resolution and sensitivity, it is done only periodically and is not a tool for proactive corrosion management,” Wold says.
FSM is based on feeding an electrical current between an array of sensing pins installed on the external pipe wall. It measures the voltage drop between pairs of pins, the first reading establishing the “signature” or benchmark to compare with later readings. From the change in voltage drop readings and distribution, uniform corrosion can be established and localized corrosion can be tracked by pin location.
“The great benefit of FSM is that it can be installed directly on the pipe, even in areas that are difficult to access, and can normally be retrofitted to pipelines already in operation,” Wold says. “After installation, the soil can be backfilled so that only the FSMLog instrument, along with power and communication devices, will be visible on top of the ground, making it possible to receive online data from very remote locations.”
Giving humans a helping hand
“There are limits to the number of things human operators can deal with at any given time,” says Ross Otto, engineering and program manager, Oil and Gas Canada, for Rockwell Automation. That’s why automation suppliers are focused on faceplate-driven control, creating templates within the human-machine interface (HMI) to standardize control system displays. “Automation is more reliable than humans, and by removing multiple routine activities that might overwhelm operators, we can give them the ability to focus on the variations that actually indicate leaks or other problems.”
To protect industrial networks like those operated by pipeline companies from cyber threats, Rockwell Automation has partnered with Claroty, whose software detects anomalies by monitoring and analyzing traffic between assets. Anomalies are reported to plant and security personnel along with actionable insights to help them investigate, respond and recover systems.
Improved cybersecurity is just one aspect of the overall drive by operators to make pipeline management safer and more efficient. Like everything else involved in pipeline operations, it relies on accurate data. To improve data accuracy, automation suppliers are making products like self-diagnosing sensors and meters that will report when they need to be recalibrated. This enables operators to perform maintenance work only as needed, rather than on a time-based schedule, which improves accuracy and reduces costs.
“By using software to look for trends in the data, we can also predict breakdowns before they happen, often many months before if historical data is being analyzed,” Otto says. “Analytics are still in their infancy and it’s an important area where algorithms need to be improved. While analytics have not yet been applied to leak detection, they have the potential to help us identify the patterns that could predict future leaks.”
Instrumentation for pipelines is focused on production equipment and the flow of product because they impact revenue and top-line growth, according to Michael O’Connell, chief analytics officer at Tibco. “Analytics are being used primarily for condition-based maintenance, using things like failure models, time in service and time to failure,” he says. “Rules and models are being applied to sensor data for up-to-the-minute maintenance applications. This contributes to safer operations and there’s also great potential in applying sensor-based analytics for condition-based leak detection.”
Companies are also using automation technologies to improve business decision-making and provide customers with the right products and services in a timely fashion. Istrabenz Plini, an integrated gas company in the Slovenian market, turned to Tibco’s Spotfire software to automate its enterprise reporting processes across its supply chain.
“We were doing a lot of manual reporting work in ERP [enterprise resource planning] systems, spreadsheets and in custom-made legacy systems. It was a very inefficient process and mistakes were being made,” says Črtomir Ješelnik, CIO for Istrabenz Plini. “The software gives us right-time reporting and operations information, which is really important for reliability. It has helped transform our business from distributor to a multi-utility company.”
Turning to AI
“Technology will remain front and center in the effort to maintain safe pipelines,” says Gregory Tink, managing partner of Streamline Control Solutions. “Smart devices are pushing out more information, but this massive amount of data is often too much for control systems or humans to handle. That’s why you’re going to see artificial intelligence (AI) being embedded in equipment in the near future. Machine learning and cloud-based systems will take a little bit of the responsibility off humans, but it will also change what they do.”
Tink says the adoption of AI will happen in three phases: Training the AI engines and creating the algorithms, teaching the machines what to do with the information, and finally how to tier decisions so they can proactively look at information and tell operators, “You need to do this now.”
“Software that enables predictive maintenance and proactive monitoring for safety will become the priorities, while SCADA software will become less important,” Tink says. “It’s one of the reasons pipeline managers are beginning to question why they need to spend millions of dollars every few years to upgrade their SCADA systems.”
Though the amount of data being generated by smart instruments and control systems has already reached a point where it’s hard for humans to make sense of it, it would be a mistake to shut the data off too soon, according to Tink. “We don’t know yet what’s really important to know about operating pipelines more safely,” he says. “By going through that exercise over a period of time, we’ll be able to let the machines analyze, learn, identify and point out to us which trends are really important.”
This will be even more critical as retiring Baby Boomers take their knowledge base with them, Tink adds. “Companies need to start managing this transition and using their automation systems to feed knowledge to younger operators,” he says. “Safety will be a big driver of these changes and it will accelerate the adoption of more intelligent systems.”
Leaders relevant to this article: