TRC

Blog   |   Jul 31st, 2017 Data can drive oil and gas pipeline safety and integrity– as long as there’s commitment to data management excellence

Big Data

Along with petroleum products and natural gas, pipeline operators these days are moving extraordinary quantities of data–billions of bytes every second, from operational data to equipment monitoring, video, and Internet of Things data from sensors, mobile devices, and field surveys. It’s no exaggeration to note that a pipeline company may now produce as much data in a day as it did in a year back in the 1990s.

In an earlier installment in this blog series, we discussed the Pipeline Open Data Standard (PODS) movement and how it is enhancing safety, reliability, data transfers, and regulatory compliance–while also reducing the time, cost, and risk associated with implementing geographic information systems (GIS) to locate pipeline designs. Now, we’d like to take a look at some of the organizational and managerial challenges involved with managing and utilizing the abundance of data available. Consider, at the outset, that for all the terabytes of data pipeline operators now compile:

  • Rarely can technical consultants like TRC collect that data knowing with certainty how it will and can ultimately be used by the operators.
  • Rarely do we collect data for reports, schematics, and alignment sheets knowing with certainty what questions we will need to answer from it in the future.
  • Rarely do we know at the beginning the complete story of what information we will need to submit to regulators such as the U.S. Pipeline and Hazardous Materials Safety Administration (PHMSA) or state-level pipeline regulators. 

Along with these intellectual blind spots about why we’re gathering data and how it will be used, our industry still faces some digital blind spots about the networks whose integrity and safety we are charged with protecting. Across North America, many pipelines were first installed between 20 and 50 years ago, and no remotely accessible digital records have yet been created documenting where those pipes were put in the ground, at what depth, and how they were routed. Hand-drawn schematics and 1980s-vintage Computer Aided Design and Drafting (CADD) records do ensure that inspectors and technicians can determine where pipelines sit for the sake of verifying their safe operation. Often, however, operators haven’t yet been persuaded of the value in investing the time and capital to digitize these earlier generations of data for a cloud-based era. Developing consistently effective ways of rendering as spatial data a long, thin corridor through rough terrain still remains challenging. Dozens of pipeline safety experts who carry around volumes of mission-critical data in their memory are reaching retirement age every month. Capturing what they know and storing it in the Internet “cloud” is critically important–all the more so because that same cloud is now automatically collecting, retaining, and backing up more and more transactions and information, cheaply, and in formats that make actionable analysis from anywhere faster and easier than ever. 

Digitizing and cloud-enabling this historical information takes time and money. But even more important, it takes institutional will and an understanding of the critical value data delivers in keeping pipelines incident-free and publicity-free by promoting better-informed decision-making, risk calculation, and planning for remediation and mitigation of incidents.

Also required are policies and training to collect the right data the right way at the right times at every single step. That’s especially important when you can’t know for sure which data may wind up being used, and how, in the future. Everyone in your organization who touches the creation of data must understand that data analytics are no good unless the data is good. Bad or missing data, or data squirreled away in inaccessible data silos and repositories, represent a lost chance to get an early warning about a potential operational or safety issue.

Complete data, well analyzed, are vital to pipeline safety. For example, one operator TRC works with maintained a thorough, reliable record of every location along its pipeline visited by inspectors over several years. That data could be rendered graphically, like a weather contour map, revealing visually the frequency and intensity of inspections. What we quickly saw was that large stretches of pipes from the gathering wells feeding the pipeline had been going under-inspected for years. Solid data, analyzed smartly, revealed a safety risk that quickly got rectified.

The good news today is that this data can be stored in the cloud at operating-expense prices, not with capital-expense investments. And when it comes to keeping the data secure, I often note that Amazon and Google have more data-security people on staff than most pipeline companies have people on staff, period.

Technically, none of this work is especially difficult for pipeline operators. You just have to do it. You must collect digital data at every step of the planning, routing, design, installation, operation, and maintenance of a pipeline. You need to establish and enforce rules for how it’s collected, stewarded, and shared and transferred among partners within a pipeline operator. You need to make sure everyone collecting data or doing work in the field, or studying and analyzing the pipeline system, understands they have signed on to become agents for reducing risk.

Data and technology are transformative for pipeline safety and integrity–but only when everyone involved is committed to the process and the gathering of accurate, reliable digital data.

Blog Author

Peter Veenstra

Peter Veenstra

Peter Veenstra has over 24 years of GIS and technical experience including extensive experience rolling out enterprise pipeline GIS solutions for domestic and international clients and experience as a consultant, director of software development, software architect, programmer and analyst. As a Principal GIS Technologist at TRC he is responsible for defining and managing project scope, methodologies, choosing and implementing appropriate technologies to meet client needs and is focused on providing GIS database design and implementation to support pipeline operations, integrity management, construction and maintenance. His skills includes understanding of system integration strategies, data management concepts, pipeline regulatory requirements, enterprise systems architecture, GIS data structures, and desktop, web, and cloud-based software solutions. Peter actively participates in industry and data model committees and he is an original author of the APDM. He also serves on the PODS Technical Committee Governance Team and the PODS Board of Directors

Comments