Quantcast

In this age of big data, someone coined the phrase: ‘data is the new oil’.

By James Potts, Director, OpenLink

It’s clever enough, but the problem with the expression is that it implies the two are very separate things. If you ask the operator of a major oil pipeline or storage facility, they’ll quickly disabuse you of that notion. The two go hand in hand and more oil invariably means more data. The trouble is getting that data under control so you can extract its value.

Despite a low-oil price environment and decarbonization drives, the global population and economy continue to grow, spurring ever increasing hydrocarbon product demand – including in advanced economies like the US. For example, last August was the highest month ever for US gasoline consumption, with an average of 9.7 million barrels used per day.

In fact, worldwide, the market is saturated with oil. OPEC production cuts aren’t making a dent in global stockpiles of crude or refined products and, in the US, the desire for energy independence means the shale boom continues apace.

Now though, we’re also starting to see that oil moves around a lot more. In the States, long-planned pipeline projects have been un-paused by the new Republican administration, at the same time that Canada continues to pump oil across the border to its southern neighbor.

Then you have the fact that US shale is distributed right across the country, but most of its refining capacity is clustered around the Gulf of Mexico. Moving crude from Bakken and Marcellus to the refineries means a lot of transportation work.

Outside the States, midstream and downstream asset owners are seeing less of a transportation boom, but are certainly seeing an increase in storage needs – be it at the terminal or on a vessel. Companies in places such as Rotterdam, Heathrow and Singapore continue to deal with near-record levels of storage.

The upshot is that those companies who own the storage and transportation assets have never been busier.

The problem with increased data flow

Just as increased oil flow means engineering challenges, increased data flow means increased IT challenges. Specifically around pipeline and storage management systems.

These asset owners need to constantly provide their customers with transaction reports, maintain operational efficiency and minimize settlement risks. To do so, they need to know what product they’re handling, who it belongs to and where it’s going at all times. This insight needs to match with real-time knowledge of how much they are charging in tariffs and fees at any given moment.

The problem is though, that most don’t have the technology to do so in a timely manner. They rely on homegrown spreadsheets and manual client interaction to track product, use email as their primary method of distributing information to customers, and lack transparency into their manually-handled invoice and nomination processes.

This opens the door for avoidable errors, processing delays and sub-optimal decision-making.

Instead, modern asset transaction management (ATM) software can pull all the required data automatically, feeding it back to both the pipeline/storage asset owner and the client, allowing each to make decisions in real-time. Automation also cuts out human error, and makes reporting instantaneous. Customers can opt to increase or decrease the volumes they send through based on capacity constraints or production fluctuations as they happen.

As flow and storage volumes continue to increase, these capabilities will only become more important.

The (black) golden ticket

All that explains why midstream and refining companies would want to take greater control of their asset transactions but there’s another crucial enabling trend that allows them to do so: independence.

The lower-for-longer oil price environment has hit a lot of oil companies hard – especially on the upstream side. However, the economics of transporting and storing the stuff don’t change much depending on the price of crude. As such, these operations have stayed largely profitable compared to their upstream equivalents.

In order to protect that value and avoid profitable operations being devalued by struggling ones, in many cases larger oil companies have spun out their pipeline and storage operations into separate entities.

Not only does that protect them from the vicissitudes of the market, it frees them up to make more independent decisions. As a result, companies can look at more specialized, built-for-purpose IT solutions rather than have to muddle along with a broader software package implemented across a wider group.

For midstream and downstream players struggling with the increased dataflow that comes with greater oil flow, a strategic IT investment could be the key to staying on top.

About the Author:
James Potts is Director, Americas Energy & Commodities within OpenLink’s Houston office. As a specialist in Commodity Trade Risk Management (CTRM), James helps companies identify complete solutions for planning, producing, processing, managing, moving and trading liquid hydrocarbons. His previous experience includes time spent at SolArc as well as Allegro.



Top