Subsurface data comes out on top

Andrew McBarnet reviews the implications of a new report on the value proposition of E&P data and how it is managed.
To get the most mileage out of any personal achievement, it is always better if the recognition comes from a third party. This is why the geoscience and petroleum engineering community should get a real kick out of the new study The Business Value Case for Data Management published by Common Data Access (CDA), the not-for-profit subsidiary of Oil & Gas UK, the main representative body for the UK offshore oil and gas industry.

The study provides a glowing tribute from none other than senior managers in the industry, and indeed mildly chastises oil companies for not paying more attention to this aspect of their business. The report carried out by Schlumberger on behalf of CDA concludes that 70% of the value generated for oil companies relies on their understanding of the subsurface, in essence reducing geological uncertainty. Such information typically includes exploration data (eg seismic surveys), production data (eg hourly flow readings) and interpreted data, in the form of information that includes processed seismic data and dynamic reservoir models.

That finding alone should be worth pay rises all round for the teams of geoscientists and petroleum engineers involved. But in a sense it gets better. The report attempts to classify the crucial elements in creating subsurface understanding. The senior executives interviewed agreed that a simplified model would break down into people, tools, data, and the process they employ. It was a toss-up between data and people as to which was the most significant with data winning out 38.5% to 32.7%, with tools and process trailing behind at 15.1% and 13.7% respectively. Although the survey was confined to managers responsible for offshore UK and Norway E&P operations, the report suggests this finding would be typical of the industry worldwide.

The survey recognises that not all oil companies have the same business goals, and that this might affect their view. Predictably, those that listed reserves replacement as their priority suggested that 90% of the value came from understanding the subsurface. The figure for asset development was 70% and still more than half for contributing to improvements in production.

The report also points to unexpected value that can be attributed to subsurface data. For example, some now depleted southern North Sea gas fields are emerging as possible locations for sequestration of CO2 or as storage units for natural gas. Assessing the available E&P data acquired over the life of the field will clearly be valuable information in assessing potential candidates for these objectives.

More obviously a number of enterprising operators in mature provinces such as the North Sea are realising that more sophisticated technology and data analysis developed over four decades means that many fields could be regenerated. Reviewing all the old data is one of the key steps in that process which points to the lasting value of subsurface data.

Any serious evaluation of value needs to put some numbers on cost of the data, and that is somewhat problematic. It is a simple exercise to quantify the cost of direct measurements such as seismic surveys, well log curves and production rates. The cost of acquisition of data is straightforward to calculate in an oil company's budgeting process, but the real value of data lies in the interpreted result and that is much more difficult if not impossible to figure out with any accuracy. As the report says, data with the most direct impact on business decisions such as static geological models and dynamic reservoir models are the result of combining a wide range of evidence using the skill and judgement of experienced staff.

A different index of value is to estimate the longevity of the data. Seismic data was seen as being of value from anything between four and 20 years. A complicating factor here is that surveys are often re-shot to take advantage of continuing advances being made in seismic acquisition technology. When it comes to well measurements, these retained a value throughout the lifetime of the field. The picture is much less clear regarding the perceived lifetime of the static and dynamic field models. The report observes that some respondents were aware that interpretations influencing major decisions should be available for future scrutiny, but acknowledged it was not currently happening.

Actually this is where the rubber hits the road, so to speak. Having established the worth of subsurface data to oil company E&P operations, the report reveals a somewhat cavalier attitude towards its management. A key passage reads: ‘When interviewing senior staff there was a surprising lack of interest in exactly who performed data management. However, based on extensive prior experience our suspicion is that the vast majority of data managers qualified in other domains and have little formal training in data management. It was noticeable that none of the senior executives had a data management background.'

Let's face it, who doesn't stifle yawn when the subject of data management comes up? The report's authors refer to its ‘Cinderella' status in the industry; they state that CDA and others are working on a set of data management competencies which will become a framework for independent, certification, career development and recruitment, to which one can only wish them luck. One is reminded of the look of bewilderment on the face of Dustin Hoffman in the film The Graduate when a smug businessman tells him a good job with a future can be summarised in just one word ‘plastics'.

In the unlikely event someone actually said ‘when I grow up I want to be a data manager', even people in the industry would be hard put to explain what this would, or more importantly according to the report, should entail. Respondents in the report representing senior management perceived that the main goal of data management was to hold and make available the raw or unprocessed data in a form that can be used by geoscientists.

So we're talking here about seismic and well log data which has been acquired on the company's behalf. This function is in fact probably quite well catered for in most companies. For a start, there is a regulatory requirement for operators in many countries to lodge certain seismic, well log, and other data with the government in some form of national repository. The Diskos databank established by the Norwegian Petroleum Directorate (NPD) with Norway's offshore oil and gas operators was the first such repository, established in the early 1990s. The Diskos concept has been followed by a number of important hydrocarbon producing countries including the UK where Common Data Access plays a pivotal role.

The Diskos data repository, currently managed by Landmark in Stavanger, Norway, is a data management system which stores corporate and national data from the Norwegian continental shelf. Access to authorised data in the database – seismic and navigational data, well data and production data – is available online (or as tape media) to members of the Diskos consortium – the 50 or so operators and licensees on the Norwegian Continental Shelf.

A major benefit of Diskos and Common Data Access is that a huge amount of unnecessary duplication in the storage of huge volumes of data is achieved. For example, in the old days, operators and licensees on the same block would be filing effectively the same information to meet their regulatory obligations. Through use of systems like Diskos, data may be transferred directly to workstations at high speed and low cost, and data may be traded by simply changing owner rights to data in the data store. It is also the practical way to make data public after the required confidentiality period. Another important aspect is that all the data entering the database have to meet an agreed level of quality delivered on agreed data formats. Diskos is also loaded with official quality approved cultural data (coastlines, licence information, installations, etc) from the NPD.

From what the report calls a data governance perspective, the feeling is that companies do organise some data with a long term view. Such data may come under a heading of Master, Corporate or Approved data. Ensuring that key data categories are clearly defined, widely used and well managed is vital to data management but the report finds that oil companies fall short in maintaining a high standard for all their data categories. Over the past few decades there has been an ongoing discussion about the challenge of ever increasing of volumes of E&P data being generated from techniques such as 3D and multi-component seismic acquisition and how such data should be stored, managed and accessed. Even the mechanics of storage are complicated by the development of different formats all of which have their pluses and minuses.

In the perfect world a geoscientists their workstations would be able to bring up any data to which they are entitled to access at a stroke of the keyboard. Over several decades companies from Schlumberger and Halliburton to smaller specialist outfits have grappled with software solutions to ensure companies can keep a handle on their data.

Where the system really breaks down is when it comes to keeping track of interpreted information. This is not a commodity; it is the data uniquely refined by a company's professional geoscience and engineering staff and constitutes the basis of business decisions. As already implied it is extremely difficult to put a monetary value on this source of information except to acknowledge its crucial function. Logically, therefore, it would seem especially important to keep a record. Yet the report found within the companies surveyed, which are pretty representative of the industry, that: ‘It was uncommon to find any evidence of the systematic definition of data architecture, ownership roles, data strategy discussion or coordination of investments in improving data handling. There was not even awareness of formal groups tasked with agreeing on these topics.'

Now we should not be shocked by this observation. Most data is acquired with a very specific purpose, most probably commissioned by an asset team tasked to explore and potentially develop a prospect. The team's interest is completely target oriented and shortterm; once the project is up and running its members move on to the next assignment, so that no one really takes care of the history for reference in the future. Any residual value that the data might have tends to get lost in the system.

It is actually even more complicated because interpreted data is to some degree only as good as the interpreter. Geoscience is subjective and in many respects is no different to historical writing. Serious historians cannot take for granted what previous researchers have concluded: they have to go back to the original sources to draw their own inferences. They may even have doubts about the original sources! So, someone coming upon previously interpreted data wants to know the processes involved in coming to a particular conclusion. In some if not many circumstances something's that's been done before gets to be done again, which is clearly a waste of resources. This is one of many points touched upon in a fascinating and admirably frank discussion about data management between six oil company executives which is included as a document in the report. It should be required reading for anyone with the slightest interest in this topic.

In the conversation there is early general agreement that ‘there is no consideration to the cost of managing data or to the value or benefit of managing it beyond the point where you have returned your investment in that data'. Even if there was to be agreement on keeping data visible and accessible after the end of a project, there is the issue of who would look after the data and how. This is where we come to what seems to be a serious disconnect between geoscientist interpreter types and the data management team. In effect they inhabit different universes, well expressed in this tidbit from the discussion. ‘One thing we are still struggling with is capturing the "added value" created by the interpreters: horizons, markers, faults, geological models, reservoir models, analysis, reports and so on. A data management team cannot decide which model, horizon or report is the correct one to store. A problem could arise if business teams expect this from data management.' In a small company possibly focused on one or two assets the issue may not be so acute because the same person may be doing the interpreting and the archiving.

The bigger the organisation the more difficult it becomes.

There does not appear to be an easy solution to this conundrum. The closing remarks of one participant suggest that data management happens well when it is HSE critical or where regulatory compliance of some kind is involved. The debacle over Shell's reserves estimates a few years ago probably had all the oil majors review data quality control for that particular part of their business. After that, as someone says, it's going to sway with the wind and the whims of the budget cycle.

That said we need to return to the starting point which is that E&P geoscientists have just earned some significant recognition for the value they create for oil companies through their endeavours to understand the subsurface. OE

 

Current News

Türkiye Aims to Drill for Oil Off Somali Coast Next Year

Türkiye Aims to Drill for Oil

Prysmian Appoints New CEO

Prysmian Appoints New CEO

Oilfield Firm SLB's Profit Rises on International Drilling Demand

Oilfield Firm SLB's Profit Ris

Malampaya Gas Field Exceeds Export Capacity Amid Grid Demands in Philippines

Malampaya Gas Field Exceeds Ex

Subscribe for OE Digital E‑News

Offshore Engineer Magazine