Taking responsibility for research

Andrew McBarnet
Wednesday, February 2, 2011

Andrew McBarnet weighs up the value of research in the E&P marine geoscience sector and draws some surprising conclusions.

We hear a great deal about the hard to find oil which needs to be discovered if the world's supply needs are to be met in the coming decades before, on an optimistic scenario, we can come up with viable alternative energy strategies as yet undefined. The somewhat vexing perennial question is whether the E&P business can keep producing the goods to bridge the perceived gap between now and a world weaned from oil and gas dependency.

Peak Oil doomsters say that world reserves and time to develop alternatives are running out too rapidly for comfort. Without dismissing the finite nature of hydrocarbon resources, the industry on the whole prefers to believe in its ability to keep finding enough oil to meet demand. That faith is based to quite an extent on the expectation of technological advances emerging to meet the challenges ahead.

Taking such thinking a step further would imply that research and development of new technologies should be a high priority for oil companies. In terms of investment that certainly has not been the case for at least a decade. Many will recall that the business priority of oil companies in the 1990s was focused on consolidation. This accelerated in 1998/99 as a result of a sudden, albeit temporary collapse in the price of oil. Mega mergers, downsizing, outsourcing and cost cutting were the order of the day in response to changing market conditions, notably the diminishing role of international oil companies and the rise of national oil companies (NOCs) with control over much of the world's oil and gas reserves now and in the future. It was the expedient thing to do at the time to keep shareholders happy. Operating margins did improve; and balance sheets looked much healthier for those companies who swallowed some of their competitors in the process acquiring proven reserves without the hassle of costly, risk investment in exploration with no certain outcome.

Arguably the biggest casualty of this period was the research department, which is now a fond memory in all but the few oil companies either enlightened enough or sufficiently well resourced to keep some R&D capability. As a generalisation you could say that, with the exception of the super majors, the listed oil company of today is more or less entirely project oriented in its approach to E&P. Strategic spending on research offers no upside for asset teams dedicated to maximising the return on specific acreage. Choice of technology to complete a job will be based upon what is cost effective, a conservative mindset which errs toward the proven solution and is wary of innovation.

The implication of this trend is clear. Just when there would seem to be a pressing need for innovative technology to meet the challenge of finding and producing more oil, the majority of oil companies, most of which are profitable, contribute close to zip in the way of research funding. For different reasons this lack of research investment applies to all but the very big NOCs such as Statoil, Saudi Aramco and Petrobras. NOCs typically work according to an agenda set by their governments focused on meeting the needs of the country's economy. There is more often than not a dearth of indigenous technical expertise available which means that NOCs buy in almost all their E&P services. In such circumstances, technology research doesn't figure as an option.

When you get down to weighing how this state of affairs affects the marine geophysical services business working the oil patch, of which seismic is the largest component, it is obvious that the technology is still a work in progress. Otherwise there would be a lot more confidence about possible reserves in frontier areas such as the Arctic and less discussion about potential uses of seismic monitoring and other methods to optimise reservoir production from mature hydrocarbon provinces. Geoscientists haven't figured out all the answers yet.

Pros and cons

What is clear is that the burden of research investment in the E&P geoscience community has shifted to the service sector. This has its pros and cons for oil companies. Outsourcing research has the advantage of reducing costs; but it does carry the potential loss of competitive advantage because contractors can offer their new products on a non-exclusive basis. It also means some loss of control over the direction of new research, although obviously contractors are unlikely to develop technology which does not meet a perceived demand. A risk that oil companies sometimes do not factor in is that they may inadvertently stifle research. For instance, if they squeeze contractors too much on the price of their services, there comes a point at which they have to make budget cuts to remain in business, and those may easily include research and product development.

If you were to ask whether we should be worried about possible stagnation in the future pace of technology advances in the marine geophysical E&P sector, the answer is an unequivocal yes and no. In other words, it's complicated. For a start, there is the intriguing debate in the philosophy of science about how new ideas evolve. Genuine technology breakthroughs are rare because we deal in a world of known unknowns: the Eureka moment comes from some connection that no one anticipated. This is why simply pouring money into research does not guarantee results, otherwise the cure for cancer would have been discovered ages ago. It is no criticism of research sponsors that they can only sanction science or industry research money for projects that they can already understand.

In practice this means most technological change in an industry context is incremental. We cannot logically expect marine seismic operations to be revolutionised overnight by some amazing new technology because a) we don't know what that might be and therefore b) we do know that no-one can be focused on such a breakthrough. That does not preclude the possibility that it might happen. According to the legend, Archimedes was invited to consider how to differentiate between a crown made of pure gold and one of the same weight made of a gold and silver amalgam. So when he famously got into the bath tub, observed the water displacement by his body and came up with the idea of measuring density as the solution, his Eureka moment, there had been time on task by a renowned scientist of the day. So the Newtonian apple may drop one day for a geoscientist already working on some current technology issue.

In the last decade of E&P geoscience, the best documented flash of insight closest to a Eureka moment probably belongs to the founders of the Norwegian company Electromagnetic GeoServices (EMGS) – Terje Eidesmo, Svein Ellingsrud and Ståle Johansen. When working at Statoil they are credited with realising in 2000-02 the commercial potential of marine controlled source electromagnetic (CSEM) surveys for the detection of hydrocarbons. The method had previously been subject mainly of academic interest for nearly 30 years. A citation from the Norwegian Petroleum Society in 2008 stated that the EMGS founders had ‘broadened geophysics beyond seismic with the birth of electromagnetic technology, and delivered an outstanding contribution to geosciences and to the petroleum industry in Norway and globally'. It is true that after the initial hype, adoption of CSEM stalled for a time before finding its more established place in the market today. Nonetheless the Norwegians' achievement still stands.

Reviewing the marine seismic technology changes introduced in the past 20-25 years, the record is impressive by any measure. There may not have been many recognisable Eureka moments, but the industry has often been acknowledged by oil companies for the advances it has made in both the finding and production of hydrocarbons. For example, this year is the first time in a decade that an annual poll conducted by Barclays Capital found that 3D and 4D seismic were no longer regarded as most important technologies in E&P, the title being ceded to hydraulic fracturing (largely as a result of shale gas operations).

There is no doubt that the development of 3D seismic transformed the exploration strategies of oil companies thanks to the improvements in imaging of subsurface geology. It was not an overnight revolution, instead it represents a good example of R&D targeting an industry problem and coming up with a result. It was Shell in the 1980s that really pushed for 3D seismic development, but since the 1990s its evolution has been in the hands of contractors. Early 3D involved three or four streamers. Contractor Geco-Prakla sometimes used two vessels in tandem towing a total of four streamers; today Petroleum Geo-Services (PGS), which led the race to more productive spreads, has vessels designed to tow 16-20 streamers allowing the collection of huge volumes of seismic data. Originally data processing of 3D seismic involved tape capture of data and physical transfer to shore for processing which could take up to a year or more, with the addition of navigation and positioning data. Massive computing advances and satellite communications have completely changed that scenario. Nowadays clients can ask for alterations in marine seismic surveys in progress based on early processing of data.

Time-lapse 3D seismic (4D) was really put on the map by a collaboration between BP and Schlumberger in the mid-1990s. That was when the first attempts to monitor a reservoir with 3D surveys were carried out over the Foinaven field, off the west coast of Shetland, using an early form of ocean bottom seismic. Fifteen years on, contractors can offer high resolution 4D seismic deploying either towed streamers or ocean bottom recorders in the form of cable or node. Meanwhile BP has been pioneering life of field seismic based on permanently buried recording cables over a reservoir. Responsibility for developing different recording systems based on conventional ocean bottom cable, nodes or fibre optic options has been left to service providers.

Despite these achievements, the rate of progress is often said to be slow compared with some other industries, a conclusion usually formed on the basis of the oil industry's oft-cited tortoise speed adoption of new technology. This is certainly a bone of contention among service providers. They often feel that too much onus is put on them to prove a solution before they can expect any takeup, ie oil companies act like your average high street shopper waiting for the latest gismo to appear in the stores. New developments in ocean bottom seismic and CSEM are two good examples of how the ‘wait and see' fear of commitment attitude of oil company customers has put potentially valuable innovation at risk. WesternGeco certainly had a hard time establishing the value of its Q-Marine acquisition system with oil company clients five or six years ago, although persistence eventually paid off. The steerable streamer feature of Q-Marine was a step change so much so that other contractors now offer their versions.

It would be hard to argue, however, that oil companies are dragging the anchor on technology change in marine seismic. The larger contractors such as Schlumberger (WesternGeco) and CGGVeritas have always tried to maintain a research budget of around 5% of turnover. That means hundreds of millions of dollars a year. This spending is driven by an unwavering belief in technology as a significant differentiator in the marketplace.

Shared memories

It is a moot point as to whether this belief can always be justified. Differentiation, even when successful, never has a very long shelf life. The rapid emergence of wide-azimuth surveys is a case in point. Service companies all talk to the same customers so that everyone basically agrees on the wish list for improved technology. Monitoring patent applications is de rigueur. Oil companies and individual service providers routinely agree to collaborate on specific technologies: this provides the oil company with some exclusivity and the service company additional resources plus a perspective on client needs. Last month, for example, CGGVeritas and BG entered into quite a wide ranging technology R&D agreement. Not so long ago EMGS announced a collaboration with Shell on next generation CSEM. No amount of confidentiality in such cooperative deals can prevent the glass walls being broken. It is not so cynical to suggest that geoscientists can scarcely scrub from the memory bank everything experienced during one research project and not let it inform the next.

Besides, there are numerous conferences, workshops, exhibitions and publications where advances are logged and discussed in minute detail by oil companies, service providers and academia. Public research presentations at meetings pose an exquisite dilemma for authors who have to steer around the need to promote new ideas and technology for promotion purposes (personal or commercial), the opportunity to gain some feedback on a new concept, and the fear of giving too much away so it can be easily replicated by competitors.

We should not overlook that the academic community contributes to the whole R&D mix and as a result provides an additional forum for discussion. Over the years some university initiatives such as the Stanford Exploration project (SEP), the Delphi Consortium, and the Edinburgh Anisotropy project have impacted the whole industry. The Society of Exploration Geophysicists (SEG) website provides a page compiled by Jo Dellinger of SEP which lists over 100 groups worldwide doing non-proprietary exploration geophysics research, such as academic consortia, university research groups, laboratories, institutes, government labs, etc. Academics and their PhD and post-grad students don't normally have good access to seismic operations, so most work concentrates on computer-based study. That said there have been some notable commercial spinouts from universities; for example in the UK, MTEM (University of Edinburgh), Offshore Hydrocarbons Mapping (University of Southampton) and GETECH (University of Leeds) all made it into the commercial world.

Marine seismic has a long history of being technology driven, so it is unlikely that the trend will change in the future. Successes are chalked up when one company finds the answer to the next challenge first. Lately PGS with its GeoStreamer technology was ahead of the field in tackling de-ghosting of marine seismic data, for which WesternGeco and CGGVeritas are now presenting their solutions to the problem. Currently the race is on to improve bandwidth, etc. On the data processing side, it is sometimes remarked that the geoscience community is often certain of its direction but often has to await the next step change in computer power to achieve its objectives. But methods to extract the optimum information from available data continue unabated, reverse time migration being the latest craze. Integrating data from all sources to produce a single earth model is a more distant goal.

So, the best that a company can realistically expect from research is to gain some lead time on the opposition. This in itself is important enough to guarantee that there will be no let-up in the pace of technology change any time in marine seismic. OE

Categories: Technology Geophysics Geology

Related Stories

Add Energy Nets FPSO CMMS Deal

Bob Black Joins EC-OG Board

Kraken Scores $7.1 Million of Contracts for Offshore Subsea Inspections

Current News

Seanovent, Strohm in Offshore Wind-to-hydrogen Collab

Lhyfe, DORIS to Install Hydrogen Production System on Floating Wind Turbine

Seplat Looking to Buy Exxon's Shallow Water Fields in Nigeria

U.S. Oil Drilling Review Proposes Higher Fees, Development Curbs

Subscribe for OE Digital E‑News