Join OEdigital on Facebook Join OEdigital on LinkedIn Join OEdigital on Twitter
 

Trending upward

Written by  Audrey Leon Friday, 25 July 2014 09:18
Launched in 2010, CGG’s Oceanic Vega is a purpose-built high-capacity seismic vessel optimized for reduced noise and fuel consumption. It carries DNV-CLEAN notation for the lowest environmental impact. Photo from CGG.

Innovation is a necessity when it comes to exploring and producing oil and gas. Audrey Leon discussed recent technological trends occurring in the geophysical sector with several leading companies.

Improvements in technology, specifically within the seismic industry, are leading oil and gas explorers to obtain better information—and not only about possible finds.

Several trends have emerged throughout the last couple of years, including improvements to marine-towed streamers and ocean bottom seismic, as well as moves toward acquiring broadband, and wide and full azimuth surveys.

However, improved acquisition methods aren’t the only way innovation is occurring. On the software side, compression technology, as well as cloud computing, could help revolutionize the industry. All of these improvements are helping to make the big picture a lot clearer for the industry as a whole.

“Over the past several years, we’ve seen the need to increase azimuth on geophysical surveys, particularly in areas where you have complex geology—subsalt in the Gulf of Mexico, for example,” says TGS CEO Robert Hobbs. “The big push since 2004 is to try and figure out ways to most efficiently increase the azimuthal content of data. You’re trying to get more directionality in the subsurface with your geophysical signal."

Hobbs continues, “What you’ve seen in the deepwater Gulf of Mexico is that the subsalt has been covered by wide azimuth. Lately, geophysical companies are returning to these areas and acquiring the next generation of data, which in the subsalt area is full azimuth.”

The geophysical industry, Hobbs says, is focusing on acquiring wider bandwidths in order to gain more frequencies in the subsurface. “In the Gulf of Mexico, what you’re seeing is a combination of those two trends. You’re starting to see companies offer full azimuth technologies, but also broadband,” he says.

Hobbs notes that there are different ways to extract more bandwidth out of data and that several companies offer their own technology to do it.

“You can either process more bandwidths out of conventionally acquired seismic streamer data, and that’s what we do at TGS; we have our own technology to take conventionally acquired seismic data and extract broader bandwidth out of that data.

“You have a number of companies that are offering technologies that are acquisition-based. The first one to come out was Petroleum Geo-Services’ (PGS) Geostreamer. Next, CGG came out with BroadSeis. Then WesternGeco came out with ObliQ and IsoMetrix. Each one of those technologies is an acquisition technique to acquire more bandwidth in the subsurface. And TGS has used many of those technologies as well.”

Craig Beasley, Chief Geophysicist and Schlumberger Fellow at WesternGeco comments on the difference between IsoMetrix technology and other broadband techniques saying, “IsoMetrix offers broadband data finely sampled in all directions–vertically, inline to shooting, and crossline between the towed streamers. This is designed to improve the resolution of complex geological details as well as offering flexibility for acquisition efficiency.”

The Gulf of Mexico, Hobbs says is the best laboratory for testing and deploying geophysical technology for a number of important reasons.

“It’s a very prolific basin. A lot of companies are exploring there, and therefore, they are willing to pay for the data,” he says. “You also have a very good multi-client environment, which encourages the geophysical companies to invest in new technologies and new data, because they can earn a good rate of return on that investment.

“You also have geologic complexity; you have a broad range—from simple seismic amplitude plays, to deep subsalt plays. That has enabled the geophysical industry to test a lot of technologies in the Gulf of Mexico that they are not implementing elsewhere.

Modern 3D seismic vessels typically tow spreads of 10 to 16 streamers. Photo from CGG 

“We’ve seen those technologies maturing in the Gulf, and I suspect over the next several years you will see these utilized outside of the northern Gulf of Mexico,” Hobbs says.

Marine broadband

For CGG, the big trend of the last few years is marine broadband seismic. The company’s Technical Marketing Manager Roger Taylor says CGG has seen the market for broadband seismic take off over the last several years.

“Last year, 50% of the work awarded was for marine broadband seismic in the tenders that we saw,” he says. “Broadband is very much what is being asked for at the moment.”

Taylor says all the major marine seismic companies now offer some form of broadband solution, including CGG, which offers BroadSeis.

“We’ve acquired over 200,000sq km of BroadSeis data since we introduced it,” he says.

“We have a huge amount of ongoing work, for example offshore Brazil where there is 40,000sq km of acquisition in progress with BroadSeis.”

Ocean bottom seismic

Many seismic companies agree that there are advantages to deploying ocean bottom seismic. Hobbs believes that it is likely to usher in a renaissance in mature basins such as the Gulf of Mexico and North Sea. “Unlike streamer-acquired data, you have your sensor on the seabed. There is a number of ways to do that. To date, all of those ways have been more expensive than streamer,” he says.

“The industry is working on ways to get a lot more efficient in deployment and retrieval to get the cost of ocean bottom seismic down where it approaches the cost of streamer seismic.”

Hobbs notes that ocean bottom seismic allows for better quality data, allowing for full azimuth, since the sensor is on the seabed and your source is on a boat that is above that network of sensors.

“That boat can go anywhere you want that boat to go,” he says. “You’re able to record multiple azimuths.”

Colin Hulme, CEO for OceanGeo, an ION Geophysical company, agrees, saying that ocean bottom technology has improved greatly over the years.

“The current technologies for ocean bottom acquisition systems are much more reliable, and are allowing companies to deploy much larger spreads, than previous generations of equipment,” Hulme says.

“This is driving a sub-trend that is making ocean bottom seismic more and more economic. At the same time, ocean bottom seismic provides a naturally full azimuth capability, quieter data and the promise of multi-component data to unlock more information about the reservoir geology.”

Hulme says in terms of E&P companies’ marine seismic budgets, he is seeing an increase in the percentage allocated or ocean bottom seismic projects.

“It’s been growing consistently since 2006, from 6% of all marine seismic dollars spent to 13% last year,” he notes.

This doubling of ocean bottom seismic market share is driven by the need for high quality seismic for managing production and development projects, improved economics of ocean bottom seismic acquisition systems, and the need to acquire high quality seismic in highly congested fields.”

While marine towed seismic continues to dominate the exploration and multi-client segment, Hulme believes, in time, the technology trends and associated economics will see ocean bottom seismic take a slice of these segments away from towed streamers.

“We will have to see how big that slice becomes,” Hulme says.

 

CGG’s research and survey vessel Oceanic Vega is an SX120 hull form designed by Ulstein. Photo from CGG. 

Watch this space

CGG’s Taylor says there are two areas still to watch in marine seismic: streamer and source technology.

“On the streamer technology side, we saw the introduction of dual-component streamers containing hydrophones and vertical geophones by PGS,” Taylor says, noting that WesternGeco and CGG’s Sercel have now introduced multi-component streamer technologies containing hydrophone, vertical and crossline geophone components.

“The streamers are recording the same kind of wave field you would record with a regular streamer, but you’re gaining directional information as well,” he says.

“It’s early days for these techniques, they’re not necessarily going to do things like improve bandwidth or improve low-frequency recording. This extra crossline information could have some interesting applications for data interpolation, so in principle you can tow streamers further apart, and complete surveys more efficiently, and also enable some new processing techniques which could improve subsurface imaging.”

Of course, the technology could be limited, he says, stating that he doesn’t believe these type of multi-component streamers will enhance subsalt imaging, for example, on their own.

Beasley offered more detail on WesternGeco’s IsoMetrix technology is used in the field. “Not only does the IsoMetrix system employ traditional hydrophone and vertical pressure measurements, it incorporates a densely sampled crossline gradient measurement – a totally new measurement for marine acquisition.

“The crossline gradient allows the streamers to be placed farther apart than conventional systems and, at the same time, allows a 3D reconstruction of the seismic wavefield,” Beasley says. “As a result, we get the best of both worlds—efficiency due to wider streamer spacing and a fully 3D broadband solution that goes beyond the 2.5D measurements achieved by other systems.

“The broadband problem is a 3D problem, an issue that has been ignored in the past. With the new measurements now available, a 3D solution is now available commercially.”

On the source side, Taylor says marine vibrators have been widely discussed. “This is getting away from our traditional air gun sources, which can be contentious because of the noise they create and the potential impact on marine mammals,” he says. “Marine vibrators have potential practical and geophysical benefits, including access to environmentally sensitive areas where air guns may not be appropriate, their suitability for all water depths, improved bandwidth and signal encoding.”

Currently, Shell, ExxonMobil and Total are participating in a joint industry research group to develop marine vibrators.

“It’s something we are interested in as well,” Taylor says. “They could provide access to areas that are restricted by marine environmental regulations. Marine vibrators are not as “loud” as air guns; they generate a continuous signal that has a lower peak intensity. “According to a recent study made for the joint industry research group, this would allow a smaller marine mammal exclusion zone,” Taylor says.

Non-seismic technologies

Hobbs says there are a number of non-seismic technologies that TGS is monitoring, and one of those is electromagnetic (EM) technology.

“We’ve been active investors in the Barents Sea off Norway, along with EMGS—the EM specialist in the industry,” Hobbs says. “I think it is a valid tool, in certain types of geology. We’re focused on identifying the areas where it works and where it is synergistic with our seismic activities.

“You can interpret seismic without EM, but it is almost impossible interpret EM without seismic.”

Hobbs says TGS will continue to invest in EM where the company has a very strong seismic database that can be used in conjunction with interpreting and processing EM data.

A compressed future

Beasley is excited about what the future holds for the geophysical industry. Beasley said compressive technology caused a big stir in the medical imaging field and could be revolutionary for seismic acquisitions.

He says where it might be advantageous to put out more sensors for acquisitions, more may not be required. Beasley says with compressive technology you have to be smarter about the complexity of the thing you are trying to measure. He relates the technology to how compression revolutionized the music industry by taking music compressed on compact discs (CDs) and compressing further into MP3 files, which allow for more storage without noticeable loss in quality.

“A more fundamental question would be, did I need to make all those elaborate measurements to start with, if I could simply compress with a factor of ten,” Beasley asked. “Couldn’t I have only sampled 1/10 of the data to start with? That way you would save a lot in the recording phase. It may not make that much sense for the digital media, but for the seismic industry, this could be groundbreaking.”

Beasley says WesternGeco has implemented a type of compressive sensing on the source side, called SimSource, a simultaneous source acquisition and processing technique. This technology, Beasley says, allows for two or more seismic sources to be active in the field at one time, both able to recover data.

“This breaks all the rules for geophysicists,” he says. “The commercial introduction has been some time coming, because it is totally contrary to all the principals of geophysical data acquisition. We don’t like to have more than one source active because one air gun source looks like another, so when you record two that are active at the same time on the same record, how can you tell which information came from which source? Different techniques allow us to do this separation.”

Beasley says having two sources active allows for recording more data in the same time frame. “In the geophysical industry, time is money,” he says. “By compressing the data at the acquisition phase, this allows us to gain factors of efficiency that you wouldn’t with just a single source active.

“We believe that compressive sensing will revolutionize the data acquisition world in general, not just seismic,” Beasley says. “The need for compressive acquisition will be extremely important.”

 

Deploying an air gun umbilical. Photo from OceanGeo. 

Handling big data

Big data affects all aspect of the oil and gas industry and the software development business is no exception.

“Data size and data diversity is a huge challenge in our industry,” says Duane Dopkin, executive vice president of geoscience at subsurface software developer Paradigm. “The volume of metadata that has to be managed and referenced, the diversity of data that has to be consumed in our software platforms, and the volumes of data that needs to be visualized, interpreted, and modeled, puts a lot of pressure on software developers to accommodate that.”

Dopkin notes the need for compression and the need for software to comply with this technology.

“In the 1990s, data compression became very topical and popular because we were trying to do fast-track seismic processing and (send via) satellite that information back to the office, so it could be worked on in more detail,” Dopkin notes. “The disc costs and technology were not able to keep up.

“That changed in the 2000s with affordable storage and improved technology, but now the industry has a full appreciation for high density and high resolution seismic data,” Dopkin says. “Improved computer assisted technologies are needed to interpret, model, and validate these vast volumes of seismic data. New ways to manage, reference, and query all of the petrotechnical metadata associated with active conventional and unconventional fields has to be addressed by our software. That’s just one general topic, but it has a huge impact.”

Halliburton’s Landmark is dealing with the same big data issues, and is attempting to tackle the need for a common platform.

“The oil and gas industry is a huge user of data,” says Michael Dunn, senior director of geophysics, geology and reservoir engineering for Landmark. “How well you move data across applications is important. You want to have a common platform for E&P data, an enterprise platform, and we’re working very hard on it.”

Dunn says cloud computing, which offers essential infinite compute and data storage, could offer a solution.

“The benefits are pretty obvious,” he says. “For oil and gas companies, in addition to increased capacity the Cloud also provides the ability to provide software version control much better than these companies can today.

“If you go to any large company and look at what they’re running, they are usually behind with regards to the latest release.

“For example, it is not unusual for a large company to be two versions behind on Microsoft Office. The reason is often that they have a lot of programs that are dependent on a particular version and they can’t bring in the new version until all of those dependencies are tested.”

Dunn says version control is easier to manage on the Cloud because the local computing environment dependencies are removed and the software is simply accessed through a browser.

With infinite compute and data storage, he says, if there is a large project that a user wants to run, and it requires a large computer infrastructure, the Cloud is available 24/7. Today the cost for on demand compute is high but it rapidly decreasing as the capacity continues to grow, Dunn says.

“If I had a steady state usage, and I need peak demand, the Cloud could give me that as long as I have the dollars to pay for it,” Dunn says. “Rather than build my own infrastructures to handle those peaks, I could do that through the Cloud.”

Incorporating automation

Automation is a big topic, especially as companies seek to bring more workers onshore, and reduce human error.

Landmark’s Dunn takes a more conservative stance of automating the geophysical industry, especially where interpretation is concerned.

“While automation will play a role for individual tasks, a lot of interpretation steps aren’t easily automated,” he says. “For that reason and the significant variation in seismic data quality, a completely automated interpretation system will most likely not happen in the next 10 years.”

“I don’t see a fully automated interpretation system,” he continues. “Other areas can be automated: deep sea systems, robots that go out and place sensors on the seafloor in deepwater. However, it’s hard to automate those areas where human judgment is required.”

The bright side

Hobbs says the exciting aspect of the geophysical industry is being able to watch technology change and adapt.

“The geophysical industry is the one industry in the oil and gas field that has advanced the fastest,” Hobbs says. “I’m biased because I’m a part of it, but it delivers more value than any other technology in the exploration and production field, in terms of being able to predict what your drill bit will encounter.”

 
Read 3823 times