In oil and gas, ‘technology is the key’

Special to the Rice News

The future of the oil and gas industry, and of its continued prosperity, will increasingly rely on high-performance computing and improved visualization, said Dirk Smit, vice president of exploration technology and chief scientist for geophysics at Shell.

Smit was a keynote speaker at the 2013 Oil and Gas High-Performance Computing Workshop Feb. 28 at Rice University. Hosted by Rice’s Ken Kennedy Institute for Information Technology (K2I), the workshop drew more than 350 leaders from the oil and gas industry, the high-performance computing and information technology industries and academics.

oil rig

Rice's Ken Kennedy Institute for Information Technology hosted the 2013 Oil and Gas High-Performance Computing Workshop Feb. 28.

In his talk, “Compute Challenges to Meet Future Energy Demand,” Smit noted that by 2050, the human population will have grown to 9 billion and the number of vehicles in the world will have doubled to 2 billion.

“At the same time,” he said, “millions of people will rise out of energy poverty and demand will double, but CO2 emissions must be half of what they are today to avoid serious climate change.”

Smit described what he called the water-energy-food nexus, with demand for each by 2030 expected to increase by 30, 40 and 50 percent, respectively.

“A lot more hydrocarbons will be needed,” he said. “Many of them are buried several kilometers in the ground under complex geological formations. We have to be able to recognize subtle geological condition. Technology here is the key enabler.”

Smit outlined three technology challenges for his industry: making reliable models and accurate predictions, computing with huge data sets and increasing the efficiency of data interpretation. He said “interconnectivity” will characterize the 21st century, and people in his industry will devote more time to solving large-scale optimization problems.

“We need to turn data into information, and we need more information faster and with fewer people,” he said. “We need fast, interactive imaging tools. When it comes to ‘big data,’ we need scalable seismic imaging and fiber-optic data transport speeds.”

Smit cited his industry’s increasing reliance on wireless seismic acquisition systems, physics-based mass flow modeling and physics-based stratigraphy forward modeling. He envisioned “something like an Internet for each reservoir.”

“Technology is the key,” he said. “We won’t be able to solve the problems we face with today’s technology.” Depth imaging and visualization will increasingly become a “cluster activity,” he said. Depth imaging, visualization and interpretation of 10 petabyte data sets are necessary but not yet possible. “This can’t be done on a laptop computer.”

Shell and Hewlett-Packard are already collaborating on development of a wireless sensing network for oil and gas exploration that uses HP’s ultra-high-sensitivity microelectromechanical systems (MEMS) accelerometer. The sensor provides low noise at frequencies below the bandwidth of existing MEMS devices. Its small size and low energy consumption reduce the cost of large-scale deployments in a wireless sensor network and enable data from more channels to be collected.

“Sensors are the bottom of the food chain in oil and gas,” Smit said. “Everything else depends on them. In an area of so much uncertainty, we want to achieve robust predictability.”

When an audience member asked Smit what sort of students his industry needs, he replied, “It doesn’t really matter as long as they’re the best,” adding that desirable qualities are flexibility — “knowing when to let go of an idea that isn’t working” — and the ability to “look at the wider context.”

The afternoon keynote speaker was John Kuzan, manager of computational sciences for ExxonMobil Upstream Research, whose talk was titled “High-Performance Computing: Solving Oil and Gas Problems Today – Poised for the Future.” He began repeating an old saying in his industry: “Every geologic model is wrong, and a couple of them are useful.”

Kuzan reaffirmed his industry’s reliance on high-performance computing, particularly at the upstream end of production. “We need to find solutions for finding the harder oil, where seismic data is more critical than ever. Each reservoir requires hundreds of reservoir simulations,” he said.

Asked about students seeking employment in the oil and gas industry, Kuzan replied, “The world is your oyster in the computational sciences.”

The workshop, which included dozens of panels and plenary sessions, concluded with poster sessions in which students met with industry leaders and academics to discuss their research and explore opportunities for leveraging their computational science and engineering skills to pursue careers in the oil and gas industry.

“Our workshop is unique in the way it balances three communities — oil and gas, IT and academia,” said Jan Odegard, executive director of K2I. The program is not too theoretical or flooded with IT vendors, but rather it’s focused on what happens at the intersection of the three groups. It’s focused on sharing best practices, educating each other about challenges and opportunities, and building a community of high-performance computing practitioners.”

In addition Odegard, the workshop organizers include Henri Calandra, Total; Simanti Das, ExxonMobil; Keith Gray, BP; Olav Lindtjorn, Schlumberger; Bill Menger, Global Geophysical; Scott Morton, Hess Corporation; Amik St-Cyr, Shell; and Chap Wong, Chevron.

– Patrick Kurp is a science writer in the George R. Brown School of Engineering.

About Jade Boyd

Jade Boyd is science editor and associate director of news and media relations in Rice University's Office of Public Affairs.