In HPC for oil and gas, ‘this is the decade of sensing’

In HPC for oil and gas, ‘this is the decade of sensing’

By Patrick Kurp
Special to Rice News

“In our field, the last decade was devoted to Big Data. This is the decade of sensing.”

So predicted Peter Breunig, general manager of technology management and architecture at Chevron IT, who delivered the opening keynote address at the seventh annual Rice Oil and Gas High-Performance Computing (HPC) Workshop March 6 at Rice.

Hosted by the Ken Kennedy Institute for Information Technology (K2I), the event drew more than 500 leaders from the oil and gas industry, the high-performance computing and information technology industry, national laboratories and academia.

The seventh-annual Rice Oil & Gas High-Performance Computing Workshop drew more than 500 leaders from industry and academia March 6.

“Our goal has always been to bring together three communities — oil and gas, IT and academia,” said Jan Odegard, executive director of K2I. “Obviously, this formula is working. This year we see a record numbers of registrations, record number of abstracts, talks and posters. The HPC Workshop has become a major annual event for many people.”

In his talk, “The Never-Ending Story of IT/HPC Evolution and Its Effect on Our Business,” Breunig said, “Finding and extracting oil and gas from the subsurface has always been a data-driven exercise. With advances in hardware and software technologies and new sensing technologies, improving resolution within the reservoir is critical. Deep-water wells cost a lot of money. We have to exploit all of our existing assets.”

Breunig, who was trained as a seismic processing geophysicist, said the goal in seismic sensing and modeling is to reduce approximations and refine measurements. “That’s the reality. You’re going to be processing more and more data. Sensor data is increasing exponentially. We need sensors capable of producing terabytes of information and the appropriate ways to process it.”

The industry faces a “people bottleneck,” with its increased demand for competent IT professionals, physicists and geophysicists, Breunig said.

The afternoon keynote speaker was Bill Dally, chief scientist and senior vice president of research at Nvidia Corp., who spoke on “Efficiency and Parallelism: The Challenges of Future Computing.” He noted three trends in the field: the end of Dennard scaling, pervasive parallelism and growing cloud and mobile dependence.

“High-performance computing is our principal seismic instrument, akin to the position held by the particle accelerator in physics. Using this instrument now is like using a microscope in the 17th century,” he said.

Dally reiterated the end of Dennard scaling, the law in electronics positing that as transistors become smaller, their power density remains constant. Power use stays proportionate with area.

“The challenges we face in high-performance computing are power, programmability and scalability,” he said. “The end of Dennard scaling limits computing power from increasing processor clock frequency, so now performance is determined by software and algorithms. With improvements in process technology making just small increases in efficiency, we need more innovations in architecture and circuits if we want to maintain the expected performance scaling.

“Our future systems will be energy-limited. The process will matter less and the architecture will matter more. We’ll have to develop energy efficiency through architecture and clusters,” he said.

Among the plenary speakers was David Bernholdt, leader of the Computer Science Research Group at the Oak Ridge National Laboratory, home of Titan, a Cray XK7 system, the second most powerful computer in the world.

“Our biggest bottleneck is data movement,” Bernholdt said. “We’re dealing with programming at an extreme scale, using massive parallelism. The facts of life are that faults will no longer be rare events. We have to be willing to pay when the faults occur.”

Randal Rheinheimer, deputy division leader for the HPC division at the Los Alamos National Laboratory spoke on “Letting the Data and Water Flow: National and Local DOE Preparations for Exascale.”

“Moving to exascale is not just about computers but is now increasingly about building and operating very large and complex infrastructures,” he said. “Not only do we have to build the next computer that can operate at a highly constrained power budget, but we also have to build the facilities that can support these future systems. There is a lot of learning that needs to happen and much of it has nothing to do with the actual computer.”

Earl Joseph, research vice president of IDC’s (International Data Corp.) high-performance computing group, also spoke. IDC provides market intelligence and consulting services for the information technology, telecommunications and consumer technology markets.

“While we see strong growth in many segments, it is clear that power, cooling, real estate and system management are critical challenges moving forward,” Joseph said. “Programming for scale, heterogeneity and fault tolerance while delivering application performance will be a challenge moving forward.”

The workshop also included a session of rapid-fire lightning talks and four parallel sessions with technical talks. It concluded with a poster session in which students met with industry leaders and academics to discuss their research and explore opportunities for leveraging their computational science and engineering skills to pursue careers in the oil and gas industry.

This year, for the first time, the HPC Workshop offered two preworkshop mini-tutorials — “OCCA: Portability Layer for Many-core Thread Programming” and “Hybrid Programming Challenges for CPU+GPU+Cluster Parallelism,” and an opening networking reception the day before the formal program.

In addition to Odegard, the workshop organizers include Henri Calandra, Total; Simanti Das, ExxonMobil; Keith Gray, BP; Scott Morton, Hess Corp.; Amik St-Cyr, Shell; and Chap Wong, Chevron.

—Patrick Kurp is a science writer in the George R. Brown School of Engineering.
 

About Jade Boyd

Jade Boyd is science editor and associate director of news and media relations in Rice University's Office of Public Affairs.