We are currently living in the most data-rich time of human existence. Data is everywhere in our daily lives both at work and at home. This month I discuss specifically the use of subsurface data in geoscience and how it is now the time to ensure all historic seismic data is archived correctly and available for future generations.
The quality and volume of subsurface data to which we have access to has increased exponentially over the last 5 decades with the development of the petroleum industry. Now, with the combined effects of maturing hydrocarbon basins, and the continuing move to greener energies, are we going to see a reduction in seismic and drilling activity, and therefore are we approaching 'peak data' for the subsurface sciences?
Subsurface Data
Geoscientists use a range of methods to image and understand the ground beneath our feet. Some are relatively cheap and accessible: field mapping, or passive geophysical surveys such as gravity, magnetic or electrical susceptibility. However, the best way to map the subsurface is reflection seismic that is ground truthed to well data.
Reflection seismology uses a man-made energy source (dynamite, airgun or seismic vibrator) to send sound waves down into the subsurface. Boundaries between rocks of different properties cause a portion of the sound wave to be reflected back to the surface receiver. By understanding these changes in rock properties, a geophysicist can create an image of the rock layers in the subsurface.
The first seismic surveys were completed in the early 1900s, when onshore oil exploration in Texas and Mexico were booming. It took some years to move offshore, but by the 1950s, the first regional seismic surveys were being completed. The North Sea was among the first petroleum basins to utilise this technology. The earliest seismic images are of extremely poor quality, but led the way to a vibrant petroleum industry and a hunger for more, and better, data.
Some 50 years later and the oil industry can look into the subsurface in 2D, 3D and even 4D. Seismic surveys are increasing in extent, quality and resolution, wells are being drilled deeper, with better tools and sampling capabilities, and our ability to map the subsurface is better than at any other time in history. This understanding of the subsurface allows oil and gas companies to make sound business decisions when deciding where to drill for resources, and allows for the best chance of success of discovering a commercially viable accumulation.
Peak Oil, and Peak Data?
The term 'Peak Oil’ was introduced in the 1950s by M. King Hubbert. He proposed the maximum rate of oil extraction worldwide would occur in 2000, followed by the decline of new resources being discovered, and therefore a reduction in available oil to produce. Some hypothesised that this decline would lead to economic and societal collapse.
Thankfully, the initial peak oil date predictions were not accurate. Advances in technology, and the success of unconventional oil accumulations has extended the trend. Current peak oil predictions generally range across the 2020–2030s. More importantly, the impact of reaching peak oil has reduced, with huge developments continuing in alternative energy sources over the last decades. Whilst there will still be a need for hydrocarbon products for many years to come, our reliance on them will soon begin to decrease.
This leads me to the concept of peak subsurface data. Over the last 50+ years, the petroleum industry has been the catalyst for the acquisition of colossal volumes of subsurface data. Industry and scientific research into geological understanding of reservoir rocks has meant that we now have an ever-improving understanding of earth processes, both past and present.
With reducing oil exploration will, inevitably, come a reduced rate of data acquisition, and a subsequent reduction in funding for studies on petroleum systems and their components. No other current activity demands so much subsurface focus or can provide the resources and funding to look deep into the earth. Low carbon energy sources require near surface geohazard surveys (e.g. for wind farm or tidal installations, or geothermal projects), or rely on pre-existing and depleted oil and gas fields over which seismic data already exists. Therefore, are we approaching the pinnacle of our ability and need to interrogate the subsurface?
The Future
What might life after peak subsurface data look like? The most recent downturn in oil industry might give some clues. It saw heavy casualties in the marine seismic acquisition business, with a number of companies declaring bankruptcy or announcing their departure from the market. The rate of new seismic surveys being collected has decreased, with a new focus on multi-client data. Despite this, there are certainly flurries of activity where exploration is still active, for example in Mexico following the announcement of the first international licensing round (see video below).
Offshore seismic data acquisition the year following the announcement of the first international licensing round in over 50 years.
It is not all doom and gloom for geoscientists though: there will be increasing opportunities for access to data as the petroleum industry winds down. As basins mature and the demand for hydrocarbons decreases, countries will compete for investment. As a result, it is likely that increasing volumes of data will become 'open access' in order to attract explorers. This increasing volume of open access data offers a huge opportunity for scientists to investigate the subsurface. The reprocessing industry will help to continue improvement of existing data-sets (at a tenth of the acquisition cost). However, despite ever more data becoming available for scientists, investment into subsurface sciences will continue to decrease with reducing need to understand the petroleum systems deep below our feet. Cheap accessible data visualisation software will need to be developed to avoid a stagnation in the earth sciences.
The challenge ahead will be maintaining what data we have for future use. As technology continues to advance, how can we access old data, and make use of it? Will the data we are collecting now become redundant in the future? If so, how can we adapt? Perhaps the industry can share the values that Environmental Sciences are implementing to make digital datasets 'FAIR'. This aims to ensure all newly acquired data is:
Findable Accessible Interoperable Reusable
Governments are already successfully stepping in as providers of data repositories, and data scientists are beginning to use new technology to make vintage data-sets accessible. There is a big challenge ahead of each and every one of us to ensure that information is not lost. We all have a responsibility as geoscientists to contribute to the cause by ensuring data is archived correctly for future generations to enjoy!
-- Keep Exploring --
Comments