11th Jul, 2024

A deep dive into land seismic data processing: insights from leading geophysicists

Processing team 2024 05 09 151710 zmcr

Processing seismic data that is acquired on land presents some unique challenges and key considerations. In our discussions with five of the top land seismic processing geophysicists in the world, we explore key elements of land seismic processing, shedding light on the challenges, methodologies, and expertise essential for success in this dynamic field...

Why is the near-surface so important to solve in land processing?

Andrew Stone, Seismic Processing Geophysicist

Andrew Stone, Seismic Processing Geophysicist

"Near-surface anomalies, such as infilled channels, dunes, groundwater, and weathering, etc can degrade the quality and usefulness of the final seismic image and therefore need to be corrected to obtain a more reliable, clearer subsurface image, avoiding potential deterioration of reflection events and loss of any finer details.
“A significant piece to producing high-quality imaging is the modelling of the near-surface velocity, the higher the accuracy and resolution of the velocity model can determine the success of the final seismic image attained. We can then use the derived near-surface velocity model to calculate accurate statics corrections to compensate for the near surface anomalies.
“This is critical for producing high-quality onshore seismic images and for interpreting the subsurface geology. The velocity model can be used for depth velocity building to obtain an accurate depth migration and is also key for deriving detailed depth conversions. A near-surface model can be achieved using a variety of techniques including delay time analysis, near-surface refraction tomography, surface wave inversion and full waveform inversion.
"Improvements to the near-surface model can be achieved with denser sampling. Another point to note is that finer details within the near-surface weathering zone can often be beyond the seismic resolution. Therefore, necessitating a balance between near-surface modelling and the heterogeneities in the processing or compensating for them. Ultimately, a detailed understanding of the near-surface is crucial for achieving the best imaging results and interpretations."

Why is processing single point source and receiver different from arrays surveys?

Celina Giersz, Senior Processing Geophysicist

Celina Giersz, Senior Processing Geophysicist

“Unlike in array surveys, where data acquired by several sensors is stacked together to produce one record, in single sensor/single point surveys, data acquired by each sensor is recorded separately. The result is an unfiltered representation of the wavefield, with all the signal and the noise represented in their raw forms. By choosing the correct spatial sampling, most of the noise and signal become coherent, and therefore much easier to separate in processing.
“High density single point/single receiver data significantly benefits the pre-processing stage. We can often see this when we apply pre-stack time migration on lightly pre-processed data and already see an image quality which would usually take many stages to achieve on conventional data. In fact, most processing algorithms performs better on high trace density dataset often leading to a much simpler and faster sequence.
“There still are geological contexts where challenging near surface conditions would benefit from the utilization of arrays. In these geo-regions, replacing physical arrays with digital array forming (DAF, called also digital group forming DGF) would deliver similar or better results while still allowing the user to have access to the single sensor records which can be handy at a later stage in processing when the near surface challenges have been overcome. Ultimately, high density single point source and point receiver surveys deliver higher resolution images and better quality seismic attributes.”

What are the challenges of having advanced imaging algorithms work effectively on land data?

Zhongmin Song

Zhongmin Song, Senior Research Geophysicist

“An important characteristic of land data, compared to most marine data, is its irregular spatial sampling which can be challenging to correct and may become a source of artefacts or high uncertainty.

“Land data is usually contaminated with various types of noise that are not accounted for in advanced imaging algorithms. This makes pre-processing a tedious but essential step for these algorithms to run correctly on land data.

“Unlike in marine acquisition, both the source and receiver in land acquisition are coupled to the ground, making the elastic effect of wave propagation much more prominent in land data.

“Unfortunately, these effects have been ignored for a long time in advanced algorithms, initially designed to work on marine data. This made their transfer to land cases not very successful at the beginning.

“With the increase in trace density, pre-processing has made the attenuation of most noise much more efficient. Combined with the increase in computer power, we now see examples of a new generation of advanced imaging algorithms based on elastic propagation successfully applied to land cases."

What are the main skills required to be a successful land processor?

Eamonn Murray

Eamonn Murray, Senior Processing Geophysicist

"1. Persistence: strong and relentless determination to chase down problems
"2. A good visual memory to firstly recognise if data has a problem and then to have the recall to remember what data had a similar problem in the past.
"3. An understanding of the bones behind the processing steps helps to short-circuit the path to solutions by quickly rejecting the non-viable options. This is a critical attribute that all processers share, whether they work in land or marine.
"4. Having detailed awareness and understanding of the field acquisition. It's very useful to understand the problems that are encountered on the ground and how they translate into the recorded data.
"5. Certainly not to be satisfied with running by rule conventional flows. To be always willing to look outside and develop your workflows to suit different basins and different datasets.
"6. A fair understanding of the geology of the rocks and the geological processes that create them can be very useful in deciding what is real and what is false in the image you're looking at; what structures are realistic in particular geologic settings.

What is the difference between processing seismic for renewables vs O&G industry?

George Barns

George Barns, Seismic Processing Geophysicist

“The most common differences include survey size, number of targets, target depth, turnaround time, and environmental context. Typically, O&G projects are much larger, often exploring a wider area that may contain several areas of interest.
"Non-O&G projects are typically smaller in scope with a narrow pre-existing focus/target. An example of this would be a recent project we worked on for targeting groundwater in the Middle East, with just 8km from two 2D lines. The target is often shallow in these contexts. Whilst this is not exclusive to non-O&G projects, it is more common given their nature. Focusing on shallow imaging can introduce additional challenges in processing due to increased noise contamination of near offsets - where a strong focus on random and linear noise attenuation is required. Also, the need for a highly accurate & dense near-surface velocity model is critical for solving statics & optimising imaging, this is supported by dense sampling of the data.

“A clear operational pattern we have observed is the quick turnaround necessary for these projects. Whilst larger O&G projects may stretch for several months, due to the volume of data being handled, most of the smaller, more focused projects, are typically requested much sooner - with images being delivered just in just a few days or weeks after acquisition.”



Learn more about land seismic data processing

Discover how STRYDE’s team of geophysical experts can process land seismic data in the most efficient way possible