SlideShare a Scribd company logo
Computer Systems and Digital
Filtering Concepts in Seismic Data
Processing
 The general impact of the digital computer upon seismic
prospecting has been in the processing of seismic data.
 Such processing had its beginnings when field systems
were introduced in the early 1950s for reproducible
recording on analog magnetic tape.
 But it was not until modern high-speed digital computers
became available that the full potential of processing
techniques could begin to be realized for improving the
quality and usefulness of seismic field data.
 The basic objective of all-seismic processing is to convert
the information recorded in the field into a form that can be
used for geological interpretation.
 The data initially recorded on magnetic tape (digital or
analog) are transformed in the processing center into a
record section comparable in some ways to a geological
structure section.
 One object of the processing is to eliminate or at least
suppress all noise (defined here as signals not associated
with primary reflections.
 Actually we are not particularly interested in reducing the
absolute level of noise but in increasing the ratio of signal
level to noise level, the signal-to-noise ratio.
 Another is to present the reflections on the record sections
with the greatest possible resolution and clarity and in the
proper geometrical relationship to each other.
 Reproducibly recorded data and playback facilities have
made it economically feasible to implement new field
recording systems designed to facilitate the suppression of
noise relative to reflection signals.
 Many of these systems require processing of the data as
an integral part of the data acquisition.
 Among these are common-depth-point recording and
vibroseis, as well as many marine systems.
 Most non-dynamite sources require summing of signals,
which can be carried out either on special digital
compositing units in the field or, at a later stage in a
playback center.
 Data processing is a lengthy affair, comprising five major
types of corrections or adjustments; time, amplitude,
frequency-phase content, data compression, and data
repositioning.
 Seismic data are recorded in the field on magnetic
tapes
 About three to six weeks later the information is
transmitted to the interpreter as a seismic section
 All of the intervening steps comprise the data
processing phase of seismic exploration
 Processing of seismic data consists of applying a
sequence of computer programs, each designed to
achieve one step along the path from field tape to
record section
 From ten to twenty programs are usually used in a
processing sequence
 In each company’s library there may be several
programs designed to produce the same effect by
different approaches
 The selection of the processing sequence for a given set
of field data depends upon:
 Geologic environment
 Processing philosophy of the company
 Personal preference of the user
 Cost
 Processing programs fall into four categories:
 Data reduction
 Geometric corrections
 Data analysis and Parameter optimization
 Data refinement
 Data presentation and storage
 Some procedures are common to every processing
sequence.
 Others are used only occasionally or rarely.
 Some processors always come at the same stage of the
sequence.
 Others can be used in any of several stages. Some may
be used more than once
 There is no standard processing recipe for all types of
data
DATA ASSEMBLY AND PREPROCESSING
 Field tapes are often delivered to the processing centre
in multiplexed format.
 They are accompanied by hand written notes prepared
by members of the field party.
 These notes contain the information necessary to
identify and process the data, including the following:
 Line number & shot numbers
 Location of source and receiver stations
 Elevation of source & Receiver stations
 Station interval
 Shot hole depths and charge sizes
 Date and time of recording
 Uphole times may be read and recorded by hand, or
they may be read automatically by the computer.
 All of this information is recorded on scratch tape.
 When the field tape is demultiplexed, the archival data
is merged with the field tape.
 These data are recorded in the reel header and trace
header blocks on an output tape, which later becomes
the input to the main processing run stream
ROUTINE PROCESSES AND CORRECTIONS
 Making CDP gathers
 Restoring true amplitudes
 Applying static corrections
 Analyzing and applying filters
 Running velocity analysis and applying NMO
corrections
 Stacking
PRELIMINARY EVALUATION
 The brute stack is usually regarded as a trial run for
quality control.
 It is played out directly on paper and is not recorded
on film for permanent retention.
 Processors examine the brute stack for evidence of
residual problems that may need special processing.
 At this stage it often helps to examine the preliminary
displays.
 Corrected CDP gathers may be especially useful, since
they provide the last chance to see the field traces
before they are stacked.
DEMULTIPLEXING
 The output of a geophone group during field recording
is a continously varying electrical voltage
 A graph of this output signal showing voltage as a
continous function of time is one form of a seismic
trace, an analog display
 Since seismic data are processed in digital computers,
the continuous geophone output voltage must be
converted to digital format by sampling
 Sampling occurs in the field or in the processing centre
in an analog to digital converter (A/D)
 Consider for geophone arrays, whose outputs are
brought to four terminals arranged in a circular pattern
 The instataneous voltage at each terminal is recorded
each time the rotating arm sweeps over it, yielding an
array of samples
 If we identify each sample by its geophone group
source (A, B, C, D) and by its chronological sequence in
that group (1, 2, 3, …)
 Then output of A/D converter is
A1,B1,C1,D1,A2,B2,C2,D2,A3,B3,C3,D3,……
 This scrambled sequence is referred to as a
“multiplexed” array
 However, it is more convenient to process seismic data
in trace sequential array
A1,A2,A3,…B1,B2,B3…C1,C2,C3,….,D1,D2,D3,…..
 Unscrambling a multiplexed array into a trace
sequential array is called “Demultiplexing”
 It is accomplised by a simple computer sorting
program and is the first step in any data processing
sequence
DATA REDUCTION
Data Reduction
Introduction
 Data reduction is the first and most important step in data
processing flow.
 It includes following four categories of software programs:
i. Demultiplex
ii. Display
iii. Edit
iv. Amplitude adjustment
Demultiplex
 Demultiplexing in the geophysical sense is the
unscrambling of multiplexed field data to trace sequential
form.
 To the data processor, demultiplex has come to mean a
whole family of processing functions that affect or are
affected by the specific process of demultiplexing.
 Demultiplexing will be explained in the following six major
steps.
a. Tape Dump and Format Determination
b. Preliminary checks
c. Binary gain recovery
d. Vibroseis correction
e. Summing
f. Diversity stacking
Tape Dump and Format Determination
 A tape dump is a printed listing of the data on tape.
 A tape dump could be printed in binary but this would
generate a large quantity of paper.
 It is much more convenient to convert binary numbers to
“hexadecimal” for listing purposes.
 Hexadecimal is a base 16 number system using the
characters 1,2,3,4,5,6,7,8,9, A,B,C,D,E, and F. Four bits
are required to express one hexadecimal number.
 Hexadecimal numbers are often written preceded by an “X”
to identify them e.g. ‘X16B8C1AE.DC’.
 To convert X64C1B to binary:
Hex 6 A C 1 B
Binary 0110 1010 1100 0001 1011
i.e. X6AC1B: 01101010110000011011
Preliminary Checks
 The initial tests should be as simple as possible.
 Demultiplexing 5-10 record should be adequate for the
purpose.
 These runs should not contain any option except resample
and / or binary gain removal.
 In this manner one can see basic data undisturbed by gain
curve application or correlation.
 Output of auxilia channels (time break, uphole etc) also
are helpful at this stage.
 At this time, the processor has a tape format sheet, a hex-
dump, a rim time printout from computer and a display of
demultiplexed data.
 With the help this data follows field tape errors can be
diagnosed and removed.
Rate Errors.
 A rate error occurs when the multiplexed data is coming
too fast form the magnetic tape to the disk.
 This normally occurs when the wrong format has been
specified or there are too many zeros ahead of the first
sync frame.
Binary Gain Recovery
 Modern digital seismic instruments use various gain
ranging techniques to increase their dynamic range.
 This is necessary because of the very rapid decay of
seismic signal strength with time.
 Gain ranging instruments apply a gain to this signal before
recording it and also keep a record of what gains were
applied. Since the gain applied to the signal are recorded
(usually on auxiliary channels) along with the signal, the
processing centre can “de-gain” the data, that is, remove
the gain applied by the field instruments.
 To correct for the rapid decay of signal energy, a gain
function may be applied.
 A gain function is a smoothly time varying gain applied to
the data to bring the overall amplitude up to a fairly
constant level.
 This is referred to as gain recovery.
Vibroseis Correlation
 Most seismic methods use an energy source that
generates a very short pulse.
 When these pulses are reflected and recorded they can be
used directly to examine subsurface structure.
 The signal generated in the vibroseis method is not a short
pulse, but rather a “sweep” lasting some seven seconds or
longer.
 The sweep is sent into the earth by a pad that is pressed
against the ground surface and vibrated.
 The sweep is transmitted through the earth and reflected
as is any seismic signal, but on vibroseis record, each
reflection is as long as the input sweep rather than being a
pulse or wavelet as with an impulsive source.
 Each reflection is a near duplicate of the sweep itself and
so the reflections in the vibroseis record overlap and are
generally indistinguishable.
 To make the vibroseis record useable, we must
“compress” the reflections into short wavelets.
 This is done by cross correlating the data with the original
input sweep.
 If we were to correlate a sweep with itself, the result would
be a wavelet at time zero containing all of the frequencies
in the sweep.
 This an autocorrelogram of the sweep and is called a
“Klauder wavelet”
 Similarly, when a vibroseis trace is correlated with a
sweep, the result is that a wavelet is output every time a
reflected sweep is encountered.
 On the correlated output record. Each reflection has been
“compressed” to a wavelet.
 Thus correlated vibroseis data looks similar to impulsive
source data and can be treated in much the same way.
 There are some important differences, however, which will
be discussed later in this course.
Header Generation
 After all of the samples from a given field trace are
assembled into an array, a large amount of archival
information is placed in a reserved block called a trace
header, which is located on the tape just ahead of the data
samples.
 Trace header information may include location and
elevation of source and receiver, field record number, trace
number, etc.
 A real header block is also placed at the head of each reel,
for recording line number, reel number etc.
Display
 At the end of the processing sequence, and sometimes at
intermediate stages, it is necessary to reconvert the
digitized data to analog format so that they can be
displayed and then examined visually.
 There are several methods of plotting digital data in analog
form.
 The most common method is to pass the digital trace
through a sample and hold device, which produces an
output voltage proportional to each sample and which
holds the voltage constant until the next sample arrives.
 This procedure produces a “Stair step” type of analog
output which can then be smoothed into a continuous trace
by passage through a low pass filter.
Editing
 Raw seismic data inevitably contains some unwanted
noise and perhaps some dead traces.
 If obviously useless information is to be removed from the
processing stream, it must first be identified and then
blanked or muted by assigning zero values to all samples
in the affected time interval.
 Unwanted data are usually identified by visual examination
of raw field traces, although obvious cases (dead traces,
strong noise bursts etc.) can be detected and edited
automatically.
 The raw field trace used in this editing procedure can be
obtained from field monitor records or from one of the trace
gathers previously described.
Amplitude Adjustment
 In a seismic section the variations in the amplitudes of
reflection can be important factors in the interpretation
 Lateral amplitude variations from trace to trace, within a
reflection event (bright spot) may be direct indications of
the presence of hydrocarbons.
 Vertically amplitude variations, from event to event may be
helpful in identifying and correlating reflecting horizons.
 To preserve these important amplitude variations, the
seismic analyst must exercise care in applying gain
recovery and scaling (trace equalization).
 In some cases, however, the amplitude variations are so
great that low level events become difficult to follow or
even invisible
 In order to raise the level of these weak events relative to
the strong ones so that geologic structure can be made
visible, the analyst can apply a digital balance or “ AGC”
(automatic gain control).
Geometric Corrections
 A seismic trace on a field monitor shows reflected energy
bursts form subsurface rock layer interfaces.
 We will later measure the travel times form source down to
reflector and back to geophone and use them together with
average velocity information (if we have it) to compute
depths to the various reflectors.
 However, before we use these reflected energy bursts and
their travel times, we must apply several corrections to
compensate for geometric effects.
 The corrections include static correction and normal move
out corrections applied on the trace gathered data.
Trace Gathering
 Traces are routinely gathered into groups having some
common elements.
 The types of gathers usually made are:
- Common source point
- Common depth point
- Common receiver
- Common offset
 The information needed to generate these various gathers
may be contained in the trace headers already inserted by
some geometry definition programs.
Static Corrections
 Static corrections are constant for an entire trace. They
consist of weathering corrections and elevation
corrections.
 There is usually a thin layer – from several feet to several
hundred feet thick - of low velocity material (1000 to 2500
ft/sec) at the earth’s land surface, underlain by rock of
much higher velocity (more than 5000 ft/sec).
 Geophysicists call this low velocity zone the weathered
layer.
 Geologists use the same term to describe a generally
thinner surface zone in which the elements lend to
decompose rock into soil.
 The thickness and velocity of the weathered layer can
change along a seismic line, there by causing changes in
raw reflection time which are unrelated to subsurface
structure.
 This effect must be removed by applying a weathering
correction.
 Since seismic sources and receivers are usually at or near
the earth’s surface, raw reflection times are influenced by
topographic effects, which are also independent of
subsurface structure.
 These effects are removed by elevation corrections.
Dynamic Correction (NMO)
 In general, shot points and their associated geophone
stations are separated by distances ranging up to three
kms or even more.
 The slant, or non-vertical component of travel time can be
large and must be removed before we can attempt to
relate reflection time to reflection depth.
 For a flat reflector, the difference between travel time to a
remote geophone station and travel time to a coincident
(zero offset) station is called the normal move out (NMO).
 The purpose of this process is to derive normal moveout
functions from the velocity-time functions and apply normal
moveout corrections to each trace in each CDP gather.
 Normal moveout is a function of reflection-time and trace
offset (source to detector distance) and, thus, a normal
moveout time function must be derived for each trace
offset.
Data analysis and parameter optimization
Filtering
 A filter is a system, which discriminates against some of its
input.
 Seismic data always contain some signal information,
which we want to preserve.
 Everything else is called noise, and we want to remove or
reduce it.
 In a sense, almost all of the processes we use are filters of
a sort, because they are designed to preserve signals and
attenuate noise.
 Those systems, which are generally called filters work
either by convolution in the time domain or by spectral
shaping in the frequency domain.
 The most common types of filters are the following:
- Low pass frequency filters
- High pass frequency filters
- Band pass frequency filters
- Notch frequency filters (attenuate a narrow frequency
band)
- Deconvolution filters (shorten pulses and/ or
reverberate)
- Velocity filters (multi-channel filters which attenuate
seismic events having specified range of apparent
velocities)
Deconvolution
 Deconvolution is a filtering process designed to improve
resolution and suppress multiple reflections.
 Deconvolution can be considered either in the time domain
or in the frequency domain.
 In the time domain the object is to convert each wavelet,
with its reverberations and multiple, into a single spike.
 If we know the shape of the wavelet, we can design an
operator which, when convolved with the seismic trace, will
convert each wavelet into a single spike.
Seismic Velocities
 In this process velocity time functions are derived from
reflections by a scanning technique velocity “spectra” are
produced and plotted for interpretation by the geophysicist.
 Supplemental data, usually from well velocity surveys, also
may be used to increase accuracy of the functions.
 This analysis is performed continuously such that velocity
time functions are derived at intervals of, generally, one
kilometer along the seismic line.
Processing Sequence/ Flow Chart
 Successful processing requires selecting the appropriate
programs and parameters for a given set of data.
 Several diagnostic programs, e.g. velocity analysis, DCON
analysis and spectral analysis can be used to reveal
details which help in the choice of data enhancement and
refinement techniques.
 The finalization of this set of programs to process a special
project or group of seismic profiles are managed in a flow
chart or processing sequence.
Data Refinement
 The processes described thus for are used to reformat,
correct and diagnose data characteristics.
 This category includes various procedures intended for
data refinement.
Stacking
 After static and NMO corrections have been applied,
primary reflections should appear as horizontal events
across the gather, and a normalized sum of the traces
should then form one composite trace in which reflections
are preserved at full strength, while all other energy is
reduced.
 This procedure is known as stacking or composting
 Stacked data are also referred to as CDP or CRP
(common or reflection point) data.
 Some geophysicists prefer the term CMP (common
midpoint), in recognition of the fact that the reflection point
of the gathered traces do not coincide unless all reflectors
are flat.
 Stacking came into general use in the early 1960s.
 Its original purpose was to attenuate multiple reflections,
which are echoes of marine water bottom reflections or of
strong reflections from subsurface layers.
 It has also proved useful in suppressing other kinds of
noises both organized and random.
Residual Statics
 Routine static corrections mentioned earlier, are often not
adequate to correct satisfactorily for near surface effects.
Migration.
 A trace on a record section can be thought of as an
idealized field trace recorded by one geophone station
from a shot at the same station.
 Note that from one trace it is generally not possible to
distinguish signal for noise.
 When a trace is plotted vertically beneath its common shot/
receiver station. A reflected event is identified by its
alignment across an array of traces, and the reflection time
is measured under each station.
 If the reflector is flat, the reflection point will be located
directly beneath the shot/ receiver station, and the record
section displays the event in its true position plotted in time
rather than depth.
 The process of re-plotting seismic event in their true spatial
positions is called migration, and there are several
methods of implementing this process.
Data Presentation And Storage
 It involves the labeling of processing sequence, field
information, velocity data and other useful parameters on
the final section.
 Moreover the storage of essential seismic data for future
use is also an important subject of data processing
concern.

More Related Content

PPTX
Seismic data processing
PPT
Seismic Data Processing, Ahmed Osama
PDF
Principles of seismic data processing m.m.badawy
PDF
Simple seismic processing workflow
PDF
2 d and 3d land seismic data acquisition and seismic data processing
PPTX
New introduction to seismic method
PPTX
Lecture 23 april29 static correction
PDF
Basics of seismic interpretation
Seismic data processing
Seismic Data Processing, Ahmed Osama
Principles of seismic data processing m.m.badawy
Simple seismic processing workflow
2 d and 3d land seismic data acquisition and seismic data processing
New introduction to seismic method
Lecture 23 april29 static correction
Basics of seismic interpretation

What's hot (20)

PDF
Principles of seismic data interpretation m.m.badawy
PDF
Introduction to seismic interpretation
PPTX
Land Seismic Sources - Explosives Vs. Vibroseis
PDF
Reservoir Geophysics
PPTX
Seismic interpretation work flow final ppt
PDF
Bp sesmic interpretation
PDF
Using 3-D Seismic Attributes in Reservoir Characterization
PPTX
Seismic data processing 13 stacking&migration
PPT
Seismic survey
PPT
Seismic geometric corrections
PPTX
Role of Seismic Attributes in Petroleum Exploration_30May22.pptx
PPTX
Seismic Attributes .pptx
PPTX
Well logging
PPTX
SEISMIC METHOD
PPTX
Sonic log
PPT
Survey design
PPTX
Introduction to Seismic Method
PPT
Seismic acquisition
PDF
Introduction to velocity model building
PDF
Induced polarization method (electrical survey)
Principles of seismic data interpretation m.m.badawy
Introduction to seismic interpretation
Land Seismic Sources - Explosives Vs. Vibroseis
Reservoir Geophysics
Seismic interpretation work flow final ppt
Bp sesmic interpretation
Using 3-D Seismic Attributes in Reservoir Characterization
Seismic data processing 13 stacking&migration
Seismic survey
Seismic geometric corrections
Role of Seismic Attributes in Petroleum Exploration_30May22.pptx
Seismic Attributes .pptx
Well logging
SEISMIC METHOD
Sonic log
Survey design
Introduction to Seismic Method
Seismic acquisition
Introduction to velocity model building
Induced polarization method (electrical survey)
Ad

Similar to Seismic data processing (20)

PDF
Data analysis and Seismogram Interpretation
PDF
What do I do
PPTX
At seismic survey
PDF
Seismogram Analysis.pdf
PDF
Seismic_Processing.pdf
PDF
2 d and_3d_land_seismic_data_acquisition
PDF
Field Acqusition Paramater Design
PDF
Seniour Project: Integration of Surface Seismic with Geo-electric Data
PPT
2-D SURVEY DESIGN_Final.ppt
PPTX
Extended seismic processing sequence lecture 24
PDF
lesson 2 digital data acquisition and data processing
PDF
Managing Subsurface Data In The Oil And Gas Sector Seismic Maidinsar
PDF
Seismic Refraction Test
PPTX
Digital recording system (Geo Physics)
PPTX
Interpretation 23.12.13
PPTX
proposal presentation by Faisal engineer finialized 2024 (4) (1) (1) (2).pptx
PPTX
Sequence of Calculations for course participants.pptx
PPTX
( KEVIN SONI )DATA ACQUISITION SYSTEM
PDF
Reflection Seismology Overview
PDF
WesternGeco presentation - Seismic Data Processing
Data analysis and Seismogram Interpretation
What do I do
At seismic survey
Seismogram Analysis.pdf
Seismic_Processing.pdf
2 d and_3d_land_seismic_data_acquisition
Field Acqusition Paramater Design
Seniour Project: Integration of Surface Seismic with Geo-electric Data
2-D SURVEY DESIGN_Final.ppt
Extended seismic processing sequence lecture 24
lesson 2 digital data acquisition and data processing
Managing Subsurface Data In The Oil And Gas Sector Seismic Maidinsar
Seismic Refraction Test
Digital recording system (Geo Physics)
Interpretation 23.12.13
proposal presentation by Faisal engineer finialized 2024 (4) (1) (1) (2).pptx
Sequence of Calculations for course participants.pptx
( KEVIN SONI )DATA ACQUISITION SYSTEM
Reflection Seismology Overview
WesternGeco presentation - Seismic Data Processing
Ad

More from Shah Naseer (20)

PDF
Velocity Models Difference
PPT
Seismic acquisition survey
PPT
Structur Alanalysis
PPT
2 D 3D_ seismic survey
PPTX
Seismic waves
PPT
The geological time scale
PPT
fossils
PPT
Pompeii
PPT
Pompeii
PPT
What is a fossil?
PPT
Planktonic foraminifera
PPT
Mollusca
PPT
Mollusc
PPT
Fossils
PPT
Classification
PPT
Brachiopods
PPT
Benthic foraminifera
PPTX
SYNTHETIC GEMSTONE
PPTX
Seismicity
PPTX
Graphite
Velocity Models Difference
Seismic acquisition survey
Structur Alanalysis
2 D 3D_ seismic survey
Seismic waves
The geological time scale
fossils
Pompeii
Pompeii
What is a fossil?
Planktonic foraminifera
Mollusca
Mollusc
Fossils
Classification
Brachiopods
Benthic foraminifera
SYNTHETIC GEMSTONE
Seismicity
Graphite

Recently uploaded (20)

PPTX
G5Q1W8 PPT SCIENCE.pptx 2025-2026 GRADE 5
PPTX
7. General Toxicologyfor clinical phrmacy.pptx
PPTX
ECG_Course_Presentation د.محمد صقران ppt
PDF
. Radiology Case Scenariosssssssssssssss
PDF
diccionario toefl examen de ingles para principiante
PDF
Mastering Bioreactors and Media Sterilization: A Complete Guide to Sterile Fe...
PPTX
EPIDURAL ANESTHESIA ANATOMY AND PHYSIOLOGY.pptx
PDF
Phytochemical Investigation of Miliusa longipes.pdf
PPT
The World of Physical Science, • Labs: Safety Simulation, Measurement Practice
PPTX
microscope-Lecturecjchchchchcuvuvhc.pptx
PPTX
Cell Membrane: Structure, Composition & Functions
PPTX
Taita Taveta Laboratory Technician Workshop Presentation.pptx
PDF
HPLC-PPT.docx high performance liquid chromatography
PPTX
Vitamins & Minerals: Complete Guide to Functions, Food Sources, Deficiency Si...
PPT
protein biochemistry.ppt for university classes
PDF
SEHH2274 Organic Chemistry Notes 1 Structure and Bonding.pdf
PPTX
famous lake in india and its disturibution and importance
PDF
bbec55_b34400a7914c42429908233dbd381773.pdf
PPTX
Introduction to Fisheries Biotechnology_Lesson 1.pptx
PDF
Sciences of Europe No 170 (2025)
G5Q1W8 PPT SCIENCE.pptx 2025-2026 GRADE 5
7. General Toxicologyfor clinical phrmacy.pptx
ECG_Course_Presentation د.محمد صقران ppt
. Radiology Case Scenariosssssssssssssss
diccionario toefl examen de ingles para principiante
Mastering Bioreactors and Media Sterilization: A Complete Guide to Sterile Fe...
EPIDURAL ANESTHESIA ANATOMY AND PHYSIOLOGY.pptx
Phytochemical Investigation of Miliusa longipes.pdf
The World of Physical Science, • Labs: Safety Simulation, Measurement Practice
microscope-Lecturecjchchchchcuvuvhc.pptx
Cell Membrane: Structure, Composition & Functions
Taita Taveta Laboratory Technician Workshop Presentation.pptx
HPLC-PPT.docx high performance liquid chromatography
Vitamins & Minerals: Complete Guide to Functions, Food Sources, Deficiency Si...
protein biochemistry.ppt for university classes
SEHH2274 Organic Chemistry Notes 1 Structure and Bonding.pdf
famous lake in india and its disturibution and importance
bbec55_b34400a7914c42429908233dbd381773.pdf
Introduction to Fisheries Biotechnology_Lesson 1.pptx
Sciences of Europe No 170 (2025)

Seismic data processing

  • 1. Computer Systems and Digital Filtering Concepts in Seismic Data Processing
  • 2.  The general impact of the digital computer upon seismic prospecting has been in the processing of seismic data.  Such processing had its beginnings when field systems were introduced in the early 1950s for reproducible recording on analog magnetic tape.  But it was not until modern high-speed digital computers became available that the full potential of processing techniques could begin to be realized for improving the quality and usefulness of seismic field data.  The basic objective of all-seismic processing is to convert the information recorded in the field into a form that can be used for geological interpretation.  The data initially recorded on magnetic tape (digital or analog) are transformed in the processing center into a record section comparable in some ways to a geological structure section.
  • 3.  One object of the processing is to eliminate or at least suppress all noise (defined here as signals not associated with primary reflections.  Actually we are not particularly interested in reducing the absolute level of noise but in increasing the ratio of signal level to noise level, the signal-to-noise ratio.  Another is to present the reflections on the record sections with the greatest possible resolution and clarity and in the proper geometrical relationship to each other.  Reproducibly recorded data and playback facilities have made it economically feasible to implement new field recording systems designed to facilitate the suppression of noise relative to reflection signals.  Many of these systems require processing of the data as an integral part of the data acquisition.
  • 4.  Among these are common-depth-point recording and vibroseis, as well as many marine systems.  Most non-dynamite sources require summing of signals, which can be carried out either on special digital compositing units in the field or, at a later stage in a playback center.  Data processing is a lengthy affair, comprising five major types of corrections or adjustments; time, amplitude, frequency-phase content, data compression, and data repositioning.
  • 5.  Seismic data are recorded in the field on magnetic tapes  About three to six weeks later the information is transmitted to the interpreter as a seismic section  All of the intervening steps comprise the data processing phase of seismic exploration  Processing of seismic data consists of applying a sequence of computer programs, each designed to achieve one step along the path from field tape to record section
  • 6.  From ten to twenty programs are usually used in a processing sequence  In each company’s library there may be several programs designed to produce the same effect by different approaches
  • 7.  The selection of the processing sequence for a given set of field data depends upon:  Geologic environment  Processing philosophy of the company  Personal preference of the user  Cost
  • 8.  Processing programs fall into four categories:  Data reduction  Geometric corrections  Data analysis and Parameter optimization  Data refinement  Data presentation and storage
  • 9.  Some procedures are common to every processing sequence.  Others are used only occasionally or rarely.  Some processors always come at the same stage of the sequence.  Others can be used in any of several stages. Some may be used more than once  There is no standard processing recipe for all types of data
  • 10. DATA ASSEMBLY AND PREPROCESSING  Field tapes are often delivered to the processing centre in multiplexed format.  They are accompanied by hand written notes prepared by members of the field party.  These notes contain the information necessary to identify and process the data, including the following:
  • 11.  Line number & shot numbers  Location of source and receiver stations  Elevation of source & Receiver stations  Station interval  Shot hole depths and charge sizes  Date and time of recording  Uphole times may be read and recorded by hand, or they may be read automatically by the computer.  All of this information is recorded on scratch tape.
  • 12.  When the field tape is demultiplexed, the archival data is merged with the field tape.  These data are recorded in the reel header and trace header blocks on an output tape, which later becomes the input to the main processing run stream
  • 13. ROUTINE PROCESSES AND CORRECTIONS  Making CDP gathers  Restoring true amplitudes  Applying static corrections  Analyzing and applying filters  Running velocity analysis and applying NMO corrections  Stacking
  • 14. PRELIMINARY EVALUATION  The brute stack is usually regarded as a trial run for quality control.  It is played out directly on paper and is not recorded on film for permanent retention.  Processors examine the brute stack for evidence of residual problems that may need special processing.
  • 15.  At this stage it often helps to examine the preliminary displays.  Corrected CDP gathers may be especially useful, since they provide the last chance to see the field traces before they are stacked.
  • 16. DEMULTIPLEXING  The output of a geophone group during field recording is a continously varying electrical voltage  A graph of this output signal showing voltage as a continous function of time is one form of a seismic trace, an analog display  Since seismic data are processed in digital computers, the continuous geophone output voltage must be converted to digital format by sampling
  • 17.  Sampling occurs in the field or in the processing centre in an analog to digital converter (A/D)  Consider for geophone arrays, whose outputs are brought to four terminals arranged in a circular pattern  The instataneous voltage at each terminal is recorded each time the rotating arm sweeps over it, yielding an array of samples
  • 18.  If we identify each sample by its geophone group source (A, B, C, D) and by its chronological sequence in that group (1, 2, 3, …)  Then output of A/D converter is A1,B1,C1,D1,A2,B2,C2,D2,A3,B3,C3,D3,……  This scrambled sequence is referred to as a “multiplexed” array
  • 19.  However, it is more convenient to process seismic data in trace sequential array A1,A2,A3,…B1,B2,B3…C1,C2,C3,….,D1,D2,D3,…..  Unscrambling a multiplexed array into a trace sequential array is called “Demultiplexing”  It is accomplised by a simple computer sorting program and is the first step in any data processing sequence
  • 21. Data Reduction Introduction  Data reduction is the first and most important step in data processing flow.  It includes following four categories of software programs: i. Demultiplex ii. Display iii. Edit iv. Amplitude adjustment
  • 22. Demultiplex  Demultiplexing in the geophysical sense is the unscrambling of multiplexed field data to trace sequential form.  To the data processor, demultiplex has come to mean a whole family of processing functions that affect or are affected by the specific process of demultiplexing.  Demultiplexing will be explained in the following six major steps. a. Tape Dump and Format Determination b. Preliminary checks c. Binary gain recovery d. Vibroseis correction e. Summing f. Diversity stacking
  • 23. Tape Dump and Format Determination  A tape dump is a printed listing of the data on tape.  A tape dump could be printed in binary but this would generate a large quantity of paper.  It is much more convenient to convert binary numbers to “hexadecimal” for listing purposes.  Hexadecimal is a base 16 number system using the characters 1,2,3,4,5,6,7,8,9, A,B,C,D,E, and F. Four bits are required to express one hexadecimal number.  Hexadecimal numbers are often written preceded by an “X” to identify them e.g. ‘X16B8C1AE.DC’.  To convert X64C1B to binary: Hex 6 A C 1 B Binary 0110 1010 1100 0001 1011 i.e. X6AC1B: 01101010110000011011
  • 24. Preliminary Checks  The initial tests should be as simple as possible.  Demultiplexing 5-10 record should be adequate for the purpose.  These runs should not contain any option except resample and / or binary gain removal.  In this manner one can see basic data undisturbed by gain curve application or correlation.  Output of auxilia channels (time break, uphole etc) also are helpful at this stage.  At this time, the processor has a tape format sheet, a hex- dump, a rim time printout from computer and a display of demultiplexed data.  With the help this data follows field tape errors can be diagnosed and removed.
  • 25. Rate Errors.  A rate error occurs when the multiplexed data is coming too fast form the magnetic tape to the disk.  This normally occurs when the wrong format has been specified or there are too many zeros ahead of the first sync frame. Binary Gain Recovery  Modern digital seismic instruments use various gain ranging techniques to increase their dynamic range.  This is necessary because of the very rapid decay of seismic signal strength with time.
  • 26.  Gain ranging instruments apply a gain to this signal before recording it and also keep a record of what gains were applied. Since the gain applied to the signal are recorded (usually on auxiliary channels) along with the signal, the processing centre can “de-gain” the data, that is, remove the gain applied by the field instruments.  To correct for the rapid decay of signal energy, a gain function may be applied.  A gain function is a smoothly time varying gain applied to the data to bring the overall amplitude up to a fairly constant level.  This is referred to as gain recovery.
  • 27. Vibroseis Correlation  Most seismic methods use an energy source that generates a very short pulse.  When these pulses are reflected and recorded they can be used directly to examine subsurface structure.  The signal generated in the vibroseis method is not a short pulse, but rather a “sweep” lasting some seven seconds or longer.  The sweep is sent into the earth by a pad that is pressed against the ground surface and vibrated.  The sweep is transmitted through the earth and reflected as is any seismic signal, but on vibroseis record, each reflection is as long as the input sweep rather than being a pulse or wavelet as with an impulsive source.
  • 28.  Each reflection is a near duplicate of the sweep itself and so the reflections in the vibroseis record overlap and are generally indistinguishable.  To make the vibroseis record useable, we must “compress” the reflections into short wavelets.  This is done by cross correlating the data with the original input sweep.  If we were to correlate a sweep with itself, the result would be a wavelet at time zero containing all of the frequencies in the sweep.  This an autocorrelogram of the sweep and is called a “Klauder wavelet”
  • 29.  Similarly, when a vibroseis trace is correlated with a sweep, the result is that a wavelet is output every time a reflected sweep is encountered.  On the correlated output record. Each reflection has been “compressed” to a wavelet.  Thus correlated vibroseis data looks similar to impulsive source data and can be treated in much the same way.  There are some important differences, however, which will be discussed later in this course.
  • 30. Header Generation  After all of the samples from a given field trace are assembled into an array, a large amount of archival information is placed in a reserved block called a trace header, which is located on the tape just ahead of the data samples.  Trace header information may include location and elevation of source and receiver, field record number, trace number, etc.  A real header block is also placed at the head of each reel, for recording line number, reel number etc.
  • 31. Display  At the end of the processing sequence, and sometimes at intermediate stages, it is necessary to reconvert the digitized data to analog format so that they can be displayed and then examined visually.  There are several methods of plotting digital data in analog form.  The most common method is to pass the digital trace through a sample and hold device, which produces an output voltage proportional to each sample and which holds the voltage constant until the next sample arrives.  This procedure produces a “Stair step” type of analog output which can then be smoothed into a continuous trace by passage through a low pass filter.
  • 32. Editing  Raw seismic data inevitably contains some unwanted noise and perhaps some dead traces.  If obviously useless information is to be removed from the processing stream, it must first be identified and then blanked or muted by assigning zero values to all samples in the affected time interval.  Unwanted data are usually identified by visual examination of raw field traces, although obvious cases (dead traces, strong noise bursts etc.) can be detected and edited automatically.  The raw field trace used in this editing procedure can be obtained from field monitor records or from one of the trace gathers previously described.
  • 33. Amplitude Adjustment  In a seismic section the variations in the amplitudes of reflection can be important factors in the interpretation  Lateral amplitude variations from trace to trace, within a reflection event (bright spot) may be direct indications of the presence of hydrocarbons.  Vertically amplitude variations, from event to event may be helpful in identifying and correlating reflecting horizons.  To preserve these important amplitude variations, the seismic analyst must exercise care in applying gain recovery and scaling (trace equalization).  In some cases, however, the amplitude variations are so great that low level events become difficult to follow or even invisible  In order to raise the level of these weak events relative to the strong ones so that geologic structure can be made visible, the analyst can apply a digital balance or “ AGC” (automatic gain control).
  • 34. Geometric Corrections  A seismic trace on a field monitor shows reflected energy bursts form subsurface rock layer interfaces.  We will later measure the travel times form source down to reflector and back to geophone and use them together with average velocity information (if we have it) to compute depths to the various reflectors.  However, before we use these reflected energy bursts and their travel times, we must apply several corrections to compensate for geometric effects.  The corrections include static correction and normal move out corrections applied on the trace gathered data.
  • 35. Trace Gathering  Traces are routinely gathered into groups having some common elements.  The types of gathers usually made are: - Common source point - Common depth point - Common receiver - Common offset  The information needed to generate these various gathers may be contained in the trace headers already inserted by some geometry definition programs.
  • 36. Static Corrections  Static corrections are constant for an entire trace. They consist of weathering corrections and elevation corrections.  There is usually a thin layer – from several feet to several hundred feet thick - of low velocity material (1000 to 2500 ft/sec) at the earth’s land surface, underlain by rock of much higher velocity (more than 5000 ft/sec).  Geophysicists call this low velocity zone the weathered layer.  Geologists use the same term to describe a generally thinner surface zone in which the elements lend to decompose rock into soil.  The thickness and velocity of the weathered layer can change along a seismic line, there by causing changes in raw reflection time which are unrelated to subsurface structure.
  • 37.  This effect must be removed by applying a weathering correction.  Since seismic sources and receivers are usually at or near the earth’s surface, raw reflection times are influenced by topographic effects, which are also independent of subsurface structure.  These effects are removed by elevation corrections.
  • 38. Dynamic Correction (NMO)  In general, shot points and their associated geophone stations are separated by distances ranging up to three kms or even more.  The slant, or non-vertical component of travel time can be large and must be removed before we can attempt to relate reflection time to reflection depth.  For a flat reflector, the difference between travel time to a remote geophone station and travel time to a coincident (zero offset) station is called the normal move out (NMO).  The purpose of this process is to derive normal moveout functions from the velocity-time functions and apply normal moveout corrections to each trace in each CDP gather.  Normal moveout is a function of reflection-time and trace offset (source to detector distance) and, thus, a normal moveout time function must be derived for each trace offset.
  • 39. Data analysis and parameter optimization Filtering  A filter is a system, which discriminates against some of its input.  Seismic data always contain some signal information, which we want to preserve.  Everything else is called noise, and we want to remove or reduce it.  In a sense, almost all of the processes we use are filters of a sort, because they are designed to preserve signals and attenuate noise.  Those systems, which are generally called filters work either by convolution in the time domain or by spectral shaping in the frequency domain.
  • 40.  The most common types of filters are the following: - Low pass frequency filters - High pass frequency filters - Band pass frequency filters - Notch frequency filters (attenuate a narrow frequency band) - Deconvolution filters (shorten pulses and/ or reverberate) - Velocity filters (multi-channel filters which attenuate seismic events having specified range of apparent velocities)
  • 41. Deconvolution  Deconvolution is a filtering process designed to improve resolution and suppress multiple reflections.  Deconvolution can be considered either in the time domain or in the frequency domain.  In the time domain the object is to convert each wavelet, with its reverberations and multiple, into a single spike.  If we know the shape of the wavelet, we can design an operator which, when convolved with the seismic trace, will convert each wavelet into a single spike.
  • 42. Seismic Velocities  In this process velocity time functions are derived from reflections by a scanning technique velocity “spectra” are produced and plotted for interpretation by the geophysicist.  Supplemental data, usually from well velocity surveys, also may be used to increase accuracy of the functions.  This analysis is performed continuously such that velocity time functions are derived at intervals of, generally, one kilometer along the seismic line.
  • 43. Processing Sequence/ Flow Chart  Successful processing requires selecting the appropriate programs and parameters for a given set of data.  Several diagnostic programs, e.g. velocity analysis, DCON analysis and spectral analysis can be used to reveal details which help in the choice of data enhancement and refinement techniques.  The finalization of this set of programs to process a special project or group of seismic profiles are managed in a flow chart or processing sequence.
  • 44. Data Refinement  The processes described thus for are used to reformat, correct and diagnose data characteristics.  This category includes various procedures intended for data refinement. Stacking  After static and NMO corrections have been applied, primary reflections should appear as horizontal events across the gather, and a normalized sum of the traces should then form one composite trace in which reflections are preserved at full strength, while all other energy is reduced.  This procedure is known as stacking or composting  Stacked data are also referred to as CDP or CRP (common or reflection point) data.
  • 45.  Some geophysicists prefer the term CMP (common midpoint), in recognition of the fact that the reflection point of the gathered traces do not coincide unless all reflectors are flat.  Stacking came into general use in the early 1960s.  Its original purpose was to attenuate multiple reflections, which are echoes of marine water bottom reflections or of strong reflections from subsurface layers.  It has also proved useful in suppressing other kinds of noises both organized and random.
  • 46. Residual Statics  Routine static corrections mentioned earlier, are often not adequate to correct satisfactorily for near surface effects. Migration.  A trace on a record section can be thought of as an idealized field trace recorded by one geophone station from a shot at the same station.  Note that from one trace it is generally not possible to distinguish signal for noise.  When a trace is plotted vertically beneath its common shot/ receiver station. A reflected event is identified by its alignment across an array of traces, and the reflection time is measured under each station.  If the reflector is flat, the reflection point will be located directly beneath the shot/ receiver station, and the record section displays the event in its true position plotted in time rather than depth.
  • 47.  The process of re-plotting seismic event in their true spatial positions is called migration, and there are several methods of implementing this process. Data Presentation And Storage  It involves the labeling of processing sequence, field information, velocity data and other useful parameters on the final section.  Moreover the storage of essential seismic data for future use is also an important subject of data processing concern.