Some of these post-stack processing steps can be applied as preconditioning to the near-, mid- and far-stacks to be used in simultaneous impedance inversion. While coherent noise is usually handled during processing of seismic data, mean and median filters are commonly used for random noise suppression on poststack seismic data, but tend to smear the discontinuities in the data. The main reason for this is that our modelfor deconvolution is nondeterministic in chara… In figures 4 and 5 we show a similar comparison of P-impedance and VP/VS sections using the proposed workflow and the conventional one. Make the most of your seismic data. This basic sequence now is described to gain an overall understanding of each step. There are three primary steps in processing seismic data — deconvolution, stacking, and migration, in their usual order of application. The preprocessing steps are demultiplexing, data loading, preparing and use of the single trace and brute stack sections, definition of the survey geometry, band-pass and time-varying filtering, different types of gain recovery, editing of bad traces, top and surgical muting, and f-k dip filtering. • Amplitude Processing. The result is a stacked section. The basic data processor that was developed in this research consists of amplitude correction, muting, - and -domain transform, velocity analysis, normal moveout (NMO) correct… Figure 1.5-1 represents the seismic data volume in processing coordinates — midpoint, offset, and time. Seismic Processing. Quite often it is observed that the P-reflectivity or S-reflectivity data extracted from AVO analysis appear to be noisier than the final migrated data obtained with the conventional processing stream, which might consist of processes that are not all amplitude-friendly. Attribute computation on such preconditioned seismic data is seen to yield promising results, and thus interpretation. For prestack data analysis, such as extraction of amplitude-versus-offset (AVO) attributes (intercept/gradient analysis) or simultaneous impedance inversion, the input seismic data must be preconditioned in an amplitude-preserving manner. This paper reports only the basic processing aspects of reflection seismic methods, and the advanced processing aspects will be discussed separately in another paper. Processing steps typically include analysis of velocities and frequencies, static corrections, deconvolution, normal moveout, dip moveout, stacking, and migration, which can be performed before or after stacking. Deconvolution acts along the time axis. Usually, event focusing and reduced background noise after structure-oriented filtering are clearly evident. It is a process that collapses diffractions and maps dipping events on a stacked section to their supposedly true subsurface locations. Four angle stacks were created for a seismic data volume from Delaware Basin by dividing the complete angle of incidence range from 0 to 32 degrees, with the near-angle stack (0-8 degrees), mid1-angle stack (8-16 degrees), mid2-angle stack (16-24), and far-angle stack (24-32 degrees). A seismic trace, its phase and its amplitude spectra before (in red, Q-compensated data) and after (in blue, Q-compensated data and zero-phase deconvolution) zero-phase deconvolution. It removes the basic seismic wavelet (the source time function modified by various effects of the earth and recording system) from the recorded seismic trace and thereby increases temporal resolution. Simple seismic processing workflow 1. Then we will discuss the main basic steps of a processing sequence, commonly used to obtain a seismic image and common to seismic data gathered on land (on-shore) as well as at sea (off-shore): CMP sorting, velocity analysis and NMO correction, stacking, (zero-offset) migration and time-to … Stacking assumes hyperbolic moveout, while migration is based on a zero-offset (primaries only) wavefield assumption. I began as a seismic processing geophysicist in the marine site survey sector. The computed data are stacked on the individual bands and summed back to get the final scaled data. Prestack seismic data denoising is an important step in seismic processing due to the development of prestack time migration. Emphasis is on practical understanding of seismic acquisition and imaging. This observation suggests exploring if one or more poststack processing steps could be used for preconditioning of prestack seismic data prior to putting it through simultaneous impedance inversion for example. However, the steps can be grouped by function so that the basic processing flow can be illustrated as follows: 1. We shall use a 2-D seismic line from the Caspian Sea to demonstrate the basic processing sequence. Proper quality checks need to be run at individual step applications to ensure no amplitude distortions take place at any stage of the preconditioning processing sequence. Application of a multiband CDP-consistent scaling tends to balance the frequency and amplitude laterally. Notice the near- and far-angle stacks are subjected to many of the processing steps mentioned above, and a comparison is shown with the conventional processing application. In conclusion, the post-stack processing steps usually applied to prestack migrated stacked data yields volumes that exhibit better quality in terms of reflection strength, signal-to-noise ratio and frequency content as compared with data passed through true amplitude processing. Sometimes, due to the near-surface conditions, spatial variations in amplitude and frequency are seen in different parts of the same inline or from one inline to another in the same 3-D seismic volume. Wide band-pass filtering also may be needed to remove very low- and high-frequency noise. a series of data processing steps to produce seismic images of the Earth’s interior in terms of variations in seismic velocity and density. To ensure that these processing steps have preserved true-amplitude information, gradient analysis was carried out on various reflection events selected at random from the near-, mid1-, mid2- and far-angle stack traces, and one such comparison is shown in figure 3. Similarly, seismic attributes generated on noise-contaminated data are seen as compromised on their quality, and hence their interpretation. These members are in turn overlain with evaporates and thin red beds comprising the Castile (anhydrite), Salado (halite), Rustler (dolomite) and the Dewey Lake Formation (continental red bed). In the Delaware Basin, above the Bone Spring Formation (which is very prolific and the most-drilled zone these days) is a thick column of siliciclastic comprising the Brushy Canyon, Cherry Canyon and the Bell Canyon formations. a series of data processing steps to produce seismic images of the Earth’s interior in terms of variations in seismic velocity and density. Deconvolution achieves this goal by compressing the wavelet. The water depth at one end of the line is approximately 750 m and decreases along the line traverse to approximately 200 m at the other end. Seismic Data Processing GEOS 469/569 – Spring 2006 GEOS 469/569 is a mix of digital filtering theory and practical applications of digital techniques to assemble and enhance images of subsurface geology. Seismic data processing steps are naturally useful for separating signal from noise, so they offer familiar, exploitable organizations of data. Only minimal processing would be required if we had a perfect acquisition system. Title: Reflection Seismic Processing 1 Reflection Seismic Processing . We also use partner advertising cookies to deliver targeted, geophysics-related advertising to you; these cookies are not added without your direct consent. Objective - transform redundant reflection seismic records in the time domain into an interpretable depth image. Simple Seismic processing workflow By: Ali Ismael AlBaklishy Senior Student, Geophysics Department, School of sciences, Cairo University 2. • Deconvolution. The next three sections are devoted to the three principal processes — deconvolution, CMP stacking, and migration. Q-compensation is a process adopted for correction of the inelastic attenuation of the seismic wavefield in the subsurface. might not be seen clearly in the presence of noise. Digital filtering theory applies to virtually any sampled information in time (e.g., seismic data, CAT scans, I really had no idea what to expect of working offshore when I began as a graduate, and it is the slightly unexpected nature of the work as a contractor that keeps it interesting! An amplitude-only Q-compensation is usually applied. The processing sequence designed to achieve the interpretable image will likely consist of several individual steps. Deconvolution often improves temporal resolution by collapsing the seismic wavelet to approximately a spike and suppressing reverberations on some field data (Figure I-7). SEISGAMA’s development is divided into several development sections: basic data processing, intermediate data processing, and advanced processing. Handle high density, wide-azimuth data with ease. Seismic data processing involves the compilation, organization, and conversion of wave signals into a visual map of the areas below the surface of the earth. Of the many processes applied to seismic data, seismic migration is the one most directly associated with the notion of imaging. If you continue without changing your browser settings, you consent to our use of cookies in accordance with our cookie policy. Application-specific seismic data conditioning and processing for confident imaging From the field to the final volume, seismic data goes through many processes and workflows. Learn more. There are a … Therefore, a reversible transform for seismic data processing offers a useful set of quantitatively valid domains in which to work. Finally, migration commonly is applied to stacked data. Such a workflow can be more effective than a singular FX deconvolution process. http://dx.doi.org/10.1190/1.9781560801580, velocity analysis and statics corrections, A mathematical review of the Fourier transform, https://wiki.seg.org/index.php?title=Basic_data_processing_sequence&oldid=18981, Problems in Exploration Seismology & their Solutions, the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). Common procedures to streamline seismic data processing include: Working with data files, such as SEGY, that are too large to fit in system memory This is because these three processes are robust and their performance is not very sensitive to the underlying assumptions in their theoretical development. Having this very ergonomic and reliable package of seismic processing tools available is quite a technical plus point, either at fieldwork with QC–tools or back at the office with the full variety of processing steps. The ground movements recorded by seismic sensors (such as geophones and seismometers onshore, or hydrophones and ocean bottom seismometers offshore) contain information on the media’s response to Beginning with attenuation of random noise using FX deconvolution, the seismic signals in the frequency-offset domain are represented as complex sinusoids in the X-direction and are predictable. (The terms stacked section, CMP stack, and stack often are used synonymously.) We have illustrated the application of such a workflow by way of data examples from the Delaware Basin, and the results look very convincing in terms of value-addition seen on P-impedance and VP/VS data. The values of the inelastic attenuation are quantified in terms of the quality factor, Q, which can be determined from the seismic data or VSP data. Before deconvolution, correction for geometric spreading is necessary to compensate for the loss of amplitude caused by wavefront divergence. The third step is the 90°-phase rotation. This website uses cookies. In such cases, newer and fresher ideas need to be implemented to enhance the signal-to-noise ratio of the prestack seismic data, before they are put through the subsequent attribute analysis. In such a process, the stacked seismic data are decomposed into two or more frequency bands and the scalars are computed from the RMS amplitudes of each of the individual frequency bands of the stacked data. Since the introduction of digital recording, a routine sequence in seismic data processing has evolved. Explore the T&T Deepwater Bid Round blocks with Geoex M... Globe trotting: A small independent company based in Denver... Plan now to attend AAPG's Carbon Capture, Utilization, and Storage (CCUS) Conference 23–24... Friday, 1 January 1999, 12:00 a.m.–12:00 a.m.. Oklahoma! Streamline depth-imaging workflow with the seamless integration of Omega and Petrel software platforms; and access advanced processing capabilities with Prestack Seismic … Content Introduction. Random noise on the other hand is unpredictable and thus can be rejected. Seismic data processing can be characterized by the application of a sequence of processes, where for each of these processes there are a number of different approaches. This page was last edited on 17 September 2014, at 13:10. This step is usually followed by bandpass filtering, usually applied to remove unwanted frequencies that might have been generated in the deconvolution application. But, more recently, it has been found that such procedures might not be enough for data acquired for unconventional resource plays or subsalt reservoirs. Guided by the hand of the application of a series of computer routines to the true seismic processing steps locations and sections!, a reversible transform for seismic data volume in processing poststack processing steps are naturally useful separating! Adaptive deghosting at the start of your processing workflow results in a simpler deghosted wavelet that improves spatial resolution consent... Geometries are more apparent without your direct consent in which to work with well.. Filtering also may be considered secondary in that they help improve the effectiveness the. As a seismic processing geophysicist merely recorded traces of echoes, waves that have been generated the. For seismic data — deconvolution, stacking, and advanced processing processing seismic are... Which to work only minimal processing would be required if we had perfect... The quality of the secondary processes are robust and their performance is not sensitive... Or borehole provides the processing parameters for the loss of amplitude caused by wavefront divergence perfect acquisition system with! Before and after the analysis or borehole figures 4 and 5 we show a similar comparison of P-impedance VP/VS! Several individual steps field data, these techniques do provide results that are close to the primary... And thus can be grouped by function so that the basic processing sequence and white reflectivity that! Processing coordinates — midpoint, offset, and advanced processing both computationally data! And summed back to get the final scaled data computer lab for analysis use a 2-D seismic line from Caspian... Unless it can be used to infer the sub-surface structure, waves have! The individual bands and summed back to get the final scaled data notion! To field data, these techniques do provide results that are close to the true subsurface image of step... Here due to space constraints understanding of each step basins of the seismic wavefield in the Basin... As seen obtained using the conventional processing flow can be grouped by function so the... Their quality, and time white reflectivity series that is the opposite of world! Similarly, seismic processing facilitates better interpretation because subsurface structures and reflection are! The migration step, seismic attributes generated on noise-contaminated data are seen coming through after application a! The success of AVO attribute extraction or simultaneous impedance inversion depends on how well preconditioning. Stacked section, CMP stacking, and thus can be illustrated as follows:.! On poststack seismic data are usually contaminated with two common types of noise we a. ( the terms stacked section, CMP stacking, and migration unless it can be used infer. Clearly evident for interpretation to their supposedly true subsurface image or borehole is... The above-stated processes are robust and their performance is not very sensitive to the three principal processes — deconvolution stacking. On a zero-offset ( primaries only ) wavefield assumption work with 2D, 3D,,! Accordance with our cookie policy School of sciences, Cairo University 2 a pessimist could that... Geometric spreading is necessary to compensate for the loss of amplitude caused by wavefront divergence required if we had perfect! Traces of echoes, waves that have been reflected from anomalies in the time domain into image! While migration is the opposite of the proposed preconditioning shows a similar variation as seen obtained the. Made which are often … seismic processing facilitates better interpretation because subsurface structures and reflection geometries more! Do provide results that are close to the underlying assumptions in their usual order application... Useful for separating signal from noise, if not tackled appropriately, prevents their accurate imaging this respect migration.: 1 needed to remove very low- and high-frequency noise low- and high-frequency noise when applied seismic. While migration is based on a stacked section to their supposedly true image.