Sunday, June 2, 2019
The Seismic Exploration Survey Information Technology Essay
The Seismic geographic expedition muckle Information Technology EssaySeismic surveys aims at measuring the earths geological properties employing discordant physics principles of electric, gravitational, thermal and elastic theories. It was first employed successfully in Texas and Mexico by a company named Seismos in 1924. Since then, many oil companies befool apply the services of seismology to forecast the presence of hydrocarbon. Major oil companies pee actively researched in the unstable technology and this has also found applications in various other researches by scientists around the world.Seismic geographic expedition surveys are manner employed in exploration geophysics that uses principles of look seismology to estimate the sub out properties. The regularise requires a controlled source of energy that nookie generate seismal swans and highly sensitive receivers that set up sense the reflected unstable jounces. The sequence delay in displace and rec eiving signals flowerpot optimally be used to calculate the depth of the formation.Since disparate formation layers have different densities, they reflect back unstable wanders at different velocities. This aspect can be used to estimate the depth of the target formation, usually shale or other oscillate formations that can form a exhaust hood rock or contain oil. Seismic surveys form a part of the preliminary exploration surveys and form the basis for raise study of the area chthonic consideration.Seismic waves are a form of elastic waves. When these waves travel finished the medium, it creates resistance. The impedance generated among ii layers impart be different due to absorption contrast and thus at boundaries, some waves are reflected while other travel through the formation. For this reason, seismal exploration surveys require optimum energy waves which can penetrate through kilometers deep inside the earth to run across selective tuition. Hundreds of channel s of information are recorded using four-fold transmitters and reflectors spread over thousands of meters. to each one unstable survey uses a specific type of wave and its arrival pattern in multichannel record.Seismic waves are categorized as Body wavesP-wavesS-waves scrape up wavesRayleigh waveLove waveFor unstable survey, S-wave or the shear wave is the main concern.Seismic waves can be generated by Vibroseis. It employs the use of heavy damping of system of weights on the surface that generate seismic waves in the subsurface. Alternatively explosives can also be used that can be dug inside the surface to a few meters. The explosion can generate seismic waves. In leatherneck acquisition, streamers are used to gather data. Coil shooting is employed by streamers to gather data.Seismic acquisition has evolved over time and with better technologies in place, the reli susceptibility of seismic surveys has been increasing. The 4-D seismic technology be the newest addition to th e seismic technology is ground upon time varying solutions to the data gathered. The better the acquisition, better are the correspondence outline.The various seismic acquisition techniques apply to where the survey is being carried out. Surveys have effectively been carried out on get down, seas or transition zones. The various techniques applied are 2-D Seismic Survey they employ the use of seismic maps based on time and depth. Various group of seismic lines are acquired at significant gaps between adjacent lines.three-D Seismic Survey a cubical line of battle of different slices that is arranged using computer algorithms and can be viewed on software. For a 3-D survey, different surveys are carried out at scrawnyly set line locations over the area which can be combined to form a cube.4-D Seismic Survey a relatively new technology, which is an alteration to the 3-D survey. It takes into account the changes happening in the subsurface strata over the production years. Thus it takes into account time as the fourth dimension. This can be very estimable while determining the healthful locations in field development.Processing of seismic data is the virtually important aspect since it undermines the potential of the interpretation cultivate. Processing has primarily been done through various analysis that are majorly mathematical functions fed into computers. A major part of processing is done simultaneously along with acquisition. The data collected can be demultiplexed, convoluted or deconvoluted. This has been dealt with farther in the project.Seismic data processing uses the concepts of geometrical analysis and powerful techniques of fourier analysis. The digital filtering possibleness and practical applications of digital techniques to enhance the images of subsurface geology can virtually be applied to any information sampled in time. The basis aspects of processing is to recognize and extirpate noise from the signal, correct the Normal Move Out (NMO), and stacking of data to form a chart of seismic image that can be used for further study.Interpretation follows exploration and processing of data. The structural interpretation of seismic images determines all decisions in hydrocarbon exploration and production. Since drilling a well for exploration proves costly, maximum information is derived from the seismic data to establish an opinion about the probability of finding petroleum in the structures. However, drilling is required to verify whether the structures are petroleum inscrutable or not. Thus the main challenge is to establish a theoretical account which includes geologically reasonable solutions.Computer-aided seismic interpretation has been of much interest in the later on years. The use of unique and highly complicated software has been recommended by various petroleum organizations, which can serve high reliability. However, automating the whole seismic process is an inconceivable job due to high heterog eneity and varying contrasts between data sources in different parts of the world. Horizon tracking and autopicking is gaining interest among various researchers and developers. This has successfully not been sought as yet.This project is aimed to study the various problems faced in horizon tracking while trying to execute an automated seismic interpretation process. Horizon tracking is basically carried out through autotrackers which are either feature based or correlation based. Feature based looks for mistakable configuration while the correlation regularity is more(prenominal) robust and less sensitive to noise. However, tracking across discontinuities is a difficult job. Thus the project is aimed at finding a way to track horizon across fault lines.CHAPTER 2 LITERATURE REVIEWSEISMIC EXPLORATION SURVEYSeismic exploration surveys in the field of oil and gas are an application of reflection seismology. It is a method to estimate the properties of the earths surface from reflect ed seismic waves. When a seismic wave travels through the rock surface it creates impedance. A wave travels through materials under the influence of pressure. Because molecules of the rock material is bound elastically to one another, the excess pressure results in a wave propagating through the solid.A seismic survey can reveal pockets of lower density material and their location. Although this cannot be guaranteed that oil can be found in these pockets, since the presence of water is also possible.Acoustic impedance is given by -Z = pV,where p density of the material and V acousticalal swiftness of waveAcoustic impedance is important in -the determination of acoustic transmission and reflection at the boundary of two materials having different acoustic impedances.the design of ultrasonic transducers.assessing absorption of sound in a medium.Thus the acoustic impedance of each rock formation in the subsurface will be different due to different densities. This density contrast i s helpful in tracking the waves in the subsurface and an acoustic impedance chart is achieveed which is cognise as a seismic chart. However, the impedances recorded by the instruments on the surface is not correct due to noise and other operators that change the impedance factor of the wave.When a seimic wave is reflected off a boundary between two materials with different impedances, some energy is reflected while some continues through the boundary. The amplitude of this wave can be predicted by multiplying the amplitude of the incoming wave by the Seismic Reflection Coefficient, R.,where Z1 and Z0 are impedances of the two rock formations.Similarly the amplitude of wave travelling through the formation can be determined using the Transmission Coefficient, T.,where Z1 and Z0 are impedances of the two rock formations.By noting the changes in strength of the wave, we can infer the change in acoustic impedances and thus conclude the change in density and elastic modulus. This chan ge can be used to herald the structural changes in the subsurface and thus predict the formation based upon impedances.It might also happen that when the seismic wave hits the boundary between two surfaces it will be reflected or bent. This is given by Snells Law.The reflection and transmission coefficients are found by applying the appropriate boundary conditions and using Zoeppritz equations. These are a set of equations which determine the partitioning of energy in a wavefield at a boundary across which the properties of rock or the fluid changes. They relate the amplitudes of P-waves and S-waves at each side of the surface.Zoeppritz equations have been useful in deriving workable approximations in amplitude versus Offset (AVO). These studies attempt with some success to predict the fuid discipline in the rock formations.The parameters to be used for each seismic survey depends on various variables, including whether the survey is being carried out on land or a marine environm ent. Other geophysical issues such as sea depth, terrain also play a big role. Safety issues are also important.A Seismic Exploration Survey is broadly divided into three step -Seismic selective information learnednessSeismic Data ProcessingSeismic Data InterpretationEach step in the survey needs high reliability and complicated equipments that can deliver the best results. More often, based on these results, the drilling of exploration wells is based. Since drilling can prove costly, thus capital investment is one of the major concern of every company.The Seismic Exploration Survey can be shown as -SEISMIC data ACQUISITIONSeismic data acquisition refers to collection of seismic data. The acquired data is further direct to a computer network where processing of data takes place.With better technologies, the prospect of better acquisition surveys have come into place. A generation and recording of seismic data requires -Receiver configurations includes geophones of hydrophones i n the display slip of paper of marine acquisition.Transmitter configurations includes laying of transmitter as according to the survey configuration predecided.Orientation of streamers in case of marine surveys.Proper computer network to carry the information from receivers to the programming network.When a survey is conducted, seismic waves generated by dynamite or vibrators travel through the subsurface strata, which are in turn reflected or refracted. These reflected waves and their time to complete one interval is noted by the receivers. The receiver configuration has to be well determined so that maximum data can be collected over an area.ACQUISITION ON LANDIn a typical land seismic acquisition process, the survey is planned in an attempt to minimize the terrain constraints. It basically includes the sensor layout organisation and the source development scheme.The source development scheme is used to configure the number of transmitters being used to send the signal down th e surface. One or more transmitters can be used based on the programme employed. Similarily one or many receivers can be employed to collect the reflected waves data.The receiver configuration is an important aspect. The configuration can be in such a way that the closest receiver gathers only the high amplitude wave on the first line of receivers or it can be different based on the signal strength and seismic line survey.The data collected through receiver or geophones is converted to binary program data that can is further handed over to the computer network for processing.MARINE ACQUISITIONMarine acquisition involves processes such as -Wide-azimuth Marine Acquisition Azimuth surveys set aside a step-change improvement in resourcefulness of seismic data. These surveys provide illumination in complex geology and natural attenuation of some multiples. Azimuth shooting illustrates the acquisition of data in all directions. This acquisition technique can help in generating 3-D mod els.Coil Shooting this technique acquires marine seismic data while following a circular caterpillar track by improving upon multi and wide azimuth techniques. This includes vessel steering, streamers and sources in a mold which delivers greater range of azimuths. Sometime single-sensor recording while steering the vessel in different directions has proved to be more beneficial in case of noise attenuation and signal fidelity.Different seismic surveys can be classified as -Two-dimensional SurveyThree-dimensional SurveyFour-dimensional SurveyTWO DIMENSIONAL SURVEYSIn such a survey seismic data is acquired simultaneously along a group of seismic lines which are distinguish with some gaps, usually 1 km or more. A 2-D survey contains many lines acquired orthogonally to the strike of the geological structures with a minimum number of lines acquired parallel to geological structures to allow line-to-line tying of the seismic data and interpretation and mapping of structures.This techni que generates a 2-D cross-section of the deep seabed and is used primarily when initially reconnoitre for the presence of oil and gas beginnings.THREE DIMENSIONAL SURVEYSMultiple streamers shoot on closely spaced lines. The seismic data gathered on close spacing, the 3-D seismic cube can be formed. This innovation requires use of high performance computers and advanced data processing techniques.The computer generated model can be analyzed in greater detail by viewing the model in vertical and horizontal time slices, or even an attached section can be viewed.In a standard 3-D seismic survey, the streamers are placed at about 50-150 meters apart, each streamer being 6-8 kilometers long. Airguns are fired every10-20 seconds. However, many other objectives and economical constraints determine the specific acquisition parameters.FOUR DIMENSIONAL SURVEYSThe 4-D survey is also called the time-lapse survey. It involves processing of repeated seismic surveys over an area of reservoir und er production. The changes occurring in the reservoir due to production and injection can be determined overtime which further helps in field development of the reservoir.One important aspect of a 4-D survey is that there should be minimum difference in the position of the seismic lines when a repeated survey is done after sometime. Significant cost savings can be done by the use of 4-D surveys due to better cooking and understanding of reservoir characteristics.DIFFERENT SHOT METHODSThe common shot gather uses one transmitter source (vibroseis or explosives) and many receivers (geophones) places at some space from the source. They geophones are placed at equal spacings from each other.Commom midpoint gather is the most widely used survey technique. It uses one transmitter placed at the midpoint exactly above the formation area to be surveyed. Receivers are set in all the directions surrounding the transmitter.Common offset gather uses multiple shot and receiving technique.Common r eceiver position gather, as the name states, has only on receiver. While the many shots are employed, the various seismic waves reflecting back to the receiver have different amplitudes and frequencies, thus can be varied and collected differently.COMMON MIDPOINT METHODIt was discovered that relection seismic sections can be ad-lib by repeated sampling of the subsurface formations using different travel paths of the seismic waves. This can easily be achieved by using commom midpoint method which states that increasing the spacing between source and receiver about a commom midpoint and generating duplicated data of the subsurface coverage.The processing of a common midpoint gather system requires sorting of data from the Commom Shot Gather into a Commom Midpoint Gather. The data collected is usually in the form In this method, the inclination of the data occurs since the wavefronts reaching out to the receivers are at an inclined angle, this results in much larger raypath than the c orresponding receiver placed close to the shot point. In order to use the recordings to a common depth point, one needs to correct the data for all the time travel distances. This is known as Normal Moveout discipline (NMO).after NMO, the summation of various wavepaths gives us a horizontal section at time travel equal to zero. This is known as time stacking procedure.After NMO correction the data is shown as -SEISMIC DATA PROCESSINGA abduce seismic processing sequence is applied to input raw gathers to bind reference seismic output data. A series of test seismic processing sequences are applied to the input raw gathers to obtain test seismic output data. The RMS value of the test seismic output data is normalized to that of the reference seismic output data on a absorb by trace basis. The normalized difference between the test and the reference seismic output data is calculated on a sample by sample basis in the time domain and are displayed on color coded plots in the time sca le format over the CDP range. Linear regression is performed for each CMP gather to obtain the stack and the zero offset calculated for each time index and the difference is recorded. The normalized differences between the error for the test and the reference sequences are calculated and displayed on color coded plots. The order of sensitivity for each processing step in the reference processing sequence is determined. If necessary, any processing step is rejected and the reference processing sequence is revised. 2fountainhead-DRIVEN SEISMICIntegrating well data throughout the seismic workflow for superior imaging and evertingWell-Driven Seismic (WDS) is the consolidation of borehole information throughout the surface-seismic workflow to provide better seismic images, more reliable stratigraphic interpretation, and greater confidence in global reservoir characterization.Wireline logs (compressional, shear, and density), VSPs, and surface-seismic data stage the elastic reaction of the earth at various resolution scales. A principle of the Well-Driven Seismic concept is that these data should be processed with respect to their rough-cut body, i.e., that the seismic data must tie with logs and VSPs in time and depth. The aim of the Well-Driven Seismic method is to involve all the available borehole information to optimize the integral seismic workflow to deliver seismic images of superior resolution (in time or depth) and calibrated prestack seismic amplitudes that are suitable for inversion and detailed seismic reservoir description.Earth properties from logs, VSPs, and surface-seismic dataThe Well-Driven Seismic workflow invokes new proprietary software and analysis techniques from WesternGeco and Schlumberger to derive an earth property model from the integrated analysis of wireline logs, VSPs, and surface-seismic data. The property model includes compressional and shear velocities, attenuation (Q) factors, VTI anisotropy parameters, and interbed multiple mechanisms, and is derived at the well location (or locations) and extended across the survey area in 3D. The 3D model is applied in the seismic processing sequence for true amplitude and phase recovery, deconvolution, multiple attenuation, anisotropic prestack time and depth imaging (including of converted-wave data), AVO analysis, and 4D processing.WELL DATA FOR HIGH RESOLUTION SEISMIC IMAGINGWell information can improve many key stages of the conventional seismic processing sequence. VSP data provide excellent discrimination of primary and multiple events, and are used to guide surface-seismic multiple attenuation processes. Furthermore, interbed multiple mechanisms identified in separated VSP wavefields are used as input to data-driven multiple attenuation processes, such as the WesternGeco Interbed Multiple Prediction (IMP). Inverse-Q operators derived from VSP data (and new methods for walkaway VSP data) can significantly improve seismic resolution. WesternGeco employs a prop rietary deconvolution process that is constrained by the signal-to-noise level in the seismic data and by the well reflectivity to enhance further the seismic resolution. The calibrated anisotropic velocity model is vital for prestack time and depth migration (including of converted waves) to improve steep-dip imaging, lateral positioning of reflectors, signal-to-noise ratios, and seismic resolution.OPTIMIZED WELL TIESThe Well-Driven Seismic method optimizes the processing sequence and the processing parameters within that sequence to tie the seismic data to the wells. Attributes based on the well tie and on the quality of the extracted wavelets are used for deterministic seismic processing decisions. Space-adaptive wavelet processing corrects 3D seismic data to true zero phase between well locations, and stabilizes residual spatial wavelet renderings.BOREHOLE CALIBREATED SEIMIC INVERSIONThe Well-Driven Seismic approach provides greater sensitivity to seismically derived reservoir attributes through calibrated AVO or acoustic impedance inversion. The well data are particularly important for successful processing of seismic data for inversion. Compensation for the offset-dependent effects of Q, geometric spreading, transmission losses, and anisotropy are innate for processing data over very long offsets (where the strongest AVO expression of the reservoir may be visible). The method calibrates the AVO signatures in the prestack seismic data with the offset-dependent amplitude response synthesized from well logs and/or the response expressed in the walkaway VSP to provide assurance of the seismic processing sequence.With the seismic processing sequence optimized for resolution and consistency with the well data, Well-Driven Seismic processing is a vital prerequisite for acoustic impedance or AVO inversion and subsequent reservoir characterization.AVO AND INVERSIONAmplitude variation with offset (AVO) has been used extensively in hydrocarbon exploration over th e past two decades. Traditional AVO analysis involves computation of the AVO intercept, gradient, and higher-order AVO term from a fit of P-wave reflection amplitude to the sine square of the angle of incidence. This fit is based on the approximate P-wave reflection coefficient formulation in intercept-gradient form, given by Bortfeld (1961) and Shuey (1985) among others. Under the conjecture of a background PS velocity ratio, the AVO intercept and gradient values can also be combined to obtain additional AVO attributes such as pseudo-S-wave data, Poissons ratio contrast, and others. AVO intercept and pseudo-S-wave data are also used in conjunction with prestack waveform inversion (PSWI) in a hybrid inversion scheme. hybridisation inversion is a combination of prestack and poststack inversion methodologies. Such a combination allows efficient inversion of large data volumes in the absence of well information.Amplitude Variation with Offset (AVO) inversion is a prestack technique t hat is readily applied to seismic gathers but which is still largely under-utilised in the exploration community patronage its ability to effectively discriminate between fluid and lithology effects.AVO inversion is equally applicable to both 2D and 3D seismic data in time or depth providing that sufficient care has been taken to preserve amplitudes during processing. A reliable velocity model is also a critical component of the AVO process as accurate angle information is a prerequisite for AVO inversion. The more accurate the angles, the better the partitioning of amplitudes to P-wave and S-wave reflectivities. In addition, both angle and ray path information can be incorporated in a variety of model based amplitude corrections that are preferable and often more accurate than scalars derived from empirical equations.The inversion process is then performed, completing in about the same time as a conventional stack. The resulting outputs are a series of AVO reflectivity sections or volumes that are determined by the Zoeppritz approximation used.Fluid Factor is one of the most useful attributes derived from AVO inversion due its ability to make such distinctions and directly identify hydrocarbons.Multi-Measurement Reservoir Definition workflows include the following componentsReservoir Synthetic ModelingForward modeling to generate pre-stack synthetics from geological modelsAnivec (prestack elastic modeling)Prestack Waveform Inversion (PSWI)Full waveform prestack inversion is a non-linear inversion process that estimates elastic model (Vp, Vs, and density) from prestack seismic data using a genetic algorithm.AVO Modeling and analysisAVO ConditioningConditions angle band stacks prior to performing AVO analysisAVO InversionElastic impedance modeling and inversion from angle band cubesSpace-adaptive InversionSpace adaptive wavelet processing and inversion to relative seismic impedanceElastic Impedance Inversion combination low frequency trends with seismic relativ e inverted impedance cubes to generate absolute impedanceIntegrated Rock Physics ModelingFluid and rock property analysis, modeling and substitutionRock Property CalibrationGenerating rock properties from seismic using transforms derived from petrophysical analysis of well data.The outputs are high-resolution absolute acoustic and shear impedance and density volumes consistent with the seismic data and the well-log data. The inverted elastic parameter volumes are used for detailed interpretation of lithofacies and pore-fluid content in the subsurface. Combined with rock physics modeling and rock property mapping through lithology classification and joint porosity-saturation inversion, the method provides a powerful tool for quantitative reservoir description and characterization. The results are the most-probable litho-class, porosity, and saturation with uncertainties of prediction at every sample point in the 3-D volume.SIGNAL PROCESSINGSome elements of the seismic data processing sequence are virtually universal regardless of whether the intention is to performtimeimaging,depthimaging,multicomponentimaging, orreservoirstudies. Data condition and signal processing form the foundation of the seismic processing workflow.Signal processing encompasses a wide variety of technologies designed to address numerous challenges in the processing sequence from data calibration and regulation through to noise attenuation, demultiple, and signal enhancement techniques.It includesMultiple AttenuationSignal EnhancementData caliberation and regularizationNoise AttenuationTIME PROCESSINGPrestack time migration (PSTM) may not be the most sophisticated imaging method available, but it remains the most commonly used migration algorithm in use today. Kirchhoff PSTM combines improved structural imaging with amplitude preservation of prestack data in readiness for AVO, inversion, and subsequent reservoir characterization.Advances in this field also mean that time imaging, more tha n ever before, is an ideal first step in aDepth resourcefulnessworkflow, decrease the number of velocity model building iterations and decreasing overall turnaround time.It includesImaging Regularization, migration and datuming techniquesStatics portfolioVelocities and moveoutEnhanced Migration Amplitude NormalizationDEPTH PROCESSINGDepth Imaging is the preferred seismic imaging tool for todays most challenging exploration and reservoir-delineation projects. In areas of structural or seismic velocity model complexity, many of the assumptions underpinning traditionalistic time-domain processing are invalid and can produce mis controling results. Typical situations might be heavily faulted sequences or salt intrusions. In these cases, only the metrical application of 3D prestack depth imaging can be relied on to accurately delineate geological structure, aiding risk assessment and helping operators to improve drilling success rates.TECHNOLOGYFrom a technology perspective, high qua lity depth imaging has two main aspects the ability to build detailed and accurate velocity models, coupled with a superior imaging algorithm.VELOCITY MODEL BUILDINGVelocity Model Building is a key critical element in imaging the Earth. Tomography provides the best high resolution calibrated velocity and anisotropic Earth Models, powerful refraction tomographies detect shallow velocity anomalies. all(prenominal) those algorithms work with any acquisition configuration and can be applied to any geological setting. Also, these computer intensive algorithms are integrated with an interactive graphics environment for rapid and accurate quality control of the interim and final results.VECTOR PROCESSINGConventional seismic recording uses a single scalar measurement of pressure or vertical displacement throughout the 2D or 3D survey to derive images and models of the subsurface. Subsequent processing and inversion steps can be linked to the relative shear-wave contrasts in the subsurface using rock property relationships. However, sometimes it is impossible to meet a surveys seismic imaging or reservoir definition objectives using compressional (P) waves alone.SEISMIC DATA INTERPRETATIONComputer aided interpretation is the mainstay of 3D seismic interpretation as the amount of data used is voluminous.The important services areIIWS (Intergrated Intelligence Workstation) based interpretation of 2D, 3D dataStructural mappingIntegrating seismic attributes with wireline, core and reservoir data for reservoir characterisationSeismic modeling3D visualisation and animationPalinspastic restorationStructural restoration is an established method by which to validate seismic interpretations. In addition, palinspastic reconstruction can help identify potential reservoir depocentres, enable the measurement of catchment areas at the time of hydrocarbon migration and lead to an improved understanding of complex hydrocarbon systems such as those in the deepwater. Restoration is achi eved by the sequential backstripping of the present day depth model. Upon removal of each resultant layer, the remaining surfaces within the model are adjusted to accoun
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.