Interpretive processing is a user-involved approach to ensure generation of proper seismic data products which meet the users specific requirements. This is often achieved by bringing the processing and interpretation geoscientists together - or by providing interpretive processing capabilities to the interpreter-geophysicist who may then to create his/her own cubes that effectively provide the required, optimum seismic quality exactly where it's needed and for a specific purpose.
How is this done?
- GPU acceleration with smart data streaming & smart compute to deliver real-time feedback
- Native handling of prestack data and other multi-dimensionality issues (multi-azimuth)
- Highly effective data I/O
- Compute chains that support velocity model building, data conditioning, and much more.