Storseismic: An Approach to Pre-Train a Neural Network to Store Seismic Data Features
KAUST DepartmentEarth Science and Engineering Program
King Abdullah University Of Science And Technology
King Abdullah University of Science and Technology
Physical Science and Engineering (PSE) Division
Seismic Wave Analysis Group
Permanent link to this recordhttp://hdl.handle.net/10754/678307
MetadataShow full item record
AbstractMachine Learning (ML) has recently been helpful for many seismic processing and imaging tasks. However, these tasks are often handled separately with their own neural network model and training. We propose StorSeismic, a unified framework to store the features in seismic data and use them later for varying seismic processing tasks. Through the help of the self-attention mechanism embedded in the Bidirectional Encoder Representation from Transformers (BERT), a Transformer-based network architecture, we capture and store the local and global features of seismic data in the pre-training stage, then utilize them in various seismic processing tasks in the fine-tuning stage. Using this framework, we could achieve a more efficient and flexible training process than existing approaches. Two applications on denoising and velocity estimation demonstrate the flexibility and the potential of this proposed framework in adapting to various seismic processing tasks.
CitationHarsuko, M. R. C., & Alkhalifah, T. A. (2022). Storseismic: An Approach to Pre-Train a Neural Network to Store Seismic Data Features. 83rd EAGE Annual Conference & Exhibition. https://doi.org/10.3997/2214-4609.202210282
SponsorsWe thank Bingbing Sun for his initial work on this concept. We thank SWAG, especially Claire Birnie, for fruitful discussions and KAUST for its continuous support.
Conference/Event name83rd EAGE Annual Conference & Exhibition