Prediction of Sonic Well Logs Using Deep Neural Network: Application to Petroleum Reservoir Characterization in Mexico
Abstract
1. Introduction
2. Background/Related Work
2.1. Sonic-Log Fundamentals
2.2. Classical Empirical Relations for Sonic Synthesis
2.3. Learning-Based Approaches for Sonic-Log Prediction
- It applies a unified deep learning framework under a consistent preprocessing and training protocol across multiple models (CNN, GRU, and dense layers) to quantify the influence of different input-log combinations (5R, 4R, and 2R).
- It enforces strict well-level holdouts to prevent data leakage, which is often overlooked in previous work.
- It demonstrates field deployment of the trained model to predict sonic logs in wells where measurements were never acquired, providing a realistic test of operational applicability in a Mexican hydrocarbon field, a regional context not represented in previous literature.
3. Data and Study Area
4. Methods
4.1. Preprocessing
- Outliers and flags: Obvious tool-failure values were masked, and suspect intervals (e.g., washouts) were visually inspected across all logs. Values were removed only when the anomaly was inconsistent with the multi-log context.
- Depth alignment: Each log was resampled onto a common 0.15 m depth grid per well. Minor misalignments between tools—typically on the order of a few centimeters—were corrected by linear interpolation so that corresponding measurements (e.g., GR, NPHI, Density) referred to the same depth levels. This process eliminates artificial phase shifts that could otherwise distort the multilog relationships.
- Resistivity stabilization: Because resistivity values (R) span several orders of magnitude and occasionally contain zero or negative artifacts (from tool noise or vendor flagging), a small positive constant ( m) was added before applying the natural logarithmic transform, i.e.,This prevents undefined values during the logarithmic scaling while preserving the overall distribution. The transformation compresses extreme high values and reduces skewness, facilitating more stable optimization during training.
- Windowing: To provide local depth context and stabilize training, each well was divided into overlapping windows of 200 samples (≈30 m) with 4-sample overlap. Windows containing >10% masked samples were discarded. Windows were shuffled only within the training set.
- Leakage control: All preprocessing steps involving fitted statistics (means, SDs, imputers) were computed on training wells and then applied to validation/test wells.
4.2. Neural Network Architecture
- Convolutional front-end (feature extractor): Three 1-D convolutional layers along depth with 32, 64, and 64 filters and kernel sizes of 3, 7, and 14 samples, respectively. Stride = 1, same padding, ReLU activations, no pooling. Input logs are stacked as channels.
- Recurrent backbone (context): Two bidirectional gated recurrent unit (GRU) layers with 32 and 64 units, using tanh activations to capture sequential dependence along depth.
- Fully connected head (regression): Dense layers of 32 → 64 → 1 units; hidden tanh activations and linear activation at the output to predict continuous sonic values.
- Regularization: Dropout (rate = 0.2) after each block and weight decay = on dense layers.
- Implementation: TensorFlow/Keras.
- 5R model includes all five logs (Gamma Ray, Resistivity, Neutron Porosity, Bulk Density, and Sonic);
- 4R model excludes one log (uses Gamma Ray, Resistivity, Neutron Porosity, and Bulk Density);
- 2R model uses only Gamma Ray and Resistivity as predictors.
4.3. Baselines
- Empirical relations: Faust-type velocity–resistivity–depth relation; Gardner and Lindseth density–velocity relations. Constants (a, b, m) were fitted by least squares on training wells.
- Linear/Multi-linear regression: Ordinary least squares (OLS) mapping from predictor logs to sonic. The same input sets as above were tested.
4.4. Training and Evaluation
- Loss and optimizer: Huber loss with (in normalized units) to reduce sensitivity to outliers. Optimizer Adam with initial learning rate = and cosine decay. Weight decay = .
- Batching: Mini-batches of 64 windows.
- Early stopping: Validation loss monitored with patience = 20 epochs; maximum 200 epochs; best weights restored.
- Metrics: Root mean square error (RMSE, in s/ft), mean absolute percentage error (MAPE, in%), mean absolute error (MAE, in s/ft), and coefficient of determination (). Metrics are reported per well and as aggregated mean ± standard deviation.
4.5. Ablations and Sensitivity
- Input importance: Compare Full, Reduced-A, and Reduced-B configurations.
- Window length: 100, 200, and 300 samples to test receptive-field sensitivity.
- Normalization: Per-well vs. global z-score (expect per-well to generalize better).
- Loss function: Huber vs. MAE vs. MSE.
- Model variants: MLP-only (no CNN/RNN) and CNN-only (no RNN) to isolate contributions of sequence modeling.
4.6. Statistical Testing
5. Results
5.1. Convergence
5.2. Held-Out Performance (Within-Well Intervals)
- Well A: best RMSE/MAPE from 4R (1.421, 2.02%); 5R is essentially tied (1.427, 2.03%).
- Well B: 5R performs best (1.348, 1.79%).
- Well C: 5R performs best (1.667, 2.98%).
- Well D: 4R performs best (1.211, 1.62%).
| Model 5R | Model 4R | Model 2R | ||||
|---|---|---|---|---|---|---|
| Well Name | RMSE | MAPE | RMSE | MAPE | RMSE | MAPE |
| A-Test | 1.4267 | 2.03% | 1.4212 | 2.02% | 1.6251 | 2.61% |
| B-Test | 1.3483 | 1.79% | 1.4744 | 2.14% | 1.4693 | 2.18% |
| C-Test | 1.6672 | 2.98% | 1.7134 | 3.12% | 1.7995 | 3.44% |
| D-Test | 1.2911 | 1.84% | 1.2108 | 1.62% | 1.7004 | 3.02% |
5.3. Statistical Comparison of Models
5.4. Depth-Track Case Studies (A–D)
5.5. Deployment to Wells Without Sonic
6. Discussion
6.1. Data Treatment and Sampling Strategy
6.2. Convergence Behavior and Model Selection
6.3. What the Metrics Really Say
6.4. Comparing Input Configurations (5R vs. 4R vs. 2R)
Comparison with Empirical Equations
6.5. Operational Value and Limits
6.6. Recommendations for Future Work
- Cross-well validation: Implement leave-one-well-out testing to eliminate depth-adjacent data leakage and evaluate transferability.
- Uncertainty quantification: Use Monte Carlo dropout or lightweight ensembles to compute per-depth confidence intervals.
- Lithology-aware inputs: Integrate shale-volume or facies indicators so that similar lithologies cluster in feature space.
- Physics-guided regularization: Introduce soft constraints linking predictions to Gardner- or Lindseth-consistent density–velocity trends.
- Field-transfer calibration: Apply per-well normalization and light fine-tuning when transferring models across basins or logging vendors.
7. Conclusions
- A field-tested pipeline that couples geoscience-aware preprocessing with a compact deep model to predict DT from routinely acquired logs.
- Strict leakage control at both depth and well levels, with performance quantified by RMSE, MAPE, MAE, and (see Table 7), and validated through qualitative depth-track comparisons in Section 6.5 that confirm geological plausibility.
- Operational deployment to wells lacking DT, enabling downstream tasks such as impedance estimation, synthetic seismograms, and rock-mechanics screening where sonic acquisition was skipped.
- Resilience to missing inputs: reduced-input models deliver usable first-order trends, supporting legacy wells and cost-constrained campaigns.
- The workflow reduces the time and cost associated with running sonic tools in every well and offers a recovery path when logging fails, with immediate benefits for well-to-seismic ties and overpressure screening.
- Synthetic DT fills critical gaps not only in petroleum applications but also in geotechnical and geothermal projects.
- Generalization is currently field-specific: tool vintages, vendor corrections, and lithologic mixes can shift input distributions.
- Errors increase in shale-rich intervals and washouts, where borehole effects and thin beds challenge any method; uncertainty should be explicitly communicated to users.
- Lithology/shale awareness: Incorporate a lithology column or shale volume (e.g., learned embeddings) to stabilize predictions across facies boundaries and improve accuracy when some logs are missing.
- Cross-well validation and transfer: Extend evaluation with leave-one-well-out protocols and adopt fine-tuning/domain normalization to transfer models across fields and tool vintages.
- Physics-guided learning: Regularize predictions toward rock-physics relations (e.g., Gardner, Lindseth) or use multi-task objectives (joint DT and RHOB) to encode structure without over-constraining.
- Uncertainty and calibration: Deliver per-depth prediction intervals (e.g., MC-dropout or ensembles) with coverage checks, so end-users can act on quantified risk.
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A



References
- Lai, J.; Su, Y.; Xiao, L.; Zhao, F.; Bai, T.; Li, Y.; Li, H.; Huang, Y.; Wang, G.; Qin, Z. Application of geophysical well logs in solving geologic issues: Past, present and future prospect. Geosci. Front. 2024, 15, 101779. [Google Scholar] [CrossRef]
- Brie, A.; Endo, T.; Hoyle, D.; Codazzi, D.; Esmersoy, C.; Hsu, K. New Directions in Sonic Logging. Oilfield Rev. 1998, 10, 40–55. [Google Scholar]
- Mirhashemi, M.; Khojasteh, E.R.; Manaman, N.S.; Makarian, E. Efficient sonic log estimations by geostatistics, empirical petrophysical relations, and their combination: Two case studies from Iranian hydrocarbon reservoirs. J. Pet. Sci. Eng. 2004, 45, 123–134. [Google Scholar] [CrossRef]
- Li, Z.; Xia, J.; Liu, Z.; Lei, G.; Lee, K.; Ning, F. Missing sonic logs generation for gas hydrate-bearing sediments via hybrid networks combining deep learning with rock physics modeling. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5921915. [Google Scholar] [CrossRef]
- Ojala, I. Using rock physics for constructing synthetic sonic logs. In Proceedings of the 3rd Canada-US Rock Mechanics Symposium, Toronto, ON, Canada, 9–15 May 2009; Available online: https://geogroup.utoronto.ca/wp-content/uploads/RockEng09/PDF/Session6/4016%20PAPER.pdf (accessed on 13 September 2025).
- Makarian, E.; Mirhashemi, M.; Elyasi, A.; Mansourian, D.; Falahat, R.; Radwan, A.E.; El-Aal, A.; Fan, C.; Li, H. A novel directional-oriented method for predicting shear wave velocity through empirical rock physics relationship using geostatistics analysis. Sci. Rep. 2023, 13, 47016. [Google Scholar] [CrossRef]
- Guntoro, T.; Putri, I.; Bahri, A.S. Petrophysical relationship to predict synthetic porosity log. Search Discov. Artic. 2013, 41124. Available online: https://www.searchanddiscovery.com/documents/2013/41124guntoro/ndx_guntoro.pdf (accessed on 5 September 2025).
- Bader, S.; Wu, X.; Fomel, S. Missing log data interpolation and semiautomatic seismic well ties using data matching techniques. Interpretation 2019, 7, T347–T361. [Google Scholar] [CrossRef]
- Bader, S.; Wu, X.; Fomel, S. Missing well log estimation by multiple well-log correlation. In Proceedings of the 80th EAGE Conference & Exhibition, Copenhagen, Denmark, 11–14 June 2018. [Google Scholar] [CrossRef]
- Lines, L.R.; Alam, M. Synthetic Seismograms, Synthetic Sonic Logs, and Synthetic Core. CREWES. 2012. Available online: https://www.crewes.org/Documents/ResearchReports/2012/CRR201259.pdf (accessed on 20 September 2025).
- Maalouf, E.; Torres-Verdín, C. Inversion-based method to mitigate noise in borehole sonic logs. Geophysics 2018, 83, D61–D71. [Google Scholar] [CrossRef]
- Zhang, D.; Chen, Y.; Meng, J. Synthetic well logs generation via Recurrent Neural Networks. Pet. Explor. Dev. 2018, 45, 629–639. [Google Scholar] [CrossRef]
- Wang, J.; Cao, J.; Fu, J.; Xu, H. Missing well logs prediction using deep learning integrated neural network with the self-attention mechanism. Energy 2022, 261, 125270. [Google Scholar] [CrossRef]
- Saleh, K.; Mabrouk, W.M.; Metwally, A. Machine learning model optimization for compressional sonic log prediction using well logs in Shahd SE field, Western Desert, Egypt. Sci. Rep. 2025, 15, 14957. [Google Scholar] [CrossRef]
- Pham, T.; Tran, T.; Nguyen, H. Missing well log prediction using convolutional long short-term memory network. Geophysics 2020, 85, WA159–WA170. [Google Scholar] [CrossRef]
- Cabello-Solorzano, K.; Ortigosa de Araujo, I.; Peña, M.; Correia, L.; Tallón-Ballesteros, A.J. The impact of data normalization on the accuracy of machine learning algorithms: A comparative analysis. In Proceedings of the 18th International Conference on Soft Computing Models in Industrial and Environmental Applications (SOCO 2023); García Bringas, P., Pérez García, H., Martínez de Pisón, F.J., Martínez-Álvarez, F., Troncoso Lora, A., Herrero, Á., Calvo-Rolle, J.L., Quintián, H., Corchado, E., Eds.; Springer: Berlin/Heidelberg, Germany, 2023; pp. 344–353. [Google Scholar] [CrossRef]
- Kirkham, T. Lasio: Log ASCII Standard (LAS) File Reader for Python. 2023. Available online: https://lasio.readthedocs.io (accessed on 10 January 2025).
- Abadi, M.; Agarwal, A.; Barham, P.; Brevdo, E.; Chen, Z.; Citro, C.; Corrado, G.S.; Davis, A.; Dean, J.; Devin, M.; et al. TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems. 2016. Available online: https://www.tensorflow.org/ (accessed on 24 June 2025).
- Tixier, M.P.; Alger, R.P.; Doh, C.A. Sonic logging. Pet. Trans. AIME 1959, 216, 106–114. [Google Scholar] [CrossRef]
- Glover, P. Sonic Log. University of Leeds. 2016. Available online: https://homepages.see.leeds.ac.uk/~earpwjg/PG_EN/CD%20Contents/GGL-66565%20Petrophysics%20English/Chapter%2016.PDF (accessed on 10 September 2025).
- Serra, O. Fundamentals of Well Logging; Elsevier: Amsterdam, The Netherlands, 1984; Available online: https://books.google.com.mx/books?id=VfXElAEACAAJ (accessed on 27 September 2025).
- Liu, H. Principles and Applications of Well Logging; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar] [CrossRef]
- Hacikoylu, P.; Dvorkin, J.; Mavko, G. Resistivity-velocity transforms revisited. Lead. Edge 2006, 25, 1006–1009. [Google Scholar] [CrossRef]
- Halliburton (s.f.). Pseudo Sonic Log ProMax. Halliburton. Available online: https://www.halliburton.com/en/products/geosciences-suite/geophysics-workflow (accessed on 13 September 2025).
- Potter, C.C.; Stewart, R.R. Density predictions using Vp and Vs sonic logs. CREWES Res. Rep. 1998, 10, 1–10. Available online: https://www.crewes.org/Documents/ResearchReports/1998/1998-10.pdf (accessed on 13 September 2025).
- Quijada, M.F.; Stewart, R.R. Density Estimations Using Density-Velocity Relations and Seismic Inversion. CREWES. 2007. Available online: https://www.crewes.org/Documents/ResearchReports/2007/2007-01.pdf (accessed on 13 September 2025).
- Atat, J.G.; Uko, E.D.; Tamunobereton-ari, I.; Eze, C.L. The Constants of Density-Velocity Relation for Density Estimation in Tau Field, Niger Delta Basin. IOSR J. Appl. Phys. 2020, 12, 19–26. [Google Scholar]
- Wyllie, M.R.J.; Gregory, A.R.; Gardner, G.H.F. Elastic wave velocities in heterogeneous and porous media. Geophysics 1956, 21, 41–70. [Google Scholar] [CrossRef]
- Gardner, G.H.F.; Gardner, L.W.; Gregory, A.R. Formation velocity and density—The diagnostic basics for stratigraphic traps. Geophysics 1974, 39, 770–780. [Google Scholar] [CrossRef]
- Mitchell, T.M. Machine Learning; McGraw-Hill: New York, NY, USA, 1997. [Google Scholar]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Nie, F.; Hu, Z.; Li, X. An investigation for loss functions widely used in machine learning. Commun. Inf. Syst. 2018, 18, 37–52. [Google Scholar] [CrossRef]
- Ding, B.; Qian, H.; Zhou, J. Activation functions and their characteristics in deep neural networks. In Proceedings of the 2018 Chinese Control and Decision Conference (CCDC), Shenyang, China, 9–11 June 2018; pp. 1836–1841. [Google Scholar] [CrossRef]
- Lau, M.M.; Lim, K.H. Review of adaptive activation function in deep neural network. In Proceedings of the IEEE-EMBS Conference on Biomedical Engineering and Sciences (IECBES), Sarawak, Malaysia, 3–6 December 2018; pp. 686–690. [Google Scholar] [CrossRef]
- Ketkar, N. Deep Learning with Python: A Hands-On Introduction; Apress: New York, NY, USA, 2017. [Google Scholar]
- Zhang, A.; Lipton, Z.C.; Li, M.; Smola, A.J. Dive into Deep Learning; Cambridge University Press: Cambridge, UK, 2023; Available online: https://d2l.ai/ (accessed on 13 September 2025).
- Pascanu, R.; Gulcehre, C.; Cho, K.; Bengio, Y. How to construct deep recurrent neural networks. In Proceedings of the 3rd International Conference for Learning Representations (ICLR), San Diego, CA, USA, 7–9 May 2014. [Google Scholar] [CrossRef]
- Chung, J.; Gulchere, C.; Cho, K.; Benigo, Y. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. arXiv 2014, arXiv:1412.3555. [Google Scholar] [CrossRef]
- Shen, G.; Tan, Q.; Zhang, H.; Zeng, P.; Xu, J. Deep learning with gated recurrent unit networks for financial sequence predictions. Procedia Comput. Sci. 2018, 131, 895–903. [Google Scholar] [CrossRef]
- Ruder, S. An overview of gradient descent optimization algorithms. arXiv 2016. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J.L. Adam: A method for stochastic optimization. In Proceedings of the 3rd International Conference for Learning Representations (ICLR), San Diego, CA, USA, 7–9 May 2015. [Google Scholar] [CrossRef]
- Duchi, J.; Hazan, E.; Singer, Y. Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 2011, 12, 2121–2159. Available online: https://dl.acm.org/doi/10.5555/1953048.2021068 (accessed on 13 September 2025).
- Hinton, G.; Srivastava, N.; Swersky, K. Overview of mini-batch gradient descent. In Neural Networks for Machine Learning (Lecture 6a); University of Toronto: Toronto, ON, Canada, 2012; Available online: http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf (accessed on 13 September 2025).
- Subiatmono, P.; Buntoro, A.; Lukmana, A.H.; David, M.; Kristanto, D. Brittleness prediction using sonic and density logs to determine sweet spot of Brown Shale reservoir. J. Multidiscip. Eng. Sci. Technol. 2022, 9, 15078–15084. [Google Scholar]











| Approach | Representative Model/Source | Inputs | Key Idea/Strengths | Main Limitations |
|---|---|---|---|---|
| Velocity–resistivity empirical transform | Faust relation (and formation factor variants) [7] | Deep resistivity, depth (or formation factor) | Simple field-calibrated law linking velocity/slowness to resistivity and burial; fast to apply. | Requires local calibration; sensitive to salinity, clay content, and anisotropy; struggles in thin-bedded or heterogeneous intervals. |
| Shale-corrected mixing formulas | “Mixing-style” with shale volume [28] | GR (for ), resistivity/porosity proxies | Corrects clean-sand relations for shale effects; interpretable end-member framework. | Depends on end-member choices and shale-volume estimates; limited in complex mineralogies and laminated sands–shales. |
| Density–velocity empirical relations | Gardner; Lindseth; Birch-style trends [29] | RHOB (and sometimes ) | Rock-physics style links between density and velocity; easy to tune per field. | Break down when fluids/lithology vary strongly; needs per-lithology constants; limited in shaly or fractured zones. |
| Geostatistical synthesis | Mirhashemi et al. (case studies) [3] | Multi-log set + spatial statistics | Combines empirical/rock-physics with kriging/multiwell statistics; can reduce gaps. | Assumptions on stationarity and variograms; may smooth sharp facies transitions; requires dense multiwell control. |
| Multiwell correlation/data matching | Bader, Wu & Fomel [8,9] | Neighboring wells, correlation operators | Leverages cross-well similarity to interpolate and tie logs; useful when nearby control exists. | Performance degrades with stratigraphic variability or sparse offset wells; sensitive to depth mismatches. |
| Seismic-integrated workflows | Lines & Alam (synthetic seismograms, inversion) | Seismic + limited logs | Joint seismic–log workflows to derive synthetic sonic or constrain it via inversion. | Quality hinges on seismic bandwidth/wavelet and well tie; cycle skipping and non-uniqueness are common. |
| Processing-based pseudo-sonic | Halliburton ProMAX “Pseudo Sonic Log” [24] | Seismic + logs (varies) | Vendor workflow to approximate sonic where missing; operationally accessible. | Proprietary assumptions; accuracy varies with data quality; limited transparency/transferability. |
| Noise-mitigation/ inversion on measured sonic | Maalouf & Torres-Verdín (denoising) [11] | Raw sonic + inversion | Improves measured sonic in noisy or washed-out intervals; complementary to synthesis. | Does not “create” sonic where entirely absent; dependent on inversion settings and borehole conditions. |
| Architecture | Key Characteristics | Typical Advantages/Limitations |
|---|---|---|
| ANN (Fully Connected) | Treats each input independently; no spatial or sequential context. | Simple and fast; may overlook depth continuity in well logs. |
| CNN (Convolutional) | Learns local patterns and spatial dependencies via kernels. | Captures neighboring relationships in depth; sensitive to kernel size. |
| RNN/GRU/LSTM | Processes sequences recursively; retains temporal (depth) context. | Effective for sequential data but computationally heavier; risk of vanishing gradients. |
| Well ID | GR | Resistivity | Density | NPHI | DT (Sonic) | Usage |
|---|---|---|---|---|---|---|
| Well A | ✓ | ✓ | ✓ | ✓ | ✓ | Training |
| Well B | ✓ | ✓ | ✓ | ✓ | ✓ | Validation |
| Well C | ✓ | ✓ | ✓ | ✓ | ✕ | Test/Prediction |
| Well D | ✓ | ✓ | ✕ | ✕ | ✕ | Test/Prediction |
| Comparison | RMSE [μs/ft] | MAPE [%] | Wilcoxon Test | |||
|---|---|---|---|---|---|---|
| Mean | Median | Mean | Median | Stat | p-value | |
| 5R vs. 4R | −0.022 | −0.020 | −0.07 | −0.07 | 4.0 | 0.875 |
| 5R vs. 2R | −0.215 | −0.165 | −0.65 | −0.52 | 0.0 | 0.125 |
| 4R vs. 2R | −0.194 | −0.145 | −0.59 | −0.46 | 1.0 | 0.250 |
| Model 5R | Model 4R | Model 2R | ||||
|---|---|---|---|---|---|---|
| Well Name | RMSE | MAPE | RMSE | MAPE | RMSE | MAPE |
| A | 1.2949 | 1.68% | 1.3632 | 1.86% | 1.3577 | 1.84% |
| B | 1.2170 | 1.55% | 1.3071 | 1.79% | 1.3757 | 1.97% |
| C | 1.4304 | 2.18% | 1.4834 | 2.34% | 1.6289 | 2.81% |
| D | 1.1862 | 1.44% | 1.1970 | 1.47% | 1.3701 | 1.93% |
| Model | RMSE [μs/ft] | MAPE (%) | MAE [μs/ft] | |
|---|---|---|---|---|
| Faust (Empirical) | 3.8 ± 0.4 | 6.1 ± 0.5 | 3.2 ± 0.3 | 0.61 |
| Linear Regression | 2.9 ± 0.3 | 4.7 ± 0.4 | 2.4 ± 0.2 | 0.75 |
| 2R (ML) | 1.65 ± 0.09 | 2.6 ± 0.2 | 1.31 ± 0.10 | 0.91 |
| 4R (ML) | 1.46 ± 0.07 | 2.3 ± 0.2 | 1.31 ± 0.10 | 0.93 |
| 5R (ML) | 1.43 ± 0.06 | 2.1 ± 0.2 | 1.12 ± 0.07 | 0.94 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Vázquez-Ayala, J.A.; Ortiz-Alemán, J.C.; López-Juárez, S.; Couder-Castañeda, C.; Trujillo-Alcántara, A. Prediction of Sonic Well Logs Using Deep Neural Network: Application to Petroleum Reservoir Characterization in Mexico. Geosciences 2025, 15, 424. https://doi.org/10.3390/geosciences15110424
Vázquez-Ayala JA, Ortiz-Alemán JC, López-Juárez S, Couder-Castañeda C, Trujillo-Alcántara A. Prediction of Sonic Well Logs Using Deep Neural Network: Application to Petroleum Reservoir Characterization in Mexico. Geosciences. 2025; 15(11):424. https://doi.org/10.3390/geosciences15110424
Chicago/Turabian StyleVázquez-Ayala, Jorge Alejandro, Jose Carlos Ortiz-Alemán, Sebastian López-Juárez, Carlos Couder-Castañeda, and Alfredo Trujillo-Alcántara. 2025. "Prediction of Sonic Well Logs Using Deep Neural Network: Application to Petroleum Reservoir Characterization in Mexico" Geosciences 15, no. 11: 424. https://doi.org/10.3390/geosciences15110424
APA StyleVázquez-Ayala, J. A., Ortiz-Alemán, J. C., López-Juárez, S., Couder-Castañeda, C., & Trujillo-Alcántara, A. (2025). Prediction of Sonic Well Logs Using Deep Neural Network: Application to Petroleum Reservoir Characterization in Mexico. Geosciences, 15(11), 424. https://doi.org/10.3390/geosciences15110424

