Transformer Architecture and Attention Mechanisms in Genome Data Analysis: A Comprehensive Review
Abstract
:Simple Summary
Abstract
1. Introduction
2. Deep Learning with Transformers and Attention Mechanism
2.1. Conventional Architectures of Deep Learning
2.2. Transformers and Attention Mechanism
3. Methods
3.1. Publication Selection Process
3.2. Journals of Published Papers
3.3. Year-Wise Analysis of Publications
3.4. Analysis of Citation Distribution
4. Overview of Recent Studies in Transformer Architectures and Attention Mechanisms for Genome Data
4.1. Sequence and Site Prediction
4.2. Gene Expression and Phenotype Prediction
4.3. ncRNA and circRNA Studies
4.4. Transcription Process Insights
4.5. Multi-Omics/Modal Tasks
4.6. CRISPR Efficacy and Outcome Prediction
4.7. Gene Regulatory Network Inference
4.8. Disease Prognosis Estimation
4.9. Gene Expression-Based Classification
4.10. Proteomics
4.11. Cell-Type Identification
4.12. Predicting Drug–Drug Interactions
4.13. Other Topics
5. Discussion
5.1. Challenges
5.2. Future Work
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Auslander, N.; Gussow, A.B.; Koonin, E.V. Incorporating Machine Learning into Established Bioinformatics Frameworks. Int. J. Mol. Sci. 2021, 22, 2903. [Google Scholar] [CrossRef]
- Lee, M. Deep Learning Techniques with Genomic Data in Cancer Prognosis: A Comprehensive Review of the 2021–2023 Literature. Biology 2023, 12, 893. [Google Scholar] [CrossRef]
- Gomes, R.; Paul, N.; He, N.; Huber, A.F.; Jansen, R.J. Application of Feature Selection and Deep Learning for Cancer Prediction Using DNA Methylation Markers. Genes 2022, 13, 1557. [Google Scholar] [CrossRef]
- Sadad, T.; Aurangzeb, R.A.; Safran, M.; Imran; Alfarhood, S.; Kim, J. Classification of Highly Divergent Viruses from DNA/RNA Sequence Using Transformer-Based Models. Biomedicines 2023, 11, 1323. [Google Scholar] [CrossRef]
- Lee, M. Recent Advances in Deep Learning for Protein-Protein Interaction Analysis: A Comprehensive Review. Molecules 2023, 28, 5169. [Google Scholar] [CrossRef]
- Kim, Y.; Lee, M. Deep Learning Approaches for lncRNA-Mediated Mechanisms: A Comprehensive Review of Recent Developments. Int. J. Mol. Sci. 2023, 24, 10299. [Google Scholar] [CrossRef]
- Brown, T.; Mann, B.; Ryder, N.; Subbiah, M.; Kaplan, J.D.; Dhariwal, P.; Neelakantan, A.; Shyam, P.; Sastry, G.; Askell, A.; et al. Language models are few-shot learners. Adv. Neural Inf. Process. Syst. 2020, 33, 1877–1901. [Google Scholar]
- Khan, S.; Naseer, M.; Hayat, M.; Zamir, S.W.; Khan, F.S.; Shah, M. Transformers in vision: A survey. ACM Comput. Surv. (CSUR) 2022, 54, 1–41. [Google Scholar] [CrossRef]
- Liu, Z.; Lin, Y.; Cao, Y.; Hu, H.; Wei, Y.; Zhang, Z.; Lin, S.; Guo, B. Swin transformer: Hierarchical vision transformer using shifted windows. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 10012–10022. [Google Scholar]
- Han, K.; Wang, Y.; Chen, H.; Chen, X.; Guo, J.; Liu, Z.; Tang, Y.; Xiao, A.; Xu, C.; Xu, Y.; et al. A survey on vision transformer. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 45, 87–110. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30, 6000–6010. [Google Scholar]
- Wei, Z.; Yan, Q.; Lu, X.; Zheng, Y.; Sun, S.; Lin, J. Compression Reconstruction Network with Coordinated Self-Attention and Adaptive Gaussian Filtering Module. Mathematics 2023, 11, 847. [Google Scholar] [CrossRef]
- Jin, A.; Zeng, X. A Novel Deep Learning Method for Underwater Target Recognition Based on Res-Dense Convolutional Neural Network with Attention Mechanism. J. Mar. Sci. Eng. 2023, 11, 69. [Google Scholar] [CrossRef]
- Gao, L.; Wu, Y.; Yang, T.; Zhang, X.; Zeng, Z.; Chan, C.K.D.; Chen, W. Research on Image Classification and Retrieval Using Deep Learning with Attention Mechanism on Diaspora Chinese Architectural Heritage in Jiangmen, China. Buildings 2023, 13, 275. [Google Scholar] [CrossRef]
- Lu, J.; Ren, H.; Shi, M.; Cui, C.; Zhang, S.; Emam, M.; Li, L. A Novel Hybridoma Cell Segmentation Method Based on Multi-Scale Feature Fusion and Dual Attention Network. Electronics 2023, 12, 979. [Google Scholar] [CrossRef]
- Cheng, S.; Liu, Y. Research on Transportation Mode Recognition Based on Multi-Head Attention Temporal Convolutional Network. Sensors 2023, 23, 3585. [Google Scholar] [CrossRef]
- Kasgari, A.B.; Safavi, S.; Nouri, M.; Hou, J.; Sarshar, N.T.; Ranjbarzadeh, R. Point-of-Interest Preference Model Using an Attention Mechanism in a Convolutional Neural Network. Bioengineering 2023, 10, 495. [Google Scholar] [CrossRef]
- Raimundo, A.; Pavia, J.P.; Sebastião, P.; Postolache, O. YOLOX-Ray: An Efficient Attention-Based Single-Staged Object Detector Tailored for Industrial Inspections. Sensors 2023, 23, 4681. [Google Scholar] [CrossRef]
- Kim, T.; Pak, W. Deep Learning-Based Network Intrusion Detection Using Multiple Image Transformers. Appl. Sci. 2023, 13, 2754. [Google Scholar] [CrossRef]
- Feng, S.; Zhu, X.; Ma, S.; Lan, Q. GIT: A Transformer-Based Deep Learning Model for Geoacoustic Inversion. J. Mar. Sci. Eng. 2023, 11, 1108. [Google Scholar] [CrossRef]
- Jiang, D.; Shi, G.; Li, N.; Ma, L.; Li, W.; Shi, J. TRFM-LS: Transformer-Based Deep Learning Method for Vessel Trajectory Prediction. J. Mar. Sci. Eng. 2023, 11, 880. [Google Scholar] [CrossRef]
- Cao, L.; Wang, Q.; Hong, J.; Han, Y.; Zhang, W.; Zhong, X.; Che, Y.; Ma, Y.; Du, K.; Wu, D.; et al. MVI-TR: A Transformer-Based Deep Learning Model with Contrast-Enhanced CT for Preoperative Prediction of Microvascular Invasion in Hepatocellular Carcinoma. Cancers 2023, 15, 1538. [Google Scholar] [CrossRef] [PubMed]
- Shrestha, A.; Mahmood, A. Review of deep learning algorithms and architectures. IEEE Access 2019, 7, 53040–53065. [Google Scholar] [CrossRef]
- Bhatt, D.; Patel, C.; Talsania, H.; Patel, J.; Vaghela, R.; Pandya, S.; Modi, K.; Ghayvat, H. CNN variants for computer vision: History, architecture, application, challenges and future scope. Electronics 2021, 10, 2470. [Google Scholar] [CrossRef]
- Yu, Y.; Si, X.; Hu, C.; Zhang, J. A review of recurrent neural networks: LSTM cells and network architectures. Neural Comput. 2019, 31, 1235–1270. [Google Scholar] [CrossRef]
- Lee, M.; Seok, J. Controllable generative adversarial network. IEEE Access 2019, 7, 28158–28169. [Google Scholar] [CrossRef]
- Kim, J.; Lee, M. Portfolio optimization using predictive auxiliary classifier generative adversarial networks. Eng. Appl. Artif. Intell. 2023, 125, 106739. [Google Scholar] [CrossRef]
- Lee, M.; Seok, J. Score-guided generative adversarial networks. Axioms 2022, 11, 701. [Google Scholar] [CrossRef]
- Lee, M.; Seok, J. Estimation with uncertainty via conditional generative adversarial networks. Sensors 2021, 21, 6194. [Google Scholar] [CrossRef] [PubMed]
- Yeom, T.; Lee, M. DuDGAN: Improving Class-Conditional GANs via Dual-Diffusion. arXiv 2023, arXiv:2305.14849. [Google Scholar]
- Ko, K.; Lee, M. ZIGNeRF: Zero-shot 3D Scene Representation with Invertible Generative Neural Radiance Fields. arXiv 2023, arXiv:2306.02741. [Google Scholar]
- Lee, M. Recent Advances in Generative Adversarial Networks for Gene Expression Data: A Comprehensive Review. Mathematics 2023, 11, 3055. [Google Scholar] [CrossRef]
- Niu, Z.; Zhong, G.; Yu, H. A review on the attention mechanism of deep learning. Neurocomputing 2021, 452, 48–62. [Google Scholar] [CrossRef]
- Raad, J.; Bugnon, L.A.; Milone, D.H.; Stegmayer, G. miRe2e: A full end-to-end deep model based on transformers for prediction of pre-miRNAs. Bioinformatics 2022, 38, 1191–1197. [Google Scholar] [CrossRef] [PubMed]
- Shen, Z.; Zhang, Q.; Han, K.; Huang, D.S. A Deep Learning Model for RNA-Protein Binding Preference Prediction Based on Hierarchical LSTM and Attention Network. IEEE-ACM Trans. Comput. Biol. Bioinform. 2022, 19, 753–762. [Google Scholar] [CrossRef]
- Li, Q.; Cheng, X.; Song, C.; Liu, T. M6A-BERT-Stacking: A Tissue-Specific Predictor for Identifying RNA N6-Methyladenosine Sites Based on BERT and Stacking Strategy. Symmetry 2023, 15, 731. [Google Scholar] [CrossRef]
- Ma, Z.W.; Zhao, J.P.; Tian, J.; Zheng, C.H. DeeProPre: A promoter predictor based on deep learning. Comput. Biol. Chem. 2022, 101, 107770. [Google Scholar] [CrossRef]
- Zeng, R.; Cheng, S.; Liao, M. 4mCPred-MTL: Accurate Identification of DNA 4mC Sites in Multiple Species Using Multi-Task Deep Learning Based on Multi-Head Attention Mechanism. Front. Cell Dev. Biol. 2021, 9, 664669. [Google Scholar] [CrossRef]
- Mai, D.H.A.; Nguyen, L.T.; Lee, E.Y. TSSNote-CyaPromBERT: Development of an integrated platform for highly accurate promoter prediction and visualization of Synechococcus sp. and Synechocystis sp. through a state-of-the-art natural language processing model BERT. Front. Genet. 2022, 13, 1067562. [Google Scholar] [CrossRef]
- Song, J.; Tian, S.; Yu, L.; Yang, Q.; Xing, Y.; Zhang, C.; Dai, Q.; Duan, X. MD-MLI: Prediction of miRNA-lncRNA Interaction by Using Multiple Features and Hierarchical Deep Learning. IEEE-ACM Trans. Comput. Biol. Bioinform. 2022, 19, 1724–1733. [Google Scholar] [CrossRef] [PubMed]
- Tang, X.; Zheng, P.; Li, X.; Wu, H.; Wei, D.Q.; Liu, Y.; Huang, G. Deep6mAPred: A CNN and Bi-LSTM-based deep learning method for predicting DNA N6-methyladenosine sites across plant species. Methods 2022, 204, 142–150. [Google Scholar] [CrossRef] [PubMed]
- Du, B.; Liu, Z.; Luo, F. Deep multi-scale attention network for RNA-binding proteins prediction. Inf. Sci. 2022, 582, 287–301. [Google Scholar] [CrossRef]
- Pan, Z.; Zhou, S.; Zou, H.; Liu, C.; Zang, M.; Liu, T.; Wang, Q. CRMSNet: A deep learning model that uses convolution and residual multi-head self-attention block to predict RBPs for RNA sequence. Proteins-Struct. Funct. Bioinform. 2023, 91, 1032–1041. [Google Scholar] [CrossRef] [PubMed]
- Zhuang, J.; Liu, D.; Lin, M.; Qiu, W.; Liu, J.; Chen, S. PseUdeep: RNA Pseudouridine Site Identification with Deep Learning Algorithm. Front. Genet. 2021, 12, 773882. [Google Scholar] [CrossRef]
- Huang, Y.; Luo, J.; Jing, R.; Li, M. Multi-model predictive analysis of RNA solvent accessibility based on modified residual attention mechanism. Brief. Bioinform. 2022, 23, bbac470. [Google Scholar] [CrossRef] [PubMed]
- Guan, X.; Wang, Y.; Shao, W.; Li, Z.; Huang, S.; Zhang, D. S2Snet: Deep learning for low molecular weight RNA identification with nanopore. Brief. Bioinform. 2022, 23, bbac098. [Google Scholar] [CrossRef]
- Li, X.; Zhang, S.; Shi, H. An improved residual network using deep fusion for identifying RNA 5-methylcytosine sites. Bioinformatics 2022, 38, 4271–4277. [Google Scholar] [CrossRef]
- Fei, Y.; Zhang, H.; Wang, Y.; Liu, Z.; Liu, Y. LTPConstraint: A transfer learning based end-to-end method for RNA secondary structure prediction. BMC Bioinform. 2022, 23, 1–26. [Google Scholar] [CrossRef]
- Du, Z.; Xiao, X.; Uversky, V.N. DeepA-RBPBS: A hybrid convolution and recurrent neural network combined with attention mechanism for predicting RBP binding site. J. Biomol. Struct. Dyn. 2022, 40, 4250–4258. [Google Scholar] [CrossRef]
- Wenjing, Y.; Baoyu, Z.; Min, Z.; Qingchuan, Z.; Hong, W.; Da, M. AttentionSplice: An Interpretable Multi-Head Self-Attention Based Hybrid Deep Learning Model in Splice Site Prediction. Chin. J. Electron. 2022, 31, 870–887. [Google Scholar]
- Cao, L.; Liu, P.; Chen, J.; Deng, L. Prediction of Transcription Factor Binding Sites Using a Combined Deep Learning Approach. Front. Oncol. 2022, 12, 893520. [Google Scholar] [CrossRef]
- He, S.; Gao, B.; Sabnis, R.; Sun, Q. RNAdegformer: Accurate prediction of mRNA degradation at nucleotide resolution with deep learning. Brief. Bioinform. 2023, 24, bbac581. [Google Scholar] [CrossRef]
- Shen, L.C.; Liu, Y.; Song, J.; Yu, D.J. SAResNet: Self-attention residual network for predicting DNA-protein binding. Brief. Bioinform. 2021, 22, bbab101. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Wang, Z.; Zeng, Y.; Zhou, J.; Zou, Q. High-resolution transcription factor binding sites prediction improved performance and interpretability by deep learning method. Brief. Bioinform. 2021, 22, bbab273. [Google Scholar] [CrossRef]
- Jiang, J.Y.; Ju, C.J.T.; Hao, J.; Chen, M.; Wang, W. JEDI: Circular RNA prediction based on junction encoders and deep interaction among splice sites. Bioinformatics 2021, 37, I289–I298. [Google Scholar] [CrossRef]
- Bhukya, R.; Kumari, A.; Dasari, C.M.; Amilpur, S. An attention-based hybrid deep neural networks for accurate identification of transcription factor binding sites. Neural Comput. Appl. 2022, 34, 19051–19060. [Google Scholar] [CrossRef]
- Muneer, A.; Fati, S.M.; Akbar, N.A.; Agustriawan, D.; Wahyudi, S.T. iVaccine-Deep: Prediction of COVID-19 mRNA vaccine degradation using deep learning. J. King Saud-Univ.-Comput. Inf. Sci. 2022, 34, 7419–7432. [Google Scholar] [CrossRef]
- Wekesa, J.S.; Meng, J.; Luan, Y. Multi-feature fusion for deep learning to predict plant lncRNA-protein interaction. Genomics 2020, 112, 2928–2936. [Google Scholar] [CrossRef]
- Liang, J.; Cui, Z.; Wu, C.; Yu, Y.; Tian, R.; Xie, H.; Jin, Z.; Fan, W.; Xie, W.; Huang, Z.; et al. DeepEBV: A deep learning model to predict Epstein-Barr virus (EBV) integration sites. Bioinformatics 2021, 37, 3405–3411. [Google Scholar] [CrossRef]
- Zhang, H.; Fang, J.; Sun, Y.; Xie, G.; Lin, Z.; Gu, G. Predicting miRNA-Disease Associations via Node-Level Attention Graph Auto-Encoder. IEEE-ACM Trans. Comput. Biol. Bioinform. 2023, 20, 1308–1318. [Google Scholar] [CrossRef] [PubMed]
- Xie, X.; Wang, Y.; He, K.; Sheng, N. Predicting miRNA-disease associations based on PPMI and attention network. BMC Bioinform. 2023, 24, 1–19. [Google Scholar] [CrossRef] [PubMed]
- Fan, X.Q.; Hu, J.; Tang, Y.X.; Jia, N.X.; Yu, D.J.; Zhang, G.J. Predicting RNA solvent accessibility from multi-scale context feature via multi-shot neural network. Anal. Biochem. 2022, 654, 114802. [Google Scholar] [CrossRef] [PubMed]
- Tsukiyama, S.; Hasan, M.M.; Deng, H.W.; Kurata, H. BERT6mA: Prediction of DNA N6-methyladenine site using deep learning-based approaches. Brief. Bioinform. 2022, 23, bbac053. [Google Scholar] [CrossRef] [PubMed]
- Gao, Y.; Chen, Y.; Feng, H.; Zhang, Y.; Yue, Z. RicENN: Prediction of Rice Enhancers with Neural Network Based on DNA Sequences. Interdiscip.-Sci.-Comput. Life Sci. 2022, 14, 555–565. [Google Scholar] [CrossRef]
- Ullah, A.; Malik, K.M.; Saudagar, A.K.J.; Khan, M.B.; Abul Hasanat, M.H.; AlTameem, A.; AlKhathami, M.; Sajjad, M. COVID-19 Genome Sequence Analysis for New Variant Prediction and Generation. Mathematics 2022, 10, 4267. [Google Scholar] [CrossRef]
- Guo, Y.; Zhou, D.; Li, W.; Cao, J.; Nie, R.; Xiong, L.; Ruan, X. Identifying polyadenylation signals with biological embedding via self-attentive gated convolutional highway networks. Appl. Soft Comput. 2021, 103, 107133. [Google Scholar] [CrossRef]
- Wang, Y.; Hou, Z.; Yang, Y.; Wong, K.c.; Li, X. Genome-wide identification and characterization of DNA enhancers with a stacked multivariate fusion framework. PLoS Comput. Biol. 2022, 18, e1010779. [Google Scholar] [CrossRef]
- Sun, L.; Xu, K.; Huang, W.; Yang, Y.T.; Li, P.; Tang, L.; Xiong, T.; Zhang, Q.C. Predicting dynamic cellular protein-RNA interactions by deep learning using in vivo RNA structures. Cell Res. 2021, 31, 495–516. [Google Scholar] [CrossRef]
- Zhang, T.H.; Hasib, M.M.; Chiu, Y.C.; Han, Z.F.; Jin, Y.F.; Flores, M.; Chen, Y.; Huang, Y. Transformer for Gene Expression Modeling (T-GEM): An Interpretable Deep Learning Model for Gene Expression-Based Phenotype Predictions. Cancers 2022, 14, 4763. [Google Scholar] [CrossRef] [PubMed]
- Lee, D.; Yang, J.; Kim, S. Learning the histone codes with large genomic windows and three-dimensional chromatin interactions using transformer. Nat. Commun. 2022, 13, 6678. [Google Scholar] [CrossRef]
- Chen, Y.; Xie, M.; Wen, J. Predicting gene expression from histone modifications with self-attention based neural networks and transfer learning. Front. Genet. 2022, 13, 1081842. [Google Scholar] [CrossRef]
- Kang, M.; Lee, S.; Lee, D.; Kim, S. Learning Cell-Type-Specific Gene Regulation Mechanisms by Multi-Attention Based Deep Learning With Regulatory Latent Space. Front. Genet. 2020, 11, 869. [Google Scholar] [CrossRef]
- Liao, Y.; Guo, H.; Jing, R.; Luo, J.; Li, M.; Li, Y. Predicting gene expression levels from histone modification profiles by a hybrid deep learning network. Chemom. Intell. Lab. Syst. 2021, 219, 104456. [Google Scholar] [CrossRef]
- Angenent-Mari, N.M.; Garruss, A.S.; Soenksen, L.R.; Church, G.; Collins, J.J. A deep learning approach to programmable RNA switches. Nat. Commun. 2020, 11, 5057. [Google Scholar] [CrossRef] [PubMed]
- Zuo, Z.; Wang, P.; Chen, X.; Tian, L.; Ge, H.; Qian, D. SWnet: A deep learning model for drug response prediction from cancer genomic signatures and compound chemical structures. BMC Bioinform. 2021, 22, 1–16. [Google Scholar] [CrossRef] [PubMed]
- Karbalayghareh, A.; Sahin, M.; Leslie, C.S. Chromatin interaction-aware gene regulatory modeling with graph attention networks. Genome Res. 2022, 32, 930–944. [Google Scholar] [CrossRef] [PubMed]
- Pham, T.H.; Qiu, Y.; Zeng, J.; Xie, L.; Zhang, P. A deep learning framework for high-throughput mechanism-driven phenotype compound screening and its application to COVID-19 drug repurposing. Nat. Mach. Intell. 2021, 3, 247–257. [Google Scholar] [CrossRef]
- Dominic, N.; Cenggoro, T.W.; Budiarto, A.; Pardamean, B. Deep polygenic neural network for predicting and identifying yield-associated genes in Indonesian rice accessions. Sci. Rep. 2022, 12, 13823. [Google Scholar] [CrossRef]
- Lee, H.; Yeom, S.; Kim, S. BP-GAN: Interpretable Human Branchpoint Prediction Using Attentive Generative Adversarial Networks. IEEE Access 2020, 8, 97851–97862. [Google Scholar] [CrossRef]
- Li, Z.; Zhong, T.; Huang, D.; You, Z.H.; Nie, R. Hierarchical graph attention network for miRNA-disease association prediction. Mol. Ther. 2022, 30, 1775–1786. [Google Scholar] [CrossRef]
- Bu, Y.; Jia, C.; Guo, X.; Li, F.; Song, J. COPPER: An ensemble deep-learning approach for identifying exclusive virus-derived small interfering RNAs in plants. Briefings Funct. Genom. 2023, 22, 274–280. [Google Scholar] [CrossRef]
- Schapke, J.; Tavares, A.; Recamonde-Mendoza, M. EPGAT: Gene Essentiality Prediction With Graph Attention Networks. IEEE-ACM Trans. Comput. Biol. Bioinform. 2022, 19, 1615–1626. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.; Yu, Y.; Zhao, S. Dual Attention Mechanisms and Feature Fusion Networks Based Method for Predicting LncRNA-Disease Associations. Interdiscip.-Sci.-Comput. Life Sci. 2022, 14, 358–371. [Google Scholar] [CrossRef] [PubMed]
- Song, J.; Tian, S.; Yu, L.; Yang, Q.; Dai, Q.; Wang, Y.; Wu, W.; Duan, X. RLF-LPI: An ensemble learning framework using sequence information for predicting lncRNA-protein interaction based on AE-ResLSTM and fuzzy decision. Math. Biosci. Eng. 2022, 19, 4749–4764. [Google Scholar] [CrossRef]
- Wekesa, J.S.; Meng, J.; Luan, Y. A deep learning model for plant lncRNA-protein interaction prediction with graph attention. Mol. Genet. Genom. 2020, 295, 1091–1102. [Google Scholar] [CrossRef]
- Wu, H.; Pan, X.; Yang, Y.; Shen, H.B. Recognizing binding sites of poorly characterized RNA-binding proteins on circular RNAs using attention Siamese network. Brief. Bioinform. 2021, 22, bbab279. [Google Scholar] [CrossRef] [PubMed]
- Yang, T.H.; Shiue, S.C.; Chen, K.Y.; Tseng, Y.Y.; Wu, W.S. Identifying piRNA targets on mRNAs in C. elegans using a deep multi-head attention network. BMC Bioinform. 2021, 22, 1–23. [Google Scholar] [CrossRef]
- Liu, T.; Zou, B.; He, M.; Hu, Y.; Dou, Y.; Cui, T.; Tan, P.; Li, S.; Rao, S.; Huang, Y.; et al. LncReader: Identification of dual functional long noncoding RNAs using a multi-head self-attention mechanism. Brief. Bioinform. 2022, 24, bbac579. [Google Scholar] [CrossRef]
- Gao, M.; Shang, X. Identification of associations between lncRNA and drug resistance based on deep learning and attention mechanism. Front. Microbiol. 2023, 14, 1147778. [Google Scholar] [CrossRef] [PubMed]
- Yuan, L.; Yang, Y. DeCban: Prediction of circRNA-RBP Interaction Sites by Using Double Embeddings and Cross-Branch Attention Networks. Front. Genet. 2021, 11, 632861. [Google Scholar] [CrossRef]
- Song, Z.; Huang, D.; Song, B.; Chen, K.; Song, Y.; Liu, G.; Su, J.; de Magalhaes, J.P.; Rigden, D.J.; Meng, J. Attention-based multi-label neural networks for integrated prediction and interpretation of twelve widely occurring RNA modifications. Nat. Commun. 2021, 12, 4011. [Google Scholar] [CrossRef]
- Chen, K.; Zhu, X.; Wang, J.; Hao, L.; Liu, Z.; Liu, Y. ncDENSE: A novel computational method based on a deep learning framework for non-coding RNAs family prediction. BMC Bioinform. 2023, 24, 1–20. [Google Scholar] [CrossRef] [PubMed]
- Yang, Y.; Hou, Z.; Ma, Z.; Li, X.; Wong, K.C. iCircRBP-DHN: Identification of circRNA-RBP interaction sites using deep hierarchical network. Brief. Bioinform. 2021, 22, bbaa274. [Google Scholar] [CrossRef]
- Li, G.; Wang, D.; Zhang, Y.; Liang, C.; Xiao, Q.; Luo, J. Using Graph Attention Network and Graph Convolutional Network to Explore Human CircRNA-Disease Associations Based on Multi-Source Data. Front. Genet. 2022, 13, 829937. [Google Scholar] [CrossRef] [PubMed]
- Wang, H.; Han, J.; Li, H.; Duan, L.; Liu, Z.; Cheng, H. CDA-SKAG: Predicting circRNA-disease associations using similarity kernel fusion and an attention-enhancing graph autoencoder. Math. Biosci. Eng. 2023, 20, 7957–7980. [Google Scholar] [CrossRef] [PubMed]
- Li, G.; Lin, Y.; Luo, J.; Xiao, Q.; Liang, C. GGAECDA: Predicting circRNA-disease associations using graph autoencoder based on graph representation learning. Comput. Biol. Chem. 2022, 99, 107722. [Google Scholar] [CrossRef]
- Fan, Y.; Chen, M.; Pan, X. GCRFLDA: Scoring lncRNA-disease associations using graph convolution matrix completion with conditional random field. Brief. Bioinform. 2022, 23, bbab361. [Google Scholar] [CrossRef]
- Sheng, N.; Cui, H.; Zhang, T.; Xuan, P. Attentional multi-level representation encoding based on convolutional and variance autoencoders for lncRNA-disease association prediction. Brief. Bioinform. 2021, 22, bbaa067. [Google Scholar] [CrossRef]
- Niu, M.; Zou, Q.; Lin, C. CRBPDL: Identification of circRNA-RBP interaction sites using an ensemble neural network approach. PLoS Comput. Biol. 2022, 18, e1009798. [Google Scholar] [CrossRef]
- Zhang, Y.; Ye, F.; Gao, X. MCA-Net: Multi-Feature Coding and Attention Convolutional Neural Network for Predicting lncRNA-Disease Association. IEEE-ACM Trans. Comput. Biol. Bioinform. 2022, 19, 2907–2919. [Google Scholar] [CrossRef]
- Liu, Y.; Fu, Q.; Peng, X.; Zhu, C.; Liu, G.; Liu, L. Attention-Based Deep Multiple-Instance Learning for Classifying Circular RNA and Other Long Non-Coding RNA. Genes 2021, 12, 2018. [Google Scholar] [CrossRef]
- Guo, Y.; Lei, X.; Liu, L.; Pan, Y. circ2CBA: Prediction of circRNA-RBP binding sites combining deep learning and attention mechanism. Front. Comput. Sci. 2023, 17, 175904. [Google Scholar] [CrossRef]
- Clauwaert, J.; Menschaert, G.; Waegeman, W. Explainability in transformer models for functional genomics. Brief. Bioinform. 2021, 22, bbab060. [Google Scholar] [CrossRef] [PubMed]
- Feng, P.; Xiao, A.; Fang, M.; Wan, F.; Li, S.; Lang, P.; Zhao, D.; Zeng, J. A machine learning-based framework for modeling transcription elongation. Proc. Natl. Acad. Sci. USA 2021, 118, e2007450118. [Google Scholar] [CrossRef]
- Han, K.; Shen, L.C.; Zhu, Y.H.; Xu, J.; Song, J.; Yu, D.J. MAResNet: Predicting transcription factor binding sites by combining multi-scale bottom-up and top-down attention and residual network. Brief. Bioinform. 2022, 23, bbab445. [Google Scholar] [CrossRef] [PubMed]
- Tao, Y.; Ma, X.; Palmer, D.; Schwartz, R.; Lu, X.; Osmanbeyoglu, H.U. Interpretable deep learning for chromatin-informed inference of transcriptional programs driven by somatic alterations across cancers. Nucleic Acids Res. 2022, 50, 10869–10881. [Google Scholar] [CrossRef] [PubMed]
- Asim, M.N.; Ibrahim, M.A.; Malik, M.I.; Zehe, C.; Cloarec, O.; Trygg, J.; Dengel, A.; Ahmed, S. EL-RMLocNet: An explainable LSTM network for RNA-associated multi-compartment localization prediction. Comput. Struct. Biotechnol. J. 2022, 20, 3986–4002. [Google Scholar] [CrossRef]
- Park, S.; Koh, Y.; Jeon, H.; Kim, H.; Yeo, Y.; Kang, J. Enhancing the interpretability of transcription factor binding site prediction using attention mechanism. Sci. Rep. 2020, 10, 13413. [Google Scholar] [CrossRef]
- Yan, Z.; Lecuyer, E.; Blanchette, M. Prediction of mRNA subcellular localization using deep recurrent neural networks. Bioinformatics 2019, 35, I333–I342. [Google Scholar] [CrossRef] [Green Version]
- Song, J.; Tian, S.; Yu, L.; Xing, Y.; Yang, Q.; Duan, X.; Dai, Q. AC-Caps: Attention Based Capsule Network for Predicting RBP Binding Sites of LncRNA. Interdiscip.-Sci.-Comput. Life Sci. 2020, 12, 414–423. [Google Scholar] [CrossRef]
- Gong, P.; Cheng, L.; Zhang, Z.; Meng, A.; Li, E.; Chen, J.; Zhang, L. Multi-omics integration method based on attention deep learning network for biomedical data classification. Comput. Methods Prog. Biomed. 2023, 231, 107377. [Google Scholar] [CrossRef]
- Kayikci, S.; Khoshgoftaar, T.M. Breast cancer prediction using gated attentive multimodal deep learning. J. Big Data 2023, 10, 1–11. [Google Scholar] [CrossRef]
- Ye, L.; Zhang, Y.; Yang, X.; Shen, F.; Xu, B. An Ovarian Cancer Susceptible Gene Prediction Method Based on Deep Learning Methods. Front. Cell Dev. Biol. 2021, 9, 730475. [Google Scholar] [CrossRef] [PubMed]
- Kang, Q.; Meng, J.; Shi, W.; Luan, Y. Ensemble Deep Learning Based on Multi-level Information Enhancement and Greedy Fuzzy Decision for Plant miRNA-lncRNA Interaction Prediction. Interdiscip.-Sci.-Comput. Life Sci. 2021, 13, 603–614. [Google Scholar] [CrossRef] [PubMed]
- Wang, C.; Lye, X.; Kaalia, R.; Kumar, P.; Rajapakse, J.C. Deep learning and multi-omics approach to predict drug responses in cancer. BMC Bioinform. 2022, 22, 1–15. [Google Scholar] [CrossRef] [PubMed]
- Chan, Y.H.; Wang, C.; Soh, W.K.; Rajapakse, J.C. Combining Neuroimaging and Omics Datasets for Disease Classification Using Graph Neural Networks. Front. Neurosci. 2022, 16, 866666. [Google Scholar] [CrossRef]
- Liu, Q.; He, D.; Xie, L. Prediction of off-target specificity and cell-specific fitness of CRISPR-Cas System using attention boosted deep learning and network-based gene feature. PLoS Comput. Biol. 2019, 15, e1007480. [Google Scholar] [CrossRef] [Green Version]
- Liu, X.; Wang, S.; Ai, D. Predicting CRISPR/Cas9 Repair Outcomes by Attention-Based Deep Learning Framework. Cells 2022, 11, 1847. [Google Scholar] [CrossRef]
- Wan, Y.; Jiang, Z. TransCrispr: Transformer Based Hybrid Model for Predicting CRISPR/Cas9 Single Guide RNA Cleavage Efficiency. IEEE-ACM Trans. Comput. Biol. Bioinform. 2023, 20, 1518–1528. [Google Scholar] [CrossRef]
- Xiao, L.M.; Wan, Y.Q.; Jiang, Z.R. AttCRISPR: A spacetime interpretable model for prediction of sgRNA on-target activity. BMC Bioinform. 2021, 22, 1–17. [Google Scholar] [CrossRef]
- Mathis, N.; Allam, A.; Kissling, L.; Marquart, K.F.; Schmidheini, L.; Solari, C.; Balazs, Z.; Krauthammer, M.; Schwank, G. Predicting prime editing efficiency and product purity by deep learning. Nat. Biotechnol. 2023. [Google Scholar] [CrossRef]
- Zhang, Z.R.; Jiang, Z.R. Effective use of sequence information to predict CRISPR-Cas9 off-target. Comput. Struct. Biotechnol. J. 2022, 20, 650–661. [Google Scholar] [CrossRef] [PubMed]
- Zhang, G.; Zeng, T.; Dai, Z.; Dai, X. Prediction of CRISPR/Cas9 single guide RNA cleavage efficiency and specificity by attention-based convolutional neural networks. Comput. Struct. Biotechnol. J. 2021, 19, 1445–1457. [Google Scholar] [CrossRef] [PubMed]
- Lin, Z.; Ou-Yang, L. Inferring gene regulatory networks from single-cell gene expression data via deep multi-view contrastive learning. Brief. Bioinform. 2022, 24, bbac586. [Google Scholar] [CrossRef] [PubMed]
- Xu, J.; Zhang, A.; Liu, F.; Zhang, X. STGRNS: An interpretable transformer-based method for inferring gene regulatory networks from single-cell transcriptomic data. Bioinformatics 2023, 39, btad165. [Google Scholar] [CrossRef] [PubMed]
- Feng, X.; Fang, F.; Long, H.; Zeng, R.; Yao, Y. Single-cell RNA-seq data analysis using graph autoencoders and graph attention networks. Front. Genet. 2022, 13, 1003711. [Google Scholar] [CrossRef]
- Ullah, F.; Ben-Hur, A. A self-attention model for inferring cooperativity between regulatory features. Nucleic Acids Res. 2021, 49, e77. [Google Scholar] [CrossRef]
- Xie, X.; Wang, Y.; Sheng, N.; Zhang, S.; Cao, Y.; Fu, Y. Predicting miRNA-disease associations based on multi-view information fusion. Front. Genet. 2022, 13, 979815. [Google Scholar] [CrossRef]
- Lee, M. An Ensemble Deep Learning Model with a Gene Attention Mechanism for Estimating the Prognosis of Low-Grade Glioma. Biology 2022, 11, 586. [Google Scholar] [CrossRef]
- Choi, S.R.; Lee, M. Estimating the Prognosis of Low-Grade Glioma with Gene Attention Using Multi-Omics and Multi-Modal Schemes. Biology 2022, 11, 1462. [Google Scholar] [CrossRef]
- Dutta, P.; Patra, A.P.; Saha, S. DeePROG: Deep Attention-Based Model for Diseased Gene Prognosis by Fusing Multi-Omics Data. IEEE-ACM Trans. Comput. Biol. Bioinform. 2022, 19, 2770–2781. [Google Scholar] [CrossRef]
- Xing, X.; Yang, F.; Li, H.; Zhang, J.; Zhao, Y.; Gao, M.; Huang, J.; Yao, J. Multi-level attention graph neural network based on co-expression gene modules for disease diagnosis and prognosis. Bioinformatics 2022, 38, 2178–2186. [Google Scholar] [CrossRef] [PubMed]
- Meng, X.; Wang, X.; Zhang, X.; Zhang, C.; Zhang, Z.; Zhang, K.; Wang, S. A Novel Attention-Mechanism Based Cox Survival Model by Exploiting Pan-Cancer Empirical Genomic Information. Cells 2022, 11, 1421. [Google Scholar] [CrossRef] [PubMed]
- Feng, C.; Xiang, T.; Yi, Z.; Meng, X.; Chu, X.; Huang, G.; Zhao, X.; Chen, F.; Xiong, B.; Feng, J. A Deep-Learning Model With the Attention Mechanism Could Rigorously Predict Survivals in Neuroblastoma. Front. Oncol. 2021, 11, 653863. [Google Scholar] [CrossRef]
- Gokhale, M.; Mohanty, S.K.; Ojha, A. GeneViT: Gene Vision Transformer with Improved DeepInsight for cancer classification. Comput. Biol. Med. 2023, 155, 106643. [Google Scholar] [CrossRef]
- Beykikhoshk, A.; Quinn, T.P.; Lee, S.C.; Tran, T.; Venkatesh, S. DeepTRIAGE: Interpretable and individualised biomarker scores using attention mechanism for the classification of breast cancer sub-types. BMC Med. Genom. 2020, 13, 1–10. [Google Scholar] [CrossRef] [Green Version]
- Manica, M.; Oskooei, A.; Born, J.; Subramanian, V.; Saez-Rodriguez, J.; Martinez, M.R. Toward Explainable Anticancer Compound Sensitivity Prediction via Multimodal Attention-Based Convolutional Encoders. Mol. Pharm. 2019, 16, 4797–4806. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lee, S.; Lim, S.; Lee, T.; Sung, I.; Kim, S. Cancer subtype classification and modeling by pathway attention and propagation. Bioinformatics 2020, 36, 3818–3824. [Google Scholar] [CrossRef] [PubMed]
- Hou, Z.; Yang, Y.; Li, H.; Wong, K.C.; Li, X. iDeepSubMito: Identification of protein submitochondrial localization with deep learning. Brief. Bioinform. 2021, 22, bbab288. [Google Scholar] [CrossRef]
- Gong, H.; Wen, J.; Luo, R.; Feng, Y.; Guo, J.; Fu, H.; Zhou, X. Integrated mRNA sequence optimization using deep learning. Brief. Bioinform. 2023, 24, bbad001. [Google Scholar] [CrossRef]
- Armenteros, J.J.A.; Salvatore, M.; Emanuelsson, O.; Winther, O.; von Heijne, G.; Elofsson, A.; Nielsen, H. Detecting sequence signals in targeting peptides using deep learning. Life Sci. Alliance 2019, 2, e201900429. [Google Scholar] [CrossRef] [Green Version]
- Littmann, M.; Heinzinger, M.; Dallago, C.; Weissenow, K.; Rost, B. Protein embeddings and deep learning predict binding residues for various ligand classes. Sci. Rep. 2021, 11, 23916. [Google Scholar] [CrossRef] [PubMed]
- Song, T.; Dai, H.; Wang, S.; Wang, G.; Zhang, X.; Zhang, Y.; Jiao, L. TransCluster: A Cell-Type Identification Method for single-cell RNA-Seq data using deep learning based on transformer. Front. Genet. 2022, 13, 1038919. [Google Scholar] [CrossRef] [PubMed]
- Feng, X.; Zhang, H.; Lin, H.; Long, H. Single-cell RNA-seq data analysis based on directed graph neural network. Methods 2023, 211, 48–60. [Google Scholar] [CrossRef] [PubMed]
- Buterez, D.; Bica, I.; Tariq, I.; Andres-Terre, H.; Lio, P. CellVGAE: An unsupervised scRNA-seq analysis workflow with graph attention networks. Bioinformatics 2022, 38, 1277–1286. [Google Scholar] [CrossRef]
- Zhang, Y.; Blanchette, M. Reference panel guided topological structure annotation of Hi-C data. Nat. Commun. 2022, 13, 7426. [Google Scholar] [CrossRef] [PubMed]
- Schwarz, K.; Allam, A.; Gonzalez, N.A.P.; Krauthammer, M. AttentionDDI: Siamese attention-based deep learning method for drug-drug interaction predictions. BMC Bioinform. 2021, 22, 1–19. [Google Scholar] [CrossRef]
- Kim, E.; Nam, H. DeSIDE-DDI: Interpretable prediction of drug-drug interactions using drug-induced gene expressions. J. Cheminform. 2022, 14, 1–12. [Google Scholar] [CrossRef]
- Liu, Q.; Xie, L. TranSynergy: Mechanism-driven interpretable deep neural network for the synergistic prediction and pathway deconvolution of drug combinations. PLoS Comput. Biol. 2021, 17, e1008653. [Google Scholar] [CrossRef] [PubMed]
- Wang, J.; Liu, X.; Shen, S.; Deng, L.; Liu, H. DeepDDS: Deep graph neural network with attention mechanism to predict synergistic drug combinations. Brief. Bioinform. 2022, 23, bbab390. [Google Scholar] [CrossRef]
- Yu, G.; Zeng, J.; Wang, J.; Zhang, H.; Zhang, X.; Guo, M. Imbalance deep multi-instance learning for predicting isoform-isoform interactions. Int. J. Intell. Syst. 2021, 36, 2797–2824. [Google Scholar] [CrossRef]
- Yamaguchi, H.; Saito, Y. Evotuning protocols for Transformer-based variant effect prediction on multi-domain proteins. Brief. Bioinform. 2021, 22, bbab234. [Google Scholar] [CrossRef]
- Zhou, J.; Chen, Q.; Braun, P.R.; Mandell, K.A.P.; Jaffe, A.E.; Tan, H.Y.; Hyde, T.M.; Kleinman, J.E.; Potash, J.B.; Shinozaki, G.; et al. Deep learning predicts DNA methylation regulatory variants in the human brain and elucidates the genetics of psychiatric disorders. Proc. Natl. Acad. Sci. USA 2022, 119, e2206069119. [Google Scholar] [CrossRef]
- Cao, L.; Zhang, Q.; Song, H.; Lin, K.; Pang, E. DeepASmRNA: Reference-free prediction of alternative splicing events with a scalable and interpretable deep learning model. iScience 2022, 25, 105345. [Google Scholar] [CrossRef] [PubMed]
- Gupta, S.; Shankar, R. miWords: Transformer-based composite deep learning for highly accurate discovery of pre-miRNA regions across plant genomes. Brief. Bioinform. 2023, 24, bbad088. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z.Y.; Ning, L.; Ye, X.; Yang, Y.H.; Futamura, Y.; Sakurai, T.; Lin, H. iLoc-miRNA: Extracellular/intracellular miRNA prediction using deep BiLSTM with attention mechanism. Brief. Bioinform. 2022, 23, bbac395. [Google Scholar] [CrossRef] [PubMed]
- Choi, J.M.; Chae, H. moBRCA-net: A breast cancer subtype classification framework based on multi-omics attention neural networks. BMC Bioinform. 2023, 24, 1–15. [Google Scholar] [CrossRef]
- Yin, C.; Chen, Z. Developing Sustainable Classification of Diseases via Deep Learning and Semi-Supervised Learning. Healthcare 2020, 8, 291. [Google Scholar] [CrossRef]
- Song, H.; Yin, C.; Li, Z.; Feng, K.; Cao, Y.; Gu, Y.; Sun, H. Identification of Cancer Driver Genes by Integrating Multiomics Data with Graph Neural Networks. Metabolites 2023, 13, 339. [Google Scholar] [CrossRef]
- Song, J.T.; Woo, D.U.; Lee, Y.; Choi, S.H.; Kang, Y.J. The Semi-Supervised Strategy of Machine Learning on the Gene Family Diversity to Unravel Resveratrol Synthesis. Plants 2021, 10, 2058. [Google Scholar] [CrossRef]
- Munoz, S.A.; Park, J.; Stewart, C.M.; Martin, A.M.; Hedengren, J.D. Deep Transfer Learning for Approximate Model Predictive Control. Processes 2023, 11, 197. [Google Scholar] [CrossRef]
- Dastour, H.; Hassan, Q.K. A Comparison of Deep Transfer Learning Methods for Land Use and Land Cover Classification. Sustainability 2023, 15, 7854. [Google Scholar] [CrossRef]
- Yang, L.; Huang, R.; Zhang, J.; Huang, J.; Wang, L.; Dong, J.; Shao, J. Inter-Continental Transfer of Pre-Trained Deep Learning Rice Mapping Model and Its Generalization Ability. Remote Sens. 2023, 15, 2443. [Google Scholar] [CrossRef]
Research Topic | Studies |
---|---|
Sequence and Site Prediction | Raad et al. [34], Shen et al. [35], Li et al. [36], Ma et al. [37], Zeng et al. [38], Mai et al. [39], Song et al. [40], Tang et al. [41], Du et al. [42], Pan et al. [43], Zhuang et al. [44], Huang et al. [45], Guan et al. [46], Li et al. [47], Liu et al. [48], Du et al. [49], Wenjing et al. [50], Cao et al. [51], He et al. [52], Shen et al. [53], Zhang et al. [54], Jiang et al. [55], Bhukya et al. [56], Muneer et al. [57], Wekesa et al. [58], Liang et al. [59], Zhang et al. [60], Xie et al. [61], Fan et al. [62], Tsukiyama et al. [63], Gao et al. [64], Ullah et al. [65], Guo et al. [66], Wang et al. [67], Sun et al. [68] |
Gene Expression and Phenotype Prediction | Zhang et al. [69], Lee et al. [70], Chen et al. [71], Kang et al. [72], Liao et al. [73], Angenent-Mari et al. [74], Zuo et al. [75], Karbalayghareh et al. [76], Pham et al. [77], Dominic et al. [78], Lee et al. [79], Li et al. [80], Bu et al. [81], Schapke et al. [82] |
ncRNA and circRNA Studies | Liu et al. [83], Song et al. [84], Wekesa et al. [85], Wu et al. [86], Yang et al. [87], Liu et al. [88], Gao and Shang [89], Yuan and Yang [90], Song et al. [91], Chen et al. [92], Yang et al. [93], Li et al. [94], Wang et al. [95], Li et al. [96], Fan et al. [97], Sheng et al. [98], Niu et al. [99], Zhang et al. [100], Liu et al. [101], Guo et al. [102] |
Transcription Process Insights | Clauwaert et al. [103], Feng et al. [104], Han et al. [105], Tao et al. [106], Asim et al. [107], Park et al. [108], Yan et al. [109], Song et al. [110] |
Multi-omics/modal Tasks | Gong et al. [111], Kayikci and Khoshgoftaar [112], Ye et al. [113], Kang et al. [114], Wang et al. [115], Chan et al. [116] |
CRISPR Efficacy and Outcome Prediction | Liu et al. [117], Liu et al. [118], Wan and Jiang [119], Xiao et al. [120], Mathis et al. [121], Zhang et al. [122], Zhang et al. [123] |
Gene Regulatory Network Inference | Lin and Ou-Yang [124], Xu et al. [125], Feng et al. [126], Ullah and Ben-Hur [127], Xie et al. [128] |
Disease Prognosis Estimation | Lee [129], Choi and Lee [130], Dutta et al. [131], Xing et al. [132], Meng et al. [133], Feng et al. [134] |
Gene Expression-based Classification | Gokhale et al. [135], Beykikhoshk et al. [136], Manica et al. [137], Lee et al. [138] |
Proteomics | Hou et al. [139], Gong et al. [140], Armenteros et al. [141], Littmann et al. [142] |
Cell-Type Identification | Song et al. [143], Feng et al. [144], Buterez et al. [145], Zhang et al. [146] |
Predicting Drug-Drug Interactions | Schwarz et al. [147], Kim et al. [148], Liu and Xie [149], Wang et al. [150] |
Other Topics | Yu et al. [151], Yamaguchi and Saito [152], Zhou et al. [153], Cao et al. [154], Gupta and Shankar [155], Zhang et al. [156], Choi and Chae [157] |
Journal | Counts | Percentage (%) |
---|---|---|
Briefings in Bioinformatics | 20 | 16.1 |
Bioinformatics | 9 | 7.3 |
BMC Bioinformatics | 9 | 7.3 |
Frontiers in Genetics | 9 | 7.3 |
IEEE-ACM Transactions on Computational Biology and Bioinformatics | 7 | 5.6 |
PLOS Computational Biology | 4 | 3.2 |
Nature Communications | 4 | 3.2 |
Interdisciplinary Sciences-Computational Life Sciences | 4 | 3.2 |
Computational and Structural Biotechnology Journal | 3 | 2.4 |
Scientific Reports | 3 | 2.4 |
Biology-Basel | 2 | 1.6 |
Mathematical Biosciences and Engineering | 2 | 1.6 |
Frontiers in Oncology | 2 | 1.6 |
Computational Biology and Chemistry | 2 | 1.6 |
Proceedings of the National Academy of Sciences of the United States of America | 2 | 1.6 |
Methods | 2 | 1.6 |
Nucleic Acids Research | 2 | 1.6 |
Cells | 2 | 1.6 |
Frontiers in Cell and Developmental Biology | 2 | 1.6 |
Others (<2 Publications) | 34 | 27.4 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Choi, S.R.; Lee, M. Transformer Architecture and Attention Mechanisms in Genome Data Analysis: A Comprehensive Review. Biology 2023, 12, 1033. https://doi.org/10.3390/biology12071033
Choi SR, Lee M. Transformer Architecture and Attention Mechanisms in Genome Data Analysis: A Comprehensive Review. Biology. 2023; 12(7):1033. https://doi.org/10.3390/biology12071033
Chicago/Turabian StyleChoi, Sanghyuk Roy, and Minhyeok Lee. 2023. "Transformer Architecture and Attention Mechanisms in Genome Data Analysis: A Comprehensive Review" Biology 12, no. 7: 1033. https://doi.org/10.3390/biology12071033
APA StyleChoi, S. R., & Lee, M. (2023). Transformer Architecture and Attention Mechanisms in Genome Data Analysis: A Comprehensive Review. Biology, 12(7), 1033. https://doi.org/10.3390/biology12071033