Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (86)

Search Parameters:
Keywords = forensic process validation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 874 KB  
Review
Advances in Age Estimation Using Facial Sutures: Current Status, Challenges, and Future Perspectives
by Siriwat Thunyacharoen, Phruksachat Singsuwan, Chirapat Inchai and Pasuk Mahakkanukrauh
Appl. Sci. 2026, 16(8), 3698; https://doi.org/10.3390/app16083698 - 9 Apr 2026
Viewed by 470
Abstract
Forensic age estimation is a fundamental component of biological profiling for unidentified skeletal remains, particularly in mass casualty incidents where specimens are frequently fragmented or incomplete. This review evaluates the diagnostic utility of craniofacial suture closure—specifically across four facial regions—as a non-invasive methodology [...] Read more.
Forensic age estimation is a fundamental component of biological profiling for unidentified skeletal remains, particularly in mass casualty incidents where specimens are frequently fragmented or incomplete. This review evaluates the diagnostic utility of craniofacial suture closure—specifically across four facial regions—as a non-invasive methodology for age determination in adults. By analyzing the predictable fusion patterns of ectocranial and endocranial sutures, forensic practitioners can derive approximate age ranges when postcranial indicators are absent or unreliable. Despite its utility, the reliability of suture-based estimation remains a subject of academic debate. The rate of closure is influenced by a complex interplay of environmental and biological factors, including nutritional status, hormonal influences, and mechanical loading. Historically, the method has faced criticism due to significant inter-individual variability and limited sample sizes in cadaveric studies. To improve precision and novel detail, this review explores the integration of emerging technologies such as artificial intelligence (AI) and machine learning (ML). These tools can process extensive cranial datasets to identify subtle morphological patterns that may elude human observation. While craniofacial suture analysis remains an essential resource in the forensic toolkit, its accuracy is contingent upon accounting for multi-factorial biological factors. The authors emphasize the necessity for further external validation across diverse global populations to ensure the generalizability and refinement of the technique in forensic medicine and osteology. Full article
Show Figures

Figure 1

25 pages, 1873 KB  
Article
An Empirical Assessment of Digital Forensic Process Reliability Using Integrated ISO/IEC 27037 and 27041 Standards
by Zlatan Morić, Vedran Dakić and Ivana Ogrizek Biškupić
J. Cybersecur. Priv. 2026, 6(2), 57; https://doi.org/10.3390/jcp6020057 - 30 Mar 2026
Viewed by 1322
Abstract
The escalating scale and complexity of cybercrime necessitate standardized digital forensic protocols to ensure the integrity and admissibility of digital evidence. This study empirically assesses the use of ISO/IEC 27037 and ISO/IEC 27041 through three real-world digital forensic case studies conducted in organizational [...] Read more.
The escalating scale and complexity of cybercrime necessitate standardized digital forensic protocols to ensure the integrity and admissibility of digital evidence. This study empirically assesses the use of ISO/IEC 27037 and ISO/IEC 27041 through three real-world digital forensic case studies conducted in organizational settings. A multi-case methodology was employed, encompassing a multinational corporate criminal investigation, an internal employee misbehaviour probe, and an examination into mobile- and cloud-based data leaks. The effect of synchronized standard implementation was evaluated using audit-based and quantitative indicators that measure forensic process quality as a system attribute. The findings demonstrate that the systematic implementation of ISO/IEC 27037 and ISO/IEC 27041 improves investigative traceability, documentation quality, and evidentiary robustness. In the worldwide case study, documentation completeness increased by 18%, and all digital evidence was deemed admissible in judicial proceedings, surpassing the institutional baseline admissibility rate of 82%. In other instances, evidence gathered within the same framework was acknowledged in organizational or disciplinary review processes, resulting in similar enhancements in documentation quality and procedural consistency, notwithstanding technological and organizational limitations. The paper develops and empirically substantiates an integrated procedural validation model that connects evidence-handling practices with method and instrument validation. The results indicate that the synchronized implementation of ISO/IEC forensic standards improves the transparency, dependability, and auditability of digital forensic investigations. Full article
(This article belongs to the Section Security Engineering & Applications)
Show Figures

Figure 1

18 pages, 536 KB  
Review
Molecular Age Estimation: Current Perspectives and Future Considerations
by Muriel Tahtouh Zaatar, Rashed Alghafri, Rima Othman, Amira Ahmed, Mounir Alfahel, Mohammed Alhashimi, Mahmod Alsabagh, Aryaman Dayal, Shamma Kamal, Hiba Khamis, Talal Mansour, Lali Rhayem and Khaled Zeidan
Int. J. Mol. Sci. 2026, 27(7), 3104; https://doi.org/10.3390/ijms27073104 - 29 Mar 2026
Viewed by 1145
Abstract
Age estimation is an important component of forensic investigation, with applications in criminal casework, immigration assessments, and disaster victim identification. Determining whether an individual is a minor or an adult, or estimating the age at death of unidentified remains, can have significant legal [...] Read more.
Age estimation is an important component of forensic investigation, with applications in criminal casework, immigration assessments, and disaster victim identification. Determining whether an individual is a minor or an adult, or estimating the age at death of unidentified remains, can have significant legal and humanitarian implications. Traditional forensic age estimation methods rely primarily on anthropological and radiological assessment of skeletal development and degeneration; however, these approaches may be limited by subjectivity, population-specific reference standards, and reduced precision in adult age estimation. In recent years, molecular biomarkers have emerged as promising complementary tools for age prediction. Molecular approaches, including DNA methylation profiling, Y-chromosome-associated markers, RNA-based biomarkers, mitochondrial DNA alterations, proteomic signatures, and telomere length analysis, reflect biological processes associated with aging and may provide objective indicators that can be measured from biological samples. Among these methods, DNA methylation-based models currently demonstrate the strongest predictive performance and represent the most extensively studied molecular strategy for forensic age estimation. Nevertheless, several challenges remain before widespread forensic implementation can be achieved, including tissue specificity, environmental influences on biomarker stability, population variability, and the need for robust validation across laboratories and forensic sample types. This review summarises the current molecular approaches investigated for forensic age estimation, evaluates their biological basis and methodological limitations, and discusses their potential integration into forensic workflows. While molecular techniques offer promising avenues for improving age estimation, further standardisation, validation, and careful interpretation are required before they can be routinely applied in forensic practice. Full article
Show Figures

Figure 1

18 pages, 1239 KB  
Article
Bone Marrow as a Source of DNA in Forensic Genetics: An Optimized Nucleic Acids Extraction Protocol
by Mattia Porcu, Noemi Argirò, Venusia Cortellini, Antonio De Luca, Camilla Tettamanti, Lorenzo Franceschetti, Francesco Ventura and Andrea Verzeletti
Genes 2026, 17(3), 332; https://doi.org/10.3390/genes17030332 - 18 Mar 2026
Viewed by 643
Abstract
Background: low-quantity or degraded samples are often studied in forensic genetics. Therefore, it is important to efficiently obtain all the available DNA from the biological sample analyzed to provide the most reliable results. This is particularly challenging in bone marrow processing due to [...] Read more.
Background: low-quantity or degraded samples are often studied in forensic genetics. Therefore, it is important to efficiently obtain all the available DNA from the biological sample analyzed to provide the most reliable results. This is particularly challenging in bone marrow processing due to its hydrophobic molecular structure, as for other lipid-rich tissues, especially if rancid. In fact, during adipose tissue decomposition, the putrefaction of fatty acids can in some instances give a compact cerous consistency to the lipidic tissue, hardly susceptible to the nucleic acid extraction mechanisms. According to environmental circumstances, this condition is notably observable in submerged bodies or in putrefied bone marrow. Thus, this study is focused on developing an optimized nucleic acids extraction protocol for putrefied bone marrow. Methods: genetic analyses were performed on putrefied yellow bone marrow collected from 20 human femora recovered from bodies in different decomposition stages. The optimized method was developed by integrating additional steps, reagents and time intervals on a silica-based column commercial kit. This strategy was compared in DNA yield to a standard extraction protocol, represented by the same commercial kit, but following the manufacturer’s directions. Both these strategies were tested in nucleic acid isolation efficiency by performing DNA typing, including real-time PCR quantification, Short Tandem Repeats (STR) amplification and fragments analysis steps. The analytical parameters evaluated were allele count, DNA concentration (ng/µL) and Degradation Index (DI). Results: for allele count and DNA concentration parameters, the optimized protocol showed clear and significant qualitative and quantitative improvements compared with the standard protocol, supporting its potential applicability in forensic casework and laying the foundation for future studies. Conclusions: prior to appropriate laboratory internal validation, the optimized protocol can be used for tough lipid-rich tissues processing without the need to purchase a dedicated system and using a same commercial kit routinely adopted for other forensic genetics matrices. Full article
(This article belongs to the Special Issue Advances and Challenges in Forensic Genetics)
Show Figures

Figure 1

18 pages, 444 KB  
Review
Autosomal STR Markers for Forensic Genetics: Applications, Challenges, and Future Directions
by Irena Zupanič Pajnič
Genes 2026, 17(3), 285; https://doi.org/10.3390/genes17030285 - 27 Feb 2026
Viewed by 1383
Abstract
Autosomal short tandem repeat (STR) markers remain the cornerstone of modern forensic genetics, providing exceptional power for individualization, kinship verification, and reconstruction of complex investigative cases. Over the last decade, the field has undergone a major technological transition from length-based capillary electrophoresis (CE) [...] Read more.
Autosomal short tandem repeat (STR) markers remain the cornerstone of modern forensic genetics, providing exceptional power for individualization, kinship verification, and reconstruction of complex investigative cases. Over the last decade, the field has undergone a major technological transition from length-based capillary electrophoresis (CE) toward sequence-level characterization using massively parallel sequencing (MPS), enabling detection of internal sequence variants (isoalleles) and flanking-region polymorphisms that substantially increase discriminatory power in many forensic contexts. Although MPS is increasingly adopted in forensic laboratories, implementation remains dependent on infrastructure, cost considerations, validation requirements, and jurisdiction-specific legal frameworks. This review synthesizes the molecular mechanisms underlying STR variability, including replication slippage and mutation processes, and critically evaluates the transition to sequencing-based analysis. Particular attention is given to analytical challenges such as stochastic effects in ultra-low-template DNA and PCR inhibition in degraded samples. Special emphasis is placed on identification of skeletal remains from mass graves and historical contexts, where hierarchical analytical strategies—from mini-STR approaches to MPS-based workflows—enable recovery of highly fragmented DNA. The review also examines the evolution of probabilistic genotyping (PG), highlighting the importance of algorithmic transparency and reproducible analytical frameworks for judicial applications. By integrating technological advances with practical forensic challenges, this review outlines a comprehensive framework for implementing high-resolution STR analysis in contemporary genomic casework. As a narrative synthesis, the conclusions reflect currently available published evidence and acknowledge variability in validation status, implementation practices, and regional forensic infrastructures. Full article
(This article belongs to the Special Issue Forensic DNA Profiling: PCR Techniques and Innovations)
Show Figures

Figure 1

47 pages, 6821 KB  
Article
Prediction and Validation of Phase II Glucuronide Conjugates in Urine Using Combined Non-Targeted and Targeted LC–HRMS/MS Workflows and Their Validation for over 200 Drugs
by Camila Bardy, Luis Manuel Menéndez-Quintanal, Gemma Montalvo, Carmen García-Ruiz, Begoña Bravo Serrano and Jose Manuel Matey
Analytica 2026, 7(1), 18; https://doi.org/10.3390/analytica7010018 - 26 Feb 2026
Viewed by 1312
Abstract
High-resolution mass spectrometry (HRMS) enables non-targeted detection of drugs and metabolites in complex matrices. Phase II metabolites—especially glucuronides—are often the only detectable biomarkers in late or postmortem samples but are underrepresented in commercial libraries. This work pursued the prediction of phase II-glucuronide conjugates [...] Read more.
High-resolution mass spectrometry (HRMS) enables non-targeted detection of drugs and metabolites in complex matrices. Phase II metabolites—especially glucuronides—are often the only detectable biomarkers in late or postmortem samples but are underrepresented in commercial libraries. This work pursued the prediction of phase II-glucuronide conjugates in diluted urine samples by non-targeted/targeted LC-HRMS workflows. A simply “dilute-and-shoot” qualitative UHPLC-HRMS/MS method (Q Exactive HF, ddMS2) was integrated with Compound Discoverer® software for data processing. The workflow incorporated predictive strategies such as exact mass suspect lists, Structured Query Language (SQL)-based filters, compound-class and diagnostic neutral-loss rules (including the characteristic loss of 176.0321 Da for glucuronides) and MS/MS confirmation using both in-house and public spectral libraries. An additional part of the application’s performance assessment involved its validation for diluted urine sample. A qualitative validated method for more than two hundred drugs in urine samples was performed, including the method’s selectivity/specificity, limit of identification, matrix effects, and potential carryover. Most analytes fulfilled the qualitative acceptance criteria, with more than 60% successfully identified at a concentration of at least 2.5 ng/mL. Matrix effects were within acceptable limits for most compounds, and no severe ion suppression was observed. A non-targeted workflow was applied to real forensic samples (n = 16), allowing a reduction of approximately 66,800 detected features to 225 glucuronide candidates, while a targeted workflow based on exact mass lists yielded 31 high-confidence identifications. Characteristic neutral losses and diagnostic fragment ions led to the tentative identification of some glucuronide phase II metabolites such as mirtazapine–glucuronide, morphine-6–glucuronide, and glucuronide conjugates of benzodiazepines and synthetic opioids. In conclusion, the integration of biotransformation knowledge with HRMS-based predictive filtering allows for the efficient and hydrolysis-free detection of glucuronide metabolites, thereby extending detection windows and enhancing toxicological interpretation in complex forensic scenarios. This adaptable and library-independent workflow also facilitates retrospective data mining, making it suitable for the identification of emerging substances and newly characterized metabolites. Full article
(This article belongs to the Special Issue New Analytical Techniques and Methods in Pharmaceutical Science)
Show Figures

Figure 1

8 pages, 474 KB  
Article
Selection and Validation of Endogenous Reference microRNAs for Post-Mortem Interval Estimation in Vitreous Humor: A Preliminary Study
by Julia Lazzari, Andrea Scatena, Marco Di Paolo and Anna Rocchi
Int. J. Mol. Sci. 2026, 27(5), 2102; https://doi.org/10.3390/ijms27052102 - 24 Feb 2026
Viewed by 415
Abstract
Estimating the post-mortem interval (PMI) using microRNAs (miRNAs) in vitreous humor (VH) is a promising technique in forensic pathology. However, the reliability of quantitative Real-Time PCR (qPCR) data in this matrix is currently constrained by a critical methodological challenge: the lack of a [...] Read more.
Estimating the post-mortem interval (PMI) using microRNAs (miRNAs) in vitreous humor (VH) is a promising technique in forensic pathology. However, the reliability of quantitative Real-Time PCR (qPCR) data in this matrix is currently constrained by a critical methodological challenge: the lack of a rigorously validated endogenous reference gene (normalizer) capable of correcting for non-biological variations without being influenced by decomposition. This study aimed to identify a robust reference gene for VH analysis by performing a comparative validation of two candidates proposed in the literature: miR-222-3p and miR-96-5p. VH samples were collected from 47 forensic autopsy cases with estimated PMIs ranging from 3 to 24 h. The validation process assessed three key parameters: amplification detectability, expression stability (Coefficient of Variation, CV), and statistical independence from both the PMI and the pre-analytical freezing interval using regression models. MiR-222-3p was rejected as a normalizer due to poor detectability, failing to reach the detection threshold (Cq < 35) in 61.7% of cases (29/47). Conversely, hsa-miR-96-5p was validated as a stable reference gene. It demonstrated high detectability and expression stability (CV = 9.07%) among valid samples. Crucially, linear regression analysis showed no significant correlation between hsa-miR-96-5p levels and either the PMI (p = 0.69) and the pre-freezing time (p = 0.70). This study demonstrates that miR-222-3p is unsuitable for forensic casework in VH due to instability. We identified and validated hsa-miR-96-5p as a robust endogenous reference gene. Its adoption is recommended to standardize future molecular thanatochronology studies and improve the accuracy of PMI estimation models. Full article
(This article belongs to the Section Molecular Biology)
Show Figures

Figure 1

33 pages, 745 KB  
Article
XAI-Driven Malware Detection from Memory Artifacts: An Alert-Driven AI Framework with TabNet and Ensemble Classification
by Aristeidis Mystakidis, Grigorios Kalogiannnis, Nikolaos Vakakis, Nikolaos Altanis, Konstantina Milousi, Iason Somarakis, Gabriela Mihalachi, Mariana S. Mazi, Dimitris Sotos, Antonis Voulgaridis, Christos Tjortjis, Konstantinos Votis and Dimitrios Tzovaras
AI 2026, 7(2), 66; https://doi.org/10.3390/ai7020066 - 10 Feb 2026
Viewed by 1882
Abstract
Modern malware presents significant challenges to traditional detection methods, often leveraging fileless techniques, in-memory execution, and process injection to evade antivirus and signature-based systems. To address these challenges, alert-driven memory forensics has emerged as a critical capability for uncovering stealthy, persistent, and zero-day [...] Read more.
Modern malware presents significant challenges to traditional detection methods, often leveraging fileless techniques, in-memory execution, and process injection to evade antivirus and signature-based systems. To address these challenges, alert-driven memory forensics has emerged as a critical capability for uncovering stealthy, persistent, and zero-day threats. This study presents a two-stage host-based malware detection framework, that integrates memory forensics, explainable machine learning, and ensemble classification, designed as a post-alert asynchronous SOC workflow balancing forensic depth and operational efficiency. Utilizing the MemMal-D2024 dataset—comprising rich memory forensic artifacts from Windows systems infected with malware samples whose creation metadata spans 2006–2021—the system performs malware detection, using features extracted from volatile memory. In the first stage, an Attentive and Interpretable Learning for structured Tabular data (TabNet) model is used for binary classification (benign vs. malware), leveraging its sequential attention mechanism and built-in explainability. In the second stage, a Voting Classifier ensemble, composed of Light Gradient Boosting Machine (LGBM), eXtreme Gradient Boosting (XGB), and Histogram Gradient Boosting (HGB) models, is used to identify the specific malware family (Trojan, Ransomware, Spyware). To reduce memory dump extraction and analysis time without compromising detection performance, only a curated subset of 24 memory features—operationally selected to reduce acquisition/extraction time and validated via redundancy inspection, model explainability (SHAP/TabNet), and training data correlation analysis —was used during training and runtime, identifying the best trade-off between memory analysis and detection accuracy. The pipeline, which is triggered from host-based Wazuh Security Information and Event Management (SIEM) alerts, achieved 99.97% accuracy in binary detection and 70.17% multiclass accuracy, resulting in an overall performance of 87.02%, including both global and local explainability, ensuring operational transparency and forensic interpretability. This approach provides an efficient and interpretable detection solution used in combination with conventional security tools as an extra layer of defense suitable for modern threat landscapes. Full article
Show Figures

Figure 1

27 pages, 2292 KB  
Article
Source Camera Identification via Explicit Content–Fingerprint Decoupling with a Dual-Branch Deep Learning Framework
by Zijuan Han, Yang Yang, Jiaxuan Lu, Jian Sun, Yunxia Liu and Ngai-Fong Bonnie Law
Appl. Sci. 2026, 16(3), 1245; https://doi.org/10.3390/app16031245 - 26 Jan 2026
Viewed by 534
Abstract
In this paper, we propose a source camera identification method based on disentangled feature modeling, aiming to achieve robust extraction of camera fingerprint features under complex imaging and post-processing conditions. To address the severe coupling between image content and camera fingerprint features in [...] Read more.
In this paper, we propose a source camera identification method based on disentangled feature modeling, aiming to achieve robust extraction of camera fingerprint features under complex imaging and post-processing conditions. To address the severe coupling between image content and camera fingerprint features in existing methods, which makes content interference difficult to suppress, we develop a dual-branch deep learning framework guided by imaging physics. By introducing physical consistency constraints, the proposed framework explicitly separates image content representations from device-related fingerprint features in the feature space, thereby enhancing the stability and robustness of source camera identification. The proposed method adopts two parallel branches: a content modeling branch and a fingerprint feature extraction branch. The content branch is built upon an improved U-Net architecture to reconstruct scene and color information, and further incorporates texture refinement and multi-scale feature fusion to reduce residual content interference in fingerprint modeling. The fingerprint branch employs ResNet-50 as the backbone network to learn discriminative global features associated with the camera imaging pipeline. Based on these branches, fingerprint information dominated by sensor noise is explicitly extracted by computing the residual between the input image and the reconstructed content, and is further encoded through noise analysis and feature fusion for joint camera model classification. Experimental results on multiple public-source camera forensics datasets demonstrate that the proposed method achieves stable and competitive identification performance in same-brand camera discrimination, complex imaging conditions, and post-processing scenarios, validating the effectiveness of the proposed disentangled modeling and physical consistency constraint strategy for source camera identification. Full article
(This article belongs to the Special Issue New Development in Machine Learning in Image and Video Forensics)
Show Figures

Figure 1

15 pages, 1607 KB  
Article
Using Steganography and Artificial Neural Network for Data Forensic Validation and Counter Image Deepfakes
by Matimu Caswell Nkuna, Ebenezer Esenogho and Ahmed Ali
Computers 2026, 15(1), 61; https://doi.org/10.3390/computers15010061 - 15 Jan 2026
Cited by 1 | Viewed by 869
Abstract
The merging of the Internet of Things (IoT) and Artificial Intelligence (AI) advances has intensified challenges related to data authenticity and security. These advancements necessitate a multi-layered security approach to ensure the security, reliability, and integrity of critical infrastructure and intelligent surveillance systems. [...] Read more.
The merging of the Internet of Things (IoT) and Artificial Intelligence (AI) advances has intensified challenges related to data authenticity and security. These advancements necessitate a multi-layered security approach to ensure the security, reliability, and integrity of critical infrastructure and intelligent surveillance systems. This paper proposes a two-layered security approach that combines a discrete cosine transform least significant bit 2 (DCT-LSB-2) with artificial neural networks (ANNs) for data forensic validation and mitigating deepfakes. The proposed model encodes validation codes within the LSBs of cover images captured by an IoT camera on the sender side, leveraging the DCT approach to enhance the resilience against steganalysis. On the receiver side, a reverse DCT-LSB-2 process decodes the embedded validation code, which is subjected to authenticity verification by a pre-trained ANN model. The ANN validates the integrity of the decoded code and ensures that only device-originated, untampered images are accepted. The proposed framework achieved an average SSIM of 0.9927 across the entire investigated embedding capacity, ranging from 0 to 1.988 bpp. DCT-LSB-2 showed a stable Peak Signal-to-Noise Ratio (average 42.44 dB) under various evaluated payloads ranging from 0 to 100 kB. The proposed model achieved a resilient and robust multi-layered data forensic validation system. Full article
(This article belongs to the Special Issue Multimedia Data and Network Security)
Show Figures

Graphical abstract

17 pages, 480 KB  
Review
MicroRNAs in Cardiovascular Diseases and Forensic Applications: A Systematic Review of Diagnostic and Post-Mortem Implications
by Matteo Antonio Sacco, Saverio Gualtieri, Maria Cristina Verrina, Fabrizio Cordasco, Maria Daniela Monterossi, Gioele Grimaldi, Helenia Mastrangelo, Giuseppe Mazza and Isabella Aquila
Int. J. Mol. Sci. 2026, 27(2), 825; https://doi.org/10.3390/ijms27020825 - 14 Jan 2026
Cited by 1 | Viewed by 833
Abstract
MicroRNAs (miRNAs) are small non-coding RNA molecules approximately 20–22 nucleotides in length that regulate gene expression at the post-transcriptional level. By binding to target messenger RNAs (mRNAs), miRNAs inhibit translation or induce degradation, thus influencing a wide array of biological processes including development, [...] Read more.
MicroRNAs (miRNAs) are small non-coding RNA molecules approximately 20–22 nucleotides in length that regulate gene expression at the post-transcriptional level. By binding to target messenger RNAs (mRNAs), miRNAs inhibit translation or induce degradation, thus influencing a wide array of biological processes including development, inflammation, apoptosis, and tissue remodeling. Owing to their remarkable stability and tissue specificity, miRNAs have emerged as promising biomarkers in both clinical and forensic settings. In recent years, increasing evidence has demonstrated their utility in cardiovascular diseases, where they may serve as diagnostic, prognostic, and therapeutic tools. This systematic review aims to comprehensively summarize the role of miRNAs in cardiovascular pathology, focusing on their diagnostic potential in myocardial infarction, sudden cardiac death (SCD), and cardiomyopathies, and their applicability in post-mortem investigations. Following PRISMA guidelines, we screened PubMed, Scopus, and Web of Science databases for studies up to December 2024. The results highlight several miRNAs—including miR-1, miR-133a, miR-208b, miR-499a, and miR-486-5p—as robust markers for ischemic injury and sudden death, even in degraded or formalin-fixed autopsy samples. The high stability of miRNAs under extreme post-mortem conditions reinforces their potential as molecular tools in forensic pathology. Nevertheless, methodological heterogeneity and limited standardization currently hinder their routine application. Future studies should aim to harmonize analytical protocols and validate diagnostic thresholds across larger, well-characterized cohorts to fully exploit miRNAs as reliable molecular biomarkers in both clinical cardiology and forensic medicine. Full article
(This article belongs to the Section Molecular Genetics and Genomics)
Show Figures

Figure 1

14 pages, 847 KB  
Article
Molecular Tools for qPCR Identification and STR-Based Individual Identification of Panthera pardus (Linnaeus, 1758)
by Karolina Mahlerová, Lenka Vaňková and Daniel Vaněk
Genes 2026, 17(1), 45; https://doi.org/10.3390/genes17010045 - 31 Dec 2025
Viewed by 664
Abstract
Background/Objectives The leopard (Panthera pardus), an apex predator listed in CITES Appendix I and classified as Vulnerable by the IUCN, is undergoing severe population declines driven by habitat loss, human–wildlife conflict, and illegal trade. Rapid and reliable species and individual identification [...] Read more.
Background/Objectives The leopard (Panthera pardus), an apex predator listed in CITES Appendix I and classified as Vulnerable by the IUCN, is undergoing severe population declines driven by habitat loss, human–wildlife conflict, and illegal trade. Rapid and reliable species and individual identification is critical for conservation and forensic applications, particularly when analyzing highly processed or degraded seized wildlife products, where morphological identification is often impossible. We aimed to develop and validate a robust multiplex quantitative real-time PCR (qPCR) assay combined with a short tandem repeat (STR) system for the species-specific detection and individual identification of P. pardus. Methods The qPCR assay (Ppar Qplex) was designed to target a mitochondrial Cytochrome b (Cyt b) fragment for species confirmation, a nuclear marker (PLP) for general Feliformia detection and quantification, and an artificial internal positive control (IPC) to monitor PCR inhibition. The assay’s performance was validated for robustness, specificity, sensitivity, repeatability, and reproducibility, utilizing DNA extracted from 30 P. pardus individuals (hair and feces) and tested against 18 related Feliformia species and two outgroups. Individual identification was achieved using a set of 18 STR loci and a sex determination system adapted from previously published Panthera panels. Results Validation demonstrated high specificity for the Ppar Qplex: mitochondrial amplification occurred exclusively in P. pardus samples. The nuclear marker consistently amplified across all 18 tested Feliformia species but not the outgroups. The assay showed high analytical sensitivity, successfully detecting DNA at concentrations as low as 1 pg/µL, with consistent results confirmed across different sample types, replicates, and independent users. Furthermore, the STR multiplex successfully generated 30 unique individual profiles using the 18 polymorphic loci and the sex determination system. Conclusions The combined qPCR assay and STR system provide a fast, sensitive, and highly specific molecular framework for rapid leopard detection, quantification, and individual identification from a wide range of sample types. These tools strengthen forensic capacity to combat wildlife crime and provide critical data to support evidence-based conservation management of P. pardus. P. pardus, an apex predator listed in CITES Appendix I and classified as Vulnerable by the IUCN, is undergoing severe population declines driven by habitat loss, human–wildlife conflict, and illegal trade. Rapid and reliable identification of seized specimens is therefore critical for conservation and forensic applications, mainly when products are highly processed. We developed and validated a multiplex quantitative real-time PCR (qPCR) assay targeting the mitochondrial gene Cytochrome b (Cyt b) for species-specific detection. The assay was tested on verified leopard individuals and validated across 18 Feliformia and two outgroup species (Homo sapiens, Canis lupus familiaris). Analytical performance was assessed through robustness, specificity, sensitivity, repeatability, and reproducibility. Mitochondrial amplification occurred exclusively in leopard samples, while nuclear markers amplified consistently across Feliformia but not in outgroup species. The assay’s limit of DNA detection is 1 pg/µL and produces consistent results across replicates, tested types of samples (hair, feces), and independent users, with internal controls confirming the absence of inhibition. In addition, we present the results of successful individual identification using the set of 18 STR loci and the sex determination system. The developed qPCR and STR systems provide a fast, sensitive, and specific solution for leopard detection and quantification, reinforcing forensic efforts against wildlife crime and supporting conservation of P. pardus. Full article
(This article belongs to the Special Issue Advances in Forensic Genetics and DNA)
Show Figures

Figure 1

25 pages, 1817 KB  
Review
Animal Species and Identity Testing: Developments, Challenges, and Applications to Non-Human Forensics
by Bruce Budowle, Antti Sajantila and Daniel Vanek
Genes 2025, 16(12), 1503; https://doi.org/10.3390/genes16121503 - 16 Dec 2025
Cited by 1 | Viewed by 2381
Abstract
Biological samples of non-human origin, commonly encountered in wildlife crime investigations, present distinct challenges regarding forensic DNA analysis efforts. Although the types of samples encountered in human identity testing can vary to some degree, analyzing DNA from one species is facilitated by unified [...] Read more.
Biological samples of non-human origin, commonly encountered in wildlife crime investigations, present distinct challenges regarding forensic DNA analysis efforts. Although the types of samples encountered in human identity testing can vary to some degree, analyzing DNA from one species is facilitated by unified processes, common genetic marker systems, and national DNA databases. In contrast, non-human animal species identification is confounded by a diverse range of target species and a variety of sampling materials, such as feathers, processed animal parts in traditional medicine, and taxidermy specimens, which often contain degraded DNA in low quantities, are contaminated with chemical inhibitors, and may be comingled with other species. These complexities require specialized analytical approaches. Compounding these issues is a lack of validated non-human species forensic sampling and typing kits, and the risk of human DNA contamination during evidence collection. Markers residing on the mitochondrial genome (mtDNA) are routinely sought because of the large datasets available for comparison and their greater sensitivity of detection. However, the barcoding results can be complicated at times for achieving species-level resolution, the presence of nuclear inserts of mitochondrial DNA (NUMTs), and the limitation of mtDNA analysis alone to detect hybrids. Species-specific genetic markers for identification have been developed for a few high-profile species; however, many CITES (Convention on International Trade in Endangered Species of Wild Fauna and Flora)-listed organisms lack specific, validated forensic analytical tools, creating a significant gap in investigative enforcement capabilities. This deficiency stems in part from the low commercial nature of wildlife forensics efforts, a government research-driven field, the difficulty of obtaining sufficient reference samples from wild populations, limited training and education infrastructure, and inadequate funding support. Full article
(This article belongs to the Special Issue Research Updates in Forensic Genetics)
Show Figures

Figure 1

26 pages, 709 KB  
Article
A Tabular Data Imputation Technique Using Transformer and Convolutional Neural Networks
by Charlène Béatrice Bridge-Nduwimana, Salah Eddine El Harrauss, Aziza El Ouaazizi and Majid Benyakhlef
Big Data Cogn. Comput. 2025, 9(12), 321; https://doi.org/10.3390/bdcc9120321 - 13 Dec 2025
Cited by 2 | Viewed by 1249
Abstract
Upstream processes strongly influence downstream analysis in sequential data-processing workflows, particularly in machine learning, where data quality directly affects model performance. Conventional statistical imputations often fail to capture nonlinear dependencies, while deep learning approaches typically lack uncertainty quantification. We introduce a hybrid imputation [...] Read more.
Upstream processes strongly influence downstream analysis in sequential data-processing workflows, particularly in machine learning, where data quality directly affects model performance. Conventional statistical imputations often fail to capture nonlinear dependencies, while deep learning approaches typically lack uncertainty quantification. We introduce a hybrid imputation model that integrates a deep learning autoencoder with Convolutional Neural Network (CNN) layers and a Transformer-based contextual modeling architecture to address systematic variation across heterogeneous data sources. Performing multiple imputations in the autoencoder–transformer latent space and averaging representations provides implicit batch correction that suppresses context-specific remains without explicit batch identifiers. We performed experiments on datasets in which 10% of missing data was artificially introduced by completely random missing data (MCAR) and non-random missing data (MNAR) mechanisms. They demonstrated practical performance, jointly ranking first among the imputation methods evaluated. This imputation technique reduced the root mean square error (RMSE) by 50% compared to denoising autoencoders (DAE) and by 46% compared to iterative imputation (MICE). Performance was comparable for adversarial models (GAIN) and attention-based models (MIDA), and both provided interpretable uncertainty estimates (CV = 0.08–0.15). Validation on datasets from multiple sources confirmed the robustness of the technique: notably, on a forensic dataset from multiple laboratories, our imputation technique achieved a practical improvement over GAIN (0.146 vs. 0.189 RMSE), highlighting its effectiveness in mitigating batch effects. Full article
Show Figures

Graphical abstract

19 pages, 1366 KB  
Article
Assessing the Feasibility of In Vitro Assays in Combination with Biological Matrices to Screen for Endogenous CYP450 Phenotype Biomarkers Using an Untargeted Metabolomics Approach—A Proof of Concept Study
by Yannick Wartmann, Lana Brockbals, Thomas Kraemer and Andrea E. Steuer
Metabolites 2025, 15(12), 791; https://doi.org/10.3390/metabo15120791 - 12 Dec 2025
Viewed by 800
Abstract
Background/Objectives: Cytochrome P450 (CYP) enzymes are crucial for drug metabolism, yet inter-individual variability in their activity remains a significant clinical challenge. Current phenotyping methods are often impractical or even impossible, particularly in forensic toxicology and vulnerable populations. This proof-of-concept study investigated the feasibility [...] Read more.
Background/Objectives: Cytochrome P450 (CYP) enzymes are crucial for drug metabolism, yet inter-individual variability in their activity remains a significant clinical challenge. Current phenotyping methods are often impractical or even impossible, particularly in forensic toxicology and vulnerable populations. This proof-of-concept study investigated the feasibility of using in vitro assays with human liver microsomes (HLM) and recombinant CYP enzymes (isoenzymes), combined with untargeted metabolomics, to identify potential endogenous biomarker candidates indicative of CYP phenotype. Methods: This study uses in vitro incubations of HLM and isoenzymes in tandem with targeted and untargeted LC-(HR)MS and metabolomics techniques as well as statistical processing. Results: We demonstrate that HLM and isoenzymes maintain activity in the presence of complex biological matrices (blood/plasma), enabling metabolomic profiling. Untargeted analysis of assays in plasma revealed numerous potential biomarkers, with several showing significant correlations to enzyme activity. Conclusions: While identification remains the major challenge, this approach offers a promising avenue for developing accessible and efficient methods for indirect CYP phenotyping, potentially facilitating investigations in scenarios where traditional approaches are limited. This work provides a foundation for future studies focused on further developing in vitro assays and validating the proposed biomarkers, as well as establishing their utility in clinical and forensic settings. Full article
(This article belongs to the Section Pharmacology and Drug Metabolism)
Show Figures

Graphical abstract

Back to TopTop