Next Article in Journal
Milk Supply in Lebanon: Economic Challenges and the Role of Traditional Dairy Products
Previous Article in Journal
Integrated Transcriptomic and Proteomic Analyses Revealed the Mechanism of the Osmotic Stress Response in Lacticaseibacillus rhamnosus ATCC 53103
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Deep Learning-Driven Intelligent Fluorescent Probes: Advancements in Molecular Design for Accurate Food Safety Detection

1
Agricultural Product Processing and Storage Lab, School of Food and Biological Engineering, Jiangsu University, Zhenjiang 212013, China
2
School of Grain Science and Technology, Jiangsu University of Science and Technology, Zhenjiang 212003, China
*
Author to whom correspondence should be addressed.
Foods 2025, 14(17), 3114; https://doi.org/10.3390/foods14173114
Submission received: 23 July 2025 / Revised: 25 August 2025 / Accepted: 3 September 2025 / Published: 5 September 2025

Abstract

The complexity of global food supply chains challenges public health, requiring advanced detection technologies beyond traditional lab methods. Fluorescent sensing, known for its sensitivity and quick response, is promising for food safety but hindered by inefficient probe design and difficulties in analyzing complex signals in food. Deep Learning (DL) offers solutions with its nonlinear modeling and pattern recognition capabilities. This review explores recent advancements in DL applications for fluorescent sensing. We explore deep learning methods for predicting fluorescent probe properties and generating fluorescent molecule structures, highlighting their role in accelerating high-performance probe development. We then offer a detailed discussion on the pivotal technologies of deep learning in the intelligent analysis of complex fluorescent signals. On this basis, we engage in a thorough reflection on the core challenges presently confronting the field and propose a forward-looking perspective on the future developmental trajectories of fluorescent sensing technology, offering a comprehensive and insightful roadmap for future research in this interdisciplinary domain.

1. Introduction

Ensuring food safety is the cornerstone of maintaining global public health security, promoting sustainable economic development and maintaining social stability [1]. With the deepening of globalized trade and the wide application of modern agricultural technologies, the potential risk factors in the food supply chain have become increasingly diversified and hidden, covering every link from farm to table, including persistent organic pollutants (POPs), pesticide and veterinary drug residues, heavy metal ions, fungal toxins, illegal additives, as well as food-borne pathogenic microorganisms [2,3,4,5]. Traditional food safety standard testing methods, such as gas chromatography–mass spectrometry (GC-MS) and liquid chromatography–mass spectrometry (LC-MS/MS), although known as the “gold standard” with high sensitivity and accuracy [6,7], are expensive in terms of instrumentation and equipment, suffering from centralization limitations that fail to meet the urgent needs of modern society for large-scale, high-throughput, on-site and even in-line real-time monitoring [8,9].
To address the limitations of traditional methods, various rapid detection techniques have been developed [10,11,12], among which fluorescence sensing analysis has stood out as one of the most dynamic research areas [13,14,15]. The technique utilizes measurable changes in the probe photophysical properties (e.g., fluorescence intensity, wavelength, lifetime, polarization state) induced by the interaction of a target analyte with a fluorescent probe molecule to achieve detection [16,17]. Their core strengths are unparalleled sensitivity (theoretically up to the single-molecule level), a nanosecond response time, non-invasiveness, and visualization capabilities through fluorescence imaging [18,19]. Material systems for fluorescent probes have also undergone significant development (Figure 1), from the initial organic small-molecule dyes [20,21] to conjugated polymers [22,23,24,25], semiconductor quantum dots (SQDs) [26,27,28], fluorescent carbon dots (CQDs) [29,30,31,32], metal nanoclusters (NCs) [33,34,35], rare-earth-element-based upconversion nanoparticles (UCNPs) [36,37,38,39], and metal–organic frameworks (MOFs) [40,41], among others. In addition, common fluorescence sensing mechanisms include Förster resonance energy transfer (FRET) [42,43,44,45], the internal filtering effect (IFE) [46,47], photoelectron transfer (PET) [48,49], and aggregation-induced luminescence (AIE) [50], which provide a rich toolbox for different application scenarios [51,52].
However, despite the many achievements, fluorescence sensing technology has been plagued by two systematic bottlenecks on its way to truly widespread and reliable practical applications. First, the development of traditional probes is highly dependent on the “chemical intuition” and experience of chemists [53,54], through the iterative cycle of “hypothesis–synthesis–test–correction”, which is a trial-and-error approach [55,56]. In the face of thousands of food safety hazards and increasingly stringent requirements on probe performance (e.g., near-infrared emission to penetrate biological tissues, high selectivity for specific ions, high quantum yield in aqueous phases), it is a great challenge for synthetic chemistry and materials science to move away from serendipitous discovery and achieve the functionally oriented and predictable rational design of molecules [57]. Secondly, food samples are known to be “dirty” matrices containing endogenous substances such as proteins, fats, pigments, vitamins, etc., which can cause background fluorescence, absorption or scattering interference [58], which is difficult to avoid [59,60] despite the fact that ratiometric fluorescent probes have been designed to reduce the matrix interference effect [61,62,63]. In addition, when multiple structurally similar analytes coexist, the probes tend to exhibit cross response, making it difficult to distinguish between analyses of a single signal dimension [64,65]. These issues can lead to compromised accuracy, context-dependent reproducibility, and elevated false-positive rates under demanding field conditions in practical applications [66].
In this review, the rise of deep learning (DL) brings a historic opportunity to break through the above dual dilemmas. As a branch of machine learning, deep neural networks are capable of constructing deep and complex nonlinear models, automatically learning and extracting abstract features from massive and high-dimensional data [67,68], and their success in fields such as image recognition and natural language processing has proved their powerful cognitive and creative abilities [69,70,71]. For the mathematical framework of deep learning, readers may refer to the article by Shlezinger et al. [72]. In recent years, DL has begun to penetrate into the field of basic science, showing great potential to reshape the paradigm of scientific discovery [73,74,75]. Although there have been reviews exploring various applications of fluorescent probes in food testing [76,77,78] or outlining the role of machine learning in analytical chemistry [79,80,81], respectively, they tend to suffer from perspective limitations. Most of the literature views deep learning only as a more powerful chemometrics tool at the end of the analytical process for data classification or regression [82,83,84], ignoring its revolutionary potential in the upstream of scientific discovery—i.e., the molecule creation stage.
The purpose of this review is to systematically elaborate the new paradigm of “Intelligent Design (e.g., DL-enabled molecular design and optimization)→Intelligent Sensing (e.g., AI-powered signal acquisition and decoding)→Intelligent Analyses (e.g., deep learning-based feature extraction and pattern recognition)”, which is a closed-loop approach with an integrated analytical pipeline that is carried out by deep learning. In this paper, we will focus on how deep learning can fundamentally change the discovery mode of fluorescent probes through “forward prediction” and “reverse generation”, and how deep learning can achieve the accurate decoding of complex and dynamic fluorescent signals through deep feature extraction and spatio-temporal pattern recognition (Figure 2). In the end, this paper aims to provide a systematic framework for the future development of smart sensing technology with comprehensive analytical capabilities for researchers in the interdisciplinary fields of chemistry, materials, analytics, food, and computer science through the systematic consideration of current challenges and future development directions.

2. Deep Learning for Fluorescent Probe Design: From Screening to Generation

The performance of fluorescent probes is rooted in their molecular structure. Deep learning is transforming probe design from an “art” to a computational “science” by constructing high-precision mappings between structures and properties, making the leap from mass screening to on-demand creation.

2.1. Property-Driven Design: DL-Based QSAR and Virtual Screening

The core task of “Forward Design” is to rapidly and accurately predict a set of key properties of a molecule given its structure. For fluorescent probes, these critically include absorption/emission wavelengths (λabsem), fluorescence quantum yields, Stokes shifts, and photostability—properties that directly govern sensing performance. The accurate prediction of these parameters enables the rational design of probes with enhanced brightness, target specificity, and environmental stability. This is a prerequisite for high-throughput virtual screening (HTVS). Traditional QSPR (Quantitative Structure-Performance Relationship) methods rely on manually designed molecular descriptors (e.g., topological indices, physicochemical parameters), which often have incomplete and biased information [85,86]. Deep learning, particularly graph neural networks (GNNs), is changing this paradigm [83,87,88]. GNNs naturally represent molecules as graphs (atoms as nodes and bonds as edges), and iteratively aggregate information about neighboring atoms and bonds across the graph through a mechanism known as ‘message passing’, thus generating for each atom an iterative graph containing its localized information. Each atom generates a rich vector representation containing information about its local chemical environment and even global topology [89,90]. Ultimately, a molecular-level representation can be obtained by performing a pooling operation on all atom representations within the predefined molecular graph structure, which are then input into the fully connected layer for property prediction.
A variety of state-of-the-art GNN architectures, such as message-passing neural networks (MPNNs) [91], SchNet [92], and DimeNet/DimeNet++ [93] (which can explicitly encode interatomic distance and angle information, which is more physically intuitive), have proved to be highly successful in predicting molecular properties. Using publicly available databases containing hundreds of thousands of molecules (e.g., QM9 [94], ZINC [95]) or self-constructed specialized databases, researchers have trained GNN models to predict the core photophysical parameters of fluorescent probes. For example, by training on large fluorescent dye databases (e.g., FluoDB or privately constructed databases), the GNN model is able to predict the molecule’s maximum absorption/emission wavelengths (λabsem), molar absorbance coefficients, fluorescence quantum yields, Stokes shifts, and even two-photon absorption cross-sections with accuracies comparable to or exceeding those achieved by conventional density-functional theory [96,97], particularly for certain properties, while acknowledging that DFT retains unique advantages in providing fundamental physical insights.
Sungnam Park et al. introduced a deep learning model based on GNN for chromophore–solvent interactions (Figure 3A), which accurately and reliably predicted the optical and photophysical properties of organic compounds (Figure 3B) [98]. Cheng-Wei Ju et al. established a database containing over 4300 solvated organic fluorescent dyes and developed a new machine learning method (non-GNN) to predict emission wavelengths and photoluminescence quantum yields (PLQY) (Figure 3C). Evaluation using unseen molecules showed that this model is comparable to time-dependent density functional theory (TD-DFT) calculations [99]. Voznyy et al. utilized machine-learning-in-the-loop (non-GNN) to learn from available experimental data [100] and propose experimental parameters to try, and, ultimately, point to regions of synthetic parameter space that will enable record-monodispersity PbS quantum dots (Figure 3D). Additionally, its prediction speed is improved by five to seven orders of magnitude compared to DFT calculations, from hours/days to milliseconds [101]. This amazing efficiency makes comprehensive screening in a chemical space containing hundreds of millions or even billions of virtual compounds a reality [102]. In addition, DL models can predict the binding affinity of molecules to specific targets (e.g., protein active sites, nucleic acid sequences) [103,104], enabling truly functionally oriented design [105].

2.2. Structure-Driven Design: Generative Models for On-Demand Probes

If forward design is a “needle in a haystack”, then Inverse Design is the important goal in molecular discovery—generating completely new molecular structures that meet the requirements based directly on a set of predefined target properties [106,107]. Deep generative modelling is the core engine to achieve this goal.
Variational Autoencoders (VAEs) are one of the key classes of techniques. A typical molecular VAE consists of an encoder and a decoder [108]. The encoder compresses the discrete molecular map (or SMILES string) into a continuous, low-dimensional “latent space” that satisfies a specific probability distribution (usually Gaussian). The decoder then reconstructs the molecular structure from a point (vector) in this latent space. The structural characteristics of this latent space are its continuity and structure: molecules with similar properties have similar positions in the latent space [109]. By training the property predictor in conjunction with the VAE, it is possible to make certain dimensions of the latent space correlate with specific molecular properties (e.g., emission wavelength). Gradient up optimization can then be performed in the latent space in the direction that optimizes the target property (e.g., maximizing the quantum yield while fixing the emission wavelength at 700 nm), and the optimized vectors of the latent space can then be fed into the decoder to generate a completely new molecule with the desired property [110,111]. Critically for fluorescent probes, this requires embedding photophysical constraints like Stokes shift thresholds and aqueous stability during optimization. Shi et al. employed an autoencoder-based generative adversarial network (AGAN) to generate molecular SMILES. AGAN conducts unified training by simultaneously optimizing the encoder/decoder and the generative adversarial network (Figure 4A). The results show that the excited-state property distribution of the generated molecules completely matches the original samples, and the feasibility of molecular synthesis is also achieved [112].
Another powerful class of generative models is Generative Adversarial Networks (GANs) [115], which contain a Generator and a Discriminator. The Generator attempts to create chemically valid and authentic-looking molecules from random noise, while the Discriminator endeavors to distinguish between real molecules (from the training set) and molecules faked by the Generator. The two co-evolve in adversarial training, eventually forcing the generator to master the basic rules of chemical structure and be able to create novel and diverse molecules [116]. When generating fluorescent probes, GANs must additionally satisfy spectral criteria (e.g., excitation/emission gap > 100 nm) and avoid known quenching motifs. Johansson et al. proposed a new deep learning architecture, LatentGAN, which combines autoencoders and generative adversarial neural networks for de novo molecular design (Figure 4B). The results show that the compounds sampled by the trained model can largely occupy the same chemical space as the training set and can generate a large number of novel compounds [113]. Rinke et al. trained and evaluated three different neural network architectures—Multilayer Perceptron (MLP), CNN and Deep Tensor Neural Network (DTNN)—using the electronic energy level density of 132,000 organic molecules as an example, for predicting molecular excitation spectra [114] (Figure 4C). For fluorescent probe design, such predictions must account for solvent-dependent spectral shifts and aggregation effects. These studies provide references for the generation of functional fluorescent molecules.
To achieve goal-directed generation, Reinforcement Learning (RL) can be combined with GANs [117]. In this framework, the generator is considered as an “agent” whose “action” is to build the molecule step by step (e.g., adding atoms and bonds one by one). After each step, the agent receives a “reward” if the resulting intermediate structure or final molecule is closer to the target property, as evaluated by the QSPR model. By maximizing the cumulative reward, the generator learns how to strategically construct structures to meet complexity performance requirements [118,119]. In fluorescent probe RL, reward functions typically penalize structures prone to PET quenching or with poor photostability. In addition, Flow-based models [120] and Diffusion models [121] are emerging as newer generative techniques for molecular design because of their ability to generate high-quality, diverse molecules and more stable training processes. These “on-demand” methodologies imply a shift in chemical discovery from a passive “screening” mode to an active “creation” mode, providing a potential rapid response to novel and unknown food safety threats. A comparison of these model architectures’ advantages and challenges is provided in Table 1.

3. Intelligent Fluorescent Signal Processing and Feature Extraction

Designing the ideal probe molecule is the first step in building a high-performance sensing system. The second, and equally critical, step is to extract the weak but valuable analytical signals from the real, complex sample environment. Deep learning plays the role of “smart decoder” in this process, with capabilities far beyond those of traditional chemometrics.

3.1. Matrix Interference Correction: Signal Enhancement and Quantification

The complexity of food matrices is a major impediment to the practical application of fluorescence analysis. Casein and fat globules in milk, pigments and phenolics in fruit juices, and myoglobin in meat extracts produce strong autofluorescence, Rayleigh/Raman scattering, and severe Inner Filter Effect (IFE), which interferes with interferences that are often several orders of magnitude stronger than the target signals, leading to the risk of invalidating quantitative methods based on peak height or peak area [123,124]. For fluorescent probes, this interference disproportionately affects probes with small Stokes shifts or short lifetimes, demanding specialized correction approaches. Deep learning models, especially convolutional neural networks (CNNs), provide powerful solutions to this challenge. For one-dimensional fluorescence spectra, they can be treated as one-dimensional signals and processed using 1D-CNNs [125]. Through its multi-layer convolutional kernel, 1D-CNN is able to automatically learn and recognize multi-scale features in the spectrum [126,127], such as the characteristic peak shapes of the target, the statistical patterns of the noise, and the complex morphology of the baseline drift, without the need for manual pre-processing [128]. By end-to-end training on a dataset containing a large number of “interfering spectra + target concentration” pairs, the CNN can directly regress the exact concentration of the target from the original, heavily contaminated spectra, and its intrinsic nonlinear modelling capability allows it to implicitly learn and correct for a variety of complex matrix effects [129,130].
For the more informative three-dimensional fluorescence spectra, i.e., Excitation-Emission Matrix (EEM), 2D-CNN is able to bring out its great advantages in the image processing field [131]. EEMs can be directly treated as a two-dimensional image, where the target fluorescence peaks, Rayleigh/Raman scattering ridges, and background fluorescence regions present different shapes and texture features. 2D-CNN can efficiently capture these spatial features for the accurate segmentation and identification of different signal sources [132,133]. For example, by using semantic segmentation networks such as U-Net, researchers can accurately “key out” pure target fluorescent regions from EEMs and then quantify them to eliminate scattering and other fluorescent substances [134]. In addition, Autoencoders and their variants (e.g., noise-reducing autoencoders) have been widely used for spectral denoising and feature extraction [135], where the model is forced to learn the intrinsic features of the signal by compressing the spectrum into a low-dimensional “bottleneck” layer and then reconstructing it to efficiently filter out random noise and irrelevant information [136].

3.2. Fluorescent Probe Arrays: Chemical Fingerprinting with Deep Learning

In the face of multi-component mixtures with highly similar chemical structures (e.g., multiple carbamate pesticides, or honey from different origins), the development of a single selective probe often falls short of the task. A smarter strategy is to construct an array of multiple semi-selective (or cross-reactive) fluorescent probes, a “fluorescent chemotaxis” or “chemotongue”, borrowing from the biological olfactory and gustatory systems [137]. Each probe in the array produces a different level of fluorescence response when interacting with the analyte system, and the responses of all the probes combined form a unique high-dimensional “chemical fingerprint” of the sample [138,139].
Deep learning models, with their powerful nonlinear classification capabilities, are ideal tools for parsing these complex chemical fingerprints. The response vectors of the probe arrays can be directly fed into a fully connected dense convolutional network (DenseNet). Utilizing dense connections between layers encourages feature reuse and improve gradient flow, thereby enhancing its capacity to learn complex hierarchical representations from high-dimensional data, which is capable of finding complex decision boundaries in the high-dimensional space capable of distinguishing between different sample classes through a multilayered nonlinear transformation [138]. If the response signals of an array of probes are arranged into a two-dimensional matrix (e.g., rows representing different probes and columns representing responses at different excitation wavelengths), then they can be processed using CNNs [140]. The ability of the convolutional kernel of a CNN to automatically learn local patterns of synergistic responses between probes, as well as global response compositions, is essential for identifying the subtle “fingerprint” differences arising from multicomponent synergistic interactions [141].
As shown in Figure 5, researchers have successfully utilized probes arrays based on different fluorescent nanomaterials in combination with deep learning models to achieve the identification of a wide range of heavy metal ions [142,143,144,145], pesticide residues [146,147], amino acids [148], illegal additives [149,150,151], food freshness [130,152,153,154,155], food adulteration [156,157], and foodborne pathogens and their toxins [158,159]. Table 2 shows some typical cases of using fluorescent probes combined with deep learning or machine learning techniques for food analysis. The choice of specific deep learning architectures in these applications is driven by task requirements; for instance, ResNet-101 is often favored for its ability to extract deep hierarchical features from complex spectral or image data, while YOLO’s real-time object detection capability makes it highly suitable for tasks like dynamic freshness assessment. This ‘low-selectivity probe & high-intelligence algorithm’ strategy strategically couples computational intelligence with probe chemistry, where deep learning enhances spectral interpretation for multi-component systems while chemical design principles continue to govern probe development, thereby reducing demanding selectivity requirements for probes and opening new paths to solve thorny, multi-component food safety problems.

3.3. Kinetic Process Modeling: Nonlinear Quantification with RNNs

Many fluorescence sensing processes are inherently dynamic; this includes, for example, organophosphorus pesticide detection based on enzyme inhibition reactions, where the fluorescence signal may change with the gradual loss of acetylcholinesterase (AChE) activity [164], sensing based on conformational changes in a nucleic acid aptamer upon binding to a target [165], where the fluorescence resonance energy transfer (FRET) efficiency evolves over time [166]; and adsorption or catalytic processes between nanomaterials and analytes [167,168]. The kinetic profiles of these fluorescence signals over time contain far richer information than single endpoint readings, such as reaction rates, delay times, etc., and these dynamic features are crucial for distinguishing between different types or concentrations of analytes [169]. RNNs (especially LSTM/GRU) are uniquely suited to such tasks as they learn temporal patterns directly from sequence data, bypassing the need for predefined kinetic equations which often fail under complex nonlinearities or noise.
Recurrent Neural Networks (RNNs), particularly their advanced variants that overcome the gradient vanishing problem, Long Short-Term Memory Networks (LSTMs) [170,171,172] and Gated Recurrent Units (GRU) [173], are specifically designed to process and analyze sequence data. By taking fluorescence dynamics features as input time series, LSTM is able to capture the long-term dependencies and subtle trends of signals in the time dimension using its internal “memory units” [174]. Shi et al. [175] built an LSTM and radial basis function (RBF) based on optimized excitation–emission matrices (EEMs) from fisheye fluid. LSTM and radial basis function neural network (RBFNN) models were used to accurately predict the changes in rainbow trout freshness under non-isothermal storage conditions. In their work, Wei et al. utilized an edge-deployed multimodal nano-sensor array combined with a CNN-LSTM deep learning model to achieve the real-time monitoring of multi-pollutant water quality [176]. Separately, a key finding demonstrates that LSTM was able to accurately predict the final result based on the initial trend of the curve at an early stage when the reaction was far from equilibrium, thus reducing the detection time from hours to tens of minutes [177]. While powerful, these models demand significant computational resources for training and may lack interpretability compared to simpler kinetic models, posing challenges for real-time edge deployment or mechanistic studies. In addition, for sensing systems with highly nonlinear dose–response relationships, deep neural networks are able to fit these complex functional relationships with great accuracy, ensuring accurate quantification over the entire wide dynamic range [178].

4. Core Challenges, Deep Reflections and Future Scenarios

4.1. Current Core Challenges

1. The data gap and quality dilemma: The power of deep learning is rooted in large amounts of high-quality, well-labelled data. However, in the field of chemistry and sensing, access to such data is an extremely expensive “luxury”. Publicly available, standardised databases of fluorescent molecular properties are still limited in size and mostly restricted to basic photophysical parameters, lacking data directly related to sensing performance [179]. On the sensing application side, fluorescence response data involving real food matrices are even more scarce, heterogeneous, and difficult to reproduce. The construction of large-scale comprehensive databases that are open, shared, and follow the FAIR (discoverable, accessible, interoperable, and reusable) principle is the cornerstone for advancing the field. At the same time, it is also crucial to develop Few-shot Learning (FSL) to reduce the reliance on massive labelled data [180]. However, FSL performance remains sensitive to data distribution shifts and may struggle with highly nonlinear chemosensing relationships. In fluorescent probe design, this scarcity critically impedes predicting analyte-binding affinities and optimizing Stokes shifts—key parameters for rationetric sensing.
2. The contradiction between model interpretability (XAI) and scientific discovery: Deep learning models, especially deep networks, are opaque in their decision-making process, which is in tension with the scientific spirit of pursuing cause and effect and mechanism. In probe design, we not only want to know that an AI-designed molecule “works”, but also “why it works”, and what are its key structural motifs or mechanisms of action? In signal analysis, understanding which spectral features or probe response patterns the model is based on can help us discover new sensing mechanisms or optimise array design. Therefore, the development of interpretable AI techniques, such as the use of Attention Mechanism [181] to visualise the focus of a model’s attention, is essential to make the leap from “black-box” prediction to “white-box” insight, and thus to truly guide scientific innovation. Current XAI methods face challenges in providing chemically meaningful interpretations for complex nonlinear sensor responses. For fluorescence signal interpretation, this obscurity hinders identifying critical spectral bands or quenching pathways that govern target recognition specificity.
3. Generalisability and Domain Adaptation Challenge: A model trained in one lab, with one instrument, for one food matrix (e.g., plain milk), may suffer a sharp drop in performance when applied to another lab, with another instrument, or to another matrix (e.g., yoghurt, milk powder, juice). This is a typical Domain Shift problem. The development of more robust and generalisable model architectures, as well as the effective use of techniques such as Transfer Learning [182,183], which allows models to be rapidly “fine-tuned” and adapted with only a small amount of data from new scenarios, are key bottlenecks in the transition from “lab toys” to “industrial-grade tools”. Transfer learning efficacy is constrained when source-target domain gaps are large, particularly across different instrumentation or food matrices with varying interferents. In real-world probe deployment, matrix-induced fluorescence quenching or scattering effects frequently trigger catastrophic domain shifts beyond standard transfer learning capabilities.
4. Integration of physical and chemical laws: Purely data-driven deep learning models are “agnostic” in the sense that they do not have any a priori knowledge of physics or chemistry, and may sometimes make predictions that defy basic scientific principles (e.g., predicting negative quantum yields or generating chemically unstable molecules). How to embed known laws of physics and chemistry (e.g., Kasha’s rule, molecular orbital theory, reaction kinetics equations) as a strong constraint or regularisation term into the structural design or loss function of neural networks is an important direction to improve the prediction accuracy, data efficiency and extrapolation capability of the models [184,185]. However, mathematically formalizing complex chemical principles for model constraints remains challenging, and overly rigid physical embeddings may limit discovery of novel sensing mechanisms. Specifically for fluorescent probes, formalizing rules like Kasha’s Vavilov behavior or FRET distance constraints remains experimentally intractable for supramolecular systems, limiting rational design.

4.2. Future Development Landscape and Outlook

1. Application of Physics-Informed Neural Networks (PINNs): as a direct solution to the fourth challenge above, Physics-Informed Neural Networks (PINNs) [186] encode partial differential equations (e.g., equations describing the kinetics of mass transfer or reactions) directly into the loss function of the neural network, forcing the output of the model to satisfy the data fit while also having to obey the laws of physics. For the field of fluorescence sensing, this means that intelligent analytical models can be constructed that simultaneously fit experimental data and obey either fluorescence burst theory (e.g., the Stern-Volmer equation) or enzyme kinetic models (e.g., the Michaelis-Menten equation), resulting in more reliable and physically interpretable results. Specifically for fluorescent probes, PINNs can enforce: first, quantum yield ≥ 0.4 in aqueous media; second, förster distance constraints in FRET pairs; third, photobleaching kinetics ≤ 5%/min. However, practical implementation faces challenges such as the complexity of designing effective loss functions that balance data fidelity with physical constraints for intricate fluorescence phenomena, and the requirement for sufficient high-quality experimental data to train robust models.
2. Multimodal sensing information fusion and the rise of the Transformer architecture: by fusing information from different sensing modalities (e.g., fluorescence spectra, Raman spectra, mass spectrometry, electrochemical signals, hyperspectral images, etc.), a more comprehensive and robust characterisation of the sample can be obtained, thus overcoming the limitations of a single modality. The Transformer architecture, initially designed for natural language processing, thanks to its powerful long-range dependency capture capability and parallel processing advantages, has shown potential to outperform CNNs in areas such as computer vision [187,188]. Its application to processing multimodal sensing data is expected to extract deeper, cross-modal features, thereby significantly improving the accuracy and reliability of complex food system analysis. For fluorescence-centric fusion: first, time-resolved fluorescence decay (ns-μs) as core modality; second, cross-attention between fluorescence lifetime and Raman peaks; third, embedding solvent polarity effects in positional encoding. A key caveat is the current scarcity of large, well-annotated multimodal datasets specific to food fluorescence analysis needed for effective Transformer training, alongside the significant computational resources often required for these large models.
3. Edge Computing and Lightweight Model Diffusion: in order to achieve true on-site, real-time food safety monitoring, it is crucial to deploy complex deep learning models to resource-constrained portable devices or sensor nodes. This requires strong development of model compression techniques (e.g., knowledge distillation [189]), the design of lightweight network architectures, and the incorporation of Edge Computing frameworks [190], which enable data processing and intelligent decision making to be done close to the source of the data, thus lowering latency, preserving data privacy, and reducing the reliance on cloud computing resources. While edge deployment of intelligent sensing systems offers potential advantages for real-time monitoring, several technical challenges remain to be addressed. Current hardware constraints (e.g., computational capacity and energy efficiency) and the demand for model robustness in dynamic environments represent active research areas. Although advances in energy-efficient chipsets and adaptive learning algorithms may help mitigate these limitations, further validation of their performance in real-world scenarios is still required. Critical for fluorescence field deployment: first, real-time lifetime fitting on ≤100 mW processors; second, on-device correction of temperature-dependent Stokes shift; third, energy-efficient inference under probe photobleaching drift. Real-world deployment on edge devices necessitates careful trade-offs between model complexity/accuracy and the strict limitations of power consumption, memory, and processing capabilities inherent in portable hardware, which can impact the sophistication of the algorithms that can be reliably run.

5. Conclusions

Deep learning is systematically reshaping every aspect of fluorescence sensing technology from molecular design to signal analysis to practical applications with unprecedented depth and breadth. It not only improves the upper performance limit and R&D efficiency of the sensing system from the source by accelerating and empowering the rational design and ab initio creation of probes, but also significantly enhances the accuracy, robustness, and multitasking capability of fluorescence analysis in complex food matrices through its powerful pattern recognition, nonlinear modelling, and multi-dimensional information mining capabilities on the application side. Despite the challenges of data, interpretability, generalisation and physical consistency, with the continuous breakthroughs and integration of cutting-edge technologies such as physical information neural networks, multimodal fusion and edge AI, it is reasonable to believe that an AI-driven, high-throughput, high-precision food safety intelligent monitoring network will be realised, building a solid foundation for the protection of the people’s “safety on the tip of the tongue”. Building an impenetrable technological line of defence for the protection of people’s “safety on the tip of the tongue.

Author Contributions

Y.S.: Writing—original draft, Formal analysis, Funding acquisition.; S.Y.: Writing—original draft, Editing and revisions; W.L. (Wenting Li): Investigation and editing, Y.W.: Formal analysis and revisions; W.L. (Weiran Luo): Formal analysis and revisions. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Key Research and Development Program of China (grant number 2023YFE0105500 and 2024YFE0117000), National Natural Science Foundation of China (grant number 32502343), Open Project of Key Laboratory of Modern Agricultural Equipment and Technology, Ministry of Education, China (grant number MAET202332).

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this review.

References

  1. World Health Organization. Draft WHO Global Strategy for Food Safety 2022–2030. In Towards Stronger Food Safety Systems and Global Cooperation; World Health Organization: Geneva, Switzerland, 2022; Available online: https://www.who.int/publications/i/item/9789240057685 (accessed on 1 June 2025).
  2. Guo, W.; Pan, B.; Sakkiah, S.; Yavas, G.; Ge, W.; Zou, W.; Tong, W.; Hong, H. Persistent Organic Pollutants in Food: Contamination Sources, Health Effects and Detection Methods. Int. J. Environ. Res. Public Health 2019, 16, 4361. [Google Scholar] [CrossRef]
  3. Focker, M.; van Asselt, E.; Berendsen, B.; van de Schans, M.; van Leeuwen, S.; Visser, S.; van der Fels-Klerx, H. Review of food safety hazards in circular food systems in Europe. Food Res. Int. 2022, 158, 111505. [Google Scholar] [CrossRef] [PubMed]
  4. Long, L.; Han, Y.; Yuan, X.; Cao, S.; Liu, W.; Chen, Q.; Wang, K.; Han, Z. A novel ratiometric near-infrared fluorescent probe for monitoring cyanide in food samples. Food Chem. 2020, 331, 127359. [Google Scholar] [CrossRef] [PubMed]
  5. Chen, X.; Zhao, C.; Zhao, Q.; Yang, Y.; Yang, S.; Zhang, R.; Wang, Y.; Wang, K.; Qian, J.; Long, L. Construction of a Colorimetric and Near-Infrared Ratiometric Fluorescent Sensor and Portable Sensing System for On-Site Quantitative Measurement of Sulfite in Food. Foods 2024, 13, 1758. [Google Scholar] [CrossRef]
  6. Peris-Vicente, J.; Peris-García, E.; Albiol-Chiva, J.; Durgbanshi, A.; Ochoa-Aranda, E.; Carda-Broch, S.; Bose, D.; Esteve-Romero, J. Liquid chromatography, a valuable tool in the determination of antibiotics in biological, food and environmental samples. Microchem. J. 2022, 177, 107309. [Google Scholar] [CrossRef]
  7. Hu, M.; Ben, Y.; Wong, M.H.; Zheng, C. Trace Analysis of Multiclass Antibiotics in Food Products by Liquid Chromatography-Tandem Mass Spectrometry: Method Development. J. Agric. Food Chem. 2021, 69, 1656–1666. [Google Scholar] [CrossRef]
  8. Sun, Q.; Dong, Y.; Wen, X.; Zhang, X.; Hou, S.; Zhao, W.; Yin, D. A review on recent advances in mass spectrometry analysis of harmful contaminants in food. Front. Nutr. 2023, 10, 1244459. [Google Scholar] [CrossRef]
  9. Shan, Y.; Lu, Y.-N.; Yi, W.; Wang, B.; Li, J.; Guo, J.; Li, W.; Yin, Y.; Wang, S.; Liu, F. On-site food safety detection: Opportunities, advancements, and prospects. Biosens. Bioelectron. X 2023, 14, 100350. [Google Scholar] [CrossRef]
  10. Umapathi, R.; Park, B.; Sonwal, S.; Rani, G.M.; Cho, Y.; Huh, Y.S. Advances in optical-sensing strategies for the on-site detection of pesticides in agricultural foods. Trends Food Sci. Technol. 2022, 119, 69–89. [Google Scholar] [CrossRef]
  11. Zhang, X.; Yang, J.; Lin, T.; Ying, Y. Food and agro-product quality evaluation based on spectroscopy and deep learning: A review. Trends Food Sci. Technol. 2021, 112, 431–441. [Google Scholar] [CrossRef]
  12. Gupta, R.; Raza, N.; Bhardwaj, S.K.; Vikrant, K.; Kim, K.-H.; Bhardwaj, N. Advances in nanomaterial-based electrochemical biosensors for the detection of microbial toxins, pathogenic bacteria in food matrices. J. Hazard. Mater. 2021, 401, 123379. [Google Scholar] [CrossRef] [PubMed]
  13. He, H.; Sun, D.-W.; Wu, Z.; Pu, H.; Wei, Q. On-off-on fluorescent nanosensing: Materials, detection strategies and recent food applications. Trends Food Sci. Technol. 2022, 119, 243–256. [Google Scholar] [CrossRef]
  14. Shen, Y.; Wei, Y.; Zhu, C.; Cao, J.; Han, D.-M. Ratiometric fluorescent signals-driven smartphone-based portable sensors for onsite visual detection of food contaminants. Coord. Chem. Rev. 2022, 458, 214442. [Google Scholar] [CrossRef]
  15. Ouyang, Q.; Liu, Y.; Chen, Q.; Guo, Z.; Zhao, J.; Li, H.; Hu, W. Rapid and specific sensing of tetracycline in food using a novel upconversion aptasensor. Food Control 2017, 81, 156–163. [Google Scholar] [CrossRef]
  16. Zhang, B.; Li, H.; Pan, W.; Chen, Q.; Ouyang, Q.; Zhao, J. Dual-Color Upconversion Nanoparticles (UCNPs)-Based Fluorescent Immunoassay Probes for Sensitive Sensing Foodborne Pathogens. Food Anal. Methods 2017, 10, 2036–2045. [Google Scholar] [CrossRef]
  17. Gu, H.; Huang, X.; Chen, Q.; Sun, Y. Rapid Assessment of Total Polar Material in Used Frying Oils Using Manganese Tetraphenylporphyrin Fluorescent Sensor with Enhanced Sensitivity. Food Anal. Methods 2020, 13, 2080–2086. [Google Scholar] [CrossRef]
  18. Zhang, W.; Zhong, H.; Zhao, P.; Shen, A.; Li, H.; Liu, X. Carbon quantum dot fluorescent probes for food safety detection: Progress, opportunities and challenges. Food Control 2022, 133, 108591. [Google Scholar] [CrossRef]
  19. Niu, C.; Yao, Z.; Jiang, S. Synthesis and application of quantum dots in detection of environmental contaminants in food: A comprehensive review. Sci. Total Environ. 2023, 882, 163565. [Google Scholar] [CrossRef]
  20. Yang, X.; Wang, J.; Zhang, Z.; Zhang, B.; Du, X.; Zhang, J.; Wang, J. BODIPY-based fluorescent probe for cysteine detection and its applications in food analysis, test strips and biological imaging. Food Chem. 2023, 416, 135730. [Google Scholar] [CrossRef]
  21. Chen, Q.; Li, J.; Wang, Y.; Tian, M.; Liang, T.; Zhong, K.; Yan, X.; Tang, L. A quinolinium-based colorimetric and NIR fluorescent dual-channel sensing platform for specific detection of bisulfite in food, traditional Chinese medicine and living cells. Dye. Pigment. 2025, 239, 112767. [Google Scholar] [CrossRef]
  22. Chen, Z.; Ma, J.; Sun, D.W. Aggregates-based fluorescence sensing technology for food hazard detection: Principles, improvement strategies, and applications. Compr. Rev. Food Sci. Food Saf. 2023, 22, 2977–3010. [Google Scholar] [CrossRef]
  23. Skorjanc, T.; Shetty, D.; Valant, M. Covalent organic polymers and frameworks for fluorescence-based sensors. ACS Sens. 2021, 6, 1461–1481. [Google Scholar] [CrossRef] [PubMed]
  24. Muthusamy, S.; Rajalakshmi, K.; Ahn, D.-H.; Kannan, P.; Zhu, D.; Nam, Y.-S.; Choi, K.Y.; Luo, Z.; Song, J.-W.; Xu, Y. Spontaneous detection of F and viscosity using a multifunctional tetraphenylethene-lepidine probe: Exploring environmental applications. Food Chem. 2025, 466, 142147. [Google Scholar] [CrossRef] [PubMed]
  25. Qiu, H.; Gao, L.; Wang, J.; Pan, J.; Yan, Y.; Zhang, X. A precise and efficient detection of Beta-Cyfluthrin via fluorescent molecularly imprinted polymers with ally fluorescein as functional monomer in agricultural products. Food Chem. 2017, 217, 620–627. [Google Scholar] [CrossRef] [PubMed]
  26. Linghu, X.; Qiu, J.; Wang, S.; Lu, Y. Fluorescence immunoassay based on magnetic separation and ZnCdSe/ZnS quantum dots as a signal marker for intelligent detection of sesame allergen in foods. Talanta 2023, 256, 124323. [Google Scholar] [CrossRef]
  27. Zhou, H.; Wang, H.; Li, X.; Wang, L.; Huang, H.; Qiu, H.; Cong, W.; Wang, M.; Zhang, J. Synthesis of group I–III–VI semiconductor quantum dots and its application in food safety testing. Rev. Anal. Chem. 2022, 41, 324–336. [Google Scholar] [CrossRef]
  28. Sun, Y.; Zhai, X.; Zou, X.; Shi, J.; Huang, X.; Li, Z. A Ratiometric Fluorescent Sensor Based on Silicon Quantum Dots and Silver Nanoclusters for Beef Freshness Monitoring. Foods 2023, 12, 1464. [Google Scholar] [CrossRef]
  29. Liu, L.; Mi, Z.; Huo, X.; Yuan, L.; Bao, Y.; Liu, Z.; Feng, F. A label-free fluorescence nanosensor based on nitrogen and phosphorus co-doped carbon quantum dots for ultra-sensitive detection of new coccine in food samples. Food Chem. 2022, 368, 130829. [Google Scholar] [CrossRef]
  30. Liu, Y.; Zan, M.; Cao, L.; Peng, J.; Wang, P.; Pang, X.; Zhang, Y.; Li, L.; Mei, Q.; Dong, W.-F. F-doped silicon quantum dots as a novel fluorescence nanosensor for quantitative detection of new coccine and application in food samples. Microchem. J. 2022, 179, 107453. [Google Scholar] [CrossRef]
  31. Hu, X.; Li, Y.; Xu, Y.; Gan, Z.; Zou, X.; Shi, J.; Huang, X.; Li, Z.; Li, Y. Green one-step synthesis of carbon quantum dots from orange peel for fluorescent detection of Escherichia coli in milk. Food Chem. 2021, 339, 127775. [Google Scholar] [CrossRef]
  32. Yin, M.; Wang, W.; Wei, J.; Chen, X.; Chen, Q.; Chen, X.; Oyama, M. Novel dual-emissive fluorescent immunoassay for synchronous monitoring of okadaic acid and saxitoxin in shellfish. Food Chem. 2022, 368, 130856. [Google Scholar] [CrossRef] [PubMed]
  33. Luo, L.; Li, J.; Bi, X.; Jiang, P.; Li, L.; Qiao, G.; You, T. Engineering “three-in-one” fluorescent nanozyme of Ce-Au NCs for on-site visual detection of Hg2+. J. Hazard. Mater. 2024, 476, 134967. [Google Scholar] [CrossRef] [PubMed]
  34. Tan, K.; Ma, H.; Mu, X.; Wang, Z.; Wang, Q.; Wang, H.; Zhang, X.-D. Application of gold nanoclusters in fluorescence sensing and biological detection. Anal. Bioanal. Chem. 2024, 416, 5871–5891. [Google Scholar] [CrossRef] [PubMed]
  35. Huang, X.; Sun, W.; Li, Z.; Shi, J.; Zhang, N.; Zhang, Y.; Zhai, X.; Hu, X.; Zou, X. Hydrogen sulfide gas sensing toward on-site monitoring of chilled meat spoilage based on ratio-type fluorescent probe. Food Chem. 2022, 396, 133654. [Google Scholar] [CrossRef]
  36. Li, H.; Wu, Y.; Shoaib, M.; Bei, Q.; Chen, Q. A dual-recognition UCNPs sensor for sensitive detection of tetracycline in food using computer-designed silica-grafted paper microfluidic strategy. Sens. Actuators B Chem. 2025, 438, 137799. [Google Scholar] [CrossRef]
  37. Bahari, H.R.; Mousavi Khaneghah, A.; Eş, I. Upconversion nanoparticles-modified aptasensors for highly sensitive mycotoxin detection for food quality and safety. Compr. Rev. Food Sci. Food Saf. 2024, 23, e13369. [Google Scholar] [CrossRef]
  38. Li, Y.; Li, Y.; Zhang, D.; Tan, W.; Shi, J.; Li, Z.; Liu, H.; Yu, Y.; Yang, L.; Wang, X.; et al. A fluorescence resonance energy transfer probe based on functionalized graphene oxide and upconversion nanoparticles for sensitive and rapid detection of zearalenone. LWT 2021, 147, 111541. [Google Scholar] [CrossRef]
  39. Liu, Y.; Ouyang, Q.; Li, H.; Chen, M.; Zhang, Z.-Z.; Chen, Q. Turn-On Fluoresence Sensor for Hg2+ in Food Based on FRET between Aptamers-Functionalized Upconversion Nanoparticles and Gold Nanoparticles. J. Agric. Food Chem. 2018, 66, 6188–6195. [Google Scholar] [CrossRef]
  40. Zhu, A.; Ali, S.; Jiao, T.; Wang, Z.; Xu, Y.; Ouyang, Q.; Chen, Q. Facile synthesis of fluorescence-SERS dual-probe nanocomposites for ultrasensitive detection of sulfur-containing gases in water and beer samples. Food Chem. 2023, 420, 136095. [Google Scholar] [CrossRef]
  41. Marimuthu, M.; Arumugam, S.S.; Sabarinathan, D.; Li, H.; Chen, Q. Metal organic framework based fluorescence sensor for detection of antibiotics. Trends Food Sci. Technol. 2021, 116, 1002–1028. [Google Scholar] [CrossRef]
  42. Liu, R.; Ali, S.; Haruna, S.A.; Ouyang, Q.; Li, H.; Chen, Q. Development of a fluorescence sensing platform for specific and sensitive detection of pathogenic bacteria in food samples. Food Control 2022, 131, 108419. [Google Scholar] [CrossRef]
  43. Xu, Y.; Kutsanedzie, F.Y.H.; Ali, S.; Wang, P.; Li, C.; Ouyang, Q.; Li, H.; Chen, Q. Cysteamine-mediated upconversion sensor for lead ion detection in food. J. Food Meas. Charact. 2021, 15, 4849–4857. [Google Scholar] [CrossRef]
  44. Wang, L.; Haruna, S.A.; Ahmad, W.; Wu, J.; Chen, Q.; Ouyang, Q. Tunable multiplexed fluorescence biosensing platform for simultaneous and selective detection of paraquat and carbendazim pesticides. Food Chem. 2022, 388, 132950. [Google Scholar] [CrossRef] [PubMed]
  45. Wang, P.; Li, H.; Hassan, M.; Guo, Z.; Zhang, Z.-Z.; Chen, Q. Fabricating an Acetylcholinesterase Modulated UCNPs-Cu2+ Fluorescence Biosensor for Ultrasensitive Detection of Organophosphorus Pesticides-Diazinon in Food. J. Agric. Food Chem. 2019, 67, 4071–4079. [Google Scholar] [CrossRef] [PubMed]
  46. Chen, Z.-J.; Huang, A.-J.; Dong, X.-X.; Zhang, Y.-F.; Zhu, L.; Luo, L.; Xu, Z.-L.; Wang, H. A simple and sensitive fluoroimmunoassay based on the nanobody-alkaline phosphatase fusion protein for the rapid detection of fenitrothion. Front. Sustain. Food Syst. 2023, 7, 1320931. [Google Scholar] [CrossRef]
  47. Wu, W.; Ahmad, W.; Hassan, M.; Wu, J.; Ouyang, Q.; Chen, Q. An upconversion biosensor based on inner filter effect for dual-role recognition of sulfadimethoxine in aquatic samples. Food Chem. 2024, 437, 137832. [Google Scholar] [CrossRef] [PubMed]
  48. Bi, X.; Li, L.; Luo, L.; Liu, X.; Li, J.; You, T. A ratiometric fluorescence aptasensor based on photoinduced electron transfer from CdTe QDs to WS2 NTs for the sensitive detection of zearalenone in cereal crops. Food Chem. 2022, 385, 132657. [Google Scholar] [CrossRef]
  49. Ding, X.; Ahmad, W.; Wu, J.; Rong, Y.; Ouyang, Q.; Chen, Q. Bipyridine-mediated fluorescence charge transfer process based on copper ion grafted upconversion nanoparticle platform for ciprofloxacin sensing in aquatic products. Food Chem. 2023, 404, 134761. [Google Scholar] [CrossRef]
  50. Fang, X.; Liu, T.; Xue, C.; Xue, G.; Wu, M.; Liu, P.; Hammock, B.D.; Lai, W.; Peng, J.; Zhang, C. Competitive ratiometric fluorescent lateral flow immunoassay based on dual emission signal for sensitive detection of chlorothalonil. Food Chem. 2024, 433, 137200. [Google Scholar] [CrossRef]
  51. Marimuthu, M.; Xu, K.; Song, W.; Chen, Q.; Wen, H. Safeguarding food safety: Nanomaterials-based fluorescent sensors for pesticide tracing. Food Chem. 2025, 463, 141288. [Google Scholar] [CrossRef]
  52. Wang, Y.; Xu, J.; Qiu, Y.; Li, P.; Liu, B.; Yang, L.; Barnych, B.; Hammock, B.D.; Zhang, C. Highly Specific Monoclonal Antibody and Sensitive Quantum Dot Beads-Based Fluorescence Immunochromatographic Test Strip for Tebuconazole Assay in Agricultural Products. J. Agric. Food Chem. 2019, 67, 9096–9103. [Google Scholar] [CrossRef]
  53. Xu, Y.; Hassan, M.; Sharma, A.S.; Li, H.; Chen, Q. Recent advancement in nano-optical strategies for detection of pathogenic bacteria and their metabolites in food safety. Crit. Rev. Food Sci. Nutr. 2023, 63, 486–504. [Google Scholar] [CrossRef]
  54. Gan, Z.; Hu, X.; Xu, X.; Zhang, W.; Zou, X.; Shi, J.; Zheng, K.; Arslan, M. A portable test strip based on fluorescent europium-based metal–organic framework for rapid and visual detection of tetracycline in food samples. Food Chem. 2021, 354, 129501. [Google Scholar] [CrossRef]
  55. Xu, W.; Zhang, W.; Shen, Z.; Xu, W.; Zhao, J.; Li, H.; He, Q.; Fu, Y.; Cheng, J. Tailoring Super-Performed Chemo-Sensor via Simulation-Modeling and MEMS-Screening. Adv. Sci. 2025, 12, e2412937. [Google Scholar] [CrossRef] [PubMed]
  56. Sharma, A.S.; Marimuthu, M.; Varghese, A.W.; Wu, J.; Xu, J.; Xiaofeng, L.; Devaraj, S.; Lan, Y.; Li, H.; Chen, Q. A review of biomolecules conjugated lanthanide up-conversion nanoparticles-based fluorescence probes in food safety and quality monitoring applications. Crit. Rev. Food Sci. Nutr. 2024, 64, 6129–6159. [Google Scholar] [CrossRef] [PubMed]
  57. Li, L.; Peng, Z.; Zeng, Y.; Liu, G. Recent application of near-infrared fluorescence probes in food safety detection. J. Innov. Opt. Heal Sci. 2024, 18, 25300034. [Google Scholar] [CrossRef]
  58. Cai, Y.; Cao, L.; Cai, H.; Yang, W.; Lu, H.; Adila, A.; Zhang, B.; Cao, Y.; Huang, W.; Xu, W.; et al. A rapid microfluidic paper-based chip sensor using ratiometric fluorescence and molecularly imprinted polymers for visual detection of sulfadiazine in actual samples. J. Food Compos. Anal. 2024, 139, 107108. [Google Scholar] [CrossRef]
  59. Radotić, K.; Stanković, M.; Bartolić, D.; Natić, M. Intrinsic Fluorescence Markers for Food Characteristics, Shelf Life, and Safety Estimation: Advanced Analytical Approach. Foods 2023, 12, 3023. [Google Scholar] [CrossRef]
  60. Xu, Y.; Zheng, H.; Sui, J.; Lin, H.; Cao, L. Rapid and Sensitive Fluorescence Detection of Staphylococcus aureus Based on Polyethyleneimine-Enhanced Boronate Affinity Isolation. Foods 2023, 12, 1366. [Google Scholar] [CrossRef]
  61. Wang, Y.; Li, W.; Hu, X.; Zhang, X.; Huang, X.; Li, Z.; Li, M.; Zou, X.; Shi, J. Efficient preparation of dual-emission ratiometric fluorescence sensor system based on aptamer-composite and detection of bis(2-ethylhexyl) phthalate in pork. Food Chem. 2021, 352, 129352. [Google Scholar] [CrossRef]
  62. Shi, B.; Zhang, X.; Li, W.; Liang, N.; Hu, X.; Xiao, J.; Wang, D.; Zou, X.; Shi, J. An intrinsic dual-emitting fluorescence sensing toward tetracycline with self-calibration model based on luminescent lanthanide-functionalized metal-organic frameworks. Food Chem. 2023, 400, 133995. [Google Scholar] [CrossRef]
  63. Chen, X.; Xu, J.; Li, Y.; Zhang, L.; Bi, N.; Gou, J.; Zhu, T.; Jia, L. A novel intelligently integrated MOF-based ratio fluorescence sensor for ultra-sensitive monitoring of TC in water and food samples. Food Chem. 2023, 405, 134899. [Google Scholar] [CrossRef]
  64. Lai, L.; Yan, F.; Chen, G.; Huang, Y.; Huang, L.; Li, D. Recent Progress on Fluorescent Probes in Heavy Metal Determinations for Food Safety: A Review. Molecules 2023, 28, 5689. [Google Scholar] [CrossRef]
  65. Tian, X.; Murfin, L.C.; Wu, L.; Lewis, S.E.; James, T.D. Fluorescent small organic probes for biosensing. Chem. Sci. 2021, 12, 3406–3426. [Google Scholar] [CrossRef] [PubMed]
  66. Kakkar, S.; Gupta, P.; Kumar, N.; Kant, K. Progress in Fluorescence Biosensing and Food Safety towards Point-of-Detection (PoD) System. Biosensors 2023, 13, 249. [Google Scholar] [CrossRef] [PubMed]
  67. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
  68. Zhou, X.; Zhao, C.; Sun, J.; Cao, Y.; Yao, K.; Xu, M. A deep learning method for predicting lead content in oilseed rape leaves using fluorescence hyperspectral imaging. Food Chem. 2023, 409, 135251. [Google Scholar] [CrossRef]
  69. You, J.; Li, D.; Wang, Z.; Chen, Q.; Ouyang, Q. Prediction and visualization of moisture content in Tencha drying processes by computer vision and deep learning. J. Sci. Food Agric. 2024, 104, 5486–5494. [Google Scholar] [CrossRef]
  70. Guo, J.; Zhang, K.; Adade, S.Y.S.; Lin, J.; Lin, H.; Chen, Q. Tea grading, blending, and matching based on computer vision and deep learning. J. Sci. Food Agric. 2025, 105, 3239–3251. [Google Scholar] [CrossRef]
  71. Ji, W.; Gao, X.; Xu, B.; Pan, Y.; Zhang, Z.; Zhao, D. Apple target recognition method in complex environment based on improved YOLOv4. J. Food Process. Eng. 2021, 44, e13866. [Google Scholar] [CrossRef]
  72. Shlezinger, N.; Eldar, Y.C. Model-Based Deep Learning. In Foundations and Trends® in Signal Processing; Stanford University: Stanford, CA, USA, 2023; Volume 17, pp. 291–416. [Google Scholar] [CrossRef]
  73. He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
  74. Kadam, G.V.; Dubey, G.P. Survey on One-Shot Learning for Image Recognition using Machine Learning and Deep Learning Techniques. In Proceedings of the 2024 5th International Conference on Smart Electronics and Communication (ICOSEC), Trichy, India, 18–20 September 2024; pp. 1055–1062. [Google Scholar]
  75. Tian, Y.; Sun, J.; Zhou, X.; Yao, K.; Tang, N. Detection of soluble solid content in apples based on hyperspectral technology combined with deep learning algorithm. J. Food Process. Preserv. 2022, 46, e16414. [Google Scholar] [CrossRef]
  76. Wang, N.; Zhang, L.; Li, J.; Zhou, Q.; Yang, H.; Shan, Y.; Chen, Y.; Li, K.; Yu, X. Recent advances in reactive small-molecule fluorescent probes for food safety. Coord. Chem. Rev. 2025, 530, 216480. [Google Scholar] [CrossRef]
  77. Duan, N.; Wang, H.; Li, Y.; Yang, S.; Tian, H.; Sun, B. The research progress of organic fluorescent probe applied in food and drinking water detection. Coord. Chem. Rev. 2021, 427, 213557. [Google Scholar] [CrossRef]
  78. Sharma, A.S.; Ali, S.; Sabarinathan, D.; Murugavelu, M.; Li, H.; Chen, Q. Recent progress on graphene quantum dots-based fluorescence sensors for food safety and quality assessment applications. Compr. Rev. Food Sci. Food Saf. 2021, 20, 5765–5801. [Google Scholar] [CrossRef] [PubMed]
  79. Yuan, Y.; Ji, Z.; Fan, Y.; Xu, Q.; Shi, C.; Lyu, J.; Ertbjerg, P. Deep learning-assisted fluorescence spectroscopy for food quality and safety analysis. Trends Food Sci. Technol. 2025, 156, 104821. [Google Scholar] [CrossRef]
  80. Deshmukh, M.T.; Wankhede, P.; Chakole, N.; Kale, P.D.; Jadhav, M.R.; Kulkarni, M.B.; Bhaiyya, M. Towards intelligent food safety: Machine learning approaches for aflatoxin detection and risk prediction. Trends Food Sci. Technol. 2025, 161, 105055. [Google Scholar] [CrossRef]
  81. Li, Y.; Zhang, W.; Cui, Z.; Shi, L.; Shang, Y.; Ji, Y.; Wang, J. Machine learning-assisted nanosensor arrays: An efficiently high-throughput food detection analysis. Trends Food Sci. Technol. 2024, 149, 104564. [Google Scholar] [CrossRef]
  82. Saini, N.; Kriti; Thakur, A.; Saini, S.; Kaur, N.; Singh, N. Synergizing Machine Learning and fluorescent biomolecules: A new era in sensing platforms. TrAC Trends Anal. Chem. 2025, 187, 118196. [Google Scholar] [CrossRef]
  83. Wang, Y.; Feng, Y.; Zhang, B.; Upadhyay, A.; Xiao, Z.; Luo, Y. Machine learning-supported sensor array for multiplexed foodborne pathogenic bacteria detection and identification. Trends Food Sci. Technol. 2024, 154, 104787. [Google Scholar] [CrossRef]
  84. Liang, Y.; Lin, H.; Kang, W.; Shao, X.; Cai, J.; Li, H.; Chen, Q. Application of colorimetric sensor array coupled with machine-learning approaches for the discrimination of grains based on freshness. J. Sci. Food Agric. 2023, 103, 6790–6799. [Google Scholar] [CrossRef]
  85. Shi, J.; Luan, F.; Zhang, H.; Liu, M.; Guo, Q.; Hu, Z.; Fan, B. QSPR Study of Fluorescence Wavelengths (λex/λem) Based on the Heuristic Method and Radial Basis Function Neural Networks. QSAR Comb. Sci. 2006, 25, 147–155. [Google Scholar] [CrossRef]
  86. Boczar, D.; Michalska, K. A Review of Machine Learning and QSAR/QSPR Predictions for Complexes of Organic Molecules with Cyclodextrins. Molecules 2024, 29, 3159. [Google Scholar] [CrossRef] [PubMed]
  87. Maji, D.; Ghosh, A.; Barman, D.; Sarkar, P. Accelerating Molecular Dynamics with a Graph Neural Network: A Scalable Approach through E(q)C-GNN. J. Phys. Chem. Lett. 2025, 16, 2254–2264. [Google Scholar] [CrossRef]
  88. Tang, R.; Yang, J.; Shao, C.; Shen, N.; Chen, B.; Gu, Y.; Li, C.; Xu, D.; Guo, C. Two-dimensional nanomaterials-based optical biosensors empowered by machine learning for intelligent diagnosis. TrAC Trends Anal. Chem. 2025, 185, 118162. [Google Scholar] [CrossRef]
  89. Tang, B.; Kramer, S.T.; Fang, M.; Qiu, Y.; Wu, Z.; Xu, D. A self-attention based message passing neural network for predicting molecular lipophilicity and aqueous solubility. J. Cheminformatics 2020, 12, 15. [Google Scholar] [CrossRef]
  90. Singh, K.; Münchmeyer, J.; Weber, L.; Leser, U.; Bande, A. Graph Neural Networks for Learning Molecular Excitation Spectra. J. Chem. Theory Comput. 2022, 18, 4408–4417. [Google Scholar] [CrossRef]
  91. Jo, J.; Kwak, B.; Choi, H.-S.; Yoon, S. The message passing neural networks for chemical property prediction on SMILES. Methods 2020, 179, 65–72. [Google Scholar] [CrossRef]
  92. Schütt, K.T.; Sauceda, H.E.; Kindermans, P.-J.; Tkatchenko, A.; Müller, K.-R. A deep learning architecture for molecules and materials. J. Chem. Phys. 2018, 148, 241722. [Google Scholar] [CrossRef]
  93. Zhu, F.W.; Futrega, M.; Bao, H.; Eryilmaz, S.B.; Kong, F.; Jouanneaux, M.; Stadler, M.; Marcinkiewicz, M.; Duan, K.F.; Zheng, X.N.A.; et al. FastDimeNet plus plus: Training DimeNet++ in 22 minutes. In Proceedings of the 52nd International Conference on Parallel Processing (ICPP), Salt Lake City, UT, USA, 7–10 August 2023; pp. 274–284. [Google Scholar]
  94. Nandi, S.; Vegge, T.; Bhowmik, A. MultiXC-QM9: Large dataset of molecular and reaction energies from multi-level quantum chemical methods. Sci. Data 2023, 10, 783. [Google Scholar] [CrossRef]
  95. Bobrowski, T.M.; Korn, D.R.; Muratov, E.N.; Tropsha, A. ZINC Express: A Virtual Assistant for Purchasing Compounds Annotated in the ZINC Database. J. Chem. Inf. Model. 2021, 61, 1033–1036. [Google Scholar] [CrossRef]
  96. Zhu, Y.; Fang, J.; Ahmed, S.A.H.; Zhang, T.; Zeng, S.; Liao, J.-Y.; Ma, Z.; Qian, L. A modular artificial intelligence framework to facilitate fluorophore design. Nat. Commun. 2025, 16, 3598. [Google Scholar] [CrossRef]
  97. Ibrahim, A.; Ataca, C. Prediction of Frequency-Dependent Optical Spectrum for Solid Materials: A Multi-Output & Multi-Fidelity Machine Learning Approach. ACS Appl. Mater. Interfaces 2024, 16, 41145–41156. [Google Scholar]
  98. Joung, J.F.; Han, M.; Hwang, J.; Jeong, M.; Choi, D.H.; Park, S. Deep Learning Optical Spectroscopy Based on Experimental Database: Potential Applications to Molecular Design. JACS Au 2021, 1, 427–438. [Google Scholar] [CrossRef] [PubMed]
  99. Ju, C.-W.; Bai, H.; Li, B.; Liu, R. Machine Learning Enables Highly Accurate Predictions of Photophysical Properties of Organic Fluorescent Materials: Emission Wavelengths and Quantum Yields. J. Chem. Inf. Model. 2021, 61, 1053–1065. [Google Scholar] [CrossRef] [PubMed]
  100. Voznyy, O.; Levina, L.; Fan, J.Z.; Askerka, M.; Jain, A.; Choi, M.-J.; Ouellette, O.; Todorović, P.; Sagar, L.K.; Sargent, E.H. Machine Learning Accelerates Discovery of Optimal Colloidal Quantum Dot Synthesis. ACS Nano 2019, 13, 11122–11128. [Google Scholar] [CrossRef] [PubMed]
  101. Li, C.-N.; Liang, H.-P.; Zhang, X.; Lin, Z.; Wei, S.-H. Graph deep learning accelerated efficient crystal structure search and feature extraction. npj Comput. Mater. 2023, 9, 176. [Google Scholar] [CrossRef]
  102. Bühlmann, S.; Reymond, J.-L. ChEMBL-Likeness Score and Database GDBChEMBL. Front. Chem. 2020, 8, 46. [Google Scholar] [CrossRef]
  103. Zheng, L.; Fan, J.; Mu, Y. OnionNet: A Multiple-Layer Intermolecular-Contact-Based Convolutional Neural Network for Protein–Ligand Binding Affinity Prediction. ACS Omega 2019, 4, 15956–15965. [Google Scholar] [CrossRef]
  104. Zheng, L.; Meng, J.; Jiang, K.; Lan, H.; Wang, Z.; Lin, M.; Li, W.; Guo, H.; Wei, Y.; Mu, Y. Improving protein-ligand docking and screening accuracies by incorporating a scoring function correction term. Briefings Bioinform. 2022, 23, bbac051. [Google Scholar] [CrossRef]
  105. Dong, J.; Qian, J.; Yu, K.; Huang, S.; Cheng, X.; Chen, F.; Jiang, H.; Zeng, W. Rational Design of Organelle-Targeted Fluorescent Probes: Insights from Artificial Intelligence. Research 2023, 6, 0075. [Google Scholar] [CrossRef]
  106. Bang, K.; Kim, J.; Hong, D.; Kim, D.; Han, S.S. Inverse design for materials discovery from the multidimensional electronic density of states. J. Mater. Chem. A 2024, 12, 6004–6013. [Google Scholar] [CrossRef]
  107. Cheng, M.; Fu, C.-L.; Okabe, R.; Chotrattanapituk, A.; Boonkird, A.; Hung, N.T.; Li, M. AI-driven materials design: A mini-review. arXiv 2025, arXiv:2502.02905. [Google Scholar]
  108. Ma, T.; Chen, J.; Xiao, C. Constrained generation of semantically valid graphs via regularizing variational autoencoders. Adv. Neural Inf. Process. Syst. 2018, 31, 23–28. [Google Scholar]
  109. Bresson, X.; Laurent, T. A two-step graph convolutional decoder for molecule generation. arXiv 2019, arXiv:1906.03412. [Google Scholar]
  110. Wang, Y.; Li, Z.; Barati Farimani, A. Graph Neural Networks for Molecules. In Machine Learning in Molecular Sciences; Qu, C., Liu, H., Eds.; Springer International Publishing: Cham, Switzerland, 2023; pp. 21–66. [Google Scholar]
  111. Gómez-Bombarelli, R.; Wei, J.N.; Duvenaud, D.; Hernández-Lobato, J.M.; Sánchez-Lengeling, B.; Sheberla, D.; Aguilera-Iparraguirre, J.; Hirzel, T.D.; Adams, R.P.; Aspuru-Guzik, A. Automatic Chemical Design Using a Data-Driven Continuous Representation of Molecules. ACS Central Sci. 2018, 4, 268–276. [Google Scholar] [CrossRef]
  112. Tan, Z.; Li, Y.; Wu, X.; Zhang, Z.; Shi, W.; Yang, S.; Zhang, W. De novo creation of fluorescent molecules via adversarial generative modeling. RSC Adv. 2023, 13, 1031–1040. [Google Scholar] [CrossRef]
  113. Prykhodko, O.; Johansson, S.V.; Kotsias, P.-C.; Arús-Pous, J.; Bjerrum, E.J.; Engkvist, O.; Chen, H. A de novo molecular generation method using latent vector based generative adversarial network. J. Chemin. 2019, 11, 74. [Google Scholar] [CrossRef]
  114. Ghosh, K.; Stuke, A.; Todorović, M.; Jørgensen, P.B.; Schmidt, M.N.; Vehtari, A.; Rinke, P. Deep Learning Spectroscopy: Neural Networks for Molecular Excitation Spectra. Adv. Sci. 2019, 6, 1801367. [Google Scholar] [CrossRef]
  115. Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial networks. Commun. ACM 2020, 63, 139–144. [Google Scholar] [CrossRef]
  116. De Cao, N.; Kipf, T. MolGAN: An implicit generative model for small molecular graphs. arXiv 2018, arXiv:1805.11973. [Google Scholar]
  117. Popova, M.; Isayev, O.; Tropsha, A. Deep reinforcement learning for de novo drug design. Sci. Adv. 2018, 4, eaap7885. [Google Scholar] [CrossRef]
  118. Li, C.; Tang, H.; Zhu, Y.; Yamanishi, Y. A Reinforcement Learning-Driven Transformer GAN for Molecular Generation. arXiv 2025, arXiv:2503.12796. [Google Scholar]
  119. Goel, M.; Raghunathan, S.; Laghuvarapu, S.; Priyakumar, U.D. MoleGuLAR: Molecule Generation Using Reinforcement Learning with Alternating Rewards. J. Chem. Inf. Model. 2021, 61, 5815–5826. [Google Scholar] [CrossRef] [PubMed]
  120. Zang, C.; Wang, F. Moflow: An invertible flow model for generating molecular graphs. In Proceedings of the KDD ‘20: The 26th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Virtual Event, CA, USA, 6–10 July 2020; pp. 617–626. [Google Scholar]
  121. Luo, Y.; Fang, J.; Li, S.; Liu, Z.; Wu, J.; Zhang, A.; Du, W.; Wang, X. Text-guided small molecule generation via diffusion model. iScience 2024, 27, 110992. [Google Scholar] [CrossRef] [PubMed]
  122. Zeng, X.; Wang, F.; Luo, Y.; Kang, S.-G.; Tang, J.; Lightstone, F.C.; Fang, E.F.; Cornell, W.; Nussinov, R.; Cheng, F. Deep generative molecular design reshapes drug discovery. Cell Rep. Med. 2022, 3, 100794. [Google Scholar] [CrossRef] [PubMed]
  123. Aernouts, B.; Van Beers, R.; Watté, R.; Huybrechts, T.; Lammertyn, J.; Saeys, W. Visible and near-infrared bulk optical properties of raw milk. J. Dairy Sci. 2015, 98, 6727–6738. [Google Scholar] [CrossRef]
  124. Russell, J.D.; Scalf, M.; Book, A.J.; Ladror, D.T.; Vierstra, R.D.; Smith, L.M.; Coon, J.J.; Hess, S. Characterization and quantification of intact 26S proteasome proteins by real-time measurement of intrinsic fluorescence prior to top-down mass spectrometry. PLoS ONE 2013, 8, e58157. [Google Scholar] [CrossRef]
  125. Venturini, F.; Sperti, M.; Michelucci, U.; Gucciardi, A.; Martos, V.M.; Deriu, M.A. Extraction of physicochemical properties from the fluorescence spectrum with 1D convolutional neural networks: Application to olive oil. J. Food Eng. 2023, 336, 111198. [Google Scholar] [CrossRef]
  126. Ouyang, Q.; Fan, Z.; Chang, H.; Shoaib, M.; Chen, Q. Analyzing TVB-N in snakehead by Bayesian-optimized 1D-CNN using molecular vibrational spectroscopic techniques: Near-infrared and Raman spectroscopy. Food Chem. 2025, 464, 141701. [Google Scholar] [CrossRef]
  127. Wang, B.; Deng, J.; Jiang, H. Markov Transition Field Combined with Convolutional Neural Network Improved the Predictive Performance of Near-Infrared Spectroscopy Models for Determination of Aflatoxin B1 in Maize. Foods 2022, 11, 2210. [Google Scholar] [CrossRef]
  128. Venturini, F.; Michelucci, U.; Sperti, M.; Gucciardi, A.; Deriu, M.A.; Berghmans, F.; Zergioti, I. One-dimensional convolutional neural networks design for fluorescence spectroscopy with prior knowledge: Explainability techniques applied to olive oil fluorescence spectra. In Proceedings of the Optical Sensing and Detection VII, Strasbourg, France, 3 April–23 May 2022; Volume 12139, pp. 323–333. [Google Scholar] [CrossRef]
  129. Wu, X.; Zhao, Z.; Tian, R.; Gao, S.; Niu, Y.; Liu, H. Exploration of total synchronous fluorescence spectroscopy combined with pre-trained convolutional neural network in the identification and quantification of vegetable oil. Food Chem. 2021, 335, 127640. [Google Scholar] [CrossRef]
  130. Lin, Y.; Ma, J.; Sun, D.-W.; Cheng, J.-H.; Zhou, C. Fast real-time monitoring of meat freshness based on fluorescent sensing array and deep learning: From development to deployment. Food Chem. 2024, 448, 139078. [Google Scholar] [CrossRef] [PubMed]
  131. Shen, F.; Feng, X.; Li, Y.; Lin, X.; Cai, F. Compact three-dimensional fluorescence spectroscopy and its application in food safety. LWT 2024, 202, 116324. [Google Scholar] [CrossRef]
  132. Wang, Y.; Gu, H.-W.; Yin, X.-L.; Geng, T.; Long, W.; Fu, H.; She, Y. Deep leaning in food safety and authenticity detection: An integrative review and future prospects. Trends Food Sci. Technol. 2024, 146, 104396. [Google Scholar] [CrossRef]
  133. Nazir, A.; Hussain, A.; Assad, A. CNN in Food Industry: Current Practices and Future Trends. In Artificial Intelligence in the Food Industry; CRC Press: Boca Raton, FL, USA, 2025; pp. 329–354. [Google Scholar] [CrossRef]
  134. Seltmann, A.; Carravilla, P.; Reglinski, K.; Eggeling, C.; Waithe, D. Neural network informed photon filtering reduces fluorescence correlation spectroscopy artifacts. Biophys. J. 2024, 123, 745–755. [Google Scholar] [CrossRef]
  135. Deng, J.; Chen, Z.; Jiang, H.; Chen, Q. High-precision detection of dibutyl hydroxytoluene in edible oil via convolutional autoencoder compressed Fourier-transform near-infrared spectroscopy. Food Control 2025, 167, 110808. [Google Scholar] [CrossRef]
  136. Ren, P.; Zhou, R.-G.; Li, Y. A Self-supervised Learning Method for Raman Spectroscopy based on Masked Autoencoders. Expert Syst. Appl. 2025, 292, 128576. [Google Scholar] [CrossRef]
  137. Pode, Z.; Peri-Naor, R.; Georgeson, J.M.; Ilani, T.; Kiss, V.; Unger, T.; Markus, B.; Barr, H.M.; Motiei, L.; Margulies, D. Protein recognition by a pattern-generating fluorescent molecular probe. Nat. Nanotechnol. 2017, 12, 1161–1168. [Google Scholar] [CrossRef]
  138. Han, X.; Che, L.; Zhao, Y.; Chen, Y.; Zhou, S.; Wang, J.; Yin, M.; Wang, S.; Deng, Q. Fluorescence sensor array of a multiplexing probe with three/four excitations/emissions for rapid and highly sensitive discrimination of foodborne pathogenic bacteria. Sens. Actuators B Chem. 2023, 388, 133847. [Google Scholar] [CrossRef]
  139. Cheng, H.; Liu, T.; Tian, J.; An, R.; Shen, Y.; Liu, M.; Yao, Z. A General Strategy for Food Traceability and Authentication Based on Assembly-Tunable Fluorescence Sensor Arrays. Adv. Sci. 2024, 11, e2309259. [Google Scholar] [CrossRef]
  140. Xu, X.; Wang, X.; Ding, Y.; Zhou, X.; Ding, Y. Integration of lanthanide MOFs/methylcellulose-based fluorescent sensor arrays and deep learning for fish freshness monitoring. Int. J. Biol. Macromol. 2024, 265, 131011. [Google Scholar] [CrossRef] [PubMed]
  141. Noreldeen, H.A.A.; Huang, K.-Y.; Wu, G.-W.; Peng, H.-P.; Deng, H.-H.; Chen, W. Deep Learning-Based Sensor Array: 3D Fluorescence Spectra of Gold Nanoclusters for Qualitative and Quantitative Analysis of Vitamin B6 Derivatives. Anal. Chem. 2022, 94, 9287–9296. [Google Scholar] [CrossRef] [PubMed]
  142. Mandal, S.; Paul, D.; Saha, S.; Das, P. Deep learning assisted detection of toxic heavy metal ions based on visual fluorescence responses from a carbon nanoparticle array. Environ. Sci. Nano 2022, 9, 2596–2606. [Google Scholar] [CrossRef]
  143. Wang, X.; Lin, W.; Chen, C.; Kong, L.; Huang, Z.; Kirsanov, D.; Legin, A.; Wan, H.; Wang, P. Neural networks based fluorescence and electrochemistry dual-modal sensor for sensitive and precise detection of cadmium and lead simultaneously. Sens. Actuators B Chem. 2022, 366, 131922. [Google Scholar] [CrossRef]
  144. Tian, C.; Lee, Y.; Song, Y.; Elmasry, M.R.; Yoon, M.; Kim, D.-H.; Cho, S.-Y. Machine-Learning-Enhanced Fluorescent Nanosensor Based on Carbon Quantum Dots for Heavy Metal Detection. ACS Appl. Nano Mater. 2024, 7, 5576–5586. [Google Scholar] [CrossRef]
  145. Sarmanova, O.; Laptinskiy, K.; Burikov, S.; Chugreeva, G.; Dolenko, T. Implementing neural network approach to create carbon-based optical nanosensor of heavy metal ions in liquid media. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2023, 286, 122003. [Google Scholar] [CrossRef]
  146. Li, M.; Pan, Q.; Wang, J.; Wang, Z.; Peng, C. Machine learning-assisted fluorescence sensor array for qualitative and quantitative analysis of pyrethroid pesticides. Food Chem. 2024, 433, 137368. [Google Scholar] [CrossRef]
  147. Yun, P.; Jinorose, M.; Devahastin, S. Rapid smartphone-based assays for pesticides inspection in foods: Current status, limitations, and future directions. Crit. Rev. Food Sci. Nutr. 2024, 64, 6251–6271. [Google Scholar] [CrossRef]
  148. Tan, X.; Liang, Y.; Ye, Y.; Liu, Z.; Meng, J.; Li, F. Explainable Deep Learning-Assisted Fluorescence Discrimination for Aminoglycoside Antibiotic Identification. Anal. Chem. 2022, 94, 829–836. [Google Scholar] [CrossRef]
  149. Zhang, Y.; Wang, M.; Shao, C.; Liu, T.; Sun, M.; Wu, C.; Su, G.; Wang, Y.; Ye, J.; Hu, H.; et al. Nanozyme-induced deep learning-assisted smartphone integrated colorimetric and fluorometric dual-mode for detection of tetracycline analogs. Anal. Chim. Acta 2024, 1297, 342373. [Google Scholar] [CrossRef]
  150. Wu, L.; Li, Y.; Xiang, X.; Qin, H.; Zhao, J.; Zhai, X.; Li, P.; Li, Z. Machine learning-enabled flexible luminescent sensor for non-destructive mapping antibiotics distribution on seafood. Chem. Eng. J. 2025, 510, 161376. [Google Scholar] [CrossRef]
  151. Li, Z.; Jin, K.; Chen, H.; Zhang, L.; Zhang, G.; Jiang, Y.; Zou, H.; Wang, W.; Qi, G.; Qu, X. A machine learning approach-based array sensor for rapidly predicting the mechanisms of action of antibacterial compounds. Nanoscale 2022, 14, 3087–3096. [Google Scholar] [CrossRef]
  152. Li, M.; Xu, J.; Peng, C.; Wang, Z. Deep learning-assisted flavonoid-based fluorescent sensor array for the nondestructive detection of meat freshness. Food Chem. 2024, 447, 138931. [Google Scholar] [CrossRef] [PubMed]
  153. Wang, D.; Zhang, M.; Zhu, Q.; Adhikari, B. Intelligent vegetable freshness monitoring system developed by integrating eco-friendly fluorescent sensor arrays with deep convolutional neural networks. Chem. Eng. J. 2024, 488, 150739. [Google Scholar] [CrossRef]
  154. Tan, X.; Ye, Y.; Liu, H.; Meng, J.; Yang, L.; Li, F. Deep Learning-Assisted Visualized Fluorometric Sensor Array for Biogenic Amines Detection. Chin. J. Chem. 2022, 40, 609–616. [Google Scholar] [CrossRef]
  155. Fan, L.; Chen, Y.; Zeng, Y.; Yu, Z.; Dong, Y.; Li, D.; Zhang, C.; Ye, C. Application of visual intelligent labels in the assessment of meat freshness. Food Chem. 2024, 460, 140562. [Google Scholar] [CrossRef]
  156. Zhang, Z.; Yan, B. Convolution Neural Network-Assisted Smart Fluorescent-Tongue Based on Lanthanide Ion-Induced Forming MOF/HOF Composite for Differentiation of Flavor Compounds and Wine Identification. ACS Sens. 2023, 8, 3585–3594. [Google Scholar] [CrossRef]
  157. Wu, X.; Zhao, Z.; Tian, R.; Shang, Z.; Liu, H. Identification and quantification of counterfeit sesame oil by 3D fluorescence spectroscopy and convolutional neural network. Food Chem. 2020, 311, 125882. [Google Scholar] [CrossRef]
  158. Laliwala, A.; Svechkarev, D.; Sadykov, M.R.; Endres, J.; Bayles, K.W.; Mohs, A.M. Simpler Procedure and Improved Performance for Pathogenic Bacteria Analysis with a Paper-Based Ratiometric Fluorescent Sensor Array. Anal. Chem. 2022, 94, 2615–2624. [Google Scholar] [CrossRef]
  159. Aggarwal, M.; Sahoo, P.; Saha, S.; Das, P. Machine Learning-Mediated Ultrasensitive Detection of Citrinin and Associated Mycotoxins in Real Food Samples Discerned from a Photoluminescent Carbon Dot Barcode Array. J. Agric. Food Chem. 2023, 71, 12849–12858. [Google Scholar] [CrossRef]
  160. Chen, Z.; Li, Z.; He, H.; Liu, J.; Deng, J.; Jiang, L.; Liu, X. Ratiometric fluorescence sensor based on deep learning for rapid and user-friendly detection of tetracycline antibiotics. Food Chem. 2024, 450, 138961. [Google Scholar] [CrossRef]
  161. Lu, Z.; Li, J.; Ruan, K.; Sun, M.; Zhang, S.; Liu, T.; Yin, J.; Wang, X.; Chen, H.; Wang, Y.; et al. Deep learning-assisted smartphone-based ratio fluorescence for “on–off-on” sensing of Hg2+ and thiram. Chem. Eng. J. 2022, 435, 134979. [Google Scholar] [CrossRef]
  162. Wu, C.; Chang, H.; Chen, X.; Yang, S.; Dai, Y.; Tan, P.; Chen, Y.; Shen, C.; Lu, Z.; Sun, M.; et al. Deep Learning-Assisted Rapid Assessment of Food Freshness Using an Anti-interfering Triple-Emission Ratiometric Fluorescent Sensor. ACS Sustain. Chem. Eng. 2024, 12, 2465–2475. [Google Scholar] [CrossRef]
  163. Xu, Z.; Wang, K.; Zhang, M.; Wang, T.; Du, X.; Gao, Z.; Hu, S.; Ren, X.; Feng, H. Machine learning assisted dual-emission fluorescence/colorimetric sensor array detection of multiple antibiotics under stepwise prediction strategy. Sens. Actuators B Chem. 2022, 359, 131590. [Google Scholar] [CrossRef]
  164. Upadhyay, L.S.B.; Verma, N. Enzyme Inhibition Based Biosensors: A Review. Anal. Lett. 2013, 46, 225–241. [Google Scholar] [CrossRef]
  165. Wang, C.; Gu, C.; Zhao, X.; Yu, S.; Zhang, X.; Xu, F.; Ding, L.; Huang, X.; Qian, J. Self-designed portable dual-mode fluorescence device with custom python-based analysis software for rapid detection via dual-color FRET aptasensor with IoT capabilities. Food Chem. 2024, 457, 140190. [Google Scholar] [CrossRef] [PubMed]
  166. Perez-Gonzalez, C.; Lafontaine, D.A.; Penedo, J.C. Fluorescence-Based Strategies to Investigate the Structure and Dynamics of Aptamer-Ligand Complexes. Front. Chem. 2016, 4, 33. [Google Scholar] [CrossRef] [PubMed]
  167. Liu, Z.; Wang, X.; Ren, X.; Li, W.; Sun, J.; Wang, X.; Huang, Y.; Guo, Y.; Zeng, H. Novel fluorescence immunoassay for the detection of zearalenone using HRP-mediated fluorescence quenching of gold-silver bimetallic nanoclusters. Food Chem. 2021, 355, 129633. [Google Scholar] [CrossRef]
  168. Ouyang, Q.; Wang, L.; Ahmad, W.; Yang, Y.; Chen, Q. Upconversion Nanoprobes Based on a Horseradish Peroxidase-Regulated Dual-Mode Strategy for the Ultrasensitive Detection of Staphylococcus aureus in Meat. J. Agric. Food Chem. 2021, 69, 9947–9956. [Google Scholar] [CrossRef]
  169. McCann, B.; Tipper, B.; Shahbeigi, S.; Soleimani, M.; Jabbari, M.; Esfahani, M.N. A Review on Perception of Binding Kinetics in Affinity Biosensors: Challenges and Opportunities. ACS Omega 2025, 10, 4197–4216. [Google Scholar] [CrossRef]
  170. Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
  171. Nunekpeku, X.; Zhang, W.; Gao, J.; Adade, S.Y.-S.S.; Li, H.; Chen, Q. Gel strength prediction in ultrasonicated chicken mince: Fusing near-infrared and Raman spectroscopy coupled with deep learning LSTM algorithm. Food Control 2025, 168, 110916. [Google Scholar] [CrossRef]
  172. Wang, Y.; Li, T.; Chen, T.; Zhang, X.; Taha, M.F.; Yang, N.; Mao, H.; Shi, Q. Cucumber Downy Mildew Disease Prediction Using a CNN-LSTM Approach. Agriculture 2024, 14, 1155. [Google Scholar] [CrossRef]
  173. Gao, Y.; Glowacka, D. Deep gate recurrent neural network. In Proceedings of the Asian Conference on Machine Learning, PMLR, Hamilton, New Zealand, 16–18 November 2016; pp. 350–365. [Google Scholar]
  174. Gouzou, D.; Taimori, A.; Haloubi, T.; Finlayson, N.; Wang, Q.; Hopgood, J.R.; Vallejo, M. Applications of machine learning in time-domain fluorescence lifetime imaging: A Review. Methods Appl. Fluoresc. 2024, 12, 022001. [Google Scholar] [CrossRef]
  175. Fan, Y.; Dong, R.; Luo, Y.; Tan, Y.; Hong, H.; Ji, Z.; Shi, C. Deep learning models with optimized fluorescence spectroscopy to advance freshness of rainbow trout predicting under nonisothermal storage conditions. Food Chem. 2024, 454, 139774. [Google Scholar] [CrossRef] [PubMed]
  176. Xi, Z.; Nicolas, R.; Wei, J. An Edge-Deployable Multi-Modal Nano-Sensor Array Coupled with Deep Learning for Real-Time, Multi-Pollutant Water-Quality Monitoring. Water 2025, 17, 2065. [Google Scholar] [CrossRef]
  177. Saltepe, B.; Bozkurt, E.U.; Güngen, M.A.; Çiçek, A.E.; Şeker, U.Ö.Ş. Genetic circuits combined with machine learning provides fast responding living sensors. Biosens. Bioelectron. 2021, 178, 113028. [Google Scholar] [CrossRef] [PubMed]
  178. Xie, J.; Chen, W.; Chen, S.; Wu, P.; Lv, Z.; Wu, J.; Chen, Z.; Li, Z.; Luo, F.; Liu, X. Research on Malodor Component Identification Based on Sensor Array. Sensors 2025, 25, 3857. [Google Scholar] [CrossRef]
  179. Li, H.; Xu, H.; Li, Y.; Li, X. Application of artificial intelligence (AI)-enhanced biochemical sensing in molecular diagnosis and imaging analysis: Advancing and challenges. TrAC Trends Anal. Chem. 2024, 174, 117700. [Google Scholar] [CrossRef]
  180. Zeng, L.; Fu, Y.; Guo, J.; Guo, J. Technology, Combination of fluorescence sensor and artificial intelligence—A new method of quantitative ketamine detection. Meas. Sci. Technol. 2023, 34, 125701. [Google Scholar] [CrossRef]
  181. Dehimi, N.E.H.; Tolba, Z. Attention mechanisms in deep learning: Towards explainable artificial intelligence. In Proceedings of the 2024 6th International Conference on Pattern Analysis and Intelligent Systems (PAIS), El Oued, Algeria, 24–25 April 2024; pp. 1–7. [Google Scholar]
  182. Wang, W.; Chen, K.; Ma, X.; Guo, J. Artificial intelligence reinforced upconversion nanoparticle-based lateral flow assay via transfer learning. Fundam. Res. 2023, 3, 544–556. [Google Scholar] [CrossRef] [PubMed]
  183. Yang, F.; Sun, J.; Cheng, J.; Fu, L.; Wang, S.; Xu, M. Detection of starch in minced chicken meat based on hyperspectral imaging technique and transfer learning. J. Food Process. Eng. 2023, 46, e14304. [Google Scholar] [CrossRef]
  184. Karniadakis, G.E.; Kevrekidis, I.G.; Lu, L.; Perdikaris, P.; Wang, S.; Yang, L. Physics-informed machine learning. Nat. Rev. Phys. 2021, 3, 422–440. [Google Scholar] [CrossRef]
  185. Willard, J.; Jia, X.; Xu, S.; Steinbach, M.; Kumar, V. Integrating Scientific Knowledge with Machine Learning for Engineering and Environmental Systems. ACM Comput. Surv. 2022, 55, 66. [Google Scholar] [CrossRef]
  186. Zhang, Y.; Zhu, J.; Xie, H.; He, Y. Physics-informed deep learning for stochastic particle dynamics estimation. Proc. Natl. Acad. Sci. USA 2025, 122, e2418643122. [Google Scholar] [CrossRef]
  187. Han, K.; Wang, Y.; Chen, H.; Chen, X.; Guo, J.; Liu, Z.; Tang, Y.; Xiao, A.; Xu, C.; Xu, Y.; et al. A survey on vision transformer. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 45, 87–110. [Google Scholar] [CrossRef]
  188. Ji, W.; Zhai, K.; Xu, B.; Wu, J. Green Apple Detection Method Based on Multidimensional Feature Extraction Network Model and Transformer Module. J. Food Prot. 2025, 88, 100397. [Google Scholar] [CrossRef]
  189. Gou, J.; Yu, B.; Maybank, S.J.; Tao, D. Knowledge distillation: A survey. Int. J. Comput. Vis. 2021, 129, 1789–1819. [Google Scholar] [CrossRef]
  190. Shi, W.; Cao, J.; Zhang, Q.; Li, Y.; Xu, L. Edge computing: Vision and challenges. IEEE Internet Things J. 2016, 3, 637–646. [Google Scholar] [CrossRef]
Figure 1. Fluorescent sensing materials (yellow), mechanisms (pink), and detection of typical food hazards (light red).
Figure 1. Fluorescent sensing materials (yellow), mechanisms (pink), and detection of typical food hazards (light red).
Foods 14 03114 g001
Figure 2. Deep learning enables molecular design for precision food safety detection through “forward prediction” and “reverse generation”.
Figure 2. Deep learning enables molecular design for precision food safety detection through “forward prediction” and “reverse generation”.
Foods 14 03114 g002
Figure 3. (A) A graph neural network (GNN) model-coupled ligand solvent interaction used to predict the optical and photophysical properties of ligands. The figure was reproduced from [98], with permission from American Chemical Society, 2021; (B) Experimentally measured and DL predicted absorption (black) and emission (red) spectra. (a) Coumarin 153 in ethanol. (b) BPCP-2CPC (molecule) in C-2PC (host). The bandwidths in fwhm for the calculated spectra are set to 5370 cm−1. (c) Photograph of (E,E,E) -2-(4-diphenylaminostyryl) -4,6-bis (4-methoxystyryl) pyrimidine in several solvents. The colors below the photograph are those predicted using the proposed GNN model from (A). (d) Photographs of solid state emission. The colors below the photograph are those predicted using the proposed DL model from (A). The figure was reproduced from [98], with permission from American Chemical Society, 2021; (C) A different (non-GNN) machine learning model was used to predict the emission wavelength and quantum yield of organic fluorescent materials. The figure was reproduced from [99], with permission from American Chemical Society, 2021; (D) Utilizing machine learning (non-GNN) to optimize synthesis parameters of colloidal quantum dots. The figure was reproduced from [100], with permission from American Chemical Society, 2019.
Figure 3. (A) A graph neural network (GNN) model-coupled ligand solvent interaction used to predict the optical and photophysical properties of ligands. The figure was reproduced from [98], with permission from American Chemical Society, 2021; (B) Experimentally measured and DL predicted absorption (black) and emission (red) spectra. (a) Coumarin 153 in ethanol. (b) BPCP-2CPC (molecule) in C-2PC (host). The bandwidths in fwhm for the calculated spectra are set to 5370 cm−1. (c) Photograph of (E,E,E) -2-(4-diphenylaminostyryl) -4,6-bis (4-methoxystyryl) pyrimidine in several solvents. The colors below the photograph are those predicted using the proposed GNN model from (A). (d) Photographs of solid state emission. The colors below the photograph are those predicted using the proposed DL model from (A). The figure was reproduced from [98], with permission from American Chemical Society, 2021; (C) A different (non-GNN) machine learning model was used to predict the emission wavelength and quantum yield of organic fluorescent materials. The figure was reproduced from [99], with permission from American Chemical Society, 2021; (D) Utilizing machine learning (non-GNN) to optimize synthesis parameters of colloidal quantum dots. The figure was reproduced from [100], with permission from American Chemical Society, 2019.
Foods 14 03114 g003
Figure 4. (A) De novo creation of fluorescent molecules via adversarial generative modeling. The figure was reproduced from [112], with permission from Royal Society of Chemistry, 2023; (B) LatentGAN which combines an autoencoder and a generative adversarial neural network for de novo molecular design. The figure was reproduced from [113], with permission from the BMC, 2019; (C) Predicting molecular excitation spectra based on Machine learning and deep learning models. (a) Atomic structure of the N-methyl-N-(2,2,2-trifluoroethyl)formamide molecule and (b) its corresponding Coulomb matrix representation. Canonical illustration of the three neural network types: (c) the multilayer perceptron (MLP); (d) the convolutional neural network (CNN); and (e) the deep tensor neural network (DTNN). Green circles to the left represent the molecular input and yellow circles to the right the output (here 16 excitation energies or the molecular excitation spectrum). The gray blocks are schematics for fully connected hidden layers, convolutional blocks, pooling layers, and state vectors. Nodes corresponding to atom types in the DTNN are represented as blue squares and the distances matrix between different atoms as pink squares. Parameter tensors (red squares) project the vectors encoding atom types and the interatomic distance matrix into a vector with same dimensions as the atom type encodings. The DTNN is evaluated iteratively, building up more complex interactions between atoms with each iteration. (f) Comparison of CNN and DTNN spectra predictions: the first column depicts RSE histograms for 13 000 test molecules from the 132k dataset. The following three columns show the spectra of the best, an average, and one of the worst predictions compared to the corresponding reference spectrum. The colored circles mark the histogram positions of the selected molecules. The figure was reproduced from [114], with permission from Wiley, 2019.
Figure 4. (A) De novo creation of fluorescent molecules via adversarial generative modeling. The figure was reproduced from [112], with permission from Royal Society of Chemistry, 2023; (B) LatentGAN which combines an autoencoder and a generative adversarial neural network for de novo molecular design. The figure was reproduced from [113], with permission from the BMC, 2019; (C) Predicting molecular excitation spectra based on Machine learning and deep learning models. (a) Atomic structure of the N-methyl-N-(2,2,2-trifluoroethyl)formamide molecule and (b) its corresponding Coulomb matrix representation. Canonical illustration of the three neural network types: (c) the multilayer perceptron (MLP); (d) the convolutional neural network (CNN); and (e) the deep tensor neural network (DTNN). Green circles to the left represent the molecular input and yellow circles to the right the output (here 16 excitation energies or the molecular excitation spectrum). The gray blocks are schematics for fully connected hidden layers, convolutional blocks, pooling layers, and state vectors. Nodes corresponding to atom types in the DTNN are represented as blue squares and the distances matrix between different atoms as pink squares. Parameter tensors (red squares) project the vectors encoding atom types and the interatomic distance matrix into a vector with same dimensions as the atom type encodings. The DTNN is evaluated iteratively, building up more complex interactions between atoms with each iteration. (f) Comparison of CNN and DTNN spectra predictions: the first column depicts RSE histograms for 13 000 test molecules from the 132k dataset. The following three columns show the spectra of the best, an average, and one of the worst predictions compared to the corresponding reference spectrum. The colored circles mark the histogram positions of the selected molecules. The figure was reproduced from [114], with permission from Wiley, 2019.
Foods 14 03114 g004
Figure 5. (A) Deep learning-assisted flavonoid-based fluorescent sensor array for the nondestructive detection of meat freshness. The figure was reproduced from [152], with permission from Elsevier, 2024; (B) Fluorescent sensor array combined with DCNN model for freshness prediction of three vegetables. (a) Flowchart of DCNN for freshness prediction. (b) Training loss and training accuracy of the four DCNN models decrease and increase with the increase of training epochs. The figure was reproduced from [153], with permission from Elsevier, 2024; (C) Nanozyme-induced deep learning-assisted smartphone integrated colorimetric and fluorometric dual-mode for detection of tetracycline analogs. The figure was reproduced from [149], with permission from Elsevier, 2024; (D) Generative Adversarial Nets (GANs) assisted detection of toxic heavy metal ions based on visual fluorescence responses from a carbon nanoparticle array. The figure was reproduced from [142], with permission from Royal Society of Chemistry, 2022.
Figure 5. (A) Deep learning-assisted flavonoid-based fluorescent sensor array for the nondestructive detection of meat freshness. The figure was reproduced from [152], with permission from Elsevier, 2024; (B) Fluorescent sensor array combined with DCNN model for freshness prediction of three vegetables. (a) Flowchart of DCNN for freshness prediction. (b) Training loss and training accuracy of the four DCNN models decrease and increase with the increase of training epochs. The figure was reproduced from [153], with permission from Elsevier, 2024; (C) Nanozyme-induced deep learning-assisted smartphone integrated colorimetric and fluorometric dual-mode for detection of tetracycline analogs. The figure was reproduced from [149], with permission from Elsevier, 2024; (D) Generative Adversarial Nets (GANs) assisted detection of toxic heavy metal ions based on visual fluorescence responses from a carbon nanoparticle array. The figure was reproduced from [142], with permission from Royal Society of Chemistry, 2022.
Foods 14 03114 g005
Table 1. Comparison of typical model architectures’ advantages and challenges.
Table 1. Comparison of typical model architectures’ advantages and challenges.
Model ArchitectureKey AdvantagesKey ChallengesControllability/Goal OrientationSeminal Paper
Variational Autoencoder (VAE)Smooth latent space, facilitating gradient-based property optimization and interpolation.Sometimes low validity of reconstructed molecules; tends to generate training set-similar molecules.Good, via joint training with property predictors or latent space optimization.[111]
Generative Adversarial Network (GAN)Generates high-quality, novel, and diverse molecules.Unstable training, prone to mode collapse.Moderate, typically combined with RL or conditional GAN.[115]
Reinforcement Learning (RL)-Guided ModelDirectly optimizes complex, non-differentiable rewards (e.g., synthetic accessibility); strong goal orientation.Difficult reward function design; hard to balance exploration/exploitation.Very high, enables multi-objective optimization via well-designed rewards.[117]
Diffusion ModelCapable of generating extremely high-quality samples, exhibiting good diversity, stable training dynamics.Slow sampling (due to iterative denoising process), large model size requirements, high computational cost per sample.Good, achievable via guidance or conditional input.[121]
Flow-based ModelExact and efficient likelihood calculation, enabling precise probability density estimation; invertible transformations, stable training.Strong architectural constraints (e.g., requiring bijective transformations), potentially high computational cost during training/inference.Moderate, achievable via conditional flow models.[122]
Table 2. Performance comparison of fluorescent sensor arrays combined with deep learning or machine learning techniques in food analysis.
Table 2. Performance comparison of fluorescent sensor arrays combined with deep learning or machine learning techniques in food analysis.
Sensor Array Composition/
Probe Material
Food MatrixTarget Analyte(s)Machine Learning/
Deep learning Model
Key Performance
Metrics
Reference
Copper nanoclusters (CuNCs) & fluorescent dyesPorkMeat freshness (Ammonia, dimethylamine, trimethylamine)SqueezeNet (CNN), Grad-CAM, UMAPLimit of detection (LOD): 131.56 ppb. Accuracy: 98.17%. Lin et al.
[130]
EuMOF-FITCFish productsFish freshnessResNext-101LOD: 3.94 ppm (NH3). Accuracy: 98.97%.Xu et al.
[140]
Cys/NAC–AuNC&3D fluorescence spectraFoodsVitamin B6 derivativesDNN, CNNAccuracy: 97.77–100%. R2 = 97.01%Noreldeen et al.
[141]
Flavonoid-based fluorescent sensor arrayPackaged meatMeat freshnessDCNNAccuracy: 97.1%.Li et al.
[152]
Rhodamine B-CD@Au, rhodamine 6G-CD@Au, & coumarin 6-CD@AuVegetables and fruitsPyrethroid pesticides (PPs)HCA, SVM, BPNNLOD: magnitude of ppm. Recovery: 94.7–105%.Li et al.
[146]
Carbon dot & europium-doped calcium fluoride Milk, eggTetracycline antibiotics (TCs)Resnet18LOD: 0.05 μM. Accuracy: 99.0%.Chen et al.
[160]
Au NCs@ Fe-MIL-88NH2Water samplesHg2+ and thiramYolov3LOD: 7 nM (Hg2+). Precision: 97.1%.Lu et al.
[161]
Carbon dots (CDs) &Ru-MOFsShrimp and pork Food freshnessYOLORecovery: 98.63–106.64%, RSD < 1.56%.Wu et al.
[162]
Carbon dots & CdTe quantum dotsFoodsNine antibioticsSX-modelAccuracy: 95%. average concentration error for unknown samples: 4.93%Xu et al.
[163]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shi, Y.; Yang, S.; Li, W.; Wu, Y.; Luo, W. Deep Learning-Driven Intelligent Fluorescent Probes: Advancements in Molecular Design for Accurate Food Safety Detection. Foods 2025, 14, 3114. https://doi.org/10.3390/foods14173114

AMA Style

Shi Y, Yang S, Li W, Wu Y, Luo W. Deep Learning-Driven Intelligent Fluorescent Probes: Advancements in Molecular Design for Accurate Food Safety Detection. Foods. 2025; 14(17):3114. https://doi.org/10.3390/foods14173114

Chicago/Turabian Style

Shi, Yongqiang, Sisi Yang, Wenting Li, Yuqing Wu, and Weiran Luo. 2025. "Deep Learning-Driven Intelligent Fluorescent Probes: Advancements in Molecular Design for Accurate Food Safety Detection" Foods 14, no. 17: 3114. https://doi.org/10.3390/foods14173114

APA Style

Shi, Y., Yang, S., Li, W., Wu, Y., & Luo, W. (2025). Deep Learning-Driven Intelligent Fluorescent Probes: Advancements in Molecular Design for Accurate Food Safety Detection. Foods, 14(17), 3114. https://doi.org/10.3390/foods14173114

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop