Next Article in Journal
Efficient Malware Detection in HWP Byte Sequences Using Pooling-Based Model
Previous Article in Journal
Assessment of Immediate Traction Manipulation of the Ankle Joint on the Peroneus Longus, Gluteus Medius and Tensor Fascia Lata Muscles in Healthy People: A Randomized Double-Blind Study
Previous Article in Special Issue
Custom Material Scanning System for PBR Texture Acquisition: Hardware Design and Digitisation Workflow
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Image Processing Technology Applied to Fluorescent Rapid Tests for Influenza A and B Viruses

Department of Engineering Science, National Cheng Kung University, 1 University Road, Tainan 70101, Taiwan
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2025, 15(21), 11523; https://doi.org/10.3390/app152111523
Submission received: 25 September 2025 / Revised: 25 October 2025 / Accepted: 27 October 2025 / Published: 28 October 2025

Abstract

This study establishes a detection method based on image recognition to interpret and quantitatively analyze fluorescent rapid test kits for influenza. The method operates in a dark chamber equipped with a UV-LED, where the fluorescence of the test kit is excited by the UV-LED and subsequently captured using a camera module. The captured images are processed by segmenting the regions of interest (ROI), converting them to grayscale images, and analyzing the grayscale value distributions to identify the control (C) and test (T) line regions. By comparing the values of the C and T lines, the concentration is determined to achieve quantitative analysis. In the linearity validation experiments, the concentrations of influenza A (H1N1) specimens are 2, 4, 6, 8, and 10 ng/mL, achieving a coefficient of determination (R2) of 0.9923. For influenza B (Yamagata) specimens, concentrations of 6, 8, 10, 12.5, and 25 ng/mL resulted in an R2 of 0.9878. The established method enables the detection of both influenza A (H1N1) and influenza B (Yamagata), replacing visual qualitative interpretation with quantitative analysis. Currently, the detection method developed in this paper is designed for use exclusively in a dark chamber and is specifically applied to fluorescent rapid tests. It cannot be directly used with conventional colloidal gold-based rapid test reagents. In the future, the proposed detection approach could be integrated with neural networks to enable its application to non-fluorescent rapid test interpretation and to operate beyond the dark chamber environment, for example by utilizing smartphone imaging for result interpretation under normal lighting conditions.

1. Introduction

Influenza is an acute respiratory disease caused by viruses classified into types A, B, C, and D [1,2,3,4]. Among them, influenza A and B viruses are the primary pathogens responsible for human influenza, with influenza A having the potential to cause global pandemics. Common symptoms of influenza include fever, headache, muscle pain, and fatigue, with severe cases potentially leading to fatal outcomes. High-risk groups include the elderly, individuals with cardiovascular or pulmonary diseases, and patients with renal dysfunction, anemia, or immunodeficiency [5,6]. Therefore, rapid and accurate influenza diagnosis is crucial for timely treatment.
Current detection methods for influenza viruses include nucleic acid-based assays, antigen tests, serological analysis, and high-throughput sequencing techniques [7]. Among these, antigen tests are widely used in clinical practice due to their simplicity, short turnaround time, and minimal requirements for specialized equipment and expertise [8]. The commonly used influenza rapid antigen test (RAT) employs the sandwich immunoassay (SIA) principle, where the viral antigen in the sample binds to a labeled antibody, which subsequently interacts with a secondary antibody immobilized on the test zone. The two antibodies bound to the antigen form a sandwich-like structure. This structure induces a color change upon detection of the influenza virus, allowing qualitative assessment within 10–30 min [9]. The fluorescent sandwich immunoassay (FSIA) utilizes fluorescently labeled microspheres conjugated with antibodies to capture the viral antigen, subsequently binding to a secondary antibody immobilized in the test zone. Fluorescence emission is detected upon UV excitation to determine the test result [10]. Compared to SIA, FSIA provides improved sensitivity and specificity, allowing for more reliable detection even at low viral loads [11,12]. The increased accuracy of FSIA helps minimize misinterpretation risks and enhances diagnostic reliability so that patients can receive proper treatment in time.
However, RATs may yield false-negative or false-positive results due to operator error, low viral load, test sensitivity and specificity variations, or inadequate sample collection [13]. Misinterpretation of results can cause delays in seeking medical treatment. Machine vision-assisted interpretation of RAT results offers improved reliability and enables quantitative concentration analysis, reducing errors caused by human vision and environmental factors. In 2021, Lin et al. applied image interpretation techniques in the field of biomedicine [14]. They constructed an image acquisition system using a USB camera to capture the colorimetric results of fecal occult blood colloidal gold rapid test strips at various reagent concentrations, aiming to determine the corresponding concentration levels. In the same year, Turbé et al. utilized a tablet to capture images and applied deep learning-based image analysis algorithms to interpret HIV rapid diagnostic tests [15]. In 2022, Schary et al. developed a rapid testing detection system using a smartphone in combination with a 3D-printed dark chamber environment [16]. To mitigate variations among different test kits that could affect concentration estimation, most studies have adopted the control line (C line) as a reference, using the ratio of the test line (T line) to the C line as a standardized metric to improve result accuracy [17,18].
Quick Response Codes (QR codes) are two-dimensional barcodes invented by Japan’s Denso Wave company in 1994. Compared with traditional one-dimensional barcodes, QR codes have faster reading speeds and larger data capacity. QR codes have been widely used in IOT research and can also be applied in biomedicine. For example, in 2001, Qian et al. applied visible-light LEDs and QR codes to develop indoor positioning technology [19]. Shukran et al. proposed the use of mobile device-based QR code labels to enhance laboratory chemical inventory management and introduced a QR tag inventory system implemented in the chemical laboratory of the National Defense University of Malaysia [20]. In this paper, the reagent casings employed were all printed with QR codes. In addition to identifying the test item, specimen type, and batch number, the endpoints of the QR code can also serve as reference coordinates for image processing.
Perspective transformation is a technique that projects an image onto a new visual plane, also known as projection mapping. This technology is widely used in three-dimensional image processing, including correcting image perspective distortion, performing image scaling, and achieving three-dimensional reconstruction. For example, in a 2023 study, Jin et al. addressed the problem of single-image camera calibration by applying perspective field mapping. By predicting the direction and angle at each pixel, their method significantly improved the accuracy of image cropping and calibration. Furthermore, it was applied to tasks such as image composition and perspective consistency verification [21]. The method proposed by Abu Raddaha et al. in 2024 employs automatic ROI identification and perspective transformation techniques to geometrically rectify road images captured by vehicle-mounted cameras, enhancing the ability to detect potholes [22]. In 2025, Zhang et al. proposed a multimodal fusion projection technique that enhances cross-modal perception by applying perspective transformation to project the 3D coordinates detected by LiDAR onto the image plane of a camera. This approach improves the system’s flexibility and real-time performance [23]. To enhance the stability of detection, this study applied a perspective transformation based on the reference coordinates to transpose the region of interest to the center of the image.
The method established in this paper utilizes machine vision technology to provide a convenient and precise approach for interpreting fluorescent rapid tests, enabling quantitative analysis of fluorescence signals through image processing. To enhance reliability, QR codes were applied both for reading test data and as reference coordinates for image processing. A perspective transformation was applied to position the rapid test at the center of the image, ensuring stability and allowing consistent detection of weak fluorescence reactions on the rapid test.

2. Materials and Methods

2.1. Detection Target

The target of detection in this paper is a fluorescent rapid test, which utilizes the high specificity of a fluorescent sandwich immunoassay to simultaneously detect influenza A (Flu A) and influenza B (Flu B). When irradiated with a 365 nm ultraviolet (UV) LED, the fluorescent microspheres are excited to emit a 520 nm fluorescence signal that is visible to the naked eye. The structure of the rapid test used in this paper is shown in Figure 1. The nitrocellulose (NC) membrane of the test strip contains two test lines (T1 and T2 lines) and one control line (C line). The T1 line is coated with antibodies specific to influenza A virus antigens, enabling the formation of a sandwich complex with labeled antibodies that have bound to Flu A antigens. Similarly, the T2 line is coated with antibodies specific to influenza B virus antigens, allowing the formation of a sandwich complex with labeled antibodies that have bound to Flu B antigens. The control line (C line) is coated with IgG antibodies that bind to the labeled antibodies, serving as an internal control to verify the validity of the test results.
The results of the rapid test are shown in Figure 2a. The C line will appear regardless of whether the sample contains the viral antigen. If the C line does not appear, it indicates that the rapid test kit or the sample is invalid. When only the C line appears, the result is considered negative. When both the T1 line and the C line appear, the result indicates a positive reaction for Flu A. When both the T2 line and the C line appear, the result indicates a positive reaction for Flu B. The actual rapid test result is shown in Figure 2b.

2.2. Experimental Environment

The experimental setup in this paper was conducted inside a dark chamber specifically designed for optical detection, as shown in Figure 3. The chamber integrates a USB camera and a 365 nm ultraviolet (UV) LED. Upon UV excitation, the rapid test emits fluorescence, which is then captured by the camera for analysis. To minimize misjudgment caused by variations in the imaging environment, both the camera and UV LED are fixed at designated positions within the chamber. The inner walls of the chamber are coated with light-absorbing material to prevent internal reflections from interfering with image interpretation.
At the bottom of the chamber, a replaceable rapid test slot is installed for the rapid test. During operation, the rapid test is inserted into the slot and aligned with the bottom edge, ensuring that it remains within the focal range of the camera. The replaceable design of the slot allows for future adaptation to different types of rapid tests by simply fabricating a new slot tailored to the dimensions of the desired test, thereby enhancing the system’s compatibility and scalability.

2.3. Established Method

The detection procedure established in this paper is illustrated in Figure 4. First, the rapid test was imaged using a USB camera, and the captured RGB color image was converted into a grayscale image for subsequent analysis. The detection procedure began with locating the QR code, followed by reading the recorded information regarding the test type and batch number. The endpoints of the QR code were then extracted as reference coordinates to perform perspective transformation, thereby rotating the image and aligning the rapid test at the center of the frame. Next, the display region of the rapid test was segmented and defined as the region of interest (ROI), which was subsequently subjected to result interpretation and quantitative analysis. During the analysis, the presence of the control line (C line) was first verified. If the C line was absent, the reagent was immediately classified as invalid, and the test line (T line) was not analyzed further. If the C line was present, its signal intensity was normalized, after which the existence of the T line was detected, and its corresponding intensity was also normalized.

2.4. Region of Interest (ROI) Cropping

In this paper, each rapid test used includes a fixed-position QR code that encodes relevant information such as the test item, specimen type, and batch number. In addition to serving as an identifier for the rapid test, the endpoints of the QR code also provide reference markers for image processing. After reading the QR code data, its corner coordinates are used to perform perspective transformation that centers the rapid test in the image through rotation and alignment.
Once the rapid test is centered, the portion of the image above the QR code is cropped to prevent interference during ROI selection caused by the presence of the code. ROI extraction is performed through contour detection. The image is first binarized to enhance edges, and contours approximating a rectangular shape are identified. Among these, the largest contour whose center is horizontally aligned with the QR code is selected. This region is then cropped and used as the ROI for interpretation and quantitative analysis.

2.5. Result Detection

When the rapid test result is positive for Flu A, both the C line and T1 line appear simultaneously within the cropped ROI image. The grayscale values along the X-direction of the rapid test exhibit a waveform-like distribution, as shown in Figure 5a. This paper utilizes the characteristics of this distribution to perform numerical normalization for the C line, T1 line, and T2 line. The grayscale values of the ROI image are summed and averaged along the Y-direction to obtain the average grayscale distribution along the X-direction, as illustrated in Figure 5b.
Automatic Multiscale-based Peak Detection (AMPD) is an algorithm for detecting peaks in noisy signals. It was proposed by Scholkmann et al. in 2012 [24]. The method is based on the calculation and analysis of the Local Maxima Scalogram and uses multiscale technology to identify the true peak in the signal. The AMPD algorithm does not require any parameters to be set before analysis and is quite robust to high-frequency and low-frequency noise. Since the mean grayscale distribution along the X-direction exhibited a waveform-like pattern, the AMPD algorithm is then applied to identify the peak positions in the distribution. Based on the relative positions along the X-axis, the presence of the C line and T line is determined, followed by numerical normalization.
The normalization methods for the C line, T1 line, and T2 line were identical. Taking the C line as an example, the normalization process was performed as follows. First, let I(x) denote the grayscale intensity at position x along the C line region. The background value Ibg was defined as the median grayscale intensity of the entire image, representing the baseline level of non-line regions. The peak grayscale intensity of the C line was denoted as Ipeak. The difference between the C line peak and the background was first calculated as follows:
I = I p e a k I b g .
To obtain a more stable measurement, this paper defines a reference threshold at half the difference between the peak and the background value. The calculation equation is expressed as follows:
I r e f = I b g + 1 2 I .
Subsequently, all grayscale values I ( x ) within the region where I ( x ) I ref were extracted and averaged to obtain the representative C line intensity:
I ¯ C = 1 N x R C I x , w h e r e   R c = { x | I ( x ) I r e f } .
Finally, the normalized intensity of the C line is calculated as
I C n o r m a l = I ¯ C I b g .
This normalization method was consistently applied to the T1 line and T2 line using the same computational procedure, with the corresponding symbols replaced by IT1 and IT2, respectively.

3. Results

Commercially available standard samples of Flu A (H1N1) and Flu B (Yamagata) viruses were used in this paper. The original concentration of the Flu A (H1N1) samples was 500 ng/mL and that of the Flu B (Yamagata) samples was 1000 ng/mL. Both samples were serially diluted before testing. To verify the linearity and detection limit of the proposed method, experiments were conducted using samples with different concentrations in the low-concentration range. As the analytical sensitivity of rapid tests differs between Flu A and B viruses [25], and the initial sample concentrations also vary, distinct concentration ranges were set for each virus. Five concentrations were tested for Flu A and six concentrations for Flu B. For each experiment, 130 μL of sample was applied, and the reaction time was set to 15 min. Each concentration was tested five times to ensure repeatability. All experiments were performed multiple times, and one representative dataset was selected for presentation and discussion.

3.1. Results of Flu A

This paper conducted experiments on Flu A (H1N1) samples using five selected concentrations: 2, 4, 6, 8, and 10 ng/mL. The rapid tests with samples of varying concentrations were placed inside the dark chamber for detection. The average grayscale value distributions along the X-axis are shown in Figure 6. As observed in the figure, the peak of the T1 line gradually decreases as the sample concentration decreases, with the trend becoming more gradual. Even at low concentrations (e.g., 2 ng/mL), although changes are difficult to distinguish with the naked eye, slight variations can still be detected under the controlled environment of the dark chamber used in this paper. Overall, the grayscale values of the T1 line show a synchronized decreasing trend corresponding to decreasing sample concentration. Moreover, the grayscale values demonstrate a strong correlation with the concentrations, indicating that this method is feasible for quantitative analysis. Through linear regression analysis, a relationship between grayscale intensity and concentration can be established, enabling the quantitative estimation of virus concentration based on the T1 line.
Figure 7 illustrates the relationship between the signal intensity ratio of the T1 line and C line and the concentration of Flu A (H1N1) at five different levels. The linear regression analysis demonstrates excellent linearity within the concentration range of 2–10 ng/mL, with a coefficient of determination (R2) as high as 0.9923. The results show that the method established in this paper can effectively reflect changes in H1N1 concentration. However, the variability of the T/C ratio is significant at a concentration of 2 ng/mL, which indicates that the detection limit for the established method is 4 ng/mL.

3.2. Results of Flu B

For Flu B (Yamagata) samples, five concentrations were selected for experimentation: 6, 8, 10, 12.5, and 25 ng/mL. Following UV-LED excitation, the average grayscale distribution along the X-axis for each concentration is presented in Figure 8. As shown in the figure, the peak intensity of the T2 line progressively decreases with decreasing sample concentration, and this trend gradually levels off. At higher concentrations (12.5 ng/mL and 25 ng/mL), the T2 line exhibits a clear contrast against the background. However, as the concentration is reduced to 6 ng/mL and below, the T2 line signal intensity approaches the background level, indicating a marked decline in detection sensitivity under low-concentration conditions. Therefore, the detection limit of the rapid test method in this paper was determined to be 6 ng/mL.
Figure 9 shows the ratio of the T2 line to the C line and its relationship with concentration for five different concentrations of influenza B (Yamagata) samples. It can be seen from the figure that the linearity is relatively poor, with an R2 value of 0.9878. When applied to these five concentrations, the T/C ratio at 6 ng/mL showed significant variability, making it difficult to accurately determine the concentration. This indicates that the detection limit of the method established in this paper is 8 ng/mL.

4. Discussion

In this paper, a dark chamber equipped with a UV-LED light source was used. This configuration not only effectively excites the fluorescent or chromogenic reactions of the reagents but also simplifies the system design and reduces the chamber size, thereby facilitating overall device miniaturization. Notably, under high-concentration conditions (25 ng/mL) of Flu B (Yamagata) samples, the peak intensity of the C line exhibited a smaller difference from the background signal compared to the T line. This phenomenon is likely due to antibody saturation or competitive binding effects in the reagent system, resulting in a reduction in the C line signal. To mitigate the influence of such effects on result interpretation, this paper employed the T/C intensity ratio as the output metric to improve the stability and accuracy of quantitative analysis.
In addition, the background threshold was determined based on the median grayscale value within the ROI, rather than the overall mean. This choice was made because mean values are more susceptible to distortion from high-intensity signals, particularly from the T and C lines at higher sample concentrations. Such distortion can lead to an overestimation of the background level, subsequently causing underestimation of the actual analyte concentration and compromising quantification accuracy. Using the median as the reference background value thus enhances the robustness and reliability of the data.

4.1. Limitations of the Method

Although the established method demonstrated high linearity under specific conditions (e.g., R2 of 0.9923 for Flu A (H1N1)), several key limitations must be addressed when evaluating its broad applicability such as variability at low concentrations, validation on specific influenza strains, and environmental and reagent-related restrictions.
The method faces challenges in the detection of low-concentration samples. Experimental results show that as the detection limit is approached, signal variability significantly increases. For instance, at a concentration of 2 ng/mL for Flu A (H1N1), the variability of the T/C ratio was significant; thus, the stable detection limit for this method was determined to be 4 ng/mL. Similarly, Flu B (Yamagata) showed significant variability at a concentration of 8 ng/mL, and at 6 ng/mL and below, the T2 line signal intensity was close to the background level.
The validation experiments in this paper were conducted using two specific standard test samples: Flu A (H1N1) and Flu B (Yamagata). This indicates that the method’s efficacy and performance have been validated only for these two specific strains. Its performance on other influenza strains such as Flu A (H3N2) or Flu B (Victoria) remains to be further confirmed in future studies [9,26,27].
The currently developed detection method is specifically designed for use in a dark chamber equipped with a UV-LED light source, which limits its immediate application in general clinics or home settings. Moreover, this image processing method is designed for fluorescent rapid tests and cannot be directly used to interpret conventional colloidal gold (non-fluorescent) rapid tests.

4.2. Future Outlook

To overcome the aforementioned limitations and expand the applicability of the established method, future research will proceed in two main directions. First, the test strips and concentration ranges should be adjusted to broaden the range of potential applications. Second, this study can be integrated with neural networks to enable more sophisticated image detection, such as using smartphone imaging under normal lighting conditions and applying the approach to non-fluorescent rapid test reagents.
However, when developing such deep learning-based systems, it is essential to consider the cautions raised by Dell’Olmo et al. in their study on CNN-based forgery detection [28]. Their work highlights the major challenge of dataset dependency, emphasizing that the performance of CNN architectures is strongly influenced by the intrinsic characteristics of the training datasets. Their analysis demonstrated that factors such as sample size, class imbalance, and the intrinsic complexity of manipulations, which in our case refers to the subtlety of the signal, are critical determinants of a model’s generalization capability.

5. Conclusions

This paper introduces a method for interpreting and quantitatively analyzing the results of rapid fluorescence tests for influenza A and influenza B in a UV-LED environment. When applied to Flu A (H1N1), the method showed excellent linearity with an R2 of 0.9923. It was effective even at low concentrations, with a detection limit of 4 ng/mL. For Flu B (Yamagata), good linearity was also observed, with an R2 of 0.9896. However, the method was unable to effectively interpret results beyond a detection limit of 8 ng/mL. Currently, the detection method developed in this paper is designed for use exclusively in a dark chamber and is specifically applied to fluorescent rapid tests. It cannot be directly used with conventional colloidal gold-based rapid test reagents. In the future, the proposed detection approach could be integrated with neural networks to enable its application to non-fluorescent rapid test interpretation and to operate beyond the dark chamber environment, for example by utilizing smartphone imaging for result interpretation under normal lighting conditions

Author Contributions

Conceptualization and methodology, W.-C.W. and Y.-L.W.; investigation and validation, W.-C.W. and Y.-L.W.; writing—original draft preparation, Y.-L.W.; supervision, Y.-C.L.; writing—review and editing, Y.-L.W., W.-F.P. and Y.-C.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available in the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Iuliano, A.D.; Roguski, K.M.; Chang, H.H.; Muscatello, D.J.; Palekar, R.; Tempia, S.; Cohen, C.; Gran, J.M.; Schanzer, D.; Cowling, B.J.; et al. Estimates of global seasonal influenza-associated respiratory mortality: A modelling study. Lancet 2017, 391, 1285–1300. [Google Scholar] [CrossRef]
  2. Weng, Y.S. The Development of a Community-Based Health Promotion System. J. Med. Health Sci. 2014, 24, 365–375. [Google Scholar] [CrossRef]
  3. Hung, C.Y.; Hu, H.C.; Huang, C.C.; Hsieh, M.J.; Yang, C.T.; Kao, K.C. Severe Acute Respiratory Distress Syndrome Caused by Influenza B Virus in a Healthy Adult. Thorac. Med. 2011, 26, 147–152. [Google Scholar] [CrossRef]
  4. Sederdahl, B.K.; Williams, J.V. Epidemiology and clinical characteristics of influenza C virus. Viruses 2020, 12, 89. [Google Scholar] [CrossRef]
  5. Schughart, K.; Smith, A.M.; Tsalik, E.L.; Threlkeld, S.C.; Sellers, S.; Fischer, W.A., II; Schreiber, J.; Lücke, E.; Cornberg, M.; Debarry, J.; et al. Host response to influenza infections in human blood: Association of influenza severity with host genetics and transcriptomic response. Front. Immunol. 2024, 15, 1385362. [Google Scholar] [CrossRef] [PubMed]
  6. Uyeki, T.M.; Bernstein, H.H.; Bradley, J.S.; Englund, J.A.; File, T.M.; Fry, A.M.; Gravenstein, S.; Hayden, F.G.; Harper, S.A.; Hirshon, J.M.; et al. Clinical practice guidelines by the Infectious Diseases Society of America: 2018 update on diagnosis, treatment, chemoprophylaxis, and institutional outbreak management of seasonal influenza. Clin. Infect. Dis. 2019, 68, e1–e47. [Google Scholar] [CrossRef] [PubMed]
  7. Yin, H.; Wu, W.; Lv, Y.; Kou, H.; Sun, Y. Comparative evaluation of three rapid influenza diagnostic tests for detection of influenza A and B viruses using RT-PCR as reference method. J. Med. Virol. 2025, 97, e70162. [Google Scholar] [CrossRef] [PubMed]
  8. Morehouse, Z.; Chance, N.; Ryan, G.L.; Proctor, C.M.; Nash, R. A narrative review of nine commercial point of care influenza tests: An overview of methods, benefits, and drawbacks to rapid influenza diagnostic testing. J. Osteopath. Med. 2023, 123, 39–47. [Google Scholar] [CrossRef]
  9. Chartrand, C.; Leeflang, M.M.G.; Minion, J.; Brewer, T.; Pai, M. Accuracy of rapid influenza diagnostic tests: A meta-analysis. Ann. Intern. Med. 2012, 156, 500–511. [Google Scholar] [CrossRef]
  10. Radha, R.; Shahzadi, S.K.; Al-Sayah, M.H. Fluorescent Immunoassays for Detection and Quantification of Cardiac Troponin I: A Short Review. Molecules 2021, 26, 4812. [Google Scholar] [CrossRef]
  11. Babamiri, B.; Hallaj, R.; Salimi, A. Solid surface fluorescence immunosensor for ultrasensitive detection of hepatitis B virus surface antigen using PAMAM/CdTe@CdS QDs nanoclusters. Methods Appl. Fluoresc. 2018, 6, 035013. [Google Scholar] [CrossRef]
  12. Jin, Z.; Wang, Y.; Zhang, Y.; Wang, Y. A new method for rapid screening of hybridoma cell clones secreting paired antibodies using sandwich cell surface fluorescence immunosorbent assay. Anal. Chim. Acta 2021, 1163, 338493. [Google Scholar] [CrossRef] [PubMed]
  13. Lee, J.Y.; Baek, S.H.; Ahn, J.G.; Yoon, S.H.; Kim, M.K.; Kim, S.Y.; Kim, K.W.; Sohn, M.H.; Kang, J.M. Delayed Influenza Treatment in Children With False-Negative Rapid Antigen Test: A Retrospective Single-Center Study in Korea 2016–2019. J. Korean Med. Sci. 2021, 37, e3. [Google Scholar] [CrossRef] [PubMed]
  14. Lin, K.-W.; Chang, Y.-C. Embedded Immunodetection System for Fecal Occult Blood. Biosensors 2021, 11, 106. [Google Scholar] [CrossRef] [PubMed]
  15. Turbé, V.; Herbst, C.; Mngomezulu, T.; Meshkinfamfard, S.; Dlamini, N.; Mhlongo, T.; Smit, T.; Cherepanova, V.; Shimada, K.; Budd, J.; et al. Deep learning of HIV field-based rapid tests. Nat. Med. 2021, 27, 1165–1170. [Google Scholar] [CrossRef]
  16. Schary, W.; Paskali, F.; Rentschler, S.; Ruppert, C.; Wagner, G.E.; Steinmetz, I.; Deigner, H.-P.; Kohl, M. Open-Source, Adaptable, All-in-One Smartphone-Based System for Quantitative Analysis of Point-of-Care Diagnostics. Diagnostics 2022, 12, 589. [Google Scholar] [CrossRef]
  17. Tuong, H.T.; Jeong, J.H.; Choi, Y.K.; Park, H.; Baek, Y.H.; Yeo, S.-J. Development of a Rapid Fluorescent Diagnostic System to Detect Subtype H9 Influenza A Virus in Chicken Feces. Int. J. Mol. Sci. 2021, 22, 8823. [Google Scholar] [CrossRef]
  18. Goux, H.J.; Vu, B.V.; Wasden, K.; Alpadi, K.; Kumar, A.; Kalra, B.; Savjani, G.; Brosamer, K.; Kourentzi, K.; Willson, R.C. Development of a quantitative fluorescence lateral flow immunoassay (LFIA) prototype for point-of-need detection of anti-Müllerian hormone. Pract. Lab. Med. 2023, 35, e00314. [Google Scholar] [CrossRef]
  19. Wu, Q.; He, Y. Indoor location technology based on LED visible light and QR code. Appl. Opt. 2021, 60, 4606–4612. [Google Scholar] [CrossRef]
  20. Shukran, M.A.M.; Ishak, M.S.; Abdullah, M.N. Enhancing chemical inventory management in laboratory through a mobile-based QR code tag. IOP Conf. Ser. Mater. Sci. Eng. 2017, 226, 012093. [Google Scholar] [CrossRef]
  21. Jin, L.; Zhang, J.; Hold-Geoffroy, Y.; Wang, O.; Matzen, K.; Sticha, M.; Fouhey, D.F. Perspective Fields for Single Image Camera Calibration. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 17–22 June 2023; pp. 1234–1244. [Google Scholar] [CrossRef]
  22. Abu-raddaha, A.; El-Shair, Z.A.; Rawashdeh, S. Leveraging Perspective Transformation for Enhanced Pothole Detection in Autonomous Vehicles. J. Imaging 2024, 10, 227. [Google Scholar] [CrossRef]
  23. Zhang, Y.; Yang, B.; Lei, W.; Pei, X. Research on Multimodal Fusion Perception Technology for Autonomous Sweeping Vehicle. IEEE Sens. J. 2025, 25, 27743–27753. [Google Scholar] [CrossRef]
  24. Scholkmann, F.; Boss, J.; Wolf, M. An Efficient Algorithm for Automatic Peak Detection in Noisy Periodic and Quasi-Periodic Signals. Algorithms 2012, 5, 588–603. [Google Scholar] [CrossRef]
  25. Wang, K.; Lin, C.; Fang, Y.; Kao, M.; Shih, D.Y.-C.; Lo, C.-F.; Wang, D.-Y. Sensitivity and specificity of in vitro diagnostic device used for influenza rapid test in Taiwa. J. Food Drug Anal. 2014, 22, 279–284. [Google Scholar] [CrossRef]
  26. Chon, I.; Saito, R.; Kyaw, Y.; Aye, M.M.; Setk, S.; Phyu, W.W.; Wagatsuma, K.; Li, J.; Sun, Y.; Otoguro, T.; et al. Whole-Genome Analysis of Influenza A(H3N2) and B/Victoria Viruses Detected in Myanmar during the COVID-19 Pandemic in 2021. Viruses 2023, 15, 583. [Google Scholar] [CrossRef]
  27. Sakai-Tagawa, Y.; Yamayoshi, S.; Kawakoka, Y. Sensitivity of Commercially Available Influenza Rapid Diagnostic Tests in the 2018–2019 Influenza Season. Front. Microbiol. 2019, 10, 2342. [Google Scholar] [CrossRef]
  28. Dell’Olmo, P.V.; Kuznetsov, O.; Frontoni, E.; Arnesano, M.; Napoli, C.; Randieri, C. Dataset Dependency in CNN-Based Copy-Move Forgery Detection: A Multi-Dataset Comparative Analysis. Mach. Learn. Knowl. Extr. 2025, 7, 54. [Google Scholar] [CrossRef]
Figure 1. Configuration of the FSIA rapid test: (a) configuration of the FSIA rapid test strip; (b) photograph of the actual rapid test.
Figure 1. Configuration of the FSIA rapid test: (a) configuration of the FSIA rapid test strip; (b) photograph of the actual rapid test.
Applsci 15 11523 g001
Figure 2. Colorimetric results of the rapid test: (a) different result types and their corresponding interpretations; (b) actual appearance of the rapid test after detection.
Figure 2. Colorimetric results of the rapid test: (a) different result types and their corresponding interpretations; (b) actual appearance of the rapid test after detection.
Applsci 15 11523 g002
Figure 3. Structure of the dark chamber.
Figure 3. Structure of the dark chamber.
Applsci 15 11523 g003
Figure 4. Flowchart of the established detection method.
Figure 4. Flowchart of the established detection method.
Applsci 15 11523 g004
Figure 5. Grayscale value distribution diagram of Flu A-positive test: (a) origin image; (b) distribution diagram of average.
Figure 5. Grayscale value distribution diagram of Flu A-positive test: (a) origin image; (b) distribution diagram of average.
Applsci 15 11523 g005
Figure 6. Five different concentrations of Flu A (H1N1) samples and their coloration average grayscale value distribution along the X-axis.
Figure 6. Five different concentrations of Flu A (H1N1) samples and their coloration average grayscale value distribution along the X-axis.
Applsci 15 11523 g006
Figure 7. Distribution graph of Flu A (H1N1) concentration and the T1 line/C line ratio.
Figure 7. Distribution graph of Flu A (H1N1) concentration and the T1 line/C line ratio.
Applsci 15 11523 g007
Figure 8. Flu B (Yamagata) samples at five concentrations and their average grayscale value distribution along the X-axis.
Figure 8. Flu B (Yamagata) samples at five concentrations and their average grayscale value distribution along the X-axis.
Applsci 15 11523 g008
Figure 9. Distribution graph of Flu B (Yamagata) concentration and the T2 line/C line ratio.
Figure 9. Distribution graph of Flu B (Yamagata) concentration and the T2 line/C line ratio.
Applsci 15 11523 g009
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wu, Y.-L.; Weng, W.-C.; Pan, W.-F.; Lin, Y.-C. Image Processing Technology Applied to Fluorescent Rapid Tests for Influenza A and B Viruses. Appl. Sci. 2025, 15, 11523. https://doi.org/10.3390/app152111523

AMA Style

Wu Y-L, Weng W-C, Pan W-F, Lin Y-C. Image Processing Technology Applied to Fluorescent Rapid Tests for Influenza A and B Viruses. Applied Sciences. 2025; 15(21):11523. https://doi.org/10.3390/app152111523

Chicago/Turabian Style

Wu, Yu-Lin, Wei-Chien Weng, Wen-Fung Pan, and Yu-Cheng Lin. 2025. "Image Processing Technology Applied to Fluorescent Rapid Tests for Influenza A and B Viruses" Applied Sciences 15, no. 21: 11523. https://doi.org/10.3390/app152111523

APA Style

Wu, Y.-L., Weng, W.-C., Pan, W.-F., & Lin, Y.-C. (2025). Image Processing Technology Applied to Fluorescent Rapid Tests for Influenza A and B Viruses. Applied Sciences, 15(21), 11523. https://doi.org/10.3390/app152111523

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop