Skip Content
You are currently on the new version of our website. Access the old version .
NanomaterialsNanomaterials
  • Article
  • Open Access

23 October 2025

A Wavelet-Based Bilateral Segmentation Study for Nanowires

,
,
and
School of Computer Science and Technology, Changchun Normal University, Changchun 130032, China
*
Authors to whom correspondence should be addressed.
This article belongs to the Special Issue Low Dimensional Materials Fabrication, Characterization and Applications

Abstract

One-dimensional (1D) nanowires represent a critical class of nanomaterials with extensive applications in biosensing, biomedicine, bioelectronics, and energy harvesting. In materials science, accurately extracting their morphological and structural features is essential for effective image segmentation. However, 1D nanowires frequently appear in dispersed or entangled configurations, often with blurred backgrounds and indistinct boundaries, which significantly complicates the segmentation process. Traditional threshold-based methods struggle to segment these structurally complex nanowires with high precision. To address this challenge, we propose a wavelet-based Bilateral Segmentation Network named WaveBiSeNet, to which a Dual Wavelet Convolution Module (DWCM) and a Flexible Upsampling Module (FUM) are introduced to enhance feature representation and improve segmentation accuracy. In this study, we benchmarked WaveBiSeNet against ten segmentation models on a peptide nanowire image dataset. Experimental results demonstrate that WaveBiSeNet achieves, mIoU of 77.59%, an accuracy of 89.95%, an F1 score of 87.22%, and a Kappa coefficient of 74.13%, respectively. Compared to other advanced models, our proposed model achieves better segmentation performance. These findings demonstrate that WaveBiSeNet is an end-to-end deep segmentation network capable of accurately analyzing complex 1D nanowire structures.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.