Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (467)

Search Parameters:
Keywords = automated behavior analysis

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 2930 KB  
Article
Developing and Assessing the Performance of a Machine Learning Model for Analyzing Drinking Behaviors in Minipigs for Experimental Research
by Frederik Deutch, Lars Schmidt Hansen, Firas Omar Saleh, Marc Gjern Weiss, Constanca Figueiredo, Cyril Moers, Anna Krarup Keller and Stefan Rahr Wagner
Sensors 2026, 26(2), 402; https://doi.org/10.3390/s26020402 - 8 Jan 2026
Abstract
Monitoring experimental animals is essential for ethical, scientific, and financial reasons. Conventional observation methods are limited by subjectivity and time constraints. Camera-based monitoring combined with machine learning offers a promising solution for automating the monitoring process. This study aimed to validate and assess [...] Read more.
Monitoring experimental animals is essential for ethical, scientific, and financial reasons. Conventional observation methods are limited by subjectivity and time constraints. Camera-based monitoring combined with machine learning offers a promising solution for automating the monitoring process. This study aimed to validate and assess the performance of a machine learning model for analyzing drinking behavior in minipigs. A novel, vision-based monitoring system was developed and tested to detect drinking behavior in minipigs. The system, based on low-cost Raspberry Pi units, enabled on-site video analysis. A dataset of 5297 images was used to train a YOLOv11n object detection model to identify key features such as pig heads and water faucets. Drinking events were defined by the spatial proximity of these features within video frames. The multi-class object detection model achieved an accuracy of above 97%. Manual validation using human-annotated ground truth on 72 h of video yielded an overall accuracy of 99.7%, with a precision of 99.7%, recall of 99.2%, and F1-score of 99.5%. Drinking patterns for three pigs were analyzed using 216 h of video. The results revealed a bimodal drinking pattern and substantial inter-pig variability. A limitation to the study was chosen methods missing distinguishment between multiple pigs and the absence of quantification of water intake. This study demonstrates the feasibility of a low-cost, computer vision-based system for monitoring drinking behavior in individually housed experimental pigs, supporting earlier detection of illness. Full article
Show Figures

Figure 1

20 pages, 945 KB  
Article
A Pilot Study on Multilingual Detection of Irregular Migration Discourse on X and Telegram Using Transformer-Based Models
by Dimitrios Taranis, Gerasimos Razis and Ioannis Anagnostopoulos
Electronics 2026, 15(2), 281; https://doi.org/10.3390/electronics15020281 - 8 Jan 2026
Abstract
The rise of Online Social Networks has reshaped global discourse, enabling real-time conversations on complex issues such as irregular migration. Yet the informal, multilingual, and often noisy nature of content on platforms like X (formerly Twitter) and Telegram presents significant challenges for reliable [...] Read more.
The rise of Online Social Networks has reshaped global discourse, enabling real-time conversations on complex issues such as irregular migration. Yet the informal, multilingual, and often noisy nature of content on platforms like X (formerly Twitter) and Telegram presents significant challenges for reliable automated analysis. This study presents an exploratory multilingual natural language processing (NLP) framework for detecting irregular migration discourse across five languages. Conceived as a pilot study addressing extreme data scarcity in sensitive migration contexts, this work evaluates transformer-based models on a curated multilingual corpus. It provides an initial baseline for monitoring informal migration narratives on X and Telegram. We evaluate a broad range of approaches, including traditional machine learning classifiers, SetFit sentence-embedding models, fine-tuned multilingual BERT (mBERT) transformers, and a Large Language Model (GPT-4o). The results show that GPT-4o achieves the highest performance overall (F1-score: 0.84), with scores reaching 0.89 in French and 0.88 in Greek. While mBERT excels in English, SetFit outperforms mBERT in low-resource settings, specifically in Arabic (0.79 vs. 0.70) and Greek (0.88 vs. 0.81). The findings highlight the effectiveness of transformer-based and large-language-model approaches, particularly in low-resource or linguistically heterogeneous environments. Overall, the proposed framework provides an initial, compact benchmark for multilingual detection of irregular migration discourse under extreme, low-resource conditions. The results should be viewed as exploratory indicators of model behavior on this synthetic, small-scale corpus, not as statistically generalizable evidence or deployment-ready tools. In this context, “multilingual” refers to robustness across different linguistic realizations of identical migration narratives under translation, rather than coverage of organically diverse multilingual public discourse. Full article
(This article belongs to the Special Issue Artificial Intelligence-Driven Emerging Applications)
Show Figures

Figure 1

19 pages, 539 KB  
Article
Actuator-Aware Evaluation of MPC and Classical Controllers for Automated Insulin Delivery
by Adeel Iqbal, Pratik Goswami and Hamid Naseem
Actuators 2026, 15(1), 35; https://doi.org/10.3390/act15010035 - 5 Jan 2026
Viewed by 93
Abstract
Automated insulin delivery (AID) systems depend on their actuators’ behavior since saturation limits, rate constraints, and hardware degradation directly affect the stability and safety of glycemic regulation. In this paper, we conducted an actuator-centric evaluation of five control strategies: Nonlinear Model Predictive Control [...] Read more.
Automated insulin delivery (AID) systems depend on their actuators’ behavior since saturation limits, rate constraints, and hardware degradation directly affect the stability and safety of glycemic regulation. In this paper, we conducted an actuator-centric evaluation of five control strategies: Nonlinear Model Predictive Control (NMPC), Linear MPC (LMPC), Adaptive MPC (AMPC), Proportional-Integral-Derivative (PID), and Linear Quadratic Regulator (LQR) in three physiologically realistic scenarios: the first combines exercise and sensor noise to test for stress robustness; the second tightens the actuation constraints to provoke saturation; and the third models partial degradation of an insulin actuator in order to quantify fault tolerance. We have simulated a full virtual cohort under the two-actuator configurations, DG3.2 and DG4.0, in an effort to investigate generation-to-generation consistency. The results detail differences in the way controllers distribute insulin and glucagon effort, manage rate limits, and handle saturation: NMPC shows persistently tighter control with fewer rate-limit violations in both DG3.2 and DG4.0, whereas the classical controllers are prone to sustained saturation episodes and delayed settling under hard disturbances. In response to actuator degradation, NMPC suffers smaller losses in insulin effort with limited TIR losses, whereas both PID and LQR show increased variability and overshoot. This comparative analysis yields fundamental insights into important trade-offs between robustness, efficiency, and hardware stress and demonstrates that actuator-aware control design is essential for next-generation AID systems. Such findings position MPC-based algorithms as leading candidates for future development of actuator-limited medical devices and deliver important actionable insights into actuator modeling, calibration, and controller tuning during clinical development. Full article
(This article belongs to the Section Actuators for Medical Instruments)
Show Figures

Figure 1

22 pages, 14360 KB  
Article
Kinematic Characterization of a Novel 4-DoF Parallel Mechanism with Modular Actuation
by Zoltán Forgó and Ferenc Tolvaly-Roșca
Robotics 2026, 15(1), 13; https://doi.org/10.3390/robotics15010013 - 1 Jan 2026
Viewed by 113
Abstract
The accelerating industrial demand for high-speed manipulation has necessitated the development of robotic architectures that effectively balance dynamic performance with workspace size. While serial SCARA robots offer large workspaces and parallel Delta robots provide high acceleration, existing architectures fail to combine these benefits [...] Read more.
The accelerating industrial demand for high-speed manipulation has necessitated the development of robotic architectures that effectively balance dynamic performance with workspace size. While serial SCARA robots offer large workspaces and parallel Delta robots provide high acceleration, existing architectures fail to combine these benefits effectively for specific four-degree-of-freedom (4-DoF) Schoenflies motion tasks. This study introduces and characterizes a novel 4-DoF parallel topology, having a symmetrical build-up, which is distinguished by its use of modular 2-DoF linear drive units. The research methodology entails the structural synthesis of the kinematic chain followed by kinematic analysis using vector algebra to derive closed-form inverse geometric models. Additionally, the Jacobian matrix is formulated to evaluate velocity transmission and systematically classify singular configurations, while the dexterity index is defined to assess the rotational capabilities of the mechanism. Numerical simulations of pick-and-place trajectory were also conducted, varying trajectory curvature to analyze kinematic behavior. The results demonstrate that the proposed modular architecture yields a highly symmetric and homogeneous workspace that can be scaled simply by adjusting the drive module lengths. Furthermore, the singularity and dexterity analyses reveal a substantial, singularity-free operational workspace, although tighter trajectory curvatures were found to impose higher velocity demands on the joints. In conclusion, the proposed mechanism successfully achieves the targeted Schoenflies motion, offering a solution for automated industrial tasks. Full article
(This article belongs to the Special Issue Advanced Control and Optimization for Robotic Systems)
Show Figures

Figure 1

20 pages, 1440 KB  
Article
Robust Optimization and Workspace Enhancement of a Reconfigurable Delta Robot via a Singularity-Sensitive Index
by Arturo Franco-López, Mauro Maya, Alejandro González, Liliana Félix-Ávila, César-Fernando Méndez-Barrios and Antonio Cardenas
Robotics 2026, 15(1), 11; https://doi.org/10.3390/robotics15010011 - 30 Dec 2025
Viewed by 193
Abstract
This study investigates the kinematic behavior of a reconfigurable Delta parallel robot aiming to enhance its performance in real industrial applications such as high-speed packaging, precision pick-and-place operations, automated inspection, and lightweight assembly tasks. While Delta robots are widely recognized for their speed [...] Read more.
This study investigates the kinematic behavior of a reconfigurable Delta parallel robot aiming to enhance its performance in real industrial applications such as high-speed packaging, precision pick-and-place operations, automated inspection, and lightweight assembly tasks. While Delta robots are widely recognized for their speed and accuracy, their practical use is often limited by workspace constraints and singularities that compromise motion stability and control safety. Through a detailed analysis, it is shown that classical Jacobian-based performance indices are unsuitable for resolving the redundancy introduced by geometric reconfiguration, as they may lead the system toward singular or ill-conditioned configurations. To overcome these limitations, this work introduces an adjustable singularity-sensitive performance index designed to penalize extreme velocity and force singular values and enables tuning between velocity and force performance. Simulation results demonstrate that optimizing the reconfiguration parameter using this index increases the usable workspace by approximately 82% and improves the uniformity of manipulability across the workspace. These findings suggest that the proposed approach provides a robust framework for enhancing the operational range and kinematic safety of reconfigurable Delta robots, while remaining adaptable to different design priorities. Full article
(This article belongs to the Topic New Trends in Robotics: Automation and Autonomous Systems)
Show Figures

Figure 1

24 pages, 1035 KB  
Article
XT-Hypergraph-Based Decomposition and Implementation of Concurrent Control Systems Modeled by Petri Nets
by Łukasz Stefanowicz, Paweł Majdzik and Marcin Witczak
Appl. Sci. 2026, 16(1), 340; https://doi.org/10.3390/app16010340 - 29 Dec 2025
Viewed by 155
Abstract
This paper presents an integrated approach to the structural decomposition of concurrent control systems using exact transversal hypergraphs (XT-hypergraphs). The proposed method combines formal properties of XT-hypergraphs with invariant-based Petri net analysis to enable automatic partitioning of complex, concurrent specifications into deterministic and [...] Read more.
This paper presents an integrated approach to the structural decomposition of concurrent control systems using exact transversal hypergraphs (XT-hypergraphs). The proposed method combines formal properties of XT-hypergraphs with invariant-based Petri net analysis to enable automatic partitioning of complex, concurrent specifications into deterministic and independent components. The approach focuses on preserving behavioral correctness while minimizing inter-component dependencies and computational complexity. By exploiting the uniqueness of minimal transversal covers, reducibility, and structural stability of XT-hypergraphs, the method achieves a deterministic decomposition process with polynomial-delay generation of exact transversals. The research provides practical insights into the construction, reduction, and classification of XT structures, together with quality metrics evaluating decomposition efficiency and structural compactness. The developed methodology is validated on representative real-world control and embedded systems, showing its applicability in deterministic modeling, analysis, and implementation of concurrent architectures. Future work includes the integration of XT-hypergraph algorithms with adaptive decomposition and verification frameworks to enhance scalability and automation in modern system design and integration with currently popular AI and machine learning methods. Full article
Show Figures

Figure 1

26 pages, 5836 KB  
Article
Soil Classification from Cone Penetration Test Profiles Based on XGBoost
by Jinzhang Zhang, Jiaze Ni, Feiyang Wang, Hongwei Huang and Dongming Zhang
Appl. Sci. 2026, 16(1), 280; https://doi.org/10.3390/app16010280 - 26 Dec 2025
Viewed by 296
Abstract
This study develops a machine-learning-based framework for multiclass soil classification using Cone Penetration Test (CPT) data, aiming to overcome the limitations of traditional empirical Soil Behavior Type (SBT) charts and improve the automation, continuity, robustness, and reliability of stratigraphic interpretation. A dataset of [...] Read more.
This study develops a machine-learning-based framework for multiclass soil classification using Cone Penetration Test (CPT) data, aiming to overcome the limitations of traditional empirical Soil Behavior Type (SBT) charts and improve the automation, continuity, robustness, and reliability of stratigraphic interpretation. A dataset of 340 CPT soundings from 26 sites in Shanghai is compiled, and a sliding-window feature engineering strategy is introduced to transform point measurements into local pattern descriptors. An XGBoost-based multiclass classifier is then constructed using fifteen engineered features, integrating second-order optimization, regularized tree structures, and probability-based decision functions. Results demonstrate that the proposed method achieves strong classification performance across nine soil categories, with an overall classification accuracy of approximately 92.6%, an average F1-score exceeding 0.905, and a mean Average Precision (mAP) of 0.954. The confusion matrix, P–R curves, and prediction probabilities show that soil types with distinctive CPT signatures are classified with near-perfect confidence, whereas transitional clay–silt facies exhibit moderate but geologically consistent misclassification. To evaluate depth-wise prediction reliability, an Accuracy Coverage Rate (ACR) metric is proposed. Analysis of all CPTs reveals a mean ACR of 0.924, and the ACR follows a Weibull distribution. Feature importance analysis indicates that depth-dependent variables and smoothed ps statistics are the dominant predictors governing soil behavior differentiation. The proposed XGBoost-based framework effectively captures nonlinear CPT–soil relationships, offering a practical and interpretable tool for high-resolution soil classification in subsurface investigations. Full article
Show Figures

Figure 1

33 pages, 4543 KB  
Review
A One-Dimensional Model Used for the Analysis of Seismic Site Response and Soil Instabilities: A Review of SCOSSA 1.0 Computer Code
by Giuseppe Tropeano and Anna Chiaradonna
Geotechnics 2026, 6(1), 2; https://doi.org/10.3390/geotechnics6010002 - 25 Dec 2025
Viewed by 208
Abstract
This review aims to provide a complete and comprehensive state of the art of the SCOSSA computer code, which is a one-dimensional nonlinear computer code used for the analysis of seismic site response and soil instability. Indeed, among the effects of earthquakes, the [...] Read more.
This review aims to provide a complete and comprehensive state of the art of the SCOSSA computer code, which is a one-dimensional nonlinear computer code used for the analysis of seismic site response and soil instability. Indeed, among the effects of earthquakes, the activation of landslides and liquefaction constitute two of the predominant causes of vulnerability in the physical and built environment. The SCOSSA computer code (Seismic Code for Stick–Slip Analysis) was initially developed to evaluate the permanent displacements of simplified slopes using a coupled model, and introduced several improvements with respect to the past, namely, the formulation for solving the dynamic equilibrium equations incorporates the capability for automated detection of the critical sliding surface; an up-to-date constitutive model to represent hysteretic material behavior and a stable iterative algorithm to support the solution of the system in terms of kinematic variables. To address liquefaction-induced failure, a simplified pore water pressure generation model was subsequently developed and integrated into the code, coupled with one-dimensional consolidation theory. This review retraces the main features, developments, and applications of the computer code from the origin to the present version. Full article
Show Figures

Figure 1

23 pages, 7378 KB  
Article
A Longitudinal 3D Live-Cell Imaging Platform to Uncover AAV Vector–Host Dynamics at Single-Cell Resolution
by Marlies Leysen, Nicolas Peredo, Benjamin Pavie, Benjamien Moeyaert and Els Henckaerts
Int. J. Mol. Sci. 2026, 27(1), 236; https://doi.org/10.3390/ijms27010236 - 25 Dec 2025
Viewed by 371
Abstract
Recombinant adeno-associated viral vectors (rAAVs) are the leading gene delivery vehicles in clinical development, yet efficient nuclear delivery remains a major barrier to effective transduction. This limitation is partly due to the incomplete understanding of rAAV’s complex subcellular trafficking dynamics. Here, we establish [...] Read more.
Recombinant adeno-associated viral vectors (rAAVs) are the leading gene delivery vehicles in clinical development, yet efficient nuclear delivery remains a major barrier to effective transduction. This limitation is partly due to the incomplete understanding of rAAV’s complex subcellular trafficking dynamics. Here, we establish a longitudinal confocal live-cell imaging workflow that tracks rAAV2 from 4 to 12 h post-transduction, paired with an automated 3D analysis pipeline that quantifies spatiotemporal vector distribution, cytoplasmic trafficking, nuclear accumulation, and transgene expression at single-cell resolution. We use this platform to evaluate the effects of vector dose, cell cycle progression, and the behavior of empty particles. We identify previously undescribed trafficking features associated with high transgene expression. Higher rAAV2 doses enhanced cytoplasmic trafficking and nuclear delivery, while cell cycle progression facilitated both trafficking efficiency and transgene expression. We also characterize empty rAAV2 particles, revealing distinct trafficking patterns and markedly reduced nuclear accumulation compared to genome-containing vectors. By uncovering new bottlenecks in rAAV transduction, this platform provides mechanistic insights and potential strategies to improve AAV-based gene therapy. Its generalizable design further supports broad applicability to other non-enveloped viruses. Full article
(This article belongs to the Special Issue Molecular Advances in Parvovirus)
Show Figures

Figure 1

23 pages, 3108 KB  
Article
Transformer-Based Memory Reverse Engineering for Malware Behavior Reconstruction
by Khaled Alrawashdeh
Computers 2026, 15(1), 8; https://doi.org/10.3390/computers15010008 - 24 Dec 2025
Viewed by 450
Abstract
Volatile memory provides the most direct and clear view into a system’s runtime behavior. Yet, traditional forensics methods are prone to errors and remain fragile against modern obfuscation and injection techniques. This paper introduces a textual-attention transformer framework that treats raw memory bytes [...] Read more.
Volatile memory provides the most direct and clear view into a system’s runtime behavior. Yet, traditional forensics methods are prone to errors and remain fragile against modern obfuscation and injection techniques. This paper introduces a textual-attention transformer framework that treats raw memory bytes as linguistic tokens, allowing the model to read memory as text and infer contextual relationships across disjoint regions. The proposed model aligns positional encodings with memory addresses and learns to associate scattered structures—such as injected stubs, PE headers, and decryption routines—within a unified semantic space. Experiments on two publicly verifiable datasets, CIC-MalMem-2022 (multi-class) and NIST CFReDS Basic Memory Images (binary), demonstrate that this approach reconstructs malware behavior with ≈97% accuracy, outperforming CNN and LSTM baselines. Attention heatmaps reveal interpretable forensic cues that identify malicious regions, bridging AI and digital forensics. The proposed concept of textual self-attention for memory opens a new paradigm in automated memory analysis—transforming volatile memory into a readable, interpretable sequence for malware behavior reconstruction. Full article
(This article belongs to the Special Issue Cyber Security and Privacy in IoT Era)
Show Figures

Figure 1

14 pages, 2669 KB  
Article
Laser Biospeckles Analysis for Rapid Evaluation of Organic Pollutants in Water
by Arti Devi, Hirofumi Kadono and Uma Maheswari Rajagopalan
AppliedPhys 2026, 2(1), 1; https://doi.org/10.3390/appliedphys2010001 - 21 Dec 2025
Viewed by 166
Abstract
Rapid evaluation of water toxicity requires biological methods capable of detecting sub-lethal physiological changes without depending on chemical identification. Conventional microscopy-based bioassays are limited by low throughput and difficulties in observing small, transparent and fast-moving microorganisms. This study applies a laser-biospeckle, non-imaging microbioassay [...] Read more.
Rapid evaluation of water toxicity requires biological methods capable of detecting sub-lethal physiological changes without depending on chemical identification. Conventional microscopy-based bioassays are limited by low throughput and difficulties in observing small, transparent and fast-moving microorganisms. This study applies a laser-biospeckle, non-imaging microbioassay to assess the motility responses of Paramecium caudatum and Euglena gracilis exposed to two organic pollutants, trichloroacetic acid (TCAA) and acephate. Dynamic speckle patterns were recorded using a 638 nm laser diode (Thorlabs Inc., Tokyo, Japan) and a CCD camera (Gazo Co., Ltd., Tokyo, Japan) at 60 fps for 120 s. Correlation time, derived from temporal cross-correlation analysis, served as a quantitative indicator of motility. Exposure to TCAA (0.1–50 mg/L) produced strong concentration-dependent inhibition, with correlation time increasing up to 16-fold at 500× PL in P. caudatum (p < 0.01), whereas E. gracilis showed a delayed response, with significant inhibition only above 250× PL. In contrast, acephate exposure (0.036–3.6 mg/L) induced motility enhancement in both species, reflected by decreases in correlation time of up to 57% in P. caudatum and 40% in E. gracilis at 100× PL. Acute trends diminished after 24–48 h, indicating time-dependent physiological adaptation. These results demonstrate that biospeckled-derived correlation time sensitively captures both inhibitory and stimulatory behavioral responses, enabling real-time, high-throughput water toxicity screening without microscopic imaging. The method shows strong potential for integration into automated water-quality monitoring systems. Full article
(This article belongs to the Special Issue Advancements in Optical Measurements and Sensing Technology)
Show Figures

Figure 1

33 pages, 894 KB  
Review
Impacts of Connected and Automated Driving: From Personal Acceptance to the Effects in Society: A Multi-Factor Review
by Nuria Herrero García, Nicoletta Matera, Michela Longo and Felipe Jiménez
Electronics 2026, 15(1), 27; https://doi.org/10.3390/electronics15010027 - 21 Dec 2025
Viewed by 244
Abstract
This systematic literature review explores the impacts of autonomous and connected mobility systems on sustainable road transportation. The evaluation process involves a multifaceted analysis, encompassing the assessment of their capacity to mitigate accidents, energy consumption, emissions, and urban traffic congestion. As a novel [...] Read more.
This systematic literature review explores the impacts of autonomous and connected mobility systems on sustainable road transportation. The evaluation process involves a multifaceted analysis, encompassing the assessment of their capacity to mitigate accidents, energy consumption, emissions, and urban traffic congestion. As a novel approach, this paper analyses the parameters of user acceptance of technology and how these are reflected in the overall impacts of automated and connected driving. Thus, based on a behavioral intention to use the new technology model, we aim to analyze the state of the art of the overall impacts that may be correlated with individual interests. To this end, a multi-factor approach is applied and potential interactions between factors that may arise are studied in a holistic and quantitative assessment of their combined effects on transportation systems. This impact assessment is a significant challenge, as numerous factors come into play, leading to conflicting effects. Since there is no significant penetration of vehicles with medium or high levels of automation, conclusions are often obtained through simulations or estimates based on hypotheses that must be considered when analyzing the results and can lead to significant dispersion. The results confirm that these technologies can substantially improve road safety, traffic efficiency, and environmental performance. However, their large-scale deployment will critically depend on the establishment of coherent regulatory frameworks, infrastructural readiness, and societal acceptance. Comprehensive stakeholder collaboration, incorporating industry, regulatory authorities, and society, is essential to successfully address existing concerns, facilitate technological integration, and maximize the societal benefits of these transformative mobility systems. Full article
(This article belongs to the Section Electrical and Autonomous Vehicles)
Show Figures

Figure 1

21 pages, 3674 KB  
Article
scSelector: A Flexible Single-Cell Data Analysis Assistant for Biomedical Researchers
by Xiang Gao, Peiqi Wu, Jiani Yu, Xueying Zhu, Shengyao Zhang, Hongxiang Shao, Dan Lu, Xiaojing Hou and Yunqing Liu
Genes 2026, 17(1), 2; https://doi.org/10.3390/genes17010002 - 19 Dec 2025
Viewed by 353
Abstract
Background: Standard single-cell RNA sequencing (scRNA-seq) analysis workflows face significant limitations, particularly the rigidity of clustering-dependent methods that can obscure subtle cellular heterogeneity and the potential loss of biologically meaningful cells during stringent quality control (QC) filtering. This study aims to develop [...] Read more.
Background: Standard single-cell RNA sequencing (scRNA-seq) analysis workflows face significant limitations, particularly the rigidity of clustering-dependent methods that can obscure subtle cellular heterogeneity and the potential loss of biologically meaningful cells during stringent quality control (QC) filtering. This study aims to develop scSelector (v1.0), an interactive software toolkit designed to empower researchers to flexibly select and analyze cell populations directly from low-dimensional embeddings, guided by their expert biological knowledge. Methods: scSelector was developed using Python, relying on core dependencies such as Scanpy (v1.9.0), Matplotlib (v3.4.0), and NumPy (v1.20.0). It integrates an intuitive lasso selection tool with backend analytical modules for differential expression and functional enrichment analysis. Furthermore, it incorporates Large Language Model (LLM) assistance via API integration (DeepSeek/Gemini) to provide automated, contextually informed cell-type and state prediction reports. Results: Validation across multiple public datasets demonstrated that scSelector effectively resolves functional heterogeneity within broader cell types, such as identifying distinct alpha-cell subpopulations with unique remodeling capabilities in pancreatic tissue. It successfully characterized rare populations, including platelets in PBMCs and extremely low-abundance endothelial cells in liver tissue (as few as 53 cells). Additionally, scSelector revealed that cells discarded by standard QC can represent biologically functional subpopulations, and it accurately dissected the states of outlier cells, such as proliferative NK cells. Conclusions: scSelector provides a flexible, researcher-centric platform that moves beyond the constraints of automated pipelines. By combining interactive selection with AI-assisted interpretation, it enhances the precision of scRNA-seq analysis and facilitates the discovery of novel cell types and complex cellular behaviors. Full article
(This article belongs to the Section Bioinformatics)
Show Figures

Figure 1

28 pages, 6434 KB  
Article
Mapping Cyber Bot Behaviors: Understanding Payload Patterns in Honeypot Traffic
by Shiyu Wang, Cheng Tu, Yunyi Zhang, Min Zhang and Pengfei Xue
Sensors 2026, 26(1), 11; https://doi.org/10.3390/s26010011 - 19 Dec 2025
Viewed by 498
Abstract
Cyber bots have become prevalent across the Internet ecosystem, making behavioral understanding essential for threat intelligence. Since bot behaviors are encoded in traffic payloads that blend with normal traffic, honeypot sensors are widely adopted for capture and analysis. However, previous works face adaptation [...] Read more.
Cyber bots have become prevalent across the Internet ecosystem, making behavioral understanding essential for threat intelligence. Since bot behaviors are encoded in traffic payloads that blend with normal traffic, honeypot sensors are widely adopted for capture and analysis. However, previous works face adaptation challenges when analyzing large-scale, diverse payloads from evolving bot techniques. In this paper, we conduct an 11-month measurement study to map cyber bot behaviors through payload pattern analysis in honeypot traffic. We propose TrafficPrint, a pattern extraction framework to enable adaptable analysis of diverse honeypot payloads. TrafficPrint combines representation learning with clustering to automatically extract human-understandable payload patterns without requiring protocol-specific expertise. Our globally distributed honeypot sensors collected 21.5 M application-layer payloads. Starting from only 168 K labeled payloads (0.8% of data), TrafficPrint extracted 296 patterns that automatically labeled 83.57% of previously unknown payloads. Our pattern-based analysis reveals actionable threat intelligence: 82% of patterns employ semi-customized structures balancing automation with targeted modifications; 13% contain distinctive identity markers enabling threat actor attribution, including CENSYS’s unique signature; and bots exploit techniques like masquerading as crawlers, embedding commands in brute-force attacks, and using base64 encoding for detection evasion. Full article
(This article belongs to the Special Issue Privacy and Security in Sensor Networks)
Show Figures

Figure 1

21 pages, 1304 KB  
Article
An Automated Machine Learning Framework for Interpretable Customer Segmentation in Financial Services
by Iveta Grigorova, Aleksandar Efremov and Aleksandar Karamfilov
Int. J. Financial Stud. 2025, 13(4), 243; https://doi.org/10.3390/ijfs13040243 - 17 Dec 2025
Viewed by 647
Abstract
Customer segmentation is essential in financial services for designing targeted interventions, managing dormant portfolios, and supporting marketing re-engagement strategies. Traditional approaches such as Recency–Frequency–Monetary (RFM) analysis offer interpretability but often lack the flexibility needed to capture heterogeneous behavioral patterns. This study presents an [...] Read more.
Customer segmentation is essential in financial services for designing targeted interventions, managing dormant portfolios, and supporting marketing re-engagement strategies. Traditional approaches such as Recency–Frequency–Monetary (RFM) analysis offer interpretability but often lack the flexibility needed to capture heterogeneous behavioral patterns. This study presents an automated segmentation framework that integrates machine learning-based clustering with RFM-based interpretability benchmarks. KMeans and Hierarchical clustering are evaluated across multiple values of k using internal validity metrics (Silhouette Coefficient, Davies–Bouldin Index) and interpretability alignment measures (Adjusted Rand Index, Normalized Mutual Information, Homogeneity, Completeness, and V-Measure). The Hungarian algorithm is used to align machine-learned clusters with RFM segments for comparability. The framework reveals behavioral subgroups not captured by RFM alone, demonstrating that machine learning can expose hidden heterogeneity within dormant customer populations. While outcome-based financial validation is not yet feasible due to the cold-start nature of the deployment environment, the study provides a reproducible, scalable pipeline for segmentation that balances analytical rigor with business interpretability. The findings highlight how data-driven clustering can refine traditional segmentation logic, supporting more nuanced portfolio monitoring and re-engagement strategies in financial services. Full article
Show Figures

Figure 1

Back to TopTop