entropy-logo

Journal Browser

Journal Browser

Information Processing in Complex Biological Systems

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Complexity".

Deadline for manuscript submissions: 31 August 2025 | Viewed by 921

Special Issue Editor


E-Mail Website
Guest Editor
Department of Electronics and Communications Engineering, Kwangwoon University, Seoul 01897, Republic of Korea
Interests: neural engineering; brain–computer interface; neural and biomedical signal processing; information-theoretic machine learning
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

This Special Issue focuses on theoretical and experimental approaches to understanding information processing in complex biological and artificial systems, with particular emphasis on entropy-based analysis and optimization. We welcome contributions that advance our understanding of how biological and artificial systems encode, process, and transmit information, particularly emphasizing the following:

  1. Information theory applications in biological and artificial neural systems;
  2. Knowledge distillation and model compression techniques in neural networks;
  3. Entropy-based approaches to understanding system complexity and efficiency;
  4. Information processing optimization in resource-constrained environments;
  5. Bio-inspired approaches to efficient information processing;
  6. Computational modeling of information flow in complex systems;
  7. Low-rank and sparse representations in biological and artificial systems;
  8. Multi-scale information integration and processing;
  9. Energy-efficient information processing in biological and artificial systems;
  10. Quantitative measures of information compression and preservation.

We welcome both original research articles and comprehensive reviews covering these topics. Of particular interest are studies that:

  • Investigate information-theoretic approaches to system optimization;
  • Explore novel methods for efficient information processing;
  • Examine parallels between biological and artificial information processing systems;
  • Address resource constraints in information processing systems.

Prof. Dr. Young-Seok Choi
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information theory
  • complex biological or adaptive systems
  • knowledge distillation
  • model compression
  • neural networks
  • information entropy
  • system optimization
  • computational efficiency
  • low-rank representations
  • resource-constrained computing
  • biological information processing
  • artificial intelligence

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 2258 KiB  
Article
Lightweight Pre-Trained Korean Language Model Based on Knowledge Distillation and Low-Rank Factorization
by Jin-Hwan Kim and Young-Seok Choi
Entropy 2025, 27(4), 379; https://doi.org/10.3390/e27040379 - 2 Apr 2025
Viewed by 746
Abstract
Natural Language Processing (NLP) stands as a forefront of artificial intelligence research, empowering computational systems to comprehend and process human language as used in everyday contexts. Language models (LMs) underpin this field, striving to capture the intricacies of linguistic structure and semantics by [...] Read more.
Natural Language Processing (NLP) stands as a forefront of artificial intelligence research, empowering computational systems to comprehend and process human language as used in everyday contexts. Language models (LMs) underpin this field, striving to capture the intricacies of linguistic structure and semantics by assigning probabilities to sequences of words. The trend towards large language models (LLMs) has shown significant performance improvements with increasing model size. However, the deployment of LLMs on resource-limited devices such as mobile and edge devices remains a challenge. This issue is particularly pronounced in languages other than English, including Korean, where pre-trained models are relatively scarce. Addressing this gap, we introduce a novel lightweight pre-trained Korean language model that leverages knowledge distillation and low-rank factorization techniques. Our approach distills knowledge from a 432 MB (approximately 110 M parameters) teacher model into student models of substantially reduced sizes (e.g., 53 MB ≈ 14 M parameters, 35 MB ≈ 13 M parameters, 30 MB ≈ 11 M parameters, and 18 MB ≈ 4 M parameters). The smaller student models further employ low-rank factorization to minimize the parameter count within the Transformer’s feed-forward network (FFN) and embedding layer. We evaluate the efficacy of our lightweight model across six established Korean NLP tasks. Notably, our most compact model, KR-ELECTRA-Small-KD, attains over 97.387% of the teacher model’s performance despite an 8.15× reduction in size. Remarkably, on the NSMC sentiment classification benchmark, KR-ELECTRA-Small-KD surpasses the teacher model with an accuracy of 89.720%. These findings underscore the potential of our model as an efficient solution for NLP applications in resource-constrained settings. Full article
(This article belongs to the Special Issue Information Processing in Complex Biological Systems)
Show Figures

Figure 1

Back to TopTop