Data Science

A topical collection in Encyclopedia (ISSN 2673-8392). This collection belongs to the section "Mathematics & Computer Science".

Viewed by 15333

Editors


E-Mail Website
Collection Editor
Department of Computer Science and Creative Technologies, University of the West of England, Bristol BS16 1QY, UK
Interests: data science; big data analytics; artificial intelligence; cloud computing; project management
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Collection Editor
Department of Computer Science and Numerical Analysis, School of Engineering Sciences, University of Cordoba, Cordoba, Spain
Interests: data science; search-based software engineering;intelligent systems;model-driven engineering

E-Mail Website
Collection Editor
VC Secretariat, Balochistan University of Information Technology Engineering and Management Sciences, Quetta 87300, Pakistan
Interests: digital transformation; knowledge management; strategic management; supply chain management; Industry 4.0

Topical Collection Information

Dear Colleagues,

The topic collection Data Science has helped us to develop interdisciplinary linkages between the computer, statistics, mathematics, information and intelligence sciences, and it has fostered cross-domain interactions between academia and industry for data science and big data analytics. The topic collection Data Science welcomes contributions related, but not limited, to the following topics of interest:

  • Data science and analytical methods;
  • The machine learning foundations of data science;
  • Infrastructures, tools and systems focusing on data processing and analytics;
  • Real-world data science applications and case studies;
  • Learning from data with domain knowledge;
  • Emerging data science applications;
  • Human-centric data science;
  • Data science for the next digital frontier (telecommunications and 5G, predictive maintenance, sustainability and the environment, etc.);
  • Systems for practical applications of data science, data analytics and applied machine learning, demonstrating real-world impact;
  • Solutions or advances towards understanding the issues related to deploying data science technologies/solutions in the real world;
  • Processes and Methodologies related to Data Science.

Prof. Dr. Kamran Munir
Prof. Dr. José Raúl Romero
Prof. Dr. Khalid Hafeez
Collection Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the collection website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Encyclopedia is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • data science
  • big data management
  • applied machine learning
  • predictive analytics
  • data with domain knowledge

Published Papers (3 papers)

2024

Jump to: 2023

24 pages, 610 KiB  
Review
Optimisation of Small-Scale Aquaponics Systems Using Artificial Intelligence and the IoT: Current Status, Challenges, and Opportunities
by Abdul Aziz Channa, Kamran Munir, Mark Hansen and Muhammad Fahim Tariq
Encyclopedia 2024, 4(1), 313-336; https://doi.org/10.3390/encyclopedia4010023 - 8 Feb 2024
Cited by 5 | Viewed by 3875
Abstract
Environment changes, water scarcity, soil depletion, and urbanisation are making it harder to produce food using traditional methods in various regions and countries. Aquaponics is emerging as a sustainable food production system that produces fish and plants in a closed-loop system. Aquaponics is [...] Read more.
Environment changes, water scarcity, soil depletion, and urbanisation are making it harder to produce food using traditional methods in various regions and countries. Aquaponics is emerging as a sustainable food production system that produces fish and plants in a closed-loop system. Aquaponics is not dependent on soil or external environmental factors. It uses fish waste to fertilise plants and can save up to 90–95% water. Aquaponics is an innovative system for growing food and is expected to be very promising, but it has its challenges. It is a complex ecosystem that requires multidisciplinary knowledge, proper monitoring of all crucial parameters, and high maintenance and initial investment costs to build the system. Artificial intelligence (AI) and the Internet of Things (IoT) are key technologies that can overcome these challenges. Numerous recent studies focus on the use of AI and the IoT to automate the process, improve efficiency and reliability, provide better management, and reduce operating costs. However, these studies often focus on limited aspects of the system, each considering different domains and parameters of the aquaponics system. This paper aims to consolidate the existing work, identify the state-of-the-art use of the IoT and AI, explore the key parameters affecting growth, analyse the sensing and communication technologies employed, highlight the research gaps in this field, and suggest future research directions. Based on the reviewed research, energy efficiency and economic viability were found to be a major bottleneck of current systems. Moreover, inconsistencies in sensor selection, lack of publicly available data, and the reproducibility of existing work were common issues among the studies. Full article
Show Figures

Figure 1

2023

Jump to: 2024

11 pages, 618 KiB  
Entry
Large Language Models and Logical Reasoning
by Robert Friedman
Encyclopedia 2023, 3(2), 687-697; https://doi.org/10.3390/encyclopedia3020049 - 30 May 2023
Cited by 1 | Viewed by 4695
Definition
In deep learning, large language models are typically trained on data from a corpus as representative of current knowledge. However, natural language is not an ideal form for the reliable communication of concepts. Instead, formal logical statements are preferable since they are subject [...] Read more.
In deep learning, large language models are typically trained on data from a corpus as representative of current knowledge. However, natural language is not an ideal form for the reliable communication of concepts. Instead, formal logical statements are preferable since they are subject to verifiability, reliability, and applicability. Another reason for this preference is that natural language is not designed for an efficient and reliable flow of information and knowledge, but is instead designed as an evolutionary adaptation as formed from a prior set of natural constraints. As a formally structured language, logical statements are also more interpretable. They may be informally constructed in the form of a natural language statement, but a formalized logical statement is expected to follow a stricter set of rules, such as with the use of symbols for representing the logic-based operators that connect multiple simple statements and form verifiable propositions. Full article
Show Figures

Graphical abstract

7 pages, 346 KiB  
Entry
Tokenization in the Theory of Knowledge
by Robert Friedman
Encyclopedia 2023, 3(1), 380-386; https://doi.org/10.3390/encyclopedia3010024 - 20 Mar 2023
Cited by 6 | Viewed by 4107
Definition
Tokenization is a procedure for recovering the elements of interest in a sequence of data. This term is commonly used to describe an initial step in the processing of programming languages, and also for the preparation of input data in the case of [...] Read more.
Tokenization is a procedure for recovering the elements of interest in a sequence of data. This term is commonly used to describe an initial step in the processing of programming languages, and also for the preparation of input data in the case of artificial neural networks; however, it is a generalizable concept that applies to reducing a complex form to its basic elements, whether in the context of computer science or in natural processes. In this entry, the general concept of a token and its attributes are defined, along with its role in different contexts, such as deep learning methods. Included here are suggestions for further theoretical and empirical analysis of tokenization, particularly regarding its use in deep learning, as it is a rate-limiting step and a possible bottleneck when the results do not meet expectations. Full article
Show Figures

Graphical abstract

Back to TopTop