Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (98)

Search Parameters:
Keywords = No-SQL database

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 2223 KB  
Article
Off-the-Shelf AAL—A Practical Approach to Face the Population Shift
by Gerhard Leitner
Appl. Sci. 2026, 16(5), 2251; https://doi.org/10.3390/app16052251 - 26 Feb 2026
Viewed by 249
Abstract
Although the concept of Active and Assisted Living (AAL) has been a prominent topic in academia and in industry for decades, the widespread adoption of related technologies remains well below expectations. The underlying causes are multifaceted. The installation and retrofitting of such systems [...] Read more.
Although the concept of Active and Assisted Living (AAL) has been a prominent topic in academia and in industry for decades, the widespread adoption of related technologies remains well below expectations. The underlying causes are multifaceted. The installation and retrofitting of such systems typically require substantial financial investments, significant manual effort, and specialized expertise for setup and maintenance. Existing solutions lack flexibility and are difficult to tailor to the individual living situations and diverse needs of the primary target group, older adults. While state-of-the-art smart home platforms would, in principle, be capable of supporting a broad range of AAL functionalities and could be adapted to different usage contexts, much of the research in this domain has been conducted in artificial settings, such as laboratory environments or model houses, conditions that fail to fully capture the complexity and variability of real-world living environments of the elderly population. In this paper, we explore the potential, opportunities, and limitations of integrating low-cost hardware with open-source software components in residential environments representative of older adults’ everyday lives. Our work is based on a longitudinal case study conducted over several years in an actual household, focusing on delivering fundamental AAL functionality. By documenting the iterative development and real-world deployment of the system, this study offers practical insights into the feasibility and challenges of implementing on-site AAL support under realistic conditions. Full article
(This article belongs to the Special Issue Human Activity Recognition (HAR) in Healthcare, 3rd Edition)
Show Figures

Figure 1

23 pages, 1201 KB  
Article
Comparative Read Performance Analysis of PostgreSQL and MongoDB in E-Commerce: An Empirical Study of Filtering and Analytical Queries
by Jovita Urnikienė, Vaida Steponavičienė and Svetoslav Atanasov
Big Data Cogn. Comput. 2026, 10(2), 66; https://doi.org/10.3390/bdcc10020066 - 19 Feb 2026
Viewed by 996
Abstract
This paper presents a comparative analysis of read performance for PostgreSQL and MongoDB in e-commerce scenarios, using identical datasets in a resource-constrained single-host environment. The results demonstrate that PostgreSQL executes complex analytical queries 1.6–15.1 times faster, depending on the query type and data [...] Read more.
This paper presents a comparative analysis of read performance for PostgreSQL and MongoDB in e-commerce scenarios, using identical datasets in a resource-constrained single-host environment. The results demonstrate that PostgreSQL executes complex analytical queries 1.6–15.1 times faster, depending on the query type and data volume. The study employed synthetic data generation with the Faker library across three stages, processing up to 300,000 products and executing each of 6 query types 15 times. Both filtering and analytical queries were tested on non-indexed data in a controlled localhost environment with PostgreSQL 17.5 and MongoDB 7.0.14, using default configurations. PostgreSQL showed 65–80% shorter execution times for multi-criteria queries, while MongoDB required approximately 33% less disk space. These findings suggest that normalized relational schemas are advantageous for transactional e-commerce systems where analytical queries dominate the workload. The results are directly applicable to small and medium e-commerce developers operating in budget-constrained, single-host deployment environments when choosing between relational and document-oriented databases for structured transactional data with read-heavy analytical workloads. A minimal indexed validation confirms that the baseline trends remain consistent under a simple indexing configuration. Future work will examine broader indexing strategies, write-intensive workloads, and distributed deployment scenarios. Full article
Show Figures

Figure 1

26 pages, 1629 KB  
Article
Performance Evaluation of MongoDB and RavenDB in IIoT-Inspired Data-Intensive Mobile and Web Applications
by Mădălina Ciumac, Cornelia Aurora Győrödi, Robert Ștefan Győrödi and Felicia Mirabela Costea
Future Internet 2026, 18(1), 57; https://doi.org/10.3390/fi18010057 - 20 Jan 2026
Cited by 1 | Viewed by 603
Abstract
The exponential growth of data generated by modern digital applications, including systems inspired by Industrial Internet of Things (IIoT) requirements, has accelerated the adoption of NoSQL databases due to their scalability, flexibility, and performance advantages over traditional relational systems. Among document-oriented solutions, MongoDB [...] Read more.
The exponential growth of data generated by modern digital applications, including systems inspired by Industrial Internet of Things (IIoT) requirements, has accelerated the adoption of NoSQL databases due to their scalability, flexibility, and performance advantages over traditional relational systems. Among document-oriented solutions, MongoDB and RavenDB stand out due to their architectural features and their ability to manage dynamic, large-scale datasets. This paper presents a comparative analysis of MongoDB and RavenDB, focusing on the performance of fundamental CRUD (Create, Read, Update, Delete) operations. To ensure a controlled performance evaluation, a mobile and web application for managing product orders was implemented as a case study inspired by IIoT data characteristics, such as high data volume and frequent transactional operations, with experiments conducted on datasets ranging from 1000 to 1,000,000 records. Beyond the core CRUD evaluation, the study also investigates advanced operational scenarios, including joint processing strategies (lookup versus document inclusion), bulk data ingestion techniques, aggregation performance, and full-text search capabilities. These complementary tests provide deeper insight into the systems’ architectural strengths and their behavior under more complex and data-intensive workloads. The experimental results highlight MongoDB’s consistent performance advantage in terms of response time, particularly with large data volumes, while RavenDB demonstrates competitive behavior and offers additional benefits such as built-in ACID compliance, automatic indexing, and optimized mechanisms for relational retrieval and bulk ingestion. The analysis does not propose a new benchmarking methodology but provides practical insights for selecting an appropriate document-oriented database for data intensive mobile and web application contexts, including IIoT-inspired data characteristics, based on a controlled single-node experimental setting, while acknowledging the limitations of a single-host experimental environment. Full article
Show Figures

Graphical abstract

29 pages, 2803 KB  
Article
Benchmarking SQL and NoSQL Persistence in Microservices Under Variable Workloads
by Nenad Pantelic, Ljiljana Matic, Lazar Jakovljevic, Stefan Eric, Milan Eric, Miladin Stefanović and Aleksandar Djordjevic
Future Internet 2026, 18(1), 53; https://doi.org/10.3390/fi18010053 - 15 Jan 2026
Viewed by 953
Abstract
This paper presents a controlled comparative evaluation of SQL and NoSQL persistence mechanisms in containerized microservice architectures under variable workload conditions. Three persistence configurations—SQL with indexing, SQL without indexing, and a document-oriented NoSQL database, including supplementary hybrid SQL variants used for robustness analysis—are [...] Read more.
This paper presents a controlled comparative evaluation of SQL and NoSQL persistence mechanisms in containerized microservice architectures under variable workload conditions. Three persistence configurations—SQL with indexing, SQL without indexing, and a document-oriented NoSQL database, including supplementary hybrid SQL variants used for robustness analysis—are assessed across read-dominant, write-dominant, and mixed workloads, with concurrency levels ranging from low to high contention. The experimental setup is fully containerized and executed in a single-node environment to isolate persistence-layer behavior and ensure reproducibility. System performance is evaluated using multiple metrics, including percentile-based latency (p95), throughput, CPU utilization, and memory consumption. The results reveal distinct performance trade-offs among the evaluated configurations, highlighting the sensitivity of persistence mechanisms to workload composition and concurrency intensity. In particular, indexing strategies significantly affect read-heavy scenarios, while document-oriented persistence demonstrates advantages under write-intensive workloads. The findings emphasize the importance of workload-aware persistence selection in microservice-based systems and support the adoption of polyglot persistence strategies. Rather than providing absolute performance benchmarks, the study focuses on comparative behavioral trends that can inform architectural decision-making in practical microservice deployments. Full article
Show Figures

Figure 1

24 pages, 9146 KB  
Article
A Model for a Serialized Set-Oriented NoSQL Database Management System
by Alexandru-George Șerban and Alexandru Boicea
Information 2026, 17(1), 84; https://doi.org/10.3390/info17010084 - 13 Jan 2026
Viewed by 1135
Abstract
Recent advancements in data management highlight the increasing focus on large-scale integration and analytics, with the management of duplicate information becoming a more resource-intensive and costly task. Existing SQL and NoSQL systems inadequately address the semantic constraints of set-based data, either by compromising [...] Read more.
Recent advancements in data management highlight the increasing focus on large-scale integration and analytics, with the management of duplicate information becoming a more resource-intensive and costly task. Existing SQL and NoSQL systems inadequately address the semantic constraints of set-based data, either by compromising relational fidelity or through inefficient deduplication mechanisms. This paper presents a set-oriented centralized NoSQL database management system (DBMS) that enforces uniqueness by construction, thereby reducing downstream deduplication and enhancing result determinism. The system utilizes in-memory execution with binary serialized persistence, achieving O(1) time complexity for exact-match CRUD operations while maintaining ACID-compliant transactional semantics through explicit commit operations. A comparative performance evaluation against Redis and MongoDB highlights the trade-offs between consistency guarantees and latency. The results reveal that enforced set uniqueness completely eliminates duplicates, incurring only moderate latency trade-offs compared to in-memory performance measures. The model can be extended for fuzzy queries and imprecise data by retrieving the membership function information. This work demonstrates that the set-oriented DBMS design represents a distinct architectural paradigm that addresses data integrity constraints inadequately handled by contemporary database systems. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

10 pages, 960 KB  
Proceeding Paper
SQL and NoSQL Databases: A Comparative Study with Perspectives on IA-Based Migration Approach
by Soukayna Abdellaoui, Wafae Abbaoui, Loubna Meziane, Brahim El Bhiri and Soumia Ziti
Eng. Proc. 2025, 112(1), 72; https://doi.org/10.3390/engproc2025112072 - 20 Nov 2025
Viewed by 4091
Abstract
Traditional relational databases have been widely used for many years and have been the go-to choice for the industry. However, the emergence of NoSQL databases coincided with the increasing popularity of the Internet, social networks, and cloud computing. Data migration has become a [...] Read more.
Traditional relational databases have been widely used for many years and have been the go-to choice for the industry. However, the emergence of NoSQL databases coincided with the increasing popularity of the Internet, social networks, and cloud computing. Data migration has become a crucial topic, due to the rapid growth of applications and the ever-expanding amount of data being collected. This has led to a shift from relational database management systems (RDBMS) to NoSQL databases, also known as not only SQL. This article compares the characteristics of both database architectures, SQL and NoSQL, focusing on aspects such as scalability and performance, flexibility, and security. Additionally, the role of AI in streamlining the migration process will be explored. Full article
Show Figures

Figure 1

24 pages, 1442 KB  
Article
Enhancing Student Motivation and Competencies via the WWH Teaching Method: A Case Study on the NoSQL Database Course
by Bin Yu, Yihong Liu, Yuhui Fan, Shaohua Liu, Xiaoyan Li and Ruoyu Li
Electronics 2025, 14(22), 4453; https://doi.org/10.3390/electronics14224453 - 14 Nov 2025
Viewed by 600
Abstract
NoSQL databases are vital for modern big data applications, yet traditional teaching methods struggle with lagging content, insufficient practice, and low student engagement. To address these issues, this paper proposes the WWH-integrated teaching method “Why learn, What learn, How learn” for a NoSQL [...] Read more.
NoSQL databases are vital for modern big data applications, yet traditional teaching methods struggle with lagging content, insufficient practice, and low student engagement. To address these issues, this paper proposes the WWH-integrated teaching method “Why learn, What learn, How learn” for a NoSQL database course. WWH combines three core approaches: the general–special method, which structures knowledge from foundational concepts to specialized technologies; the comparative method, which contextualizes NoSQL value via real-scenario analysis; and the theory–practice combination method, which links concepts to hands-on tasks, supplemented by the problem-guidance and key-highlighting strategies. A quasi-experiment with two cohorts (80 students each; 2023 cohort as control, 2024 as experimental) validated WWH. Quantitative results showed significant improvements: theoretical exam scores rose by 9.2 points (t(158) = 9.21, p < 0.001) and experimental scores by 10.3 points (t(158) = 7.92, p < 0.001), and classroom discussion rates increased from 45.2% to 82.7% (χ2(1) = 28.90, p < 0.001). Qualitative analysis of student essays and project reports further confirmed deeper conceptual understanding, stronger tradeoff awareness, and enhanced knowledge integration in the experimental cohort. This study provides an evidence-based, student-centered framework for modernizing NoSQL instruction, better preparing students for industry data management needs. Full article
Show Figures

Figure 1

16 pages, 832 KB  
Proceeding Paper
Leveraging MongoDB in Real-Time Emotion Recognition from Video for Scalable and Efficient Data Handling
by Haikal Muhammad Kurniawan, Muhammad Maulidan, Muhammad Faisal Zulmaulidin and Alun Sujjada
Eng. Proc. 2025, 107(1), 84; https://doi.org/10.3390/engproc2025107084 - 11 Sep 2025
Viewed by 708
Abstract
Real-time emotion recognition from video poses significant challenges in handling large-scale and continuously growing datasets. Traditional relational databases often fail to meet the scalability and efficiency requirements of such applications. MongoDB, a NoSQL database, offers significant advantages in scalability, speed, and data management, [...] Read more.
Real-time emotion recognition from video poses significant challenges in handling large-scale and continuously growing datasets. Traditional relational databases often fail to meet the scalability and efficiency requirements of such applications. MongoDB, a NoSQL database, offers significant advantages in scalability, speed, and data management, making it an ideal choice for video-based emotion recognition systems. This paper explores the use of MongoDB to optimize the management of video data in real-time emotion recognition, leveraging its features like sharding, indexing, and replication. We demonstrate how MongoDB’s advanced features enhance the performance and reliability of emotion recognition systems by reducing latency and processing time. Through experimental results, we show that MongoDB outperforms traditional relational databases and other NoSQL solutions in handling large datasets efficiently. Future work will explore integrating MongoDB with cloud platforms to improve scalability and incorporate advanced deep learning algorithms for better emotion recognition accuracy. Full article
Show Figures

Figure 1

34 pages, 7582 KB  
Article
Proposed SmartBarrel System for Monitoring and Assessment of Wine Fermentation Processes Using IoT Nose and Tongue Devices
by Sotirios Kontogiannis, Meropi Tsoumani, George Kokkonis, Christos Pikridas and Yorgos Kotseridis
Sensors 2025, 25(13), 3877; https://doi.org/10.3390/s25133877 - 21 Jun 2025
Cited by 4 | Viewed by 4509
Abstract
This paper introduces SmartBarrel, an innovative IoT-based sensory system that monitors and forecasts wine fermentation processes. At the core of SmartBarrel are two compact, attachable devices—the probing nose (E-nose) and the probing tongue (E-tongue), which mount directly onto stainless steel wine tanks. These [...] Read more.
This paper introduces SmartBarrel, an innovative IoT-based sensory system that monitors and forecasts wine fermentation processes. At the core of SmartBarrel are two compact, attachable devices—the probing nose (E-nose) and the probing tongue (E-tongue), which mount directly onto stainless steel wine tanks. These devices periodically measure key fermentation parameters: the nose monitors gas emissions, while the tongue captures acidity, residual sugar, and color changes. Both utilize low-cost, low-power sensors validated through small-scale fermentation experiments. Beyond the sensory hardware, SmartBarrel includes a robust cloud infrastructure built on open-source Industry 4.0 tools. The system leverages the ThingsBoard platform, supported by a NoSQL Cassandra database, to provide real-time data storage, visualization, and mobile application access. The system also supports adaptive breakpoint alerts and real-time adjustment to the nonlinear dynamics of wine fermentation. The authors developed a novel deep learning model called V-LSTM (Variable-length Long Short-Term Memory) to introduce intelligence to enable predictive analytics. This auto-calibrating architecture supports variable layer depths and cell configurations, enabling accurate forecasting of fermentation metrics. Moreover, the system includes two fuzzy logic modules: a device-level fuzzy controller to estimate alcohol content based on sensor data and a fuzzy encoder that synthetically generates fermentation profiles using a limited set of experimental curves. SmartBarrel experimental results validate the SmartBarrel’s ability to monitor fermentation parameters. Additionally, the implemented models show that the V-LSTM model outperforms existing neural network classifiers and regression models, reducing RMSE loss by at least 45%. Furthermore, the fuzzy alcohol predictor achieved a coefficient of determination (R2) of 0.87, enabling reliable alcohol content estimation without direct alcohol sensing. Full article
(This article belongs to the Special Issue Applications of Sensors Based on Embedded Systems)
Show Figures

Figure 1

19 pages, 7037 KB  
Article
An Artificial Intelligence Home Monitoring System That Uses CNN and LSTM and Is Based on the Android Studio Development Platform
by Guo-Ming Sung, Sachin D. Kohale, Te-Hui Chiang and Yu-Jie Chong
Appl. Sci. 2025, 15(3), 1207; https://doi.org/10.3390/app15031207 - 24 Jan 2025
Cited by 2 | Viewed by 1673
Abstract
This paper developed an artificial intelligence home environment monitoring system by using the Android Studio development platform. A database was constructed within a server to store sensor data. The proposed system comprises multiple sensors, a message queueing telemetry transport (MQTT) communication protocol, cloud [...] Read more.
This paper developed an artificial intelligence home environment monitoring system by using the Android Studio development platform. A database was constructed within a server to store sensor data. The proposed system comprises multiple sensors, a message queueing telemetry transport (MQTT) communication protocol, cloud data storage and computation, and end device control. A mobile application was developed using MongoDB software, which is a file-oriented NoSQL database management system developed using C++. This system represents a new database for processing big sensor data. The k-nearest neighbor (KNN) algorithm was used to impute missing data. Node-RED development software was used within the server as a data-receiving, storage, and computing environment that is convenient to manage and maintain. Data on indoor temperature, humidity, and carbon dioxide concentrations are transmitted to a mobile phone application through the MQTT communication protocol for real-time display and monitoring. The system can control a fan or warning light through the mobile application to maintain ambient temperature inside the house and to warn users of emergencies. A long short-term memory (LSTM) model and a convolutional neural network (CNN) model were used to predict indoor temperature, humidity, and carbon dioxide concentrations. Average relative errors in the predicted values of humidity and carbon dioxide concentration were approximately 0.0415% and 0.134%, respectively, for data storage using the KNN algorithm. For indoor temperature prediction, the LSTM model had a mean absolute percentage error of 0.180% and a root-mean-squared error of 0.042 °C. The CNN–LSTM model had a mean absolute percentage error of 1.370% and a root-mean-squared error of 0.117 °C. Full article
Show Figures

Figure 1

33 pages, 27167 KB  
Article
Enhancing Supply Chain Efficiency in India: A Sustainable Framework to Minimize Wastage Through Authentication and Contracts
by S. Mahaboob Hussain, Akula Balakrishna, K. T. Narasimha Naidu, Prakash Pareek, Nishit Malviya and Manuel J. C. S. Reis
Sustainability 2025, 17(3), 808; https://doi.org/10.3390/su17030808 - 21 Jan 2025
Cited by 6 | Viewed by 4089
Abstract
A recent study by the Food and Agriculture Organization (FAO) reveals that approximately 40% of India’s food production is wasted each year due to inefficient and fragmented supply chain management. An effective supply chain is essential to ensure the efficient use of resources [...] Read more.
A recent study by the Food and Agriculture Organization (FAO) reveals that approximately 40% of India’s food production is wasted each year due to inefficient and fragmented supply chain management. An effective supply chain is essential to ensure the efficient use of resources and to maximize profitability. A platform is required to enhance communication, facilitate contract agreements, and streamline collaboration among stakeholders to reach their target destinations. In some cases, suppliers of raw materials often struggle to find buyers, which create gaps in the system. To address these issues, we have developed a unique methodology. We designed a platform that connects all stakeholders in the supply chain through a transparent, two-way communication channel, while ensuring high levels of authenticity and trust. Additionally, we created a comprehensive framework that includes profiles for farmers, retailers, manufacturers, and distributors. This application aims to bridge communication gaps and improve the overall efficiency of the system. To ensure the application’s integrity with the database, we conducted tests with various databases for optimal performance and authentication. During our evaluation, one of our observations on databases was that the Postgres database system utilized 48.3% of the total processing time and a latency of 10,941.3 milliseconds. At the same time, PgBouncer used 5.4% of the time consumed and required 1230.75 milliseconds before data processing started. Considering the authentication as a parameter, we also considered the Firebase (NoSQL) database for our application. It optimizes database operations to enhance the platform’s speed and efficiency, which has the potential for performance improvements in the supply chain management system. Full article
Show Figures

Figure 1

26 pages, 2692 KB  
Article
Automated Research Review Support Using Machine Learning, Large Language Models, and Natural Language Processing
by Vishnu S. Pendyala, Karnavee Kamdar and Kapil Mulchandani
Electronics 2025, 14(2), 256; https://doi.org/10.3390/electronics14020256 - 9 Jan 2025
Cited by 7 | Viewed by 5956
Abstract
Research expands the boundaries of a subject, economy, and civilization. Peer review is at the heart of research and is understandably an expensive process. This work, with human-in-the-loop, aims to support the research community in multiple ways. It predicts quality, and acceptance, and [...] Read more.
Research expands the boundaries of a subject, economy, and civilization. Peer review is at the heart of research and is understandably an expensive process. This work, with human-in-the-loop, aims to support the research community in multiple ways. It predicts quality, and acceptance, and recommends reviewers. It helps the authors and editors to evaluate research work using machine learning models developed based on a dataset comprising 18,000+ research papers, some of which are from highly acclaimed, top conferences in Artificial Intelligence such as NeurIPS and ICLR, their reviews, aspect scores, and accept/reject decisions. Using machine learning algorithms such as Support Vector Machines, Deep Learning Recurrent Neural Network architectures such as LSTM, a wide variety of pre-trained word vectors using Word2Vec, GloVe, FastText, transformer architecture-based BERT, DistilBERT, Google’s Large Language Model (LLM), PaLM 2, and TF-IDF vectorizer, a comprehensive system is built. For the system to be readily usable and to facilitate future enhancements, a frontend, a Flask server in the cloud, and a NOSQL database at the backend are implemented, making it a complete system. The work is novel in using a unique blend of tools and techniques to address most aspects of building a system to support the peer review process. The experiments result in a 86% test accuracy on acceptance prediction using DistilBERT. Results from other models are comparable, with PaLM-based LLM embeddings achieving 84% accuracy. Full article
(This article belongs to the Special Issue Data-Centric Artificial Intelligence: New Methods for Data Processing)
Show Figures

Figure 1

26 pages, 1559 KB  
Article
Real-Time Text-to-Cypher Query Generation with Large Language Models for Graph Databases
by Markus Hornsteiner, Michael Kreussel, Christoph Steindl, Fabian Ebner, Philip Empl and Stefan Schönig
Future Internet 2024, 16(12), 438; https://doi.org/10.3390/fi16120438 - 22 Nov 2024
Cited by 17 | Viewed by 7228
Abstract
Based on their ability to efficiently and intuitively represent real-world relationships and structures, graph databases are gaining increasing popularity. In this context, this paper proposes an innovative integration of a Large Language Model into NoSQL databases and Knowledge Graphs to bridge the gap [...] Read more.
Based on their ability to efficiently and intuitively represent real-world relationships and structures, graph databases are gaining increasing popularity. In this context, this paper proposes an innovative integration of a Large Language Model into NoSQL databases and Knowledge Graphs to bridge the gap in field of Text-to-Cypher queries, focusing on Neo4j. Using the Design Science Research Methodology, we developed a Natural Language Interface which can receive user queries in real time, convert them into Cypher Query Language (CQL), and perform targeted queries, allowing users to choose from different graph databases. In addition, the user interaction is expanded by an additional chat function based on the chat history, as well as an error correction module, which elevates the precision of the generated Cypher statements. Our findings show that the chatbot is able to accurately and efficiently solve the tasks of database selection, chat history referencing, and CQL query generation. The developed system therefore makes an important contribution to enhanced interaction with graph databases, and provides a basis for the integration of further and multiple database technologies and LLMs, due to its modular pipeline architecture. Full article
(This article belongs to the Special Issue Generative Artificial Intelligence in Smart Societies)
Show Figures

Figure 1

13 pages, 3001 KB  
Article
Concurrent Access Performance Comparison Between Relational Databases and Graph NoSQL Databases for Complex Algorithms
by Elena Lupu, Adriana Olteanu and Anca Daniela Ionita
Appl. Sci. 2024, 14(21), 9867; https://doi.org/10.3390/app14219867 - 28 Oct 2024
Cited by 1 | Viewed by 5362
Abstract
Databases are a fundamental element of contemporary software applications. The most widely used and recognized type in practice is the relational database, valued for its ability to store and organize data in tabular structures, its emphasis on data consistency and integrity, and its [...] Read more.
Databases are a fundamental element of contemporary software applications. The most widely used and recognized type in practice is the relational database, valued for its ability to store and organize data in tabular structures, its emphasis on data consistency and integrity, and its use of a standardized query language, SQL. However, with the rapid increase in both the volume and complexity of data, relational databases have recently encountered challenges in effectively modeling this expanding information. To address performance challenges, new database systems have emerged, offering alternative approaches to data modeling—these are known as NoSQL databases. In this paper, we present an indoor navigation application designed to operate on both a relational database, Microsoft SQL Server, and a graph-based NoSQL database, Neo4j. We describe the algorithms implemented for testing and the performance metrics analyzed to draw our conclusions. The results revealed Neo4j’s strength in managing data with complex relationships but also exposed its limitations in handling concurrent access, where SQL Server demonstrated significantly greater stability. Full article
(This article belongs to the Special Issue State-of-the-Art of Knowledge Graphs and Their Applications)
Show Figures

Figure 1

21 pages, 4733 KB  
Entry
MongoDB: Meeting the Dynamic Needs of Modern Applications
by Mukesh Rathore and Sikha S. Bagui
Encyclopedia 2024, 4(4), 1433-1453; https://doi.org/10.3390/encyclopedia4040093 - 27 Sep 2024
Cited by 6 | Viewed by 6837
Definition
This entry reviews MongoDB’s fundamentals, architectural features, advantages, and limitations, providing a comprehensive understanding of its capabilities. MongoDB’s impact on the database landscape is profound, challenging traditional relational databases and influencing the adoption of NoSQL solutions globally. With its continued growth, innovation, and [...] Read more.
This entry reviews MongoDB’s fundamentals, architectural features, advantages, and limitations, providing a comprehensive understanding of its capabilities. MongoDB’s impact on the database landscape is profound, challenging traditional relational databases and influencing the adoption of NoSQL solutions globally. With its continued growth, innovation, and commitment to addressing evolving market needs, MongoDB remains a pivotal player in modern data management, empowering organizations to build scalable, efficient, and high-performance applications. Full article
(This article belongs to the Section Mathematics & Computer Science)
Show Figures

Figure 1

Back to TopTop