Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (265)

Search Parameters:
Keywords = corpus criticism

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 732 KB  
Systematic Review
From the Digital Divide to Algorithmic Vulnerability: A Systematic Review of Social Stratification in the AI Era (2015–2025)
by Manuel José Mera Cedeño, Gertrudis Amarilis Laínez Quinde, Wilson Alexander Zambrano Vélez and César Ernesto Roldán Martínez
Soc. Sci. 2026, 15(5), 326; https://doi.org/10.3390/socsci15050326 (registering DOI) - 15 May 2026
Abstract
The present study seeks to synthesize the scientific evidence from the last decade (2015–2025) regarding the transition from inequality in technological access toward social stratification mediated by automated decision-making systems. Following PRISMA 2020 guidelines and the SPIDER model, a corpus of 74 high-impact [...] Read more.
The present study seeks to synthesize the scientific evidence from the last decade (2015–2025) regarding the transition from inequality in technological access toward social stratification mediated by automated decision-making systems. Following PRISMA 2020 guidelines and the SPIDER model, a corpus of 74 high-impact records from Scopus, Web of Science, ProQuest, and PsycINFO was examined. The results reveal an exponential growth in scientific production since 2018, marking a shift from infrastructure-based inequality toward a systemic stratification mediated by algorithmic opacity. Three critical sectors of exclusion are categorized: the socio-health nexus, the labor market, and the educational ecosystem. Methodologically, quantitative algorithmic auditing predominates (58%), although mixed sociotechnical approaches have increased by 25% since 2021 to capture experiences of intersectional vulnerability. The study concludes that AI acts as an active agent of social reproduction, necessitating a transition toward “Algorithmic Justice” and “Human-Centric Governance.” Finally, a “Reinstating AI” framework is proposed to democratize technological development and mitigate systemic biases, offering a roadmap for researchers and policymakers in the pursuit of technological sovereignty. Full article
30 pages, 5537 KB  
Review
Conceptual Plurality in Transition Programmes for Newly Hired Nurses: An Umbrella Review
by Marcello Torre, Cristina Arrigoni, Rosario Caruso, Antonio Maria Giuseppe Staffa, Desiree Lucà and Arianna Magon
Nurs. Rep. 2026, 16(5), 163; https://doi.org/10.3390/nursrep16050163 - 13 May 2026
Abstract
Background/Objectives: Nurse transition programmes are widely implemented to support newly hired nurses and promote workforce retention. Despite the growing number of published reviews, conceptual inconsistency and methodological heterogeneity limit the interpretability and cumulative value of the evidence. This umbrella review aimed to [...] Read more.
Background/Objectives: Nurse transition programmes are widely implemented to support newly hired nurses and promote workforce retention. Despite the growing number of published reviews, conceptual inconsistency and methodological heterogeneity limit the interpretability and cumulative value of the evidence. This umbrella review aimed to synthesise and critically examine review-level evidence on nurse transition programmes, clarifying programme typologies, contexts, methodological approaches, reported outcomes, and thematic patterns. Methods: An umbrella review was conducted in accordance with PRISMA 2020 guidance. Systematic searches were performed in CINAHL, PubMed, Scopus, Web of Science, and Google Scholar, supplemented by citation tracking. Results: Fourteen reviews published between 2010 and 2025 were included: 12 reviews of primary studies and two reviews of secondary evidence (one umbrella review and one meta-review). Programme models and outcome measures were highly heterogeneous, and primary study overlap was slight (CCA = 2.55), indicating that reviews in the corpus drew on largely non-overlapping sets of primary studies. Transition programmes for new nurses commonly use one-on-one preceptorships with supernumerary practice, simulation-based learning, and active methods like case studies and reflective journaling to build competence and confidence. Their duration varies from a few days to 12 months, aligning with the progressive learning curve of new graduates. Professional outcomes, particularly competence and confidence, were consistently reported, whereas organisational outcomes, such as retention, showed mixed, methodologically constrained evidence. Patient-level outcomes were rarely examined. Thematic analysis revealed a shift over time from individual professional readiness towards implementation and organisational considerations. Conclusions: Given this conceptual plurality, there is an urgent need to standardise key indicators for evaluating the effectiveness of nurse transition programmes across healthcare settings globally. Full article
Show Figures

Figure 1

19 pages, 963 KB  
Brief Report
The Circular Economy of EU Construction and Demolition Waste: Persistent Barriers, Digital Innovation, and the Emerging Energy Security Imperative
by Fernando Pacheco-Torgal, Yining Ding and Xin-Yu Zhao
Sustainability 2026, 18(10), 4851; https://doi.org/10.3390/su18104851 - 12 May 2026
Viewed by 33
Abstract
Construction and demolition waste (CDW) constitutes the largest single waste stream in the European Union by weight, yet the EU’s circular material use rate remains low, at around 12%, indicating substantial distance from policy ambitions for circular resource use. This paper presents a [...] Read more.
Construction and demolition waste (CDW) constitutes the largest single waste stream in the European Union by weight, yet the EU’s circular material use rate remains low, at around 12%, indicating substantial distance from policy ambitions for circular resource use. This paper presents a systematic narrative review of the literature on circular economy integration in CDW management, with a focus on the EU context. The review pursues three objectives: (i) to critically assess the gap between reported CDW recovery performance and genuine material circularity; (ii) to systematically identify and analyse persistent barrier domains to circular economy adoption in CDW management; and (iii) to evaluate the potential of digital and governance-oriented innovations to address these barriers. The review scope is explicitly delimited to the EU regulatory and institutional context, drawing on a corpus of 42 sources identified through systematic Scopus searches. The review identifies five persistent barrier domains—legal, technical, social, behavioural, and economic—with regulatory fragmentation and secondary material devaluation as the most structurally entrenched. Apparent compliance with the 70% recovery target under Directive 2008/98/EC conceals widespread downcycling and inconsistent reporting. A decisive paradigm shift is observed in recent research, from material characterisation towards systemic circularity, digital demolition frameworks, and governance. Emerging technologies—including AI-enabled sorting, Building Information Modelling, Digital Twins, and Digital Product Passports—offer significant potential to enhance material traceability, recovery quality, and decision-making across the CDW value chain. However, technological innovation alone is insufficient, Design for Deconstruction remains an underutilised upstream strategy, and lasting progress depends on coherent regulatory frameworks, institutional coordination, and market conditions that support circular practices. Future research should therefore focus on governance mechanisms, longitudinal performance assessment, and the scalability of digitally enabled circular solutions. Full article
Show Figures

Figure 1

34 pages, 399 KB  
Article
Urban Fear, Criminality and the Erosion of Intangible Cultural Access in Machala: A Critical Qualitative Content Analysis of Ecuadorian National Digital Press
by Fernanda Tusa, Ignacio Aguaded and Santiago Tejedor
Heritage 2026, 9(5), 187; https://doi.org/10.3390/heritage9050187 - 12 May 2026
Viewed by 21
Abstract
This article examines how the Ecuadorian national digital press has represented the relationship between criminal violence, declining mobility, tourism contraction, and the erosion of intangible cultural access in Machala, Puerto Bolívar, and the route to Jambelí during 2025. This study aims to explain [...] Read more.
This article examines how the Ecuadorian national digital press has represented the relationship between criminal violence, declining mobility, tourism contraction, and the erosion of intangible cultural access in Machala, Puerto Bolívar, and the route to Jambelí during 2025. This study aims to explain how mediated representations of insecurity can contribute to the symbolic narrowing of culturally meaningful urban–coastal spaces, even when those spaces remain materially present and formally open. The article responds to a gap in the literature at the intersection of critical heritage studies, media framing, urban fear, and Latin American security studies. The existing research has examined heritage as social practice, media representation of crime, and urban securitization, but has rarely connected these fields to explain how criminal violence erodes lived access to intangible cultural environments in secondary port cities of the Global South. Methodologically, this study applies qualitative content analysis to a purposive corpus of eight focal journalistic texts published in Ecuadorian digital outlets, such as El Universo, El Comercio, Expreso, El Mercurio, Extra, Primicias, GK, and La Hora. Deductive–inductive coding was complemented by descriptive article-level indicators of themes, keyword clusters, and temporal distribution. The findings show that the press did not merely report violent events; it progressively reorganized the symbolic meaning of Machala by re-signifying Puerto Bolívar, the marine environment, the cabotage pier, and the maritime route to Jambelí as spaces of risk, interruption, and conditional access. This study contributes conceptually by defining intangible cultural access and symbolic enclosure, empirically by documenting the mediated erosion of coastal public–cultural life, and practically by proposing integrated policy actions for security governance, cultural reactivation, local commerce, maritime mobility, and responsible public communication. Full article
(This article belongs to the Section Cultural Heritage)
21 pages, 861 KB  
Article
Evaluation of NeMo Guardrails as a Firewall for User–LLM Interaction
by Antônio João Azambuja, Marcos Guilherme, João Victor Fernandes de Castro, Jean Phelipe de Oliveira Lima, Leonardo B. Oliveira and Anderson da Silva Soares
Future Internet 2026, 18(5), 252; https://doi.org/10.3390/fi18050252 - 9 May 2026
Viewed by 250
Abstract
The rapid integration of Large Language Models (LLMs) into critical personal and professional environments has exacerbated security risks, particularly adversarial attacks such as prompt injection and jailbreaking, which aim to bypass safety alignment. This study evaluates the efficacy of NVIDIA’s Llama-3.1-nemoguard-8b-content-safety model acting [...] Read more.
The rapid integration of Large Language Models (LLMs) into critical personal and professional environments has exacerbated security risks, particularly adversarial attacks such as prompt injection and jailbreaking, which aim to bypass safety alignment. This study evaluates the efficacy of NVIDIA’s Llama-3.1-nemoguard-8b-content-safety model acting as a semantic firewall to mitigate these threats. To ensure a robust assessment, we utilized the ‘Do Not Answer’ dataset, augmented with 939 synthetically generated benign prompts to create a balanced corpus of 1878 samples. The evaluation methodology encompasses a risk-category analysis, standard binary classification metrics, and a novel metric, the Compensation Rate, which measures the firewall’s ability to block responses when the underlying LLM fails. Results indicate a high Precision (94.57%) but a moderate Sensitivity (51.97%), uncovering a critical performance trade-off: the model exhibits a conservative bias, prioritizing high precision to minimize false positives at the expense of recall for nuanced adversarial prompts, particularly in categories involving sensitive data leakage and misinformation. Furthermore, the proposed Compensation Rate achieved 34.8%, suggesting that the semantic firewall successfully mitigated 34.8% of instances where the foundational LLM’s internal safety alignment failed. These findings indicate that while the system effectively blocks explicit threats, its efficacy as a secondary defense diminishes against context-dependent vulnerabilities, notably data exfiltration and misinformation. Full article
Show Figures

Figure 1

41 pages, 2342 KB  
Systematic Review
Artificial Intelligence in Complex Manufacturing Systems: A Systematic Review of Validation Rigor and Deployment Readiness in Predictive Maintenance
by Cesar Felipe Henao Villa, David Alberto Garcia Arango, Luis Fernando Garcés Giraldo, Rosana Alejandra Meleán Romero, Alejandro Valencia-Arias and José Alexander Velásquez Ochoa
Information 2026, 17(5), 456; https://doi.org/10.3390/info17050456 - 8 May 2026
Viewed by 380
Abstract
This systematic review (PRISMA 2020) examines 89 studies—64 peer-reviewed articles and 25 arXiv preprints (2007–2026)—addressing the gap between AI research and operational predictive maintenance (PdM) deployment in complex manufacturing systems. Analyzing five thematic clusters in non-stationary and stochastic environments, we evaluated predictive performance [...] Read more.
This systematic review (PRISMA 2020) examines 89 studies—64 peer-reviewed articles and 25 arXiv preprints (2007–2026)—addressing the gap between AI research and operational predictive maintenance (PdM) deployment in complex manufacturing systems. Analyzing five thematic clusters in non-stationary and stochastic environments, we evaluated predictive performance and deployment readiness. Deep learning dominates remaining useful life (RUL) forecasting; however, 65.6% of studies employ weak or unclear validation protocols (Tier 0–1), lacking real-world robustness testing. Fault diagnosis increasingly integrates Edge-AI, yet Explainable AI (XAI) adoption remains scarce (15.6%), undermining industrial trustworthiness. No study reached operational field validation beyond temporal or cross-domain split, reflecting a systematic disconnection from deployed manufacturing systems. We introduce a novel Deployment Readiness Score (DRS) framework and identify critical barriers: data scarcity, environmental non-stationarity, computational constraints, and black-box model distrust. Recommendations include standardized temporal validation protocols, multi-site field studies, and architecture-integrated explainability. The 25 arXiv preprints (2024–2026) exhibit a mean DRS nearly three times that of the peer-reviewed corpus, signaling nascent convergence toward deployment-mature research. This review was not pre-registered. Full article
(This article belongs to the Special Issue Surveys in Information Systems and Applications)
Show Figures

Figure 1

26 pages, 1746 KB  
Review
Mapping the Convergence of Frontier Technologies for Major Environmental Challenges: A Chemical and Molecular Perspective on the Use of AI for Climate Action and Antimicrobial Resistance
by Segundo Jonathan Rojas-Flores, Rafael Liza, Renny Nazario-Naveda, Félix Díaz, Daniel Delfin-Narciso, Moisés Gallozzo Cardenas and Luis Cabanillas-Chirinos
Molecules 2026, 31(10), 1571; https://doi.org/10.3390/molecules31101571 - 8 May 2026
Viewed by 290
Abstract
The planet faces the critical interconnected challenges of climate change and antimicrobial resistance (AMR); these two crises mutually reinforce each other, threatening global health and ecosystem stability. This study conducts a systematic documentary analysis to map the convergence and identify the structural gaps [...] Read more.
The planet faces the critical interconnected challenges of climate change and antimicrobial resistance (AMR); these two crises mutually reinforce each other, threatening global health and ecosystem stability. This study conducts a systematic documentary analysis to map the convergence and identify the structural gaps between two key technological domains: artificial intelligence (AI) for climate action and molecular methods for AMR. The methodology was based on a corpus of 179 scientific documents indexed in Scopus (2010–2025), analyzed with data science tools to identify trends, collaborations, and impact. Quantitative results revealed clear leadership by the United States, accounting for 37.4% of publications, followed by China (26.8%); this leadership reflects the concentration of high-throughput molecular surveillance infrastructure and data science clusters essential for monitoring the environmental resistome. In terms of scientific impact, Spain showed the highest average, with 32.8 citations per article. The most influential work, a review on food security and sustainability, accumulated 275 citations. Network analysis identified authors such as Zhu, Yongguan, with 240 citations in total, as central nodes in international collaborations. Thematically, metagenomics and machine learning emerged as mature and interconnected research cores. This analysis confirms a solid yet still fragmented relationship between the two fields. The analysis reveals that, while metagenomic tools dominate the current literature, a gap persists in correlating genotypic resistance potential with functional phenotypic expression under changing climatic stressors. The results confirm a solid yet still fragmented foundation, highlighting the need for hybrid platforms that transition from descriptive bibliometrics to functional integration for designing systemic solutions. Future work should prioritize the development of hybrid platforms, such as intelligent biosensors, and collaborative governance frameworks that accelerate effective responses to these dual crises. Full article
(This article belongs to the Section Natural Products Chemistry)
Show Figures

Figure 1

24 pages, 3491 KB  
Review
Epigenomic Biomarker Discovery from Biomedical Literature: AI and Text Mining Toward Health Monitoring Frameworks
by Ji-Hye Oh, Hee-Jo Nam, Soo Hyun Seo and Hyun-Seok Park
Appl. Sci. 2026, 16(10), 4622; https://doi.org/10.3390/app16104622 - 8 May 2026
Viewed by 153
Abstract
Epigenomic regulation, particularly DNA methylation, plays a critical role in gene expression control and has emerged as an important source of biomarkers for disease diagnosis, risk prediction, and longitudinal health monitoring. As high-throughput sequencing technologies have expanded, epigenomic research has rapidly grown, producing [...] Read more.
Epigenomic regulation, particularly DNA methylation, plays a critical role in gene expression control and has emerged as an important source of biomarkers for disease diagnosis, risk prediction, and longitudinal health monitoring. As high-throughput sequencing technologies have expanded, epigenomic research has rapidly grown, producing a large and complex body of biomedical literature. This review presents an AI-driven literature-level analysis aimed at uncovering structural patterns and research trends related to epigenomic biomarker discovery. Using a large corpus of full-text articles collected from PubMed and PubMed Central, we applied text mining techniques including keyword frequency analysis, document-level co-occurrence analysis, topic clustering, contextual concordance analysis, and temporal trend analysis. Rather than evaluating individual experiments, this approach examines the broader research landscape to identify recurring conceptual structures and methodological patterns. The analysis reveals that epigenomic biomarker research is organized into several interconnected domains, including disease-focused epigenomics, chromatin regulation studies, transcriptomic integration research, and cancer-related epigenomic investigations. The rapid growth of publications since 2010 further reflects the increasing importance of high-throughput epigenomic profiling and biomarker-driven research. These findings demonstrate that AI-driven literature mining provides a scalable framework for uncovering epigenomic biomarker knowledge and translating it toward AI-enabled health monitoring systems. Such approaches may support biomarker prioritization, early disease detection, and data-driven health monitoring within precision health environments. Full article
Show Figures

Figure 1

17 pages, 303 KB  
Article
Donald J. Trump’s Discursive and Neurocommunicational Playbook
by Almudena Barrientos-Báez, Humberto Azpurua and David Caldevilla-Domínguez
Journal. Media 2026, 7(2), 99; https://doi.org/10.3390/journalmedia7020099 - 7 May 2026
Viewed by 448
Abstract
The present article employs a discursive and neurocommunicational lens to analyze Donald J. Trump’s discursive playbook. By means of a qualitative approach founded upon Critical Discourse Analysis and framing theory, the present study examines the manner in which the tactics of denial, deflection, [...] Read more.
The present article employs a discursive and neurocommunicational lens to analyze Donald J. Trump’s discursive playbook. By means of a qualitative approach founded upon Critical Discourse Analysis and framing theory, the present study examines the manner in which the tactics of denial, deflection, and discrediting have structured the politician’s political communication since 2016, throughout his new presidential term. The corpus under scrutiny encompasses a range of sources, including official speeches, social media posts, media coverage, and academic literature. A particular emphasis is placed on the Epstein case, which serves to expose the structural limitations of his rhetorical strategy. The findings demonstrate the efficacy of these tactics in polarized contexts, characterized by the mobilization of emotions such as fear, anger, and victimhood, while reinforcing cognitive biases such as dissonance and confirmation bias. However, in situations involving crises that are characterized by strong social consensus—such as sexual abuse, corruption, or demands for transparency—discursive saturation has been shown to generate contradictions, cognitive fatigue, and psychological reactance, leading to a so-called ‘boomerang effect’. From a neurocommunicational perspective, this phenomenon elucidates the process by which emotional triggers, which initially serve to strengthen audience cohesion, can subsequently act as factors of disaffection and internal fragmentation. The study posits that Trump’s rhetorical populism is contingent on a delicate equilibrium between emotional mobilization and perceived credibility, whose disruption threatens to undermine the stability of his leadership and facilitate the emergence of alternative leadership figures within the MAGA movement. Full article
21 pages, 1348 KB  
Article
AI-Driven Generation of Old English: A Framework for Low-Resource Languages
by Rodrigo Gabriel Salazar Alva, Matías Núñez, Cristian López Del Alamo and Javier Martín Arista
Big Data Cogn. Comput. 2026, 10(5), 145; https://doi.org/10.3390/bdcc10050145 - 6 May 2026
Viewed by 336
Abstract
Preserving ancient languages is essential for understanding the cultural and linguistic heritage of humanity. Old English, however, remains critically under-resourced, which limits its accessibility to modern natural language processing (NLP) techniques. We present a scalable framework that uses advanced large language models (LLMs) [...] Read more.
Preserving ancient languages is essential for understanding the cultural and linguistic heritage of humanity. Old English, however, remains critically under-resourced, which limits its accessibility to modern natural language processing (NLP) techniques. We present a scalable framework that uses advanced large language models (LLMs) to generate high-quality Old English texts to address this gap. In this study, we specifically employ state-of-the-art models, including Llama-3.1-8B and Mistral-7B, as our foundation models, which are then adapted to the unique characteristics of Old English. Our approach combines parameter-efficient fine-tuning (Low-Rank Adaptation (LoRA)), data augmentation via back-translation, and a dual-agent pipeline that separates content generation (in English) and translation (into Old English). Evaluation with automated metrics (BLEU, METEOR, and CHRF) shows improvements over baseline models, with BLEU scores increasing from 26 to over 65 for English-to-Old English translation. Expert human assessment confirms high grammatical accuracy and stylistic fidelity in the generated texts, with average scores of 9.0/10 for inflection and word order, 9.1/10 for lexical authenticity, and 7.8 for semantic coherence. These results demonstrate that the framework can reliably expand limited historical corpora while maintaining linguistic integrity, with immediate practical applications in digital humanities research, computational philology, and the development of educational resources for Old English study. Beyond expanding the Old English corpus, our method offers a practical blueprint for revitalizing other endangered languages, thus linking AI innovation with the goals of cultural preservation. Full article
Show Figures

Figure 1

11 pages, 268 KB  
Review
Ten Years of Congenital Zika Syndrome: From Outbreak to a Decade of Clinical, Therapeutic, and Preventive Advances in a Tropical Disease Context
by Fabrício Silva Pessoa
Trop. Med. Infect. Dis. 2026, 11(5), 124; https://doi.org/10.3390/tropicalmed11050124 - 6 May 2026
Viewed by 336
Abstract
A decade has elapsed since the first recognized cluster of congenital anomalies associated with Zika virus (ZIKV) was reported in Brazil in 2015, culminating in the formal delineation of Congenital Zika Syndrome (CZS) as a specific pattern of birth defects. This narrative review [...] Read more.
A decade has elapsed since the first recognized cluster of congenital anomalies associated with Zika virus (ZIKV) was reported in Brazil in 2015, culminating in the formal delineation of Congenital Zika Syndrome (CZS) as a specific pattern of birth defects. This narrative review examines the ten-year trajectory of CZS as a tropical infectious disease, from its initial emergence and public health emergency declaration by the World Health Organization (WHO) in February 2016, through evolving epidemiological, clinical, and scientific understanding. CZS is characterized by a spectrum of severe neurological manifestations—including microcephaly, subcortical calcifications, malformations of cortical development, ventriculomegaly, and corpus callosum abnormalities—alongside ophthalmic, auditory, and musculoskeletal complications. Transmitted primarily by Aedes aegypti mosquitoes in tropical and subtropical regions, ZIKV disproportionately affects low- and middle-income countries in Latin America, Africa, and Southeast Asia, underscoring its nature as a quintessential tropical disease linked to poverty, inadequate vector control, and health inequity. Over ten years, substantial advances have been made in understanding ZIKV pathogenesis, neurodevelopmental outcomes, diagnostic criteria, and multidisciplinary clinical management of affected children. In the therapeutic and preventive domain, over 45 vaccine candidates have been identified, with 16 reaching Phase 1 or 2 clinical trials by late 2025, though no licensed vaccine or specific antiviral therapy yet exists. This review contextualizes CZS within the broader framework of neglected tropical diseases, evaluates its global and family-level burden, and critically appraises progress and remaining gaps in clinical care, vaccination, and vector control over the past ten years. Full article
(This article belongs to the Special Issue Emerging Vector-Borne Diseases and Public Health Challenges)
34 pages, 605 KB  
Article
AMNDA: An Adaptive Multi-Layer, Lifecycle-Aware Defense Architecture for Multi-Stage Cyberattacks with Azure-Based Validation
by Zlatan Morić, Vedran Dakić, Damir Regvart and Jasmin Redžepagić
Electronics 2026, 15(9), 1939; https://doi.org/10.3390/electronics15091939 - 3 May 2026
Viewed by 218
Abstract
Modern enterprise breaches are no longer isolated events but coordinated, multi-stage campaigns whose success depends on the defender’s inability to translate detection into timely containment. While existing frameworks—such as attack-lifecycle models, Zero Trust architectures, and detection-driven systems—provide valuable capabilities, they lack a formal [...] Read more.
Modern enterprise breaches are no longer isolated events but coordinated, multi-stage campaigns whose success depends on the defender’s inability to translate detection into timely containment. While existing frameworks—such as attack-lifecycle models, Zero Trust architectures, and detection-driven systems—provide valuable capabilities, they lack a formal mechanism for coupling inferred adversarial state with coordinated, cross-layer enforcement. This paper presents AMNDA, an Adaptive Multi-layer, stage-aware Network Defense Architecture that operationalizes lifecycle-aware defense through explicit state-to-control mapping and executable orchestration. Adversarial progression is modeled as a probabilistic state-transition process, and inferred states are systematically mapped to synchronized controls across edge protection, identity governance, internal segmentation, and behavioral detection. A formally defined orchestration function transforms detection outputs into stage-conditioned policy updates, enforcing monotonic tightening of containment as adversarial capability escalates. AMNDA is implemented and validated in a reproducible Microsoft Azure environment. Empirical results show that stage-aligned enforcement actions execute within 1.0–3.1 s, while detection latency remains the dominant constraint, with a median of 1034 s across the validation corpus. This separation reveals a critical operational insight: in modern cloud environments, the limiting factor in lifecycle defense is not enforcement capability but detection timing. The contribution of AMNDA is therefore not a new detection technique but a formal, deployable architecture that converts attack-stage inference into coordinated, low-latency containment. By bridging lifecycle modeling, Zero Trust principles, and automated orchestration, the proposed approach establishes a practical foundation for state-aware, adaptive cyber defense. Full article
Show Figures

Figure 1

24 pages, 1038 KB  
Article
Avant-Garde Poetry and the Tékhnē of Traditional Versification
by Evgenii Kazartsev and Nikita Kirichenko
Arts 2026, 15(5), 97; https://doi.org/10.3390/arts15050097 - 2 May 2026
Viewed by 331
Abstract
This article offers a theoretically nuanced and empirically grounded investigation into the paradoxical afterlife of classical versification within the poetic practices of the Russian and Soviet avant-garde. Challenging the persistent historiographic narrative that equates avant-garde poetics with an unequivocal rupture from tradition, the [...] Read more.
This article offers a theoretically nuanced and empirically grounded investigation into the paradoxical afterlife of classical versification within the poetic practices of the Russian and Soviet avant-garde. Challenging the persistent historiographic narrative that equates avant-garde poetics with an unequivocal rupture from tradition, the study demonstrates that canonical metrical forms—most notably iambic tetrameter—continued to operate as structurally productive, albeit critically reconfigured, elements within experimental verse. Drawing on a broad corpus encompassing poetic manifestos, verse texts, and prose writings by Vladimir Maiakovskii, Ilia Sel’vinskii, Semen Kirsanov, and Nikolai Aseev, the authors combine close formal analysis with quantitative prosodic modeling, including linguistic and speech models derived from Kolmogorov–Taranovsky verse theory. The article argues that avant-garde poets did not simply negate inherited metrics but subjected them to a process of internal recomposition, shifting attention from meter as a fixed scheme to rhythm as a dynamic, semantically charged construct. While rhythmic innovation is shown to be consciously engineered in verse, the analysis of verse-like fragments in prose reveals persistent, unconscious attachments to “classical” rhythmic patterns, particularly the Pushkinian alternating rhythm. This tension between declarative rejection and latent continuity illuminates the avant-garde’s distinctive mode of negotiating tradition: not abolishing it, but instrumentalizing it within a broader project of total artistic reorganization. The study thus reframes avant-garde prosody as a site where innovation and inheritance coexist in a state of productive contradiction, reshaping our understanding of modernist poetic technique. Full article
Show Figures

Figure 1

30 pages, 859 KB  
Article
Singular Design Foresight: A Foundational Method for Auditable Anticipation and Decision Closure
by Pablo Lara-Navarra, Antonia Ferrer-Sapena and Enrique A. Sánchez-Pérez
Forecasting 2026, 8(3), 38; https://doi.org/10.3390/forecast8030038 - 2 May 2026
Viewed by 315
Abstract
Singular Design Foresight (SDF) is proposed as a foundational methodological framework for advancing Design Foresight (DF) toward a more explicit, traceable, and evaluable scientific discipline. The framework formalizes DF as a structured cycle in which qualitative foresight inputs—such as signals, trends, and expert [...] Read more.
Singular Design Foresight (SDF) is proposed as a foundational methodological framework for advancing Design Foresight (DF) toward a more explicit, traceable, and evaluable scientific discipline. The framework formalizes DF as a structured cycle in which qualitative foresight inputs—such as signals, trends, and expert interpretations—are progressively transformed into analyzable representations that support decision closure under conditions of structural uncertainty. SDF combines an expert-defined conceptual universe with semantic projections to relate textual and contextual evidence to anticipatory constructs, enabling the generation of traceable indicators and structured configurations of viable futures. Within this architecture, the Stakeholder Viability Principle (SVP) functions as a filtering mechanism that delimits relevant futures according to continuity, agency, and axiological coherence, while Social Singularity captures context-specific critical transitions that shape when and why decision closure becomes necessary. The framework is organized in alignment with Design Science Research (DSR), adopting an evaluation logic centered on validity, utility, and attribution. Rather than presenting conclusive system-level validation, the article synthesizes summative evidence from previously published studies on semantic projections, singularity detection, and mixed expert–corpus foresight applications to support the plausibility, internal coherence, and operational feasibility of the proposed framework, while delimiting full integrated validation as a future research objective. SDF does not aim to provide deterministic prediction; instead, it enables auditable anticipatory representations and justified closure under uncertainty. In this sense, the framework is compatible with forecasting understood as the production of evaluable anticipations under explicit assumptions, while preserving the interpretive and situated character of strategic decision-making. Full article
Show Figures

Figure 1

35 pages, 1316 KB  
Article
The Rhetoric of Energy Transition Coverage: Analyzing Lexical Patterns and Rhetorical Strategies as Framing Tools in News Discourse of English-Language Mainstream Media
by Ekaterina Veselinovna Teneva
Journal. Media 2026, 7(2), 95; https://doi.org/10.3390/journalmedia7020095 - 1 May 2026
Viewed by 672
Abstract
The 2021–2024 global energy crisis intensified the energy transition, with mainstream media coverage playing a pivotal role in shaping public perceptions. Guided by Burke’s and Lippmann’s theories, and supported by corpus-based critical and rhetorical discourse analyses, this interdisciplinary study aimed to analyze the [...] Read more.
The 2021–2024 global energy crisis intensified the energy transition, with mainstream media coverage playing a pivotal role in shaping public perceptions. Guided by Burke’s and Lippmann’s theories, and supported by corpus-based critical and rhetorical discourse analyses, this interdisciplinary study aimed to analyze the role of lexical patterns and rhetorical strategies in framing the transition within a corpus of 1341 news articles retrieved from the websites of five English-language mainstream media outlets. Corpus-based analysis identified generic frames, including economic consequences, responsibility, conflict, technological, emotion, and moral duty frames. Rhetorical discourse analysis revealed specific frames, including economic opportunities, technological progress and challenges, energy security and independence, global leadership, energy partnerships, partisan divide, global disparities, corporate greenwashing, necessity, hope, and uncertainty frames, that indicated an ambivalence in the framing of the transition, thereby contributing to the polarization and manipulation of public opinion. The findings indicated a discrepancy: while British, American, and Brazilian media focused more on political divides, Indian and Chinese media emphasized energy partnerships and patriotism. Appeals to experts were less frequent, whereas appeals to emotions were often employed to shape public perceptions. The findings illustrate how lexical patterns and rhetorical strategies function as powerful framing tools within journalism, applied linguistics, and media rhetoric. Full article
(This article belongs to the Special Issue Media, Journalism and Environmental Resilience)
Show Figures

Figure 1

Back to TopTop