You are currently viewing a new version of our website. To view the old version click .

AI-Powered Material Science and Engineering | Interview with Dr. Fernando Gomes de Souza Junior—Editorial Board Member of Materials

27 November 2025

The integration of artificial intelligence (AI) with materials science and engineering has become one of the most dynamic and transformative frontiers in contemporary research. By leveraging AI techniques such as machine learning, deep learning, and data-driven modeling, scientists can now accelerate materials discovery, optimize material properties, and predict performance with unprecedented efficiency. Recognizing its immense potential, MDPI has launched the “AI-Powered Material Science and Engineering” event. We were sincerely honored to interview Dr. Fernando Gomes de Souza Junior, an Editorial Board Member of Materials (ISSN: 1996-1944).

Name: Dr. Fernando Gomes de Souza Junior
Affiliation: Biopolymers & Sensors Lab., Macromolecules Institute, Universidade Federal do Rio de Janeiro, Rio de Janeiro, Brazil
Interests: natural resources; polymerization; nanocomposites; characterization; imaging; environmental recovery; nanomedicine; sensors; machine learning; data mining

The following is a short interview with Dr. Fernando Gomes de Souza Junior:

1. Could you introduce yourself and provide a concise overview of your research field?
Hi and it’s a pleasure to be here. My name is Fernando G. de Souza Jr. I am a Professor at the Federal University of Rio de Janeiro (UFRJ), and my work sits at the intersection of materials science and engineering, with a specialized focus on biopolymers, nanocomposites, data analysis, experimental design, biofuels, artificial intelligence, and machine learning.
My journey began in 1994, when I enrolled in chemistry at the Federal University of Espírito Santo. Back then, we were still using Windows 3.11—the first encounters with computers felt almost magical. It was during this era that I sent my first email, near the end of the 1990s, and began to realize how profoundly technology could transform scientific research. Throughout my undergraduate studies, my master’s degree (at UENF in materials science and engineering), and my doctorate (at the Institute of Macromolecules, UFRJ, working with conductive polymers), I consistently faced one recurring challenge: the explosion of scientific data generated by instruments such as electrometers, spectrometers, and sensors. This compelled me to learn programming—first in BASIC, later in more advanced languages—to automate measurements, process results, and extract meaning from numerical chaos. My postdoctoral work led me into data analysis and experimental design, where I began constructing statistical models capable of precisely describing the formation and performance processes of the materials we study.
Today, my research group focuses on biopolymers and nanocomposites, particularly in addressing their economic and technical challenge: they are, on average, 25% more expensive than their petrochemical analogs. Overcoming this barrier requires more than simply substituting raw materials—it demands functional innovation, which in turn necessitates nanomodification strategies guided by data-driven optimization. And this is where artificial intelligence entered as a catalyst—not merely as a tool but as a new scientific paradigm of thinking.

2. What has been the greatest challenge you have faced in your research career?
This is a very interesting question—and I believe it doesn’t have a single answer. When I reflect on the evolution of my career since 1994, I see that the greatest challenge wasn’t merely technical—it was cultural and systemic: learning to adapt to the accelerating pace of technological change while simultaneously fighting for investments—both public and private—that can translate this change into real scientific advancement. Universities are fundamental institutions for training qualified personnel, and this became clear to me during my undergraduate research, master’s, doctorate, and, ultimately, through my professorship selection process. But the true leap came when I confronted the absurd volume of data produced by high-precision instruments—data that, without adequate tools, was useless. That’s when I began writing my first code, realizing the importance of programming, multivariate statistics, and factorial experimental design. But the most recent—and perhaps the deepest—challenge is different: text mining of scientific and patent literature.
Today, what challenges me most is extracting hidden knowledge from the literature: articles, patents, technical reports. It’s not just about reading more—it’s about understanding what is not being said, identifying unexplored gaps, and detecting invisible connections between seemingly unrelated fields. For example, while scientific literature emphasizes new nanoparticles, novel synthesis techniques, or thermal properties, patents focus on durability, flexibility, lifespan, and industrial scalability. This discrepancy is rich—yet invisible without AI. This is precisely where we are now focusing: developing machine learning and generative AI models to mine these texts, extract patterns, identify trends, and—most importantly—generate novel hypotheses from data that already exists but remains unread. This is our current challenge: transforming information into strategic knowledge. And this requires more than algorithms—it demands scientific vision, critical curiosity, and persistence.

3. In your view, what are the main advantages of integrating artificial intelligence into materials science and engineering? How has AI transformed your research methods or outcomes?
This is an excellent question—because it touches the core of the revolution we are living through. The integration of AI into materials science is not an enhancement—it is a redefinition of the scientific methodology. Many of the problems we face—complex, multivariate, involving hundreds of interacting variables—would be impossible to solve without these tools.
One concrete example: We developed a butylene succinate oligomer for use as a bio-phase changing material (bio-PCM)—a material that stores and releases heat to regulate temperature in environments. Optimizing its thermal properties involves dozens of parameters: monomer-to-catalyst ratio, reaction temperature, time, pressure, additives, etc. With traditional methods, testing all combinations would take years. With machine learning, we built predictive models that identified optimal conditions for maximum thermal efficiency and cyclic stability—in weeks. And this has enormous social impact: residential climate control consumes staggering amounts of energy. If we can develop materials that reduce this demand, we contribute to energy justice and resilience amid severe climate change.
Another example: in the field of biofuels, we used machine learning to discover novel catalysts. Instead of randomly testing hundreds of compounds, we trained models using molecular structures and catalytic performance data—and the models pointed us toward promising candidates we would never have considered.
We also developed a text classification system to understand how science and industry perceive the same material differently. We used Scopus (scientific literature) and patent databases (WIPO, USPTO). Result? In science, the focus is on new techniques, new nanoparticles, new properties. In patents, the focus is on lifespan, flexibility, production cost, scalability. This divergence reveals a critical gap between what science produces and what industry needs. And AI allows us to visualize, quantify, and—ultimately—bridge it.
But perhaps the work I am most proud of is the development of a unique, unprecedented scale for assessing the hazard of micro- and nanoplastics. No standardized global metric existed. We aggregated data from hundreds of articles—toxicity, size, shape, surface charge, chemical composition, environmental behavior—and trained an AI model to classify the relative risk of each particle type. This would have been impossible without AI. Only through the capacity to process, correlate, and generalize such vast data at scale could we create a tool now being used by research groups worldwide. AI doesn’t merely accelerate research—it redefines what is possible to investigate.

4. Looking ahead to the next decade, what do you see as the main opportunities and potential advances in materials science and engineering driven by AI?
This is another excellent question—and I believe that, above all, we must focus on more efficient methods for extracting scientific data. Much of what we seek to discover is already written—but hidden within thousands of articles, theses, patents, and technical reports. The next great leap will come from intelligent web scraping, semantic extraction, and the use of Large Language Models (LLMs) to uncover connections between concepts, disciplines, and fields. It’s not just about keyword searches. It’s about understanding:

  • What are the most critical gaps in biopolymer nanocomposites?
  • Which material combinations have been tested and failed—but never documented as “failures”?
  • Which patents are blocking innovation due to overly aggressive protection strategies? 

These are the new problems of science—and AI is the only tool capable of solving them.
Moreover, property optimization will remain a pillar—but no longer in isolation. The ideal strategy now integrates four pillars:

  1. Data analysis (to understand what already exists);
  2. Experimental design (to define next steps efficiently);
  3. Computational simulation (Monte Carlo, molecular dynamics);
  4. Machine learning (to predict, generalize, and suggest). 

We have already succeeded in predicting properties of nanocomposites—such as thermal conductivity, mechanical strength, or degradation rate—based solely on chemical composition. This eliminates hundreds of experiments. And what’s even more powerful: these models are reusable. A model trained on biopolymers can, with minimal adjustments, be applied to synthetic polymers, ceramics, or even hydrogels.
The next decade will be defined by generative models—not just to predict, but to invent. Imagine a model that, given a functional objective—“a material that is biodegradable, lightweight, highly impact-resistant, and degrades within 6 months in moist soil”—generates hundreds of plausible compositions, suggests molecular structures, viable synthesis routes, and even cost estimates. This is already possible. In ten years, it will be routine. Materials science will cease to be a science of trial and error—and become a science of data-guided computational design.

5. As the Editorial Board Member of Materials, could you share your experience with MDPI?
I greatly appreciate the opportunities offered by MDPI—and I have had an exceptionally positive experience as a member of the Editorial Board of Materials. I’ve had the privilege of leading several Special Issues—thematic collections that have been highly relevant and, I believe, motivated the community to pursue new knowledge in emerging areas.
What impresses me most is the professionalism with which MDPI engages its editorial board. They do not treat us as volunteers—they treat us as partners. There is genuine care in communication: timely reminders, strategic suggestions, clear incentives. They constantly remind us of how we can contribute to the dissemination of knowledge. They also grant us access to a global database of researchers, enabling—even indirectly—connections with colleagues across all continents. This broadens our perspective, expands our collaborations, and amplifies our impact.
The commitment to open science and open access is fundamental. Knowledge cannot be a privilege. When an article is published in Materials, it is available to any student at a public university in Brazil, Africa, India, or Latin America—without financial barriers. This is a paradigm shift—and MDPI is leading it.
Results are rapid—without excessive bureaucracy or unnecessary delays—and academic rigor in quality control is strict.
Another point I deeply value: the recognition of reviewers. MDPI offers accumulable vouchers that can be used to cover article processing charges for our own publications. This is extraordinary. It creates a virtuous cycle: you review, you contribute to the quality of science, and you are directly rewarded. It’s a system that values the invisible labor of science—and for me, this is the most important thing.
Being a member of the Editorial Board of Materials by MDPI is, without doubt, one of the most enriching experiences of my academic life. It is a publisher that understands science is a collective effort—and that to advance, it requires transparency, speed, equity, and recognition. And that—simply—is the future of scientific publishing.