Next Article in Journal
An ML Framework for the Early Detection and Prediction of Hypertension: Enhancing Diagnostic Accuracy
Previous Article in Journal
Methodology for Electric Conversion of a Small City Car
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Using Large Language Models for Ontology Development †

Faculty of Organization and Informatics, University of Zagreb, Pavlinska 2, 42000 Varaždin, Croatia
Presented at the International Conference on Electronics, Engineering Physics and Earth Science (EEPES 2025), Alexandroupolis, Greece, 18–20 June 2025.
Eng. Proc. 2025, 104(1), 9; https://doi.org/10.3390/engproc2025104009
Published: 22 August 2025

Abstract

This paper explores the application of Large Language Models (LLMs) for ontology development, focusing specifically on cloud service ontologies. We demonstrate how LLMs can streamline the ontology development process by following a modified Ontology Development 101 methodology using Perplexity AI. Our case study shows that LLMs can effectively assist in defining scope, identifying existing ontologies, generating class hierarchies, creating properties, and populating instances. The resulting cloud service ontology integrates concepts from multiple standards and existing ontologies. While LLMs cannot fully automate ontology creation, they significantly reduce development time and complexity, serving as valuable assistants in the ontology engineering process.

1. Introduction

Semantic Web technologies and ontologies themselves have a large number of applications in science and practice. These advanced methods enable the conceptualization and classification of knowledge from a certain domain and the realization of interoperability or integration of computer systems of a similar purpose. When creating ontologies, it is necessary to manually define the main concepts and the connection between concepts by experts for a certain domain. For the evaluation of ontologies, experts from the same domain, but who did not work on the ontology itself, are also needed in most cases. When achieving interoperability using the Semantic Web, services and applications are manually annotated with tags that refer to classes from the used ontologies. All of these tasks are labor-intensive. This part of the work certainly reduces the popularity of the Semantic Web, so in some applications it is preferred to opt for machine learning that automates some procedures, such as integration or partial realization of interoperability.
Recently, we have witnessed the rapid progress of large language models (LLMs), which have been popularized by the public through available web chats such as ChatGPT, Claude, DeepSeek and others. LLMs makes it possible to understand natural language and answer user questions. In terms of creating an ontology, these technologies can be used to extract the most important concepts and connections between them from relevant documents. In the OWL ontology, these are classes and subclasses, as well as object and data properties. One can also try to extend existing ontologies either by extending class hierarchies or by adding relevant instances from relevant digital documents or web sources. Another possible direction of research is how LLMs can be used for ontology evaluation, either as a replacement for experts or as a complement to them; yet another possible direction of research is whether LLMs can be used for comparing different ontologies and determining semantic similarities between ontologies. We begin the paper by reviewing relevant existing works on the application of LLMs in the creation and improvement of ontologies. After that, we present a case of using LLMs for creating a cloud service ontology, upgrading existing similar ontologies, and evaluating the created ontology itself. First, we list related works. Then, in Section 3, we describe how to use LLMs in the ontology development process. The conclusion and future work are described in the last section.

2. Related Work

We first investigate existing similar works on using LLMs to develop new ontologies, enhance existing ontologies, align ontologies, and evaluate developed ontologies. Works are described below in the subsequent subsections together with their analysis and our perspective.

2.1. Development of New Ontologies

Some existing works investigate how different large language models (LLMs) can be used for the development of new ontologies for some specific domain. Saeedizad and Blomqvist [1] explored the ability of Large Language Models (LLMs) to create OWL ontologies directly from ontological requirements. They found that LLMs can produce OWL modeling suggestions and alternatives that are at least as good as those generated by novice human modelers. In their experiments, the GPT-4 model was proven to be best for this task. Giglou et al. [2] claim that LLMs can only work as assistants for ontology construction. Fathallah et al. [3] combined the NeOn methodology with LLMs to translate textual domain descriptions into ontologies. They concluded that LLMs are not suitable to automatically creating the whole ontology for a specific domain but that LLMs can be useful in significantly alleviating the time needed to develop an ontology.

2.2. Enhancing Existing Ontologies

The next application is to use LLMs and domain texts to enhance existing ontologies with new classes, properties, or individuals. Du et al. [4] discuss how LLMs can be used for concept extraction and hierarchical and non-hierarchical relation extraction in the ontology development and enhancement process. Zhao et al. [5] show how to use GPT-based LLMs to refine the ontology (addressing possible ontology issues and adding missing semantic entities and/or relationships). Wu et al. [6] show how to extend medical ontologies using LLMs to find out and classify medical symptoms from online forums.

2.3. Alignment of the Ontologies

Ontology alignment or ontology matching is used to identify semantic correspondences between ontologies, and some recent works use LLMs for this purpose. Amini et al. [7] explored the use of Large Language Model (LLM) technologies (prompting GPT-4) to address the complex challenge of ontology alignment. Giglou et al. [8] presented the LLMs4OM framework for matching ontologies with LLMs. They tested their framework on 20 tasks and datasets from different domains. Snijder et al. [9] concluded that an integration of BERT and GPT performs best in ontology matching in the labor market domain. Sousa et al. [10] integrated LLMs into an approach for generating expressive correspondence between ontologies.

2.4. Ontology Evaluation

Some existing works explore how to use LLMs for ontology evaluation. Tsaneva et al. [11] designed the LLM-driven approach (based on ChatGPT-4) for the verification of ontology restrictions to identify the concrete defects of an ontology. Shah et al. [12] investigated how to use LLMs to generate and validate user intent taxonomies.

3. Use Case

Next, we will try to incorporate LLMs technology into well-established ontology development methodology. We decided to create cloud service ontology by using Perplexity AI [13] through modified Ontology Development 101 methodology [14]. Perplexity AI has a connection to the open internet and can pull information from actual live web sites. We will go through all the steps of Ontology Development 101 methodology and test how LLMs can be used to improve the methodology to create cloud service ontology.

3.1. Define the Scope of the Ontology

The scope of the developed ontology is cloud services and their capabilities. Public cloud providers usually include this information in the documentation of their application programming interfaces. If we want to build an ontology, we know what type of question we want to answer, for which purpose we want to build it, and what will be its domain and scope of application. Here, we can eventually ask LLMs about all possible scopes of cloud service ontologies, and choose or eventually upgrade manually what is best for our ontology. We define the following prompt: “What can be the scope of cloud service ontology?” and send it to the Perplexity AI search form. We get a lengthy and detailed answer that can be summarized as: The scope of cloud service ontologies encompasses service standardization, discovery, selection, resource management, interoperability, requirements engineering, and consumer-centric service negotiation, ultimately enhancing the efficiency and effectiveness of cloud computing environments.

3.2. Reuse of the Existing Ontologies

The second step of the Ontology Development 101 is to find similar existing ontologies and see if they or their parts can be reused in our ontology.
Perplexity AI has pretrained LLM models (Pro subscribers have access to GPT-4 Omni (GPT-4o), Claude 3.7 Sonnet, Gemini 2.0 Flash, GPT-4.5, and Grok-2) on a vast amount of data and documents, and additionally it can access the current version of web resources. It can help us find existing similar ontologies. Our next prompt to Perplexity AI was: “Please list existing ontologies of cloud service. If they are publicly available give me appropriate links to their source files”. The result includes five major cloud service ontologies, described in Table 1. It is useful to look at the existing ontologies and reuse concepts that are useful for the defined purpose of the new ontology that we are developing.

3.3. Enumeration of the Important Terms

The next step in the Ontology Development 101 methodology is to list all important terms needed to build a new technology. We tried with several prompts to retrieve important terms from similar existing ontologies we found via Perplexity AI, but we did not obtain good results. Even the recognition of OWL classes in existing ontologies represents a problem for Perplexity AI. We also tried this with Claude.AI, which provided us with better results, as it lists the main classes from existing ontologies. However, the LLMs here can be used to obtain links for most important available similar ontologies; then, we can manually or semi-automatically try to obtain the main terms used in these ontologies. Regarding Perplexity AI, we obtain some meaningful answers when using the general prompt: “What are the most important terms when defining cloud service ontology?”

3.4. Definition of the Ontology’s Classes and Their Hierarchy

The next step is to build the class hierarchy of the ontology. This task can be completed so as to ask LLM (in our case, Perplexity AI) to generate ontology in the OWL format, and then we can simply open it and inspect it in some ontology editor tool, such as Protégé. This can be accomplished by simply asking, “Please create a class hierarchy of the ontology of cloud service in OWL format”, or we can create a complex LLM prompt that includes a detailed description of the intended ontology’s domain, cloud standards and main ontology terms that we want to include or current definitions of application programming interfaces of a specific cloud provider. When executing the simple prompt in Perplexity AI, we obtain an ontology that is mostly built on the existing CoCoOn ontology [15]. In the same chat session, we then ask, “Please add into this ontology concepts from other relating ontologies and cloud computing standards”, to create a more comprehensive cloud service ontology. In its answer, Perplexity AI provides the generated ontology and explains which existing ontologies and standards have been used. Thus, if we want to include other documents or links, we can do that in the subsequent prompts and improve and extend the ontology until we are satisfied. At the end, we will probably need to manually improve the ontology, but this will be much faster compared to the classical approach where we do all the steps manually. The initial version of such a created cloud service ontology contained 39 OWL classes. The main class hierarchy is shown in Figure 1.

3.5. Definition of the Ontology’s Data and Object Properties

In a similar manner, we can ask Perplexity AI to create data and object properties, or we can first ask it to create the complete ontology that contains all types of OWL concepts (classes, object and data properties). In our experiments, we ask Perplexity AI to “add more data and object properties to the created OWL ontology”. After this prompt, the generated ontology contains 23 object properties and 17 data properties.

3.6. Creation of the Ontology’s Instances

Using previously described prompts, Perplexity AI did not create any instances in the cloud service ontology. Thus, we asked it to do that by asking the next prompt: “Please create some OWL instances (individuals) for the generated ontology.” After this prompt, Perplexity AI created instances representing real-world cloud services from major providers, such as AWS, Azure, and Google Cloud Platform, and their different types of resources. Each instance includes appropriate data properties (such as pricing, CPU cores, memory size) and object properties (relationships between resources). These instances demonstrate how the ontology can be populated with real-world cloud services and their characteristics. A total of 19 individuals have been created using the above-described procedure.

4. Conclusions

This paper explored the application of Large Language Models (LLMs) for ontology development, focusing specifically on cloud service ontologies. We demonstrated how LLMs can be integrated into the established Ontology Development 101 methodology to streamline the ontology engineering process. Our experiments with Perplexity AI showed that LLMs can effectively assist in multiple phases of ontology development, including scope definition, identification of existing ontologies, class hierarchy generation, property creation, and instance population.
The resulting cloud service ontology integrated concepts from multiple standards and existing ontologies, creating a comprehensive semantic framework for describing cloud resources and services. While LLMs proved valuable in accelerating the development process, we observed certain limitations. LLMs struggled with enumerating important terms from existing ontologies and required multiple iterations of prompting to generate a complete ontology with all necessary elements.
Our findings suggest that LLMs are best positioned as assistants in the ontology development process rather than fully autonomous ontology creators. They significantly reduce the time and effort required for ontology development but still require human oversight and refinement. This semi-automated approach represents a promising middle ground between fully manual ontology engineering and completely automated approaches.
Future work could investigate in more detail how LLMs can be used for ontology evaluation. Additionally, comparing results from different LLM systems could provide more comprehensive insights, as different models might identify valuable ontological elements that others miss. A significant challenge for future research is developing methods to determine if a specific ontology was developed solely using LLMs, which has implications for scientific contribution claims in the field of ontology engineering.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study.

Conflicts of Interest

The author declares no conflicts of interest.

References

  1. Saeedizade, M.J.; Blomqvist, E. Navigating Ontology Development with Large Language Models. In The Semantic Web; Meroño Peñuela, A., Dimou, A., Troncy, R., Hartig, O., Acosta, M., Alam, M., Paulheim, H., Lisena, P., Eds.; Springer Nature: Cham, Switzerland, 2024; pp. 143–161. [Google Scholar] [CrossRef]
  2. Babaei Giglou, H.; D’Souza, J.; Auer, S. LLMs4OL: Large Language Models for Ontology Learning. In The Semantic Web–ISWC 2023; Payne, T.R., Presutti, V., Qi, G., Poveda-Villalón, M., Stoilos, G., Hollink, L., Kaoudi, Z., Cheng, G., Li, J., Eds.; Springer Nature: Cham, Switzerland, 2023; pp. 408–427. [Google Scholar] [CrossRef]
  3. Fathallah, N.; Das, A.; Giorgis, S.D.; Poltronieri, A.; Haase, P.; Kovriguina, L. NeOn-GPT: A Large Language Model-Powered Pipeline for Ontology Learning. In The Semantic Web: ESWC 2024 Satellite Events; Meroño Peñuela, A., Corcho, O., Groth, P., Simperl, E., Tamma, V., Nuzzolese, A.G., Poveda-Villalón, M., Sabou, M., Presutti, V., Celino, I., et al., Eds.; Springer Nature: Cham, Switzerland, 2025; pp. 36–50. [Google Scholar] [CrossRef]
  4. Du, R.; An, H.; Wang, K.; Liu, W. A Short Review for Ontology Learning: Stride to Large Language Models Trend. arXiv 2024, arXiv:2404.14991. [Google Scholar] [CrossRef]
  5. Zhao, Y.; Vetter, N.; Aryan, K. Using Large Language Models for OntoClean-based Ontology Refinement. arXiv 2024, arXiv:2403.15864. [Google Scholar] [CrossRef]
  6. Wu, G.; Ling, C.; Graetz, I.; Zhao, L. Ontology extension by online clustering with large language model agents. Front. Big Data 2024, 7, 1463543. [Google Scholar] [CrossRef] [PubMed]
  7. Amini, R.; Norouzi, S.S.; Hitzler, P.; Amini, R. Towards Complex Ontology Alignment Using Large Language Models. In Knowledge Graphs and Semantic Web; Tiwari, S., Villazón-Terrazas, B., Ortiz-Rodríguez, F., Sahri, S., Eds.; Springer Nature: Cham, Switzerland, 2025; pp. 17–31. [Google Scholar] [CrossRef]
  8. Babaei Giglou, H.; D’Souza, J.; Engel, F.; Auer, S. LLMs4OM: Matching Ontologies with Large Language Models. arXiv 2024, arXiv:2404.10317. [Google Scholar] [CrossRef]
  9. Snijder, L.L.; Smit, Q.T.S.; de Boer, M.H.T. Advancing Ontology Alignment in the Labor Market: Combining Large Language Models with Domain Knowledge. Proc. AAAI Symp. Ser. 2024, 3, 31208. [Google Scholar] [CrossRef]
  10. Sousa, G.; Lima, R.; Trojahn, C. Complex Ontology Matching with Large Language Model Embeddings. arXiv 2025, arXiv:2502.13619. [Google Scholar] [CrossRef]
  11. Tsaneva, S.; Vasic, S.; Sabou, M. LLM-driven Ontology Evaluation: Verifying Ontology Restrictions with ChatGPT. Semant. Web ESWC Satell. Events 2024, in press. [Google Scholar]
  12. Shah, C.; Bender, E.M.; Zamani, H.; Bota, H.; Awadallah, A.H.; Diaz, F.; Lalmas, M. Using Large Language Models to Generate, Validate, and Apply User Intent Taxonomies. arXiv 2024, arXiv:2309.13063. [Google Scholar] [CrossRef]
  13. Zala, D. Perplexity AI Review: Research Tool That Doesn’t Hallucinate. Available online: https://dhruvirzala.com/perplexity-ai-review/ (accessed on 14 March 2025).
  14. Noy, N.; Mcguinness, D. Ontology Development 101: A Guide to Creating Your First Ontology. Stanford Knowledge Systems Laboratory. 2001. Available online: http://protege.stanford.edu/publications/ontology_development/ontology101.pdf (accessed on 14 March 2025).
  15. Ali, A.; Shamsuddin, S.; Eassa, F. Ontology-based Cloud Services Representation. Res. J. Appl. Sci. Eng. Technol. 2014, 8, 83–94. [Google Scholar] [CrossRef]
  16. Moscato, F.; Aversa, R.; Di Martino, B.; Fortis, T.-F.; Munteanu, V. An Analysis of mOSAIC ontology for Cloud Resources annotation. In Proceedings of the Federated Conference on Computer Science and Information Systems, Szczecin, Poland, 18–21 September 2011; pp. 973–980. [Google Scholar]
  17. Androcec, D.; Vrcek, N. Ontologies for Platform as Service APIs Interoperability. Cybern. Inf. Technol. 2016, 16, 4. [Google Scholar] [CrossRef]
  18. Banditwattanawong, T.; Masdisornchote, M. Infrastructure-as-a-Service Ontology for Consumer-Centric Assessment. Adv. Sci. Technol. Eng. Syst. J. 2023, 8, 37–45. [Google Scholar] [CrossRef]
  19. Rekik, M.; Boukadi, K.; Ben-Abdallah, H. Cloud description ontology for service discovery and selection. In Proceedings of the 10th International Joint Conference on Software Technologies (ICSOFT), Colmar, France, 20–22 July 2015; pp. 1–11. [Google Scholar]
  20. Ma, Y.B.; Jang, S.H.; Lee, J.S. Ontology-Based Resource Management for Cloud Computing. In Intelligent Information and Database Systems; Springer: Berlin/Heidelberg, Germany, 2011; pp. 343–352. [Google Scholar] [CrossRef]
Figure 1. Main classes of the LLM-based cloud service ontology creation.
Figure 1. Main classes of the LLM-based cloud service ontology creation.
Engproc 104 00009 g001
Table 1. Retrieved existing cloud service ontologies and their descriptions.
Table 1. Retrieved existing cloud service ontologies and their descriptions.
Name of the OntologyBrief Description of the Ontology
Cloud Computing Ontology (CoCoOn) [15]An OWL-based ontology that defines functional and non-functional concepts, attributes, and relations of infrastructure services.
moSAIC Cloud Ontology [16] This ontology aims to provide common access to cloud services and enable discovery in cloud federations.
PaaS API Ontology [17]Focuses on remote operations of PaaS providers’ APIs and interoperability problems among different platform-as-a-service offers.
IaaS Ontology [18]A consumer-centric ontology with 15 primary subclasses and 340 individual classes for IaaS assessment.
Cloud Description Ontology [19]Designed for service discovery and selection in cloud federation environments.
Cloud Resource Ontology [20]Developed by Y. Ma et al. for resource management in cloud environments.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Andročec, D. Using Large Language Models for Ontology Development. Eng. Proc. 2025, 104, 9. https://doi.org/10.3390/engproc2025104009

AMA Style

Andročec D. Using Large Language Models for Ontology Development. Engineering Proceedings. 2025; 104(1):9. https://doi.org/10.3390/engproc2025104009

Chicago/Turabian Style

Andročec, Darko. 2025. "Using Large Language Models for Ontology Development" Engineering Proceedings 104, no. 1: 9. https://doi.org/10.3390/engproc2025104009

APA Style

Andročec, D. (2025). Using Large Language Models for Ontology Development. Engineering Proceedings, 104(1), 9. https://doi.org/10.3390/engproc2025104009

Article Metrics

Back to TopTop