Bridging the Semantic Gap in BIM Interior Design: A Neuro-Symbolic Framework for Explainable Scene Completion
Abstract
1. Introduction
- (1)
- Neuro-symbolic style–logic alignment for BIM interior recommendation: a unified framework bridging visual aesthetics and explicit design logic to reduce the style–logic semantic gap.
- (2)
- Automated and scalable design-logic extraction: a data-driven pipeline (Faster R-CNN + FP-Growth) to mine tacit matching patterns from successful design cases and formalize them as a knowledge graph.
- (3)
- Context-aware scene completion via knowledge-guided fusion: a scene-level recommendation strategy that dynamically weights object influences under global logical rules.
- (4)
2. Related Work
2.1. BIM-Oriented Object Library Retrieval and Design Automation
2.2. Visual Compatibility Learning and Scene-Level Interior Modeling
2.3. Knowledge Graphs, Neuro-Symbolic AI, and Explainable Recommendation
3. Problem Formulation and System Architecture
3.1. Mathematical Problem Formulation
- a.
- Furniture Style Representation:
- b.
- Logical Constraints as Probabilistic Weights:
- c.
- The Recommendation Objective:
3.2. The Neuro-Symbolic System Architecture
- (1)
- Visual Path: It retrieves (or computes) the style vectors v(pi) for all existing furniture from BIM-derived 2D renderings/projections of the corresponding 3D BIM families and scene objects, and accesses them from the vector database.
- (2)
- Logical Path: It queries the Knowledge Graph to determine the dynamic attention weights w based on the relationships between existing furniture categories and the target furniture type.
4. Methods
4.1. Visual-Semantic Knowledge Extraction and Graph Construction
4.1.1. Visual Perception via Deep Learning
4.1.2. Mining Tacit Design Logic
- Frequency Quantification: We perform the first traversal of the original dataset to count the occurrence of each furniture category (1-furniture itemset). To filter out noise and retain statistically significant design elements, we set a minimum support threshold of Smin = 0.18.
- Header Table Creation: The frequent furniture items are sorted in descending order of support to generate the item header table.
- Tree Construction: In a second traversal, we reorder the items in each transaction record according to the header table. A root node is created (default count 1), and items from each record are inserted into the tree as branches. If an item shares the same path as an existing record, its node count is incremented; otherwise, a new branch is formed, linked by node pointers to maintain structural connectivity.
- Path Extraction: For each specific furniture item, we trace all paths in the FP-tree that terminate at that item. The minimum node count along these paths constitutes the Conditional Pattern Base, which represents the specific contextual environment for that furniture.
- Contextual Pruning: For instance, as shown in Figure 4, for the item “sofa,” the FP-tree reveals three distinct design paths: {chair:1, bed:1, table:1, sofa:1}, {chair:1, sofa:1}, and {table:1, sofa:1}. Consequently, the conditional pattern base for “sofa” is derived as {{chair, bed, table}:1, {chair}:1, {table}:1}.
- Conditional FP-Tree Generation: We merge nodes within this base and prune items that fall below the support threshold. For example, if “bed” has a cumulative count of 1 (below threshold), it is removed. This results in a pruned Conditional FP-Tree for the sofa (e.g., branches <chair:2, table:1> and <table:1>). By applying this methodology recursively, we obtain the frequent itemsets and their respective support counts for all furniture categories.
4.2. Furniture Recommendation Using Scene Style Learning
| Algorithm 1. Recommended algorithm for BIM scene |
| Input: BIM furniture style vector library G = {v(gi)|i = 1, 2, …, n} |
| BIM scene context: P = {pj|j = 1, 2, …, m} |
| Output: Top-K harmonious furniture list B(P, K) |
| 1: S = []//List to store similarity scores |
| 2: t = target furniture type required by user |
| 3://Step 1: Compute Scene Style Vector |
| 4: for each item pj in P: |
| 5: wj = SearchRule (T(pj), t)//Query weight from FMKG |
| 6: v(P) = calculate (wj, v(pj))//Equation (8) |
| 7://Step 2: Rank Candidates |
| 8: for each candidate gi in G: |
| 9: si = CosineSimilarity(v(P), v(gi))//Equation (9) |
| 10: Add pair (gi, si) to S |
| 11: Sort S in descending order based on si |
| 12: B = {item from S[0], …, item from S[K − 1]} |
| 13: return B |
5. Experiments
5.1. Experimental Setup
5.2. Results and Mechanism Analysis
5.2.1. Efficacy of Visual-Semantic Feature Extraction
5.2.2. Comparative Analysis: Global Context vs. Local Features
5.2.3. Robustness Across Diverse Scenarios
5.2.4. In-Authoring Explainability Visualization in Revit
6. Discussion
6.1. Addressing the “Semantic Gap” in BIM Interior Assistance
6.2. Interpreting the Empirical Gains: Why Neuro-Symbolic Fusion Helps
6.3. Representation Choices: Gram-Matrix Style Features and Credible Published Alternatives
6.4. Robustness of FMKG Construction: Thresholds, Sparsity, and Normalized Association Measures
6.5. Scope, Failure Modes, and Extensions That Preserve the Paper’s Core Framing
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
| Number | Association Rules | Confidence (%) |
|---|---|---|
| 1 | {sofa} → {chair} | 40.8 |
| 2 | {cabinet} → {sofa} | 44.2 |
| 3 | {sofa} → {cabinet} | 47.0 |
| 4 | {sofa} → {bed} | 54.9 |
| 5 | {sofa} → {table} | 67.3 |
| 6 | {table} → {sofa} | 53.5 |
| 7 | {cabinet} → {table} | 57.0 |
| 8 | {bed} → {sofa} | 41.2 |
| 9 | {bed} → {chair} | 43.0 |
| 10 | {chair} → {bed} | 46.1 |
| 11 | {table} → {cabinet} | 48.2 |
| 12 | {cabinet} → {chair} | 62.8 |
| 13 | {chair} → {cabinet} | 53.7 |
| 14 | {bed} → {cabinet} | 51.5 |
| 15 | {cabinet} → {bed} | 64.6 |
| 16 | {table} → {chair} | 70.3 |
| 17 | {chair} → {table} | 71.1 |
References
- Abanda, F.H.; Balu, B.; Adukpo, S.E.; Akintola, A. Decoding ISO 19650 Through Process Modelling for Information Management and Stakeholder Communication in BIM. Buildings 2025, 15, 431. [Google Scholar] [CrossRef]
- Sacks, R.; Eastman, C.; Lee, G.; Teicholz, P. BIM Handbook: A Guide to Building Information Modeling for Owners, Designers, Engineers, Contractors, and Facility Managers; John Wiley & Sons: Hoboken, NJ, USA, 2018. [Google Scholar]
- Aydın, M. Analyzing the impact of ISO 16739-1: 2024 (Industry Foundation Classes, IFC) on data sharing and Building Information Modeling (BIM) collaboration in the construction industry. J. Archit. Sci. Appl. 2025, 10, 157–174. [Google Scholar] [CrossRef]
- Zabin, A.; González, V.A.; Zou, Y.; Amor, R. Applications of machine learning to BIM: A systematic literature review. Adv. Eng. Inform. 2022, 51, 101474. [Google Scholar] [CrossRef]
- Jadidoleslami, S.; Saghatforoush, E. Unveiling diverse AI applications in BIM: A quantitative map of techniques for 10 dimensions. Archit. Eng. Des. Manag. 2025, 1–19. [Google Scholar] [CrossRef]
- Rane, N. Integrating building information modelling (BIM) and artificial intelligence (AI) for smart construction schedule, cost, quality, and safety management: Challenges and opportunities. In Cost, Quality, and Safety Management: Challenges and Opportunities (September 16, 2023); SSRN: Rochester, NY, USA, 2023. [Google Scholar]
- Wu, S.; Shen, Q.; Deng, Y.; Cheng, J. Natural-language-based intelligent retrieval engine for BIM object database. Comput. Ind. 2019, 108, 73–88. [Google Scholar] [CrossRef]
- Molsa, M.; Demian, P.; Gerges, M. BIM search engine: Effects of object relationships and information standards. Buildings 2023, 13, 1591. [Google Scholar] [CrossRef]
- Wang, Y.; Liang, C.; Huai, N.; Chen, J.; Zhang, C. A survey of personalized interior design. Comput. Graph. Forum 2023, 42, e14844. [Google Scholar] [CrossRef]
- Liu, T.; Hertzmann, A.; Li, W.; Funkhouser, T. Style compatibility for 3D furniture models. ACM Trans. Graph. (TOG) 2015, 34, 1–9. [Google Scholar] [CrossRef]
- De Raedt, L.; Dumančić, S.; Manhaeve, R.; Marra, G. From statistical relational to neuro-symbolic artificial intelligence. arXiv 2020, arXiv:2003.08316. [Google Scholar] [CrossRef]
- Zhou, X.; Ma, C.; Wang, M.; Guo, M.; Guo, Z.; Liang, X.; Han, J. BIM product recommendation for intelligent design using style learning. J. Build. Eng. 2023, 73, 106701. [Google Scholar] [CrossRef]
- Yang, Y.; Wang, Y.; Zhou, X.; Su, L.; Hu, Q. BIM Style Restoration Based on Image Retrieval and Object Location Using Convolutional Neural Network. Buildings 2022, 12, 2047. [Google Scholar] [CrossRef]
- Peng, J.; Liu, X. Automated code compliance checking research based on BIM and knowledge graph. Sci. Rep. 2023, 13, 7065. [Google Scholar] [CrossRef]
- Love, P.E.; Matthews, J.; Fang, W.; Porter, S.; Luo, H.; Ding, L. Explainable artificial intelligence in construction: The content, context, process, outcome evaluation framework. arXiv 2022, arXiv:2211.06561. [Google Scholar] [CrossRef]
- Markchom, T.; Liang, H.; Ferryman, J. Review of explainable graph-based recommender systems. ACM Comput. Surv. 2023, 58, 1–35. [Google Scholar] [CrossRef]
- Zarghami, S.; Kouchaki, H.; Yang, L.; Rodriguez, P.M. Explainable Artificial Intelligence in Generative Design for Construction. In EC3 Conference 2024; European Council on Computing in Construction: St-Niklaas, Belgium, 2024; Volume 5. [Google Scholar]
- Leimer, K.; Guerrero, P.; Weiss, T.; Musialski, P. Layoutenhancer: Generating good indoor layouts from imperfect data. In SIGGRAPH Asia 2022 Conference Papers; Association for Computing Machinery: New York, NY, USA, 2022. [Google Scholar]
- Yang, Y.; Jia, B.; Zhi, P.; Huang, S. Physcene: Physically interactable 3d scene synthesis for embodied ai. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 16–22 June 2024. [Google Scholar]
- Bai, T.; Bai, W.; Chen, D.; Wu, T.; Li, M.; Ma, R. FreeScene: Mixed Graph Diffusion for 3D Scene Synthesis from Free Prompts. In Proceedings of the Computer Vision and Pattern Recognition Conference, Nashville, TN, USA, 10–17 June 2025. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster r-cnn: Towards real-time object detection with region proposal networks. Adv. Neural Inf. Process. Syst. 2015, 28. [Google Scholar] [CrossRef] [PubMed]
- Han, J.; Jian, P.; Yin, Y. Mining frequent patterns without candidate generation. ACM SIGMOD Rec. 2000, 29, 1–12. [Google Scholar] [CrossRef]
- Marra, G.; Dumančić, S.; Manhaeve, R.; De Raedt, L. From statistical relational to neurosymbolic artificial intelligence: A survey. Artif. Intell. 2024, 328, 104062. [Google Scholar] [CrossRef]
- Jung, H.; Park, H.; Lee, K. Enhancing recommender systems with semantic user profiling through frequent subgraph mining on knowledge graphs. Appl. Sci. 2023, 13, 10041. [Google Scholar] [CrossRef]
- Zhang, J.-C.; Zain, A.M.; Zhou, K.-Q.; Chen, X.; Zhang, R.-M. A review of recommender systems based on knowledge graph embedding. Expert Syst. Appl. 2024, 250, 123876. [Google Scholar] [CrossRef]
- Sharma, K.; Lee, Y.-C.; Nambi, S.; Salian, A.; Shah, S.; Kim, S.-W.; Kumar, S. A survey of graph neural networks for social recommender systems. ACM Comput. Surv. 2024, 56, 1–34. [Google Scholar] [CrossRef]
- Hellin, S.; Nousias, S.; Borrmann, A. Natural Language Information Retrieval from BIM Models: An LLM-Based Multi-Agent System Approach. In EC3 Conference 2025; European Council on Computing in Construction: St-Niklaas, Belgium, 2025; Volume 6. [Google Scholar]
- Elsaka, O.; Du, C.; Nousias, S.; Borrmann, A. BIM Command Recommendation Using Dynamic Graph Neural Network; International Workshop on Intelligent Computing in Engineering: Glasgow, UK, 2025. [Google Scholar]
- Doukari, O.; Greenwood, D.; Rogage, K.; Kassem, M. Automated compliance checking (ACC) classification approach. ITcon 2022, 27, 335. [Google Scholar] [CrossRef]
- Beach, T.; Yeung, J.; Nisbet, N.; Rezgui, Y. Digital approaches to construction compliance checking: Validating the suitability of an ecosystem approach to compliance checking. Adv. Eng. Inform. 2024, 59, 102288. [Google Scholar] [CrossRef]
- Nuyts, E.; Bonduel, M.; Verstraeten, R. Comparative analysis of approaches for automated compliance checking of construction data. Adv. Eng. Inform. 2024, 60, 102443. [Google Scholar] [CrossRef]
- Gourabpasi, H.; Jalaei, A.F.; Ghobadi, M. Developing an openBIM Information Delivery Specifications Framework for Operational Carbon Impact Assessment of Building Projects. Sustainability 2025, 17, 673. [Google Scholar] [CrossRef]
- Fischer, S.; Urban, H.; Schranz, C.; Loibl, P.; van Berlo, L. Extending Information Delivery Specifications for digital building permit requirements. Dev. Built Environ. 2024, 20, 100560. [Google Scholar] [CrossRef]
- Ashish, V. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30. [Google Scholar] [CrossRef]
- Ao, L.; Qi, R. Early-Stage Graph Fusion with Refined Graph Neural Networks for Semantic Code Search. Appl. Sci. 2025, 16, 12. [Google Scholar] [CrossRef]
- Cai, Q.; Ma, M.; Wang, C.; Li, H. Image neural style transfer: A review. Comput. Electr. Eng. 2023, 108, 108723. [Google Scholar] [CrossRef]
- Radford, A.; Kim, J.W.; Hallacy, C.; Ramesh, A.; Goh, G.; Agarwal, S.; Sastry, G.; Askell, A.; Mishkin, P.; Clark, J. Learning transferable visual models from natural language supervision. In International Conference on Machine Learning; PmLR: New York, NY, USA, 2021. [Google Scholar]
- Alexey, D. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv 2020, arXiv:2010.11929. [Google Scholar]
- Caron, M.; Touvron, H.; Misra, I.; Jegou, H.; Mairal, J.; Bojanowski, P.; Joulin, A. Emerging properties in self-supervised vision transformers. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, QC, Canada, 10–17 October 2021. [Google Scholar]
- Chiarcos, C.; Gkirtzou, K.; Ionov, M.; Kabashi, B.; Khan, F.; Truică, C.O. Modelling collocations in OntoLex-FrAC. In Proceedings of Globalex Workshop on Linked Lexicography Within the 13th Language Resources and Evaluation Conference; European Language Resources Association: Marseille, France, 2022. [Google Scholar]






| No. | 1 | 2 | 3 | 4 | 5 |
|---|---|---|---|---|---|
| scene | ![]() | ![]() | ![]() | ![]() | ![]() |
| No. | 6 | 7 | 8 | 9 | 10 |
| scene | ![]() | ![]() | ![]() | ![]() | ![]() |
| Results | Chair | Table | Sofa | Bed | Cabinet | |
|---|---|---|---|---|---|---|
| BIM Scene | ||||||
![]() | ![]() 0.9069 | ![]() 0.8851 | ![]() 0.9343 | ![]() 0.9462 | ![]() 0.8934 | |
![]() 0.9009 | ![]() 0.8694 | ![]() 0.9327 | ![]() 0.9387 | ![]() 0.8734 | ||
![]() 0.8923 | ![]() 0.8664 | ![]() 0.9265 | ![]() 0.9304 | ![]() 0.8730 | ||
| Results | Chair | Table | Sofa | Bed | Cabinet | |
|---|---|---|---|---|---|---|
| BIM Scene | ||||||
![]() | ![]() 1 | ![]() 1 | ![]() 1 | ![]() 1 | ![]() 1 | |
![]() 1 | ![]() 1 | ![]() 1 | ![]() 1 | ![]() 1 | ||
![]() 0 | ![]() 0 | ![]() 1 | ![]() 1 | ![]() 1 | ||
| Results | Chair | Table | Sofa | Bed | Cabinet | |
|---|---|---|---|---|---|---|
| Furniture | ||||||
![]() | ![]() 1 | ![]() 1 | ![]() 1 | ![]() 0 | ![]() 1 | |
![]() 1 | ![]() 1 | ![]() 0 | ![]() 0 | ![]() 0 | ||
![]() 0 | ![]() 0 | ![]() 1 | ![]() 1 | ![]() 1 | ||
| Results | Chair | Table | Sofa | Bed | Cabinet | |
|---|---|---|---|---|---|---|
| Furniture | ||||||
![]() | ![]() 0 | ![]() 1 | ![]() 1 | ![]() 1 | ![]() 0 | |
![]() 0 | ![]() 0 | ![]() 1 | ![]() 1 | ![]() 1 | ||
![]() 0 | ![]() 1 | ![]() 1 | ![]() 0 | ![]() 0 | ||
| Furniture | Scene | Furniture | Scene | |||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Reference style | 1.1 | 1.2 | 1 | 2.1 | 2.2 | 2 | ||||||||
| Precision (%) | 53.3 | 60.0 | 86.7 | 53.3 | 46.7 | 80.0 | ||||||||
| Recall (%) | 30.8 | 34.6 | 50.0 | 34.8 | 30.4 | 52.2 | ||||||||
| Reference style | 3.1 | 3.2 | 3 | 4.1 | 4.2 | 4 | ||||||||
| Precision (%) | 40.0 | 60.0 | 66.7 | 33.3 | 53.3 | 60.0 | ||||||||
| Recall (%) | 25.0 | 37.5 | 41.7 | 23.8 | 38.1 | 42.9 | ||||||||
| Reference style | 5.1 | 5.2 | 5 | 6.1 | 6.2 | 6 | ||||||||
| Precision (%) | 46.7 | 60.0 | 80.0 | 33.3 | 46.7 | 53.3 | ||||||||
| Recall (%) | 33.3 | 42.9 | 57.1 | 22.7 | 31.8 | 36.4 | ||||||||
| Reference style | 7.1 | 7.2 | 7 | 8.1 | 8.2 | 8 | ||||||||
| Precision (%) | 53.3 | 53.3 | 73.3 | 46.7 | 46.7 | 66.7 | ||||||||
| Recall (%) | 32.0 | 32.0 | 44.0 | 31.8 | 31.8 | 45.5 | ||||||||
| Reference style | 9.1 | 9.2 | 9.3 | 9 | 10.1 | 10.2 | 10.3 | 10 | ||||||
| Precision (%) | 60.0 | 46.7 | 40.0 | 86.7 | 53.3 | 46.7 | 33.3 | 66.7 | ||||||
| Recall (%) | 32.1 | 25.0 | 21.4 | 46.4 | 26.1 | 30.4 | 21.7 | 43.5 | ||||||
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Feng, J.; Luo, R.; Li, X.; Zhou, X.; Wang, M.; Yin, J.; Yuan, H. Bridging the Semantic Gap in BIM Interior Design: A Neuro-Symbolic Framework for Explainable Scene Completion. Appl. Sci. 2026, 16, 1530. https://doi.org/10.3390/app16031530
Feng J, Luo R, Li X, Zhou X, Wang M, Yin J, Yuan H. Bridging the Semantic Gap in BIM Interior Design: A Neuro-Symbolic Framework for Explainable Scene Completion. Applied Sciences. 2026; 16(3):1530. https://doi.org/10.3390/app16031530
Chicago/Turabian StyleFeng, Junfu, Ruidan Luo, Xuechao Li, Xiaoping Zhou, Mengmeng Wang, Jiaqi Yin, and Hong Yuan. 2026. "Bridging the Semantic Gap in BIM Interior Design: A Neuro-Symbolic Framework for Explainable Scene Completion" Applied Sciences 16, no. 3: 1530. https://doi.org/10.3390/app16031530
APA StyleFeng, J., Luo, R., Li, X., Zhou, X., Wang, M., Yin, J., & Yuan, H. (2026). Bridging the Semantic Gap in BIM Interior Design: A Neuro-Symbolic Framework for Explainable Scene Completion. Applied Sciences, 16(3), 1530. https://doi.org/10.3390/app16031530











































































