IFC BIM Model Enrichment with Space Function Information Using Graph Neural Networks

Adam Buruzs, Milos Sipetic, Brigitte Blank-Landeshammer, Gerhard Zucker

Research output: Contribution to journalArticlepeer-review


The definition of room functions in Building Information Modeling (BIM) using IfcSpace entities is an important quality requirement that is often not fulfilled. This paper presents a three-step method for enriching open BIM representations based on Industry Foundation Classes (IFC) with room function information (e.g., kitchen, living room, foyer). In the first step, the geometric algorithm for detecting and defining IfcSpace entities and injecting them into IFC models is presented. After deriving the IfcSpaces, a geometric method for calculating the graph of connections between spaces based on accessibility is described; this information is not explicitly stored in IFC models. In the final step, a graph convolution-based neural network using the accessibility graph to classify the IfcSpace entities is described. Local node features are automatically extracted from the geometry and neighboring elements. With the help of a Graph Convolutional Network (GCN), the connection and spatial context information is utilized by the neural network for the classification decision, in addition to the local features of the spaces which are more commonly used. To evaluate the classification accuracy, the model was tested on a set of residential building IFC models. A weighted version of the common GCN was implemented and tested, resulting in a slight improvement in the classification accuracy.
Original languageEnglish
Pages (from-to)2937.1-2937.12
Number of pages1
Issue number15
Publication statusPublished - 2022

Research Field

  • Digitalisation and HVAC Technologies in Buildings


  • BIM; IFC; architecture model enrichment; machine learning; IfcSpace


Dive into the research topics of 'IFC BIM Model Enrichment with Space Function Information Using Graph Neural Networks'. Together they form a unique fingerprint.

Cite this