The production of materials needed to satisfy the demand for new architectures generates a great impact on our environment. Meanwhile, every year resources are lost during building demolitions and almost thirty percent of the waste generated in the European Union comes from the construction industry. This projects develops an innovative digital decision support system on 4 demonstrators, integrating various digital tools and processes (3D scanning, imagery localization and point cloud classification) that helps to define the most sustainable and economical deconstruction and reuse strategy for a building.​ The results showed that it is possible to quantify, qualify and map the flows of products and materials from buildings undergoing renovation, redevelopment or deconstruction and to assess, even before the work begins, their potential for reuse and recycling.​​​​​​​
Developed platform
In order to use the resources available in the buildings, the first step is to identify them. Therefore, the aim of this research is to characterize and classify the demolition waste from pre-demolition sites. We can see companies that are already involved in the deconstruction sector. Our added value to what they are offering now is that we are embedding technology in this process in order to have a more accurate and reliable material report. The potential customers of this service could be the building owners, the architects or the construction and demolition, contractors.
The way we are displaying the data of the report is through a user interface. We want to have a material research tool that shows the amount of material, its value, where it is located, and images related to the specific material. We also want it to be relevant in providing information about the coordination of the deconstruction actions. The current interface focuses on clearly presenting the documentation, spatial features, and material analysis of a demolition site. Further functionality from here would include deeper practical integration with planning the demolition phase over time, both from a physical standpoint with the logical and structural order in which elements will be removed, and a data standpoint to integrate with BIM and project planning software.​​​​​​​
Process
First Iteration
Second Iteration
Third Iteration
Firstly, we inspect the demolition site; Secondly, we used the gathered data to do a geometric reconstruction; Thirdly, we perform classification and location of the materials; And finally, we process the information and present it in a user interface.
Step 1 | Building Inspection​​​​​​​
We bring drones that capture images of the field in order to digitize it. We use drones instead of ground robots because they can work in rough terrains, they can reach higher heights and they allow us to also do exterior inspections. 
Step 2 | Geometric Reconstruction​​​​​​​
From the exploration flight, we have two desired outputs, the first one being an octomap from 'OrbSlam2' for a further autonomous flight and the second one a colored dense point cloud from photogrammetry. The point cloud will be used to produce a 3D environment where all data acquired will be shown.
The environment is processed in Cloud Compare to extract the architectural and structural elements from the point cloud. After having all the elements as geometry files, the surfaces or meshes are imported into Grasshopper for further cleaning. The output of the whole process is a quad mesh element named as ceiling, beams, columns, floor, and wall.
Step 3 | Classification and Location​​​​​​​
From the imagery of the first step, we start annotating localizations of our relevant materials.  A classification algorithm is applied to a grid of smaller patches of the image and colorized accordingly. In order to localize the materials, we need to train the algorithm, perform a material classification and finally, the material localization.
The image classifier is trained with one set of images for each category, both close up to describe the texture of the material, and as part of various building elements to describe its shape. A none category that contains imagery of other people and objects that will commonly be found in a demolition site has also been included in order to avoid false positives.
The actual classification starts with a mathematical description of the regions around certain points of interest, or features. In this case, the algorithm finds these features at the ‘corners’ between edges in image brightness. The descriptor algorithm (BRISK) then performs a series of simple binary comparisons of brightness in a pattern around the feature and combines these together in a binary string. Finally, these descriptions are clustered together to obtain a limited set of ‘code-words’ that can describe our images, and a model is trained to associate specific histogram-concentrations of these codewords with each of our categories. 
Step 4 | Material Report and Detailed Inspection​​​​​​​
The last step is to gather all the information generated in the previous steps and assemble it to display it in the user interface. Using cloud compare and the images with the material location we build the point cloud and the colored mesh.
Once this colored mesh is built, a piece of important information that we can obtain from it, it is the waypoints to perform a second flight. In case there is confusing information and we need to get more data to build a reliable report, we can send the drone to capture more information performing and autonomous flight. The second flight will be using the information collected from the first one (octomap + waypoints).
Matter Site is a project of IaaC, Institute for Advanced Architecture of Catalonia developed at Masters of Robotics and Advanced Construction in 2019-2020 by Students: Anna Batallé, Irem Yagmur Cebeci, Matthew Gordon, Roberto Vargas, Faculty: Aldo Sollazzo, Daniel Serrano

You may also like

Back to Top