top of page

Design exploration of quantitative performance and geometry typology for indoor arena based on self-organizing map and multi-layered perceptron neural network

Year  2020 

Author(s)  Wang Pan, Yimin Sun, Michela Turrin, Christian Louter, Sevil Sariyildiz 

Link  https://repository.tudelft.nl/islandora/object/uuid:4ef2a27b-5668-4242-8757-e1416de4aaf0?collection=research  

ML Tags

Surrogate Modelling

Neural Networks

Self-Organizing Maps

Local Linear Mapping

Topic Tags

Architectural Design and Building

> Software & Plug-ins Used


  • Rhinoceros, Grasshopper, Karamba for Indoor Arena Generator (IAG) 

  • Matlab (programming) toolbox for Self Organizing Maps (SOM) 

  • Matlab (programming) custom code for Local Linear Map (LLM) 

  • Matlab (programming) multi-layer perceptron neural network (MLPNN) toolbox for feedforward neural network 

> Workflow


LLM-SOM  


Thesis Report Figure 4: The workflow of SOM-LLM


MLPNN-SOM 

Thesis Report Figure 5: The workflow of SOM-MLPNN

> Summary


Two methods were used in the approach and for both methods steps 1, 2, 3, 5, and 6 are the same. Therefore only step 4 differs between the two methods and this difference is in the ML method used, LLM and MLPNN respectively. 


Step 1: A parametric model generates options based on adjusting parameters 


Step 2: SOM are used to organize the generated geometry  


Step 3: Design of Experiments is used to obtain labelled outputs from the labelled inputs. The performance data of the geometry is obtained through a simulation. Additional design options are generated using the SOM clustering information. The simulation data becomes the label of the data 


Step 4 (LLM-SOM): Data approximation model is trained to predict performance data of the design alternatives using LLM. LLM approximates the performance data based on the distribution of the input data and the reference data. Therefore, this is no validation process. 

 

Step 4 (MLPNN-SOM): Data approximation model is trained to predict performance data of the design alternatives using a MLPNN. MLPNN used has 3 hidden layers with number of channels 6, 6, 10. Mean Squared Errors (MSE) was used as the cost/error function which is minimized during training. Used training, test, and validation set of data. LIMITATIONS: MLPNN can only be trained to predict one kind of performance data, multiple MLPNN models needed to predict multiple performance indicators. 


Step 5: Using the SOM clustering from Step 2, the design alternatives are clustered within he design space. The clustering allows designers to understand the design space while reviewing specific design options.  


Step 6: The data is visualized using Rhino and Grasshopper and allows the designer to explore options in four different ways: 

  • Explore based on performance data 

  • Explore based on design objectives related to extreme performance data [review minimum or maximum values for specific indicators] 

  • Explore based on design constraints related to multiple performance indicators [display design within cluster and show how they meet the constraints] 

  • Explore based on geometry and related performance value 


After the MLPNN or LMM have been trained steps 3 and 4 can be skipped and the trained model can be used to evaluate the designs generated from the parametric model. 


The author presents a case study to compare the two methods in the paper. Please reference the paper for further information about the case study. 



LIMITATIONS: The accuracy of predicting correlations between the models and the predicted performance is sometimes low depending on the performance indicators. The model is well trained for the data set but the accuracy of its generalization in practice is unknown. 

> Possible Applications


For ideas on how to implement some of the above mentioned techniques, please see

‘Possible applications for students to try with SOM’

bottom of page