OptiStruct Optimisation with RapidMiner Machine Learning
Optimisation is great, right? Well, yes, but as with all things it can be improved upon. In our traditional optimisation process, and hopefully readers will be familiar with the DRCO (Design variables, Responses, Constraints, Objectives) workflow, we generally use mechanical results as the responses which then also define our constraints and objectives. These could perhaps take the form of maximum stresses, or displacements across a group of nodes. However, as the world becomes increasingly connected and we have access to more and more datasets from multiple sources, we might want to include other responses in order to obtain a more optimal design in competitive markets. This could be a cost calculation, or marketing data on aesthetics. It could even be a reduced order model of another simulation if we don’t want to perform multiphysics optimisation. This blog post describes the development of a workflow to combine RapidMiner, a leading brand in Data Analytics, and OptiStruct in order to use ‘external’ responses in optimisation. Hint: it won’t really be external!
This is not a new topic, my colleague Charles Mortished discusses here how we can use Altair PhysicsAI as a response via the DRESP3 card to control the displacement behaviour during optimisation. However, if we’re feeling lazy then we might not want to use a Compose link and instead show some interest in the DRESP2 card, which we can use to create a functional combination of our responses and design variables.
The first step we take is to create a scenario, and for simplicity we’ll base it on this example from the OptiStruct help (OSE: 0930). This example contains a quarter of a radar dish under gust loading, and a total of 17 gauge design variables defined on the model. Because this is an example, we also need a dataset alongside this which we can generate using a Design of Experiments (DOE) in HyperStudy. We add an extra column which represents our external measurement, generated by making a random combination of the values of our input parameters.
With this dataset, we can build a visual process in RapidMiner Studio to produce a linear regression model to predict the value of this dummy external measurement, based off the values of the design variables in the OptiStruct model. A characteristic of RapidMiner is it makes machine learning and data analytics available to many more users, with draganddrop operators used to build reusable processes and no coding requirement. In order of operation, we read in the Excel file, setting the measurement we want to predict as a label, selecting the attributes to be used, and a linear regression model inside a validation operator. This is shown below along with the validation subprocess.
This workflow is just the start of what we can achieve with data analytics, and we could touch on so many more topics such as Feature Engineering. I find the RapidMiner Academy and YouTube channel to be excellent references and would recommend perusing them in your own time. Checking the performance of the model we have just created; we see a root mean squared error of 0.040 which is more than adequate for this example.
Now we have our machine learning model, we want to link it back to our OptiStruct model. We could write out the required OptiStruct cards by hand either through a text editor or using our HyperWorks preprocessor. Whilst either work, there is always the possibility that we introduce errors when manually copying over our values. The following cards are what we need to introduce to our bulk data, and it would be far better if we can output these from RapidMiner in a faster and more automatic way.
We can do this through using 3 main operators: Create Document, Split Document into Collection, and Combine Documents. These, combined with macros and loops, can be used to create the following process to automate the card writing process and minimize the required user input.
This may look a little complex, and we won’t go through the operation of every single operator in the process, but we can describe its effect. We input the id’s of our design variables from our OptiStruct Model, and an equal number of letters to match the coefficients in the created function. The process then handles the rest through splitting, macros, and looping, to write out the desired cards in the required format. These can then be imported into our existing example HyperMesh model as a solver deck, automatically adding our additional response and the referenced design equation.
We’re now ready to setup and run an optimisation where we maximize this external response with imposed mechanical constraints (Max VM Stress < 200 MPA, Max displacement < 50 mm). The displacement results can be seen below.
We can also see the predicted value from our machine learning model increase as OptiStruct iterates and improves the design, eventually reaching a maximum possible value whilst meeting the physical requirements.
We could also run an optimisation where we use the machine learning model as a constraint, perhaps with a lower bound of 50, and instead aim to minimize the mass. Again, we can see the results and the external response progression. From this, we can see that the lower bound on our external response is a driving constraint on our optimisation. The mass objective multiplied by 1000 has also been plotted for reference to show the progress of our objective.
This article covers the technical aspect of integrating the data analytics and optimisation domains using RapidMiner and OptiStruct. We also showed the results of two successful optimisations where we used a linear regression model as an objective and constraint function respectively. These techniques could be used for any optimisation where you want to not just optimise mechanically, but to truly design an optimised part. If you have any optimisations which you feel would benefit from applying machine learning and data analytics, please reach out or comment below.
Comments

This is great Roland!
I can see many possibilities in combining external objective parameters to a structural performance assessment  it's great that this workflow looks so straightforward.1 
Really interesting article!
0