Utilizing the power of gradients in predictive modelling

Jonathan Ollar_21112
Jonathan Ollar_21112
Altair Employee
edited February 2023 in Other Discussion & Knowledge

This post is about gradients. Gradients are the partial derivatives of some scalar response, with respect to the design variables. For instance; how much does the tip displacement (response) of a wing change as a function of the thicknesses (design variables) of the wing ribs. Knowing this is extremely useful in optimization as if you know the direction of steepest descent you don't need much inspiration to put together a very simple optimization algorithm. Don't get me wrong, there's a whole lot more too it than that but a key ingredient in many efficient optimization methods is the use of gradients.

Another area where gradients are very useful is in predictive modelling. Adding gradient information can greatly improve the quality of a predictive model. The image below shows a predictive model that does not utilize gradients (left) and one that does (right). It's apparent that the right one is a much better prediction despite being trained on the same points but using gradient information.

image

HyperStudy has the infrastructure in place to take advantage of gradients in predictive modelling. Below is an example (with more variables). Want to try it yourself? Open HyperStudy and fire up the "Beverage Can" tutorial and everything is set up for you to try it:

Design Variables:
image

Responses: 
image

Gradients:
image

Results:
With a design of experiments of 10 runs, these are the results I am getting without gradients (left) and with gradients (right). These results clearly showcase that using gradients can greatly improve your predictive models so if you got them, use them.

image
Please comment if you try this yourself and how much it improved the quality of you predictive model!