🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

Hyperstudy Optima Results Terms

User: "Andreas_21886"
Altair Community Member
Updated by Andreas_21886

I'm busy working on a design project where I need to compare 4 designs of a component in a mechanism. I have an excel workbook which is driven by hyperstudy in 4 optimisation solvers to produce a set of optimal points for each design which I can then compare between the 4 and pick the best option. 

I've written a python script which reads excel reports exported by hyperstudy and complies them into a single plot with shared axes to make the comparison easier and clearer to readers. Digging into the results, I saw that when plotting optima in hyperstudy it draws a number of unique points from a wider pool. 

When it says x unique points, bad values etc. what is this trying to convey and is there anyway to filter for the unique results in the excel report export? 

I haven't been able to find any explanations in the help documentation. 

Find more posts tagged with

Sort by:
1 - 2 of 21
    User: "Adriano_Koga"
    Altair Employee
    Accepted Answer
    Updated by Adriano_Koga

    i believe you're talking about the statistics of all your optimization runs. HyperStudy creates some statistics based on the models run, indicating among them how many unique/different designs were run, outliers, ...

    It is probably related to the post processing step, under Health.

    https://2022.help.altair.com/2022/hwdesktop/hst/topics/design_exploration/post_process_integrity_r.htm#post_process_integrity_r  

    User: "Andreas_21886"
    Altair Community Member
    OP
    Updated by Andreas_21886

    i believe you're talking about the statistics of all your optimization runs. HyperStudy creates some statistics based on the models run, indicating among them how many unique/different designs were run, outliers, ...

    It is probably related to the post processing step, under Health.

    https://2022.help.altair.com/2022/hwdesktop/hst/topics/design_exploration/post_process_integrity_r.htm#post_process_integrity_r  

    Hi Adriano,

    Thanks for the response. Those stats were what hyperstudy was pulling in, but it was showing me them on the side bar of the optima plot. Either way I took a look at the stats but they didn't really help. According to them there were no bad values.

    Digging into the data though, I found a number of repeated values at that seems to be what the wider pool was. So effectively what was happing hyperstudy was filtering out repeat values but I wasn't at the time so to me it seemed like there was much more data than there was. 

    This is interesting because it means in the outputs in xlsx reports, hyperstudy will still give you optimum repeated values - just something for people to be aware of if they come across this post/a similar issue.

    anyway thank you for the help nonetheless