Weired learner paramters in EvolutionaryParameterOptimization

_paul_
_paul_ New Altair Community Member
edited November 5 in Community Q&A
Hi,

I'm not sure if this is a bug but when I embed the NearestNeigbors learner
into EvolutionaryParameterOptimization, the latter allows to choose among
parameters for NN which are actually not visible in the learner itself. For
example, within the evolutionary optimization I can select
NearestNeighbors.kernel_gamma as parameter to be optimized. But when
I check the learner, there is no such parameter. Here is the code to reproduce:

<operator name="Root" class="Process" expanded="yes">
    <description text="#ylt#p#ygt#This process is also a parameter optimization process like the first one discussed in the meta group. In this case, an evolutionary approach is used for the search of the best parameter combination. This approach is often more appropriate and leads to better results without defining the parameter combinations which should be tested (as for the Grid Search and  the quadratic parameter optimization approaches).#ylt#/p#ygt# #ylt#p#ygt#The parameters for the evolutionary parameter optimization approach are defined in the same way as for the other parameter optimization operators. Instead of a comma separated list of parameters which should be checked the user has to define a colon separated pair which is used as lower and upper bound for the specific parameters.#ylt#/p#ygt# "/>
    <operator name="ExampleSource" class="ExampleSource">
        <parameter key="attributes" value="../data/polynomial.aml"/>
    </operator>
    <operator name="ParameterOptimization" class="EvolutionaryParameterOptimization" expanded="yes">
        <list key="parameters">
          <parameter key="NearestNeighbors.kernel_gamma" value="[0.0;Infinity]"/>
        </list>
        <parameter key="max_generations" value="10"/>
        <parameter key="tournament_fraction" value="0.75"/>
        <parameter key="crossover_prob" value="1.0"/>
        <operator name="IteratingPerformanceAverage" class="IteratingPerformanceAverage" expanded="yes">
            <parameter key="iterations" value="3"/>
            <operator name="Validation" class="XValidation" expanded="yes">
                <parameter key="number_of_validations" value="2"/>
                <parameter key="sampling_type" value="shuffled sampling"/>
                <operator name="NearestNeighbors" class="NearestNeighbors">
                </operator>
                <operator name="ApplierChain" class="OperatorChain" expanded="yes">
                    <operator name="Test" class="ModelApplier">
                        <list key="application_parameters">
                        </list>
                    </operator>
                    <operator name="Performance" class="Performance">
                    </operator>
                </operator>
            </operator>
        </operator>
        <operator name="Log" class="ProcessLog">
            <parameter key="filename" value="paraopt.log"/>
            <list key="log">
              <parameter key="C" value="operator.LibSVMLearner.parameter.C"/>
              <parameter key="degree" value="operator.LibSVMLearner.parameter.degree"/>
              <parameter key="error" value="operator.IteratingPerformanceAverage.value.performance"/>
            </list>
        </operator>
    </operator>
</operator>
Best,
Paul
Tagged:

Answers

  • cherokee
    cherokee New Altair Community Member
    Hi Paul,

    this is not a bug. kNN has this parameter. It is (only?) used when you use KernelEuclideanDistance as numerical measure. The GUI knows that and displays this parameter only when it's needed (try it!). The EvolutionaryParameterOptimization doesn't know that. Thus it is always displayed.

    Greetings,
    Michael
  • land
    land New Altair Community Member
    Hi Paul,
    Michael is totally correct.
    Unfortunately I don't have a clue, how the Optimization Dialog could obey the dependencies, too...

    Greetings,
      Sebastian