(URGENT) How to enumerate from n different training and test sets??

misaghb
misaghb New Altair Community Member
edited November 5 in Community Q&A

Hi,

I want to train a model from several training sets and apply it to several test sets.

I have 60 train sets (train_i.dat) and 60 test sets (test_i.dat) in the same folder.
How can I design a loop structure that at the first iteration loads train_1.dat and applies the learned model to test_1.dat, but in the next iteration loads train_2.dat and applies the learned model to test_2.dat and so on??

Then by using "IteratingPerformanceAverage" operator I can calculate the final performance.

Can anybody tell what operator tree structure and what xml configuration I should use to solve this problem?

I need the solution so urgently.

Thanks.

Tagged:

Answers

  • steffen
    steffen New Altair Community Member
    Hello misaghb

    Assignment ? Meeting ?  ;D

    The solution to your problem are RapidMiner macros.

    <operator name="Root" class="Process" expanded="yes">
        <operator name="IteratingPerformanceAverage" class="IteratingPerformanceAverage" expanded="yes">
            <parameter key="iterations" value="60"/>
            <operator name="load_train" class="ExampleSource">
                <parameter key="attributes" value="train_%{a}.aml"/>
                <parameter key="decimal_point_character" value=""/>
            </operator>
            <operator name="example_model" class="NaiveBayes">
            </operator>
            <operator name="load_test" class="ExampleSource">
                <parameter key="attributes" value="test_%{a}.aml"/>
                <parameter key="decimal_point_character" value=""/>
            </operator>
            <operator name="ModelApplier" class="ModelApplier">
                <list key="application_parameters">
                </list>
            </operator>
        </operator>
    </operator>
    hope this was "fast" enough

    Steffen
  • misaghb
    misaghb New Altair Community Member

    Dear Steffen
    Thanks a lot for your kind fast reply.
    It looks great.  :D

    Thanks again.
    - misagh.