"Forecasting Performance is not always Coefficient Correlation?"
Ahmed
New Altair Community Member
Hi,
I am running some prediction models using neural network and I always use Forecasting Performance as an indicator for how good is my model using out of sample data and/or cross validation, and my interpretation for the Forecasting Performance number is that it is the same as Coefficient Correlation, but for my current exp. I am getting different number, as I managed to dump to the output results into excel sheet, and I measured the CC, I found it is 10% absolute better than the forecasting performance number!
I don't know if this is a bug or something?
Any tip experts?
Cheers,
-Ahmed
I am running some prediction models using neural network and I always use Forecasting Performance as an indicator for how good is my model using out of sample data and/or cross validation, and my interpretation for the Forecasting Performance number is that it is the same as Coefficient Correlation, but for my current exp. I am getting different number, as I managed to dump to the output results into excel sheet, and I measured the CC, I found it is 10% absolute better than the forecasting performance number!
I don't know if this is a bug or something?
Any tip experts?
Cheers,
-Ahmed
Tagged:
0
Answers
-
To be more specific, the forecasting performance is 0.68 and the coefficient correlation is 0.77 for this particular example, this is really driving me crazy, as previously I was getting the same results as the CC.
Thanks,
Ahmed0 -
Hi Ahmed,
which operator are you referring to?
Greetings,
Sebastian0 -
Hi Simon,
Ahmed emailed me about this question and I realized that I didn't have an answer either. What he wants to know is how does the Forecasting Operator, in the Time Series plugin, calculate its accuracy on the training data set. Is it MAPE? RMSE? etc.
Tom0 -
Hi Sebastian,
I am using the Forecasting Performance
It exists under Series -> Evaluation -> Performance
As I am using rapid Miner for my study, I was wondering if I can have some exploitation for this operator results. is it CC, MPE, RMSE..etc?
Thanks again,
Ahmed0 -
Hi Ahmed, hi Tom,
ok, now I found the operator. Sorry, forgot to take a look in the time series extension. Here's the explanation given by Criterions description:<p>Measures the number of times a regression prediction correctly determines the trend.
Greetings,
* This performance measure assumes that the attributes of each example represents the
* values of a time window, the label is a value after a certain horizon which should
* be predicted. All examples build a consecutive series description, i.e. the labels
* of all examples build the series itself (this is, for example, the case for a windowing
* step size of 1). This format will be delivered by the Series2ExampleSet operators provided by
* RapidMiner.</p>
*
* <p>Example: Lets think of a series v1...v10 and a sliding window with window width 3,
* step size 1 and prediction horizon 1. The resulting example set is then</p>
*
* <pre>
* T1 T2 T3 L P
* ---------------
* v1 v2 v3 v4 p1
* v2 v3 v4 v5 p2
* v3 v4 v5 v6 p3
* v4 v5 v6 v7 p4
* v5 v6 v7 v8 p5
* v6 v7 v8 v9 p6
* v7 v8 v9 v10 p7
* </pre>
*
* <p>The second last column (L) corresponds to the label, i.e. the value which should be
* predicted and the last column (P) corresponds to the predictions. The columns T1, T2,
* and T3 correspond to the regular attributes, i.e. the points which should be used as
* learning input.</p>
*
* <p>This performance measure then calculates the actuals trend between the last time point
* in the series (T3 here) and the actual label (L) and compares it to the trend between T3
* and the prediction (P), sums the products between both trends, and divides this sum by the
* total number of examples, i.e. [(if ((v4-v3)*(p1-v3)>=0), 1, 0) + (if ((v5-v4)*(p2-v4)>=0), 1, 0) +...] / 7 in this example.</p>
*
Sebastian0