"Out of Memory Error"
D_M
New Altair Community Member
Hi,
My data set is having around 40000 attributes with 1000 positive & 1000 negative examples.
I am using the following operators:-
Root
TextInput (TFIDF vector creation)
StringTokenizer
TokenLengthFilter
W-GainRatioAttributeEval
AttributeWeightSelection
XValidation
LibSVMLearner
OperatorChain
ModelApplier
Performance
I am using the Rapidminer GUI. My RAM is of 2GB. OS - Windows Vista
I am getting a heap space error while on operator W-GainRatioAttributeEval.
Is there any way to get rid of this problem?
My data set is having around 40000 attributes with 1000 positive & 1000 negative examples.
I am using the following operators:-
Root
TextInput (TFIDF vector creation)
StringTokenizer
TokenLengthFilter
W-GainRatioAttributeEval
AttributeWeightSelection
XValidation
LibSVMLearner
OperatorChain
ModelApplier
Performance
I am using the Rapidminer GUI. My RAM is of 2GB. OS - Windows Vista
I am getting a heap space error while on operator W-GainRatioAttributeEval.
Is there any way to get rid of this problem?
0
Answers
-
Hi,
yes of course it is. In fact there are a bunch of solutions. The simplest is: Buy more ram and switch to 64 bit. But I think you are aware of this already
So, two ways out:
- Exchange the WEKA operator by the RapidMiner equivalent "InfoGainRationWeighting". This should save you some memory needed to double the data.
- Did you check the memory monitor? How much RAM your RapidMiner installation does actually consume? Dependent on the way, you are invoking RapidMiner, there might be some reasons, why not enough RAM is reserved by Java.
Greetings,
Sebastian
0