Altair RISE
A program to recognize and reward our most engaged community members
Nominate Yourself Now!
Home
Discussions
Community Q&A
Memory problems
Viktor_Meyer
Hello,
I'm using version 4.6 of rapidminer and I'm using the example source operator to load in 1.2 Million datarows with 65 attributes. I've got 4 gb ram on my mac... is there any way to read in these amount of data, because i'm always getting a running out of memory problem and till then in takes 12 or more to read the data... is there a way and maybe a bit faster way in doing that `?
Greets Viktor
Find more posts tagged with
AI Studio
Accepted answers
All comments
dragoljub
Update to the latest RapidMiner 5.009, because the old version is not supported anymore. A simple computation of how large the file you are working with (in binary) is compared to the ram you have available should let you know if you can fit it in memory. If not use a database and process it on the fly without loading all to memory.
-Gagi
fischer
Hi,
let me add that we also fixed some memory problems in 5.0.009 so it actually may be worth updating.
Best,
Simon
Viktor_Meyer
Okay, and how do i get the .dat file into a database.... if tried to load it in with rapidminer (AML) and export to ingres but this also doesn't work because of insufficient memory.
cheers
viktor
land
Hi Viktor,
if you have exported it as an .aml file, you probably have generated it with RapidMiner? You then could export it in chunks and reload it using a loop files operator in chunks and append it to the table. Since it's a more complex process, I save the design until you confirm that it is actually possible
Greetings,
Sebastian
Quick Links
All Categories
Recent Discussions
Activity
Unanswered
日本語 (Japanese)
한국어(Korean)
Groups