Memory problems

Viktor_Meyer
Viktor_Meyer New Altair Community Member
edited November 5 in Community Q&A
Hello,

I'm using version 4.6 of rapidminer and I'm using the example source operator to load in 1.2 Million datarows with 65 attributes. I've got 4 gb ram on my mac... is there any way to read in these amount of data, because i'm always getting a running out of memory problem and till then in takes 12 or more to read the data... is there a way and maybe a bit faster way in doing that `?

Greets Viktor
Tagged:

Answers

  • dragoljub
    dragoljub New Altair Community Member
    Update to the latest RapidMiner 5.009, because the old version is not supported anymore. A simple computation of how large the file you are working with (in binary) is compared to the ram you have available should let you know if you can fit it in memory. If not use a database and process it on the fly without loading all to memory.

    -Gagi
  • fischer
    fischer New Altair Community Member
    Hi,

    let me add that we also fixed some memory problems in 5.0.009 so it actually may be worth updating.

    Best,
    Simon
  • Viktor_Meyer
    Viktor_Meyer New Altair Community Member
    Okay, and how do i get the .dat file into a database.... if tried to load it in with rapidminer (AML) and export to ingres but this also doesn't work because of insufficient memory.

    cheers
    viktor
  • land
    land New Altair Community Member
    Hi Viktor,
    if you have exported it as an .aml file, you probably have generated it with RapidMiner? You then could export it in chunks and reload it using a loop files operator in chunks and append it to the table. Since it's a more complex process, I save the design until you confirm that it is actually possible :)

    Greetings,
    Sebastian