Community & Support
Learn
Marketplace
Discussions
Categories
Discussions
General
Platform
Academic
Partner
Regional
User Groups
Documentation
Events
Altair Exchange
Share or Download Projects
Resources
News & Instructions
Programs
YouTube
Employee Resources
This tab can be seen by employees only. Please do not share these resources externally.
Groups
Join a User Group
Support
Altair RISE
A program to recognize and reward our most engaged community members
Nominate Yourself Now!
Home
Discussions
Altair RapidMiner
Web crawling, overcome mem limit, split urls into susamples and combine results
In777
Hello,
I retrieve data from several web pages (>30000) with the "get pages" operator. I have imported all my urls to the repository from the excel file. Then I process the information with regex (I extract several categories) and write the information about categories to excel in a separate raw for each url. My process works fine with small number of urls but my computer does not have enough memory to process all web pages at once. I would like to split them into pieces like 2000 urls each and do this process separately. At the end I will join excel files together. I looked at sampling operators, but most of them produce random sample. I want to keep the order in which the urls are crawled (if possible). I think I need to use a loop, but I cannot figure out where to start. For example, I do not know which loop operator to use (I think I need loop over examples) and how to make it to write the results to several excel files ( I presume I rather need to write the results dynamically to sql database rather then excel). Could anybody help me with this issue?
Find more posts tagged with
AI Studio
Web Mining
Split
Comments
There are no comments yet
Quick Links
All Categories
Recent Discussions
Activity
My Discussions
Unanswered
日本語 (Japanese)
한국어(Korean)
Groups