Radoop Error: could not upload the necessary components to the directory of HDFS
kimusu2002
New Altair Community Member
Hi,
I am having problem with this Radoop Error: could not upload the necessary components to the directory of HDFS. It said that the radoop can't upload into the directory ''/tmp/radoop/27eca174758add21906d3b197af684e7/ ' .
So I changed the permission of "'/tmp/radoop/" and also '/tmp/radoop/27eca174758add21906d3b197af684e7/ ' of the namenode in VM, and then I typed in 'hadoop fs -ls /tmp/radoop", results show that the permission had been changed. So I went ahead and re-run the process which has the "Radoop nest", and the same error pop out again, and the permission of the directory "tmp/radoop" is now changed back automatically to the one before.
Could some give me some pointers please ?
FYI, i was able to connect the cluster, and also able to explore all the HIVE tables.
Thanks heaps !
I am having problem with this Radoop Error: could not upload the necessary components to the directory of HDFS. It said that the radoop can't upload into the directory ''/tmp/radoop/27eca174758add21906d3b197af684e7/ ' .
So I changed the permission of "'/tmp/radoop/" and also '/tmp/radoop/27eca174758add21906d3b197af684e7/ ' of the namenode in VM, and then I typed in 'hadoop fs -ls /tmp/radoop", results show that the permission had been changed. So I went ahead and re-run the process which has the "Radoop nest", and the same error pop out again, and the permission of the directory "tmp/radoop" is now changed back automatically to the one before.
Could some give me some pointers please ?
FYI, i was able to connect the cluster, and also able to explore all the HIVE tables.
Thanks heaps !
0
Answers
-
Hi kimusu2002,
in radoop are often various users involved. Sometimes Hive is writing the table and not the user you specified. Are you sure your user and the hive has accsess to this? Can you try to give accsess rights to all users?
Since you are using radoop you have most likely a support contract. You can use our professional support at http://support.rapidminer.com/ .
Cheers,
Martin0 -
Hi Team,
I get the same error and also I get "[May 9, 2015 4:18:43 PM] SEVERE: Wrong FS: hdfs://localhost.localdomain:8020/user/radoop/.staging, expected: hdfs://192.168.93.133:8020
[May 9, 2015 4:18:43 PM] SEVERE: MapReduce staging directory test failed. The Radoop client cannot write the staging directory. Please consult your Hadoop administrator."
Getting this error when testing the connection.
Checked all the permission its not working.
Infrastructure: I am using CHD4 VM from cloudera and the user in the VM is cloudera. I can see the folder /tmp/radoop being created with user group "radoop" but still have issues.
Its a bit urgent can any one please help.
Thanks in advance.
Regards,
Krishna.0 -
Hi All,
I have got the above problem resolved which is related to the connection. But I get the same error as mentioned in the thread "Error: could not upload the necessary components to the directory of HDFS" Can you please help me on this.
This is bit urgent.
Regards,
Krishna.0 -
Hi,
i would recommend to ask directly at support.rapidminer.com . Since you have hadoop you should have a support contract.
Best,
Martin0 -
could you try to
chmod 777 /tmp0 -
Hi,
my colleague Peter pointed me to this: https://support.rapidminer.com/customer/portal/questions/11551531-radoop-error-could-not-upload-the-necessary-components-to-the-directory-of-hdfs
This should help.
Cheers,
Martin0