Have you tried to run the example job as the superuser? It seems like this
might be an issue where hadoop.tmp.dir doesn't have the correctly
permissions. hadoop.tmp.dir and dfs.data.dir should be owned by the unix
user running your Hadoop daemons and owner-writtable and readable.

Can you confirm this is the case? Thanks,


On Fri, Jun 26, 2009 at 1:29 PM, Mulcahy, Stephen
[Apologies for the top-post, sending this from a dodgy webmail client]
Hi Alex,

My hadoop-site.xml is as follows,

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->





Any comments welcome,


-----Original Message-----
From: Alex Loddengaard
Sent: Fri 26/06/2009 18:32
To: core-user@hadoop.apache.org
Subject: Re: Permissions needed to run RandomWriter ?

Hey Stephen,

What does your hadoop-site.xml look like? The Exception is in
java.io.UnixFileSystem, which makes me think that you're actually creating
and modifying directories on your local file system instead of HDFS. Make
sure "fs.default.name" looks like "hdfs://your-namenode.domain.com:PORT".


On Fri, Jun 26, 2009 at 4:40 AM, stephen mulcahy

I've just installed a new test cluster and I'm trying to give it a quick
smoke test with RandomWriter and Sort.

I can run these fine with the superuser account. When I try to run them as
another user I run into problems even though I've created the output
directory and given permissions to the other user to write to this
directory. i.e.

1. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo
mkdir: org.apache.hadoop.fs.permission.AccessControlException: Permission
denied: user=smulcahy, access=WRITE, inode="":hadoop:supergroup:rwxr-xr-x

OK - we don't have permissions anyways

2. hadoop@hadoop01:/$ hadoop fs -mkdir /foo


3. hadoop fs -chown -R smulcahy /foo


4. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo/test


5. smulcahy@hadoop01:~$ hadoop jar /usr/lib/hadoop/hadoop-*-examples.jar
randomwriter /foo
java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1704)
at java.io.File.createTempFile(File.java:1793)
at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)

Any suggestions on why step 5. is failing even though I have write
permissions to /foo - do I need permissions on some other directory also or
... ?



Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
http://di2.deri.ie http://webstar.deri.ie http://sindice.com

Search Discussions

Discussion Posts


Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 4 of 8 | next ›
Discussion Overview
groupcommon-user @
postedJun 26, '09 at 11:40a
activeJun 30, '09 at 11:20a



site design / logo © 2022 Grokbase