FAQ
Jason Venner writes ...
My first guess is that, for what ever reason, the user you are running the
job as, can not write the directory specified for the hadoop.tmp.dir in your
configuration.
This is usually in the system temporary area and not an issue.

Assuming that files are in /tmp/hadoop-david, all seems to be fine:

[david@tweety ~]$ cd /tmp/hadoop-david
[david@tweety hadoop-david]$ mkdir foo
[david@tweety hadoop-david]$ rmdir foo

Using hadoop fs commands, all appears fine too:

[david@tweety ~]$ hadoop fs -ls
Found 1 items
-rw-r--r-- 1 david supergroup 28081 2009-10-06 23:27
/user/david/docnotes.txt
[david@tweety ~]$ hadoop fs -mkdir foo
[david@tweety ~]$ hadoop fs -touchz foo/bar.txt
[david@tweety ~]$ hadoop fs -rm foo/bar.txt
Deleted hdfs://localhost/user/david/foo/bar.txt
[david@tweety ~]$ hadoop fs -rmr foo
Deleted hdfs://localhost/user/david/foo

But trying to run MaxTemperature (this time with a valid input file
and output directory) I still get an error:

[david@tweety java]$ hadoop MaxTemperature sample.txt output
09/10/13 16:18:29 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=JobTracker, sessionId=
09/10/13 16:18:29 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the
same.
Exception in thread "main"
org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

The error message includes "root". I wonder if this is a clue? I'm out of ideas.

Search Discussions

Discussion Posts

Previous

Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 3 of 11 | next ›
Discussion Overview
grouphdfs-user @
categorieshadoop
postedOct 12, '09 at 4:52p
activeOct 14, '09 at 5:05p
posts11
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase