FAQ
Hi, I tested the Cloudera 4 Hadoop testprograms. When I try some of them I
always get a Error at the End *results.log (Permission denied).
I think it is some permission Problems here. I did already haoop fs -chmod
777 /
but it didn't help. I also executed the programms with the root user. But
didn't help.

$ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
TestDFSIO -write -nrFiles 5 -fileSize 200
12/07/17 11:29:39 INFO fs.TestDFSIO: TestDFSIO.0.0.6
12/07/17 11:29:39 INFO fs.TestDFSIO: nrFiles = 5
12/07/17 11:29:39 INFO fs.TestDFSIO: fileSize (MB) = 200.0
12/07/17 11:29:39 INFO fs.TestDFSIO: bufferSize = 1000000
12/07/17 11:29:39 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
12/07/17 11:29:40 INFO fs.TestDFSIO: creating control file: 209715200
bytes, 5 files
12/07/17 11:29:41 INFO fs.TestDFSIO: created control files for: 5 files
....
12/07/17 11:30:33 INFO util.NativeCodeLoader: Loaded the native-hadoop
library
java.io.FileNotFoundException: TestDFSIO_results.log (Permission denied)
at java.io.FileOutputStream.openAppend(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
at org.apache.hadoop.fs.TestDFSIO.analyzeResult(TestDFSIO.java:518)
at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:607)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:448)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
at org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
at org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)




$ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
nnbench -operation create_write -maps 10 -reduces 5 -blockSize 1
-bytesToWrite 0 -numberOfFiles 500 -replicationFactorPerFile 3
-readFileAfterOpen true
....
12/07/17 11:24:16 INFO util.NativeCodeLoader: Loaded the native-hadoop
library
java.io.FileNotFoundException: NNBench_results.log (Permission denied)
at java.io.FileOutputStream.openAppend(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
at org.apache.hadoop.hdfs.NNBench.analyzeResults(NNBench.java:458)
at org.apache.hadoop.hdfs.NNBench.run(NNBench.java:603)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.hdfs.NNBench.main(NNBench.java:577)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
at org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
at org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)


$ hadoop dfsadmin -report
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
Configured Capacity: 191644925952 (178.48 GB)
Present Capacity: 191644925952 (178.48 GB)
DFS Remaining: 188033765968 (175.12 GB)
DFS Used: 3611159984 (3.36 GB)
DFS Used%: 1.88%
Under replicated blocks: 0
Blocks with corrupt replicas: 0
Missing blocks: 0
-------------------------------------------------
report: Access denied for user ubuntu. Superuser privilege is required

Search Discussions

  • Philip Zeyliger at Jul 18, 2012 at 3:07 am
    Hi keamas,

    The 'root' user of HDFS is typically 'hdfs'. So, I suspect you will
    succeed if you "sudo -u hdfs" and run the same things.

    Very likely, the program is relying on /user/ubuntu existing, and being
    writable. Again, you can try creating that as 'hdfs'. Changing / to 777
    would have helped, but, most likely, you already have /user, which itself
    has tighter permissions.

    Several of the MR programs default to your "home" directory which is
    /user/{username} in HDFS.

    Cheers,

    -- Philip
    On Tue, Jul 17, 2012 at 2:41 AM, keamas wrote:

    Hi, I tested the Cloudera 4 Hadoop testprograms. When I try some of them I
    always get a Error at the End *results.log (Permission denied).
    I think it is some permission Problems here. I did already haoop fs -chmod
    777 /
    but it didn't help. I also executed the programms with the root user. But
    didn't help.

    $ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
    TestDFSIO -write -nrFiles 5 -fileSize 200
    12/07/17 11:29:39 INFO fs.TestDFSIO: TestDFSIO.0.0.6
    12/07/17 11:29:39 INFO fs.TestDFSIO: nrFiles = 5
    12/07/17 11:29:39 INFO fs.TestDFSIO: fileSize (MB) = 200.0
    12/07/17 11:29:39 INFO fs.TestDFSIO: bufferSize = 1000000
    12/07/17 11:29:39 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
    12/07/17 11:29:40 INFO fs.TestDFSIO: creating control file: 209715200
    bytes, 5 files
    12/07/17 11:29:41 INFO fs.TestDFSIO: created control files for: 5 files
    ....
    12/07/17 11:30:33 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: TestDFSIO_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.fs.TestDFSIO.analyzeResult(TestDFSIO.java:518)
    at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:607)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:448)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at
    org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)




    $ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
    nnbench -operation create_write -maps 10 -reduces 5 -blockSize 1
    -bytesToWrite 0 -numberOfFiles 500 -replicationFactorPerFile 3
    -readFileAfterOpen true
    ....
    12/07/17 11:24:16 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: NNBench_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.hdfs.NNBench.analyzeResults(NNBench.java:458)
    at org.apache.hadoop.hdfs.NNBench.run(NNBench.java:603)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.hdfs.NNBench.main(NNBench.java:577)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at
    org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)


    $ hadoop dfsadmin -report
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.
    Configured Capacity: 191644925952 (178.48 GB)
    Present Capacity: 191644925952 (178.48 GB)
    DFS Remaining: 188033765968 (175.12 GB)
    DFS Used: 3611159984 (3.36 GB)
    DFS Used%: 1.88%
    Under replicated blocks: 0
    Blocks with corrupt replicas: 0
    Missing blocks: 0
    -------------------------------------------------
    report: Access denied for user ubuntu. Superuser privilege is required
  • Keamas at Jul 18, 2012 at 6:40 am
    Hey this one worked with the sudo -u hdfs

    ubuntu@node9:/usr/lib/hadoop-mapreduce $ sudo -u hdfs hadoop dfsadmin
    -report
    [sudo] password for ubuntu:
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.
    Configured Capacity: 191644925952 (178.48 GB)
    Present Capacity: 191644925952 (178.48 GB)
    DFS Remaining: 188193910784 (175.27 GB)
    DFS Used: 3451015168 (3.21 GB)
    DFS Used%: 1.8%
    Under replicated blocks: 0
    Blocks with corrupt replicas: 0
    Missing blocks: 0
    -------------------------------------------------
    Datanodes available: 3 (3 total, 0 dead)
    Live datanodes:
    Name: 10.110.11.77:50010 (node7.sky.lab)
    Hostname: node7.sky.lab
    Rack: /default
    Decommission Status : Normal
    Configured Capacity: 63881641984 (59.49 GB)
    DFS Used: 1134297088 (1.06 GB)
    Non DFS Used: 0 (0 KB)
    DFS Remaining: 62747344896 (58.44 GB)
    DFS Used%: 1.78%
    DFS Remaining%: 98.22%
    Last contact: Wed Jul 18 08:24:17 CEST 2012

    Name: 10.110.11.78:50010 (node8.sky.lab)
    Hostname: node8.sky.lab
    Rack: /default
    Decommission Status : Normal
    Configured Capacity: 63881641984 (59.49 GB)
    DFS Used: 1182420992 (1.1 GB)
    Non DFS Used: 0 (0 KB)
    DFS Remaining: 62699220992 (58.39 GB)
    DFS Used%: 1.85%
    DFS Remaining%: 98.15%
    Last contact: Wed Jul 18 08:24:17 CEST 2012

    Name: 10.110.11.79:50010 (node9.sky.lab)
    Hostname: node9.sky.lab
    Rack: /default
    Decommission Status : Normal
    Configured Capacity: 63881641984 (59.49 GB)
    DFS Used: 1134297088 (1.06 GB)
    Non DFS Used: 0 (0 KB)
    DFS Remaining: 62747344896 (58.44 GB)
    DFS Used%: 1.78%
    DFS Remaining%: 98.22%
    Last contact: Wed Jul 18 08:24:17 CEST 2012

    I did first chmod 777:
    ubuntu@node9:/usr/lib/hadoop-mapreduce$ sudo -u hdfs hadoop fs -chmod 777 /
    ubuntu@node9:/usr/lib/hadoop-mapreduce$ sudo -u hdfs hadoop fs -chmod 777
    /user/
    ubuntu@node9:/usr/lib/hadoop-mapreduce$ sudo -u hdfs hadoop fs -chmod 777
    /user/ubuntu

    After this I tried it with the ubuntu user and the sudo -u hdfs but the
    other test programs didn't work.

    ubuntu@node9:/usr/lib/hadoop-mapreduce$ sudo -u hdfs hadoop jar
    hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar TestDFSIO -write
    -nrFiles 5 -fileSize 200
    12/07/18 08:28:51 INFO fs.TestDFSIO: TestDFSIO.0.0.6
    12/07/18 08:28:51 INFO fs.TestDFSIO: nrFiles = 5
    12/07/18 08:28:51 INFO fs.TestDFSIO: fileSize (MB) = 200.0
    12/07/18 08:28:51 INFO fs.TestDFSIO: bufferSize = 1000000
    12/07/18 08:28:51 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
    12/07/18 08:28:52 INFO fs.TestDFSIO: creating control file: 209715200
    bytes, 5 files
    12/07/18 08:28:52 INFO fs.TestDFSIO: created control files for: 5 files
    12/07/18 08:28:52 WARN mapred.JobClient: Use GenericOptionsParser for
    parsing the arguments. Applications should implement Tool for the same.
    12/07/18 08:28:53 INFO mapred.FileInputFormat: Total input paths to
    process : 5
    12/07/18 08:28:53 WARN conf.Configuration: dfs.https.address is
    deprecated. Instead, use dfs.namenode.https-address
    12/07/18 08:28:53 WARN conf.Configuration: io.bytes.per.checksum is
    deprecated. Instead, use dfs.bytes-per-checksum
    12/07/18 08:28:53 INFO mapred.JobClient: Running job: job_201207161740_0028
    12/07/18 08:28:54 INFO mapred.JobClient: map 0% reduce 0%
    12/07/18 08:29:07 INFO mapred.JobClient: map 20% reduce 0%
    12/07/18 08:29:09 INFO mapred.JobClient: map 60% reduce 0%
    12/07/18 08:29:10 INFO mapred.JobClient: map 100% reduce 0%
    12/07/18 08:29:48 INFO mapred.JobClient: map 100% reduce 33%
    12/07/18 08:29:49 INFO mapred.JobClient: map 100% reduce 100%
    12/07/18 08:29:51 INFO mapred.JobClient: Job complete:
    job_201207161740_0028
    12/07/18 08:29:51 INFO mapred.JobClient: Counters: 33
    12/07/18 08:29:51 INFO mapred.JobClient: File System Counters
    12/07/18 08:29:51 INFO mapred.JobClient: FILE: Number of bytes read=248
    12/07/18 08:29:51 INFO mapred.JobClient: FILE: Number of bytes
    written=343737
    12/07/18 08:29:51 INFO mapred.JobClient: FILE: Number of read
    operations=0
    12/07/18 08:29:51 INFO mapred.JobClient: FILE: Number of large read
    operations=0
    12/07/18 08:29:51 INFO mapred.JobClient: FILE: Number of write
    operations=0
    12/07/18 08:29:51 INFO mapred.JobClient: HDFS: Number of bytes
    read=1200
    12/07/18 08:29:51 INFO mapred.JobClient: HDFS: Number of bytes
    written=1048576078
    12/07/18 08:29:51 INFO mapred.JobClient: HDFS: Number of read
    operations=16
    12/07/18 08:29:51 INFO mapred.JobClient: HDFS: Number of large read
    operations=0
    12/07/18 08:29:51 INFO mapred.JobClient: HDFS: Number of write
    operations=7
    12/07/18 08:29:51 INFO mapred.JobClient: Job Counters
    12/07/18 08:29:51 INFO mapred.JobClient: Launched map tasks=5
    12/07/18 08:29:51 INFO mapred.JobClient: Launched reduce tasks=1
    12/07/18 08:29:51 INFO mapred.JobClient: Data-local map tasks=5
    12/07/18 08:29:51 INFO mapred.JobClient: Total time spent by all maps
    in occupied slots (ms)=163480
    12/07/18 08:29:51 INFO mapred.JobClient: Total time spent by all
    reduces in occupied slots (ms)=12550
    12/07/18 08:29:51 INFO mapred.JobClient: Total time spent by all maps
    waiting after reserving slots (ms)=0
    12/07/18 08:29:51 INFO mapred.JobClient: Total time spent by all
    reduces waiting after reserving slots (ms)=0
    12/07/18 08:29:51 INFO mapred.JobClient: Map-Reduce Framework
    12/07/18 08:29:51 INFO mapred.JobClient: Map input records=5
    12/07/18 08:29:51 INFO mapred.JobClient: Map output records=25
    12/07/18 08:29:51 INFO mapred.JobClient: Map output bytes=374
    12/07/18 08:29:51 INFO mapred.JobClient: Input split bytes=640
    12/07/18 08:29:51 INFO mapred.JobClient: Combine input records=0
    12/07/18 08:29:51 INFO mapred.JobClient: Combine output records=0
    12/07/18 08:29:51 INFO mapred.JobClient: Reduce input groups=5
    12/07/18 08:29:51 INFO mapred.JobClient: Reduce shuffle bytes=496
    12/07/18 08:29:51 INFO mapred.JobClient: Reduce input records=25
    12/07/18 08:29:51 INFO mapred.JobClient: Reduce output records=5
    12/07/18 08:29:51 INFO mapred.JobClient: Spilled Records=50
    12/07/18 08:29:51 INFO mapred.JobClient: CPU time spent (ms)=18820
    12/07/18 08:29:51 INFO mapred.JobClient: Physical memory (bytes)
    snapshot=1105924096
    12/07/18 08:29:51 INFO mapred.JobClient: Virtual memory (bytes)
    snapshot=6380105728
    12/07/18 08:29:51 INFO mapred.JobClient: Total committed heap usage
    (bytes)=899416064
    12/07/18 08:29:51 INFO mapred.JobClient:
    org.apache.hadoop.mapreduce.lib.input.FileInputFormatCounter
    12/07/18 08:29:51 INFO mapred.JobClient: BYTES_READ=130
    12/07/18 08:29:51 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: TestDFSIO_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.fs.TestDFSIO.analyzeResult(TestDFSIO.java:518)
    at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:607)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:448)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
  • Mauricio Vacas at Sep 18, 2012 at 7:27 pm
    Ran into this same issue. The permissions issue for the DFSIO benchmark is
    because it can't write to the local file system. If you try the following
    query from the /tmp folder it should work:
    sudo -u hdfs hadoop jar
    /usr/lib/hadoop-0.20-mapreduce/hadoop-2.0.0-mr1-cdh4.0.1-test.jar TestDFSIO
    -write -nrFiles 5 -fileSize 200

    Cheers,
    Mauricio
    On Tuesday, July 17, 2012 11:40:30 PM UTC-7, keamas wrote:

    Hey this one worked with the sudo -u hdfs

    ubuntu@node9:/usr/lib/hadoop-mapreduce $ sudo -u hdfs hadoop dfsadmin
    -report
    [sudo] password for ubuntu:
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.
    Configured Capacity: 191644925952 (178.48 GB)
    Present Capacity: 191644925952 (178.48 GB)
    DFS Remaining: 188193910784 (175.27 GB)
    DFS Used: 3451015168 (3.21 GB)
    DFS Used%: 1.8%
    Under replicated blocks: 0
    Blocks with corrupt replicas: 0
    Missing blocks: 0
    -------------------------------------------------
    Datanodes available: 3 (3 total, 0 dead)
    Live datanodes:
    Name: 10.110.11.77:50010 (node7.sky.lab)
    Hostname: node7.sky.lab
    Rack: /default
    Decommission Status : Normal
    Configured Capacity: 63881641984 (59.49 GB)
    DFS Used: 1134297088 (1.06 GB)
    Non DFS Used: 0 (0 KB)
    DFS Remaining: 62747344896 (58.44 GB)
    DFS Used%: 1.78%
    DFS Remaining%: 98.22%
    Last contact: Wed Jul 18 08:24:17 CEST 2012

    Name: 10.110.11.78:50010 (node8.sky.lab)
    Hostname: node8.sky.lab
    Rack: /default
    Decommission Status : Normal
    Configured Capacity: 63881641984 (59.49 GB)
    DFS Used: 1182420992 (1.1 GB)
    Non DFS Used: 0 (0 KB)
    DFS Remaining: 62699220992 (58.39 GB)
    DFS Used%: 1.85%
    DFS Remaining%: 98.15%
    Last contact: Wed Jul 18 08:24:17 CEST 2012

    Name: 10.110.11.79:50010 (node9.sky.lab)
    Hostname: node9.sky.lab
    Rack: /default
    Decommission Status : Normal
    Configured Capacity: 63881641984 (59.49 GB)
    DFS Used: 1134297088 (1.06 GB)
    Non DFS Used: 0 (0 KB)
    DFS Remaining: 62747344896 (58.44 GB)
    DFS Used%: 1.78%
    DFS Remaining%: 98.22%
    Last contact: Wed Jul 18 08:24:17 CEST 2012

    I did first chmod 777:
    ubuntu@node9:/usr/lib/hadoop-mapreduce$ sudo -u hdfs hadoop fs -chmod 777
    /
    ubuntu@node9:/usr/lib/hadoop-mapreduce$ sudo -u hdfs hadoop fs -chmod 777
    /user/
    ubuntu@node9:/usr/lib/hadoop-mapreduce$ sudo -u hdfs hadoop fs -chmod 777
    /user/ubuntu

    After this I tried it with the ubuntu user and the sudo -u hdfs but the
    other test programs didn't work.

    ubuntu@node9:/usr/lib/hadoop-mapreduce$ sudo -u hdfs hadoop jar
    hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar TestDFSIO -write
    -nrFiles 5 -fileSize 200
    12/07/18 08:28:51 INFO fs.TestDFSIO: TestDFSIO.0.0.6
    12/07/18 08:28:51 INFO fs.TestDFSIO: nrFiles = 5
    12/07/18 08:28:51 INFO fs.TestDFSIO: fileSize (MB) = 200.0
    12/07/18 08:28:51 INFO fs.TestDFSIO: bufferSize = 1000000
    12/07/18 08:28:51 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
    12/07/18 08:28:52 INFO fs.TestDFSIO: creating control file: 209715200
    bytes, 5 files
    12/07/18 08:28:52 INFO fs.TestDFSIO: created control files for: 5 files
    12/07/18 08:28:52 WARN mapred.JobClient: Use GenericOptionsParser for
    parsing the arguments. Applications should implement Tool for the same.
    12/07/18 08:28:53 INFO mapred.FileInputFormat: Total input paths to
    process : 5
    12/07/18 08:28:53 WARN conf.Configuration: dfs.https.address is
    deprecated. Instead, use dfs.namenode.https-address
    12/07/18 08:28:53 WARN conf.Configuration: io.bytes.per.checksum is
    deprecated. Instead, use dfs.bytes-per-checksum
    12/07/18 08:28:53 INFO mapred.JobClient: Running job:
    job_201207161740_0028
    12/07/18 08:28:54 INFO mapred.JobClient: map 0% reduce 0%
    12/07/18 08:29:07 INFO mapred.JobClient: map 20% reduce 0%
    12/07/18 08:29:09 INFO mapred.JobClient: map 60% reduce 0%
    12/07/18 08:29:10 INFO mapred.JobClient: map 100% reduce 0%
    12/07/18 08:29:48 INFO mapred.JobClient: map 100% reduce 33%
    12/07/18 08:29:49 INFO mapred.JobClient: map 100% reduce 100%
    12/07/18 08:29:51 INFO mapred.JobClient: Job complete:
    job_201207161740_0028
    12/07/18 08:29:51 INFO mapred.JobClient: Counters: 33
    12/07/18 08:29:51 INFO mapred.JobClient: File System Counters
    12/07/18 08:29:51 INFO mapred.JobClient: FILE: Number of bytes
    read=248
    12/07/18 08:29:51 INFO mapred.JobClient: FILE: Number of bytes
    written=343737
    12/07/18 08:29:51 INFO mapred.JobClient: FILE: Number of read
    operations=0
    12/07/18 08:29:51 INFO mapred.JobClient: FILE: Number of large read
    operations=0
    12/07/18 08:29:51 INFO mapred.JobClient: FILE: Number of write
    operations=0
    12/07/18 08:29:51 INFO mapred.JobClient: HDFS: Number of bytes
    read=1200
    12/07/18 08:29:51 INFO mapred.JobClient: HDFS: Number of bytes
    written=1048576078
    12/07/18 08:29:51 INFO mapred.JobClient: HDFS: Number of read
    operations=16
    12/07/18 08:29:51 INFO mapred.JobClient: HDFS: Number of large read
    operations=0
    12/07/18 08:29:51 INFO mapred.JobClient: HDFS: Number of write
    operations=7
    12/07/18 08:29:51 INFO mapred.JobClient: Job Counters
    12/07/18 08:29:51 INFO mapred.JobClient: Launched map tasks=5
    12/07/18 08:29:51 INFO mapred.JobClient: Launched reduce tasks=1
    12/07/18 08:29:51 INFO mapred.JobClient: Data-local map tasks=5
    12/07/18 08:29:51 INFO mapred.JobClient: Total time spent by all maps
    in occupied slots (ms)=163480
    12/07/18 08:29:51 INFO mapred.JobClient: Total time spent by all
    reduces in occupied slots (ms)=12550
    12/07/18 08:29:51 INFO mapred.JobClient: Total time spent by all maps
    waiting after reserving slots (ms)=0
    12/07/18 08:29:51 INFO mapred.JobClient: Total time spent by all
    reduces waiting after reserving slots (ms)=0
    12/07/18 08:29:51 INFO mapred.JobClient: Map-Reduce Framework
    12/07/18 08:29:51 INFO mapred.JobClient: Map input records=5
    12/07/18 08:29:51 INFO mapred.JobClient: Map output records=25
    12/07/18 08:29:51 INFO mapred.JobClient: Map output bytes=374
    12/07/18 08:29:51 INFO mapred.JobClient: Input split bytes=640
    12/07/18 08:29:51 INFO mapred.JobClient: Combine input records=0
    12/07/18 08:29:51 INFO mapred.JobClient: Combine output records=0
    12/07/18 08:29:51 INFO mapred.JobClient: Reduce input groups=5
    12/07/18 08:29:51 INFO mapred.JobClient: Reduce shuffle bytes=496
    12/07/18 08:29:51 INFO mapred.JobClient: Reduce input records=25
    12/07/18 08:29:51 INFO mapred.JobClient: Reduce output records=5
    12/07/18 08:29:51 INFO mapred.JobClient: Spilled Records=50
    12/07/18 08:29:51 INFO mapred.JobClient: CPU time spent (ms)=18820
    12/07/18 08:29:51 INFO mapred.JobClient: Physical memory (bytes)
    snapshot=1105924096
    12/07/18 08:29:51 INFO mapred.JobClient: Virtual memory (bytes)
    snapshot=6380105728
    12/07/18 08:29:51 INFO mapred.JobClient: Total committed heap usage
    (bytes)=899416064
    12/07/18 08:29:51 INFO mapred.JobClient:
    org.apache.hadoop.mapreduce.lib.input.FileInputFormatCounter
    12/07/18 08:29:51 INFO mapred.JobClient: BYTES_READ=130
    12/07/18 08:29:51 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: TestDFSIO_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.fs.TestDFSIO.analyzeResult(TestDFSIO.java:518)
    at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:607)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:448)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
  • Keamas at Jul 23, 2012 at 7:30 pm
    Ok I did the chmod 777:
    ubuntu@node9:/usr/lib/hadoop-mapreduce$ sudo -u hdfs hadoop fs -chmod -R
      777 /
    and also added the Ubuntu user as a Super User in the Hue Web GUI but this
    didn't help.
    I also executed it with the sudo -u hdfs which is HDFS root
    Also no success.

    But when I execute it with the normal root user it works.
    In ubuntu U need to activate the root user with sudo passwd root
  • Keamas at Jul 23, 2012 at 7:36 pm
    Oh I forgot Maybe someone can tell me why and how it will work with other
    Users.
    This would be interesting for me...


    Am Montag, 23. Juli 2012 21:30:20 UTC+2 schrieb keamas:
    Ok I did the chmod 777:
    ubuntu@node9:/usr/lib/hadoop-mapreduce$ sudo -u hdfs hadoop fs -chmod -R
    777 /
    and also added the Ubuntu user as a Super User in the Hue Web GUI but this
    didn't help.
    I also executed it with the sudo -u hdfs which is HDFS root
    Also no success.

    But when I execute it with the normal root user it works.
    In ubuntu U need to activate the root user with sudo passwd root
  • Vikram Srivastava at Jul 24, 2012 at 12:05 am
    You need to create a home directory for each user. Typically that dir is
    "/user/<username>" on HDFS. Steps are:
    1. As hdfs user, create /user/<username> (sudo -u hdfs hadoop fs -mkdir
    /user/foo)
    2. Make <username> the owner of this dir (sudo -u hdfs hadoop fs -chown -R
    foo /user/foo)
    On Mon, Jul 23, 2012 at 12:36 PM, keamas wrote:

    Oh I forgot Maybe someone can tell me why and how it will work with other
    Users.
    This would be interesting for me...


    Am Montag, 23. Juli 2012 21:30:20 UTC+2 schrieb keamas:
    Ok I did the chmod 777:
    ubuntu@node9:/usr/lib/hadoop-**mapreduce$ sudo -u hdfs hadoop fs -chmod
    -R 777 /
    and also added the Ubuntu user as a Super User in the Hue Web GUI but
    this didn't help.
    I also executed it with the sudo -u hdfs which is HDFS root
    Also no success.

    But when I execute it with the normal root user it works.
    In ubuntu U need to activate the root user with sudo passwd root
  • Keamas at Jul 24, 2012 at 5:24 am
    Hey thank you I did it but still the same error:

    ubuntu@node9:~$ sudo -u hdfs hadoop fs -mkdir /user/ubuntu
    [sudo] password for ubuntu:
    mkdir: `/user/ubuntu': File exists
    ubuntu@node9:~$ sudo -u hdfs hadoop fs -chown -R ubuntu /user/ubuntu
    ubuntu@node9:~$ cd /usr/lib/hadoop-mapreduce
    ubuntu@node9:/usr/lib/hadoop-mapreduce$ hadoop jar
    hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar TestDFSIO -write
    -nrFiles 5 -fileSize 200
    12/07/24 07:20:58 INFO fs.TestDFSIO: TestDFSIO.0.0.6
    12/07/24 07:20:58 INFO fs.TestDFSIO: nrFiles = 5
    12/07/24 07:20:58 INFO fs.TestDFSIO: fileSize (MB) = 200.0
    12/07/24 07:20:58 INFO fs.TestDFSIO: bufferSize = 1000000
    12/07/24 07:20:58 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
    12/07/24 07:20:58 INFO fs.TestDFSIO: creating control file: 209715200
    bytes, 5 files
    12/07/24 07:20:59 INFO fs.TestDFSIO: created control files for: 5 files
    12/07/24 07:20:59 INFO mapreduce.JobSubmissionFiles: Permissions on staging
    directory hdfs://node9.sky.lab:8020/user/ubuntu/.staging are incorrect:
    rwxrwxrwx. Fixing permissions to correct value rwx------
    12/07/24 07:20:59 WARN mapred.JobClient: Use GenericOptionsParser for
    parsing the arguments. Applications should implement Tool for the same.
    12/07/24 07:20:59 INFO mapred.FileInputFormat: Total input paths to process
    : 5
    12/07/24 07:20:59 WARN conf.Configuration: dfs.https.address is deprecated.
    Instead, use dfs.namenode.https-address
    12/07/24 07:20:59 WARN conf.Configuration: io.bytes.per.checksum is
    deprecated. Instead, use dfs.bytes-per-checksum
    12/07/24 07:21:00 INFO mapred.JobClient: Running job: job_201207161740_0058
    12/07/24 07:21:01 INFO mapred.JobClient: map 0% reduce 0%
    12/07/24 07:21:16 INFO mapred.JobClient: map 20% reduce 0%
    12/07/24 07:21:18 INFO mapred.JobClient: map 60% reduce 0%
    12/07/24 07:21:19 INFO mapred.JobClient: map 100% reduce 0%
    12/07/24 07:21:53 INFO mapred.JobClient: map 100% reduce 33%
    12/07/24 07:21:55 INFO mapred.JobClient: map 100% reduce 100%
    12/07/24 07:21:57 INFO mapred.JobClient: Job complete: job_201207161740_0058
    12/07/24 07:21:57 INFO mapred.JobClient: Counters: 33
    12/07/24 07:21:57 INFO mapred.JobClient: File System Counters
    12/07/24 07:21:57 INFO mapred.JobClient: FILE: Number of bytes read=246
    12/07/24 07:21:57 INFO mapred.JobClient: FILE: Number of bytes
    written=347611
    12/07/24 07:21:57 INFO mapred.JobClient: FILE: Number of read
    operations=0
    12/07/24 07:21:57 INFO mapred.JobClient: FILE: Number of large read
    operations=0
    12/07/24 07:21:57 INFO mapred.JobClient: FILE: Number of write
    operations=0
    12/07/24 07:21:57 INFO mapred.JobClient: HDFS: Number of bytes read=1200
    12/07/24 07:21:57 INFO mapred.JobClient: HDFS: Number of bytes
    written=1048576078
    12/07/24 07:21:57 INFO mapred.JobClient: HDFS: Number of read
    operations=16
    12/07/24 07:21:57 INFO mapred.JobClient: HDFS: Number of large read
    operations=0
    12/07/24 07:21:57 INFO mapred.JobClient: HDFS: Number of write
    operations=7
    12/07/24 07:21:57 INFO mapred.JobClient: Job Counters
    12/07/24 07:21:57 INFO mapred.JobClient: Launched map tasks=5
    12/07/24 07:21:57 INFO mapred.JobClient: Launched reduce tasks=1
    12/07/24 07:21:57 INFO mapred.JobClient: Data-local map tasks=5
    12/07/24 07:21:57 INFO mapred.JobClient: Total time spent by all maps
    in occupied slots (ms)=158566
    12/07/24 07:21:57 INFO mapred.JobClient: Total time spent by all
    reduces in occupied slots (ms)=10191
    12/07/24 07:21:57 INFO mapred.JobClient: Total time spent by all maps
    waiting after reserving slots (ms)=0
    12/07/24 07:21:57 INFO mapred.JobClient: Total time spent by all
    reduces waiting after reserving slots (ms)=0
    12/07/24 07:21:57 INFO mapred.JobClient: Map-Reduce Framework
    12/07/24 07:21:57 INFO mapred.JobClient: Map input records=5
    12/07/24 07:21:57 INFO mapred.JobClient: Map output records=25
    12/07/24 07:21:57 INFO mapred.JobClient: Map output bytes=374
    12/07/24 07:21:57 INFO mapred.JobClient: Input split bytes=640
    12/07/24 07:21:57 INFO mapred.JobClient: Combine input records=0
    12/07/24 07:21:57 INFO mapred.JobClient: Combine output records=0
    12/07/24 07:21:57 INFO mapred.JobClient: Reduce input groups=5
    12/07/24 07:21:57 INFO mapred.JobClient: Reduce shuffle bytes=498
    12/07/24 07:21:57 INFO mapred.JobClient: Reduce input records=25
    12/07/24 07:21:57 INFO mapred.JobClient: Reduce output records=5
    12/07/24 07:21:57 INFO mapred.JobClient: Spilled Records=50
    12/07/24 07:21:57 INFO mapred.JobClient: CPU time spent (ms)=18230
    12/07/24 07:21:57 INFO mapred.JobClient: Physical memory (bytes)
    snapshot=1200144384
    12/07/24 07:21:57 INFO mapred.JobClient: Virtual memory (bytes)
    snapshot=6375264256
    12/07/24 07:21:57 INFO mapred.JobClient: Total committed heap usage
    (bytes)=1025703936
    12/07/24 07:21:57 INFO mapred.JobClient:
    org.apache.hadoop.mapreduce.lib.input.FileInputFormatCounter
    12/07/24 07:21:57 INFO mapred.JobClient: BYTES_READ=130
    12/07/24 07:21:57 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: TestDFSIO_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.(TestDFSIO.java:518)
    at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:607)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:448)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)




    Am Dienstag, 24. Juli 2012 02:05:41 UTC+2 schrieb Vikram Srivastava:
    You need to create a home directory for each user. Typically that dir is
    "/user/<username>" on HDFS. Steps are:
    1. As hdfs user, create /user/<username> (sudo -u hdfs hadoop fs -mkdir
    /user/foo)
    2. Make <username> the owner of this dir (sudo -u hdfs hadoop fs -chown -R
    foo /user/foo)
    On Mon, Jul 23, 2012 at 12:36 PM, keamas wrote:

    Oh I forgot Maybe someone can tell me why and how it will work with other
    Users.
    This would be interesting for me...


    Am Montag, 23. Juli 2012 21:30:20 UTC+2 schrieb keamas:
    Ok I did the chmod 777:
    ubuntu@node9:/usr/lib/hadoop-**mapreduce$ sudo -u hdfs hadoop fs -chmod
    -R 777 /
    and also added the Ubuntu user as a Super User in the Hue Web GUI but
    this didn't help.
    I also executed it with the sudo -u hdfs which is HDFS root
    Also no success.

    But when I execute it with the normal root user it works.
    In ubuntu U need to activate the root user with sudo passwd root
  • Yuki at Oct 11, 2012 at 11:49 pm
    I ran into a similar problem, and while I'm not sure exactly how to fix the
    permissions issue directly, I worked around it by appending -resFile
    /tmp/TestDFSIOresults.txt (or just about any location in hdfs in which you
    know the hdfs user is already the owner of that location which you check by
    the command "bin/hadoop fs -ls -R /" ):

    /usr/lib/hadoop$ sudo -u hdfs bin/hadoop jar hadoop-mr2-test.jar TestDFSIO
    -write -nrFiles 1 -fileSize 512 -resFile /tmp/TestDFSIOresults.txt

    I don't think I saw anyone here mention that work around, so I'd thought
    I'd respond. Thanks!
    On Tuesday, July 17, 2012 2:41:25 AM UTC-7, keamas wrote:

    Hi, I tested the Cloudera 4 Hadoop testprograms. When I try some of them I
    always get a Error at the End *results.log (Permission denied).
    I think it is some permission Problems here. I did already haoop fs -chmod
    777 /
    but it didn't help. I also executed the programms with the root user. But
    didn't help.

    $ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
    TestDFSIO -write -nrFiles 5 -fileSize 200
    12/07/17 11:29:39 INFO fs.TestDFSIO: TestDFSIO.0.0.6
    12/07/17 11:29:39 INFO fs.TestDFSIO: nrFiles = 5
    12/07/17 11:29:39 INFO fs.TestDFSIO: fileSize (MB) = 200.0
    12/07/17 11:29:39 INFO fs.TestDFSIO: bufferSize = 1000000
    12/07/17 11:29:39 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
    12/07/17 11:29:40 INFO fs.TestDFSIO: creating control file: 209715200
    bytes, 5 files
    12/07/17 11:29:41 INFO fs.TestDFSIO: created control files for: 5 files
    ....
    12/07/17 11:30:33 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: TestDFSIO_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.fs.TestDFSIO.analyzeResult(TestDFSIO.java:518)
    at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:607)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:448)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)




    $ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
    nnbench -operation create_write -maps 10 -reduces 5 -blockSize 1
    -bytesToWrite 0 -numberOfFiles 500 -replicationFactorPerFile 3
    -readFileAfterOpen true
    ....
    12/07/17 11:24:16 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: NNBench_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.hdfs.NNBench.analyzeResults(NNBench.java:458)
    at org.apache.hadoop.hdfs.NNBench.run(NNBench.java:603)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.hdfs.NNBench.main(NNBench.java:577)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)


    $ hadoop dfsadmin -report
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.
    Configured Capacity: 191644925952 (178.48 GB)
    Present Capacity: 191644925952 (178.48 GB)
    DFS Remaining: 188033765968 (175.12 GB)
    DFS Used: 3611159984 (3.36 GB)
    DFS Used%: 1.88%
    Under replicated blocks: 0
    Blocks with corrupt replicas: 0
    Missing blocks: 0
    -------------------------------------------------
    report: Access denied for user ubuntu. Superuser privilege is required
  • Dhanasekaran Anbalagan at Mar 6, 2013 at 10:36 am
    HI Marcel.

    Thanks is works for me. It's works on hdfs user. But same command
    hadoop jar hadoop-test.jar TestDFSIO -write -nrFiles 10 -fileSize 1000
    -resFile /tmp/TestDFSIOresults.txt in other user not working.
    On Thursday, October 11, 2012 7:49:30 PM UTC-4, Yuki wrote:

    I ran into a similar problem, and while I'm not sure exactly how to fix
    the permissions issue directly, I worked around it by appending -resFile
    /tmp/TestDFSIOresults.txt (or just about any location in hdfs in which you
    know the hdfs user is already the owner of that location which you check by
    the command "bin/hadoop fs -ls -R /" ):

    /usr/lib/hadoop$ sudo -u hdfs bin/hadoop jar hadoop-mr2-test.jar TestDFSIO
    -write -nrFiles 1 -fileSize 512 -resFile /tmp/TestDFSIOresults.txt

    I don't think I saw anyone here mention that work around, so I'd thought
    I'd respond. Thanks!
    On Tuesday, July 17, 2012 2:41:25 AM UTC-7, keamas wrote:

    Hi, I tested the Cloudera 4 Hadoop testprograms. When I try some of them
    I always get a Error at the End *results.log (Permission denied).
    I think it is some permission Problems here. I did already haoop fs
    -chmod 777 /
    but it didn't help. I also executed the programms with the root user. But
    didn't help.

    $ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
    TestDFSIO -write -nrFiles 5 -fileSize 200
    12/07/17 11:29:39 INFO fs.TestDFSIO: TestDFSIO.0.0.6
    12/07/17 11:29:39 INFO fs.TestDFSIO: nrFiles = 5
    12/07/17 11:29:39 INFO fs.TestDFSIO: fileSize (MB) = 200.0
    12/07/17 11:29:39 INFO fs.TestDFSIO: bufferSize = 1000000
    12/07/17 11:29:39 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
    12/07/17 11:29:40 INFO fs.TestDFSIO: creating control file: 209715200
    bytes, 5 files
    12/07/17 11:29:41 INFO fs.TestDFSIO: created control files for: 5 files
    ....
    12/07/17 11:30:33 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: TestDFSIO_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.fs.TestDFSIO.analyzeResult(TestDFSIO.java:518)
    at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:607)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:448)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at
    org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)




    $ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
    nnbench -operation create_write -maps 10 -reduces 5 -blockSize 1
    -bytesToWrite 0 -numberOfFiles 500 -replicationFactorPerFile 3
    -readFileAfterOpen true
    ....
    12/07/17 11:24:16 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: NNBench_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.hdfs.NNBench.analyzeResults(NNBench.java:458)
    at org.apache.hadoop.hdfs.NNBench.run(NNBench.java:603)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.hdfs.NNBench.main(NNBench.java:577)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at
    org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)


    $ hadoop dfsadmin -report
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.
    Configured Capacity: 191644925952 (178.48 GB)
    Present Capacity: 191644925952 (178.48 GB)
    DFS Remaining: 188033765968 (175.12 GB)
    DFS Used: 3611159984 (3.36 GB)
    DFS Used%: 1.88%
    Under replicated blocks: 0
    Blocks with corrupt replicas: 0
    Missing blocks: 0
    -------------------------------------------------
    report: Access denied for user ubuntu. Superuser privilege is required
  • Cycle Peter at May 9, 2013 at 11:21 pm
    It's writing to the linux filesystem not to HDFS. It's either an
    undocumented feature of the test of a bug. It works for some people
    because the user who invokes the test has permissions to the local
    filesytem directory /tmp.
    On Wednesday, March 6, 2013 2:36:54 AM UTC-8, Dhanasekaran Anbalagan wrote:

    HI Marcel.

    Thanks is works for me. It's works on hdfs user. But same command
    hadoop jar hadoop-test.jar TestDFSIO -write -nrFiles 10 -fileSize 1000
    -resFile /tmp/TestDFSIOresults.txt in other user not working.
    On Thursday, October 11, 2012 7:49:30 PM UTC-4, Yuki wrote:

    I ran into a similar problem, and while I'm not sure exactly how to fix
    the permissions issue directly, I worked around it by appending -resFile
    /tmp/TestDFSIOresults.txt (or just about any location in hdfs in which you
    know the hdfs user is already the owner of that location which you check by
    the command "bin/hadoop fs -ls -R /" ):

    /usr/lib/hadoop$ sudo -u hdfs bin/hadoop jar hadoop-mr2-test.jar
    TestDFSIO -write -nrFiles 1 -fileSize 512 -resFile
    /tmp/TestDFSIOresults.txt

    I don't think I saw anyone here mention that work around, so I'd thought
    I'd respond. Thanks!
    On Tuesday, July 17, 2012 2:41:25 AM UTC-7, keamas wrote:

    Hi, I tested the Cloudera 4 Hadoop testprograms. When I try some of them
    I always get a Error at the End *results.log (Permission denied).
    I think it is some permission Problems here. I did already haoop fs
    -chmod 777 /
    but it didn't help. I also executed the programms with the root user.
    But didn't help.

    $ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
    TestDFSIO -write -nrFiles 5 -fileSize 200
    12/07/17 11:29:39 INFO fs.TestDFSIO: TestDFSIO.0.0.6
    12/07/17 11:29:39 INFO fs.TestDFSIO: nrFiles = 5
    12/07/17 11:29:39 INFO fs.TestDFSIO: fileSize (MB) = 200.0
    12/07/17 11:29:39 INFO fs.TestDFSIO: bufferSize = 1000000
    12/07/17 11:29:39 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
    12/07/17 11:29:40 INFO fs.TestDFSIO: creating control file: 209715200
    bytes, 5 files
    12/07/17 11:29:41 INFO fs.TestDFSIO: created control files for: 5 files
    ....
    12/07/17 11:30:33 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: TestDFSIO_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.fs.TestDFSIO.analyzeResult(TestDFSIO.java:518)
    at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:607)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:448)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at
    org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at
    org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)




    $ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
    nnbench -operation create_write -maps 10 -reduces 5 -blockSize 1
    -bytesToWrite 0 -numberOfFiles 500 -replicationFactorPerFile 3
    -readFileAfterOpen true
    ....
    12/07/17 11:24:16 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: NNBench_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.hdfs.NNBench.analyzeResults(NNBench.java:458)
    at org.apache.hadoop.hdfs.NNBench.run(NNBench.java:603)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.hdfs.NNBench.main(NNBench.java:577)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at
    org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at
    org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)


    $ hadoop dfsadmin -report
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.
    Configured Capacity: 191644925952 (178.48 GB)
    Present Capacity: 191644925952 (178.48 GB)
    DFS Remaining: 188033765968 (175.12 GB)
    DFS Used: 3611159984 (3.36 GB)
    DFS Used%: 1.88%
    Under replicated blocks: 0
    Blocks with corrupt replicas: 0
    Missing blocks: 0
    -------------------------------------------------
    report: Access denied for user ubuntu. Superuser privilege is required
  • Abhishek Jha at Jul 29, 2013 at 9:00 am
    Basically it creates log file i.e. TestDFSIO_results.log in linux
    filesystem , not HDFS, as said above. I just changed my present working
    directory to $HOME and it worked like charm. It has to have a write
    permission in pwd.

    Thanks,
    Abhishek
    On Friday, 10 May 2013 04:51:36 UTC+5:30, cycle...@gmail.com wrote:

    It's writing to the linux filesystem not to HDFS. It's either an
    undocumented feature of the test of a bug. It works for some people
    because the user who invokes the test has permissions to the local
    filesytem directory /tmp.
    On Wednesday, March 6, 2013 2:36:54 AM UTC-8, Dhanasekaran Anbalagan wrote:

    HI Marcel.

    Thanks is works for me. It's works on hdfs user. But same command
    hadoop jar hadoop-test.jar TestDFSIO -write -nrFiles 10 -fileSize 1000
    -resFile /tmp/TestDFSIOresults.txt in other user not working.
    On Thursday, October 11, 2012 7:49:30 PM UTC-4, Yuki wrote:

    I ran into a similar problem, and while I'm not sure exactly how to fix
    the permissions issue directly, I worked around it by appending -resFile
    /tmp/TestDFSIOresults.txt (or just about any location in hdfs in which you
    know the hdfs user is already the owner of that location which you check by
    the command "bin/hadoop fs -ls -R /" ):

    /usr/lib/hadoop$ sudo -u hdfs bin/hadoop jar hadoop-mr2-test.jar
    TestDFSIO -write -nrFiles 1 -fileSize 512 -resFile
    /tmp/TestDFSIOresults.txt

    I don't think I saw anyone here mention that work around, so I'd thought
    I'd respond. Thanks!
    On Tuesday, July 17, 2012 2:41:25 AM UTC-7, keamas wrote:

    Hi, I tested the Cloudera 4 Hadoop testprograms. When I try some of
    them I always get a Error at the End *results.log (Permission denied).
    I think it is some permission Problems here. I did already haoop fs
    -chmod 777 /
    but it didn't help. I also executed the programms with the root user.
    But didn't help.

    $ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
    TestDFSIO -write -nrFiles 5 -fileSize 200
    12/07/17 11:29:39 INFO fs.TestDFSIO: TestDFSIO.0.0.6
    12/07/17 11:29:39 INFO fs.TestDFSIO: nrFiles = 5
    12/07/17 11:29:39 INFO fs.TestDFSIO: fileSize (MB) = 200.0
    12/07/17 11:29:39 INFO fs.TestDFSIO: bufferSize = 1000000
    12/07/17 11:29:39 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
    12/07/17 11:29:40 INFO fs.TestDFSIO: creating control file: 209715200
    bytes, 5 files
    12/07/17 11:29:41 INFO fs.TestDFSIO: created control files for: 5 files
    ....
    12/07/17 11:30:33 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: TestDFSIO_results.log (Permission
    denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.fs.TestDFSIO.analyzeResult(TestDFSIO.java:518)
    at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:607)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:448)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at
    org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at
    org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)




    $ hadoop jar hadoop-mapreduce-client-jobclient-2.0.0-cdh4.0.1-tests.jar
    nnbench -operation create_write -maps 10 -reduces 5 -blockSize 1
    -bytesToWrite 0 -numberOfFiles 500 -replicationFactorPerFile 3
    -readFileAfterOpen true
    ....
    12/07/17 11:24:16 INFO util.NativeCodeLoader: Loaded the native-hadoop
    library
    java.io.FileNotFoundException: NNBench_results.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
    at org.apache.hadoop.hdfs.NNBench.analyzeResults(NNBench.java:458)
    at org.apache.hadoop.hdfs.NNBench.run(NNBench.java:603)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.hdfs.NNBench.main(NNBench.java:577)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
    at
    org.apache.hadoop.test.MapredTestDriver.run(MapredTestDriver.java:112)
    at
    org.apache.hadoop.test.MapredTestDriver.main(MapredTestDriver.java:120)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)


    $ hadoop dfsadmin -report
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.
    Configured Capacity: 191644925952 (178.48 GB)
    Present Capacity: 191644925952 (178.48 GB)
    DFS Remaining: 188033765968 (175.12 GB)
    DFS Used: 3611159984 (3.36 GB)
    DFS Used%: 1.88%
    Under replicated blocks: 0
    Blocks with corrupt replicas: 0
    Missing blocks: 0
    -------------------------------------------------
    report: Access denied for user ubuntu. Superuser privilege is required

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupscm-users @
categorieshadoop
postedJul 17, '12 at 9:41a
activeJul 29, '13 at 9:00a
posts12
users8
websitecloudera.com
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase