FAQ
Hi,



I tried to run the Java code and it doesn't work.



I pasted the code below:



public class testHadoop {

public static final String DIR_HADOOP = "hdfs://my.machine.com";

public static final String PORT_HADOOP = "9000";



public static void main(String[] args) {

Configuration config = new Configuration();

config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP);



try {

FileSystem haddopFileSystem = FileSystem.get(config);



String directory = "test";

Path hadoopDirectory = new
Path(haddopFileSystem.getWorkingDirectory() + "/" + directory);



haddopFileSystem.mkdirs(hadoopDirectory);



Path sourceDirectory = new
Path("C://Windows/media/ringout.wav");



haddopFileSystem.copyFromLocalFile(sourceDirectory,
hadoopDirectory);



Path sourceFile = new
Path(haddopFileSystem.getWorkingDirectory() + "/test/ringout.wav");

Path targetDirectory = new Path("C://");



haddopFileSystem.copyToLocalFile(sourceFile, targetDirectory);



haddopFileSystem.delete(hadoopDirectory, true);

} catch(IOException ex) {

Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE,
null, ex);

}

}

}



The result of this code is an exception:



org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=varadero\miguelangel, access=WRITE,
inode="tmp":root:supergroup:rwxr-xr-x

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)

at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAcces
sorImpl.java:39)

at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
torAccessorImpl.java:27)

at java.lang.reflect.Constructor.newInstance(Constructor.java:513)

at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.j
ava:96)

at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.
java:58)

at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:914)

at
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.ja
va:262)

at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1120)

at hadoop.testHadoop.main(testHadoop.java:37)

Caused by: org.apache.hadoop.ipc.RemoteException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=varadero\miguelangel, access=WRITE,
inode="tmp":root:supergroup:rwxr-xr-x

at
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChe
cker.java:176)

at
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChe
cker.java:157)

at
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(Per
missionChecker.java:105)

at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesy
stem.java:4514)

at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNa
mesystem.java:4484)

at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesys
tem.java:1766)

at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java
:1735)

at
org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:542)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:25)

at java.lang.reflect.Method.invoke(Method.java:597)

at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:396)

at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)



at org.apache.hadoop.ipc.Client.call(Client.java:740)

at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)

at $Proxy0.mkdirs(Unknown Source)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:25)

at java.lang.reflect.Method.invoke(Method.java:597)

at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocati
onHandler.java:82)

at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHand
ler.java:59)

at $Proxy0.mkdirs(Unknown Source)

at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:912)

... 3 more



What's happend?



Miguel Ángel Álvarez de la Concepción

Departamento de Lenguajes y Sistemas Informáticos

Escuela Técnica Superior de Ingeniería Informática

Universidad de Sevilla

Teléfono: 954.556.086

Email: maalvarez@us.es

Search Discussions

  • Jeff Zhang at Mar 9, 2010 at 4:16 pm
    add ugi configuration like this :
    conf.set("hadoop.job.ugi",your_hadoop_user_name+","+your_hadoop_group_name);


    2010/3/9 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>
    Hi,



    I tried to run the Java code and it doesn't work.



    I pasted the code below:



    public class testHadoop {

    public static final String DIR_HADOOP = "hdfs://my.machine.com";

    public static final String PORT_HADOOP = "9000";



    public static void main(String[] args) {

    Configuration config = new Configuration();

    config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP);



    try {

    FileSystem haddopFileSystem = FileSystem.get(config);



    String directory = "test";

    Path hadoopDirectory = new
    Path(haddopFileSystem.getWorkingDirectory() + "/" + directory);



    haddopFileSystem.mkdirs(hadoopDirectory);



    Path sourceDirectory = new
    Path("C://Windows/media/ringout.wav");



    haddopFileSystem.copyFromLocalFile(sourceDirectory,
    hadoopDirectory);



    Path sourceFile = new
    Path(haddopFileSystem.getWorkingDirectory() + "/test/ringout.wav");

    Path targetDirectory = new Path("C://");



    haddopFileSystem.copyToLocalFile(sourceFile, targetDirectory);



    haddopFileSystem.delete(hadoopDirectory, true);

    } catch(IOException ex) {

    Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE,
    null, ex);

    }

    }

    }



    The result of this code is an exception:



    org.apache.hadoop.security.AccessControlException:
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=varadero\miguelangel, access=WRITE,
    inode="tmp":root:supergroup:rwxr-xr-x

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)

    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)

    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)

    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)

    at
    org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)

    at
    org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:914)

    at
    org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)

    at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1120)

    at hadoop.testHadoop.main(testHadoop.java:37)

    Caused by: org.apache.hadoop.ipc.RemoteException:
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=varadero\miguelangel, access=WRITE,
    inode="tmp":root:supergroup:rwxr-xr-x

    at
    org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)

    at
    org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:157)

    at
    org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(PermissionChecker.java:105)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4514)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4484)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:1766)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:1735)

    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:542)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:396)

    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)



    at org.apache.hadoop.ipc.Client.call(Client.java:740)

    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)

    at $Proxy0.mkdirs(Unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)

    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)

    at $Proxy0.mkdirs(Unknown Source)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:912)

    ... 3 more



    What's happend?



    Miguel Ángel Álvarez de la Concepción

    Departamento de Lenguajes y Sistemas Informáticos

    Escuela Técnica Superior de Ingeniería Informática

    Universidad de Sevilla

    Teléfono: 954.556.086

    Email: maalvarez@us.es



    --
    Best Regards

    Jeff Zhang
  • Miguel Ángel Álvarez de la Concepción at Mar 10, 2010 at 10:57 am
    Thanks!



    Now, the error occurs after copying the remote file I uploaded before:



    java.io.IOException: Cannot run program "chmod": CreateProcess error=2, The system can’t find the specified file

    at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)

    at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)

    at org.apache.hadoop.util.Shell.run(Shell.java:134)

    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)

    at org.apache.hadoop.util.Shell.execCommand(Shell.java:354)

    at org.apache.hadoop.util.Shell.execCommand(Shell.java:337)

    at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:481)

    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:473)

    at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:280)

    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:372)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372)

    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:208)

    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)

    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1216)

    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1197)

    at hadoop.testHadoop.main(testHadoop.java:53)

    Caused by: java.io.IOException: CreateProcess error=2, The system can’t find the specified file

    at java.lang.ProcessImpl.create(Native Method)

    at java.lang.ProcessImpl.(ProcessImpl.java:30)

    at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)

    ... 17 more



    Thanks again for your help.



    De: Jeff Zhang
    Enviado el: martes, 09 de marzo de 2010 17:13
    Para: hdfs-user@hadoop.apache.org
    Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval Using Java




    add ugi configuration like this :
    conf.set("hadoop.job.ugi",your_hadoop_user_name+","+your_hadoop_group_name);



    2010/3/9 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>

    Hi,



    I tried to run the Java code and it doesn't work.



    I pasted the code below:



    public class testHadoop {

    public static final String DIR_HADOOP = "hdfs://my.machine.com";

    public static final String PORT_HADOOP = "9000";



    public static void main(String[] args) {

    Configuration config = new Configuration();

    config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP);



    try {

    FileSystem haddopFileSystem = FileSystem.get(config);



    String directory = "test";

    Path hadoopDirectory = new Path(haddopFileSystem.getWorkingDirectory() + "/" + directory);



    haddopFileSystem.mkdirs(hadoopDirectory);



    Path sourceDirectory = new Path("C://Windows/media/ringout.wav");



    haddopFileSystem.copyFromLocalFile(sourceDirectory, hadoopDirectory);



    Path sourceFile = new Path(haddopFileSystem.getWorkingDirectory() + "/test/ringout.wav");

    Path targetDirectory = new Path("C://");



    haddopFileSystem.copyToLocalFile(sourceFile, targetDirectory);



    haddopFileSystem.delete(hadoopDirectory, true);

    } catch(IOException ex) {

    Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE, null, ex);

    }

    }

    }



    The result of this code is an exception:



    org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=varadero\miguelangel, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)

    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)

    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)

    at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)

    at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:914)

    at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)

    at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1120)

    at hadoop.testHadoop.main(testHadoop.java:37)

    Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Permission denied: user=varadero\miguelangel, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    at org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)

    at org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:157)

    at org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(PermissionChecker.java:105)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4514)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4484)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:1766)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:1735)

    at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:542)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:396)

    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)



    at org.apache.hadoop.ipc.Client.call(Client.java:740)

    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)

    at $Proxy0.mkdirs(Unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)

    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)

    at $Proxy0.mkdirs(Unknown Source)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:912)

    ... 3 more



    What's happend?



    Miguel Ángel Álvarez de la Concepción

    Departamento de Lenguajes y Sistemas Informáticos

    Escuela Técnica Superior de Ingeniería Informática

    Universidad de Sevilla

    Teléfono: 954.556.086

    Email: maalvarez@us.es






    --
    Best Regards

    Jeff Zhang
  • Jeff Zhang at Mar 10, 2010 at 1:11 pm
    It seems you are running it in windows, then you should install cygwin, and
    add C:/cygwin/bin on the Path environment variable .



    2010/3/10 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>
    Thanks!



    Now, the error occurs after copying the remote file I uploaded before:



    java.io.IOException: Cannot run program "chmod": CreateProcess error=2, The
    system can’t find the specified file

    at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)

    at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)

    at org.apache.hadoop.util.Shell.run(Shell.java:134)

    at
    org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)

    at org.apache.hadoop.util.Shell.execCommand(Shell.java:354)

    at org.apache.hadoop.util.Shell.execCommand(Shell.java:337)

    at
    org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:481)

    at
    org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:473)

    at
    org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:280)

    at
    org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:372)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372)

    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:208)

    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)

    at
    org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1216)

    at
    org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1197)

    at hadoop.testHadoop.main(testHadoop.java:53)

    Caused by: java.io.IOException: CreateProcess error=2, The system can’t
    find the specified file

    at java.lang.ProcessImpl.create(Native Method)

    at java.lang.ProcessImpl.<init>(ProcessImpl.java:81)

    at java.lang.ProcessImpl.start(ProcessImpl.java:30)

    at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)

    ... 17 more



    Thanks again for your help.



    *De:* Jeff Zhang
    *Enviado el:* martes, 09 de marzo de 2010 17:13
    *Para:* hdfs-user@hadoop.apache.org
    *Asunto:* Re: Accessing Hadoop DFS for Data Storage and Retrieval Using
    Java




    add ugi configuration like this :

    conf.set("hadoop.job.ugi",your_hadoop_user_name+","+your_hadoop_group_name);

    2010/3/9 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>

    Hi,



    I tried to run the Java code and it doesn't work.



    I pasted the code below:



    public class testHadoop {

    public static final String DIR_HADOOP = "hdfs://my.machine.com";

    public static final String PORT_HADOOP = "9000";



    public static void main(String[] args) {

    Configuration config = new Configuration();

    config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP);



    try {

    FileSystem haddopFileSystem = FileSystem.get(config);



    String directory = "test";

    Path hadoopDirectory = new
    Path(haddopFileSystem.getWorkingDirectory() + "/" + directory);



    haddopFileSystem.mkdirs(hadoopDirectory);



    Path sourceDirectory = new
    Path("C://Windows/media/ringout.wav");



    haddopFileSystem.copyFromLocalFile(sourceDirectory,
    hadoopDirectory);



    Path sourceFile = new
    Path(haddopFileSystem.getWorkingDirectory() + "/test/ringout.wav");

    Path targetDirectory = new Path("C://");



    haddopFileSystem.copyToLocalFile(sourceFile, targetDirectory);



    haddopFileSystem.delete(hadoopDirectory, true);

    } catch(IOException ex) {

    Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE,
    null, ex);

    }

    }

    }



    The result of this code is an exception:



    org.apache.hadoop.security.AccessControlException:
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=varadero\miguelangel, access=WRITE,
    inode="tmp":root:supergroup:rwxr-xr-x

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)

    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)

    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)

    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)

    at
    org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)

    at
    org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:914)

    at
    org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)

    at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1120)

    at hadoop.testHadoop.main(testHadoop.java:37)

    Caused by: org.apache.hadoop.ipc.RemoteException:
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=varadero\miguelangel, access=WRITE,
    inode="tmp":root:supergroup:rwxr-xr-x

    at
    org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)

    at
    org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:157)

    at
    org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(PermissionChecker.java:105)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4514)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4484)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:1766)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:1735)

    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:542)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:396)

    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)



    at org.apache.hadoop.ipc.Client.call(Client.java:740)

    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)

    at $Proxy0.mkdirs(Unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)

    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)

    at $Proxy0.mkdirs(Unknown Source)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:912)

    ... 3 more



    What's happend?



    Miguel Ángel Álvarez de la Concepción

    Departamento de Lenguajes y Sistemas Informáticos

    Escuela Técnica Superior de Ingeniería Informática

    Universidad de Sevilla

    Teléfono: 954.556.086

    Email: maalvarez@us.es






    --
    Best Regards

    Jeff Zhang


    --
    Best Regards

    Jeff Zhang
  • Miguel Ángel Álvarez de la Concepción at Mar 10, 2010 at 1:53 pm
    I have intalled Hadoop in CentOS (Linux) and the test code is running on Windows.

    Do I need cygwin to run the test code?



    De: Jeff Zhang
    Enviado el: miércoles, 10 de marzo de 2010 14:10
    Para: hdfs-user@hadoop.apache.org
    Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval Using Java



    It seems you are running it in windows, then you should install cygwin, and add C:/cygwin/bin on the Path environment variable .




    2010/3/10 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>

    Thanks!



    Now, the error occurs after copying the remote file I uploaded before:



    java.io.IOException: Cannot run program "chmod": CreateProcess error=2, The system can’t find the specified file

    at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)

    at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)

    at org.apache.hadoop.util.Shell.run(Shell.java:134)

    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)

    at org.apache.hadoop.util.Shell.execCommand(Shell.java:354)

    at org.apache.hadoop.util.Shell.execCommand(Shell.java:337)

    at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:481)

    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:473)

    at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:280)

    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:372)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372)

    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:208)

    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)

    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1216)

    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1197)

    at hadoop.testHadoop.main(testHadoop.java:53)

    Caused by: java.io.IOException: CreateProcess error=2, The system can’t find the specified file

    at java.lang.ProcessImpl.create(Native Method)

    at java.lang.ProcessImpl.(ProcessImpl.java:30)

    at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)

    ... 17 more



    Thanks again for your help.



    De: Jeff Zhang
    Enviado el: martes, 09 de marzo de 2010 17:13
    Para: hdfs-user@hadoop.apache.org
    Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval Using Java




    add ugi configuration like this :
    conf.set("hadoop.job.ugi",your_hadoop_user_name+","+your_hadoop_group_name);

    2010/3/9 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>

    Hi,



    I tried to run the Java code and it doesn't work.



    I pasted the code below:



    public class testHadoop {

    public static final String DIR_HADOOP = "hdfs://my.machine.com";

    public static final String PORT_HADOOP = "9000";



    public static void main(String[] args) {

    Configuration config = new Configuration();

    config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP);



    try {

    FileSystem haddopFileSystem = FileSystem.get(config);



    String directory = "test";

    Path hadoopDirectory = new Path(haddopFileSystem.getWorkingDirectory() + "/" + directory);



    haddopFileSystem.mkdirs(hadoopDirectory);



    Path sourceDirectory = new Path("C://Windows/media/ringout.wav");



    haddopFileSystem.copyFromLocalFile(sourceDirectory, hadoopDirectory);



    Path sourceFile = new Path(haddopFileSystem.getWorkingDirectory() + "/test/ringout.wav");

    Path targetDirectory = new Path("C://");



    haddopFileSystem.copyToLocalFile(sourceFile, targetDirectory);



    haddopFileSystem.delete(hadoopDirectory, true);

    } catch(IOException ex) {

    Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE, null, ex);

    }

    }

    }



    The result of this code is an exception:



    org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=varadero\miguelangel, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)

    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)

    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)

    at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)

    at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:914)

    at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)

    at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1120)

    at hadoop.testHadoop.main(testHadoop.java:37)

    Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Permission denied: user=varadero\miguelangel, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    at org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)

    at org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:157)

    at org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(PermissionChecker.java:105)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4514)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4484)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:1766)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:1735)

    at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:542)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:396)

    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)



    at org.apache.hadoop.ipc.Client.call(Client.java:740)

    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)

    at $Proxy0.mkdirs(Unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)

    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)

    at $Proxy0.mkdirs(Unknown Source)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:912)

    ... 3 more



    What's happend?



    Miguel Ángel Álvarez de la Concepción

    Departamento de Lenguajes y Sistemas Informáticos

    Escuela Técnica Superior de Ingeniería Informática

    Universidad de Sevilla

    Teléfono: 954.556.086

    Email: maalvarez@us.es






    --
    Best Regards

    Jeff Zhang




    --
    Best Regards

    Jeff Zhang
  • Jeff Zhang at Mar 10, 2010 at 2:23 pm
    Yes, I think so.

    2010/3/10 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>
    I have intalled Hadoop in CentOS (Linux) and the test code is running on
    Windows.

    Do I need cygwin to run the test code?



    *De:* Jeff Zhang
    *Enviado el:* miércoles, 10 de marzo de 2010 14:10

    *Para:* hdfs-user@hadoop.apache.org
    *Asunto:* Re: Accessing Hadoop DFS for Data Storage and Retrieval Using
    Java



    It seems you are running it in windows, then you should install cygwin, and
    add C:/cygwin/bin on the Path environment variable .


    2010/3/10 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>

    Thanks!



    Now, the error occurs after copying the remote file I uploaded before:



    java.io.IOException: Cannot run program "chmod": CreateProcess error=2, The
    system can’t find the specified file

    at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)

    at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)

    at org.apache.hadoop.util.Shell.run(Shell.java:134)

    at
    org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)

    at org.apache.hadoop.util.Shell.execCommand(Shell.java:354)

    at org.apache.hadoop.util.Shell.execCommand(Shell.java:337)

    at
    org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:481)

    at
    org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:473)

    at
    org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:280)

    at
    org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:372)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372)

    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:208)

    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)

    at
    org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1216)

    at
    org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1197)

    at hadoop.testHadoop.main(testHadoop.java:53)

    Caused by: java.io.IOException: CreateProcess error=2, The system can’t
    find the specified file

    at java.lang.ProcessImpl.create(Native Method)

    at java.lang.ProcessImpl.<init>(ProcessImpl.java:81)

    at java.lang.ProcessImpl.start(ProcessImpl.java:30)

    at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)

    ... 17 more



    Thanks again for your help.



    *De:* Jeff Zhang
    *Enviado el:* martes, 09 de marzo de 2010 17:13
    *Para:* hdfs-user@hadoop.apache.org
    *Asunto:* Re: Accessing Hadoop DFS for Data Storage and Retrieval Using
    Java




    add ugi configuration like this :

    conf.set("hadoop.job.ugi",your_hadoop_user_name+","+your_hadoop_group_name);

    2010/3/9 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>

    Hi,



    I tried to run the Java code and it doesn't work.



    I pasted the code below:



    public class testHadoop {

    public static final String DIR_HADOOP = "hdfs://my.machine.com";

    public static final String PORT_HADOOP = "9000";



    public static void main(String[] args) {

    Configuration config = new Configuration();

    config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP);



    try {

    FileSystem haddopFileSystem = FileSystem.get(config);



    String directory = "test";

    Path hadoopDirectory = new
    Path(haddopFileSystem.getWorkingDirectory() + "/" + directory);



    haddopFileSystem.mkdirs(hadoopDirectory);



    Path sourceDirectory = new
    Path("C://Windows/media/ringout.wav");



    haddopFileSystem.copyFromLocalFile(sourceDirectory,
    hadoopDirectory);



    Path sourceFile = new
    Path(haddopFileSystem.getWorkingDirectory() + "/test/ringout.wav");

    Path targetDirectory = new Path("C://");



    haddopFileSystem.copyToLocalFile(sourceFile, targetDirectory);



    haddopFileSystem.delete(hadoopDirectory, true);

    } catch(IOException ex) {

    Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE,
    null, ex);

    }

    }

    }



    The result of this code is an exception:



    org.apache.hadoop.security.AccessControlException:
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=varadero\miguelangel, access=WRITE,
    inode="tmp":root:supergroup:rwxr-xr-x

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)

    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)

    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)

    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)

    at
    org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)

    at
    org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:914)

    at
    org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)

    at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1120)

    at hadoop.testHadoop.main(testHadoop.java:37)

    Caused by: org.apache.hadoop.ipc.RemoteException:
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=varadero\miguelangel, access=WRITE,
    inode="tmp":root:supergroup:rwxr-xr-x

    at
    org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)

    at
    org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:157)

    at
    org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(PermissionChecker.java:105)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4514)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4484)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:1766)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:1735)

    at
    org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:542)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:396)

    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)



    at org.apache.hadoop.ipc.Client.call(Client.java:740)

    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)

    at $Proxy0.mkdirs(Unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)

    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)

    at $Proxy0.mkdirs(Unknown Source)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:912)

    ... 3 more



    What's happend?



    Miguel Ángel Álvarez de la Concepción

    Departamento de Lenguajes y Sistemas Informáticos

    Escuela Técnica Superior de Ingeniería Informática

    Universidad de Sevilla

    Teléfono: 954.556.086

    Email: maalvarez@us.es






    --
    Best Regards

    Jeff Zhang




    --
    Best Regards

    Jeff Zhang


    --
    Best Regards

    Jeff Zhang
  • Miguel Ángel Álvarez de la Concepción at Mar 10, 2010 at 3:46 pm
    I solved the problem by reading from a stream.

    I write the whole code for anyone who wants to see it:



    public class testHadoop {

    public static final String DIR_HADOOP = "hdfs://my.machine.com";

    public static final String PORT_HADOOP = "9000";



    public static void main(String[] args) {

    Configuration config = new Configuration();

    config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP);

    config.set("hadoop.job.ugi", "root, supergroup");



    try {

    FileSystem hadoopFileSystem = FileSystem.get(config);



    Path hadoopDirectory = new Path(hadoopFileSystem.getWorkingDirectory() + "/test");



    hadoopFileSystem.mkdirs(hadoopDirectory);



    Path directorioOrigen = new Path("C://Windows/media/ringout.wav");



    hadoopFileSystem.copyFromLocalFile(directorioOrigen, hadoopDirectory);



    Path ficheroOrigen = new Path(hadoopFileSystem.getWorkingDirectory() + "/test/ringout.wav");

    FSDataInputStream in = hadoopFileSystem.open(ficheroOrigen);

    FileOutputStream out = new FileOutputStream("C://ringout.wav");



    while(in.available() > 0) {

    out.write(in.readByte());

    }



    out.close();

    in.close();



    hadoopFileSystem.delete(hadoopDirectory, true);

    } catch(IOException ex) {

    Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE, null, ex);

    }

    }

    }



    Thanks for all!



    De: Jeff Zhang
    Enviado el: miércoles, 10 de marzo de 2010 15:22
    Para: hdfs-user@hadoop.apache.org
    Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval Using Java



    Yes, I think so.

    2010/3/10 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>

    I have intalled Hadoop in CentOS (Linux) and the test code is running on Windows.

    Do I need cygwin to run the test code?



    De: Jeff Zhang
    Enviado el: miércoles, 10 de marzo de 2010 14:10


    Para: hdfs-user@hadoop.apache.org
    Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval Using Java



    It seems you are running it in windows, then you should install cygwin, and add C:/cygwin/bin on the Path environment variable .



    2010/3/10 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>

    Thanks!



    Now, the error occurs after copying the remote file I uploaded before:



    java.io.IOException: Cannot run program "chmod": CreateProcess error=2, The system can’t find the specified file

    at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)

    at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)

    at org.apache.hadoop.util.Shell.run(Shell.java:134)

    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)

    at org.apache.hadoop.util.Shell.execCommand(Shell.java:354)

    at org.apache.hadoop.util.Shell.execCommand(Shell.java:337)

    at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:481)

    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:473)

    at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:280)

    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:372)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)

    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372)

    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:208)

    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)

    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1216)

    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1197)

    at hadoop.testHadoop.main(testHadoop.java:53)

    Caused by: java.io.IOException: CreateProcess error=2, The system can’t find the specified file

    at java.lang.ProcessImpl.create(Native Method)

    at java.lang.ProcessImpl.(ProcessImpl.java:30)

    at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)

    ... 17 more



    Thanks again for your help.



    De: Jeff Zhang
    Enviado el: martes, 09 de marzo de 2010 17:13
    Para: hdfs-user@hadoop.apache.org
    Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval Using Java




    add ugi configuration like this :
    conf.set("hadoop.job.ugi",your_hadoop_user_name+","+your_hadoop_group_name);

    2010/3/9 Miguel Ángel Álvarez de la Concepción <maalvarez@us.es>

    Hi,



    I tried to run the Java code and it doesn't work.



    I pasted the code below:



    public class testHadoop {

    public static final String DIR_HADOOP = "hdfs://my.machine.com";

    public static final String PORT_HADOOP = "9000";



    public static void main(String[] args) {

    Configuration config = new Configuration();

    config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP);



    try {

    FileSystem haddopFileSystem = FileSystem.get(config);



    String directory = "test";

    Path hadoopDirectory = new Path(haddopFileSystem.getWorkingDirectory() + "/" + directory);



    haddopFileSystem.mkdirs(hadoopDirectory);



    Path sourceDirectory = new Path("C://Windows/media/ringout.wav");



    haddopFileSystem.copyFromLocalFile(sourceDirectory, hadoopDirectory);



    Path sourceFile = new Path(haddopFileSystem.getWorkingDirectory() + "/test/ringout.wav");

    Path targetDirectory = new Path("C://");



    haddopFileSystem.copyToLocalFile(sourceFile, targetDirectory);



    haddopFileSystem.delete(hadoopDirectory, true);

    } catch(IOException ex) {

    Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE, null, ex);

    }

    }

    }



    The result of this code is an exception:



    org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=varadero\miguelangel, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)

    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)

    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)

    at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)

    at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:914)

    at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)

    at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1120)

    at hadoop.testHadoop.main(testHadoop.java:37)

    Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Permission denied: user=varadero\miguelangel, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    at org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)

    at org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:157)

    at org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(PermissionChecker.java:105)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4514)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4484)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:1766)

    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:1735)

    at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:542)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)

    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:396)

    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)



    at org.apache.hadoop.ipc.Client.call(Client.java:740)

    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)

    at $Proxy0.mkdirs(Unknown Source)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)

    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)

    at $Proxy0.mkdirs(Unknown Source)

    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:912)

    ... 3 more



    What's happend?



    Miguel Ángel Álvarez de la Concepción

    Departamento de Lenguajes y Sistemas Informáticos

    Escuela Técnica Superior de Ingeniería Informática

    Universidad de Sevilla

    Teléfono: 954.556.086

    Email: maalvarez@us.es






    --
    Best Regards

    Jeff Zhang




    --
    Best Regards

    Jeff Zhang




    --
    Best Regards

    Jeff Zhang

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouphdfs-user @
categorieshadoop
postedMar 9, '10 at 4:05p
activeMar 10, '10 at 3:46p
posts7
users2
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase