one Java process?
Regards,
Xiaobo Gu
On Thu, Aug 11, 2011 at 1:07 PM, Lance Norskog wrote:
If the server is dedicated to this job, you might as well give it
10-15g. After that shakes out, try changing the number of mappers &
reducers.
--
Lance Norskog
goksron@gmail.com
If the server is dedicated to this job, you might as well give it
10-15g. After that shakes out, try changing the number of mappers &
reducers.
On Tue, Aug 9, 2011 at 2:06 AM, Xiaobo Gu wrote:
Hi Adi,
Thanks for your response, on an SMP server with 32G RAM and 8 Cores,
what's your suggestion for setting HADOOP_HEAPSIZE, the server will be
dedicated for a Single Node Hadoop with 1 data node instance, and the
it will run 4 mapper and reducer tasks .
Regards,
Xiaobo Gu
You either do not have enough memory allocated to your hadoop daemons(via
HADOOP_HEAPSIZE) or swap space.
-Adi
Hi Adi,
Thanks for your response, on an SMP server with 32G RAM and 8 Cores,
what's your suggestion for setting HADOOP_HEAPSIZE, the server will be
dedicated for a Single Node Hadoop with 1 data node instance, and the
it will run 4 mapper and reducer tasks .
Regards,
Xiaobo Gu
On Sun, Aug 7, 2011 at 11:35 PM, Adi wrote:
Caused by: java.io.IOException: error=12, Not enough space
Caused by: java.io.IOException: error=12, Not enough space
HADOOP_HEAPSIZE) or swap space.
-Adi
On Sun, Aug 7, 2011 at 5:48 AM, Xiaobo Gu wrote:
Hi,
I am trying to write a map-reduce job to convert csv files to
sequencefiles, but the job fails with the following error:
java.lang.RuntimeException: Error while running command to get file
permissions : java.io.IOException: Cannot run program "/bin/ls":
error=12, Not enough space
at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:200)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:461)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:444)
at
org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:540)
at
org.apache.hadoop.fs.RawLocalFileSystem.access$100(RawLocalFileSystem.java:37)
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:417)
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getOwner(RawLocalFileSystem.java:400)
at
org.apache.hadoop.mapred.TaskLog.obtainLogDirOwner(TaskLog.java:176)
at
org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:124)
at org.apache.hadoop.mapred.Child$4.run(Child.java:264)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.mapred.Child.main(Child.java:253)
Caused by: java.io.IOException: error=12, Not enough space
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:53)
at java.lang.ProcessImpl.start(ProcessImpl.java:65)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
... 16 more
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:442)
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getOwner(RawLocalFileSystem.java:400)
at
org.apache.hadoop.mapred.TaskLog.obtainLogDirOwner(TaskLog.java:176)
at
org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:124)
at org.apache.hadoop.mapred.Child$4.run(Child.java:264)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.mapred.Child.main(Child.java:253)
Hi,
I am trying to write a map-reduce job to convert csv files to
sequencefiles, but the job fails with the following error:
java.lang.RuntimeException: Error while running command to get file
permissions : java.io.IOException: Cannot run program "/bin/ls":
error=12, Not enough space
at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:200)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:461)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:444)
at
org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:540)
at
org.apache.hadoop.fs.RawLocalFileSystem.access$100(RawLocalFileSystem.java:37)
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:417)
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getOwner(RawLocalFileSystem.java:400)
at
org.apache.hadoop.mapred.TaskLog.obtainLogDirOwner(TaskLog.java:176)
at
org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:124)
at org.apache.hadoop.mapred.Child$4.run(Child.java:264)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.mapred.Child.main(Child.java:253)
Caused by: java.io.IOException: error=12, Not enough space
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:53)
at java.lang.ProcessImpl.start(ProcessImpl.java:65)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
... 16 more
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:442)
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getOwner(RawLocalFileSystem.java:400)
at
org.apache.hadoop.mapred.TaskLog.obtainLogDirOwner(TaskLog.java:176)
at
org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:124)
at org.apache.hadoop.mapred.Child$4.run(Child.java:264)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.mapred.Child.main(Child.java:253)
--
Lance Norskog
goksron@gmail.com