FAQ
Hi Guys

I am trying to copy hadoop data from one cluster to another but I am keep on
getting this error *Server returned HTTP response code: 500 for URL*
*
*
My distcp command is:
scripts/hadoop.sh distcp
hftp://c13-hadoop1-nn-r0-n1:50070/user/dwadmin/live/data/warehouse/facts/page_events/
*day=2011-05-17* hdfs://phx1-rb-dev40-pipe1.cnet.com:9000/user/sgehlot

In here I have *day=2011-05-17* in my file path

I found this online: https://issues.apache.org/jira/browse/HDFS-31

Is this issue is still exists? Is this could be the reason of my job
failure?

Job Error log:

2011-05-18 11:34:56,505 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2011-05-18 11:34:56,713 INFO org.apache.hadoop.mapred.MapTask:
numReduceTasks: 0
2011-05-18 11:34:57,039 INFO org.apache.hadoop.tools.DistCp: FAIL
day=2011-05-17/_logs/history/c13-hadoop1-nn-r0-n1_1291919715221_job_201012091035_41977_dwadmin_CopyFactsToHive%3A+page_events+day%3D2011-05-17
: java.io.IOException: *Server returned HTTP response code: 500 for URL*:
http://c13-hadoop1-wkr-r10-n4.cnet.com:50075/streamFile?filename=/user/dwadmin/live/data/warehouse/facts/page_events/day=2011-05-17/_logs/history/c13-hadoop1-nn-r0-n1_1291919715221_job_201012091035_41977_dwadmin_CopyFactsToHive%253A+page_events+day%253D2011-05-17&ugi=sgehlot,user
at
sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1313)
at org.apache.hadoop.hdfs.HftpFileSystem.open(HftpFileSystem.java:157)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:398)
at org.apache.hadoop.tools.DistCp$CopyFilesMapper.copy(DistCp.java:410)
at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:537)
at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:306)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
at org.apache.hadoop.mapred.Child.main(Child.java:170)

2011-05-18 11:35:06,118 WARN org.apache.hadoop.mapred.TaskTracker: Error
running child
java.io.IOException: Copied: 0 Skipped: 5 Failed: 1
at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:572)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
2011-05-18 11:35:06,124 INFO org.apache.hadoop.mapred.TaskRunner: Runnning
cleanup for the task

Any help is appreciated.

Thanks,
Sonia

Search Discussions

  • Bill Graham at May 18, 2011 at 9:35 pm
    Are you able to distcp folders that don't have special characters?

    What are the versions of the two clusters and are you running on the
    destination cluster if there's a mis-match? If there is you'll need to use
    hftp:

    http://hadoop.apache.org/common/docs/current/distcp.html#cpver
    On Wed, May 18, 2011 at 12:44 PM, sonia gehlot wrote:

    Hi Guys

    I am trying to copy hadoop data from one cluster to another but I am keep
    on
    getting this error *Server returned HTTP response code: 500 for URL*
    *
    *
    My distcp command is:
    scripts/hadoop.sh distcp

    hftp://c13-hadoop1-nn-r0-n1:50070/user/dwadmin/live/data/warehouse/facts/page_events/
    *day=2011-05-17* hdfs://phx1-rb-dev40-pipe1.cnet.com:9000/user/sgehlot

    In here I have *day=2011-05-17* in my file path

    I found this online: https://issues.apache.org/jira/browse/HDFS-31

    Is this issue is still exists? Is this could be the reason of my job
    failure?

    Job Error log:

    2011-05-18 11:34:56,505 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
    Initializing JVM Metrics with processName=MAP, sessionId=
    2011-05-18 11:34:56,713 INFO org.apache.hadoop.mapred.MapTask:
    numReduceTasks: 0
    2011-05-18 11:34:57,039 INFO org.apache.hadoop.tools.DistCp: FAIL

    day=2011-05-17/_logs/history/c13-hadoop1-nn-r0-n1_1291919715221_job_201012091035_41977_dwadmin_CopyFactsToHive%3A+page_events+day%3D2011-05-17
    : java.io.IOException: *Server returned HTTP response code: 500 for URL*:

    http://c13-hadoop1-wkr-r10-n4.cnet.com:50075/streamFile?filename=/user/dwadmin/live/data/warehouse/facts/page_events/day=2011-05-17/_logs/history/c13-hadoop1-nn-r0-n1_1291919715221_job_201012091035_41977_dwadmin_CopyFactsToHive%253A+page_events+day%253D2011-05-17&ugi=sgehlot,user
    at

    sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1313)
    at org.apache.hadoop.hdfs.HftpFileSystem.open(HftpFileSystem.java:157)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:398)
    at org.apache.hadoop.tools.DistCp$CopyFilesMapper.copy(DistCp.java:410)
    at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:537)
    at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:306)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
    at org.apache.hadoop.mapred.Child.main(Child.java:170)

    2011-05-18 11:35:06,118 WARN org.apache.hadoop.mapred.TaskTracker: Error
    running child
    java.io.IOException: Copied: 0 Skipped: 5 Failed: 1
    at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:572)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
    at org.apache.hadoop.mapred.Child.main(Child.java:170)
    2011-05-18 11:35:06,124 INFO org.apache.hadoop.mapred.TaskRunner: Runnning
    cleanup for the task

    Any help is appreciated.

    Thanks,
    Sonia
  • Sonia gehlot at May 18, 2011 at 9:43 pm
    Yes I am able to distcp folders that don't have special characters in file
    path.

    I think hadoop version is same on both the clusters, but still I am using
    hftp and yes I am running on destination cluster.
    On Wed, May 18, 2011 at 2:34 PM, Bill Graham wrote:

    Are you able to distcp folders that don't have special characters?

    What are the versions of the two clusters and are you running on the
    destination cluster if there's a mis-match? If there is you'll need to use
    hftp:

    http://hadoop.apache.org/common/docs/current/distcp.html#cpver

    On Wed, May 18, 2011 at 12:44 PM, sonia gehlot <sonia.gehlot@gmail.com
    wrote:
    Hi Guys

    I am trying to copy hadoop data from one cluster to another but I am keep
    on
    getting this error *Server returned HTTP response code: 500 for URL*
    *
    *
    My distcp command is:
    scripts/hadoop.sh distcp

    hftp://c13-hadoop1-nn-r0-n1:50070/user/dwadmin/live/data/warehouse/facts/page_events/
    *day=2011-05-17* hdfs://phx1-rb-dev40-pipe1.cnet.com:9000/user/sgehlot

    In here I have *day=2011-05-17* in my file path

    I found this online: https://issues.apache.org/jira/browse/HDFS-31

    Is this issue is still exists? Is this could be the reason of my job
    failure?

    Job Error log:

    2011-05-18 11:34:56,505 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
    Initializing JVM Metrics with processName=MAP, sessionId=
    2011-05-18 11:34:56,713 INFO org.apache.hadoop.mapred.MapTask:
    numReduceTasks: 0
    2011-05-18 11:34:57,039 INFO org.apache.hadoop.tools.DistCp: FAIL

    day=2011-05-17/_logs/history/c13-hadoop1-nn-r0-n1_1291919715221_job_201012091035_41977_dwadmin_CopyFactsToHive%3A+page_events+day%3D2011-05-17
    : java.io.IOException: *Server returned HTTP response code: 500 for URL*:

    http://c13-hadoop1-wkr-r10-n4.cnet.com:50075/streamFile?filename=/user/dwadmin/live/data/warehouse/facts/page_events/day=2011-05-17/_logs/history/c13-hadoop1-nn-r0-n1_1291919715221_job_201012091035_41977_dwadmin_CopyFactsToHive%253A+page_events+day%253D2011-05-17&ugi=sgehlot,user
    at

    sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1313)
    at org.apache.hadoop.hdfs.HftpFileSystem.open(HftpFileSystem.java:157)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:398)
    at org.apache.hadoop.tools.DistCp$CopyFilesMapper.copy(DistCp.java:410)
    at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:537)
    at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:306)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
    at org.apache.hadoop.mapred.Child.main(Child.java:170)

    2011-05-18 11:35:06,118 WARN org.apache.hadoop.mapred.TaskTracker: Error
    running child
    java.io.IOException: Copied: 0 Skipped: 5 Failed: 1
    at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:572)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
    at org.apache.hadoop.mapred.Child.main(Child.java:170)
    2011-05-18 11:35:06,124 INFO org.apache.hadoop.mapred.TaskRunner: Runnning
    cleanup for the task

    Any help is appreciated.

    Thanks,
    Sonia

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedMay 18, '11 at 7:45p
activeMay 18, '11 at 9:43p
posts3
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Sonia gehlot: 2 posts Bill Graham: 1 post

People

Translate

site design / logo © 2021 Grokbase