FAQ
I've followed carefully the instructions on
http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29,
but the task fails with the following stack trace :

08/05/13 19:16:26 INFO mapred.JobClient: map 100% reduce 0%
08/05/13 19:18:55 INFO mapred.JobClient: Task Id :
task_200805131750_0005_m_000001_0, Status : FAILED
Map output lost, rescheduling:
getMapOutput(task_200805131750_0005_m_000001_0,0) failed :
org.apache.hadoop.util.DiskChecker$DiskErrorException: Could not find
task_200805131750_0005_m_000001_0/file.out.index in any of the configured
local directories
at
org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathToRead(LocalDirAllocator.java:359)
at
org.apache.hadoop.fs.LocalDirAllocator.getLocalPathToRead(LocalDirAllocator.java:138)
at
org.apache.hadoop.mapred.TaskTracker$MapOutputServlet.doGet(TaskTracker.java:2253)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:689)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
at
org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:427)
at
org.mortbay.jetty.servlet.WebApplicationHandler.dispatch(WebApplicationHandler.java:475)
at
org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:567)
at org.mortbay.http.HttpContext.handle(HttpContext.java:1565)
at
org.mortbay.jetty.servlet.WebApplicationContext.handle(WebApplicationContext.java:635)
at org.mortbay.http.HttpContext.handle(HttpContext.java:1517)
at org.mortbay.http.HttpServer.service(HttpServer.java:954)
at org.mortbay.http.HttpConnection.service(HttpConnection.java:814)
at
org.mortbay.http.HttpConnection.handleNext(HttpConnection.java:981)
at org.mortbay.http.HttpConnection.handle(HttpConnection.java:831)
at
org.mortbay.http.SocketListener.handleConnection(SocketListener.java:244)
at org.mortbay.util.ThreadedServer.handle(ThreadedServer.java:357)
at org.mortbay.util.ThreadPool$PoolThread.run(ThreadPool.java:534)

My hadoop-site.xml file is as follows :

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

<property>
<name>fs.default.name</name>
<value>localhost:54310</value>
</property>

<property>
<name>mapred.job.tracker</name>
<value>localhost:54311</value>
</property>


<property>
<name>hadoop.tmp.dir</name>
<value>tmp_storage</value>
</property>

<property>
<name>dfs.replication</name>
<value>1</value>
</property>


<property>
<name>mapred.map.tasks.speculative.execution</name>
<value>false</value>
</property>

<property>
<name>mapred.reduce.tasks.speculative.execution</name>
<value>false</value>
</property>

</configuration>


Any happen would be appreciated.

Thanks
Shimon

Search Discussions

  • Arun C Murthy at May 13, 2008 at 4:46 pm

    <property>
    <name>hadoop.tmp.dir</name>
    <value>tmp_storage</value>
    </property>
    Could you try and change the above to an absolute path and check?
    That path should be relevant on each of the tasktrackers.
    Of course, you can configure each tasktracker independently by
    editing it's hadoop-site.xml.

    Arun

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedMay 13, '08 at 4:41p
activeMay 13, '08 at 4:46p
posts2
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Shimon golan: 1 post Arun C Murthy: 1 post

People

Translate

site design / logo © 2023 Grokbase