FAQ
Hi

I am relatively new to using hadoop. After installing hadoop on 3 machines
i tried running the word count example one one of the machines running as a
single node only. However when i try to tun the word count example using the
following command on the terminal:

hadoop@user5:~$ /home/hadoop/Desktop/hadoop/bin/hadoop jar
/home/hadoop/Desktop/hadoop/hadoop-0.19.1-examples.jar wordcount gutenberg
gut-out

where : hadoop is my user account and gutenberg is where the txt files for
the word count example are stored and gut-out is where the result is to be
stored

it starts the map-reduce however the recduce gets stuck at 0 % even though
map reaches 100 % and the output on the console is as follows. I need help.
Have been stuck on this problem since 3 days !

09/07/16 12:32:01 INFO mapred.FileInputFormat: Total input paths to process
: 3
09/07/16 12:32:44 INFO mapred.JobClient: Running job: job_200907161230_0001
09/07/16 12:32:45 INFO mapred.JobClient: map 0% reduce 0%
09/07/16 12:33:33 INFO mapred.JobClient: map 1% reduce 0%
09/07/16 12:33:37 INFO mapred.JobClient: map 3% reduce 0%
09/07/16 12:33:54 INFO mapred.JobClient: map 5% reduce 0%
09/07/16 12:33:57 INFO mapred.JobClient: map 7% reduce 0%
09/07/16 12:34:07 INFO mapred.JobClient: map 9% reduce 0%
09/07/16 12:34:14 INFO mapred.JobClient: map 11% reduce 0%
09/07/16 12:34:21 INFO mapred.JobClient: map 12% reduce 0%
09/07/16 12:34:29 INFO mapred.JobClient: map 14% reduce 0%
09/07/16 12:34:37 INFO mapred.JobClient: map 16% reduce 0%
09/07/16 12:34:44 INFO mapred.JobClient: map 18% reduce 0%
09/07/16 12:34:51 INFO mapred.JobClient: map 20% reduce 0%
09/07/16 12:34:58 INFO mapred.JobClient: map 22% reduce 0%
09/07/16 12:35:09 INFO mapred.JobClient: map 24% reduce 0%
09/07/16 12:35:41 INFO mapred.JobClient: map 25% reduce 0%
09/07/16 12:36:01 INFO mapred.JobClient: map 27% reduce 0%
09/07/16 12:36:10 INFO mapred.JobClient: map 29% reduce 0%
09/07/16 12:36:34 INFO mapred.JobClient: map 31% reduce 0%
09/07/16 12:36:58 INFO mapred.JobClient: map 33% reduce 0%
09/07/16 12:37:08 INFO mapred.JobClient: map 35% reduce 0%
09/07/16 12:37:15 INFO mapred.JobClient: map 37% reduce 0%
09/07/16 12:37:29 INFO mapred.JobClient: map 38% reduce 0%
09/07/16 12:37:31 INFO mapred.JobClient: map 40% reduce 0%
09/07/16 12:37:47 INFO mapred.JobClient: map 42% reduce 0%
09/07/16 12:37:48 INFO mapred.JobClient: map 44% reduce 0%
09/07/16 12:38:04 INFO mapred.JobClient: map 46% reduce 0%
09/07/16 12:38:06 INFO mapred.JobClient: map 48% reduce 0%
09/07/16 12:38:22 INFO mapred.JobClient: map 49% reduce 0%
09/07/16 12:38:23 INFO mapred.JobClient: map 51% reduce 0%
09/07/16 12:38:39 INFO mapred.JobClient: map 53% reduce 0%
09/07/16 12:38:40 INFO mapred.JobClient: map 55% reduce 0%
09/07/16 12:39:17 INFO mapred.JobClient: map 59% reduce 0%
09/07/16 12:39:37 INFO mapred.JobClient: Task Id :
attempt_200907161230_0001_m_000000_0, Status : FAILED
Too many fetch-failures
09/07/16 12:39:37 WARN mapred.JobClient: Error reading task outputConnection
refused
09/07/16 12:39:37 WARN mapred.JobClient: Error reading task outputConnection
refused
09/07/16 12:39:43 INFO mapred.JobClient: map 61% reduce 0%
09/07/16 12:40:06 INFO mapred.JobClient: map 64% reduce 0%
09/07/16 12:40:25 INFO mapred.JobClient: map 66% reduce 0%
09/07/16 12:40:27 INFO mapred.JobClient: map 68% reduce 0%
09/07/16 12:40:46 INFO mapred.JobClient: map 70% reduce 0%
09/07/16 12:40:48 INFO mapred.JobClient: map 72% reduce 0%
09/07/16 12:41:06 INFO mapred.JobClient: map 74% reduce 0%
09/07/16 12:41:07 INFO mapred.JobClient: map 75% reduce 0%
09/07/16 12:41:27 INFO mapred.JobClient: map 77% reduce 0%
09/07/16 12:41:28 INFO mapred.JobClient: map 79% reduce 0%
09/07/16 12:41:44 INFO mapred.JobClient: map 81% reduce 0%
09/07/16 12:41:47 INFO mapred.JobClient: map 83% reduce 0%
09/07/16 12:42:03 INFO mapred.JobClient: map 85% reduce 0%
09/07/16 12:42:06 INFO mapred.JobClient: map 87% reduce 0%
09/07/16 12:42:42 INFO mapred.JobClient: map 88% reduce 0%
09/07/16 12:42:45 INFO mapred.JobClient: map 90% reduce 0%
09/07/16 12:43:37 INFO mapred.JobClient: map 92% reduce 0%
09/07/16 12:43:40 INFO mapred.JobClient: map 94% reduce 0%
09/07/16 12:44:30 INFO mapred.JobClient: map 96% reduce 0%
09/07/16 12:44:34 INFO mapred.JobClient: map 98% reduce 0%
09/07/16 12:45:21 INFO mapred.JobClient: map 100% reduce 0%
09/07/16 12:46:27 INFO mapred.JobClient: Task Id :
attempt_200907161230_0001_m_000001_0, Status : FAILED
Too many fetch-failures
09/07/16 12:46:27 WARN mapred.JobClient: Error reading task outputConnection
refused
09/07/16 12:46:27 WARN mapred.JobClient: Error reading task outputConnection
refused
09/07/16 12:46:32 INFO mapred.JobClient: map 98% reduce 0%
09/07/16 12:46:34 INFO mapred.JobClient: map 100% reduce 0%
09/07/16 12:52:46 INFO mapred.JobClient: Task Id :
attempt_200907161230_0001_m_000002_0, Status : FAILED
Too many fetch-failures
09/07/16 12:52:46 WARN mapred.JobClient: Error reading task outputConnection
refused
09/07/16 12:52:46 WARN mapred.JobClient: Error reading task outputConnection
refused
09/07/16 12:52:51 INFO mapred.JobClient: map 98% reduce 0%
09/07/16 12:53:01 INFO mapred.JobClient: map 100% reduce 0%
09/07/16 12:59:02 INFO mapred.JobClient: Task Id :
attempt_200907161230_0001_m_000003_0, Status : FAILED
Too many fetch-failures
09/07/16 12:59:02 WARN mapred.JobClient: Error reading task outputConnection
refused
09/07/16 12:59:02 WARN mapred.JobClient: Error reading task outputConnection
refused
09/07/16 12:59:07 INFO mapred.JobClient: map 98% reduce 0%
09/07/16 12:59:15 INFO mapred.JobClient: map 100% reduce 0%



--
View this message in context: http://www.nabble.com/Error-in-running-hadoop-examples-tp24520521p24520521.html
Sent from the Hadoop lucene-users mailing list archive at Nabble.com.

Search Discussions

Discussion Posts

Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 1 of 2 | next ›
Discussion Overview
groupcommon-user @
categorieshadoop
postedJul 16, '09 at 5:03p
activeJul 16, '09 at 5:08p
posts2
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Jean-Daniel Cryans: 1 post Pooja Dave: 1 post

People

Translate

site design / logo © 2022 Grokbase