FAQ
Hi, I was still not able to start a MapReduce job across multiple datanodes
even after I started the HDFS and Task services on all nodes.

I can see there is 1 active node and 3 dead datanodes in my cluster. What
happened? Do I need to configure sth on CM?

Thanks for your help:) I would great appreciate it.


Dead Datanodes : 3
  Node Decommissioned :50010 false :50010 false :50010 false

Search Discussions

  • Darren Lo at Jul 25, 2013 at 11:35 pm
    What do the stderr, stdout, and role logs say for the failing datanodes?

    If you click on the Hosts tab, is there any problem connecting to those
    hosts? When was the last heartbeat (should be within the last 15 seconds)?
    When you run the Host Inspector, does it flag any issues?

    On Thu, Jul 25, 2013 at 4:14 PM, GIS Song wrote:

    Hi, I was still not able to start a MapReduce job across multiple
    datanodes even after I started the HDFS and Task services on all nodes.

    I can see there is 1 active node and 3 dead datanodes in my cluster. What
    happened? Do I need to configure sth on CM?

    Thanks for your help:) I would great appreciate it.


    Dead Datanodes : 3
    Node Decommissioned :50010 false :50010 false :50010 false



    --
    Thanks,
    Darren

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupscm-users @
categorieshadoop
postedJul 25, '13 at 11:14p
activeJul 25, '13 at 11:35p
posts2
users2
websitecloudera.com
irc#hadoop

2 users in discussion

GIS Song: 1 post Darren Lo: 1 post

People

Translate

site design / logo © 2022 Grokbase