FAQ
Dear all,

I was running Java code, then java heap size problem popped up.

While increasing the java heap size, the browser was hung. So after that I
saw 3 validation errors that namenode, datanode , and HDFS change directory
need to be set so I dunno what to do, how to know their place? They were
automatically generated using CM.

So please help, you can't imagine the feeling of all you work has lost all
of a sudden ;;(((((((((

Search Discussions

  • Dalia Hassan at Dec 17, 2012 at 7:34 pm
    The required directories are:
    SecondaryNameNode SettingsHDFS Checkpoint Directory
    fs.checkpoint.dir, dfs.namenode.checkpoint.dir

    DataNode Data Directory
    dfs.data.dir, dfs.datanode.data.dir

    NameNode Data Directories
    dfs.name.dir, dfs.namenode.name.dir
    On Mon, Dec 17, 2012 at 9:26 PM, Dalia Hassan wrote:

    Dear all,

    I was running Java code, then java heap size problem popped up.

    While increasing the java heap size, the browser was hung. So after that I
    saw 3 validation errors that namenode, datanode , and HDFS change directory
    need to be set so I dunno what to do, how to know their place? They were
    automatically generated using CM.

    So please help, you can't imagine the feeling of all you work has lost all
    of a sudden ;;(((((((((
  • Adam Smieszny at Dec 17, 2012 at 11:13 pm
    Hi Dalia,

    I'm unclear on the problem - are all of those parameters currently EMPTY in
    your cloudera manager interface?
    Hadoop would not be able to function without any values for these params,
    so I doubt that to be the case if you've previously been running
    successfully.

    Please let us know what was the cause of the validation warning that popped
    up in the Cloudera Manager interface. Also, if you can share the number of
    nodes that you have, and the disks that are mounted to those nodes (mount
    points, mount options), that would be useful

    Thanks,
    Adam

    On Mon, Dec 17, 2012 at 2:34 PM, Dalia Hassan wrote:

    The required directories are:
    SecondaryNameNode Settings HDFS Checkpoint Directory
    fs.checkpoint.dir, dfs.namenode.checkpoint.dir

    DataNode Data Directory
    dfs.data.dir, dfs.datanode.data.dir

    NameNode Data Directories
    dfs.name.dir, dfs.namenode.name.dir

    On Mon, Dec 17, 2012 at 9:26 PM, Dalia Hassan wrote:

    Dear all,

    I was running Java code, then java heap size problem popped up.

    While increasing the java heap size, the browser was hung. So after that
    I saw 3 validation errors that namenode, datanode , and HDFS change
    directory need to be set so I dunno what to do, how to know their place?
    They were automatically generated using CM.

    So please help, you can't imagine the feeling of all you work has lost
    all of a sudden ;;(((((((((

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156
  • Dalia Hassan at Dec 18, 2012 at 6:47 am
    Yes Adam,

    All of a sudden they were empty on those three parameters. Thats why hadoop
    is not functioning properly now. So any help how to know them manually. I
    believe that by adding nodes to the cluster, they are updated
    automatically. So could anyone send me those parameters from his Cloudera
    Manager.

    Required Parameters: Namenode, Secondary Namenode, Datanode directories.

    I don't know how that happened my browser hung and upon checking the
    interface again, I found them empty. May be when it hung, I have chosen sth
    wrong.

    "the disks that are mounted to those nodes (mount points, mount options)"
    --> how to know them ??

    Thanks in advance,
    On Tue, Dec 18, 2012 at 1:13 AM, Adam Smieszny wrote:

    Hi Dalia,

    I'm unclear on the problem - are all of those parameters currently EMPTY
    in your cloudera manager interface?
    Hadoop would not be able to function without any values for these params,
    so I doubt that to be the case if you've previously been running
    successfully.

    Please let us know what was the cause of the validation warning that
    popped up in the Cloudera Manager interface. Also, if you can share the
    number of nodes that you have, and the disks that are mounted to those
    nodes (mount points, mount options), that would be useful

    Thanks,
    Adam

    On Mon, Dec 17, 2012 at 2:34 PM, Dalia Hassan wrote:

    The required directories are:
    SecondaryNameNode Settings HDFS Checkpoint Directory
    fs.checkpoint.dir, dfs.namenode.checkpoint.dir

    DataNode Data Directory
    dfs.data.dir, dfs.datanode.data.dir

    NameNode Data Directories
    dfs.name.dir, dfs.namenode.name.dir

    On Mon, Dec 17, 2012 at 9:26 PM, Dalia Hassan wrote:

    Dear all,

    I was running Java code, then java heap size problem popped up.

    While increasing the java heap size, the browser was hung. So after that
    I saw 3 validation errors that namenode, datanode , and HDFS change
    directory need to be set so I dunno what to do, how to know their place?
    They were automatically generated using CM.

    So please help, you can't imagine the feeling of all you work has lost
    all of a sudden ;;(((((((((

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156
  • Dalia Hassan at Dec 18, 2012 at 12:16 pm
    Hiii Adam,

    I already installed CM in a different cluster, so I took the directories'
    names from this cluster and insert them to my cluster. Thanks GOD, it
    worked out :D
    On Tue, Dec 18, 2012 at 8:46 AM, Dalia Hassan wrote:

    Yes Adam,

    All of a sudden they were empty on those three parameters. Thats why
    hadoop is not functioning properly now. So any help how to know them
    manually. I believe that by adding nodes to the cluster, they are updated
    automatically. So could anyone send me those parameters from his Cloudera
    Manager.

    Required Parameters: Namenode, Secondary Namenode, Datanode directories.

    I don't know how that happened my browser hung and upon checking the
    interface again, I found them empty. May be when it hung, I have chosen sth
    wrong.

    "the disks that are mounted to those nodes (mount points, mount options)"
    --> how to know them ??

    Thanks in advance,

    On Tue, Dec 18, 2012 at 1:13 AM, Adam Smieszny wrote:

    Hi Dalia,

    I'm unclear on the problem - are all of those parameters currently EMPTY
    in your cloudera manager interface?
    Hadoop would not be able to function without any values for these params,
    so I doubt that to be the case if you've previously been running
    successfully.

    Please let us know what was the cause of the validation warning that
    popped up in the Cloudera Manager interface. Also, if you can share the
    number of nodes that you have, and the disks that are mounted to those
    nodes (mount points, mount options), that would be useful

    Thanks,
    Adam

    On Mon, Dec 17, 2012 at 2:34 PM, Dalia Hassan wrote:

    The required directories are:
    SecondaryNameNode Settings HDFS Checkpoint Directory
    fs.checkpoint.dir, dfs.namenode.checkpoint.dir

    DataNode Data Directory
    dfs.data.dir, dfs.datanode.data.dir

    NameNode Data Directories
    dfs.name.dir, dfs.namenode.name.dir

    On Mon, Dec 17, 2012 at 9:26 PM, Dalia Hassan wrote:

    Dear all,

    I was running Java code, then java heap size problem popped up.

    While increasing the java heap size, the browser was hung. So after
    that I saw 3 validation errors that namenode, datanode , and HDFS change
    directory need to be set so I dunno what to do, how to know their place?
    They were automatically generated using CM.

    So please help, you can't imagine the feeling of all you work has lost
    all of a sudden ;;(((((((((

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupscm-users @
categorieshadoop
postedDec 17, '12 at 7:26p
activeDec 18, '12 at 12:16p
posts5
users2
websitecloudera.com
irc#hadoop

2 users in discussion

Dalia Hassan: 4 posts Adam Smieszny: 1 post

People

Translate

site design / logo © 2022 Grokbase