FAQ
hey all,

I want to run a experimental cluster
but my machines have limited disk space capacity.
I want each node in my cluster to have
around 50,000 thousand blocks.

I don't want to have smaller the block size
(1K, 4K, etc).

I saw SimulatedFSDataset in HDFS code base.
Could anybody shed some light in how to use this
in a real cluster, i.e. a cluster with everything
the same but simulated block?

any hint is appreciated.

thanks a lot.
Thanh

Search Discussions

  • Thanh Do at May 16, 2011 at 4:24 pm
    got it!

    just put the option

    dfs.datanode.simulateddatastorage

    on your config files.
    On Mon, May 16, 2011 at 9:42 AM, Thanh Do wrote:

    hey all,

    I want to run a experimental cluster
    but my machines have limited disk space capacity.
    I want each node in my cluster to have
    around 50,000 thousand blocks.

    I don't want to have smaller the block size
    (1K, 4K, etc).

    I saw SimulatedFSDataset in HDFS code base.
    Could anybody shed some light in how to use this
    in a real cluster, i.e. a cluster with everything
    the same but simulated block?

    any hint is appreciated.

    thanks a lot.
    Thanh

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouphdfs-user @
categorieshadoop
postedMay 16, '11 at 2:42p
activeMay 16, '11 at 4:24p
posts2
users1
websitehadoop.apache.org...
irc#hadoop

1 user in discussion

Thanh Do: 2 posts

People

Translate

site design / logo © 2022 Grokbase