FAQ
I have installed hbase using CDH manager. Our application started pushing
data into Hbase. But I want o have a look at inserted data using hbase
shell. But I get following error. Please let me know if I am missing any
step.

hadoop_user@MACHNAME ~$ hbase shell
Exception in thread "main" java.lang.NoClassDefFoundError: HotSpot(TM)
Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: HotSpot(TM). Program will exit.

Search Discussions

  • Harsh J at Oct 4, 2012 at 8:54 am
    Hi Vikram,

    What version of a JVM are you using, and what is the OS here?

    Can you type "java -version" and send us the output please?
    On Thu, Oct 4, 2012 at 2:03 PM, vikram patil wrote:
    I have installed hbase using CDH manager. Our application started pushing
    data into Hbase. But I want o have a look at inserted data using hbase
    shell. But I get following error. Please let me know if I am missing any
    step.

    hadoop_user@MACHNAME ~$ hbase shell
    Exception in thread "main" java.lang.NoClassDefFoundError: HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.


    --
    Harsh J
  • Vikram patil at Oct 4, 2012 at 8:55 am
    Hi Harsh,

    java version "1.6.0_30"
    Java(TM) SE Runtime Environment (build 1.6.0_30-b12)
    Java HotSpot(TM) 64-Bit Server VM (build 20.5-b03, mixed mode)
    I might be missing basic steps which enable usage of hbase shell with
    cloudera installation


    Regards
    Vikram
    On Thursday, October 4, 2012 2:19:42 PM UTC+5:30, Harsh J wrote:

    Hi Vikram,

    What version of a JVM are you using, and what is the OS here?

    Can you type "java -version" and send us the output please?
    On Thu, Oct 4, 2012 at 2:03 PM, vikram patil wrote:
    I have installed hbase using CDH manager. Our application started pushing
    data into Hbase. But I want o have a look at inserted data using hbase
    shell. But I get following error. Please let me know if I am missing any
    step.

    hadoop_user@MACHNAME ~$ hbase shell
    Exception in thread "main" java.lang.NoClassDefFoundError: HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.


    --
    Harsh J
  • Harsh J at Oct 4, 2012 at 11:35 am
    Vikram,

    The error you are receiving is pretty odd.

    Can you send us a few more outputs please?

    - Does invoking "/usr/lib/hbase/bin/hbase shell" work?
    -- If it does, please send output of "bash -x hbase shell"
    -- If it does not, please send output of "bash -x
    /usr/lib/hbase/bin/hbase shell"

    - Can you send us /etc/hbase/conf/hbase-env.sh contents?
    On Thu, Oct 4, 2012 at 2:25 PM, vikram patil wrote:
    Hi Harsh,

    java version "1.6.0_30"
    Java(TM) SE Runtime Environment (build 1.6.0_30-b12)
    Java HotSpot(TM) 64-Bit Server VM (build 20.5-b03, mixed mode)
    I might be missing basic steps which enable usage of hbase shell with
    cloudera installation


    Regards
    Vikram
    On Thursday, October 4, 2012 2:19:42 PM UTC+5:30, Harsh J wrote:

    Hi Vikram,

    What version of a JVM are you using, and what is the OS here?

    Can you type "java -version" and send us the output please?
    On Thu, Oct 4, 2012 at 2:03 PM, vikram patil wrote:
    I have installed hbase using CDH manager. Our application started
    pushing
    data into Hbase. But I want o have a look at inserted data using hbase
    shell. But I get following error. Please let me know if I am missing any
    step.

    hadoop_user@MACHNAME ~$ hbase shell
    Exception in thread "main" java.lang.NoClassDefFoundError: HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.


    --
    Harsh J


    --
    Harsh J
  • Vikram patil at Oct 4, 2012 at 11:54 am
    Harsh,

    First suggestion didn't work so I tried second . It yield in error
    but gave following output This is just partial output . I have colored what
    I think might be causing issue?



    CLASS='org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/..'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str='
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console'
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize (262144k) is
    equal to or greater than the entire heap (262144k). A new generation size
    of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java HotSpot(TM)
    64-Bit Server VM warning: MaxNewSize (262144k) is equal to or greater than
    the entire heap (262144k). A new generation size of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    + '[' '' '!=' '' ']'
    + exec /usr/local/jdk/bin/java '-XX:OnOutOfMemoryError=kill -9 %p'
    -Xmx1000m -XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java 'HotSpot(TM)'
    64-Bit Server VM warning: MaxNewSize '(262144k)' is equal to or greater
    than the entire heap '(262144k).' A new generation size of 262080k will be
    used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64
    -classpath
    '/etc/hbase/conf.cloudera.hbase1:/usr/local/jdk/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security.jar:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security-tests.jar:/usr/lib/hbase/bin/../hbase.jar:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/aopalliance-1.0.jar:/usr/lib/hbase/bin/../lib/asm-3.2.jar:/usr/lib/hbase/bin/../lib/aspectjrt-1.6.5.jar:/usr/lib/hbase/bin/../lib/avro-1.5.4.jar:/usr/lib/hbase/bin/../lib/avro-ipc-1.5.4.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.4.jar:/usr/lib/hbase/bin/../lib/commons-collections-3.2.1.jar:/usr/lib/hbase/bin/../lib/commons-configuration-1.6.jar:/usr/lib/hbase/bin/../lib/commons-daemon-1.0.3.jar:/usr/lib/hbase/bin/../lib/commons-digester-1.8.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-io-2.1.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.5.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-logging-api-1.1.jar:/usr/lib/hbase/bin/../lib/commons-net-3.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/gmbal-api-only-3.0.0-b023.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1-tests.jar:/usr/lib/hbase/bin/../lib/grizzly-http-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-server-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-servlet-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-rcm-2.1.1.jar:/usr/lib/hbase/bin/../lib/guava-11.0.2.jar:/usr/lib/hbase/bin/../lib/guice-3.0.jar:/usr/lib/hbase/bin/../lib/guice-servlet-3.0.jar:/usr/lib/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/usr/lib/hbase/bin/../lib/httpclient-4.0.1.jar:/usr/lib/hbase/bin/../lib/httpcore-4.0.1.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/javax.inject-1.jar:/usr/lib/hbase/bin/../lib/javax.servlet-3.0.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.1.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/bin/../lib/jdiff-1.0.9.jar:/usr/lib/hbase/bin/../lib/jersey-client-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-guice-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-core-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jets3t-0.6.1.jar:/usr/lib/hbase/bin/../lib/jettison-1.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.5.jar:/usr/lib/hbase/bin/../lib/jsch-0.1.42.jar:/usr/lib/hbase/bin/../lib/json-simple-1.1.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1.jar:/usr/lib/hbase/bin/../lib/jsr305-1.3.9.jar:/usr/lib/hbase/bin/../lib/kfs-0.3.jar:/usr/lib/hbase/bin/../lib/libthrift-0.7.0.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.16.jar:/usr/lib/hbase/bin/../lib/management-api-3.0.0-b012.jar:/usr/lib/hbase/bin/../lib/netty-3.2.4.Final.jar:/usr/lib/hbase/bin/../lib/oro-2.0.8.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5.jar:/usr/lib/hbase/bin/../lib/slf4j-api-1.6.1.jar:/usr/lib/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/velocity-1.7.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*'
    org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb
    Exception in thread "main" java.lang.NoClassDefFoundError: HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.
    On Thursday, October 4, 2012 5:05:38 PM UTC+5:30, Harsh J wrote:

    Vikram,

    The error you are receiving is pretty odd.

    Can you send us a few more outputs please?

    - Does invoking "/usr/lib/hbase/bin/hbase shell" work?
    -- If it does, please send output of "bash -x hbase shell"
    -- If it does not, please send output of "bash -x
    /usr/lib/hbase/bin/hbase shell"

    - Can you send us /etc/hbase/conf/hbase-env.sh contents?
    On Thu, Oct 4, 2012 at 2:25 PM, vikram patil wrote:
    Hi Harsh,

    java version "1.6.0_30"
    Java(TM) SE Runtime Environment (build 1.6.0_30-b12)
    Java HotSpot(TM) 64-Bit Server VM (build 20.5-b03, mixed mode)
    I might be missing basic steps which enable usage of hbase shell with
    cloudera installation


    Regards
    Vikram
    On Thursday, October 4, 2012 2:19:42 PM UTC+5:30, Harsh J wrote:

    Hi Vikram,

    What version of a JVM are you using, and what is the OS here?

    Can you type "java -version" and send us the output please?
    On Thu, Oct 4, 2012 at 2:03 PM, vikram patil wrote:
    I have installed hbase using CDH manager. Our application started
    pushing
    data into Hbase. But I want o have a look at inserted data using
    hbase
    shell. But I get following error. Please let me know if I am missing
    any
    step.

    hadoop_user@MACHNAME ~$ hbase shell
    Exception in thread "main" java.lang.NoClassDefFoundError:
    HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.


    --
    Harsh J


    --
    Harsh J
  • Harsh J at Oct 4, 2012 at 12:06 pm
    Vikram,

    Many thanks for sending these in.

    The issue is the occurrence of this message in the midst of generating
    a library path.
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize (262144k) is equal to or greater than the entire heap (262144k). A new generation size of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'

    It looks to me like you have some bad arguments added to either
    /etc/hadoop/conf/hadoop-env.sh or to /etc/hbase/conf/hbase-env.sh (One
    of the -X JVM properties). Can you share the contents of these two
    files?
    On Thu, Oct 4, 2012 at 5:15 PM, vikram patil wrote:
    Harsh,

    First suggestion didn't work so I tried second . It yield in error but
    gave following output This is just partial output . I have colored what I
    think might be causing issue?



    CLASS='org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/..'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str='
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console'
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize (262144k) is
    equal to or greater than the entire heap (262144k). A new generation size
    of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java HotSpot(TM) 64-Bit
    Server VM warning: MaxNewSize (262144k) is equal to or greater than the
    entire heap (262144k). A new generation size of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    + '[' '' '!=' '' ']'
    + exec /usr/local/jdk/bin/java '-XX:OnOutOfMemoryError=kill -9 %p' -Xmx1000m
    -XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java 'HotSpot(TM)'
    64-Bit Server VM warning: MaxNewSize '(262144k)' is equal to or greater than
    the entire heap '(262144k).' A new generation size of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64
    -classpath
    '/etc/hbase/conf.cloudera.hbase1:/usr/local/jdk/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security.jar:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security-tests.jar:/usr/lib/hbase/bin/../hbase.jar:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/aopalliance-1.0.jar:/usr/lib/hbase/bin/../lib/asm-3.2.jar:/usr/lib/hbase/bin/../lib/aspectjrt-1.6.5.jar:/usr/lib/hbase/bin/../lib/avro-1.5.4.jar:/usr/lib/hbase/bin/../lib/avro-ipc-1.5.4.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.4.jar:/usr/lib/hbase/bin/../lib/commons-collections-3.2.1.jar:/usr/lib/hbase/bin/../lib/commons-configuration-1.6.jar:/usr/lib/hbase/bin/../lib/commons-daemon-1.0.3.jar:/usr/lib/hbase/bin/../lib/commons-digester-1.8.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-io-2.1.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.5.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-logging-api-1.1.jar:/usr/lib/hbase/bin/../lib/commons-net-3.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/gmbal-api-only-3.0.0-b023.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1-tests.jar:/usr/lib/hbase/bin/../lib/grizzly-http-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-server-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-servlet-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-rcm-2.1.1.jar:/usr/lib/hbase/bin/../lib/guava-11.0.2.jar:/usr/lib/hbase/bin/../lib/guice-3.0.jar:/usr/lib/hbase/bin/../lib/guice-servlet-3.0.jar:/usr/lib/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/usr/lib/hbase/bin/../lib/httpclient-4.0.1.jar:/usr/lib/hbase/bin/../lib/httpcore-4.0.1.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/javax.inject-1.jar:/usr/lib/hbase/bin/../lib/javax.servlet-3.0.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.1.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/bin/../lib/jdiff-1.0.9.jar:/usr/lib/hbase/bin/../lib/jersey-client-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-guice-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-core-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jets3t-0.6.1.jar:/usr/lib/hbase/bin/../lib/jettison-1.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.5.jar:/usr/lib/hbase/bin/../lib/jsch-0.1.42.jar:/usr/lib/hbase/bin/../lib/json-simple-1.1.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1.jar:/usr/lib/hbase/bin/../lib/jsr305-1.3.9.jar:/usr/lib/hbase/bin/../lib/kfs-0.3.jar:/usr/lib/hbase/bin/../lib/libthrift-0.7.0.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.16.jar:/usr/lib/hbase/bin/../lib/management-api-3.0.0-b012.jar:/usr/lib/hbase/bin/../lib/netty-3.2.4.Final.jar:/usr/lib/hbase/bin/../lib/oro-2.0.8.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5.jar:/usr/lib/hbase/bin/../lib/slf4j-api-1.6.1.jar:/usr/lib/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/velocity-1.7.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*'
    org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb
    Exception in thread "main" java.lang.NoClassDefFoundError: HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.
    On Thursday, October 4, 2012 5:05:38 PM UTC+5:30, Harsh J wrote:

    Vikram,

    The error you are receiving is pretty odd.

    Can you send us a few more outputs please?

    - Does invoking "/usr/lib/hbase/bin/hbase shell" work?
    -- If it does, please send output of "bash -x hbase shell"
    -- If it does not, please send output of "bash -x
    /usr/lib/hbase/bin/hbase shell"

    - Can you send us /etc/hbase/conf/hbase-env.sh contents?
    On Thu, Oct 4, 2012 at 2:25 PM, vikram patil wrote:
    Hi Harsh,

    java version "1.6.0_30"
    Java(TM) SE Runtime Environment (build 1.6.0_30-b12)
    Java HotSpot(TM) 64-Bit Server VM (build 20.5-b03, mixed mode)
    I might be missing basic steps which enable usage of hbase shell with
    cloudera installation


    Regards
    Vikram
    On Thursday, October 4, 2012 2:19:42 PM UTC+5:30, Harsh J wrote:

    Hi Vikram,

    What version of a JVM are you using, and what is the OS here?

    Can you type "java -version" and send us the output please?

    On Thu, Oct 4, 2012 at 2:03 PM, vikram patil <patil...@gmail.com>
    wrote:
    I have installed hbase using CDH manager. Our application started
    pushing
    data into Hbase. But I want o have a look at inserted data using
    hbase
    shell. But I get following error. Please let me know if I am missing
    any
    step.

    hadoop_user@MACHNAME ~$ hbase shell
    Exception in thread "main" java.lang.NoClassDefFoundError:
    HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J
  • Vikram patil at Oct 4, 2012 at 12:17 pm
    Harsh,

    But hbase and hdfs both are running fine without any issues . Only shell
    is not able to start up.

    I was even able to start rest connector for Hbase. But I have pasted both
    env.sh files here.

    ==> /etc/hbase/conf/hbase-env.sh <==
    export HBASE_OPTS="-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M"
    export HBASE_CLASSPATH=`echo $HBASE_CLASSPATH | sed -e
    "s|$ZOOKEEPER_CONF:||"`


    tail -3000 /etc/hadoop/conf/hadoop-env.sh

    # Set Hadoop-specific environment variables here.

    # Disable IPv6.
    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -XX:NewSize=64m
    -XX:MaxNewSize=256m $HADOOP_OPTS"

    # The only required environment variable is JAVA_HOME. All others are
    # optional. When running a distributed configuration it is best to
    # set JAVA_HOME in this file, so that it is correctly defined on
    # remote nodes.

    # The java implementation to use. Required.
    # export JAVA_HOME=/usr/lib/j2sdk1.5-sun

    # Extra Java CLASSPATH elements. Optional.
    # export HADOOP_CLASSPATH="<extra_entries>:$HADOOP_CLASSPATH"

    # The maximum amount of heap to use, in MB. Default is 1000.
    export HADOOP_HEAPSIZE=256

    # Extra Java runtime options. Empty by default.
    # export HADOOP_OPTS="-server -Xmx256M -Xms64M $HADOOP_OPTS"

    # Command specific options appended to HADOOP_OPTS when specified
    export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_NAMENODE_OPTS"
    export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_SECONDARYNAMENODE_OPTS"
    export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_DATANODE_OPTS"
    export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_BALANCER_OPTS"
    export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_JOBTRACKER_OPTS"
    # export HADOOP_TASKTRACKER_OPTS=
    # The following applies to multiple commands (fs, dfs, fsck, distcp etc)
    # export HADOOP_CLIENT_OPTS

    # Extra ssh options. Empty by default.
    # export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"

    # Where log files are stored. $HADOOP_HOME/logs by default.
    # export HADOOP_LOG_DIR=${HADOOP_HOME}/logs

    # File naming remote slave hosts. $HADOOP_HOME/conf/slaves by default.
    # export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves

    # host:path where hadoop code should be rsync'd from. Unset by default.
    # export HADOOP_MASTER=master:/home/$USER/src/hadoop

    # Seconds to sleep between slave commands. Unset by default. This
    # can be useful in large clusters, where, e.g., slave rsyncs can
    # otherwise arrive faster than the master can service them.
    # export HADOOP_SLAVE_SLEEP=0.1

    # The directory where pid files are stored. /tmp by default.
    # export HADOOP_PID_DIR=/var/hadoop/pids

    # A string representing this instance of hadoop. $USER by default.
    # export HADOOP_IDENT_STRING=$USER

    # The scheduling priority for daemon processes. See 'man nice'.
    # export HADOOP_NICENESS=10


    Regards,
    Vikram

    On Thu, Oct 4, 2012 at 5:35 PM, Harsh J wrote:

    Vikram,

    Many thanks for sending these in.

    The issue is the occurrence of this message in the midst of generating
    a library path.
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize (262144k)
    is equal to or greater than the entire heap (262144k). A new generation
    size of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'

    It looks to me like you have some bad arguments added to either
    /etc/hadoop/conf/hadoop-env.sh or to /etc/hbase/conf/hbase-env.sh (One
    of the -X JVM properties). Can you share the contents of these two
    files?
    On Thu, Oct 4, 2012 at 5:15 PM, vikram patil wrote:
    Harsh,

    First suggestion didn't work so I tried second . It yield in error but
    gave following output This is just partial output . I have colored what I
    think might be causing issue?



    CLASS='org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/..'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str='
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console'
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize (262144k) is
    equal to or greater than the entire heap (262144k). A new generation size
    of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java HotSpot(TM) 64-Bit
    Server VM warning: MaxNewSize (262144k) is equal to or greater than the
    entire heap (262144k). A new generation size of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    + '[' '' '!=' '' ']'
    + exec /usr/local/jdk/bin/java '-XX:OnOutOfMemoryError=kill -9 %p' -Xmx1000m
    -XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java 'HotSpot(TM)'
    64-Bit Server VM warning: MaxNewSize '(262144k)' is equal to or greater than
    the entire heap '(262144k).' A new generation size of 262080k will be used.

    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64
    -classpath
    '/etc/hbase/conf.cloudera.hbase1:/usr/local/jdk/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security.jar:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security-tests.jar:/usr/lib/hbase/bin/../hbase.jar:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/aopalliance-1.0.jar:/usr/lib/hbase/bin/../lib/asm-3.2.jar:/usr/lib/hbase/bin/../lib/aspectjrt-1.6.5.jar:/usr/lib/hbase/bin/../lib/avro-1.5.4.jar:/usr/lib/hbase/bin/../lib/avro-ipc-1.5.4.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.4.jar:/usr/lib/hbase/bin/../lib/commons-collections-3.2.1.jar:/usr/lib/hbase/bin/../lib/commons-configuration-1.6.jar:/usr/lib/hbase/bin/../lib/commons-daemon-1.0.3.jar:/usr/lib/hbase/bin/../lib/commons-digester-1.8.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-io-2.1.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.5.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-logging-api-1.1.jar:/usr/lib/hbase/bin/../lib/commons-net-3.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/gmbal-api-only-3.0.0-b023.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1-tests.jar:/usr/lib/hbase/bin/../lib/grizzly-http-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-server-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-servlet-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-rcm-2.1.1.jar:/usr/lib/hbase/bin/../lib/guava-11.0.2.jar:/usr/lib/hbase/bin/../lib/guice-3.0.jar:/usr/lib/hbase/bin/../lib/guice-servlet-3.0.jar:/usr/lib/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/usr/lib/hbase/bin/../lib/httpclient-4.0.1.jar:/usr/lib/hbase/bin/../lib/httpcore-4.0.1.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/javax.inject-1.jar:/usr/lib/hbase/bin/../lib/javax.servlet-3.0.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.1.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/bin/../lib/jdiff-1.0.9.jar:/usr/lib/hbase/bin/../lib/jersey-client-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-guice-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-core-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jets3t-0.6.1.jar:/usr/lib/hbase/bin/../lib/jettison-1.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.5.jar:/usr/lib/hbase/bin/../lib/jsch-0.1.42.jar:/usr/lib/hbase/bin/../lib/json-simple-1.1.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1.jar:/usr/lib/hbase/bin/../lib/jsr305-1.3.9.jar:/usr/lib/hbase/bin/../lib/kfs-0.3.jar:/usr/lib/hbase/bin/../lib/libthrift-0.7.0.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.16.jar:/usr/lib/hbase/bin/../lib/management-api-3.0.0-b012.jar:/usr/lib/hbase/bin/../lib/netty-3.2.4.Final.jar:/usr/lib/hbase/bin/../lib/oro-2.0.8.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5.jar:/usr/lib/hbase/bin/../lib/slf4j-api-1.6.1.jar:/usr/lib/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/velocity-1.7.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*'
    org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb
    Exception in thread "main" java.lang.NoClassDefFoundError: HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.
    On Thursday, October 4, 2012 5:05:38 PM UTC+5:30, Harsh J wrote:

    Vikram,

    The error you are receiving is pretty odd.

    Can you send us a few more outputs please?

    - Does invoking "/usr/lib/hbase/bin/hbase shell" work?
    -- If it does, please send output of "bash -x hbase shell"
    -- If it does not, please send output of "bash -x
    /usr/lib/hbase/bin/hbase shell"

    - Can you send us /etc/hbase/conf/hbase-env.sh contents?
    On Thu, Oct 4, 2012 at 2:25 PM, vikram patil wrote:
    Hi Harsh,

    java version "1.6.0_30"
    Java(TM) SE Runtime Environment (build 1.6.0_30-b12)
    Java HotSpot(TM) 64-Bit Server VM (build 20.5-b03, mixed mode)
    I might be missing basic steps which enable usage of hbase shell with
    cloudera installation


    Regards
    Vikram
    On Thursday, October 4, 2012 2:19:42 PM UTC+5:30, Harsh J wrote:

    Hi Vikram,

    What version of a JVM are you using, and what is the OS here?

    Can you type "java -version" and send us the output please?

    On Thu, Oct 4, 2012 at 2:03 PM, vikram patil <patil...@gmail.com>
    wrote:
    I have installed hbase using CDH manager. Our application started
    pushing
    data into Hbase. But I want o have a look at inserted data using
    hbase
    shell. But I get following error. Please let me know if I am
    missing
    any
    step.

    hadoop_user@MACHNAME ~$ hbase shell
    Exception in thread "main" java.lang.NoClassDefFoundError:
    HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J
  • Harsh J at Oct 4, 2012 at 12:23 pm
    Yes this problem may not affect the service uptimes but you may
    currently be running without any native codec support accidentally
    (possibly)

    This is the erroneous line in hadoop-env.sh thats setting an invalid
    MaxNewSize, leading to a JVM warn and a script run failure thats
    eventually boiling down to a hbase shell invocation failure:

    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -XX:NewSize=64m
    -XX:MaxNewSize=256m $HADOOP_OPTS"

    Please remove it away/comment it and retry. The MaxNewSize should not
    be close to the heap size. Why have you added this line btw?
    On Thu, Oct 4, 2012 at 5:47 PM, vikram patil wrote:
    Harsh,

    But hbase and hdfs both are running fine without any issues . Only shell is
    not able to start up.

    I was even able to start rest connector for Hbase. But I have pasted both
    env.sh files here.

    ==> /etc/hbase/conf/hbase-env.sh <==
    export HBASE_OPTS="-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M"
    export HBASE_CLASSPATH=`echo $HBASE_CLASSPATH | sed -e
    "s|$ZOOKEEPER_CONF:||"`


    tail -3000 /etc/hadoop/conf/hadoop-env.sh

    # Set Hadoop-specific environment variables here.

    # Disable IPv6.
    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -XX:NewSize=64m
    -XX:MaxNewSize=256m $HADOOP_OPTS"

    # The only required environment variable is JAVA_HOME. All others are
    # optional. When running a distributed configuration it is best to
    # set JAVA_HOME in this file, so that it is correctly defined on
    # remote nodes.

    # The java implementation to use. Required.
    # export JAVA_HOME=/usr/lib/j2sdk1.5-sun

    # Extra Java CLASSPATH elements. Optional.
    # export HADOOP_CLASSPATH="<extra_entries>:$HADOOP_CLASSPATH"

    # The maximum amount of heap to use, in MB. Default is 1000.
    export HADOOP_HEAPSIZE=256

    # Extra Java runtime options. Empty by default.
    # export HADOOP_OPTS="-server -Xmx256M -Xms64M $HADOOP_OPTS"

    # Command specific options appended to HADOOP_OPTS when specified
    export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_NAMENODE_OPTS"
    export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_SECONDARYNAMENODE_OPTS"
    export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_DATANODE_OPTS"
    export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_BALANCER_OPTS"
    export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_JOBTRACKER_OPTS"
    # export HADOOP_TASKTRACKER_OPTS=
    # The following applies to multiple commands (fs, dfs, fsck, distcp etc)
    # export HADOOP_CLIENT_OPTS

    # Extra ssh options. Empty by default.
    # export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"

    # Where log files are stored. $HADOOP_HOME/logs by default.
    # export HADOOP_LOG_DIR=${HADOOP_HOME}/logs

    # File naming remote slave hosts. $HADOOP_HOME/conf/slaves by default.
    # export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves

    # host:path where hadoop code should be rsync'd from. Unset by default.
    # export HADOOP_MASTER=master:/home/$USER/src/hadoop

    # Seconds to sleep between slave commands. Unset by default. This
    # can be useful in large clusters, where, e.g., slave rsyncs can
    # otherwise arrive faster than the master can service them.
    # export HADOOP_SLAVE_SLEEP=0.1

    # The directory where pid files are stored. /tmp by default.
    # export HADOOP_PID_DIR=/var/hadoop/pids

    # A string representing this instance of hadoop. $USER by default.
    # export HADOOP_IDENT_STRING=$USER

    # The scheduling priority for daemon processes. See 'man nice'.
    # export HADOOP_NICENESS=10


    Regards,
    Vikram

    On Thu, Oct 4, 2012 at 5:35 PM, Harsh J wrote:

    Vikram,

    Many thanks for sending these in.

    The issue is the occurrence of this message in the midst of generating
    a library path.
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize (262144k)
    is equal to or greater than the entire heap (262144k). A new generation
    size of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'

    It looks to me like you have some bad arguments added to either
    /etc/hadoop/conf/hadoop-env.sh or to /etc/hbase/conf/hbase-env.sh (One
    of the -X JVM properties). Can you share the contents of these two
    files?

    On Thu, Oct 4, 2012 at 5:15 PM, vikram patil <patilvikram@gmail.com>
    wrote:
    Harsh,

    First suggestion didn't work so I tried second . It yield in error
    but
    gave following output This is just partial output . I have colored what
    I
    think might be causing issue?



    CLASS='org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/..'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str='
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console'
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize (262144k)
    is
    equal to or greater than the entire heap (262144k). A new generation
    size
    of 262080k will be used.

    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java HotSpot(TM)
    64-Bit
    Server VM warning: MaxNewSize (262144k) is equal to or greater than the
    entire heap (262144k). A new generation size of 262080k will be used.

    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    + '[' '' '!=' '' ']'
    + exec /usr/local/jdk/bin/java '-XX:OnOutOfMemoryError=kill -9 %p'
    -Xmx1000m
    -XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java 'HotSpot(TM)'
    64-Bit Server VM warning: MaxNewSize '(262144k)' is equal to or greater
    than
    the entire heap '(262144k).' A new generation size of 262080k will be
    used.

    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64
    -classpath

    '/etc/hbase/conf.cloudera.hbase1:/usr/local/jdk/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security.jar:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security-tests.jar:/usr/lib/hbase/bin/../hbase.jar:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/aopalliance-1.0.jar:/usr/lib/hbase/bin/../lib/asm-3.2.jar:/usr/lib/hbase/bin/../lib/aspectjrt-1.6.5.jar:/usr/lib/hbase/bin/../lib/avro-1.5.4.jar:/usr/lib/hbase/bin/../lib/avro-ipc-1.5.4.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.4.jar:/usr/lib/hbase/bin/../lib/commons-collections-3.2.1.jar:/usr/lib/hbase/bin/../lib/commons-configuration-1.6.jar:/usr/lib/hbase/bin/../lib/commons-daemon-1.0.3.jar:/usr/lib/hbase/bin/../lib/commons-digester-1.8.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-io-2.1.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.5.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-logging-api-1.1.jar:/usr/lib/hbase/bin/../lib/commons-net-3.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/gmbal-api-only-3.0.0-b023.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1-tests.jar:/usr/lib/hbase/bin/../lib/grizzly-http-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-server-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-servlet-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-rcm-2.1.1.jar:/usr/lib/hbase/bin/../lib/guava-11.0.2.jar:/usr/lib/hbase/bin/../lib/guice-3.0.jar:/usr/lib/hbase/bin/../lib/guice-servlet-3.0.jar:/usr/lib/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/usr/lib/hbase/bin/../lib/httpclient-4.0.1.jar:/usr/lib/hbase/bin/../lib/httpcore-4.0.1.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/javax.inject-1.jar:/usr/lib/hbase/bin/../lib/javax.servlet-3.0.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.1.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/bin/../lib/jdiff-1.0.9.jar:/usr/lib/hbase/bin/../lib/jersey-client-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-guice-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-core-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jets3t-0.6.1.jar:/usr/lib/hbase/bin/../lib/jettison-1.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.5.jar:/usr/lib/hbase/bin/../lib/jsch-0.1.42.jar:/usr/lib/hbase/bin/../lib/json-simple-1.1.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1.jar:/usr/lib/hbase/bin/../lib/jsr305-1.3.9.jar:/usr/lib/hbase/bin/../lib/kfs-0.3.jar:/usr/lib/hbase/bin/../lib/libthrift-0.7.0.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.16.jar:/usr/lib/hbase/bin/../lib/management-api-3.0.0-b012.jar:/usr/lib/hbase/bin/../lib/netty-3.2.4.Final.jar:/usr/lib/hbase/bin/../lib/oro-2.0.8.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5.jar:/usr/lib/hbase/bin/../lib/slf4j-api-1.6.1.jar:/usr/lib/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/velocity-1.7.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*'
    org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb
    Exception in thread "main" java.lang.NoClassDefFoundError: HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.
    On Thursday, October 4, 2012 5:05:38 PM UTC+5:30, Harsh J wrote:

    Vikram,

    The error you are receiving is pretty odd.

    Can you send us a few more outputs please?

    - Does invoking "/usr/lib/hbase/bin/hbase shell" work?
    -- If it does, please send output of "bash -x hbase shell"
    -- If it does not, please send output of "bash -x
    /usr/lib/hbase/bin/hbase shell"

    - Can you send us /etc/hbase/conf/hbase-env.sh contents?

    On Thu, Oct 4, 2012 at 2:25 PM, vikram patil <patil...@gmail.com>
    wrote:
    Hi Harsh,

    java version "1.6.0_30"
    Java(TM) SE Runtime Environment (build 1.6.0_30-b12)
    Java HotSpot(TM) 64-Bit Server VM (build 20.5-b03, mixed mode)
    I might be missing basic steps which enable usage of hbase shell with
    cloudera installation


    Regards
    Vikram
    On Thursday, October 4, 2012 2:19:42 PM UTC+5:30, Harsh J wrote:

    Hi Vikram,

    What version of a JVM are you using, and what is the OS here?

    Can you type "java -version" and send us the output please?

    On Thu, Oct 4, 2012 at 2:03 PM, vikram patil <patil...@gmail.com>
    wrote:
    I have installed hbase using CDH manager. Our application started
    pushing
    data into Hbase. But I want o have a look at inserted data using
    hbase
    shell. But I get following error. Please let me know if I am
    missing
    any
    step.

    hadoop_user@MACHNAME ~$ hbase shell
    Exception in thread "main" java.lang.NoClassDefFoundError:
    HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J
  • Vikram patil at Oct 4, 2012 at 12:31 pm
    Yup it worked.

    Actually we wanted to run hadoop cluster with tap on maximum memory usage
    by any hadoop related service. Currently we have many applications
    including datanode, hbase master, hbase region server on machine with only
    4GB physical memory . So I was experimenting with that parameter.

    Reards,
    Vikram
    On Thu, Oct 4, 2012 at 5:52 PM, Harsh J wrote:

    Yes this problem may not affect the service uptimes but you may
    currently be running without any native codec support accidentally
    (possibly)

    This is the erroneous line in hadoop-env.sh thats setting an invalid
    MaxNewSize, leading to a JVM warn and a script run failure thats
    eventually boiling down to a hbase shell invocation failure:

    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -XX:NewSize=64m
    -XX:MaxNewSize=256m $HADOOP_OPTS"

    Please remove it away/comment it and retry. The MaxNewSize should not
    be close to the heap size. Why have you added this line btw?
    On Thu, Oct 4, 2012 at 5:47 PM, vikram patil wrote:
    Harsh,

    But hbase and hdfs both are running fine without any issues . Only shell is
    not able to start up.

    I was even able to start rest connector for Hbase. But I have pasted both
    env.sh files here.

    ==> /etc/hbase/conf/hbase-env.sh <==
    export HBASE_OPTS="-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M"
    export HBASE_CLASSPATH=`echo $HBASE_CLASSPATH | sed -e
    "s|$ZOOKEEPER_CONF:||"`


    tail -3000 /etc/hadoop/conf/hadoop-env.sh

    # Set Hadoop-specific environment variables here.

    # Disable IPv6.
    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -XX:NewSize=64m
    -XX:MaxNewSize=256m $HADOOP_OPTS"

    # The only required environment variable is JAVA_HOME. All others are
    # optional. When running a distributed configuration it is best to
    # set JAVA_HOME in this file, so that it is correctly defined on
    # remote nodes.

    # The java implementation to use. Required.
    # export JAVA_HOME=/usr/lib/j2sdk1.5-sun

    # Extra Java CLASSPATH elements. Optional.
    # export HADOOP_CLASSPATH="<extra_entries>:$HADOOP_CLASSPATH"

    # The maximum amount of heap to use, in MB. Default is 1000.
    export HADOOP_HEAPSIZE=256

    # Extra Java runtime options. Empty by default.
    # export HADOOP_OPTS="-server -Xmx256M -Xms64M $HADOOP_OPTS"

    # Command specific options appended to HADOOP_OPTS when specified
    export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_NAMENODE_OPTS"
    export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_SECONDARYNAMENODE_OPTS"
    export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_DATANODE_OPTS"
    export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_BALANCER_OPTS"
    export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_JOBTRACKER_OPTS"
    # export HADOOP_TASKTRACKER_OPTS=
    # The following applies to multiple commands (fs, dfs, fsck, distcp etc)
    # export HADOOP_CLIENT_OPTS

    # Extra ssh options. Empty by default.
    # export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"

    # Where log files are stored. $HADOOP_HOME/logs by default.
    # export HADOOP_LOG_DIR=${HADOOP_HOME}/logs

    # File naming remote slave hosts. $HADOOP_HOME/conf/slaves by default.
    # export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves

    # host:path where hadoop code should be rsync'd from. Unset by default.
    # export HADOOP_MASTER=master:/home/$USER/src/hadoop

    # Seconds to sleep between slave commands. Unset by default. This
    # can be useful in large clusters, where, e.g., slave rsyncs can
    # otherwise arrive faster than the master can service them.
    # export HADOOP_SLAVE_SLEEP=0.1

    # The directory where pid files are stored. /tmp by default.
    # export HADOOP_PID_DIR=/var/hadoop/pids

    # A string representing this instance of hadoop. $USER by default.
    # export HADOOP_IDENT_STRING=$USER

    # The scheduling priority for daemon processes. See 'man nice'.
    # export HADOOP_NICENESS=10


    Regards,
    Vikram

    On Thu, Oct 4, 2012 at 5:35 PM, Harsh J wrote:

    Vikram,

    Many thanks for sending these in.

    The issue is the occurrence of this message in the midst of generating
    a library path.
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize
    (262144k)
    is equal to or greater than the entire heap (262144k). A new
    generation
    size of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'

    It looks to me like you have some bad arguments added to either
    /etc/hadoop/conf/hadoop-env.sh or to /etc/hbase/conf/hbase-env.sh (One
    of the -X JVM properties). Can you share the contents of these two
    files?

    On Thu, Oct 4, 2012 at 5:15 PM, vikram patil <patilvikram@gmail.com>
    wrote:
    Harsh,

    First suggestion didn't work so I tried second . It yield in
    error
    but
    gave following output This is just partial output . I have colored
    what
    I
    think might be causing issue?



    CLASS='org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/..'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str='
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console'
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize
    (262144k)
    is
    equal to or greater than the entire heap (262144k). A new generation
    size
    of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java HotSpot(TM)
    64-Bit
    Server VM warning: MaxNewSize (262144k) is equal to or greater than
    the
    entire heap (262144k). A new generation size of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    + '[' '' '!=' '' ']'
    + exec /usr/local/jdk/bin/java '-XX:OnOutOfMemoryError=kill -9 %p'
    -Xmx1000m
    -XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java
    'HotSpot(TM)'
    64-Bit Server VM warning: MaxNewSize '(262144k)' is equal to or
    greater
    than
    the entire heap '(262144k).' A new generation size of 262080k will be
    used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64
    -classpath
    '/etc/hbase/conf.cloudera.hbase1:/usr/local/jdk/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security.jar:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security-tests.jar:/usr/lib/hbase/bin/../hbase.jar:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/aopalliance-1.0.jar:/usr/lib/hbase/bin/../lib/asm-3.2.jar:/usr/lib/hbase/bin/../lib/aspectjrt-1.6.5.jar:/usr/lib/hbase/bin/../lib/avro-1.5.4.jar:/usr/lib/hbase/bin/../lib/avro-ipc-1.5.4.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.4.jar:/usr/lib/hbase/bin/../lib/commons-collections-3.2.1.jar:/usr/lib/hbase/bin/../lib/commons-configuration-1.6.jar:/usr/lib/hbase/bin/../lib/commons-daemon-1.0.3.jar:/usr/lib/hbase/bin/../lib/commons-digester-1.8.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-io-2.1.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.5.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-logging-api-1.1.jar:/usr/lib/hbase/bin/../lib/commons-net-3.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/gmbal-api-only-3.0.0-b023.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1-tests.jar:/usr/lib/hbase/bin/../lib/grizzly-http-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-server-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-servlet-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-rcm-2.1.1.jar:/usr/lib/hbase/bin/../lib/guava-11.0.2.jar:/usr/lib/hbase/bin/../lib/guice-3.0.jar:/usr/lib/hbase/bin/../lib/guice-servlet-3.0.jar:/usr/lib/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/usr/lib/hbase/bin/../lib/httpclient-4.0.1.jar:/usr/lib/hbase/bin/../lib/httpcore-4.0.1.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/javax.inject-1.jar:/usr/lib/hbase/bin/../lib/javax.servlet-3.0.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.1.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/bin/../lib/jdiff-1.0.9.jar:/usr/lib/hbase/bin/../lib/jersey-client-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-guice-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-core-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jets3t-0.6.1.jar:/usr/lib/hbase/bin/../lib/jettison-1.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.5.jar:/usr/lib/hbase/bin/../lib/jsch-0.1.42.jar:/usr/lib/hbase/bin/../lib/json-simple-1.1.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1.jar:/usr/lib/hbase/bin/../lib/jsr305-1.3.9.jar:/usr/lib/hbase/bin/../lib/kfs-0.3.jar:/usr/lib/hbase/bin/../lib/libthrift-0.7.0.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.16.jar:/usr/lib/hbase/bin/../lib/management-api-3.0.0-b012.jar:/usr/lib/hbase/bin/../lib/netty-3.2.4.Final.jar:/usr/lib/hbase/bin/../lib/oro-2.0.8.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5.jar:/usr/lib/hbase/bin/../lib/slf4j-api-1.6.1.jar:/usr/lib/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/velocity-1.7.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*'
    org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb
    Exception in thread "main" java.lang.NoClassDefFoundError: HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.
    On Thursday, October 4, 2012 5:05:38 PM UTC+5:30, Harsh J wrote:

    Vikram,

    The error you are receiving is pretty odd.

    Can you send us a few more outputs please?

    - Does invoking "/usr/lib/hbase/bin/hbase shell" work?
    -- If it does, please send output of "bash -x hbase shell"
    -- If it does not, please send output of "bash -x
    /usr/lib/hbase/bin/hbase shell"

    - Can you send us /etc/hbase/conf/hbase-env.sh contents?

    On Thu, Oct 4, 2012 at 2:25 PM, vikram patil <patil...@gmail.com>
    wrote:
    Hi Harsh,

    java version "1.6.0_30"
    Java(TM) SE Runtime Environment (build 1.6.0_30-b12)
    Java HotSpot(TM) 64-Bit Server VM (build 20.5-b03, mixed mode)
    I might be missing basic steps which enable usage of hbase shell
    with
    cloudera installation


    Regards
    Vikram
    On Thursday, October 4, 2012 2:19:42 PM UTC+5:30, Harsh J wrote:

    Hi Vikram,

    What version of a JVM are you using, and what is the OS here?

    Can you type "java -version" and send us the output please?

    On Thu, Oct 4, 2012 at 2:03 PM, vikram patil <patil...@gmail.com>
    wrote:
    I have installed hbase using CDH manager. Our application
    started
    pushing
    data into Hbase. But I want o have a look at inserted data using
    hbase
    shell. But I get following error. Please let me know if I am
    missing
    any
    step.

    hadoop_user@MACHNAME ~$ hbase shell
    Exception in thread "main" java.lang.NoClassDefFoundError:
    HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J
  • Vikram patil at Oct 4, 2012 at 12:32 pm
    Could you please suggest ways to restrict hadoop, hbase services' memory
    usage?

    Regards,
    Vikram
    On Thu, Oct 4, 2012 at 6:01 PM, vikram patil wrote:

    Yup it worked.

    Actually we wanted to run hadoop cluster with tap on maximum memory usage
    by any hadoop related service. Currently we have many applications
    including datanode, hbase master, hbase region server on machine with only
    4GB physical memory . So I was experimenting with that parameter.

    Reards,
    Vikram

    On Thu, Oct 4, 2012 at 5:52 PM, Harsh J wrote:

    Yes this problem may not affect the service uptimes but you may
    currently be running without any native codec support accidentally
    (possibly)

    This is the erroneous line in hadoop-env.sh thats setting an invalid
    MaxNewSize, leading to a JVM warn and a script run failure thats
    eventually boiling down to a hbase shell invocation failure:

    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -XX:NewSize=64m
    -XX:MaxNewSize=256m $HADOOP_OPTS"

    Please remove it away/comment it and retry. The MaxNewSize should not
    be close to the heap size. Why have you added this line btw?

    On Thu, Oct 4, 2012 at 5:47 PM, vikram patil <patilvikram@gmail.com>
    wrote:
    Harsh,

    But hbase and hdfs both are running fine without any issues . Only shell is
    not able to start up.

    I was even able to start rest connector for Hbase. But I have pasted both
    env.sh files here.

    ==> /etc/hbase/conf/hbase-env.sh <==
    export HBASE_OPTS="-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M"
    export HBASE_CLASSPATH=`echo $HBASE_CLASSPATH | sed -e
    "s|$ZOOKEEPER_CONF:||"`


    tail -3000 /etc/hadoop/conf/hadoop-env.sh

    # Set Hadoop-specific environment variables here.

    # Disable IPv6.
    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -XX:NewSize=64m
    -XX:MaxNewSize=256m $HADOOP_OPTS"

    # The only required environment variable is JAVA_HOME. All others are
    # optional. When running a distributed configuration it is best to
    # set JAVA_HOME in this file, so that it is correctly defined on
    # remote nodes.

    # The java implementation to use. Required.
    # export JAVA_HOME=/usr/lib/j2sdk1.5-sun

    # Extra Java CLASSPATH elements. Optional.
    # export HADOOP_CLASSPATH="<extra_entries>:$HADOOP_CLASSPATH"

    # The maximum amount of heap to use, in MB. Default is 1000.
    export HADOOP_HEAPSIZE=256

    # Extra Java runtime options. Empty by default.
    # export HADOOP_OPTS="-server -Xmx256M -Xms64M $HADOOP_OPTS"

    # Command specific options appended to HADOOP_OPTS when specified
    export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_NAMENODE_OPTS"
    export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_SECONDARYNAMENODE_OPTS"
    export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_DATANODE_OPTS"
    export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_BALANCER_OPTS"
    export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_JOBTRACKER_OPTS"
    # export HADOOP_TASKTRACKER_OPTS=
    # The following applies to multiple commands (fs, dfs, fsck, distcp etc)
    # export HADOOP_CLIENT_OPTS

    # Extra ssh options. Empty by default.
    # export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o
    SendEnv=HADOOP_CONF_DIR"
    # Where log files are stored. $HADOOP_HOME/logs by default.
    # export HADOOP_LOG_DIR=${HADOOP_HOME}/logs

    # File naming remote slave hosts. $HADOOP_HOME/conf/slaves by default.
    # export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves

    # host:path where hadoop code should be rsync'd from. Unset by default.
    # export HADOOP_MASTER=master:/home/$USER/src/hadoop

    # Seconds to sleep between slave commands. Unset by default. This
    # can be useful in large clusters, where, e.g., slave rsyncs can
    # otherwise arrive faster than the master can service them.
    # export HADOOP_SLAVE_SLEEP=0.1

    # The directory where pid files are stored. /tmp by default.
    # export HADOOP_PID_DIR=/var/hadoop/pids

    # A string representing this instance of hadoop. $USER by default.
    # export HADOOP_IDENT_STRING=$USER

    # The scheduling priority for daemon processes. See 'man nice'.
    # export HADOOP_NICENESS=10


    Regards,
    Vikram

    On Thu, Oct 4, 2012 at 5:35 PM, Harsh J wrote:

    Vikram,

    Many thanks for sending these in.

    The issue is the occurrence of this message in the midst of generating
    a library path.
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize
    (262144k)
    is equal to or greater than the entire heap (262144k). A new
    generation
    size of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'

    It looks to me like you have some bad arguments added to either
    /etc/hadoop/conf/hadoop-env.sh or to /etc/hbase/conf/hbase-env.sh (One
    of the -X JVM properties). Can you share the contents of these two
    files?

    On Thu, Oct 4, 2012 at 5:15 PM, vikram patil <patilvikram@gmail.com>
    wrote:
    Harsh,

    First suggestion didn't work so I tried second . It yield in
    error
    but
    gave following output This is just partial output . I have colored
    what
    I
    think might be causing issue?



    CLASS='org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/..'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str='
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console'
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize
    (262144k)
    is
    equal to or greater than the entire heap (262144k). A new generation
    size
    of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java HotSpot(TM)
    64-Bit
    Server VM warning: MaxNewSize (262144k) is equal to or greater than
    the
    entire heap (262144k). A new generation size of 262080k will be
    used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    + '[' '' '!=' '' ']'
    + exec /usr/local/jdk/bin/java '-XX:OnOutOfMemoryError=kill -9 %p'
    -Xmx1000m
    -XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java
    'HotSpot(TM)'
    64-Bit Server VM warning: MaxNewSize '(262144k)' is equal to or
    greater
    than
    the entire heap '(262144k).' A new generation size of 262080k will be
    used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64
    -classpath
    '/etc/hbase/conf.cloudera.hbase1:/usr/local/jdk/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security.jar:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security-tests.jar:/usr/lib/hbase/bin/../hbase.jar:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/aopalliance-1.0.jar:/usr/lib/hbase/bin/../lib/asm-3.2.jar:/usr/lib/hbase/bin/../lib/aspectjrt-1.6.5.jar:/usr/lib/hbase/bin/../lib/avro-1.5.4.jar:/usr/lib/hbase/bin/../lib/avro-ipc-1.5.4.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.4.jar:/usr/lib/hbase/bin/../lib/commons-collections-3.2.1.jar:/usr/lib/hbase/bin/../lib/commons-configuration-1.6.jar:/usr/lib/hbase/bin/../lib/commons-daemon-1.0.3.jar:/usr/lib/hbase/bin/../lib/commons-digester-1.8.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-io-2.1.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.5.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-logging-api-1.1.jar:/usr/lib/hbase/bin/../lib/commons-net-3.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/gmbal-api-only-3.0.0-b023.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1-tests.jar:/usr/lib/hbase/bin/../lib/grizzly-http-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-server-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-servlet-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-rcm-2.1.1.jar:/usr/lib/hbase/bin/../lib/guava-11.0.2.jar:/usr/lib/hbase/bin/../lib/guice-3.0.jar:/usr/lib/hbase/bin/../lib/guice-servlet-3.0.jar:/usr/lib/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/usr/lib/hbase/bin/../lib/httpclient-4.0.1.jar:/usr/lib/hbase/bin/../lib/httpcore-4.0.1.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/javax.inject-1.jar:/usr/lib/hbase/bin/../lib/javax.servlet-3.0.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.1.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/bin/../lib/jdiff-1.0.9.jar:/usr/lib/hbase/bin/../lib/jersey-client-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-guice-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-core-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jets3t-0.6.1.jar:/usr/lib/hbase/bin/../lib/jettison-1.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.5.jar:/usr/lib/hbase/bin/../lib/jsch-0.1.42.jar:/usr/lib/hbase/bin/../lib/json-simple-1.1.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1.jar:/usr/lib/hbase/bin/../lib/jsr305-1.3.9.jar:/usr/lib/hbase/bin/../lib/kfs-0.3.jar:/usr/lib/hbase/bin/../lib/libthrift-0.7.0.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.16.jar:/usr/lib/hbase/bin/../lib/management-api-3.0.0-b012.jar:/usr/lib/hbase/bin/../lib/netty-3.2.4.Final.jar:/usr/lib/hbase/bin/../lib/oro-2.0.8.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5.jar:/usr/lib/hbase/bin/../lib/slf4j-api-1.6.1.jar:/usr/lib/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/velocity-1.7.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*'
    org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb
    Exception in thread "main" java.lang.NoClassDefFoundError:
    HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.
    On Thursday, October 4, 2012 5:05:38 PM UTC+5:30, Harsh J wrote:

    Vikram,

    The error you are receiving is pretty odd.

    Can you send us a few more outputs please?

    - Does invoking "/usr/lib/hbase/bin/hbase shell" work?
    -- If it does, please send output of "bash -x hbase shell"
    -- If it does not, please send output of "bash -x
    /usr/lib/hbase/bin/hbase shell"

    - Can you send us /etc/hbase/conf/hbase-env.sh contents?

    On Thu, Oct 4, 2012 at 2:25 PM, vikram patil <patil...@gmail.com>
    wrote:
    Hi Harsh,

    java version "1.6.0_30"
    Java(TM) SE Runtime Environment (build 1.6.0_30-b12)
    Java HotSpot(TM) 64-Bit Server VM (build 20.5-b03, mixed mode)
    I might be missing basic steps which enable usage of hbase shell
    with
    cloudera installation


    Regards
    Vikram
    On Thursday, October 4, 2012 2:19:42 PM UTC+5:30, Harsh J wrote:

    Hi Vikram,

    What version of a JVM are you using, and what is the OS here?

    Can you type "java -version" and send us the output please?

    On Thu, Oct 4, 2012 at 2:03 PM, vikram patil <patil...@gmail.com
    wrote:
    I have installed hbase using CDH manager. Our application
    started
    pushing
    data into Hbase. But I want o have a look at inserted data
    using
    hbase
    shell. But I get following error. Please let me know if I am
    missing
    any
    step.

    hadoop_user@MACHNAME ~$ hbase shell
    Exception in thread "main" java.lang.NoClassDefFoundError:
    HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J
  • Harsh J at Oct 4, 2012 at 12:38 pm
    Vikram,

    Glad to know it worked. For controlling max memory usage, you may only
    need to tweak the JVM maximum heap properties for each service.

    In Cloudera Manager, this is doable by editing the "Java Heap Size"
    style properties in the configuration page of any service. Set it to
    the bytes of memory you want it to use and save - it will validate
    your configs automatically if its too low.
    On Thu, Oct 4, 2012 at 6:02 PM, vikram patil wrote:
    Could you please suggest ways to restrict hadoop, hbase services' memory
    usage?

    Regards,
    Vikram

    On Thu, Oct 4, 2012 at 6:01 PM, vikram patil wrote:

    Yup it worked.

    Actually we wanted to run hadoop cluster with tap on maximum memory usage
    by any hadoop related service. Currently we have many applications
    including datanode, hbase master, hbase region server on machine with only
    4GB physical memory . So I was experimenting with that parameter.

    Reards,
    Vikram

    On Thu, Oct 4, 2012 at 5:52 PM, Harsh J wrote:

    Yes this problem may not affect the service uptimes but you may
    currently be running without any native codec support accidentally
    (possibly)

    This is the erroneous line in hadoop-env.sh thats setting an invalid
    MaxNewSize, leading to a JVM warn and a script run failure thats
    eventually boiling down to a hbase shell invocation failure:

    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -XX:NewSize=64m
    -XX:MaxNewSize=256m $HADOOP_OPTS"

    Please remove it away/comment it and retry. The MaxNewSize should not
    be close to the heap size. Why have you added this line btw?

    On Thu, Oct 4, 2012 at 5:47 PM, vikram patil <patilvikram@gmail.com>
    wrote:
    Harsh,

    But hbase and hdfs both are running fine without any issues . Only
    shell is
    not able to start up.

    I was even able to start rest connector for Hbase. But I have pasted
    both
    env.sh files here.

    ==> /etc/hbase/conf/hbase-env.sh <==
    export HBASE_OPTS="-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M"
    export HBASE_CLASSPATH=`echo $HBASE_CLASSPATH | sed -e
    "s|$ZOOKEEPER_CONF:||"`


    tail -3000 /etc/hadoop/conf/hadoop-env.sh

    # Set Hadoop-specific environment variables here.

    # Disable IPv6.
    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -XX:NewSize=64m
    -XX:MaxNewSize=256m $HADOOP_OPTS"

    # The only required environment variable is JAVA_HOME. All others are
    # optional. When running a distributed configuration it is best to
    # set JAVA_HOME in this file, so that it is correctly defined on
    # remote nodes.

    # The java implementation to use. Required.
    # export JAVA_HOME=/usr/lib/j2sdk1.5-sun

    # Extra Java CLASSPATH elements. Optional.
    # export HADOOP_CLASSPATH="<extra_entries>:$HADOOP_CLASSPATH"

    # The maximum amount of heap to use, in MB. Default is 1000.
    export HADOOP_HEAPSIZE=256

    # Extra Java runtime options. Empty by default.
    # export HADOOP_OPTS="-server -Xmx256M -Xms64M $HADOOP_OPTS"

    # Command specific options appended to HADOOP_OPTS when specified
    export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_NAMENODE_OPTS"
    export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_SECONDARYNAMENODE_OPTS"
    export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_DATANODE_OPTS"
    export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_BALANCER_OPTS"
    export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_JOBTRACKER_OPTS"
    # export HADOOP_TASKTRACKER_OPTS=
    # The following applies to multiple commands (fs, dfs, fsck, distcp
    etc)
    # export HADOOP_CLIENT_OPTS

    # Extra ssh options. Empty by default.
    # export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o
    SendEnv=HADOOP_CONF_DIR"

    # Where log files are stored. $HADOOP_HOME/logs by default.
    # export HADOOP_LOG_DIR=${HADOOP_HOME}/logs

    # File naming remote slave hosts. $HADOOP_HOME/conf/slaves by default.
    # export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves

    # host:path where hadoop code should be rsync'd from. Unset by
    default.
    # export HADOOP_MASTER=master:/home/$USER/src/hadoop

    # Seconds to sleep between slave commands. Unset by default. This
    # can be useful in large clusters, where, e.g., slave rsyncs can
    # otherwise arrive faster than the master can service them.
    # export HADOOP_SLAVE_SLEEP=0.1

    # The directory where pid files are stored. /tmp by default.
    # export HADOOP_PID_DIR=/var/hadoop/pids

    # A string representing this instance of hadoop. $USER by default.
    # export HADOOP_IDENT_STRING=$USER

    # The scheduling priority for daemon processes. See 'man nice'.
    # export HADOOP_NICENESS=10


    Regards,
    Vikram

    On Thu, Oct 4, 2012 at 5:35 PM, Harsh J wrote:

    Vikram,

    Many thanks for sending these in.

    The issue is the occurrence of this message in the midst of generating
    a library path.
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize
    (262144k)
    is equal to or greater than the entire heap (262144k). A new
    generation
    size of 262080k will be used.

    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'

    It looks to me like you have some bad arguments added to either
    /etc/hadoop/conf/hadoop-env.sh or to /etc/hbase/conf/hbase-env.sh (One
    of the -X JVM properties). Can you share the contents of these two
    files?

    On Thu, Oct 4, 2012 at 5:15 PM, vikram patil <patilvikram@gmail.com>
    wrote:
    Harsh,

    First suggestion didn't work so I tried second . It yield in
    error
    but
    gave following output This is just partial output . I have colored
    what
    I
    think might be causing issue?



    CLASS='org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/..'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str='
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console'
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize
    (262144k)
    is
    equal to or greater than the entire heap (262144k). A new
    generation
    size
    of 262080k will be used.


    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java
    HotSpot(TM)
    64-Bit
    Server VM warning: MaxNewSize (262144k) is equal to or greater than
    the
    entire heap (262144k). A new generation size of 262080k will be
    used.


    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    + '[' '' '!=' '' ']'
    + exec /usr/local/jdk/bin/java '-XX:OnOutOfMemoryError=kill -9 %p'
    -Xmx1000m
    -XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java
    'HotSpot(TM)'
    64-Bit Server VM warning: MaxNewSize '(262144k)' is equal to or
    greater
    than
    the entire heap '(262144k).' A new generation size of 262080k will
    be
    used.


    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64
    -classpath


    '/etc/hbase/conf.cloudera.hbase1:/usr/local/jdk/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security.jar:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security-tests.jar:/usr/lib/hbase/bin/../hbase.jar:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/aopalliance-1.0.jar:/usr/lib/hbase/bin/../lib/asm-3.2.jar:/usr/lib/hbase/bin/../lib/aspectjrt-1.6.5.jar:/usr/lib/hbase/bin/../lib/avro-1.5.4.jar:/usr/lib/hbase/bin/../lib/avro-ipc-1.5.4.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.4.jar:/usr/lib/hbase/bin/../lib/commons-collections-3.2.1.jar:/usr/lib/hbase/bin/../lib/commons-configuration-1.6.jar:/usr/lib/hbase/bin/../lib/commons-daemon-1.0.3.jar:/usr/lib/hbase/bin/../lib/commons-digester-1.8.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-io-2.1.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.5.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-logging-api-1.1.jar:/usr/lib/hbase/bin/../lib/commons-net-3.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/gmbal-api-only-3.0.0-b023.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1-tests.jar:/usr/lib/hbase/bin/../lib/grizzly-http-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-server-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-servlet-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-rcm-2.1.1.jar:/usr/lib/hbase/bin/../lib/guava-11.0.2.jar:/usr/lib/hbase/bin/../lib/guice-3.0.jar:/usr/lib/hbase/bin/../lib/guice-servlet-3.0.jar:/usr/lib/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/usr/lib/hbase/bin/../lib/httpclient-4.0.1.jar:/usr/lib/hbase/bin/../lib/httpcore-4.0.1.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/javax.inject-1.jar:/usr/lib/hbase/bin/../lib/javax.servlet-3.0.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.1.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/bin/../lib/jdiff-1.0.9.jar:/usr/lib/hbase/bin/../lib/jersey-client-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-guice-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-core-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jets3t-0.6.1.jar:/usr/lib/hbase/bin/../lib/jettison-1.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.5.jar:/usr/lib/hbase/bin/../lib/jsch-0.1.42.jar:/usr/lib/hbase/bin/../lib/json-simple-1.1.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1.jar:/usr/lib/hbase/bin/../lib/jsr305-1.3.9.jar:/usr/lib/hbase/bin/../lib/kfs-0.3.jar:/usr/lib/hbase/bin/../lib/libthrift-0.7.0.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.16.jar:/usr/lib/hbase/bin/../lib/management-api-3.0.0-b012.jar:/usr/lib/hbase/bin/../lib/netty-3.2.4.Final.jar:/usr/lib/hbase/bin/../lib/oro-2.0.8.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5.jar:/usr/lib/hbase/bin/../lib/slf4j-api-1.6.1.jar:/usr/lib/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/velocity-1.7.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*'
    org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb
    Exception in thread "main" java.lang.NoClassDefFoundError:
    HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.
    On Thursday, October 4, 2012 5:05:38 PM UTC+5:30, Harsh J wrote:

    Vikram,

    The error you are receiving is pretty odd.

    Can you send us a few more outputs please?

    - Does invoking "/usr/lib/hbase/bin/hbase shell" work?
    -- If it does, please send output of "bash -x hbase shell"
    -- If it does not, please send output of "bash -x
    /usr/lib/hbase/bin/hbase shell"

    - Can you send us /etc/hbase/conf/hbase-env.sh contents?

    On Thu, Oct 4, 2012 at 2:25 PM, vikram patil <patil...@gmail.com>
    wrote:
    Hi Harsh,

    java version "1.6.0_30"
    Java(TM) SE Runtime Environment (build 1.6.0_30-b12)
    Java HotSpot(TM) 64-Bit Server VM (build 20.5-b03, mixed mode)
    I might be missing basic steps which enable usage of hbase shell
    with
    cloudera installation


    Regards
    Vikram
    On Thursday, October 4, 2012 2:19:42 PM UTC+5:30, Harsh J wrote:

    Hi Vikram,

    What version of a JVM are you using, and what is the OS here?

    Can you type "java -version" and send us the output please?

    On Thu, Oct 4, 2012 at 2:03 PM, vikram patil
    <patil...@gmail.com>
    wrote:
    I have installed hbase using CDH manager. Our application
    started
    pushing
    data into Hbase. But I want o have a look at inserted data
    using
    hbase
    shell. But I get following error. Please let me know if I am
    missing
    any
    step.

    hadoop_user@MACHNAME ~$ hbase shell
    Exception in thread "main" java.lang.NoClassDefFoundError:
    HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will
    exit.


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J
  • Vikram patil at Oct 4, 2012 at 12:56 pm
    I used similar settings to do it after all .Thank you very much for
    providing help on this issue

    Regards,
    Vikram
    On Thu, Oct 4, 2012 at 6:08 PM, Harsh J wrote:

    Vikram,

    Glad to know it worked. For controlling max memory usage, you may only
    need to tweak the JVM maximum heap properties for each service.

    In Cloudera Manager, this is doable by editing the "Java Heap Size"
    style properties in the configuration page of any service. Set it to
    the bytes of memory you want it to use and save - it will validate
    your configs automatically if its too low.
    On Thu, Oct 4, 2012 at 6:02 PM, vikram patil wrote:
    Could you please suggest ways to restrict hadoop, hbase services' memory
    usage?

    Regards,
    Vikram

    On Thu, Oct 4, 2012 at 6:01 PM, vikram patil wrote:

    Yup it worked.

    Actually we wanted to run hadoop cluster with tap on maximum memory
    usage
    by any hadoop related service. Currently we have many applications
    including datanode, hbase master, hbase region server on machine with
    only
    4GB physical memory . So I was experimenting with that parameter.

    Reards,
    Vikram

    On Thu, Oct 4, 2012 at 5:52 PM, Harsh J wrote:

    Yes this problem may not affect the service uptimes but you may
    currently be running without any native codec support accidentally
    (possibly)

    This is the erroneous line in hadoop-env.sh thats setting an invalid
    MaxNewSize, leading to a JVM warn and a script run failure thats
    eventually boiling down to a hbase shell invocation failure:

    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -XX:NewSize=64m
    -XX:MaxNewSize=256m $HADOOP_OPTS"

    Please remove it away/comment it and retry. The MaxNewSize should not
    be close to the heap size. Why have you added this line btw?

    On Thu, Oct 4, 2012 at 5:47 PM, vikram patil <patilvikram@gmail.com>
    wrote:
    Harsh,

    But hbase and hdfs both are running fine without any issues . Only
    shell is
    not able to start up.

    I was even able to start rest connector for Hbase. But I have pasted
    both
    env.sh files here.

    ==> /etc/hbase/conf/hbase-env.sh <==
    export HBASE_OPTS="-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M"
    export HBASE_CLASSPATH=`echo $HBASE_CLASSPATH | sed -e
    "s|$ZOOKEEPER_CONF:||"`


    tail -3000 /etc/hadoop/conf/hadoop-env.sh

    # Set Hadoop-specific environment variables here.

    # Disable IPv6.
    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -XX:NewSize=64m
    -XX:MaxNewSize=256m $HADOOP_OPTS"

    # The only required environment variable is JAVA_HOME. All others
    are
    # optional. When running a distributed configuration it is best to
    # set JAVA_HOME in this file, so that it is correctly defined on
    # remote nodes.

    # The java implementation to use. Required.
    # export JAVA_HOME=/usr/lib/j2sdk1.5-sun

    # Extra Java CLASSPATH elements. Optional.
    # export HADOOP_CLASSPATH="<extra_entries>:$HADOOP_CLASSPATH"

    # The maximum amount of heap to use, in MB. Default is 1000.
    export HADOOP_HEAPSIZE=256

    # Extra Java runtime options. Empty by default.
    # export HADOOP_OPTS="-server -Xmx256M -Xms64M $HADOOP_OPTS"

    # Command specific options appended to HADOOP_OPTS when specified
    export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_NAMENODE_OPTS"
    export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_SECONDARYNAMENODE_OPTS"
    export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_DATANODE_OPTS"
    export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_BALANCER_OPTS"
    export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote
    $HADOOP_JOBTRACKER_OPTS"
    # export HADOOP_TASKTRACKER_OPTS=
    # The following applies to multiple commands (fs, dfs, fsck, distcp
    etc)
    # export HADOOP_CLIENT_OPTS

    # Extra ssh options. Empty by default.
    # export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o
    SendEnv=HADOOP_CONF_DIR"

    # Where log files are stored. $HADOOP_HOME/logs by default.
    # export HADOOP_LOG_DIR=${HADOOP_HOME}/logs

    # File naming remote slave hosts. $HADOOP_HOME/conf/slaves by
    default.
    # export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves

    # host:path where hadoop code should be rsync'd from. Unset by
    default.
    # export HADOOP_MASTER=master:/home/$USER/src/hadoop

    # Seconds to sleep between slave commands. Unset by default. This
    # can be useful in large clusters, where, e.g., slave rsyncs can
    # otherwise arrive faster than the master can service them.
    # export HADOOP_SLAVE_SLEEP=0.1

    # The directory where pid files are stored. /tmp by default.
    # export HADOOP_PID_DIR=/var/hadoop/pids

    # A string representing this instance of hadoop. $USER by default.
    # export HADOOP_IDENT_STRING=$USER

    # The scheduling priority for daemon processes. See 'man nice'.
    # export HADOOP_NICENESS=10


    Regards,
    Vikram

    On Thu, Oct 4, 2012 at 5:35 PM, Harsh J wrote:

    Vikram,

    Many thanks for sending these in.

    The issue is the occurrence of this message in the midst of
    generating
    a library path.
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize
    (262144k)
    is equal to or greater than the entire heap (262144k). A new
    generation
    size of 262080k will be used.
    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'

    It looks to me like you have some bad arguments added to either
    /etc/hadoop/conf/hadoop-env.sh or to /etc/hbase/conf/hbase-env.sh
    (One
    of the -X JVM properties). Can you share the contents of these two
    files?

    On Thu, Oct 4, 2012 at 5:15 PM, vikram patil <patilvikram@gmail.com
    wrote:
    Harsh,

    First suggestion didn't work so I tried second . It yield in
    error
    but
    gave following output This is just partial output . I have colored
    what
    I
    think might be causing issue?



    CLASS='org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/..'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str='
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console'
    + '[' 'xJava HotSpot(TM) 64-Bit Server VM warning: MaxNewSize
    (262144k)
    is
    equal to or greater than the entire heap (262144k). A new
    generation
    size
    of 262080k will be used.

    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    '!=' x ']'
    + HBASE_OPTS='-XX:+HeapDumpOnOutOfMemoryError
    -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java
    HotSpot(TM)
    64-Bit
    Server VM warning: MaxNewSize (262144k) is equal to or greater
    than
    the
    entire heap (262144k). A new generation size of 262080k will be
    used.

    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64'
    + '[' '' '!=' '' ']'
    + exec /usr/local/jdk/bin/java '-XX:OnOutOfMemoryError=kill -9 %p'
    -Xmx1000m
    -XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC
    -XX:+CMSIncrementalMode -Xmx256M -Xms64M
    -Dhbase.log.dir=/usr/lib/hbase/bin/../logs
    -Dhbase.log.file=hbase.log
    -Dhbase.home.dir=/usr/lib/hbase/bin/.. -Dhbase.id.str=
    -Dhbase.root.logger=INFO,console -Djava.library.path=Java
    'HotSpot(TM)'
    64-Bit Server VM warning: MaxNewSize '(262144k)' is equal to or
    greater
    than
    the entire heap '(262144k).' A new generation size of 262080k will
    be
    used.

    /usr/lib/hadoop/lib/native:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64
    -classpath

    '/etc/hbase/conf.cloudera.hbase1:/usr/local/jdk/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security.jar:/usr/lib/hbase/bin/../hbase-0.92.1-cdh4.0.1-security-tests.jar:/usr/lib/hbase/bin/../hbase.jar:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/aopalliance-1.0.jar:/usr/lib/hbase/bin/../lib/asm-3.2.jar:/usr/lib/hbase/bin/../lib/aspectjrt-1.6.5.jar:/usr/lib/hbase/bin/../lib/avro-1.5.4.jar:/usr/lib/hbase/bin/../lib/avro-ipc-1.5.4.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.4.jar:/usr/lib/hbase/bin/../lib/commons-collections-3.2.1.jar:/usr/lib/hbase/bin/../lib/commons-configuration-1.6.jar:/usr/lib/hbase/bin/../lib/commons-daemon-1.0.3.jar:/usr/lib/hbase/bin/../lib/commons-digester-1.8.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-io-2.1.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.5.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-logging-api-1.1.jar:/usr/lib/hbase/bin/../lib/commons-net-3.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/gmbal-api-only-3.0.0-b023.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.1-tests.jar:/usr/lib/hbase/bin/../lib/grizzly-http-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-server-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-http-servlet-2.1.1.jar:/usr/lib/hbase/bin/../lib/grizzly-rcm-2.1.1.jar:/usr/lib/hbase/bin/../lib/guava-11.0.2.jar:/usr/lib/hbase/bin/../lib/guice-3.0.jar:/usr/lib/hbase/bin/../lib/guice-servlet-3.0.jar:/usr/lib/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/usr/lib/hbase/bin/../lib/httpclient-4.0.1.jar:/usr/lib/hbase/bin/../lib/httpcore-4.0.1.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/javax.inject-1.jar:/usr/lib/hbase/bin/../lib/javax.servlet-3.0.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.1.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/bin/../lib/jdiff-1.0.9.jar:/usr/lib/hbase/bin/../lib/jersey-client-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-guice-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-core-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-grizzly2-1.8.jar:/usr/lib/hbase/bin/../lib/jets3t-0.6.1.jar:/usr/lib/hbase/bin/../lib/jettison-1.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.5.jar:/usr/lib/hbase/bin/../lib/jsch-0.1.42.jar:/usr/lib/hbase/bin/../lib/json-simple-1.1.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1.jar:/usr/lib/hbase/bin/../lib/jsr305-1.3.9.jar:/usr/lib/hbase/bin/../lib/kfs-0.3.jar:/usr/lib/hbase/bin/../lib/libthrift-0.7.0.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.16.jar:/usr/lib/hbase/bin/../lib/management-api-3.0.0-b012.jar:/usr/lib/hbase/bin/../lib/netty-3.2.4.Final.jar:/usr/lib/hbase/bin/../lib/oro-2.0.8.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5.jar:/usr/lib/hbase/bin/../lib/slf4j-api-1.6.1.jar:/usr/lib/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/velocity-1.7.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*'
    org.jruby.Main -X+O /usr/lib/hbase/bin/../bin/hirb.rb
    Exception in thread "main" java.lang.NoClassDefFoundError:
    HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will exit.
    On Thursday, October 4, 2012 5:05:38 PM UTC+5:30, Harsh J wrote:

    Vikram,

    The error you are receiving is pretty odd.

    Can you send us a few more outputs please?

    - Does invoking "/usr/lib/hbase/bin/hbase shell" work?
    -- If it does, please send output of "bash -x hbase shell"
    -- If it does not, please send output of "bash -x
    /usr/lib/hbase/bin/hbase shell"

    - Can you send us /etc/hbase/conf/hbase-env.sh contents?

    On Thu, Oct 4, 2012 at 2:25 PM, vikram patil <patil...@gmail.com
    wrote:
    Hi Harsh,

    java version "1.6.0_30"
    Java(TM) SE Runtime Environment (build 1.6.0_30-b12)
    Java HotSpot(TM) 64-Bit Server VM (build 20.5-b03, mixed mode)
    I might be missing basic steps which enable usage of hbase
    shell
    with
    cloudera installation


    Regards
    Vikram

    On Thursday, October 4, 2012 2:19:42 PM UTC+5:30, Harsh J
    wrote:
    Hi Vikram,

    What version of a JVM are you using, and what is the OS here?

    Can you type "java -version" and send us the output please?

    On Thu, Oct 4, 2012 at 2:03 PM, vikram patil
    <patil...@gmail.com>
    wrote:
    I have installed hbase using CDH manager. Our application
    started
    pushing
    data into Hbase. But I want o have a look at inserted data
    using
    hbase
    shell. But I get following error. Please let me know if I am
    missing
    any
    step.

    hadoop_user@MACHNAME ~$ hbase shell
    Exception in thread "main" java.lang.NoClassDefFoundError:
    HotSpot(TM)
    Caused by: java.lang.ClassNotFoundException: HotSpot(TM)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native
    Method)
    at
    java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: HotSpot(TM). Program will
    exit.


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J


    --
    Harsh J

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupscm-users @
categorieshadoop
postedOct 4, '12 at 8:33a
activeOct 4, '12 at 12:56p
posts12
users2
websitecloudera.com
irc#hadoop

2 users in discussion

Vikram patil: 7 posts Harsh J: 5 posts

People

Translate

site design / logo © 2022 Grokbase