FAQ
I can't seem to get Hbase to run using the hadoop i have connected to my s3
bucket

Running
Hbase 0.19.2
Hadoop 0.19.2

Hadoop-site.xml
< configuration>

<property>
<name>fs.default.name</name>
<value>s3://hbase</value>
</property>

<property>
<name>fs.s3.awsAccessKeyId</name>
<value>ID</value>
</property>

<property>
<name>fs.s3.awsSecretAccessKey</name>
<value>SECRET</value>
</property>
</configuration>

and it seems to start up no problem

my hbase-site.xml

<configuration>
<property>
<name>hbase.master</name>
<value>174.129.15.236:60000</value>
<description>The host and port that the HBase master runs at.
A value of 'local' runs the master and a regionserver in
a single process.
</description>
</property>

<property>
<name>hbase.rootdir</name>
<value>s3://hbase</value>
<description>The directory shared by region servers.
</description>
</property>

</configuration>


keeps giving me

]
2009-08-06 17:20:44,526 ERROR org.apache.hadoop.hbase.master.HMaster: Can
not start master
java.lang.NoClassDefFoundError: org/jets3t/service/S3ServiceException
at
org.apache.hadoop.fs.s3.S3FileSystem.createDefaultStore(S3FileSystem.java:84)
at
org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:74)
at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
at org.apache.hadoop.hbase.master.HMaster.(HMaster.java:156)
at
org.apache.hadoop.hbase.LocalHBaseCluster.(LocalHBaseCluster.java:78)
at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1013)
at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
Caused by: java.lang.ClassNotFoundException:
org.jets3t.service.S3ServiceException
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)


what am i doing wrong here?

Ananth T Sarathy

Search Discussions

  • Tim robertson at Aug 7, 2009 at 3:03 pm
    Do you need to add the Amazon S3 toolkit on the HBase classpath
    directly to use S3 as a store?
    http://developer.amazonwebservices.com/connect/entry.jspa?externalID=617&categoryID=47

    I'm guessing based on the "java.lang.NoClassDefFoundError:
    org/jets3t/service/S3ServiceException"

    Cheers

    Tim


    On Fri, Aug 7, 2009 at 4:50 PM, Ananth T.
    Sarathywrote:
    I can't seem to get Hbase to run using the hadoop i have connected to my s3
    bucket

    Running
    Hbase 0.19.2
    Hadoop  0.19.2

    Hadoop-site.xml
    < configuration>

    <property>
    <name>fs.default.name</name>
    <value>s3://hbase</value>
    </property>

    <property>
    <name>fs.s3.awsAccessKeyId</name>
    <value>ID</value>
    </property>

    <property>
    <name>fs.s3.awsSecretAccessKey</name>
    <value>SECRET</value>
    </property>
    </configuration>

    and it seems to start up no problem

    my hbase-site.xml

    <configuration>
    <property>
    <name>hbase.master</name>
    <value>174.129.15.236:60000</value>
    <description>The host and port that the HBase master runs at.
    A value of 'local' runs the master and a regionserver in
    a single process.
    </description>
    </property>

    <property>
    <name>hbase.rootdir</name>
    <value>s3://hbase</value>
    <description>The directory shared by region servers.
    </description>
    </property>

    </configuration>


    keeps giving me

    ]
    2009-08-06 17:20:44,526 ERROR org.apache.hadoop.hbase.master.HMaster: Can
    not start master
    java.lang.NoClassDefFoundError: org/jets3t/service/S3ServiceException
    at
    org.apache.hadoop.fs.s3.S3FileSystem.createDefaultStore(S3FileSystem.java:84)
    at
    org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:74)
    at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
    at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:186)
    at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:156)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:96)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:78)
    at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1013)
    at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
    Caused by: java.lang.ClassNotFoundException:
    org.jets3t.service.S3ServiceException
    at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
    at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)


    what am i doing wrong here?

    Ananth T Sarathy
  • Ananth T. Sarathy at Aug 7, 2009 at 4:52 pm
    TIm,
    that got me a little further! Thanks...

    but now i get a different error

    hbase-site.xml

    <configuration>
    <property>
    <name>hbase.master</name>
    <value>174.129.15.236:60000</value>
    <description>The host and port that the HBase master runs at.
    A value of 'local' runs the master and a regionserver in
    a single process.
    </description>
    </property>

    <property>
    <name>hbase.rootdir</name>
    <value>s3://testbucket</value>
    <description>The directory shared by region servers.
    </description>
    </property>
    </configuration>

    i copied a hadoop-site.xml with my access and secret key to my conf/ in
    hbase.... i also tried using the s3://id:access@bucket and that didn't
    work.

    Fri Aug 7 12:47:45 EDT 2009 Starting master on ip-10-244-131-228
    ulimit -n 1024
    2009-08-07 12:47:45,850 INFO org.apache.hadoop.hbase.master.HMaster:
    vmName=Java HotSpot(TM) Client VM, vmVendor=Sun Microsystems Inc.,
    vmVersion=14.1-b02
    2009-08-07 12:47:45,850 INFO org.apache.hadoop.hbase.master.HMaster:
    vmInputArguments=[-Xmx1000m, -XX:+HeapDumpOnOutOfMemoryError,
    -Dhbase.log.dir=/usr/hbase-0.19.2/bin/../logs,
    -Dhbase.log.file=hbase-root-master-ip-10-244-131-228.log,
    -Dhbase.home.dir=/usr/hbase-0.19.2/bin/.., -Dhbase.id.str=root,
    -Dhbase.root.logger=INFO,DRFA,
    -Djava.library.path=/usr/hbase-0.19.2/bin/../lib/native/Linux-i386-32]
    2009-08-07 12:47:48,535 ERROR org.apache.hadoop.hbase.master.HMaster: Can
    not start master
    org.apache.hadoop.fs.s3.S3Exception: org.jets3t.service.S3ServiceException:
    S3 PUT failed for '/' XML Error Message: <?xml version="1.0"
    encoding="UTF-8"?><Error><Code>BucketAlreadyExists</Code><Message>The
    requested bucket name is not available. The bucket namespace is shared by
    all users of the system. Please select a different name and try
    again.(Jets3tFileSystemStore.java:108)
    at
    org.apache.hadoop.fs.s3.Jets3tFileSystemStore.initialize(Jets3tFileSystemStore.java:96)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy0.initialize(Unknown Source)
    at
    org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:76)
    at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
    at org.apache.hadoop.hbase.master.HMaster.(HMaster.java:156)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.(LocalHBaseCluster.java:78)
    at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1013)
    at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
    Caused by: org.jets3t.service.S3ServiceException: S3 PUT failed for '/' XML
    Error Message: <?xml version="1.0"
    encoding="UTF-8"?><Error><Code>BucketAlreadyExists</Code><Message>The
    requested bucket name is not available. The bucket namespace is shared by
    all users of the system. Please select a different name and try
    again.(RestS3Service.java:416)
    at
    org.jets3t.service.impl.rest.httpclient.RestS3Service.performRestPut(RestS3Service.java:800)
    at
    org.jets3t.service.impl.rest.httpclient.RestS3Service.createObjectImpl(RestS3Service.java:1399)
    at
    org.jets3t.service.impl.rest.httpclient.RestS3Service.createBucketImpl(RestS3Service.java:1270)
    at org.jets3t.service.S3Service.createBucket(S3Service.java:1558)
    at org.jets3t.service.S3Service.createBucket(S3Service.java:1257)
    at org.jets3t.service.S3Service.createBucket(S3Service.java:1284)
    at
    org.apache.hadoop.fs.s3.Jets3tFileSystemStore.createBucket(Jets3tFileSystemStore.java:103)
    ... 20 more


    Ananth T Sarathy

    On Fri, Aug 7, 2009 at 11:02 AM, tim robertson wrote:

    Do you need to add the Amazon S3 toolkit on the HBase classpath
    directly to use S3 as a store?

    http://developer.amazonwebservices.com/connect/entry.jspa?externalID=617&categoryID=47

    I'm guessing based on the "java.lang.NoClassDefFoundError:
    org/jets3t/service/S3ServiceException"

    Cheers

    Tim


    On Fri, Aug 7, 2009 at 4:50 PM, Ananth T.
    Sarathywrote:
    I can't seem to get Hbase to run using the hadoop i have connected to my s3
    bucket

    Running
    Hbase 0.19.2
    Hadoop 0.19.2

    Hadoop-site.xml
    < configuration>

    <property>
    <name>fs.default.name</name>
    <value>s3://hbase</value>
    </property>

    <property>
    <name>fs.s3.awsAccessKeyId</name>
    <value>ID</value>
    </property>

    <property>
    <name>fs.s3.awsSecretAccessKey</name>
    <value>SECRET</value>
    </property>
    </configuration>

    and it seems to start up no problem

    my hbase-site.xml

    <configuration>
    <property>
    <name>hbase.master</name>
    <value>174.129.15.236:60000</value>
    <description>The host and port that the HBase master runs at.
    A value of 'local' runs the master and a regionserver in
    a single process.
    </description>
    </property>

    <property>
    <name>hbase.rootdir</name>
    <value>s3://hbase</value>
    <description>The directory shared by region servers.
    </description>
    </property>

    </configuration>


    keeps giving me

    ]
    2009-08-06 17:20:44,526 ERROR org.apache.hadoop.hbase.master.HMaster: Can
    not start master
    java.lang.NoClassDefFoundError: org/jets3t/service/S3ServiceException
    at
    org.apache.hadoop.fs.s3.S3FileSystem.createDefaultStore(S3FileSystem.java:84)
    at
    org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:74)
    at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
    at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:186)
    at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:156)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:96)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:78)
    at
    org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1013)
    at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
    Caused by: java.lang.ClassNotFoundException:
    org.jets3t.service.S3ServiceException
    at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
    at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)


    what am i doing wrong here?

    Ananth T Sarathy
  • Tim robertson at Aug 7, 2009 at 5:30 pm
    Pointing out the obvious but something somewhere is trying to create a
    bucket that has already been created.

    Sorry, but I don't think I can help further - perhaps change
    s3://testbucket to s3://testbucket2 just to be sure it is not that you
    have created it in another process by accident?

    Cheers

    Tim


    On Fri, Aug 7, 2009 at 6:51 PM, Ananth T.
    Sarathywrote:
    TIm,
    that got me a little further! Thanks...

    but now i get a different error

    hbase-site.xml

    <configuration>
    <property>
    <name>hbase.master</name>
    <value>174.129.15.236:60000</value>
    <description>The host and port that the HBase master runs at.
    A value of 'local' runs the master and a regionserver in
    a single process.
    </description>
    </property>

    <property>
    <name>hbase.rootdir</name>
    <value>s3://testbucket</value>
    <description>The directory shared by region servers.
    </description>
    </property>
    </configuration>

    i copied a hadoop-site.xml with my access and secret key to my conf/ in
    hbase....  i also tried using the s3://id:access@bucket and that didn't
    work.

    Fri Aug  7 12:47:45 EDT 2009 Starting master on ip-10-244-131-228
    ulimit -n 1024
    2009-08-07 12:47:45,850 INFO org.apache.hadoop.hbase.master.HMaster:
    vmName=Java HotSpot(TM) Client VM, vmVendor=Sun Microsystems Inc.,
    vmVersion=14.1-b02
    2009-08-07 12:47:45,850 INFO org.apache.hadoop.hbase.master.HMaster:
    vmInputArguments=[-Xmx1000m, -XX:+HeapDumpOnOutOfMemoryError,
    -Dhbase.log.dir=/usr/hbase-0.19.2/bin/../logs,
    -Dhbase.log.file=hbase-root-master-ip-10-244-131-228.log,
    -Dhbase.home.dir=/usr/hbase-0.19.2/bin/.., -Dhbase.id.str=root,
    -Dhbase.root.logger=INFO,DRFA,
    -Djava.library.path=/usr/hbase-0.19.2/bin/../lib/native/Linux-i386-32]
    2009-08-07 12:47:48,535 ERROR org.apache.hadoop.hbase.master.HMaster: Can
    not start master
    org.apache.hadoop.fs.s3.S3Exception: org.jets3t.service.S3ServiceException:
    S3 PUT failed for '/' XML Error Message: <?xml version="1.0"
    encoding="UTF-8"?><Error><Code>BucketAlreadyExists</Code><Message>The
    requested bucket name is not available. The bucket namespace is shared by
    all users of the system. Please select a different name and try
    again.</Message><BucketName>testbucket</BucketName><RequestId>C0C7F562713BDE97</RequestId><HostId>ifY4rPOqmasjPkH+EiTS3LsgRzuDcbUTHy+y8p4HMnJWN1kUXCUe+FvYSZhIlYHg</HostId></Error>
    at
    org.apache.hadoop.fs.s3.Jets3tFileSystemStore.createBucket(Jets3tFileSystemStore.java:108)
    at
    org.apache.hadoop.fs.s3.Jets3tFileSystemStore.initialize(Jets3tFileSystemStore.java:96)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy0.initialize(Unknown Source)
    at
    org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:76)
    at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
    at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:186)
    at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:156)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:96)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:78)
    at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1013)
    at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
    Caused by: org.jets3t.service.S3ServiceException: S3 PUT failed for '/' XML
    Error Message: <?xml version="1.0"
    encoding="UTF-8"?><Error><Code>BucketAlreadyExists</Code><Message>The
    requested bucket name is not available. The bucket namespace is shared by
    all users of the system. Please select a different name and try
    again.</Message><BucketName>testbucket</BucketName><RequestId>C0C7F562713BDE97</RequestId><HostId>ifY4rPOqmasjPkH+EiTS3LsgRzuDcbUTHy+y8p4HMnJWN1kUXCUe+FvYSZhIlYHg</HostId></Error>
    at
    org.jets3t.service.impl.rest.httpclient.RestS3Service.performRequest(RestS3Service.java:416)
    at
    org.jets3t.service.impl.rest.httpclient.RestS3Service.performRestPut(RestS3Service.java:800)
    at
    org.jets3t.service.impl.rest.httpclient.RestS3Service.createObjectImpl(RestS3Service.java:1399)
    at
    org.jets3t.service.impl.rest.httpclient.RestS3Service.createBucketImpl(RestS3Service.java:1270)
    at org.jets3t.service.S3Service.createBucket(S3Service.java:1558)
    at org.jets3t.service.S3Service.createBucket(S3Service.java:1257)
    at org.jets3t.service.S3Service.createBucket(S3Service.java:1284)
    at
    org.apache.hadoop.fs.s3.Jets3tFileSystemStore.createBucket(Jets3tFileSystemStore.java:103)
    ... 20 more


    Ananth T Sarathy

    On Fri, Aug 7, 2009 at 11:02 AM, tim robertson wrote:

    Do you need to add the Amazon S3 toolkit on the HBase classpath
    directly to use S3 as a store?

    http://developer.amazonwebservices.com/connect/entry.jspa?externalID=617&categoryID=47

    I'm guessing based on the "java.lang.NoClassDefFoundError:
    org/jets3t/service/S3ServiceException"

    Cheers

    Tim


    On Fri, Aug 7, 2009 at 4:50 PM, Ananth T.
    Sarathywrote:
    I can't seem to get Hbase to run using the hadoop i have connected to my s3
    bucket

    Running
    Hbase 0.19.2
    Hadoop  0.19.2

    Hadoop-site.xml
    < configuration>

    <property>
    <name>fs.default.name</name>
    <value>s3://hbase</value>
    </property>

    <property>
    <name>fs.s3.awsAccessKeyId</name>
    <value>ID</value>
    </property>

    <property>
    <name>fs.s3.awsSecretAccessKey</name>
    <value>SECRET</value>
    </property>
    </configuration>

    and it seems to start up no problem

    my hbase-site.xml

    <configuration>
    <property>
    <name>hbase.master</name>
    <value>174.129.15.236:60000</value>
    <description>The host and port that the HBase master runs at.
    A value of 'local' runs the master and a regionserver in
    a single process.
    </description>
    </property>

    <property>
    <name>hbase.rootdir</name>
    <value>s3://hbase</value>
    <description>The directory shared by region servers.
    </description>
    </property>

    </configuration>


    keeps giving me

    ]
    2009-08-06 17:20:44,526 ERROR org.apache.hadoop.hbase.master.HMaster: Can
    not start master
    java.lang.NoClassDefFoundError: org/jets3t/service/S3ServiceException
    at
    org.apache.hadoop.fs.s3.S3FileSystem.createDefaultStore(S3FileSystem.java:84)
    at
    org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:74)
    at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
    at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:186)
    at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:156)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:96)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:78)
    at
    org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1013)
    at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
    Caused by: java.lang.ClassNotFoundException:
    org.jets3t.service.S3ServiceException
    at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
    at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)


    what am i doing wrong here?

    Ananth T Sarathy
  • Ananth T. Sarathy at Aug 7, 2009 at 5:34 pm
    Thanks for whatever help you could give... when I try to something else I
    get

    Fri Aug 7 13:31:34 EDT 2009 Starting master on ip-10-244-131-228
    ulimit -n 1024
    2009-08-07 13:31:34,829 INFO org.apache.hadoop.hbase.master.HMaster:
    vmName=Ja HotSpot(TM) Client VM, vmVendor=Sun Microsystems Inc.,
    vmVersion=14.1-b02
    2009-08-07 13:31:34,830 INFO org.apache.hadoop.hbase.master.HMaster:
    vmInputArments=[-Xmx1000m, -XX:+HeapDumpOnOutOfMemoryError,
    -Dhbase.log.dir=/usr/hbase-19.2/bin/../logs,
    -Dhbase.log.file=hbase-root-master-ip-10-244-131-228.log,
    -Dase.home.dir=/usr/hbase-0.19.2/bin/.., -Dhbase.id.str=root,
    -Dhbase.root.loggeINFO,DRFA,
    -Djava.library.path=/usr/hbase-0.19.2/bin/../lib/native/Linux-i386-]
    2009-08-07 13:31:37,247 ERROR org.apache.hadoop.hbase.master.HMaster: Can
    not art master
    java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative
    pathn absolute URI: s3://testbucketananth-ROOT-
    at org.apache.hadoop.fs.Path.initialize(Path.java:140)
    at org.apache.hadoop.fs.Path.(Path.java:50)
    at
    org.apache.hadoop.hbase.HTableDescriptor.getTableDir(HTableDescript.java:651)
    at
    org.apache.hadoop.hbase.regionserver.HRegion.getRegionDir(HRegion.ja:2362)
    at org.apache.hadoop.hbase.master.HMaster.(HMaster.java:156)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.(LocalHBaseCluster.va:78)
    at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1013)
    at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
    Caused by: java.net.URISyntaxException: Relative path in absolute URI:
    s3://tebucketananth-ROOT-
    at java.net.URI.checkPath(URI.java:1787)
    at java.net.URI.(Path.java:137)
    ... 10 more



    but the bigger issue is why is it trying to create a bucket and how I do
    tell it not too.
    Ananth T Sarathy

    On Fri, Aug 7, 2009 at 1:28 PM, tim robertson wrote:

    Pointing out the obvious but something somewhere is trying to create a
    bucket that has already been created.

    Sorry, but I don't think I can help further - perhaps change
    s3://testbucket to s3://testbucket2 just to be sure it is not that you
    have created it in another process by accident?

    Cheers

    Tim


    On Fri, Aug 7, 2009 at 6:51 PM, Ananth T.
    Sarathywrote:
    TIm,
    that got me a little further! Thanks...

    but now i get a different error

    hbase-site.xml

    <configuration>
    <property>
    <name>hbase.master</name>
    <value>174.129.15.236:60000</value>
    <description>The host and port that the HBase master runs at.
    A value of 'local' runs the master and a regionserver in
    a single process.
    </description>
    </property>

    <property>
    <name>hbase.rootdir</name>
    <value>s3://testbucket</value>
    <description>The directory shared by region servers.
    </description>
    </property>
    </configuration>

    i copied a hadoop-site.xml with my access and secret key to my conf/ in
    hbase.... i also tried using the s3://id:access@bucket and that didn't
    work.

    Fri Aug 7 12:47:45 EDT 2009 Starting master on ip-10-244-131-228
    ulimit -n 1024
    2009-08-07 12:47:45,850 INFO org.apache.hadoop.hbase.master.HMaster:
    vmName=Java HotSpot(TM) Client VM, vmVendor=Sun Microsystems Inc.,
    vmVersion=14.1-b02
    2009-08-07 12:47:45,850 INFO org.apache.hadoop.hbase.master.HMaster:
    vmInputArguments=[-Xmx1000m, -XX:+HeapDumpOnOutOfMemoryError,
    -Dhbase.log.dir=/usr/hbase-0.19.2/bin/../logs,
    -Dhbase.log.file=hbase-root-master-ip-10-244-131-228.log,
    -Dhbase.home.dir=/usr/hbase-0.19.2/bin/.., -Dhbase.id.str=root,
    -Dhbase.root.logger=INFO,DRFA,
    -Djava.library.path=/usr/hbase-0.19.2/bin/../lib/native/Linux-i386-32]
    2009-08-07 12:47:48,535 ERROR org.apache.hadoop.hbase.master.HMaster: Can
    not start master
    org.apache.hadoop.fs.s3.S3Exception:
    org.jets3t.service.S3ServiceException:
    S3 PUT failed for '/' XML Error Message: <?xml version="1.0"
    encoding="UTF-8"?><Error><Code>BucketAlreadyExists</Code><Message>The
    requested bucket name is not available. The bucket namespace is shared by
    all users of the system. Please select a different name and try
    again.</Message><BucketName>testbucket</BucketName><RequestId>C0C7F562713BDE97</RequestId><HostId>ifY4rPOqmasjPkH+EiTS3LsgRzuDcbUTHy+y8p4HMnJWN1kUXCUe+FvYSZhIlYHg</HostId></Error>
    at
    org.apache.hadoop.fs.s3.Jets3tFileSystemStore.createBucket(Jets3tFileSystemStore.java:108)
    at
    org.apache.hadoop.fs.s3.Jets3tFileSystemStore.initialize(Jets3tFileSystemStore.java:96)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
    at
    org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy0.initialize(Unknown Source)
    at
    org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:76)
    at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
    at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:186)
    at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:156)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:96)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:78)
    at
    org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1013)
    at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
    Caused by: org.jets3t.service.S3ServiceException: S3 PUT failed for '/' XML
    Error Message: <?xml version="1.0"
    encoding="UTF-8"?><Error><Code>BucketAlreadyExists</Code><Message>The
    requested bucket name is not available. The bucket namespace is shared by
    all users of the system. Please select a different name and try
    again.</Message><BucketName>testbucket</BucketName><RequestId>C0C7F562713BDE97</RequestId><HostId>ifY4rPOqmasjPkH+EiTS3LsgRzuDcbUTHy+y8p4HMnJWN1kUXCUe+FvYSZhIlYHg</HostId></Error>
    at
    org.jets3t.service.impl.rest.httpclient.RestS3Service.performRequest(RestS3Service.java:416)
    at
    org.jets3t.service.impl.rest.httpclient.RestS3Service.performRestPut(RestS3Service.java:800)
    at
    org.jets3t.service.impl.rest.httpclient.RestS3Service.createObjectImpl(RestS3Service.java:1399)
    at
    org.jets3t.service.impl.rest.httpclient.RestS3Service.createBucketImpl(RestS3Service.java:1270)
    at org.jets3t.service.S3Service.createBucket(S3Service.java:1558)
    at org.jets3t.service.S3Service.createBucket(S3Service.java:1257)
    at org.jets3t.service.S3Service.createBucket(S3Service.java:1284)
    at
    org.apache.hadoop.fs.s3.Jets3tFileSystemStore.createBucket(Jets3tFileSystemStore.java:103)
    ... 20 more


    Ananth T Sarathy


    On Fri, Aug 7, 2009 at 11:02 AM, tim robertson <
    timrobertson100@gmail.com>wrote:
    Do you need to add the Amazon S3 toolkit on the HBase classpath
    directly to use S3 as a store?
    http://developer.amazonwebservices.com/connect/entry.jspa?externalID=617&categoryID=47
    I'm guessing based on the "java.lang.NoClassDefFoundError:
    org/jets3t/service/S3ServiceException"

    Cheers

    Tim


    On Fri, Aug 7, 2009 at 4:50 PM, Ananth T.
    Sarathywrote:
    I can't seem to get Hbase to run using the hadoop i have connected to
    my
    s3
    bucket

    Running
    Hbase 0.19.2
    Hadoop 0.19.2

    Hadoop-site.xml
    < configuration>

    <property>
    <name>fs.default.name</name>
    <value>s3://hbase</value>
    </property>

    <property>
    <name>fs.s3.awsAccessKeyId</name>
    <value>ID</value>
    </property>

    <property>
    <name>fs.s3.awsSecretAccessKey</name>
    <value>SECRET</value>
    </property>
    </configuration>

    and it seems to start up no problem

    my hbase-site.xml

    <configuration>
    <property>
    <name>hbase.master</name>
    <value>174.129.15.236:60000</value>
    <description>The host and port that the HBase master runs at.
    A value of 'local' runs the master and a regionserver in
    a single process.
    </description>
    </property>

    <property>
    <name>hbase.rootdir</name>
    <value>s3://hbase</value>
    <description>The directory shared by region servers.
    </description>
    </property>

    </configuration>


    keeps giving me

    ]
    2009-08-06 17:20:44,526 ERROR org.apache.hadoop.hbase.master.HMaster:
    Can
    not start master
    java.lang.NoClassDefFoundError: org/jets3t/service/S3ServiceException
    at
    org.apache.hadoop.fs.s3.S3FileSystem.createDefaultStore(S3FileSystem.java:84)
    at
    org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:74)
    at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
    at
    org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
    at
    org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
    at
    org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:186)
    at
    org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:156)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:96)
    at
    org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:78)
    at
    org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1013)
    at
    org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
    Caused by: java.lang.ClassNotFoundException:
    org.jets3t.service.S3ServiceException
    at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
    at
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
    at
    java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)

    what am i doing wrong here?

    Ananth T Sarathy

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedAug 7, '09 at 2:51p
activeAug 7, '09 at 5:34p
posts5
users2
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase