|
Allen Wittenauer |
at Oct 16, 2010 at 9:34 pm
|
⇧ |
| |
On Oct 16, 2010, at 1:08 PM, Bruce Williams wrote:
I am doing a student Independent Study Project and Harvery Mudd has given
me 13 Sun Netra X1 I can use as a dedicated Hadoop cluster. Right now they
are without an OS.
If anyone with experience with Hadoop and Solaris can contact me off list,
even to just say I am doing it and it is OK it would be appreciated.
That's my cue! :)
We have a few grids that are running Solaris. It mostly works out of the box as long as you are aware of three things:
- There are some settings in hadoop-env.sh and in the path that need to be dealt with. Rather than re-quote, these were added to the Hadoop FAQ a week or two ago so definitely take a look at that.
- The native compression libraries will need to be compiled. Depending upon what you are doing/how performant the machines are, this may or may not make a big difference. Compiling under Solaris with gcc will work fine (but it is gcc... ugh!). Only a few minor changes are required to compile it with SUNWspro. [I have patches laying around here somewhere if anyone wants to play with them.]
- The Solaris JRE is a mixed-mode implementation. So keep in mind that -d32 and -d64 have meaning and do work as advertised. You'll likely want to pick a bitsize and use that for all your Hadoop daemons and tasks, especially if you plan on using any JNI like the compression libraries.