I verified the checksum & signature and ran some of the example
programs. I also ran RAT to check that sources are licensed correctly.
I only examined the merged artifact. Do we need to release the others?
The out-of-box experience isn't great. There's no documentation at the
top-level telling how to get started. In a subsequent release we might
add links to the various documents.
Also, 'bin/start-all.sh' warns that one should use start-dfs.sh and
start-mapred.sh instead, but those scripts, out-of-the-box, complain
that that they can't find Common.
There are some empty java source files in the release, namely:
These issues should not prevent this release, in my opinion.
On 08/16/2010 10:29 PM, Tom White wrote:
I have created a new candidate build for Hadoop 0.21.0. This fixes
MAPREDUCE-2012 and MAPREDUCE-2014 which were found for the previous
release candidate (1).
This release is being classified as a minor release, which means that
it is API compatible with 0.20.2.
*** This release candidate has not been tested extensively, so it
should not be considered stable.
*** Please download, test and vote before the vote closes on Thursday 19 August.http://people.apache.org/~tomwhite/hadoop-0.21.0-candidate-2/
The hadoop-0.21.0.tar.gz file is an old-style combined release which
includes Common, HDFS, and MapReduce.