New hadoop-0.20.2 user here. Following directions in the c++
dirs, I've succesfully built and installed the C++ hadooppipes and
hadooputils libraries on Mac OS X 10.6 (via configure; make), and compiled
the various word-count example executables (also configure; make), and
have also setup my hadoop dfs and environment, and -put some text files
into the dfs's input directory.
When I run an example, however, wordcount-simple, the output dir
contains 2 part-* files (2 jobs), but I see no file that contains the
cumulative results of the two. In other words, I think mapping is
happening, but reducing is not.
I am using the C++ pipes examples as-is from hadoop-0.20.2. Is
there something wrong with what I'm seeing? What can I do to begin to
debug this? FWIW, I was able to run the Java example WordCount.jar, and
see a single output file with the cumulative results.
I also tried adding print statements (cout) to the wordcount
programs to see what's happening, but I don't see the effects of any
messages being logged anywhere. Is there a way to effectively do
Thanks for any help!