Mark N wrote:
I want to show the status of M/R jobs on user interface , should i read the
default hadoop counters to display some kind of
map/ reduce tasks?
I could read the status of map/reduce task using Jobclient ( hadoop
default counters ) . I can then have a java websevice exposing these
functions so that
other module ( such as c#, vb.net ) can access and show the status on UI
is this a correct approach ?
1. The HTML pages themselves are a user interface. They could be cleaned
up, made more secure, etc, but anything you do there would benefit
everyone. It would also be much easier to test than any rich client, as
we can use HtmlUnit to stress the site.
2. there's some JSP output of status as XML forthcoming, grab the
trunk's code and take a look
3. There are things like Hadoop Studio which provide a GUI,http://www.hadoopstudio.org/
; and some Ant tasks in a Hadoop distro.
You are much better off using someone elses code than writing your own.
You aren't going be able to talk from .NET to Hadoop right now; I've
discussed having a long-haul route to Hadoop, "Mombasa":http://www.slideshare.net/steve_l/long-haul-hadoop
, but not implemented
anything I'd recommend anyone other than myself and my colleagues to
use, as its pretty unstable.
I do think a good long-haul API for job submission would be nice, one
that also works with higher-level job queues, like Cascading, Pig, other
Hadoop workflows. I also think we should steer clear of WS-*, because
its wrong, although that will mean that you won't be able to generate
VB.net or C# stubs straight from WSDL.
Summary: use the web GUI and help improve if it you can, try using
things like Hadoop Studio if it is not enough. If you want to help build
a long-haul API it would be good, but its going to involve a lot of
effort and you wont' see benefits for a while