I have a Mapper class which needs access to several dependencies (such as a
db). It seems that because the framework is new'ing up instances of my
Mapper class, I have little control over its lifecycle. Up to now, I've been
setting them up as static fields in that class, which is not ideal. Is there
a better way to inject my dependencies?
Thanks,
Lowell

Search Discussions

  • Fan wei fang at Sep 5, 2009 at 4:49 am
    Dear Lowell,
    I have heard about JVM reuse. Hope this can help.
    Fang.
    On Sat, Sep 5, 2009 at 4:25 AM, Lowell Kirsh wrote:

    I have a Mapper class which needs access to several dependencies (such as a
    db). It seems that because the framework is new'ing up instances of my
    Mapper class, I have little control over its lifecycle. Up to now, I've been
    setting them up as static fields in that class, which is not ideal. Is there
    a better way to inject my dependencies?
    Thanks,
    Lowell
  • Eric Sammer at Sep 8, 2009 at 4:06 pm

    Lowell Kirsh wrote:
    I have a Mapper class which needs access to several dependencies (such
    as a db). It seems that because the framework is new'ing up instances of
    my Mapper class, I have little control over its lifecycle. Up to now,
    I've been setting them up as static fields in that class, which is not
    ideal. Is there a better way to inject my dependencies?
    Lowell:

    The way I've handled this in the past is to use the Mapper to
    instantiate a Spring ApplicationContext with all dependencies wired
    together. The advantages of this is that the guts of the application
    remain (mostly) unaware of, and do not require, the M/R infrastructure
    for testing. This also allows me to write the Mapper once and continue
    to iterate on the application logic by redeploying the Spring config.
    This also keeps the Mapper stateless between invocations of the M/R job
    which, in my opinion, is good.

    The downside is that there is start up overhead to Spring. In my case,
    that's fine; the overhead of the M/R setup / tear down is still less
    than the overall processing overhead without M/R on a single node (which
    is simply not possible). This also creates a dependency on Spring which
    for me isn't a problem.

    You can replace Spring with your DI framework of choice, of course, but
    this pattern works well for me. Hope this helps!

    Best regards.
    --
    Eric Sammer
    eric@lifless.net
    http://esammer.blogspot.com
  • Varene Olivier at Sep 15, 2009 at 3:39 pm
    Just my two cents, but, if your ressources can be intantiated at
    configuration time, you can use the "configure" method from
    MapperReduceBase Object to assign your local static objects with the
    values taken from the job configuration object


    public class Normalize_Reduce extends MapReduceBase implements
    Reducer<..., ..., ..., ...>
    {
    private ... ;

    // Methode to get back some knowledge about your Job conf
    public void configure(JobConf job)
    {
    }

    //
    public void reduce(... p_key, Iterator<...> p_values,
    OutputCollector<..., ...> output, Reporter reporter)
    throws IOException
    {
    }

    }

    I might have not understood your needs, but hope it helps



    Lowell Kirsh a écrit :
    I have a Mapper class which needs access to several dependencies (such
    as a db). It seems that because the framework is new'ing up instances of
    my Mapper class, I have little control over its lifecycle. Up to now,
    I've been setting them up as static fields in that class, which is not
    ideal. Is there a better way to inject my dependencies?

    Thanks,
    Lowell

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupmapreduce-user @
categorieshadoop
postedSep 4, '09 at 8:26p
activeSep 15, '09 at 3:39p
posts4
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase