FAQ
I frequently run small versions of my Hadoop jobs as local single machine
jobs to debug my logic.
When I do this I always get only a single reduce task even when I call
setNumReduceTasks with, say 10.

I want to debug some issues with the number of reducers and wonder if there
is a way to force local
hadoop jobs to emulate multiple reduce tasks

--
Steven M. Lewis PhD
4221 105th Ave Ne
Kirkland, WA 98033
206-384-1340 (cell)
Institute for Systems Biology
Seattle WA

Search Discussions

  • Gangl, Michael E (388K) at Oct 29, 2010 at 6:17 pm
    I believe you can only do this if you run in pseudo-distributed mode:

    http://hadoop.apache.org/common/docs/r0.20.0/quickstart.html#PseudoDistributed

    -Mike


    On 10/29/10 11:10 AM, "Steve Lewis" wrote:

    I frequently run small versions of my Hadoop jobs as local single machine
    jobs to debug my logic.
    When I do this I always get only a single reduce task even when I call
    setNumReduceTasks with, say 10.

    I want to debug some issues with the number of reducers and wonder if there
    is a way to force local
    hadoop jobs to emulate multiple reduce tasks

    --
    Steven M. Lewis PhD
    4221 105th Ave Ne
    Kirkland, WA 98033
    206-384-1340 (cell)
    Institute for Systems Biology
    Seattle WA

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedOct 29, '10 at 6:11p
activeOct 29, '10 at 6:17p
posts2
users2
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase