FAQ
I need to read and check the entry from a common file ( say a list of words
containing StopWords ) . So that each map function can check these stopwords
from a file.

Should I create a hadoop map file in HDFS so that it could be searched
faster ? would that be memory overhead as indexes would be loaded into
memory ?

another approach could be a Grep in map/.reduce job

Any suggestions would be appreciated

thanks





--
Nipen Mark

Search Discussions

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedDec 1, '09 at 11:17a
activeDec 1, '09 at 11:17a
posts1
users1
websitehadoop.apache.org...
irc#hadoop

1 user in discussion

Mark N: 1 post

People

Translate

site design / logo © 2022 Grokbase