I've tried 'cat'. This is the error I get: 'cat: Source must be a file.'
This happens when I try to get all parts from a directory as a single .csv
file.
Something like that:
hadoop dfs -cat hdfs://master:54310/user/hadoop-user/output/solr/
cat: Source must be a file.
This is what the dir looks like
hadoop dfs -ls hdfs://master:54310/user/hadoop-user/output/solr/
Found 3 items
drwxr-xr-x - hadoop supergroup 0 2010-03-12 16:36
/user/hadoop-user/output/solr/_logs
-rw-r--r-- 2 hadoop supergroup 64882566 2010-03-12 16:36
/user/hadoop-user/output/solr/part-00000
-rw-r--r-- 2 hadoop supergroup 51388943 2010-03-12 16:36
/user/hadoop-user/output/solr/part-00001
It seems -get can merge everything to one file, but cannot output to sdtout
while 'cat' can do stdout, but it seems I have to fetch the parts one by
one.
Or am I missing something?
thanks,
alex
On Tue, Mar 16, 2010 at 11:28 AM, Varene Olivier wrote:
Hello Alex,
get writes down a file on your FileSystem
hadoop dfs [-get [-ignoreCrc] [-crc] <src> <localdst>]
with
src : your file in your hdfs
localdst : the name of the file with the collected data (from src) on
your local filesystem
To get the results to STDOUT,
you can use cat
hadoop dfs [-cat <src>]
with src : your file in your hdfs
Regards
Olivier
Alex Parvulescu a écrit :
Hello,
Hello Alex,
get writes down a file on your FileSystem
hadoop dfs [-get [-ignoreCrc] [-crc] <src> <localdst>]
with
src : your file in your hdfs
localdst : the name of the file with the collected data (from src) on
your local filesystem
To get the results to STDOUT,
you can use cat
hadoop dfs [-cat <src>]
with src : your file in your hdfs
Regards
Olivier
Alex Parvulescu a écrit :
Hello,
Is there a reason for which 'hadoop dfs -get' will not output to stdout?
I see 'hadoop dfs -put' can handle stdin. It would seem that dfs would
have to also support outputing to stdout.
thanks,
alex
I see 'hadoop dfs -put' can handle stdin. It would seem that dfs would
have to also support outputing to stdout.
thanks,
alex