I have written a code to create sequence files for given text files.
The program takes following input parameters:
1. Local source directory - contains all the input text files
2. Destination HDFS URI - location on hdfs where sequence file will be copied
The key for a sequence-record is the file-name.
The value for a sequence-record is the content of the text file.
The program runs fine for large number input text files. But if the size of a single input text file is > 100 MB then it throws following exception:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
I am using "org.apache.hadoop.io.SequenceFile.Writer" for creating the sequence file. The Text class is used for keyclass and valclass.
I tried increasing the max memory for the program but it throws same error.
Can you provide your suggestions?
This e-mail may contain privileged and confidential information which is the property of Persistent Systems Ltd. It is intended only for the use of the individual or entity to which it is addressed. If you are not the intended recipient, you are not authorized to read, retain, copy, print, distribute or use this message. If you have received this communication in error, please notify the sender and delete all copies of this message. Persistent Systems Ltd. does not accept any liability for virus infected mails.