I want to generate a Report on stock data(streaming Data) on hourly basis.
Now I have one spout that reads the data and passing to other bolt, that
bolts does some cleaning and passes that data to final bolt. Final bolt
contains a hash-map, key is nothing but the company name and value is a
array-list that will hold multiple values based on some calculations. for a
particular key there are 10-15 values in an Array-list. and then finally i
store the result in a physical file. Consider if the size of data grows
,hash-map will also start growing and at some point of time i will run out
of memory. So in this case what is the best approach i should adopt ?
You received this message because you are subscribed to the Google Groups "storm-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to firstname.lastname@example.org.
For more options, visit https://groups.google.com/groups/opt_out.