I want to generate a Report on stock data(streaming Data) on hourly basis.
Now I have one spout that reads the data and passing to other bolt, that
bolts does some cleaning and passes that data to final bolt. Final bolt
contains a hash-map, key is nothing but the company name and value is a
array-list that will hold multiple values based on some calculations. for a
particular key there are 10-15 values in an Array-list. and then finally i
store the result in a physical file. Consider if the size of data grows
,hash-map will also start growing and at some point of time i will run out
of memory. So in this case what is the best approach i should adopt ?

Bhagwant Bhobe

You received this message because you are subscribed to the Google Groups "storm-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to storm-user+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Search Discussions

Discussion Posts

Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 1 of 3 | next ›
Discussion Overview
groupstorm-user @
postedJun 28, '13 at 12:14a
activeJun 28, '13 at 1:23a



site design / logo © 2021 Grokbase