I'm rephrasing a previous performance question, in light of new data...
I have a Lucene index of about 0.5 GB.
Currently performance is good - up to 200 milliseconds per search (with
complex boolean queries, but never retrieving more than 200 top results).
The question: how much can the index grow, before there's noticeable
1) Does anyone please have production experience with, say, 5 GB index? 10
If so, are there recommendations about merge policy, file size
If it degrades, I have other solutions (involving a change in logic), but I
don't want to get into it unless necessary.
2) Also, about 5% of my documents are editable (= the application
occasionally deletes them, and adds a modified document instead).
The other 90% are "immutable" (never deleted/edited).
Can Lucene take advantage of this? E.g. will it be smart enough to keep
changes in a single small file (which needs to be optimized), while the
other files remain unchanged?