FAQ
2GB limitation only exists when you want to put them to memory in 32bits
box.
Our index size is larger than 13 giga bytes, and it works fine.
I think it must be something error in your design. You can use Luke to see
what happened in your index.
On 8/14/06, Van Nguyen wrote:

Hi,



I have a 7GB index (about 45 fields per document X roughly 5.5 million
docs) running on a Windows 2003 32bit machine (dual proc, 2GB memory). The
index is optimized. Performing a search on this index will just "hang" when
performing the search (wild card query with a sort). At first the CPU usage
is 100%, then drops down to 50% after a minute or so, and then no CPU
utilization… but the thread is still trying to perform the search. I've
tried this in my J2EE app and in a main program. Is this due to the 2GB
limitation of the 32bit OS (I didn't realize the index would be this big…
just let it run over the weekend).



If this is due to the 2GB limitation of the 32bit OS and since I have this
7GB index built already (and optimized), is there a way to split this into
2GB indices w/o having to re-index? Or is this due to another factor?



Van

United Rentals
Consider it done.™
800-UR-RENTS
unitedrentals.com



---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org

--
--
Yueyu Lin

Search Discussions

Discussion Posts

Previous

Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 2 of 10 | next ›
Discussion Overview
groupjava-user @
categorieslucene
postedAug 14, '06 at 5:36p
activeAug 15, '06 at 9:20p
posts10
users5
websitelucene.apache.org

People

Translate

site design / logo © 2022 Grokbase