I sent a message to Ken Williams <firstname.lastname@example.org> but I have no
answer. Maybe some kind soul who has some familiarity with the
matter can have a look at this. I have a script and data at
I want to sort a set of refused documents for best fit to a set of
accepted documents. For each refused document, I train a set of all
documents except that single refused document, then test the single
refused document against that trained set. Here is my problem: at
each test the memory requirement of the running script seems to
increase. I have tried to free memory as much as I could by
unassigning the hypothesis object, the knowledge set object etc to
know avail, the memory hog slows down my machine to a crawl as the
number of tests conducted goes into the hundreds.
Any hints greatly appreciated!
Thomas Krichel http://openlib.org/home/krichel