Ioan, often hard drive speed limits you more than processor speed. That is,
it might be faster to load 5M from disk and unpack than load unpacked
25M from disk.


On Wed, Sep 4, 2013 at 11:59 AM, Jörn Kottmann wrote:
On 08/26/2013 03:00 PM, Ioan Barbulescu wrote:

Hi guys

Short question, please:

Currently, the opennlp models come as zipped files.

Is it possible to use them in an expanded / un-zipped form?
(and how?)

Zipped is very neat and clean, but it adds some time when reading the
I am interested in speeding up as much as possible the load time.
You can probably repackage the zip files without using compression.
Anyway I doubt that it adds much time, did you profile the loading code?

As far as I know is the slowest part to build the maxent model, maybe that
can be speed up, I never profiled that part of OpenNLP.

The life-cycle of a model should be the same as of your application,
maybe you can
just find a way to reuse them, instead of loading them over and over again.


Search Discussions

Discussion Posts


Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 3 of 5 | next ›
Discussion Overview
groupdev @
postedAug 26, '13 at 1:01p
activeSep 9, '13 at 4:12p



site design / logo © 2021 Grokbase