FAQ
We have a lot of serious problems because we lack a database of installed
distributions, releases and files. There are serious problems with
implementing one given A) the limitations of the standard Perl install and B)
wedging it into existing systems. But I think I have a solution. Its similar
to how meta data was slipped into the ecosystem without requiring authors to
rewrite their releases or install a bunch of extra modules. It just happens
as part of the normal CPAN module upgrade process.

I've been thinking that a minimal package database could be created by putting
some hooks into ExtUtils::Install::install(), which every Perl build system
ultimately uses, to record what gets installed. That way when
ExtUtils::Install is upgraded, the user gets a build database without
upgrading everything else.

This would be a fairly straight forward process at install time...

1) Copy everything to a temp directory
2) Record everything in that temp directory
3) Copy everything from temp into the real location

You could probably optimize this by skipping the copy to temp and just have
install() record stuff as it goes by, but this is the dumb, simple, robust way
to do it.

Storage is a problem. The only reliable "database" Perl ships with is DBM, an
on disk hash, so we can't get too fancy. It might take several DBM files, but
this is enough to record information and do simple queries. What are those
queries?

* What version of the database is this?
* What distributions are installed?
* What release of a distribution is installed?
* What files are in that release?
* What version is that release?
* What location was a release installed into? (core, vendor, site, custom)
* What are the checksums of those files?

And the basic operations we need to support.

* Add a release (ie. install).
* Delete a release (and its files).
* Delete an older version of a release (as part of install).
* Delete an older version of a release, only if its in the same release
location. This is so CPAN installs don't delete vendor installed modules.
* Verify the files of a release.
* List distributions/releases installed.

It would also store the MYMETA data which gives us a lot of information (such
as dependencies) for free.

This is all totally doable, and efficient enough, with a small pile of DBM
files and Storable. Where to put the database is a bit more complicated, see
the list of open problems below.

There's lots and lots and lots of additional information which could be stored
and queries and operations to allow, but if we can get the basics working
it'll allow a heap of new solutions. And I think this is a SMOP.


Future possibilities include...

* Auto-upgrade to SQLite if ExtUtils::Install::DB::SQLite is installed.

If a special module is installed we can offer SQLite support (or whatever) for
a more advanced database. At install time it would copy the existing DBM
system into its own database.

In general, more functionality can be added as more optional (or bundled)
dependencies are available to the system. Through it all the basic DBM
database would continue to be redundantly maintained to provide a fallback
should those optional modules break or go away.

* Extra hooks into the install system.

ExtUtils::Install is sort of a black box. If it started to do more than just
copy files it would need a more interesting API. Rather than trying to cram
more options into install() it would be worthwhile to write a new API. Build
systems can check for the existence of the new API and use that if available
and do more interesting things with the database. This would be necessary to
support uninstall.


Problems include...

* Anything installed before the new ExtUtils::Install is lost.

Just have to live with that. It will slowly go away as the new
ExtUtils::Install gets into core, Perl is released and vendor Perls update
their Perl or core modules. It'll take time, but we're in the long run here.

* Anything installed outside the normal blib process is lost.

Initially, this is acceptable loses.

Ideally the install process would be expanded to better deal with things which
are not Perl libraries or programs. Build systems which have their own
methods of installing these things could add them directly using the install
database API. A lot of hand waving here.

* Upgrading the database.

I'd like to put some thought into how things are laid out initially to avoid a
lot of major revisions, and thought into what information should be recorded
so its available later, but eventually we're going to want to change the
"schema", such as it is with DBM files.

I figure this can happen as part of upgrading ExtUtils::Install. It checks
what version of the database you have and performs the necessary transforms to
bring it up to the current version. We know how to do this, just have to keep
it in mind and remember to implement it.

* Where to put the database? What about non-standard install locations?

$Config{archlib} would seem the obvious location, but it presents a
permissions problem. If a non-root user installs into their home
directory, you don't want them needing root to write to the installation
database. There's several ways to deal with this.

One is to simply not record non-standard install locations, but this loses
data and punishes all those local::lib users out there.

Another is to have a separate install database for non-standard install
locations. This makes sense to me, but it brings in the sticky problem
of having to merge install databases. Sticky, but still a SMOP. Once you
have to implement merging anyway, it now makes sense to have an install
database for each install location. One for core. One for vendor. One for
perl. And one for each custom location. This has a lot of advantages to
better fit how Perl layers module installs.

* allows separation of permissions
* allows queries of what's installed based on what's in @INC

That second one is important. When a normal user queries the database, they
want to get what's installed in the standard library location. When a
local::lib user queries the database, they want to get what's installed in the
standard library locations AND their own local lib.


In summary...

Not perfect, but gets us off the ground. Its not a great database, but it
does the important job of recording the critical install-time data for later
use. Its implementable within the current system. It doesn't require a bunch
of dependencies, just one upgrade. It works with most existing module
releases. It solves a major design problem with the Perl module system.

I think it's a Simple(?!) Matter Of Programming in ExtUtils::Install to get it
off the ground. IMO the most important bit of coordination is putting some
thought into what the basic database should look like so we don't have to
worry about complicated upgrades later.

Thoughts?

Search Discussions

  • Alberto Simoes at Dec 16, 2012 at 11:09 am
    Thoughts?
    Seems a good idea.
    Anybody wiling to submit a TPF grant to work on this? O:-)
  • Johan Vromans at Dec 16, 2012 at 5:35 pm

    Michael G Schwern writes:

    We have a lot of serious problems because we lack a database of
    installed distributions, releases and files.
    No, that is not our problem.

    Our problem is that we want to handle it ourselves. This may have been a
    good approach in the dark ages, but nowadays there are better solutions.

    For example, on Sat, 15 Dec 2012 14:15:24 -0800 you wrote:
    When it comes to packaging, my first thought is always WDDD? What Does
    Debian Do? They usually do it right.
    Debian, and most other systems have decent package- and install
    managers. *They* maintain the database with installed distributions,
    releases and files. The only good approach for us is to play with them.

    So, an enhanced META.yaml or whatsoever may be a good idea, but *only*
    to generate a deb control file, or rpm spec file, or innosetup file and
    so on.

    It is no longer neccessary to handle everything ourselves. We're not
    alone anymore.

    -- Johan
  • Leon Timmermans at Dec 16, 2012 at 9:11 pm

    On Sun, Dec 16, 2012 at 6:34 PM, Johan Vromans wrote:
    Debian, and most other systems have decent package- and install
    managers. *They* maintain the database with installed distributions,
    releases and files. The only good approach for us is to play with them.

    So, an enhanced META.yaml or whatsoever may be a good idea, but *only*
    to generate a deb control file, or rpm spec file, or innosetup file and
    so on.

    It is no longer neccessary to handle everything ourselves. We're not
    alone anymore.
    There are many ways to deploy stuff, not everyone uses rpm/deb, there
    are good reasons not to do so: for starters it assumes you have root
    privileges.

    Leon
  • Johan Vromans at Dec 16, 2012 at 9:24 pm
    [Quoting Leon Timmermans, on December 16 2012, 22:10, in "Re: How To Build A P"]
    There are many ways to deploy stuff, not everyone uses rpm/deb,
    there are good reasons not to do so: for starters it assumes you
    have root privileges.
    Reality has overtaken these ancient views for a long time already.

    Sudo wrappers help nonroot users.

    On Android and iOS, users are not root yet they install apps all of
    the time.

    And so on.

    Time to change views.

    -- Johan
  • Alberto Simões at Dec 16, 2012 at 9:31 pm

    On 16/12/12 21:23, Johan Vromans wrote:
    [Quoting Leon Timmermans, on December 16 2012, 22:10, in "Re: How To Build A P"]
    There are many ways to deploy stuff, not everyone uses rpm/deb,
    there are good reasons not to do so: for starters it assumes you
    have root privileges.
    Reality has overtaken these ancient views for a long time already.

    Sudo wrappers help nonroot users.
    In fact there is a few share of users using shared servers. But in the
    other hand, there are a lot of users installing locally a Perl version
    for each one of their project, knowing this way updating a module for
    one of their projects will not ruin other projects running on the same
    machine.

    Therefore, restringing this magic to system-wide modules doesn't seem
    fair, nor really useful.
    On Android and iOS, users are not root yet they install apps all of
    the time.
    iOS is single user AFAIK. Android at the moment is single user as well
    (with some new tries for multi-user). So not a big example.

    My 2 cents,
    ambs
  • Michael G Schwern at Dec 17, 2012 at 12:43 am

    On 2012.12.16 1:10 PM, Leon Timmermans wrote:
    On Sun, Dec 16, 2012 at 6:34 PM, Johan Vromans wrote:
    Debian, and most other systems have decent package- and install
    managers. *They* maintain the database with installed distributions,
    releases and files. The only good approach for us is to play with them.

    So, an enhanced META.yaml or whatsoever may be a good idea, but *only*
    to generate a deb control file, or rpm spec file, or innosetup file and
    so on.

    It is no longer neccessary to handle everything ourselves. We're not
    alone anymore.
    There are many ways to deploy stuff, not everyone uses rpm/deb, there
    are good reasons not to do so: for starters it assumes you have root
    privileges.
    I agree with Johan that there are better ways to do this. I also agree with
    Leon that they are not available to all users and operating systems.

    I would much, much rather use an existing, reliable system than build our own.
    Really really really! But Perl is a cross platform, cross environment
    language. Not all the environments we work with have good package managers.
    Off the top of my head: Windows, a huge, hidden number of our users, has
    nothing usable and the Solaris package manger is a joke. Not all users in
    those environments have access to those package managers. Not everybody wants
    to use them to manage Perl. For example, cpan2rpm is a commendable effort but
    also a nightmare.

    In addition, there are some people who live on one OS and are happy and
    comfortable with how that OS does things. And there are people who happily
    jump from OS to OS and want to do things the Perl way. Its nice to be able to
    sit down on any given machine, grab cpanm, perlbrew and local::lib and get
    everything running smoothly.

    OS packages are oriented towards one install per machine. Perl is per project
    and/or per user. Furthermore, Perl is part of the operating system. Making
    the needs of your project not conflict with the needs of the operating system
    is tricky. There are ways to shoehorn OS packages to do these things, but it
    doesn't work well, many admins don't know how to do that, and its different
    from OS to OS.

    If none of that convinces you, I'll say this: we've been advancing the "build
    OS packages" route for years now and its never really worked. There's tons of
    Perl shops out there with tangled, messed up Perl/CPAN installations who find
    it difficult to upgrade in part because they can't replicate their
    installation. My current client is one of them. My best practice for dealing
    with big Perl installs is to make one giant OS package of a non-system perl
    and all the necessary CPAN modules rather than making individual CPAN module
    packages and their interdependencies come out right. This is kind of gross,
    but its manageable.

    Most shops that make it work have a Perl expert on hand. They should not need
    one.

    This is in no way in conflict with advancing cpan2package tools. If done
    right the package database will make packaging Perl modules *easier*. Those
    same hooks and APIs I talked about to allow for enhanced package database
    functionality could also be used by OS build systems. The needs of the
    package database will force modules to become more normalized and provide more
    and better meta data. At worst, it won't make it any more difficult.

    Open Source is not a zero sum game.


    --
    The interface should be as clean as newly fallen snow and its behavior
    as explicit as Japanese eel porn.
  • Leon Timmermans at Dec 16, 2012 at 7:58 pm

    On Sat, Dec 15, 2012 at 11:59 PM, Michael G Schwern wrote:
    We have a lot of serious problems because we lack a database of installed
    distributions, releases and files. There are serious problems with
    implementing one given A) the limitations of the standard Perl install and B)
    wedging it into existing systems. But I think I have a solution. Its similar
    to how meta data was slipped into the ecosystem without requiring authors to
    rewrite their releases or install a bunch of extra modules. It just happens
    as part of the normal CPAN module upgrade process.

    I've been thinking that a minimal package database could be created by putting
    some hooks into ExtUtils::Install::install(), which every Perl build system
    ultimately uses, to record what gets installed. That way when
    ExtUtils::Install is upgraded, the user gets a build database without
    upgrading everything else.

    This would be a fairly straight forward process at install time...

    1) Copy everything to a temp directory
    2) Record everything in that temp directory
    3) Copy everything from temp into the real location

    You could probably optimize this by skipping the copy to temp and just have
    install() record stuff as it goes by, but this is the dumb, simple, robust way
    to do it.

    Storage is a problem. The only reliable "database" Perl ships with is DBM, an
    on disk hash, so we can't get too fancy. It might take several DBM files, but
    this is enough to record information and do simple queries. What are those
    queries?

    * What version of the database is this?
    * What distributions are installed?
    * What release of a distribution is installed?
    * What files are in that release?
    * What version is that release?
    * What location was a release installed into? (core, vendor, site, custom)
    * What are the checksums of those files?

    And the basic operations we need to support.

    * Add a release (ie. install).
    * Delete a release (and its files).
    * Delete an older version of a release (as part of install).
    * Delete an older version of a release, only if its in the same release
    location. This is so CPAN installs don't delete vendor installed modules.
    * Verify the files of a release.
    * List distributions/releases installed.

    It would also store the MYMETA data which gives us a lot of information (such
    as dependencies) for free.
    I can agree with all of that. Actually, starting a discussion about
    this was on my todo-list for the last QA hackathon but I didn't get
    around to it. Ideally, it should replace not only packlists but also
    perllocal
    This is all totally doable, and efficient enough, with a small pile of DBM
    files and Storable. Where to put the database is a bit more complicated, see
    the list of open problems below.
    Given that Storable's format isn't forward-compatible, something more
    stable such as JSON would be more appropriate.
    There's lots and lots and lots of additional information which could be stored
    and queries and operations to allow, but if we can get the basics working
    it'll allow a heap of new solutions. And I think this is a SMOP.


    Future possibilities include...

    * Auto-upgrade to SQLite if ExtUtils::Install::DB::SQLite is installed.

    If a special module is installed we can offer SQLite support (or whatever) for
    a more advanced database. At install time it would copy the existing DBM
    system into its own database.

    In general, more functionality can be added as more optional (or bundled)
    dependencies are available to the system. Through it all the basic DBM
    database would continue to be redundantly maintained to provide a fallback
    should those optional modules break or go away.
    Having a proper database would be really nice, but I'm not sure if
    it's going to be worth the hassle if we have a robust system already.
    * Upgrading the database.

    I'd like to put some thought into how things are laid out initially to avoid a
    lot of major revisions, and thought into what information should be recorded
    so its available later, but eventually we're going to want to change the
    "schema", such as it is with DBM files.

    I figure this can happen as part of upgrading ExtUtils::Install. It checks
    what version of the database you have and performs the necessary transforms to
    bring it up to the current version. We know how to do this, just have to keep
    it in mind and remember to implement it.

    * Where to put the database? What about non-standard install locations?

    $Config{archlib} would seem the obvious location, but it presents a
    permissions problem. If a non-root user installs into their home
    directory, you don't want them needing root to write to the installation
    database. There's several ways to deal with this.

    One is to simply not record non-standard install locations, but this loses
    data and punishes all those local::lib users out there.

    Another is to have a separate install database for non-standard install
    locations. This makes sense to me, but it brings in the sticky problem
    of having to merge install databases. Sticky, but still a SMOP. Once you
    have to implement merging anyway, it now makes sense to have an install
    database for each install location. One for core. One for vendor. One for
    perl. And one for each custom location. This has a lot of advantages to
    better fit how Perl layers module installs.

    * allows separation of permissions
    * allows queries of what's installed based on what's in @INC

    That second one is important. When a normal user queries the database, they
    want to get what's installed in the standard library location. When a
    local::lib user queries the database, they want to get what's installed in the
    standard library locations AND their own local lib.
    The combination of these is problematic. You might upgrade EU::Install
    in your local module path, but not have write permissions on the
    system paths. In practice, we might have to support all our older
    versions :-|
    Not perfect, but gets us off the ground. Its not a great database, but it
    does the important job of recording the critical install-time data for later
    use. Its implementable within the current system. It doesn't require a bunch
    of dependencies, just one upgrade. It works with most existing module
    releases. It solves a major design problem with the Perl module system.

    I think it's a Simple(?!) Matter Of Programming in ExtUtils::Install to get it
    off the ground. IMO the most important bit of coordination is putting some
    thought into what the basic database should look like so we don't have to
    worry about complicated upgrades later.
    I'm not sure it's as simple as you make it sound, but it is a good
    idea nonetheless.

    Leon
  • Michael G Schwern at Dec 17, 2012 at 12:54 am

    On 2012.12.16 11:57 AM, Leon Timmermans wrote:
    I can agree with all of that. Actually, starting a discussion about
    this was on my todo-list for the last QA hackathon but I didn't get
    around to it. Ideally, it should replace not only packlists but also
    perllocal
    I was thinking about what you said about packlists, and I wonder how much
    information one could scrape out of them. Would it be enough to reconstruct
    at least that a group of files belongs to a release? That would be enough to
    be able to fully uninstall a requested module. For example, if the user asks
    to uninstall ExtUtils::MakeMaker the database could have seen that
    ExtUtils/MakeMaker.pm was in a packlist together with ExtUtils/MM_Unix.pm and
    so on and uninstall them. Probably given their original purpose was to
    provide an uninstaller.

    Also what's with this .meta directory I see popping up? I missed a memo.

    This is all totally doable, and efficient enough, with a small pile of DBM
    files and Storable. Where to put the database is a bit more complicated, see
    the list of open problems below.
    Given that Storable's format isn't forward-compatible, something more
    stable such as JSON would be more appropriate.
    That's a good point about Storable. JSON requires a dependency.
    ExtUtils::Install could bundle JSON::PP, but it would be simpler to use
    Data::Dumper. It makes de/serialization faster and simpler. The main
    disadvantage is its only readable by Perl, but that's ok since this pile of
    DBM files will be opaque to everything but the Perl API. Too much of a mess
    to contemplate otherwise.

    * Auto-upgrade to SQLite if ExtUtils::Install::DB::SQLite is installed.

    If a special module is installed we can offer SQLite support (or whatever) for
    a more advanced database. At install time it would copy the existing DBM
    system into its own database.

    In general, more functionality can be added as more optional (or bundled)
    dependencies are available to the system. Through it all the basic DBM
    database would continue to be redundantly maintained to provide a fallback
    should those optional modules break or go away.
    Having a proper database would be really nice, but I'm not sure if
    it's going to be worth the hassle if we have a robust system already.
    That's the great thing about Open Source and possibilities. You just have to
    make the possibilities available and let someone surprise you!

    * Where to put the database? What about non-standard install locations?

    $Config{archlib} would seem the obvious location, but it presents a
    permissions problem. If a non-root user installs into their home
    directory, you don't want them needing root to write to the installation
    database. There's several ways to deal with this.

    One is to simply not record non-standard install locations, but this loses
    data and punishes all those local::lib users out there.

    Another is to have a separate install database for non-standard install
    locations. This makes sense to me, but it brings in the sticky problem
    of having to merge install databases. Sticky, but still a SMOP. Once you
    have to implement merging anyway, it now makes sense to have an install
    database for each install location. One for core. One for vendor. One for
    perl. And one for each custom location. This has a lot of advantages to
    better fit how Perl layers module installs.

    * allows separation of permissions
    * allows queries of what's installed based on what's in @INC

    That second one is important. When a normal user queries the database, they
    want to get what's installed in the standard library location. When a
    local::lib user queries the database, they want to get what's installed in the
    standard library locations AND their own local lib.
    The combination of these is problematic. You might upgrade EU::Install
    in your local module path, but not have write permissions on the
    system paths. In practice, we might have to support all our older
    versions :-|
    Erg, good point. That very likely scenario is definitely going to require
    some thought.

    Not perfect, but gets us off the ground. Its not a great database, but it
    does the important job of recording the critical install-time data for later
    use. Its implementable within the current system. It doesn't require a bunch
    of dependencies, just one upgrade. It works with most existing module
    releases. It solves a major design problem with the Perl module system.

    I think it's a Simple(?!) Matter Of Programming in ExtUtils::Install to get it
    off the ground. IMO the most important bit of coordination is putting some
    thought into what the basic database should look like so we don't have to
    worry about complicated upgrades later.
    I'm not sure it's as simple as you make it sound, but it is a good
    idea nonetheless.
    Many devils in the details. Glad you like it. Thanks for looking it over.


    --
    60. "The Giant Space Ants" are not at the top of my chain of command.
    -- The 213 Things Skippy Is No Longer Allowed To Do In The U.S. Army
    http://skippyslist.com/list/
  • Leon Timmermans at Dec 17, 2012 at 1:31 am

    On Mon, Dec 17, 2012 at 1:53 AM, Michael G Schwern wrote:
    I was thinking about what you said about packlists, and I wonder how much
    information one could scrape out of them. Would it be enough to reconstruct
    at least that a group of files belongs to a release? That would be enough to
    be able to fully uninstall a requested module. For example, if the user asks
    to uninstall ExtUtils::MakeMaker the database could have seen that
    ExtUtils/MakeMaker.pm was in a packlist together with ExtUtils/MM_Unix.pm and
    so on and uninstall them. Probably given their original purpose was to
    provide an uninstaller.
    You can use them to uninstall (I assume that's the reason why Debian
    disables them for vendor packages). It can get a little messy when
    modules are split or some such, but that's relatively rare anyway.
    Also what's with this .meta directory I see popping up? I missed a memo.
    AFAIK that's cpanminus specific. AFAIK it stores meta information so
    that carton can use it. Ask Miyagawa for the details.
    That's a good point about Storable. JSON requires a dependency.
    ExtUtils::Install could bundle JSON::PP, but it would be simpler to use
    Data::Dumper. It makes de/serialization faster and simpler. The main
    disadvantage is its only readable by Perl, but that's ok since this pile of
    DBM files will be opaque to everything but the Perl API. Too much of a mess
    to contemplate otherwise.
    JSON::PP is already in modern perl releases, so it only requires a
    dependency on older perls.

    Leon
  • Tim Bunce at Dec 17, 2012 at 8:36 am

    On Sun, Dec 16, 2012 at 04:53:49PM -0800, Michael G Schwern wrote:
    On 2012.12.16 11:57 AM, Leon Timmermans wrote:

    * Where to put the database? What about non-standard install locations?
    Another is to have a separate install database for non-standard install
    locations.
    A separate install database for each install location seems like the only
    workable approach.
    This makes sense to me, but it brings in the sticky problem
    of having to merge install databases. Sticky, but still a SMOP. Once you
    have to implement merging anyway, it now makes sense to have an install
    database for each install location. One for core. One for vendor. One for
    perl. And one for each custom location. This has a lot of advantages to
    better fit how Perl layers module installs.

    * allows separation of permissions
    * allows queries of what's installed based on what's in @INC
    Perhaps that could be taken one step further: one per installed distribution.

    Then, what's kept at each install location is a cached summary of what's
    installed below it. One that can be cross-checked against the individual
    distribution 'databases' and rebuilt from it.

    That seems more robust against various kinds of 'damage'.
    That second one is important. When a normal user queries the database, they
    want to get what's installed in the standard library location. When a
    local::lib user queries the database, they want to get what's installed in the
    standard library locations AND their own local lib.
    I.e., the default view is "what's installed in my @INC".
    The combination of these is problematic. You might upgrade EU::Install
    in your local module path, but not have write permissions on the
    system paths. In practice, we might have to support all our older
    versions :-|
    Erg, good point. That very likely scenario is definitely going to require
    some thought.
    *nods*

    Here's where "one install database per distribution with a cache
    database at the install location" offers another benefit.
    The "per distribution install database" can be kept in a very simple
    plain text format that targets readability and future-proofing,
    while the "cache database at the install location" can target
    performance.

    If an install location has an incompatible version of the db,
    the per distribution dbs could be read instead. That's slow but workable
    and seems reasonable for that presumably uncommon case.
    I can think of a few further options as well.

    Tim.
  • Adam Kennedy at Dec 17, 2012 at 5:42 pm
    Packlist 2.0

    MYMETA + installed file details

    ?
    On Dec 17, 2012 12:36 AM, "Tim Bunce" wrote:
    On Sun, Dec 16, 2012 at 04:53:49PM -0800, Michael G Schwern wrote:
    On 2012.12.16 11:57 AM, Leon Timmermans wrote:

    * Where to put the database? What about non-standard install
    locations?
    Another is to have a separate install database for non-standard
    install
    locations.
    A separate install database for each install location seems like the only
    workable approach.
    This makes sense to me, but it brings in the sticky problem
    of having to merge install databases. Sticky, but still a SMOP.
    Once you
    have to implement merging anyway, it now makes sense to have an
    install
    database for each install location. One for core. One for vendor.
    One for
    perl. And one for each custom location. This has a lot of
    advantages to
    better fit how Perl layers module installs.

    * allows separation of permissions
    * allows queries of what's installed based on what's in @INC
    Perhaps that could be taken one step further: one per installed
    distribution.

    Then, what's kept at each install location is a cached summary of what's
    installed below it. One that can be cross-checked against the individual
    distribution 'databases' and rebuilt from it.

    That seems more robust against various kinds of 'damage'.
    That second one is important. When a normal user queries the
    database, they
    want to get what's installed in the standard library location. When a
    local::lib user queries the database, they want to get what's
    installed in the
    standard library locations AND their own local lib.
    I.e., the default view is "what's installed in my @INC".
    The combination of these is problematic. You might upgrade EU::Install
    in your local module path, but not have write permissions on the
    system paths. In practice, we might have to support all our older
    versions :-|
    Erg, good point. That very likely scenario is definitely going to require
    some thought.
    *nods*

    Here's where "one install database per distribution with a cache
    database at the install location" offers another benefit.
    The "per distribution install database" can be kept in a very simple
    plain text format that targets readability and future-proofing,
    while the "cache database at the install location" can target
    performance.

    If an install location has an incompatible version of the db,
    the per distribution dbs could be read instead. That's slow but workable
    and seems reasonable for that presumably uncommon case.
    I can think of a few further options as well.

    Tim.
  • Ask Bjørn Hansen at Dec 17, 2012 at 7:24 pm

    On Dec 17, 2012, at 9:36, Tim Bunce wrote:

    A separate install database for each install location seems like the only
    workable approach.
    It seems to me that the database indeed will have to be[1] "per library root" and the tools using the database will need to know to do lookups everywhere and merge the results.


    Ask

    [1] Where "will have to" means "to fit *my* universe".
  • Tim Bunce at Dec 20, 2012 at 10:38 am

    On Mon, Dec 17, 2012 at 08:23:51PM +0100, Ask Bjørn Hansen wrote:
    On Dec 17, 2012, at 9:36, Tim Bunce wrote:

    A separate install database for each install location seems like the only
    workable approach.
    It seems to me that the database indeed will have to be[1] "per library root" and the tools using the database will need to know to do lookups everywhere and merge the results.

    Ask

    [1] Where "will have to" means "to fit *my* universe".
    Yes, that's what I had in mind.

    Tim.
  • Leon Timmermans at Dec 20, 2012 at 10:42 am

    On Mon, Dec 17, 2012 at 9:36 AM, Tim Bunce wrote:
    A separate install database for each install location seems like the only
    workable approach.
    One minor complication of that is the strictest sense an "install
    location" isn't all that well defined. Or perhaps I should say every
    dist has 8 install locations with unspecified relationship on the
    filesystem (lib, arch, bin, script, libdoc, bindoc, libhtml, binhtml).
    In practice you may want to use lib *and* arch (they are never used
    simultaneously, but stuff in lib may be shared over different perl
    installations, contrary to arch which should be for one specific
    install).

    Leon
  • Tim Bunce at Dec 20, 2012 at 5:56 pm

    On Thu, Dec 20, 2012 at 11:42:06AM +0100, Leon Timmermans wrote:
    On Mon, Dec 17, 2012 at 9:36 AM, Tim Bunce wrote:
    A separate install database for each install location seems like the only
    workable approach.
    One minor complication of that is the strictest sense an "install
    location" isn't all that well defined. Or perhaps I should say every
    dist has 8 install locations with unspecified relationship on the
    filesystem (lib, arch, bin, script, libdoc, bindoc, libhtml, binhtml).
    In practice you may want to use lib *and* arch (they are never used
    simultaneously, but stuff in lib may be shared over different perl
    installations, contrary to arch which should be for one specific
    install).
    Good point.
    Strengthening the definition of an "install location" would be good.
    Supporting only the common subset of possible layouts seems reasonable.

    Tim.
  • Jan Dubois at Dec 28, 2012 at 1:45 am

    ---------- Forwarded message ----------
    From: Jan Dubois <jand@activestate.com>
    Date: Thu, Dec 27, 2012 at 5:40 PM
    Subject: Re: How To Build A Perl Package Database
    To: Leon Timmermans <fawaka@gmail.com>


    On Thu, Dec 20, 2012 at 2:42 AM, Leon Timmermans wrote:
    On Mon, Dec 17, 2012 at 9:36 AM, Tim Bunce wrote:
    A separate install database for each install location seems like the only
    workable approach.
    One minor complication of that is the strictest sense an "install
    location" isn't all that well defined. Or perhaps I should say every
    dist has 8 install locations with unspecified relationship on the
    filesystem (lib, arch, bin, script, libdoc, bindoc, libhtml, binhtml).
    In practice you may want to use lib *and* arch (they are never used
    simultaneously, but stuff in lib may be shared over different perl
    installations, contrary to arch which should be for one specific
    install).
    I wonder if it isn't time to deprecate all the complex install combinations
    that may have made sense when hard disks was rather limited.

    In ActivePerl we enforce a pretty simplified install layout to be able to
    create an intuitive package manager:

    - no sharing of directories between different versions
    - every install area has its own bin, etc, lib, and html subdirectories
    - the etc subdirectory records packages installed in that location

    Here is how PPM looks (on my Mac, but it is rather similar on Windows or
    Unix systems too):
    $ ppm area
    ┌────────┬──────┬────────────────────────────────────────┐
    │ name │ pkgs │ lib │
    ├────────┼──────┼────────────────────────────────────────┤
    │ user* │ 38 │ /Users/jan/Library/ActivePerl-5.16/lib │
    │ (site) │ n/a │ /usr/local/ActivePerl-5.16/site/lib │
    │ (perl) │ 245 │ /usr/local/ActivePerl-5.16/lib │
    └────────┴──────┴────────────────────────────────────────┘

    $ ls /Users/jan/Library/ActivePerl-5.16
    bin etc html lib

    $ ppm query XML
    ┌──────────────────────┬─────────┬─────────────────────────────────────────────────────────┬──────┐
    │ name │ version │
    abstract │ area │
    ├──────────────────────┼─────────┼─────────────────────────────────────────────────────────┼──────┤
    │ XML-NamespaceSupport │ 1.11 │ a simple generic namespace support
    class │ user │
    │ XML-Parser │ 2.41-r1 │ A perl module for parsing XML
    documents │ perl │
    │ XML-SAX │ 0.99 │ Simple API for
    XML │ user │
    │ XML-SAX-Base │ 1.08 │ Base class for SAX Drivers and
    Filters │ user │
    │ XML-Simple │ 2.20 │ Easily read/write XML (esp config
    files) │ perl │
    │ XML-Stream │ 1.23 │ Creates an XML Stream connection and
    parses return data │ user │
    └──────────────────────┴─────────┴─────────────────────────────────────────────────────────┴──────┘
    (6 packages installed matching 'XML')

    $ ppm files XML-Stream
    /Users/jan/Library/ActivePerl-5.16/lib/XML/Stream.pm
    /Users/jan/Library/ActivePerl-5.16/lib/XML/Stream/IO/Select/Win32.pm
    /Users/jan/Library/ActivePerl-5.16/lib/XML/Stream/Namespace.pm
    /Users/jan/Library/ActivePerl-5.16/lib/XML/Stream/Node.pm
    /Users/jan/Library/ActivePerl-5.16/lib/XML/Stream/Parser.pm
    /Users/jan/Library/ActivePerl-5.16/lib/XML/Stream/Parser/DTD.pm
    /Users/jan/Library/ActivePerl-5.16/lib/XML/Stream/Tree.pm
    /Users/jan/Library/ActivePerl-5.16/lib/XML/Stream/XPath.pm
    /Users/jan/Library/ActivePerl-5.16/lib/XML/Stream/XPath/Op.pm
    /Users/jan/Library/ActivePerl-5.16/lib/XML/Stream/XPath/Query.pm
    /Users/jan/Library/ActivePerl-5.16/lib/XML/Stream/XPath/Value.pm
    /Users/jan/Library/ActivePerl-5.16/lib/auto/XML/Stream/.packlist

    $ ppm remove XML-NamespaceSupport
    XML-NamespaceSupport: required by XML-SAX
    ppm remove failed: No packages uninstalled

    Having a separate perl/bin and perl/site/bin and perl/vendor/bin is
    somewhat inconvenient for adding things to the $PATH, but it makes it
    possible to install an updated core package into the site directory, and
    later uninstall it without breaking the original core version. We don't use
    perl/vendor but instead merge all pre-installed packages into the core
    directories to keep $PATH a little shorter.

    Anyways, I just wanted to say that without putting some restrictions on how
    modules (and corresponding scripts) can be installed, creating a package
    manager would seem to get even more complex than ExtUtils::Makemaker... :)

    Cheers,
    -Jan
  • Leon Timmermans at Dec 29, 2012 at 12:19 am

    On Fri, Dec 28, 2012 at 2:44 AM, Jan Dubois wrote:
    I wonder if it isn't time to deprecate all the complex install combinations
    that may have made sense when hard disks was rather limited.

    In ActivePerl we enforce a pretty simplified install layout to be able to
    create an intuitive package manager:

    no sharing of directories between different versions
    every install area has its own bin, etc, lib, and html subdirectories
    the etc subdirectory records packages installed in that location
    I think that ship already sailed, if only because different
    distributions have very different layouts.
    Anyways, I just wanted to say that without putting some restrictions on how
    modules (and corresponding scripts) can be installed, creating a package
    manager would seem to get even more complex than ExtUtils::Makemaker... :)
    What kind of restrictions are you thinking about exactly?

    Leon
  • Jan Dubois at Dec 31, 2012 at 5:50 pm

    On Fri, Dec 28, 2012 at 4:18 PM, Leon Timmermans wrote:

    Anyways, I just wanted to say that without putting some restrictions on how
    modules (and corresponding scripts) can be installed, creating a package
    manager would seem to get even more complex than ExtUtils::Makemaker...
    :)

    What kind of restrictions are you thinking about exactly?
    Mostly I would prohibit sharing of directories between Perl installations,
    and even within a single installation, the sharing of directories between
    install locations.

    E.g. the default configuration right now has $Config{installbin} and
    $Config{installsitebin} pointing to the the same directory. This means that
    if you install ExtUtils::ParseXS from CPAN, you end up with the new version
    of the module in $Config{installsitelib}, but the xsubpp script installed
    into $Config{installsitebin} will overwrite the core version already in
    $Config{installbin} because they are the same directory.

    This means it is now impossible to remove the ExtUtils::ParseXS module from
    the "site" install location and reverting to the core version.

    Even if you don't care about "delete" functionality in your package
    manager, you may still want to preserve the integrity of core install.
    Otherwise it is possible that the package manager updates a package it
    relies upon itself that breaks the package manager. Then it is impossible
    to fix this situation for a regular user without doing a complete reinstall
    of Perl itself.

    For this reason the ActivePerl package manager explicitly removes the
    "site" directories from @INC and only uses the modules originally included
    in the distribution.

    Cheers,
    -Jan
  • Leon Timmermans at Dec 31, 2012 at 6:39 pm

    On Mon, Dec 31, 2012 at 6:50 PM, Jan Dubois wrote:
    Mostly I would prohibit sharing of directories between Perl installations,
    and even within a single installation, the sharing of directories between
    install locations.

    E.g. the default configuration right now has $Config{installbin} and
    $Config{installsitebin} pointing to the the same directory. This means that
    if you install ExtUtils::ParseXS from CPAN, you end up with the new version
    of the module in $Config{installsitelib}, but the xsubpp script installed
    into $Config{installsitebin} will overwrite the core version already in
    $Config{installbin} because they are the same directory.

    This means it is now impossible to remove the ExtUtils::ParseXS module from
    the "site" install location and reverting to the core version.

    Even if you don't care about "delete" functionality in your package manager,
    you may still want to preserve the integrity of core install. Otherwise it
    is possible that the package manager updates a package it relies upon itself
    that breaks the package manager. Then it is impossible to fix this situation
    for a regular user without doing a complete reinstall of Perl itself.

    For this reason the ActivePerl package manager explicitly removes the "site"
    directories from @INC and only uses the modules originally included in the
    distribution.
    I think that would clash with most vendor distributed perls (or at
    least it does with both Debian and Red Hat). It would be nice if this
    system was instead able to integrate with them instead of them nuking
    it to prevent users from doing something stupid.

    Leon
  • Demerphq at Jan 2, 2013 at 2:18 pm

    On 31 December 2012 19:38, Leon Timmermans wrote:
    On Mon, Dec 31, 2012 at 6:50 PM, Jan Dubois wrote:
    Mostly I would prohibit sharing of directories between Perl installations,
    and even within a single installation, the sharing of directories between
    install locations.

    E.g. the default configuration right now has $Config{installbin} and
    $Config{installsitebin} pointing to the the same directory. This means that
    if you install ExtUtils::ParseXS from CPAN, you end up with the new version
    of the module in $Config{installsitelib}, but the xsubpp script installed
    into $Config{installsitebin} will overwrite the core version already in
    $Config{installbin} because they are the same directory.

    This means it is now impossible to remove the ExtUtils::ParseXS module from
    the "site" install location and reverting to the core version.

    Even if you don't care about "delete" functionality in your package manager,
    you may still want to preserve the integrity of core install. Otherwise it
    is possible that the package manager updates a package it relies upon itself
    that breaks the package manager. Then it is impossible to fix this situation
    for a regular user without doing a complete reinstall of Perl itself.

    For this reason the ActivePerl package manager explicitly removes the "site"
    directories from @INC and only uses the modules originally included in the
    distribution.
    I think that would clash with most vendor distributed perls (or at
    least it does with both Debian and Red Hat). It would be nice if this
    system was instead able to integrate with them instead of them nuking
    it to prevent users from doing something stupid.
    FWIW I think that Perl should use one install format and the distros
    should not fight that.

    IMO a lot of MakeMakers problems come from trying to please too many
    people and ending up pleasing no one.

    Yves

    --
    perl -Mre=debug -e "/just|another|perl|hacker/"
  • Johan Vromans at Dec 20, 2012 at 4:48 pm

    Tim Bunce writes:

    A separate install database for each install location seems like the
    only workable approach.
    Store the complete distribution in a git repository?

    -- Johan
  • Philippe Bruhat (BooK) at Jan 2, 2013 at 12:23 am

    On Thu, Dec 20, 2012 at 05:48:10PM +0100, Johan Vromans wrote:
    Tim Bunce <tim.bunce@pobox.com> writes:
    A separate install database for each install location seems like the
    only workable approach.
    Store the complete distribution in a git repository?
    One issue I had when trying to store distributions with Git::CPAN::Hook is
    that if a file does not change between two versions of a distribution,
    then git won't "detect" it.

    That was an issue for me, as I tried to create special tree objects for
    each distribution, so that "install this version of the distribution"
    would basically mean "apply the patch that creates the whole tree". This
    does not work if the tree does not contain files that haven't changed
    between versions.

    Depending on what you meant, that could be an issue for you too.

    --
    Philippe Bruhat (BooK)

    When you double-cross a friend, you triple-cross yourself.
    (Moral from Groo The Wanderer #8 (Epic))
  • Johan Vromans at Jan 2, 2013 at 9:30 pm
    [Quoting Philippe Bruhat (BooK), on January 2 2013, 01:23, in "Re: How To Build A P"]
    One issue I had when trying to store distributions with
    Git::CPAN::Hook is that if a file does not change between two
    versions of a distribution, then git won't "detect" it.
    This doesn't look like a problem if the approach is to checkout a
    particular tag corresponding to a specific install.

    -- Johan
    http://johan.vromans.org/seasons_greetings.html
  • Demerphq at Dec 17, 2012 at 1:21 pm

    On 17 December 2012 01:53, Michael G Schwern wrote:
    On 2012.12.16 11:57 AM, Leon Timmermans wrote:
    I can agree with all of that. Actually, starting a discussion about
    this was on my todo-list for the last QA hackathon but I didn't get
    around to it. Ideally, it should replace not only packlists but also
    perllocal
    I was thinking about what you said about packlists, and I wonder how much
    information one could scrape out of them. Would it be enough to reconstruct
    at least that a group of files belongs to a release? That would be enough to
    be able to fully uninstall a requested module. For example, if the user asks
    to uninstall ExtUtils::MakeMaker the database could have seen that
    ExtUtils/MakeMaker.pm was in a packlist together with ExtUtils/MM_Unix.pm and
    so on and uninstall them. Probably given their original purpose was to
    provide an uninstaller.

    Also what's with this .meta directory I see popping up? I missed a memo.

    This is all totally doable, and efficient enough, with a small pile of DBM
    files and Storable. Where to put the database is a bit more complicated, see
    the list of open problems below.
    Given that Storable's format isn't forward-compatible, something more
    stable such as JSON would be more appropriate.
    That's a good point about Storable. JSON requires a dependency.
    ExtUtils::Install could bundle JSON::PP, but it would be simpler to use
    Data::Dumper. It makes de/serialization faster and simpler. The main
    disadvantage is its only readable by Perl, but that's ok since this pile of
    DBM files will be opaque to everything but the Perl API. Too much of a mess
    to contemplate otherwise.
    IMO the question is whether you want the data human readable or not.

    If you dont care then use Sereal as a replacement for Storable. Same
    feature set pretty much except it is faster and produces smaller
    output.

    Yves

    --
    perl -Mre=debug -e "/just|another|perl|hacker/"
  • Eric Wilhelm at Dec 17, 2012 at 1:21 am
    a list of installed files and the contents of MYMETA should suffice. we talked about trying to fix packlists some years ago but I don't have that ref handy sorry. os package management isn't enough, needs to be per @INC path, I think. -E
    --
    Sent from my Android phone with K-9.
  • David Cantrell at Jan 4, 2013 at 12:42 pm

    On Sun, Dec 16, 2012 at 08:57:48PM +0100, Leon Timmermans wrote:
    On Sat, Dec 15, 2012 at 11:59 PM, Michael G Schwern wrote:
    Storage is a problem. The only reliable "database" Perl ships with is DBM, an
    on disk hash, so we can't get too fancy. It might take several DBM files, but
    this is enough to record information and do simple queries.
    I'd just go for JSON or YAML, which are already supported by large parts
    of the toolchain.
    And the basic operations we need to support.

    * Add a release (ie. install).
    * Delete a release (and its files).
    and stuff that depends on it?
    * Delete an older version of a release (as part of install).
    Good, this covers clearing out any files that existed in the previous
    version but no longer do in the new version, and would be very useful.
    * Delete an older version of a release, only if its in the same release
    location. This is so CPAN installs don't delete vendor installed modules.
    * Verify the files of a release.
    * List distributions/releases installed.
    * Downgrade a release to a previously installed one?
    * Rollback all modules to a point in time?

    It's not as if disk space is expensive these days.

    --
    David Cantrell | Official London Perl Mongers Bad Influence

    All principles of gravity are negated by fear
    -- Cartoon Law V
  • Michael G. Schwern at Jan 4, 2013 at 10:24 pm

    On 1/4/13 4:41 AM, David Cantrell wrote:
    On Sun, Dec 16, 2012 at 08:57:48PM +0100, Leon Timmermans wrote:
    On Sat, Dec 15, 2012 at 11:59 PM, Michael G Schwern wrote:
    Storage is a problem. The only reliable "database" Perl ships with is DBM, an
    on disk hash, so we can't get too fancy. It might take several DBM files, but
    this is enough to record information and do simple queries.
    I'd just go for JSON or YAML, which are already supported by large parts
    of the toolchain.
    For performance reasons, I wish to avoid having to slurp in a gigantic
    JSON file before doing anything.

    OTOH one could go the .git route and use the filesystem itself as a
    key/value store. /$database/$table/$key.json. That would both avoid
    DBM bugs and allow the database to be human readable.

    Great idea!

    And the basic operations we need to support.

    * Add a release (ie. install).
    * Delete a release (and its files).
    and stuff that depends on it?
    That brings up two issues. The first is a reverse dependency database
    so you can both A) delete a release and all that depends on it but also
    B) prevent a user from deleting a release which still has dependencies.
    Thanks to MYMETA, the reverse dep database is easy to build. I don't
    recall if I had that in my list, good catch.

    There's a second feature which I really like, and that's when a package
    manager can tell the difference between what was explicitly asked for
    and what was installed automatically to resolve an issue. MacPorts has
    this feature and its handy to clean up a bloated installation.
    Unfortunately this will be difficult for ExtUtils::Install to know, but
    its worth keeping in mind.

    * Delete an older version of a release, only if its in the same release
    location. This is so CPAN installs don't delete vendor installed modules.
    * Verify the files of a release.
    * List distributions/releases installed.
    * Downgrade a release to a previously installed one?
    * Rollback all modules to a point in time?
    That requires keeping a record of previously installed items, which
    would be useful for many things, but potentially a PITA to get right
    because its not simply a matter of storing a few JSON files but also
    their linkages. You could just keep a copy of everything after every
    operation, possibly using symlinks to slim down on disk space, but at a
    certain point you're creating a version control system. Pinto has been
    having similar problems.

    OTOH if its all file based you could use a version control system if
    available.

    Either way, worth recording the data for the future to work out what to
    do with it.
  • Adam Kennedy at Jan 4, 2013 at 11:33 pm
    I'll say it a second time...

    Packlist 2.0

    Take MYMETA, add an extra key with the list that will be installed, intall
    it in the usual place as we do now.

    Package manager scans the filesystem for the packlist files.

    Might seem slow, but on SSDs scanning the filesystem like that is super
    super fast.

    Adam

    On Sat, Jan 5, 2013 at 9:24 AM, Michael G. Schwern wrote:
    On 1/4/13 4:41 AM, David Cantrell wrote:
    On Sun, Dec 16, 2012 at 08:57:48PM +0100, Leon Timmermans wrote:
    On Sat, Dec 15, 2012 at 11:59 PM, Michael G Schwern wrote:
    Storage is a problem. The only reliable "database" Perl ships with is
    DBM, an
    on disk hash, so we can't get too fancy. It might take several DBM
    files, but
    this is enough to record information and do simple queries.
    I'd just go for JSON or YAML, which are already supported by large parts
    of the toolchain.
    For performance reasons, I wish to avoid having to slurp in a gigantic
    JSON file before doing anything.

    OTOH one could go the .git route and use the filesystem itself as a
    key/value store. /$database/$table/$key.json. That would both avoid
    DBM bugs and allow the database to be human readable.

    Great idea!

    And the basic operations we need to support.

    * Add a release (ie. install).
    * Delete a release (and its files).
    and stuff that depends on it?
    That brings up two issues. The first is a reverse dependency database
    so you can both A) delete a release and all that depends on it but also
    B) prevent a user from deleting a release which still has dependencies.
    Thanks to MYMETA, the reverse dep database is easy to build. I don't
    recall if I had that in my list, good catch.

    There's a second feature which I really like, and that's when a package
    manager can tell the difference between what was explicitly asked for
    and what was installed automatically to resolve an issue. MacPorts has
    this feature and its handy to clean up a bloated installation.
    Unfortunately this will be difficult for ExtUtils::Install to know, but
    its worth keeping in mind.

    * Delete an older version of a release, only if its in the same release
    location. This is so CPAN installs don't delete vendor installed
    modules.
    * Verify the files of a release.
    * List distributions/releases installed.
    * Downgrade a release to a previously installed one?
    * Rollback all modules to a point in time?
    That requires keeping a record of previously installed items, which
    would be useful for many things, but potentially a PITA to get right
    because its not simply a matter of storing a few JSON files but also
    their linkages. You could just keep a copy of everything after every
    operation, possibly using symlinks to slim down on disk space, but at a
    certain point you're creating a version control system. Pinto has been
    having similar problems.

    OTOH if its all file based you could use a version control system if
    available.

    Either way, worth recording the data for the future to work out what to
    do with it.
  • Philippe Bruhat (BooK) at Jan 4, 2013 at 11:48 pm

    On Sat, Jan 05, 2013 at 10:33:04AM +1100, Adam Kennedy wrote:
    I'll say it a second time...

    Packlist 2.0

    Take MYMETA, add an extra key with the list that will be installed, intall
    it in the usual place as we do now.

    Package manager scans the filesystem for the packlist files.
    Or the distribution -> packlist index can be store in a well known location
    (top of the install directory I suppose).
    Might seem slow, but on SSDs scanning the filesystem like that is super
    super fast.
    --
    Philippe Bruhat (BooK)

    When you wander near evil, Security is only a function of foolishness...
    (Moral from Groo The Wanderer #21 (Epic))
  • Leon Timmermans at Jan 4, 2013 at 11:59 pm

    On Sat, Jan 5, 2013 at 12:33 AM, Adam Kennedy wrote:
    I'll say it a second time...

    Packlist 2.0

    Take MYMETA, add an extra key with the list that will be installed, intall
    it in the usual place as we do now.

    Package manager scans the filesystem for the packlist files.

    Might seem slow,
    A cold «time echo q | instmodsh >/dev/null» takes 22 seconds on my
    local perl (hot it takes 0.8 seconds). I have no reason to assume the
    new system would be faster (it might even be slower since it has to
    parse the data). Not to mention it's trashing your caches.
    but on SSDs scanning the filesystem like that is super
    super fast.
    Maybe, but not everyone is using those…

    Leon

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcpan-workers @
categoriesperl
postedDec 15, '12 at 10:59p
activeJan 4, '13 at 11:59p
posts31
users13
websitecpan.org

People

Translate

site design / logo © 2018 Grokbase