FAQ
At the London Perl Workshop I gave a talk on the CPAN River, and how development and release practices should mature as a dist moves up river. This was prompted by the discussions we had at Berlin earlier this year.

Writing the talk prompted a bunch of ideas, one of which is having a “water quality” metric, which gives some indication of whether a dist is a good one to rely on (needs a better name). I’ve come up with a first definition, and calculated the metric for the different stages of the river:

http://neilb.org/2015/12/22/cpan-river-water-quality.html

Any thoughts on what factors should be included in such a metric? I think it should really include factors that it would be hard for anyone to argue with. Currently the individual factors are:

Not having too many CPAN Testers fails
Having a META.json or META.yml file
Specifying the min perl version required for the dist

Cheers,
Neil

At some point I’ll share the slides from my talk, but slideshare doesn’t handle keynote presentations, and the exported powerpoint from keynote is broken (neither powerpoint nor slideshare can handle it!)

Search Discussions

  • David Golden at Dec 23, 2015 at 12:17 am
    I thought the "min perl version" is a tough metric without considering what
    version of Perl it will actually run on. I would refine that metric to
    "declared min perl version >= actual perl version required". Figuring out
    the latter could perhaps be done via CPAN Testers -- if all of 5.6 fails,
    then we know it's 5.8 or better. But if there is at least one 5.6 pass,
    then it works on 5.6. And if it works on 5.6, I think omission of a
    minimum perl version is no big deal.

    I don't want to see go down the Kwalitee route where people put a minimum
    perl version of "5" or something just to get a better water quality score.

    Generally, I think some subset of the core Kwalitee metrics and some
    adaptation of your adoption criteria (e.g. time since any release by
    author) would be a place to look for "water quality" metrics. I do think
    you need to find a way to distinguish what water quality is trying to
    measure distinct from Kwalitee.

    David

    On Tue, Dec 22, 2015 at 6:05 PM, Neil Bowers wrote:

    At the London Perl Workshop I gave a talk on the CPAN River, and how
    development and release practices should mature as a dist moves up river.
    This was prompted by the discussions we had at Berlin earlier this year.

    Writing the talk prompted a bunch of ideas, one of which is having a
    “water quality” metric, which gives some indication of whether a dist is a
    good one to rely on (needs a better name). I’ve come up with a first
    definition, and calculated the metric for the different stages of the river:

    http://neilb.org/2015/12/22/cpan-river-water-quality.html


    Any thoughts on what factors should be included in such a metric? I think
    it should really include factors that it would be hard for anyone to argue
    with. Currently the individual factors are:


    - Not having too many CPAN Testers fails
    - Having a META.json or META.yml file
    - Specifying the min perl version required for the dist


    Cheers,
    Neil

    At some point I’ll share the slides from my talk, but slideshare doesn’t
    handle keynote presentations, and the exported powerpoint from keynote is
    broken (neither powerpoint nor slideshare can handle it!)

    --
    David Golden <xdg@xdg.me> Twitter/IRC/Github: @xdg
  • Kenichi Ishigaki at Dec 23, 2015 at 6:46 am
    CPANdeps (http://deps.cpantesters.org) has been providing useful
    information on water quality. It might be enough to make a better or
    opinionated presentation of it for the upriver authors. IMHO META
    files and min version specification depends more on when a
    distribution is released and don't well fit for water quality metrics.

    2015-12-23 9:16 GMT+09:00 David Golden <xdg@xdg.me>:
    I thought the "min perl version" is a tough metric without considering what
    version of Perl it will actually run on. I would refine that metric to
    "declared min perl version >= actual perl version required". Figuring out
    the latter could perhaps be done via CPAN Testers -- if all of 5.6 fails,
    then we know it's 5.8 or better. But if there is at least one 5.6 pass,
    then it works on 5.6. And if it works on 5.6, I think omission of a
    minimum perl version is no big deal.
    This is something I've been wishing to add to Kwalitee but haven't
    because of a performance issue of min version detectors.
    I don't want to see go down the Kwalitee route where people put a minimum
    perl version of "5" or something just to get a better water quality score.

    Generally, I think some subset of the core Kwalitee metrics and some
    adaptation of your adoption criteria (e.g. time since any release by author)
    would be a place to look for "water quality" metrics. I do think you need
    to find a way to distinguish what water quality is trying to measure
    distinct from Kwalitee.
    +1

    Kenichi
    David

    On Tue, Dec 22, 2015 at 6:05 PM, Neil Bowers wrote:

    At the London Perl Workshop I gave a talk on the CPAN River, and how
    development and release practices should mature as a dist moves up river.
    This was prompted by the discussions we had at Berlin earlier this year.

    Writing the talk prompted a bunch of ideas, one of which is having a
    “water quality” metric, which gives some indication of whether a dist is a
    good one to rely on (needs a better name). I’ve come up with a first
    definition, and calculated the metric for the different stages of the river:

    http://neilb.org/2015/12/22/cpan-river-water-quality.html


    Any thoughts on what factors should be included in such a metric? I think
    it should really include factors that it would be hard for anyone to argue
    with. Currently the individual factors are:

    Not having too many CPAN Testers fails
    Having a META.json or META.yml file
    Specifying the min perl version required for the dist


    Cheers,
    Neil

    At some point I’ll share the slides from my talk, but slideshare doesn’t
    handle keynote presentations, and the exported powerpoint from keynote is
    broken (neither powerpoint nor slideshare can handle it!)


    --
    David Golden <xdg@xdg.me> Twitter/IRC/Github: @xdg
  • Neil Bowers at Dec 24, 2015 at 9:10 am

    CPANdeps (http://deps.cpantesters.org) has been providing useful
    information on water quality. It might be enough to make a better or
    opinionated presentation of it for the upriver authors. IMHO META
    files and min version specification depends more on when a
    distribution is released and don't well fit for water quality metrics.
    I’m not convinced on min version either, but am leaning towards including it, if we can come up with a definition that’s practical and useful.

    I think “has a META.yml or META.json” is worth keeping in, as there are a number of benefits to having one, and I suspect there’s at least some correlation between dists that don’t have a META file and dists that haven’t listed all prereqs (eg in the Makefile.PL).

    That said, I’m really just experimenting here, trying to find things that are useful indicators for whether a dist is good to rely on.

    Neil
  • Karen Etheridge at Dec 24, 2015 at 6:00 pm
    I think “has a META.yml or META.json” is worth keeping in
    I'm surprised this one is being discussed at all. IMO, not having a META
    file should disqualify the distribution from being considered at all. At
    Berlin last year we talked about making it mandatory, and held off "for
    now" so the outliers could be fixed. Having META should be non-negotiable
    for a well-formed CPAN distribution.

    On Thu, Dec 24, 2015 at 1:10 AM, Neil Bowers wrote:

    CPANdeps (http://deps.cpantesters.org) has been providing useful
    information on water quality. It might be enough to make a better or
    opinionated presentation of it for the upriver authors. IMHO META
    files and min version specification depends more on when a
    distribution is released and don't well fit for water quality metrics.
    I’m not convinced on min version either, but am leaning towards including
    it, if we can come up with a definition that’s practical and useful.

    I think “has a META.yml or META.json” is worth keeping in, as there are a
    number of benefits to having one, and I suspect there’s at least some
    correlation between dists that don’t have a META file and dists that
    haven’t listed all prereqs (eg in the Makefile.PL).

    That said, I’m really just experimenting here, trying to find things that
    are useful indicators for whether a dist is good to rely on.

    Neil
  • Sawyer X at Dec 24, 2015 at 7:04 pm
    I have to agree with that, albeit probably less angry about it. :)
    On Thu, Dec 24, 2015 at 7:00 PM, Karen Etheridge wrote:

    I think “has a META.yml or META.json” is worth keeping in
    I'm surprised this one is being discussed at all. IMO, not having a META
    file should disqualify the distribution from being considered at all. At
    Berlin last year we talked about making it mandatory, and held off "for
    now" so the outliers could be fixed. Having META should be non-negotiable
    for a well-formed CPAN distribution.

    On Thu, Dec 24, 2015 at 1:10 AM, Neil Bowers wrote:

    CPANdeps (http://deps.cpantesters.org) has been providing useful
    information on water quality. It might be enough to make a better or
    opinionated presentation of it for the upriver authors. IMHO META
    files and min version specification depends more on when a
    distribution is released and don't well fit for water quality metrics.
    I’m not convinced on min version either, but am leaning towards including
    it, if we can come up with a definition that’s practical and useful.

    I think “has a META.yml or META.json” is worth keeping in, as there are a
    number of benefits to having one, and I suspect there’s at least some
    correlation between dists that don’t have a META file and dists that
    haven’t listed all prereqs (eg in the Makefile.PL).

    That said, I’m really just experimenting here, trying to find things that
    are useful indicators for whether a dist is good to rely on.

    Neil
  • Kenichi Ishigaki at Dec 25, 2015 at 1:07 am

    2015-12-25 3:00 GMT+09:00 Karen Etheridge <perl@froods.org>:
    I think “has a META.yml or META.json” is worth keeping in
    I'm surprised this one is being discussed at all. IMO, not having a META
    file should disqualify the distribution from being considered at all. At
    Berlin last year we talked about making it mandatory, and held off "for now"
    so the outliers could be fixed. Having META should be non-negotiable for a
    well-formed CPAN distribution.
    I agree with this, and if the water quality metrics should be complete
    and everything should be mentioned, the metric must be included
    without doubt. However, I'm less sure if it makes sense to stress it
    as a new, small number of water quality metrics just because only
    about 3 percent of distributions shipped in 2015 didn't have META.yml
    already and the percentage of failing distributions is getting smaller
    year by year, so there might be little room to improve for most of the
    active authors. Of course, It would help older, possibly abondoned
    distributions, but that's a different story.
    On Thu, Dec 24, 2015 at 1:10 AM, Neil Bowers wrote:

    CPANdeps (http://deps.cpantesters.org) has been providing useful
    information on water quality. It might be enough to make a better or
    opinionated presentation of it for the upriver authors. IMHO META
    files and min version specification depends more on when a
    distribution is released and don't well fit for water quality metrics.
    I’m not convinced on min version either, but am leaning towards including
    it, if we can come up with a definition that’s practical and useful.

    I think “has a META.yml or META.json” is worth keeping in, as there are a
    number of benefits to having one, and I suspect there’s at least some
    correlation between dists that don’t have a META file and dists that haven’t
    listed all prereqs (eg in the Makefile.PL).

    That said, I’m really just experimenting here, trying to find things that
    are useful indicators for whether a dist is good to rely on.

    Neil
  • David Golden at Dec 25, 2015 at 1:22 am
    I'm mostly surprised at the way-upriver distributions that lack it. I
    wonder how many of those are dual-life that don't ship with the kind of
    tooling that "CPAN best practice" use.

    David
    On Thu, Dec 24, 2015 at 8:06 PM, Kenichi Ishigaki wrote:

    2015-12-25 3:00 GMT+09:00 Karen Etheridge <perl@froods.org>:
    I think “has a META.yml or META.json” is worth keeping in
    I'm surprised this one is being discussed at all. IMO, not having a META
    file should disqualify the distribution from being considered at all. At
    Berlin last year we talked about making it mandatory, and held off "for now"
    so the outliers could be fixed. Having META should be non-negotiable for a
    well-formed CPAN distribution.
    I agree with this, and if the water quality metrics should be complete
    and everything should be mentioned, the metric must be included
    without doubt. However, I'm less sure if it makes sense to stress it
    as a new, small number of water quality metrics just because only
    about 3 percent of distributions shipped in 2015 didn't have META.yml
    already and the percentage of failing distributions is getting smaller
    year by year, so there might be little room to improve for most of the
    active authors. Of course, It would help older, possibly abondoned
    distributions, but that's a different story.

    On Thu, Dec 24, 2015 at 1:10 AM, Neil Bowers <neil.bowers@cogendo.com>
    wrote:
    CPANdeps (http://deps.cpantesters.org) has been providing useful
    information on water quality. It might be enough to make a better or
    opinionated presentation of it for the upriver authors. IMHO META
    files and min version specification depends more on when a
    distribution is released and don't well fit for water quality metrics.
    I’m not convinced on min version either, but am leaning towards
    including
    it, if we can come up with a definition that’s practical and useful.

    I think “has a META.yml or META.json” is worth keeping in, as there are
    a
    number of benefits to having one, and I suspect there’s at least some
    correlation between dists that don’t have a META file and dists that
    haven’t
    listed all prereqs (eg in the Makefile.PL).

    That said, I’m really just experimenting here, trying to find things
    that
    are useful indicators for whether a dist is good to rely on.

    Neil


    --
    David Golden <xdg@xdg.me> Twitter/IRC/Github: @xdg
  • David Cantrell at Jan 4, 2016 at 3:48 pm

    On Wed, Dec 23, 2015 at 03:46:48PM +0900, Kenichi Ishigaki wrote:

    CPANdeps (http://deps.cpantesters.org) has been providing useful
    information on water quality.
    The main limitation to using it to judge water quality is that it only
    considers the most recent version of every dependency. That is, it
    samples the water quality once, ignoring the factory upstream that shits
    its pants a coupla times a year.

    --
    David Cantrell | London Perl Mongers Deputy Chief Heretic

          If you can't imagine how I do something, it's
          because I have a better imagination than you
  • Philippe Bruhat (BooK) at Jan 4, 2016 at 9:39 pm

    On Mon, Jan 04, 2016 at 03:47:58PM +0000, David Cantrell wrote:
    On Wed, Dec 23, 2015 at 03:46:48PM +0900, Kenichi Ishigaki wrote:

    CPANdeps (http://deps.cpantesters.org) has been providing useful
    information on water quality.
    The main limitation to using it to judge water quality is that it only
    considers the most recent version of every dependency. That is, it
    samples the water quality once, ignoring the factory upstream that shits
    its pants a coupla times a year.
    Well, it tells you if there's *currently* shit in the water.

    --
      Philippe Bruhat (BooK)

      When you create a climate of peace, you have only fair weather.
      But where the climate is one of violence, it can only rain blood.
                                        (Moral from Groo The Wanderer #120 (Epic))
  • Neil Bowers at Dec 23, 2015 at 10:44 pm

    I thought the "min perl version" is a tough metric without considering what version of Perl it will actually run on. I would refine that metric to "declared min perl version >= actual perl version required". Figuring out the latter could perhaps be done via CPAN Testers -- if all of 5.6 fails, then we know it's 5.8 or better. But if there is at least one 5.6 pass, then it works on 5.6. And if it works on 5.6, I think omission of a minimum perl version is no big deal.
    I nearly didn’t include the “min perl version” in this, as there’s no easy clean definition. As you say, using in conjunction with CPAN Testers results might produce something usable. The thing that prompted me to include it was the part of my talk about the fact that the OSes and versions of perl supported by your dist is the intersection of those your code supports and those supported by all of your upstream dependencies. Once we’re past the important events of the next few days[*], I’ll have more time to spend on this.
    I don't want to see go down the Kwalitee route where people put a minimum perl version of "5" or something just to get a better water quality score. Indeed.
    Generally, I think some subset of the core Kwalitee metrics and some adaptation of your adoption criteria (e.g. time since any release by author) would be a place to look for "water quality" metrics. I do think you need to find a way to distinguish what water quality is trying to measure distinct from Kwalitee.
    Agreed.

    Part of my motivation for a separate and simpler measure was the feedback I had on the PR challenge, where some authors weren’t happy to get PRs that addressed failing CPANTS metrics. In general the message I got was “I agreed with some parts of kwalitee, but not other bits”. I’m hoping we can identify a small set of metrics that are hard to argue with when considering distributions that start moving up river.

    Neil

    [*] for example, taking my son to Star Wars tomorrow :-)
  • Neil Bowers at Dec 24, 2015 at 9:01 am
    Given an email I had off-list, I’ll clarify something related to the PR challenge (PRC):

    Through the year I had the occasional email from *authors* whose distributions had been assigned, and who got a PR that addressed kwalitee fails and nothing else. They weren’t happy with these PRs.

    Recently, I sent a questionnaire to all authors who’d had at least one distribution assigned in the PRC. I got quite a few more comments from authors saying that they didn’t want to get kwalitee PRs.

    On the flip-side, some *participants* got assignments where they said “the only thing I can think to do is kwalitee improvements, which I don’t want to do, so please can I have a different assignment”.

    Originally I didn’t plan to run the PRC in 2016, but enough people have asked to do it again next year that I’m now going to, but with some changes.

    In particular I’m going to email all authors with a repo and get them to opt-in, rather than the system for 2015, which was opt-out.

    Neil
  • Douglas Bell at Dec 23, 2015 at 3:59 am

    On Dec 22, 2015, at 5:05 PM, Neil Bowers wrote:

    Any thoughts on what factors should be included in such a metric? I think it should really include factors that it would be hard for anyone to argue with. Currently the individual factors are:

    Not having too many CPAN Testers fails
    Having a META.json or META.yml file
    Specifying the min perl version required for the dist
    Number (and age if possible) of open tickets might show if someone's paying attention to the dist. Like David said, much like the adoption criteria. The issues don't have to be valid, they could even be spam for all it matters, as long as someone's taking care of them.
  • Neil Bowers at Dec 23, 2015 at 10:49 pm
    Number (and age if possible) of open tickets might show if someone's paying attention to the dist. Like David said, much like the adoption criteria. The issues don't have to be valid, they could even be spam for all it matters, as long as someone's taking care of them.
    This is a tricky issue, as I found when trying to tune the adoption criteria. There are plenty of big name dists that have a lot of open issues, and always do.

    My current thought on this is that if no issues are getting dealt with in some timeframe, then it fails the metric. Even if a dist has a pile of open issues, if at least some issues are getting dealt with, then as you show, that indicates some level of maintainer engagement. That still has failure modes though: someone might have adopted a dist that they’re really not up to maintaining, so they avoid the large / scary / critical issues.

    Neil
  • Douglas Bell at Dec 23, 2015 at 11:14 pm

    On Dec 23, 2015, at 4:49 PM, Neil Bowers wrote:
    Number (and age if possible) of open tickets might show if someone's paying attention to the dist. Like David said, much like the adoption criteria. The issues don't have to be valid, they could even be spam for all it matters, as long as someone's taking care of them.
    This is a tricky issue, as I found when trying to tune the adoption criteria. There are plenty of big name dists that have a lot of open issues, and always do.

    My current thought on this is that if no issues are getting dealt with in some timeframe, then it fails the metric. Even if a dist has a pile of open issues, if at least some issues are getting dealt with, then as you show, that indicates some level of maintainer engagement. That still has failure modes though: someone might have adopted a dist that they’re really not up to maintaining, so they avoid the large / scary / critical issues.
    Yes, absolute ticket count is not as good as ticket movement or churn, even if a release doesn't necessarily result. A clean river is a steady-flowing river.
  • Sawyer X at Dec 24, 2015 at 10:34 am
    [top-posted]

    Further context as someone maintaining distributions with long-running
    issues. There are many reasons an issue could stay open for a long time:

    * It requires much more consideration (and could relate to multiple
    branches of reference implementation or different steps along the way)
    * It's a reminder of a very low-priority issue.
    * It's a reminder to rethink a topic.
    * It's a low-hanging fruit kept so early contributors could pick it up.
    ("Up for grabs" issue tag, for instance.)
    * It's kept until another issue is resolved.
    * It's kept for a while until the original person who opened it will
    confirm it was resolved or still exists.
    * Someone asked to handle it and they're given their time to do so
    (depending on complexity and prioritization).
    * Some PRs need - as I describe it - time to ripen. I believe whoever dealt
    with that knows what I mean.

    It's very hard to judge by issues. Perhaps comments on issues? I believe
    issues should at least be commented on (and I'm a terrible offender at
    this).


    On Thu, Dec 24, 2015 at 12:14 AM, Douglas Bell wrote:

    On Dec 23, 2015, at 4:49 PM, Neil Bowers wrote:

    Number (and age if possible) of open tickets might show if someone's
    paying attention to the dist. Like David said, much like the adoption
    criteria. The issues don't have to be valid, they could even be spam for
    all it matters, as long as someone's taking care of them.
    This is a tricky issue, as I found when trying to tune the adoption
    criteria. There are plenty of big name dists that have a lot of open
    issues, and always do.
    My current thought on this is that if no issues are getting dealt with
    in some timeframe, then it fails the metric. Even if a dist has a pile of
    open issues, if at least some issues are getting dealt with, then as you
    show, that indicates some level of maintainer engagement. That still has
    failure modes though: someone might have adopted a dist that they’re really
    not up to maintaining, so they avoid the large / scary / critical issues.

    Yes, absolute ticket count is not as good as ticket movement or churn,
    even if a release doesn't necessarily result. A clean river is a
    steady-flowing river.
  • Helmut Wollmersdorfer at Dec 24, 2015 at 1:54 pm
    [top postet]

    I agree with all of the reasons and could add even more.

    Measuring the take-care of issues automatically would need a
    standardization of best practices, e.g.:

    - before: need more info, not reproducable, not a bug, wishlist, new feature
    - severity: critical, important, normal, minor, cosmetic
    - prioritity: asap, release +x, future major release, won't solve

    A maintainer should react and take decisions in reasonable time.


    2015-12-24 11:33 GMT+01:00 Sawyer X [top-posted]
    Further context as someone maintaining distributions with long-running
    issues. There are many reasons an issue could stay open for a long time:

    * It requires much more consideration (and could relate to multiple
    branches of reference implementation or different steps along the way)
    * It's a reminder of a very low-priority issue.
    * It's a reminder to rethink a topic.
    * It's a low-hanging fruit kept so early contributors could pick it up.
    ("Up for grabs" issue tag, for instance.)
    * It's kept until another issue is resolved.
    * It's kept for a while until the original person who opened it will
    confirm it was resolved or still exists.
    * Someone asked to handle it and they're given their time to do so
    (depending on complexity and prioritization).
    * Some PRs need - as I describe it - time to ripen. I believe whoever
    dealt with that knows what I mean.

    It's very hard to judge by issues. Perhaps comments on issues? I believe
    issues should at least be commented on (and I'm a terrible offender at
    this).


    On Thu, Dec 24, 2015 at 12:14 AM, Douglas Bell wrote:

    On Dec 23, 2015, at 4:49 PM, Neil Bowers <neil.bowers@cogendo.com>
    wrote:
    Number (and age if possible) of open tickets might show if someone's
    paying attention to the dist. Like David said, much like the adoption
    criteria. The issues don't have to be valid, they could even be spam for
    all it matters, as long as someone's taking care of them.
    This is a tricky issue, as I found when trying to tune the adoption
    criteria. There are plenty of big name dists that have a lot of open
    issues, and always do.
    My current thought on this is that if no issues are getting dealt with
    in some timeframe, then it fails the metric. Even if a dist has a pile of
    open issues, if at least some issues are getting dealt with, then as you
    show, that indicates some level of maintainer engagement. That still has
    failure modes though: someone might have adopted a dist that they’re really
    not up to maintaining, so they avoid the large / scary / critical issues.

    Yes, absolute ticket count is not as good as ticket movement or churn,
    even if a release doesn't necessarily result. A clean river is a
    steady-flowing river.
  • Adam Kennedy at Dec 23, 2015 at 5:32 am
    You could try collecting up a bunch of these different metrics and then run a regression analysis against the graph wise recursive downstream dep count for everything on CPAN and see which metrics fall out in the real world.

    So many times we come up with arbitrary scoring systems that don't actually match to the real things that happen in the wild.

    Adam

    Sent from my iPhone
    On 23 Dec 2015, at 9:05 AM, Neil Bowers wrote:

    At the London Perl Workshop I gave a talk on the CPAN River, and how development and release practices should mature as a dist moves up river. This was prompted by the discussions we had at Berlin earlier this year.

    Writing the talk prompted a bunch of ideas, one of which is having a “water quality” metric, which gives some indication of whether a dist is a good one to rely on (needs a better name). I’ve come up with a first definition, and calculated the metric for the different stages of the river:

    http://neilb.org/2015/12/22/cpan-river-water-quality.html

    Any thoughts on what factors should be included in such a metric? I think it should really include factors that it would be hard for anyone to argue with. Currently the individual factors are:

    Not having too many CPAN Testers fails
    Having a META.json or META.yml file
    Specifying the min perl version required for the dist

    Cheers,
    Neil

    At some point I’ll share the slides from my talk, but slideshare doesn’t handle keynote presentations, and the exported powerpoint from keynote is broken (neither powerpoint nor slideshare can handle it!)
  • Neil Bowers at Dec 23, 2015 at 11:11 pm

    You could try collecting up a bunch of these different metrics and then run a regression analysis against the graph wise recursive downstream dep count for everything on CPAN and see which metrics fall out in the real world.
    I might have a dabble at this, perhaps roping in help from someone more mathematically, er rigorous, than me.
    So many times we come up with arbitrary scoring systems that don't actually match to the real things that happen in the wild.
    Having played with various scoring metrics, the one I use for CPAN Testers seems to be pretty reliable, and good for this purpose. A CPAN Testers fail for one of your upstream dependencies could indicate someone unable to install your dist.

    The other measure that worked well for the adoption criteria is the bug scoring: basically have multiple bugs (not wishes) been raised since the last release, and was that last release more than N months ago. That basically indicates that people are using it, but there doesn’t appear to be an engaged maintainer.

    Neil
  • Tim Bunce at Dec 23, 2015 at 6:47 pm

    On Tue, Dec 22, 2015 at 11:05:03PM +0000, Neil Bowers wrote:
    At the London Perl Workshop I gave a talk on the CPAN River, and how development and release practices
    should mature as a dist moves up river. This was prompted by the discussions we had at Berlin earlier
    this year.
    Writing the talk prompted a bunch of ideas, one of which is having a “water quality” metric, which
    gives some indication of whether a dist is a good one to rely on (needs a better name).
    I've no idea if this is useful, but https://en.wikipedia.org/wiki/Turbidity says

         Turbidity is the cloudiness or haziness of a fluid caused by large
         numbers of individual particles that are generally invisible to the
         naked eye, similar to smoke in air. The measurement of turbidity is a
         key test of water quality.

    Maybe someone can riff on that, or https://en.wikipedia.org/wiki/Water_quality

    Tim.
  • Shlomi Fish at Dec 23, 2015 at 8:52 pm
    Hi Neil,

    happy holidays.

    On Tue, 22 Dec 2015 23:05:03 +0000
    Neil Bowers wrote:
    At the London Perl Workshop I gave a talk on the CPAN River, and how
    development and release practices should mature as a dist moves up river.
    This was prompted by the discussions we had at Berlin earlier this year.

    Writing the talk prompted a bunch of ideas, one of which is having a “water
    quality” metric, which gives some indication of whether a dist is a good one
    to rely on (needs a better name). I’ve come up with a first definition, and
    calculated the metric for the different stages of the river:

    http://neilb.org/2015/12/22/cpan-river-water-quality.html

    Any thoughts on what factors should be included in such a metric? I think it
    should really include factors that it would be hard for anyone to argue with.
    Currently the individual factors are:

    Not having too many CPAN Testers fails
    Having a META.json or META.yml file
    Specifying the min perl version required for the dist
    I had put my thoughts about this in my CPAN Module-Rank document as part of the
    Rethinking-CPAN initiative:

    *
    https://bitbucket.org/shlomif/rethinking-cpan/src/a4c4eec3d769089a6a2664649f103a16e9690a8b/CPAN-Module-Rank/cpan-module-rank.pod?at=default&fileviewer=file-view-default

    Regards,

      Shlomi Fish


    --
    -----------------------------------------------------------------
    Shlomi Fish

    There is an IGLU Cabal, but its only purpose is to deny the existence of an
    IGLU Cabal.
         — Martha Greenberg

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcpan-workers @
categoriesperl
postedDec 22, '15 at 11:05p
activeJan 4, '16 at 9:39p
posts21
users12
websitecpan.org

People

Translate

site design / logo © 2018 Grokbase