Jos I. Boumans wrote:
that a Very Large Number as a minimum version is said to be higher than
whatever version it finds in the installed version of a Module on the OS.
So if I gave you a special version object that is guaranteed to be larger than
any other version object, you would be happy? Not saying that I know how to do
that, but if I get clever that may be a way to approach this corner case.
There is no generic fix possible, since the size of the largest IV
varies depending on the system architecture. And no, I am not
interested in changing the internal representation to UV's (for one
reason because that will just change where the overflow happens).
A few thoughts:

1) Wrap it in an eval
It's written in XS (in Perl only for compatibility purposes); it is also in the
v5.10.0 core as C (adapted from the XS).
2) Ask the system what the biggest IV is and then compare it to
the input passed to qv().
The problem isn't just that the number passed in is too large, but that there
are some internal mathematical operations during creation of the version object
that can cause the term to blow up (that's where the test actually is firing).
I find it hard to believe that the only course of action is to let
the program abruptly end.
FWIW, that code is actually inherited directly from the original v-string
tokenizing code. I'm not sure what the most appropriate course of action could
be, since overflowing your number is "a very bad thing" and the code cannot be
trusted at all at that point. What should happen when the input value is too
large to store?


Search Discussions

Discussion Posts


Follow ups

Related Discussions



site design / logo © 2019 Grokbase