FAQ
What's better about Ruby than Python? I'm sure there's something. What is
it?

This is not a troll. I'm language shopping and I want people's answers. I
don't know beans about Ruby or have any preconceived ideas about it. I have
noticed, however, that every programmer I talk to who's aware of Python is
also talking about Ruby. So it seems that Ruby has the potential to compete
with and displace Python. I'm curious on what basis it might do so.

--
Cheers, www.3DProgrammer.com
Brandon Van Every Seattle, WA

20% of the world is real.
80% is gobbledygook we make up inside our own heads.

Search Discussions

  • Raymond Hettinger at Aug 18, 2003 at 2:03 am
    "Brandon J. Van Every" <vanevery at 3DProgrammer.com> wrote in message
    news:bhpbc6$1plpe$1 at ID-203719.news.uni-berlin.de...
    What's better about Ruby than Python? I'm sure there's something. What is
    it?
    Code blocks, automatic properties, continuations,
    and excellent Japanese documentation ;-)


    Raymond Hettinge
  • Erik Max Francis at Aug 18, 2003 at 2:59 am

    "Brandon J. Van Every" wrote:

    What's better about Ruby than Python? I'm sure there's something.
    What is
    it?
    Wouldn't it make much more sense to ask Ruby people this, rather than
    Python people?

    --
    Erik Max Francis && max at alcyone.com && http://www.alcyone.com/max/
    __ San Jose, CA, USA && 37 20 N 121 53 W && &tSftDotIotE
    / \ Never had very much to say / Laugh last, laugh longest
    \__/ Des'ree
  • Asun Friere at Aug 18, 2003 at 7:21 am
    Erik Max Francis <max at alcyone.com> wrote in message news:<3F4040F8.87901505 at alcyone.com>...
    "Brandon J. Van Every" wrote:
    What's better about Ruby than Python? I'm sure there's something.
    What is
    it?
    Wouldn't it make much more sense to ask Ruby people this, rather than
    Python people?
    Maybe he can't speak Japanese?
  • Brandon J. Van Every at Aug 18, 2003 at 7:38 am

    Asun Friere wrote:
    Erik Max Francis <max at alcyone.com> wrote in message
    news:<3F4040F8.87901505 at alcyone.com>...
    "Brandon J. Van Every" wrote:
    What's better about Ruby than Python? I'm sure there's something.
    What is
    it?
    Wouldn't it make much more sense to ask Ruby people this, rather than
    Python people?
    Maybe he can't speak Japanese?
    I can't actually. Didn't know that Ruby was Nippocentric. And no, it
    wouldn't make "more sense" to ask the Ruby people this. They are going to
    give an answer that's biased from the Ruby perspective. For purposes of
    this post, I'm interested in the Python biased perspective.

    --
    Cheers, www.3DProgrammer.com
    Brandon Van Every Seattle, WA

    20% of the world is real.
    80% is gobbledygook we make up inside our own heads.
  • Alex Martelli at Aug 18, 2003 at 3:26 pm

    Erik Max Francis wrote:

    "Brandon J. Van Every" wrote:
    What's better about Ruby than Python? I'm sure there's something.
    What is it?
    Wouldn't it make much more sense to ask Ruby people this, rather than
    Python people?
    Might, or might not, depending on one's purposes -- for example, if
    one's purposes include a "sociological study" of the Python community,
    then putting questions to that community is likely to prove more
    revealing of informaiton about it, than putting them elsewhere:-).

    Personally, I gladly took the opportunity to follow Dave Thomas'
    one-day Ruby tutorial at last OSCON. Below a thin veneer of syntax
    differences, I find Ruby and Python amazingly similar -- if I was
    computing the minimum spanning tree among just about any set of
    languages, I'm pretty sure Python and Ruby would be the first two
    leaves to coalesce into an intermediate node:-).

    Sure, I do get weary, in Ruby, of typing the silly "end" at the end
    of each block (rather than just unindenting) -- but then I do get
    to avoid typing the equally-silly ':' which Python requires at the
    _start_ of each block, so that's almost a wash:-). Other syntax
    differences such as '@foo' versus 'self.foo', or the higher significance
    of case in Ruby vs Python, are really just about as irrelevant to me.

    Others no doubt base their choice of programming languages on just
    such issues, and they generate the hottest debates -- but to me that's
    just an example of one of Parkinson's Laws in action (the amount on
    debate on an issue is inversely proportional to the issue's actual
    importance).

    One syntax difference that I do find important, and in Python's
    favour -- but other people will no doubt think just the reverse --
    is "how do you call a function which takes no parameters". In
    Python (like in C), to call a function you always apply the
    "call operator" -- trailing parentheses just after the object
    you're calling (inside those trailing parentheses go the args
    you're passing in the call -- if you're passing no args, then
    the parentheses are empty). This leaves the mere mention of
    any object, with no operator involved, as meaning just a
    reference to the object -- in any context, without special
    cases, exceptions, ad-hoc rules, and the like. In Ruby (like
    in Pascal), to call a function WITH arguments you pass the
    args (normally in parentheses, though that is not invariably
    the case) -- BUT if the function takes no args then simply
    mentioning the function implicitly calls it. This may meet
    the expectations of many people (at least, no doubt, those
    whose only previous experience of programming was with Pascal,
    or other languages with similar "implcit calling", such as
    Visual Basic) -- but to me, it means the mere mention of an
    object may EITHER mean a reference to the object, OR a call
    to the object, depending on the object's type -- and in those
    cases where I can't get a reference to the object by merely
    mentioning it I will need to use explicit "give me a reference
    to this, DON'T call it!" operators that aren't needed otherwise.
    I feel this impacts the "first-classness" of functions (or
    methods, or other callable objects) and the possibility of
    interchanging objects smoothly. Therefore, to me, this specific
    syntax difference is a serious black mark against Ruby -- but
    I do understand why others would thing otherwise, even though
    I could hardly disagree more vehemently with them:-).

    Below the syntax, we get into some important differences in
    elementary semantics -- for example, strings in Ruby are
    mutable objects (like in C++), while in Python they are not
    mutable (like in Java, or I believe C#). Again, people who
    judge primarily by what they're already familiar with may
    think this is a plus for Ruby (unless they're familiar with
    Java or C#, of course:-). Me, I think immutable strings are
    an excellent idea (and I'm not surprised that Java, independently
    I think, reinvented that idea which was already in Python), though
    I wouldn't mind having a "mutable string buffer" type as well
    (and ideally one with better ease-of-use than Java's own
    "string buffers"); and I don't give this judgment because of
    familiarity -- before studying Java, apart from functional
    programming languages where _all_ data are immutable, all the
    languages I knew had mutable strings -- yet when I first saw
    the immutable-string idea in Java (which I learned well before
    I learned Python), it immediately struck me as excellent, a
    very good fit for the reference-semantics of a higher level
    programming language (as opposed to the value-semantics that
    fit best with languages closer to the machine and farther from
    applications, such as C) with strings as a first-class, built-in
    (and pretty crucial) data type.

    Ruby does have some advantages in elementary semantics -- for
    example, the removal of Python's "lists vs tuples" exceedingly
    subtle distinction. But mostly the score (as I keep it, with
    simplicity a big plus and subtle, clever distinctions a notable
    minus) is against Ruby (e.g., having both closed and half-open
    intervals, with the notations a..b and a...b [anybody wants
    to claim that it's _obvious_ which is which?-)], is silly --
    IMHO, of course!). Again, people who consider having a lot of
    similar but subtly different things at the core of a language
    a PLUS, rather than a MINUS, will of course count these "the
    other way around" from how I count them:-).

    Don't be misled by these comparisons into thinking the two
    languages are _very_ different, mind you. They aren't. But
    if I'm asked to compare "capelli d'angelo" to "spaghettini",
    after pointing out that these two kinds of pasta are just
    about undistinguishable to anybody and interchangeable in any
    dish you might want to prepare, I would then inevitably have
    to move into microscopic examination of how the lengths and
    diameters imperceptibly differ, how the ends of the strands
    are tapered in one case and not in the other, and so on -- to
    try and explain why I, personally, would rather have capelli
    d'angelo as the pasta in any kind of broth, but would prefer
    spaghettini as the pastasciutta to go with suitable sauces for
    such long thin pasta forms (olive oil, minced garlic, minced
    red peppers, and finely ground anchovies, for example - but if
    you sliced the garlic and peppers instead of mincing them, then
    you should choose the sounder body of spaghetti rather than the
    thinner evanescence of spaghettini, and would be well advised
    to forego the achoview and add instead some fresh spring basil
    [or even -- I'm a heretic...! -- light mint...] leaves -- at
    the very last moment before serving the dish). Ooops, sorry,
    it shows that I'm traveling abroad and haven't had pasta for
    a while, I guess. But the analogy is still pretty good!-)

    So, back to Python and Ruby, we come to the two biggies (in
    terms of language proper -- leaving the libraries, and other
    important ancillaries such as tools and environments, how to
    embed/extend each language, etc, etc, out of it for now -- they
    wouldn't apply to all IMPLEMENTATIONS of each language anyway,
    e.g., Jython vs Classic Python being two implementations of
    the Python language!):

    1. Ruby's iterators and codeblocks vs Python's iterators
    and generators;

    2. Ruby's TOTAL, unbridled "dynamicity", including the ability
    to "reopen" any existing class, including all built-in ones,
    and change its behavior at run-time -- vs Python's vast but
    _bounded_ dynamicity, which never changes the behavior of
    existing built-in classes and their instances.

    Personally, I consider [1] a wash (the differences are so
    deep that I could easily see people hating either approach
    and revering the other, but on MY personal scales the pluses
    and minuses just about even up); and [2] a crucial issue --
    one that makes Ruby much more suitable for "tinkering", BUT
    Python equally more suitable for use in large production
    applications. It's funny, in a way, because both languages
    are so MUCH more dynamic than most others, that in the end
    the key difference between them from my POV should hinge on
    that -- that Ruby "goes to eleven" in this regard (the
    reference here is to "Spinal Tap", of course). In Ruby,
    there are no limits to my creativity -- if I decide that
    all string comparisons must become case-insensitive, _I CAN
    DO THAT_! I.e., I can dynamically alter the built-in string
    class so that
    a = "Hello World"
    b = "hello world"
    if a == b
    print "equal!\n"
    else
    print "different!\n"
    end
    WILL print "equal". In python, there is NO way I can do
    that. For the purposes of metaprogramming, implementing
    experimental frameworks, and the like, this amazing dynamic
    ability of Ruby is _extremely_ appealing. BUT -- if we're
    talking about large applications, developed by many people
    and maintained by even more, including all kinds of libraries
    from diverse sources, and needing to go into production in
    client sites... well, I don't WANT a language that is QUITE
    so dynamic, thank you very much. I loathe the very idea of
    some library unwittingly breaking other unrelated ones that
    rely on those strings being different -- that's the kind of
    deep and deeply hidden "channel", between pieces of code that
    LOOK separate and SHOULD BE separate, that spells d-e-a-t-h
    in large-scale programming. By letting any module affect the
    behavior of any other "covertly", the ability to mutate the
    semantics of built-in types is just a BAD idea for production
    application programming, just as it's cool for tinkering.

    If I had to use Ruby for such a large application, I would
    try to rely on coding-style restrictions, lots of tests (to
    be rerun whenever ANYTHING changes -- even what should be
    totally unrelated...), and the like, to prohibit use of this
    language feature. But NOT having the feature in the first
    place is even better, in my opinion -- just as Python itself
    would be an even better language for application programming
    if a certain number of built-ins could be "nailed down", so
    I KNEW that, e.g., len("ciao") is 4 (rather than having to
    worry subliminally about whether somebody's changed the
    binding of name 'len' in the __builtins__ module...). I do
    hope that eventually Python does "nail down" its built-ins.

    But the problem's minor, since rebinding built-ins is quite
    a deprecated as well as a rare practice in Python. In Ruby,
    it strikes me as major -- just like the _too powerful_ macro
    facilities of other languages (such as, say, Dylan) present
    similar risks in my own opinion (I do hope that Python never
    gets such a powerful macro system, no matter the allure of
    "letting people define their own domain-specific little
    languages embedded in the language itself" -- it would, IMHO,
    impair Python's wonderful usefulness for application
    programming, by presenting an "attractive nuisance" to the
    would-be tinkerer who lurks in every programmer's heart...).


    Alex
  • Borcis at Aug 18, 2003 at 4:08 pm

    Alex Martelli wrote:
    Me, I think immutable strings are
    an excellent idea (and I'm not surprised that Java, independently
    I think, reinvented that idea which was already in Python)
    IIRC Applesoft Basic on the Apple II had immutable strings,
    way back in the late seventies.
  • Alexander Schmolck at Aug 18, 2003 at 4:30 pm

    Alex Martelli <aleaxit at yahoo.com> writes:

    (I do hope that Python never gets such a powerful macro system, no matter
    the allure of "letting people define their own domain-specific little
    languages embedded in the language itself" -- it would, IMHO, impair
    Python's wonderful usefulness for application programming, by presenting an
    "attractive nuisance" to the would-be tinkerer who lurks in every
    programmer's heart...).
    I don't think a powerful but potentially dangerous feature poses much of a
    problem as long as there is little danger of inadvertently using it and little
    incentive to inappropriately use it (e.g. you could do all sorts of stupid
    things in python, like redefining __builtins__.len, but generally there isn't
    much of an incentive to do so, so many of the reasons why e.g. Java
    programmers might think python an unsuitable language for larger projects
    don't really apply in practice. My feeling is this *not* true to the same
    extent for ruby, where similar things are both encouraged and, if I don't
    misremember, can happen inadvertently).

    Would you still have a problem with macros in python if utilizing them
    required some obvious and explicit mechanism (say a 'use_custom_syntax'
    statement right at the beginning of a module that wants to use macros), so
    that their use could easily be controlled by e.g. project managers?

    'as
  • Brandon J. Van Every at Aug 18, 2003 at 6:30 pm

    Alexander Schmolck wrote:
    Would you still have a problem with macros in python if utilizing them
    required some obvious and explicit mechanism (say a
    'use_custom_syntax' statement right at the beginning of a module that
    wants to use macros), so that their use could easily be controlled by
    e.g. project managers?
    Yes you would. In open source communities, you'd get different
    philosophical camps, and people in one camp would embed 'use_custom_syntax'
    in some *.h file (yeah yeah I'm a C++ programmer) that all the other *.h
    files use. When you grab that code, you're not really going to want to do
    the work of making more specific 'use_custom_syntax' directives, as it would
    break in all sorts of subtle ways. So now the 'culture of feature'
    supplants the culture of control. The same could happen in a big commerical
    project, for that matter. Thus it may be wise not to allow this kind of
    customization at all.

    --
    Cheers, www.3DProgrammer.com
    Brandon Van Every Seattle, WA

    20% of the world is real.
    80% is gobbledygook we make up inside our own heads.
  • Alexander Schmolck at Aug 18, 2003 at 10:51 pm

    "Brandon J. Van Every" <vanevery at 3DProgrammer.com> writes:

    Alexander Schmolck wrote:
    Would you still have a problem with macros in python if utilizing them
    required some obvious and explicit mechanism (say a
    'use_custom_syntax' statement right at the beginning of a module that
    wants to use macros), so that their use could easily be controlled by
    e.g. project managers?
    Yes you would. In open source communities, you'd get different
    philosophical camps, and people in one camp would embed 'use_custom_syntax'
    in some *.h file (yeah yeah I'm a C++ programmer) that all the other *.h
    files use.
    In the proposed scheme every module that wants to use custom syntax has to say
    so, period. So there is no silent infestation (these syntaxes can't be
    stealthily exported).
    When you grab that code, you're not really going to want to do the work of
    making more specific 'use_custom_syntax' directives, as it would break in
    all sorts of subtle ways. So now the 'culture of feature' supplants the
    culture of control. The same could happen in a big commerical project, for
    that matter. Thus it may be wise not to allow this kind of customization at
    all.
    I don't really understand what you're saying. To reiterate my point: the
    language could force every module that wants to use some custom syntax
    internally to explicitly say so at its beginning. It would be easy (from a
    technical perspective) for project managers (open source or otherwise) to only
    allow certain, particularly trusted programers to create such modules or ban
    them outright. Or allow the use of certain custom syntaxes but only approved
    ones (created by experienced programers) and only in special situations.

    Just like you wouldn't allow every second rate programmer to write the
    system-critical libraries everything is built upon, you wouldn't allow just
    about anybody to create their own mini-languages without a need. OTOH if that
    need arises, you can get some good programmers to design the necessary syntax
    transformations and I can't really see how this would be worse than having
    them write some ad-hoc interpreter from scratch which likely will suffer from
    an inferior design and worse reliability and performance.

    'as
  • Brandon J. Van Every at Aug 19, 2003 at 6:25 am

    Alexander Schmolck wrote:
    "Brandon J. Van Every" <vanevery at 3DProgrammer.com> writes:
    Alexander Schmolck wrote:
    Would you still have a problem with macros in python if utilizing
    them required some obvious and explicit mechanism (say a
    'use_custom_syntax' statement right at the beginning of a module
    that wants to use macros), so that their use could easily be
    controlled by e.g. project managers?
    Yes you would. In open source communities, you'd get different
    philosophical camps, and people in one camp would embed
    'use_custom_syntax' in some *.h file (yeah yeah I'm a C++
    programmer) that all the other *.h files use.
    In the proposed scheme every module that wants to use custom syntax
    has to say so, period. So there is no silent infestation (these
    syntaxes can't be stealthily exported).
    I wsan't talking about stealth, I was talking about developmental inertia.
    The least common denominator is functionality turned ON in some root *.h
    file, and everyone else inherits the functionality. Are you saying that
    modules in Python don't inherit?

    If you don't want the functionality to end up being "always ON" by faits
    accompli, then you must ensure that functionality is irrevocably, "always
    OFF." In other words, don't provide the functionality.
    It would be
    easy (from a technical perspective) for project managers (open source
    or otherwise) to only allow certain, particularly trusted programers
    to create such modules or ban them outright.
  • Harry George at Aug 18, 2003 at 7:27 pm

    Alexander Schmolck <a.schmolck at gmx.net> writes:

    Alex Martelli <aleaxit at yahoo.com> writes:
    (I do hope that Python never gets such a powerful macro system, no matter
    the allure of "letting people define their own domain-specific little
    languages embedded in the language itself" -- it would, IMHO, impair
    Python's wonderful usefulness for application programming, by presenting an
    "attractive nuisance" to the would-be tinkerer who lurks in every
    programmer's heart...).
    I don't think a powerful but potentially dangerous feature poses much of a
    problem as long as there is little danger of inadvertently using it and little
    incentive to inappropriately use it (e.g. you could do all sorts of stupid
    things in python, like redefining __builtins__.len, but generally there isn't
    much of an incentive to do so, so many of the reasons why e.g. Java
    programmers might think python an unsuitable language for larger projects
    don't really apply in practice. My feeling is this *not* true to the same
    extent for ruby, where similar things are both encouraged and, if I don't
    misremember, can happen inadvertently).

    Would you still have a problem with macros in python if utilizing them
    required some obvious and explicit mechanism (say a 'use_custom_syntax'
    statement right at the beginning of a module that wants to use macros), so
    that their use could easily be controlled by e.g. project managers?

    'as
    Yes, it is a problem under any circumstances.

    "The first step in writing a macro is to recognize that every time you
    write one, you are defining a new language". P. Norvig, "Paradigms of
    Artificial Intelligence Programming", c 1992, pg 66.

    In the Lisp world, you use the hundreds of macros in CL becuase they
    *are* the language. But home-grown (or vendor supplied) macros are
    basically a lockin mechanism. New syntax, new behavior to learn, and
    very little improvement in readability or efficiency of expresison
    (the commmon rationales for macros).

    The python language is just fine as is. If you really, really need
    something like a macro, consider a template body which is filled in
    and exec'd or eval'd at run time.

    --
    harry.g.george at boeing.com
    6-6M31 Knowledge Management
    Phone: (425) 342-5601
  • Alexander Schmolck at Aug 18, 2003 at 10:37 pm

    Harry George <harry.g.george at boeing.com> writes:
    In the Lisp world, you use the hundreds of macros in CL becuase they
    *are* the language. But home-grown (or vendor supplied) macros are
    basically a lockin mechanism. New syntax, new behavior to learn, and
    very little improvement in readability or efficiency of expresison
    (the commmon rationales for macros).
    How can you claim with a straight face that the sophisticated object, logic
    programming, constraint programming, lazy evaluation etc systems people have
    developed in scheme and CL over the years have brought "very little
    improvement in readability or efficiency"?
    The python language is just fine as is.
    No it isn't. Like every other language I know python sucks in a variety of
    ways (only on the whole, much less so), but I don't claim I know how to fix
    this with a macro system. I'm just not sure I buy Alex's argument that an
    introduction of something equivalent in expressive power to say CL's macro
    system would immediately wreck the language.

    The trick with adding expressiveness is doing it in a manner that doesn't
    invite abuse. Python is doing pretty well in this department so far; I think
    it is easily more expressive than Java, C++ and Perl and still causes less
    headache (Perl comes closest, but at the price of causing even greater
    headache than C++, if that's possible).
    If you really, really need something like a macro, consider a template body
    which is filled in and exec'd or eval'd at run time.
    I've actually written a library where most of the code is generated like this
    (and it works fine, because only trivial code transformation are needed that
    can be easily accomodated by simple templating (no parsing/syntax tree
    manipulations necessary)).

    But show me how to write something like CL's series package that way (or
    better yet, something similar for transforming array and matrix manipulations
    from some reader-friendly representation into something efficient).


    'as
  • Andrew Dalke at Aug 19, 2003 at 1:11 am

    Alexander Schmolck:
    No it isn't. Like every other language I know python sucks in a variety of
    ways (only on the whole, much less so), but I don't claim I know how to fix
    this with a macro system.
    What about the other way around? Make a macro for Lisp or
    Scheme which converts Python into the language then evals
    the result?

    Given how easy it is to parse Python (there are several Python
    parsers for Python) and the number of people who have popped
    up with Lisp background, I'm surprised no one has done that
    for fun. After all, there is Python for C, Java, .Net, and for
    Python (PyPy) and variations like Pyrex and Vyper. But
    none for Lisp?

    (I think I remember mention of one some years ago, .. I think
    *I* posted that link to c.l.py, but I don't remember when and
    can't find it via Google.)
    But show me how to write something like CL's series package that way (or
    better yet, something similar for transforming array and matrix
    manipulations
    from some reader-friendly representation into something efficient).
    The Boost code for C++ suggests a different way to do the latter.
    (I don't think templates are the same as hygenic macros.)

    Andrew
    dalke at dalkescientific.com
  • Alexander Schmolck at Aug 19, 2003 at 10:32 pm

    "Andrew Dalke" <adalke at mindspring.com> writes:

    Alexander Schmolck:
    No it isn't. Like every other language I know python sucks in a variety of
    ways (only on the whole, much less so), but I don't claim I know how to fix
    this with a macro system.
    What about the other way around? Make a macro for Lisp or
    Scheme which converts Python into the language then evals
    the result?
    Actually, at least one person has been working on this for scheme (I've never
    heard about it again and he targeted about the most useless scheme
    implementation around).

    One thing that makes such an attempt fairly unattractive for anyone with
    finite amounts of time is that python isn't that much use without its
    supporting C/C++ modules schemes/lisps suck in the FFI department (every
    lisp/scheme has its own way of interfacing to C).
    Given how easy it is to parse Python (there are several Python
    parsers for Python) and the number of people who have popped
    up with Lisp background, I'm surprised no one has done that
    for fun. After all, there is Python for C, Java, .Net, and for
    Python (PyPy) and variations like Pyrex and Vyper. But
    none for Lisp?
    Would certainly be interesting.
    (I think I remember mention of one some years ago, .. I think
    *I* posted that link to c.l.py, but I don't remember when and
    can't find it via Google.)
    But show me how to write something like CL's series package that way (or
    better yet, something similar for transforming array and matrix
    manipulations
    from some reader-friendly representation into something efficient).
    The Boost code for C++ suggests a different way to do the latter.
    (I don't think templates are the same as hygenic macros.)
    Could you say a little bit more about it? In python I think one could to some
    extent use operator overloading and a special expression class, that
    simplifies the literal expression the programmer stated, but I guess then one
    would then to explicitly request evaluation (and operator overloading isn't
    quite powerful enough to easily incorporate 'alien' class-instances, too).

    Is the C++ code something along those lines, or different?


    'as
  • Andrew Dalke at Aug 19, 2003 at 11:30 pm

    Alexander Schmolck:
    One thing that makes such an attempt fairly unattractive for anyone with
    finite amounts of time is that python isn't that much use without its
    supporting C/C++ modules schemes/lisps suck in the FFI department (every
    lisp/scheme has its own way of interfacing to C).
    If Parrot gets enough Python working for the challenge, then someone
    could write a Lisp-y language targeting Parrot, and take advantage
    of the work of others.

    For that matter, there's Lisps on the JVM, no? Could support
    Jython.
    Could you say a little bit more about it? In python I think one could to some
    extent use operator overloading and a special expression class, that
    simplifies the literal expression the programmer stated, but I guess then one
    would then to explicitly request evaluation (and operator overloading isn't
    quite powerful enough to easily incorporate 'alien' class-instances, too).

    Is the C++ code something along those lines, or different?
    Different. The problem with the operator overloading approach is
    the creation and description of intermediate objects. If you do

    a = b + c * d

    then "c*d" makes an intermediary temporary.

    Template expressions solve it by providing a description of
    how to do the add, early enough that the compiler can optimize
    based on the whole express. Eg, for vectors, the compiler
    could generate code equivalent to

    for (i=0; i<n; i++) {
    a[i] = b[i] + c*d[i];
    }

    Here's an old reference
    http://infm.cineca.it/infm_help/parallel/poop/KAY.html

    Andrew
    dalke at dalkescientific.com
  • Harry George at Aug 19, 2003 at 1:56 pm

    Alexander Schmolck <a.schmolck at gmx.net> writes:

    Harry George <harry.g.george at boeing.com> writes:
    In the Lisp world, you use the hundreds of macros in CL becuase they
    *are* the language. But home-grown (or vendor supplied) macros are
    basically a lockin mechanism. New syntax, new behavior to learn, and
    very little improvement in readability or efficiency of expresison
    (the commmon rationales for macros).
    How can you claim with a straight face that the sophisticated object, logic
    programming, constraint programming, lazy evaluation etc systems people have
    developed in scheme and CL over the years have brought "very little
    improvement in readability or efficiency"?
    When I want logic programming I go to Prolog, and I bind to/from
    prolog with python. If I want lazy evaluation, I do it in python (see
    e.g., xoltar). My concern is based primarily on experience with the
    ICAD language where massive use of macros provide lazy evaluation at
    the expense of an utterlay different language. We are finding the
    same KBE work can often be done cleaner and simpler in Python.

    The issue is not "can I do it at all". Lisp is great for that. It is
    rather "do I need a wholly new syntax".
    The python language is just fine as is.
    No it isn't. Like every other language I know python sucks in a variety of
    ways (only on the whole, much less so), but I don't claim I know how to fix
    this with a macro system. I'm just not sure I buy Alex's argument that an
    introduction of something equivalent in expressive power to say CL's macro
    system would immediately wreck the language.

    The trick with adding expressiveness is doing it in a manner that doesn't
    invite abuse. Python is doing pretty well in this department so far; I think
    it is easily more expressive than Java, C++ and Perl and still causes less
    headache (Perl comes closest, but at the price of causing even greater
    headache than C++, if that's possible).
    That's the point: Lisp macros invite abuse. They are wonderfully
    powerful and expressive. And they therefore support invention of new
    worlds which must be learned by others. Python (so far) resists the
    "creeping featurism", yet is still valuable in a very wide array of
    situations.

    To make an analogy with antural languages: English is relatively
    successful not just through economic dominance but also through paring
    away nuances of grammar. Yes, there are times and places where French
    or Sanskrit or Morse code are more potent languages, but for a large
    set of communications problems, English works quite well.

    (If you are worried I'm a language chauvinist, see:
    http://www.seanet.com/~hgg9140/languages/index.html )
    If you really, really need something like a macro, consider a template body
    which is filled in and exec'd or eval'd at run time.
    I've actually written a library where most of the code is generated like this
    (and it works fine, because only trivial code transformation are needed that
    can be easily accomodated by simple templating (no parsing/syntax tree
    manipulations necessary)).

    But show me how to write something like CL's series package that way (or
    better yet, something similar for transforming array and matrix manipulations
    from some reader-friendly representation into something efficient).
    Why reimplement the series package? That is a good example of rampant
    CL overkill. In Steele's CLTL2, it takes 33 pages to explain. It is
    great for people who are in the language day in and day out, and can
    therefore keep the whole shebang in their mental working set. For
    anyone who has other committments (e.g., me and 30 other engineers I
    work with), the nuances of series are too complex for practical use.
    In code reviews we have to bring out CLTL2 whenever someone uses any
    of the fancy macros. You can get the same functionality with python
    "for" or "while" and a few idioms.

    As for array and matrix manipulation, I want a good C-based library
    with python binding (e.g,, gsl), at times helped by some syntactic
    sugar (Numeric). What I don't need is a brand new language for matrix
    manipulation (wasn't APL for that?). If you mean a human readable
    treatment that can be converted to those libraries, I'd have to point
    to MathML. If you mean the programming syntax itself looks like
    vector math, I'd use Numeric overloads up to a point, but beyond that
    people get confused and you (I at least) need explicitly named
    functions anyway.
    'as
    I'll concede that the macro issue is a personal taste sort of thing.
    if you live inside a single mental world, you can afford to grow and
    use fancy macros. If (like me) your day includes a dog's bvreakfast
    of tasks, then the overhead is too great for the payoff.

    --
    harry.g.george at boeing.com
    6-6M31 Knowledge Management
    Phone: (425) 342-5601
  • Brandon J. Van Every at Aug 18, 2003 at 6:26 pm

    Alex Martelli wrote:
    Don't be misled by these comparisons into thinking the two
    languages are _very_ different, mind you. They aren't. But
    if I'm asked to compare "capelli d'angelo" to "spaghettini",
    after pointing out that these two kinds of pasta are just
    about undistinguishable to anybody and interchangeable in any
    dish you might want to prepare, I would then inevitably have
    to move into microscopic examination of how the lengths and
    diameters imperceptibly differ, how the ends of the strands
    are tapered in one case and not in the other, and so on -- to
    try and explain why I, personally, would rather have capelli
    d'angelo as the pasta in any kind of broth, but would prefer
    spaghettini as the pastasciutta to go with suitable sauces for
    such long thin pasta forms (olive oil, minced garlic, minced
    red peppers, and finely ground anchovies, for example - but if
    you sliced the garlic and peppers instead of mincing them, then
    you should choose the sounder body of spaghetti rather than the
    thinner evanescence of spaghettini, and would be well advised
    to forego the achoview and add instead some fresh spring basil
    [or even -- I'm a heretic...! -- light mint...] leaves -- at
    the very last moment before serving the dish).
    What a wonderful runon sentence. You must be an A. A. Milne fan.
    Ooops, sorry,
    it shows that I'm traveling abroad and haven't had pasta for
    a while, I guess. But the analogy is still pretty good!-)
    What I take away from it, is Python and Ruby are far more similar than
    different. So then one looks at industrial evolution - GUIs, tools,
    community size, marketing, volunteer organization, mainstream commercial
    use. Python is clearly much farther along than Ruby.
    2. Ruby's TOTAL, unbridled "dynamicity", including the ability
    to "reopen" any existing class, including all built-in ones,
    and change its behavior at run-time -- vs Python's vast but
    _bounded_ dynamicity, which never changes the behavior of
    existing built-in classes and their instances.
    Others have mentioned this. I imagine it would be a big ticket item for
    some. I can't figure out why I'd care myself, but maybe as I get into my
    diplomacy AI, I will.
    BUT Python equally more suitable for use in large production
    applications.
    Yes, this definitely matters from 10,000 miles up.

    --
    Cheers, www.3DProgrammer.com
    Brandon Van Every Seattle, WA

    20% of the world is real.
    80% is gobbledygook we make up inside our own heads.
  • John J. Lee at Aug 18, 2003 at 11:01 pm

    "Brandon J. Van Every" <vanevery at 3DProgrammer.com> writes:

    Alex Martelli wrote:
    Don't be misled by these comparisons into thinking the two
    [...]
    spaghettini as the pastasciutta to go with suitable sauces for
    such long thin pasta forms (olive oil, minced garlic, minced
    [...]
    [or even -- I'm a heretic...! -- light mint...] leaves -- at
    the very last moment before serving the dish).
    What a wonderful runon sentence. You must be an A. A. Milne fan.
    [...]

    Well, art is art, isn't it? Still, on the other hand, water is water!
    And east is east and west is west and if you take cranberries and stew
    them like applesauce they taste much more like prunes than rhubarb
    does.

    Groucho Marx.


    John
  • Brandon J. Van Every at Aug 19, 2003 at 6:26 am

    Well, art is art, isn't it? Still, on the other hand, water is water!
    And east is east and west is west and if you take cranberries and stew
    them like applesauce they taste much more like prunes than rhubarb
    does.

    Groucho Marx.
    I almost got it.

    --
    Cheers, www.3DProgrammer.com
    Brandon Van Every Seattle, WA

    20% of the world is real.
    80% is gobbledygook we make up inside our own heads.
  • John J. Lee at Aug 19, 2003 at 12:09 pm

    "Brandon J. Van Every" <vanevery at 3DProgrammer.com> writes:

    Well, art is art, isn't it? Still, on the other hand, water is water!
    And east is east and west is west and if you take cranberries and stew
    them like applesauce they taste much more like prunes than rhubarb
    does.

    Groucho Marx.
    I almost got it.
    Not meant to be 'got'. Your mention of 'runon' sentences just
    reminded me of it, and that was good enough excuse for me :-)


    John
  • John Wilson at Aug 18, 2003 at 7:25 pm

    Alex Martelli wrote:

    Me, I think immutable strings are
    an excellent idea (and I'm not surprised that Java, independently
    I think, reinvented that idea which was already in Python),
    Gosling claims that Java contains no new ideas. I have heard him say that
    every feature of Java is in at least two other programming languages. Java
    takes many things from CLU (including immutable strings). CLU looks to be an
    influence on Python too.

    John Wilson
    The Wilson Partnership
    http://www.wilson.co.uk
  • Roy Smith at Aug 18, 2003 at 8:09 pm

    Alex Martelli wrote:
    I do hope that Python never gets such a powerful macro system
    I'm with Alex on this. Macros suck. What you usually end up with is
    essentially two different languages, with different syntaxes, and which
    don't interract very well. If nothing else, this really screws up emacs
    auto-indenting :-(

    One of the few things I like about C++ is that between const, templates,
    and inline, the need for the macro preprocessor has been almost
    eliminated. Still, you see a lot of code which goes out of its way to
    do fancy things with macros, almost always with bad effect.

    I don't even want to talk about the various systems which make use of
    things like m4.

    Why do you need macros? There's a few things people do with them:

    1) Define constants. In Python, you just define symbols in your module,
    and get over the fact that there really is no such thing as a constant
    in Python.

    2) Define efficient pseudo-functions. In Python, you just define a
    function (or method) and get over the fact that it's not as efficient as
    a macro. If I cared about microseconds, I wouldn't be writing in Python.

    3) File inclusion. In Python, you don't include files, you import
    modules.

    4) Conditional compilation. In Python, you can conditionally define
    anything you want at import time.

    5) Inventing your own language constructs. In Python, you just don't do
    this.
  • Doug Tolton at Aug 18, 2003 at 10:14 pm

    On Mon, 18 Aug 2003 16:09:47 -0400, Roy Smith wrote:
    Why do you need macros? There's a few things people do with them:

    1) Define constants. In Python, you just define symbols in your module,
    and get over the fact that there really is no such thing as a constant
    in Python.

    2) Define efficient pseudo-functions. In Python, you just define a
    function (or method) and get over the fact that it's not as efficient as
    a macro. If I cared about microseconds, I wouldn't be writing in Python.

    3) File inclusion. In Python, you don't include files, you import
    modules.

    4) Conditional compilation. In Python, you can conditionally define
    anything you want at import time.

    5) Inventing your own language constructs. In Python, you just don't do
    this.

    I don't agree at all. Yes when you are defining a macro you are in
    essence defining a new mini-language. This is perhaps one of the most
    powerful features of Lisp. Programming closer to the application
    domain, *greatly* enhances both the readability and the reusability of
    code.

    Good Lisp programmers use Macros all the time. They are incredibly
    useful and powerful. The reason you don't do this in python is
    because the feature isn't available. That doesn't mean it *shouldn't*
    be available. Python is Open Source, how would someone writing a
    Macro lock you in? Just don't use the macro.

    Just like anything else, Macro's can be over used and abused. However
    I maintain that if you don't see the usefulness of macros, you don't
    really understand them. Essentially using Python over Machine
    language is just using one big ass macro language. They are there to
    allow you to create higher level abstractions, and tools that are more
    specifically useful to your application domain than a general purpose
    tool.

    Python is a Macro Language of Machine Language. Why don't you just
    program everything in Machine Language? Macros are to Python as
    Python is to C as C is to Machine Language.

    Python is great, as the trend shows, working at higher levels of
    abstraction though is the ultimate goal.



    Doug Tolton
    (format t "~a@~a~a.~a" "dtolton" "ya" "hoo" "com")
  • Andrew Dalke at Aug 19, 2003 at 12:16 am
    Doug Tolton
    I don't agree at all. Yes when you are defining a macro you are in
    essence defining a new mini-language. This is perhaps one of the most
    powerful features of Lisp. Programming closer to the application
    domain, *greatly* enhances both the readability and the reusability of
    code.
    For that domain. And rarely does the author of a package,
    much less a macro, understand "the domain as understood by other
    people" vs. personal understanding.

    This topic has come up before. Laura Creighton made several
    comments on macros, the most notable of which is:

    lac:
    ] Writing your own Lisp Macro System is better than sex. I
    ] _know_ -- 18 year old me turned down _lots_ of opportunities
    ] for sex to go hack on her macro system. Thus if we introduce
    ] this to the language, I think that it is _inevitable_ that we will
    ] fragment the Python community into a plethora of mutually
    ] unintelligble dialects. I don't want this. Thus I don't want a
    ] macro facility in the language _because_ it would be so cool.
    That doesn't mean it *shouldn't* be available [in Python].
    Python is Open Source, how would someone writing a
    Macro lock you in? Just don't use the macro.
    Another writing from Laura seems relevant:
    http://mail.python.org/pipermail/python-list/2001-May/042102.html

    My interepretation - I don't customize my apps, nor even
    my .cshrc (except for one alias (alias ls 'ls -l \!* | grep ^d')
    an 'unset noclobber', 'set ignoreeof', and the PATH and
    LD_LIBRARY_PATH - and I wish I didn't need those)
    I don't, because I don't like to think. At least not spend my
    time puzzling out slight changes. I like my changes either
    none or a lot, that is, use Python as-is or write a converter
    (or use another language).
    Just like anything else, Macro's can be over used and abused. However
    I maintain that if you don't see the usefulness of macros, you don't
    really understand them.
    That's not the argument against them. It's that they are too useful,
    each person makes their own dialect, the community breaks down
    as the different branches do their own thing, and one person's so-
    called "Python" code looks different than another's.

    I know I am nowhere near as good a language designer as Guido,
    Larry Wall, Matz, and the others, though I think I'm pretty decent.
    I don't have the essential hubris to say that I know better how
    to tweak Python-the-language to fit my own domain.
    Essentially using Python over Machine
    language is just using one big ass macro language.
    You confuse two meanings of the word 'macro' here.
    Any assembly language worth its salt has "macros", which
    are pre-assembled sets of code. Use the macro and it
    generates the code. But you can't use those macros to
    rewrite the actual language like you can with hygenic
    macros. It doesn't have the proper tail-biting recursive nature.

    Andrew
    dalke at dalkescientific.com
  • Doug Tolton at Aug 20, 2003 at 5:28 pm

    On Mon, 18 Aug 2003 18:16:07 -0600, "Andrew Dalke" wrote:

    Doug Tolton
    I don't agree at all. Yes when you are defining a macro you are in
    essence defining a new mini-language. This is perhaps one of the most
    powerful features of Lisp. Programming closer to the application
    domain, *greatly* enhances both the readability and the reusability of
    code.
    For that domain. And rarely does the author of a package,
    much less a macro, understand "the domain as understood by other
    people" vs. personal understanding.
    It depends what you are talking about. If you are talking about
    making some large cross industry library I might be inclined to agree,
    but when it comes to building good high level abstractions within a
    company, this argument doesn't make sense. Any feature has to be used
    in the proper context for it to be useful, Macros are also this way.
    This topic has come up before. Laura Creighton made several
    comments on macros, the most notable of which is:

    lac:
    ] Writing your own Lisp Macro System is better than sex. I
    ] _know_ -- 18 year old me turned down _lots_ of opportunities
    ] for sex to go hack on her macro system. Thus if we introduce
    ] this to the language, I think that it is _inevitable_ that we will
    ] fragment the Python community into a plethora of mutually
    ] unintelligble dialects. I don't want this. Thus I don't want a
    ] macro facility in the language _because_ it would be so cool.
    I just don't find that argument compelling. By that logic we should
    write the most restrictive language possible on the most restrictive
    platform possible (ie VB on Windows) because allowing choice is
    clearly a bad thing.

    Don't introduce a feature because it would be so cool that everyone
    would use it? That's just plain weird.

    The Python Core language would still always be controlled by Guido,
    but I don't see the problem with a community of people writing cool
    macro's for python.

    Linux is based on this concept of allowing people to extend the
    system, it doesn't seem to have suffered from it.

    That doesn't mean it *shouldn't* be available [in Python].
    Python is Open Source, how would someone writing a
    Macro lock you in? Just don't use the macro.
    Another writing from Laura seems relevant:
    http://mail.python.org/pipermail/python-list/2001-May/042102.html

    My interepretation - I don't customize my apps, nor even
    my .cshrc (except for one alias (alias ls 'ls -l \!* | grep ^d')
    an 'unset noclobber', 'set ignoreeof', and the PATH and
    LD_LIBRARY_PATH - and I wish I didn't need those)
    I don't, because I don't like to think. At least not spend my
    time puzzling out slight changes. I like my changes either
    none or a lot, that is, use Python as-is or write a converter
    (or use another language).
    Same argument as above, I don't agree with this logic. Python is a
    great language, that doesn't mean it couldn't be better though. If
    that were the case development would be cease.

    Why do we allow people to write functions even, I mean you have to
    learn the syntax for calling them, what the variables are and what
    they do. Bah, we should make everyone use only built in functions, if
    they want a different one, use a different language. What? It makes
    no sense to me.
    Just like anything else, Macro's can be over used and abused. However
    I maintain that if you don't see the usefulness of macros, you don't
    really understand them.
    That's not the argument against them. It's that they are too useful,
    each person makes their own dialect, the community breaks down
    as the different branches do their own thing, and one person's so-
    called "Python" code looks different than another's.
    So don't allow people to customize the system huh? They why is Python
    Open Source? That's the *entire* point of Open Source, so that people
    can tweak and customize to their own environment. Do you have any
    specific examples that are comparable where customization broke a
    community down? This sounds like baseless hypothetical speculation to
    me.
    I know I am nowhere near as good a language designer as Guido,
    Larry Wall, Matz, and the others, though I think I'm pretty decent.
    I don't have the essential hubris to say that I know better how
    to tweak Python-the-language to fit my own domain.
    You are saying you don't know how to tweak a language to fit it your
    specific domain better than a general puprose language? And you are
    saying you are a pretty good language designer? If you don't know
    your specific domain well enough to adapt a general purpose language
    to it better than it is already written there are several
    possibilities:
    1) You don't know your domain that well
    2) You work in a very general purpose domain
    3) You aren't a very good language designer

    Designing a good language is all about designing the right high level
    abstractions. Even a medium skilled designer should be able to design
    a language that maps better to their specific domain than a general
    purpose domain (actually implementing is of course a vastly different
    story). The whole point of Macro's though is to allow you to leverage
    the facilities the language provides while at the same time abstacting
    the common idioms.
    Essentially using Python over Machine
    language is just using one big ass macro language.
    You confuse two meanings of the word 'macro' here.
    Any assembly language worth its salt has "macros", which
    are pre-assembled sets of code. Use the macro and it
    generates the code. But you can't use those macros to
    rewrite the actual language like you can with hygenic
    macros. It doesn't have the proper tail-biting recursive nature.
    I am not talking about Assembly Macros. I was comparing hygenic
    macros to the ability to make useful high level abstractions. Python
    is an abstraction of Machine Language whereas Macros would allow you
    to abstract Python.

    You are in essence saying that Python is perfect, that no one could
    make a more useful abstraction than it already has, and that saying
    that one could do so is hubristic. I reject your argument and your
    logic as specious. I think what makes Python so useful is the high
    level abstractions it offers. The fact that it let's me do things
    that *I* know are right for my domain. That it doesn't make the
    assumption that Guido knows best for my domain (because I guarantee
    you I know my domain better than Guido does). Python doesn't treat me
    like the idiot programmer who can't be given a powerful tool because
    it might hurt me. Ultimately this is the basis of Java / C# / Visual
    Basic. Don't give the programmer room, he might hurt himself, or
    abuse something. That paradigm is filled, there are many languages
    that restrict programmers because they might misuse a feature, or they
    are just too dumb to get it right. I say fine, leave the languages
    like Java / C# / VB to those people, but let's make Python a language
    that allows people the room to do it the way it needs to be done, not
    so much the way Guido or whoever thinks it should be done.

    Just my 2p



    Doug Tolton
    (format t "~a@~a~a.~a" "dtolton" "ya" "hoo" "com")
  • Aahz at Aug 20, 2003 at 6:06 pm
    In article <s497kvcr4h4r6rf9s7tlcq1gjqpj277304 at 4ax.com>,
    Doug Tolton wrote:
    I just don't find that argument compelling. By that logic we should
    write the most restrictive language possible on the most restrictive
    platform possible (ie VB on Windows) because allowing choice is
    clearly a bad thing.
    It's all about maintaining balance. After all, Python forces you to
    format your code in a specific way rather than allowing the freedom of
    C/C++ or Perl.
    Don't introduce a feature because it would be so cool that everyone
    would use it? That's just plain weird.

    The Python Core language would still always be controlled by Guido,
    but I don't see the problem with a community of people writing cool
    macro's for python.
    Guido's mantra is readability. If you can come up with a concrete
    suggestion for a macro system that won't affect Python's readability,
    please propose a PEP. Otherwise, there's not much use arguing about it,
    because people won't pay attention.
    --
    Aahz (aahz at pythoncraft.com) <*> http://www.pythoncraft.com/

    This is Python. We don't care much about theory, except where it intersects
    with useful practice. --Aahz
  • Andrew Dalke at Aug 20, 2003 at 7:41 pm

    Doug Tolton:
    It depends what you are talking about. If you are talking about
    making some large cross industry library I might be inclined to agree,
    but when it comes to building good high level abstractions within a
    company, this argument doesn't make sense. Any feature has to be used
    in the proper context for it to be useful, Macros are also this way.
    As a consultant, I don't have the luxury of staying inside a singular
    code base. By your logic, I would need to learn each different
    high level abstraction done at my clients' sites. And given the usual
    software engineering experience a chemist or biologist has, those
    are unlikely to be good.
    I just don't find that argument compelling. By that logic we should
    write the most restrictive language possible on the most restrictive
    platform possible (ie VB on Windows) because allowing choice is
    clearly a bad thing.
    The inference is that programming language abstractions should
    not be more attractive than sex. Classes, functions, and modules
    are not. Arguing as you do is an all-or-nothing approach which
    overly polarizes the discussion.
    Linux is based on this concept of allowing people to extend the
    system, it doesn't seem to have suffered from it.
    I don't use Linus's kernel. The machine I have with Linux on it
    runs a modified version distributed by a company and with all
    the other parts needed to make a useful environment for my
    work. And I loath the times I need to recompile the kernel,
    even though I actually have done kernel mods on Minix in OS
    class back in school.

    In a similar vein, the different distributions once upon a time were
    very divergent on where files were placed, how startup scripts
    worked, which libraries were included, and how things were
    configured in general (eg, which libc to use?). If I wanted to
    distributed precompiled binaries, I was in a bind because I
    would need to ship all the variations, even though it's just for
    "Linux".

    There's more consensus now on, but it took a lot of time.

    In short, my comment is that Linux does allow the diversity,
    it did cause problems, and people now decide that that
    diversity isn't worth it, at least for most uses. For me as an
    applications developer, that diversity just makes my life more
    complicated.
    Same argument as above, I don't agree with this logic. Python is a
    great language, that doesn't mean it couldn't be better though. If
    that were the case development would be cease.
    What if Python had a marker so people could tell the intepreter that
    the next few lines are Lisp code, or Perl, or Tcl, or any other language.
    Would the result be more flexible? Yes? More powerful? Yes.
    Better? I think not.
    Why do we allow people to write functions even, I mean you have to
    learn the syntax for calling them, what the variables are and what
    they do. Bah, we should make everyone use only built in functions, if
    they want a different one, use a different language. What? It makes
    no sense to me.
    Your argument of an extreme has no weight because it's different
    than what I'm saying.

    Extra power and flexibility can have bad effects, not just on the
    language but on the community built around the language. Software
    development is rarely a singleton affair, so a good language should
    also optimize the ability for different people to use each others'
    libraries.

    Functions and modules and objects, based on experience, promote
    code sharing. Macros, with their implicit encouragement of domain
    specific dialect creation, do not.
    So don't allow people to customize the system huh? They why is Python
    Open Source? That's the *entire* point of Open Source, so that people
    can tweak and customize to their own environment.
    Err, no. I use open source because it's cheap, because the tools
    are of good quality, because if something breaks I can track down
    the problem, and as a risk management bonus, if the project ever
    dies, I can still maintain things on my own.

    I never, ever, ever, want to get into the case where I'm maintaining
    my own private, modified version of Python.
    Do you have any
    specific examples that are comparable where customization broke a
    community down? This sounds like baseless hypothetical speculation to
    me.
    Lisp.

    A language which allows very smart people the flexibility to
    customize the language, means there will be many different flavors,
    which don't all taste well together.

    A few years ago I tested out a Lisp library. It didn't work
    on the Lisp system I had handy, because the package system
    was different. There was a comment in the code which said
    "change this if you are using XYZ Lisp", which I did, but that
    that's a barrier to use if I ever saw one.
    You are saying you don't know how to tweak a language to fit it your
    specific domain better than a general puprose language? And you are
    saying you are a pretty good language designer? If you don't know
    your specific domain well enough to adapt a general purpose language
    to it better than it is already written there are several
    possibilities:
    1) You don't know your domain that well
    2) You work in a very general purpose domain
    3) You aren't a very good language designer
    4) a small change in a language to better fit my needs has
    subtle and far-reaching consequences down the line. Instead,
    when I do need a language variation, I write a new one
    designed for that domain, and not tweak Python.
    Designing a good language is all about designing the right high level
    abstractions. Even a medium skilled designer should be able to design
    a language that maps better to their specific domain than a general
    purpose domain (actually implementing is of course a vastly different
    story).
    But if Python is HERE ............................... and my domain is HERE
    I'm not going to try to force them together.
    You are in essence saying that Python is perfect, that no one could
    make a more useful abstraction than it already has, and that saying
    that one could do so is hubristic.
    I looked up 'hubris' just now. It's the wrong word for me to use.

    http://dictionary.reference.com/search?q=hubris
    hubris: Overbearing pride or presumption; arrogance

    I don't mean 'overbearing', I mean perhaps 'confidence'.
    'arrogance' is also the wrong word. Something without the
    negative overtones.

    Andrew
    dalke at dalkescientific.com
  • Jacek Generowicz at Aug 21, 2003 at 12:59 pm

    "Andrew Dalke" <adalke at mindspring.com> writes:

    As a consultant, I don't have the luxury of staying inside a singular
    code base. By your logic, I would need to learn each different
    high level abstraction done at my clients' sites.
    The alternative is to understand (and subsequently recognize) the
    chunks of source code implementing a given patten for which no
    abstraction was provided (often implemented slightly differently in
    different parts of the code, sometimes with bugs), each time that it
    occurs.

    I'd rather use multimethods that implement the visitor pattern.

    I'd rather look at multimethods, than at code infested with
    implementations of the visitor pattern.

    (The above comments are _not_ about the visitor pattern per se.)
    The inference is that programming language abstractions should not
    be more attractive than sex.
    Why ever not? Don't you want to put the joy back into programming :-)
    Functions and modules and objects, based on experience, promote
    code sharing. Macros, with their implicit encouragement of domain
    specific dialect creation, do not.
    I don't believe you can reasonably draw a rigid and well-defined
    boundary between functions, modules and objects on one side, and
    macros on the other. They all offer means of abstraction. All are open
    to abuse. All can be put to good use.

    In all four cases, I'd rather have the opportunity to create
    abstractions, rather than not.

    I find your suggestion that macros are in some way more "domain
    specific" than modules, or objects or functions, bogus.
    A language which allows very smart people the flexibility to
    customize the language, means there will be many different flavors,
    which don't all taste well together.
    A few years ago I tested out a Lisp library. It didn't work
    on the Lisp system I had handy, because the package system
    was different. There was a comment in the code which said
    "change this if you are using XYZ Lisp", which I did, but that
    that's a barrier to use if I ever saw one.
    You are confusing the issues of

    - extensibility,
    - standard non conformance,
    - not starting from a common base,
    - languages defined my their (single) implementation.

    A few days ago I tested out a C++ library. It didn't work on the C++
    system I had handy because the STL implementation was
    different/template support was different. etc. etc.

    A few days ago I tested out a Python library. It didn't work on the
    implementation I had handy because it was Jython.
    4) a small change in a language to better fit my needs has
    subtle and far-reaching consequences down the line. Instead,
    when I do need a language variation, I write a new one
    designed for that domain, and not tweak Python.
    So, what you are saying is that faced with the alternatives of

    a) Tweaking an existing, feature rich, mature, proven language, to
    move it "closer" to your domain.

    b) Implementing a new language from scratch, for use in a single
    domain

    you would choose the latter?

    If so, you are choosing the path which pretty much guarantees that
    your software will take much longer to write, and that it will be a
    lot buggier.

    It's an extreme form of Greenspunning.

    How do you reconcile
    when I do need a language variation, I write a new one designed for
    that domain, and not tweak Python. with
    Functions and modules and objects, based on experience, promote
    code sharing. Macros, with their implicit encouragement of domain
    specific dialect creation, do not.
    ?

    You criticize macros for not encouraging code sharing (they do, by
    encouraging you to share the (vast) underlying language while reaching
    out towards a specific domain), while your preferred solution seems to
    be the ultimate code non-sharing, by throwing away the underlying
    language, and re-doing it.
  • Borcis at Aug 21, 2003 at 1:56 pm

    Jacek Generowicz wrote:
    You criticize macros for not encouraging code sharing (they do, by
    encouraging you to share the (vast) underlying language while reaching
    out towards a specific domain), while your preferred solution seems to
    be the ultimate code non-sharing, by throwing away the underlying
    language, and re-doing it.
    This criticism can't help looking frivolous, imho. You appear to be confusing
    "language" with "speech". But I do believe there *must* exist a sane niche for
    (perhaps mutated) macros (in some lisp-like sense).

    Cheers, B.
  • Jacek Generowicz at Aug 21, 2003 at 2:27 pm

    Borcis <borcis at users.ch> writes:

    Jacek Generowicz wrote:
    You criticize macros for not encouraging code sharing (they do, by
    encouraging you to share the (vast) underlying language while reaching
    out towards a specific domain), while your preferred solution seems to
    be the ultimate code non-sharing, by throwing away the underlying
    language, and re-doing it.
    This criticism can't help looking frivolous,
    Only in so far as the original thesis is frivolous.
    You appear to be confusing "language" with "speech".
    I'm not sure what you mean by this.

    Are you saying that macros are "language" because you've heard the
    buzz-phrase that "macros allow you to modify the language", while
    functions, classes and modules are "speech", because no such
    buzz-phrases about them abound ?

    If so, then you are erecting artificial boundaries between different
    abstraction mechanisms. (All IMHO, of course.)
  • Andrew Dalke at Aug 21, 2003 at 6:43 pm

    Jacek Generowicz:
    The alternative is to understand (and subsequently recognize) the
    chunks of source code implementing a given patten for which no
    abstraction was provided (often implemented slightly differently in
    different parts of the code, sometimes with bugs), each time that it
    occurs.
    Agreed and understood. My understanding of macros were that
    they provide flexibility beyond what classes could do. However,
    at present, the only example I've seen for when to use a macro
    came from a method cache implementation that I could implement
    in Python using the normal class behaviour, so I don't have a
    good idea of when macros would be appropriate *for* *Python*.

    When this topic has come up before, others mentioned how
    macros would theoretically be able to, say, modify list.sort
    to return the sorted list after it has been modified in-place.

    Given the not infrequent request for the feature, I know that
    if it was allowed, then some of my clients would have done
    that, making it harder for me to know if what I'm looking at
    is core Python behaviour or modified.

    What you say is true, but most of the code I look at is
    based on fundamental Python types, from which I can
    be assured of fixed behaviour, or classes and functions,
    where I can be assured that they are free to do their
    own thing. The patterns of behaviour are fixed and
    the opportunities for change well defined.

    Macros, as I understand it, blurs those lines.
    The inference is that programming language abstractions should not
    be more attractive than sex.
    Why ever not? Don't you want to put the joy back into programming :-)
    Mmmm, I've reached the point in my life where the importance
    of social interactions and community building is starting to outweigh
    my interests in programming all the time.

    Plus, programming in Python is pretty joyful. It fits my mind
    quite nicely.
    I don't believe you can reasonably draw a rigid and well-defined
    boundary between functions, modules and objects on one side, and
    macros on the other. They all offer means of abstraction. All are open
    to abuse. All can be put to good use.
    I never said macros couldn't be put to good use.

    Let me repeat a statement I made in another post on this topic.

    I will grant that Lisp or Scheme is the end-all and be-all of
    languages. Where's the language between those and Python?

    Is it possible to have a language which is more flexible than
    Python but which doesn't encourage the various dialectization
    historically evident in the Lisp/Scheme community?

    Could most macros also be written without macros, using
    classes or lambdas? How often are the benefits of macros
    that much greater than classes&functions to merit their
    inclusion.

    Does it take skill to know when to use one over the other?
    Do people use macros too often? When do they hinder
    misunderstanding? Are they more prone to misuse than
    classes&functions?

    You say that macros can be put to good use, *and I
    believe you*. I say that their specific advantages over
    classes is rare enough that having an extra mechanism
    for implementing behaviours does not warrant their
    inclusion into Python - a language designed with an
    emphasis on readability and usability over the extreme
    flexibility emphasis from Lisp - given the statements
    made by people here with strong Lisp backgrounds
    on how using those seductive macros made it harder
    to share code with outsiders.
    In all four cases, I'd rather have the opportunity to create
    abstractions, rather than not.
    The complaint about macros has been their tendency to
    increase a single person's abilities at the cost of overall
    loss in group understanding. I've heard references to
    projects where that didn't occur, but am not swayed
    by it because those seem staffed by people with
    extraordinarily good programming skills almost never
    found amoung the chemists and biologists I work with.
    I find your suggestion that macros are in some way more "domain
    specific" than modules, or objects or functions, bogus.
    Sorry? I thought one of the main points of macros is that
    they allow additional flexibility to customize the language as
    appropriate for a given domain. Now you say that that's
    not the case?
    You are confusing the issues of

    - extensibility,
    - standard non conformance,
    - not starting from a common base,
    - languages defined my their (single) implementation.
    Indeed, and to some extent deliberately. The point I'm
    trying to make is that different, very smart people like
    "Lisp", but insist on variations. There is clisp and elisp
    and scheme and guile and ... a long list of Lisps.

    As I understand it, macros can be used to make one
    lisp variation act like another.

    Given that, the conculsion I infer is that different, very
    smart people would use macros to make Python more
    "right", even though there's different perceptions of
    what is "right."
    A few days ago I tested out a C++ library. It didn't work on the C++
    system I had handy because the STL implementation was
    different/template support was different. etc. etc.
    Did you really or are your making that up for the
    sake of rhetoric?
    A few days ago I tested out a Python library. It didn't work on the
    implementation I had handy because it was Jython.
    Ditto.

    That's not to say you couldn't have those problems. I just
    don't like you making up a response in the face of something
    what did actually occur.

    If it takes more than four decades for different Lisp implementations
    to agree on how to import a module, then I think there's
    a problem. And I believe that that problem is that too
    many peole who make a Lisp do it because they have a
    feeling they know what is right vs. wrong, and emphasize
    boosting personal abilities over group ones, and that
    tendency is magnified by the inclusion of macros.
    So, what you are saying is that faced with the alternatives of

    a) Tweaking an existing, feature rich, mature, proven language, to
    move it "closer" to your domain.

    b) Implementing a new language from scratch, for use in a single
    domain

    you would choose the latter?
    False dichotomy.

    I needed to evalute a user-defined expression where the
    variable names are computed based on calling an associated
    function. The functions may take a long time to compute and
    most names are not used in an expression, so I want to
    compute the names only when used.

    This is different from Python because Python's exec wants
    all the variable names defined first. Hence, this is a new language.

    Did I start from scratch? No! I used Python to build the
    parse tree then tweaked a few nodes of that tree to change
    the name lookup into the right form, then generated the
    function from that parse tree.

    The syntax was the same as Python's, but the behaviour different.
    Though back at the Python level, it's a "call this function to get
    the needed result", and has precisely the same nature as
    any other Python function would have.
    If so, you are choosing the path which pretty much guarantees that
    your software will take much longer to write, and that it will be a
    lot buggier.
    Indeed. And that would be silly. Luckily, you didn't think
    of all the alternatives available to me.

    And some languages are not-at-all close to Python. Eg, I wanted
    to implement a domain-specific language called MCL using my
    PyDaylight package. I ended up writing a parser for MCL and
    converting the result into Python code, then exec'ing the Python
    code. I would not want to write a macro for Python to support
    MCL natively. Blech!

    Similarly, SMILES is a language for describing molecules. It
    isn't a programming language at all, so I just write a parser for it.

    I once made a language for manipulating molecules. It was a
    a simple command language with syntax like
    load "abc.pdb" into a
    select "resname LYS" from a into b
    save b as "lysine.pdb"

    I wrote it in Perl, but didn't want the language to be part of
    Perl (nor of Python). It wasn't a Perl library because Perl's
    syntax was too complicated for the target audience. It wasn't
    very powerful because it only needed to load a molecule,
    select portions of the molecule, select subsets, copy, rotate
    and translate sets of atoms, and save a set to a file.

    Why in the world would I want to extend Lisp/Perl/Python/
    whatever to support that language directly?
    It's an extreme form of Greenspunning.
    ??? Apparantly not Alan Greenspan.

    Google yields 4 hits (3 if I exclude your post). It seems
    to be "writing a lot of code yourself instead of using existing
    packages"

    Then you don't know me works at all and are making
    unjustified comments based on false preconceptions.
    How do you reconcile ...
    You criticize macros for not encouraging code sharing (they do, by
    encouraging you to share the (vast) underlying language while reaching
    out towards a specific domain), while your preferred solution seems to
    be the ultimate code non-sharing, by throwing away the underlying
    language, and re-doing it.
    Because you assume when I say "language" that I mean a powerful
    language like Lisp or Python when it's really a domain specific
    languages which often aren't even Turing complete or even a programming
    language, and which are easy to convert into Python, and designed for
    someone with only a little programming experience.

    Please tell me how you would implement this language

    load "abc.pdb" into a
    select "resname LYS" from a into b
    save b as "lysine.pdb"

    as a macro in Lisp. I'll assume 'load-pdb' loads a PDB file into
    a object which holds a set of atoms, and that object has the
    method 'select-atoms' which creates a new (sub)set and also
    has the method 'save-as' for saving those atoms in the right format.

    And the above is all that the user can type.

    How in the world does macros make handling that language
    any easier than the standand parser-based non-macro solution?


    Andrew
    dalke at dalkescientific.com
  • Dave Kuhlman at Aug 21, 2003 at 10:52 pm
    Andrew Dalke wrote:

    [snip]
    The complaint about macros has been their tendency to
    increase a single person's abilities at the cost of overall
    loss in group understanding. I've heard references to
    projects where that didn't occur, but am not swayed
    by it because those seem staffed by people with
    extraordinarily good programming skills almost never
    found amoung the chemists and biologists I work with.
    I just took a quick look at the "Revised5 Report on the
    Algorithmic Language Scheme". Macros in Scheme5 are called
    "hygienic macros", so apparently there are some dirty macros that
    we should worry about. My understanding is that macros came to
    Scheme slowly, with resistence, with a good deal of thought, and
    with restrictions.

    "More recently, Scheme became the first programming language
    to support hygienic macros, which permit the syntax of a
    block-structured language to be extended in a consistent and
    reliable manner."

    See:

    http://www.schemers.org/Documents/Standards/R5RS/HTML/

    http://www.schemers.org/Documents/Standards/R5RS/HTML/r5rs-Z-H-3.html

    http://www.schemers.org/Documents/Standards/R5RS/HTML/r5rs-Z-H-7.html#%_sec_4.3

    I have the same worry that some others on this thread have
    expressed, that a macro capability in Python would enable others
    to write code that I would not be able to read or to figure out.

    [snip]

    Dave

    --
    Dave Kuhlman
    http://www.rexx.com/~dkuhlman
    dkuhlman at rexx.com
  • Kenny Tilton at Aug 21, 2003 at 11:57 pm

    Dave Kuhlman wrote:
    Andrew Dalke wrote:

    [snip]
    The complaint about macros has been their tendency to
    increase a single person's abilities at the cost of overall
    loss in group understanding. I've heard references to
    projects where that didn't occur, but am not swayed
    by it because those seem staffed by people with
    extraordinarily good programming skills almost never
    found amoung the chemists and biologists I work with.

    I just took a quick look at the "Revised5 Report on the
    Algorithmic Language Scheme". Macros in Scheme5 are called
    "hygienic macros", so apparently there are some dirty macros that
    we should worry about.
    (defmacro whoops (place &body code)
    `(let ((x (random 3)))
    (setf ,place (progn , at code))))

    ...wreaks havoc if the code refers to an X it had bound to something
    else and expected it to be used. otoh:

    (defmacro c? (&body code)
    `(lambda (self) , at code))

    ...is cool because I can do the anaphoric thing and provide the user
    with a uniform referent to the instance owning the slot, ala SmallTalk
    or C++ with "this".


    My understanding is that macros came to
    Scheme slowly, with resistence, with a good deal of thought, and
    with restrictions.

    "More recently, Scheme became the first programming language
    to support hygienic macros, which permit the syntax of a
    block-structured language to be extended in a consistent and
    reliable manner."

    See:

    http://www.schemers.org/Documents/Standards/R5RS/HTML/

    http://www.schemers.org/Documents/Standards/R5RS/HTML/r5rs-Z-H-3.html

    http://www.schemers.org/Documents/Standards/R5RS/HTML/r5rs-Z-H-7.html#%_sec_4.3

    I have the same worry that some others on this thread have
    expressed, that a macro capability in Python would enable others
    to write code that I would not be able to read or to figure out.
    Do what I do. Don't look at anyone else's code (if you have to work on
    it, rewrite it anyway) and never show your code to anyone.

    Naw, c'mon, everyone seems to be conceding macros are powerful. But you
    are going to be held back by fear and worry? Well, I am a lispnik, we
    always opt for power and damn the torpedos, as when we go for the
    productivity win of untyped variables and give up on bugs strong static
    typing is supposed to find.



    --

    kenny tilton
    clinisys, inc
    http://www.tilton-technology.com/
    ---------------------------------------------------------------
    "Career highlights? I had two. I got an intentional walk from
    Sandy Koufax and I got out of a rundown against the Mets."
    -- Bob Uecker
  • Andrew Dalke at Aug 22, 2003 at 4:26 am

    Kenny Tilton:
    as when we go for the
    productivity win of untyped variables and give up on bugs strong static
    typing is supposed to find.
    You do realize that strong typing and static typing are different
    things?

    What does (the lisp equivalent of) 2.5 + "a" do?

    In Python, a strongly typed language, it raises an exception. I
    consider that a good thing.

    But Python is not statically typed.

    Andrew
    dalke at dalkescientific.com
  • Jacek Generowicz at Aug 22, 2003 at 7:24 am

    "Andrew Dalke" <adalke at mindspring.com> writes:

    Kenny Tilton:
    as when we go for the
    productivity win of untyped variables and give up on bugs strong static
    typing is supposed to find.
    You do realize that strong typing and static typing are different
    things?
    Note that Kenny said "untyped _variables_" not "untyped objects" or
    "untyped language".
    What does (the lisp equivalent of) 2.5 + "a" do?
    Common Lisp complains that "a" is not a number.
    In Python, a strongly typed language, it raises an exception.
    Common Lisp is strongly and dynamically typed, just like Python
    ... although CL does allow type declarations, for optimization
    purposes.
  • Andrew Dalke at Aug 22, 2003 at 4:01 pm

    Kenny Tilton:
    as when we go for the
    productivity win of untyped variables and give up on bugs strong
    static
    typing is supposed to find.
    Jacek Generowicz:
    Note that Kenny said "untyped _variables_" not "untyped objects" or
    "untyped language".
    Ahh, I hadn't caught that.

    I did know Lisp had strong dynamic typing with optional static typing,
    which was why I was surprised he mentioned it. I still don't understand
    why he said it given that Python similarly meets the quoted statement.

    Andrew
    dalke at dalkescientific.com
  • Kenny Tilton at Aug 22, 2003 at 4:54 pm

    Andrew Dalke wrote:
    Kenny Tilton:
    as when we go for the
    productivity win of untyped variables and give up on bugs strong
    static
    typing is supposed to find.
    Jacek Generowicz:
    Note that Kenny said "untyped _variables_" not "untyped objects" or
    "untyped language".

    Ahh, I hadn't caught that.

    I did know Lisp had strong dynamic typing with optional static typing,
    which was why I was surprised he mentioned it. I still don't understand
    why he said it given that Python similarly meets the quoted statement.
    ?? I was just giving another example of how lisp errs on the side of
    letting us shoot ourselves in the foot. I was not saying anything about
    Python.

    --

    kenny tilton
    clinisys, inc
    http://www.tilton-technology.com/
    ---------------------------------------------------------------
    "Career highlights? I had two. I got an intentional walk from
    Sandy Koufax and I got out of a rundown against the Mets."
    -- Bob Uecker
  • Terry Reedy at Aug 22, 2003 at 1:30 am
    "Dave Kuhlman" <dkuhlman at rexx.com> wrote in message
    news:bi3ibj$4um7m$1 at ID-139865.news.uni-berlin.de...
    http://www.schemers.org/Documents/Standards/R5RS/HTML/r5rs-Z-H-3.html
    http://www.schemers.org/Documents/Standards/R5RS/HTML/r5rs-Z-H-7.html#%_sec_4.3
    I have the same worry that some others on this thread have
    expressed, that a macro capability in Python would enable others
    to write code that I would not be able to read or to figure out.
    According to the second referenced page ("Background"), not a
    completely baseless worry, although the Scheme divergence may have had
    nothing to do with macros:

    "The first description of Scheme was written in 1975...
    Three distinct projects began in 1981 and 1982 to use variants of
    Scheme for courses at MIT, Yale, and Indiana University [21, 17,
    10]...
    As Scheme became more widespread, local dialects began to diverge
    until students and researchers occasionally found it difficult to
    understand code written at other sites. Fifteen representatives of the
    major implementations of Scheme therefore met in October 1984 to work
    toward a better and more widely accepted standard for Scheme.
    "

    Reading the third referenced page on Macros, I notice that the amount
    of syntax definition for the macro sublanguage is as large as a
    substantial portion (one-third?) of that for core Python (if condensed
    to the same density). So, just by definitional bulk, having it in the
    language would not be a free ride.

    Terry J. Reedy
  • Jacek Generowicz at Aug 22, 2003 at 9:10 am
    You wrote lots. Forgive me if I don't address everything. I wish I had
    the time to address all your points more carefully.

    "Andrew Dalke" <adalke at mindspring.com> writes:
    However, at present, the only example I've seen for when to use a
    macro came from a method cache implementation that I could implement
    in Python using the normal class behaviour,
    Or using directors, or lexical closures. Similarly in Lisp, there is
    more than one way to do it, and many would not choose the macro option
    when implementing a memoizer.
    so I don't have a good idea of when macros would be appropriate
    *for* *Python*.
    (Looks like you've found a need for them yourself - see later on.)

    Well, frankly, in the context of Python, I find the whole discussion a
    bit abstract. Lisp macros fundamentally rely on the fact that Lisp
    programs are represented as lists, and that Lisp includes excellent
    support for manipulating such lists. The consequence of this (source
    code being a form of data) is that it is extremely easy to manipulate
    Lisp source code within Lisp. You can already achieve similar things
    in Python by typing your source code as a string, and performing
    string manipulations on it, and then calling eval ... but it is many
    orders of magnitude more painful to do it this way.

    Alternatively, get your hands on the parse-tree (I believe there's a
    module to help), and mess around with that. That should be much easier
    that playing with strings, but still much more of a pain than Lisp
    macros.
    When this topic has come up before, others mentioned how
    macros would theoretically be able to, say, modify list.sort
    to return the sorted list after it has been modified in-place.
    You don't need macros for that. With the advent of new-style classes,
    you subclass list, override the sort function, and replace
    __builtins__.list (maybe there's more to it, but as this is not
    something I ever intend to do, forgive me for not checking the
    details.)
    Given the not infrequent request for the feature, I know that
    if it was allowed, then some of my clients would have done
    that, making it harder for me to know if what I'm looking at
    is core Python behaviour or modified.
    That's not the point of macros. The point is not to modify existing
    behaviour behind one's back. The point is to add new behaviour
    ... much like it is with functions, classes etc.
    What you say is true, but most of the code I look at is
    based on fundamental Python types, from which I can
    be assured of fixed behaviour, or classes and functions,
    where I can be assured that they are free to do their
    own thing. The patterns of behaviour are fixed and
    the opportunities for change well defined.

    Macros, as I understand it, blurs those lines.
    I don't think this has anything to do with macros. This is a
    consequence of a language allowing to re-bind built-ins. I remind you
    that Python allows this, today.
    I will grant that Lisp or Scheme is the end-all and be-all of
    languages.
    :-) Well, at least that's clear :-) :-)

    [NB, I love Python, and wouldn't want to go without it, for a plethora
    of reasons.]
    Where's the language between those and Python?
    I am not sure that there is a need for one; Python and Lisp are
    already very close, as compared to other languages. But maybe some
    interemediate language would serve a good purpose.
    Is it possible to have a language which is more flexible than
    Python but which doesn't encourage the various dialectization
    historically evident in the Lisp/Scheme community?
    Hmmm. I think this "dialectization of the Lisp/Scheme community" is a
    bit like the "dialectization of the Algol Family community" or the
    "dialectization of the functonal community" or the "dialectization of
    the scripting (whatever that means) community"

    The fundamental feature of Lisps is that they represent their source
    code in a format which they themselves can manipulate easily. (Often
    people try to characterize Lisps by the 4 or 5 fundamental operators
    that you need to make all the rest, but I don't find this an
    interesting perspective.) What's wrong with there being different
    languages with this characteristic? What's wrong with there being
    different functional languages? What's wrong with there being
    different scripiting (whatever that means) languages ?

    Maybe you refer to the fact that, should you wish to make a completely
    new Lisp-like language, then starting with an already existing lisp,
    and writing your first implementation in that (with the help of
    macros), is usually by far the best way of going about it.

    (This is exactly how Scheme was created, IIRC)

    But your new language is exactly that. A new language. Faithful users
    of the language in which you wrote that first implementation, will not
    suddenly find that the language they know and love has been broken.
    Could most macros also be written without macros, using
    classes or lambdas?
    Heh. You can write a lot of macros which don't need to be macros (so
    don't). But there are some things for which macros are absolutely
    necessary.

    (Actually, in Lisp you could get by with functions and "'" (the quote)
    ... but you'd still be writing macros, without official language
    support for them.)

    I guess that it boils down to delaying evaluation until you have had
    the opportunity to manipulate your source.
    How often are the benefits of macros
    that much greater than classes&functions to merit their
    inclusion.

    Does it take skill to know when to use one over the other?
    Often, it does. Some are no-brainers (Control structures, for example).
    Do people use macros too often?
    Some probably do. This is probably true of classes too.
    When do they hinder misunderstanding?
    When they are badly designed. This is true of classes too.
    Are they more prone to misuse than classes&functions?
    I guess it is generally true, that the more powerful and versatile the
    tool, the more prone it is to misuse.
    The complaint about macros has been their tendency to increase a
    single person's abilities at the cost of overall loss in group
    understanding.
    No, no, NO, Noooooooooo ! :-)

    At the top of your reply, you agreed about my point that abstracting a
    frequently repeated pattern is preferable re-implementing it all over
    your code. _This_ is what macros are about.

    One can write useless and obfuscating functions and classes, just like
    one can write useless and obfuscating macros.
    I've heard references to projects where that didn't occur, but am
    not swayed by it because those seem staffed by people with
    extraordinarily good programming skills almost never found amoung
    the chemists and biologists I work with.
    I do not advocate Lisp as a language for people who are not prepared
    to invest serious time to understanding it, which probably includes
    most people whose primary activity is not programming.

    That is why I think Python is _extremely_ useful and necessary. It
    provides a significant portion of the power of Lisp, for a small
    initial investment.

    I do not expect CERN physicists to use Lisp, I do expect them to use
    Python.
    The point I'm trying to make is that different, very smart people
    like "Lisp", but insist on variations.
    Very smart and not-so-smart people like "scripting languages", but
    insist on variations. There's Perl, Python, Ruby ...
    There is clisp
    Clisp is just one implementation of an ANSI standardized Lisp (Common Lisp).
    and elisp
    Elisp is the scripting language of Emacs.

    Now, Common Lisp is an all-purpose stand-alone language designed for
    constructing complicated systems; Elisp is designed for configuning
    and extending Emacs.

    Complaining that this is "insisting on variations of lisp", and that
    it is somehow a BAD THING, is a bit like complaining about the
    co-existence of Python and Occam, as "insisting on variations of
    languages with significant indentation".
    As I understand it, macros can be used to make one lisp variation
    act like another.
    This is true to some extent. But just because it can be done, doesn't
    mean that very many people actually ever want to do it. Yes, sometimes
    CLers want to fake up a Scheme-like continuation, and the language
    allows them to do it. Great. But that (pretending to be another
    already existing language) is not the point of macros.
    A few days ago I tested out a C++ library. It didn't work on the C++
    system I had handy because the STL implementation was
    different/template support was different. etc. etc.
    Did you really or are your making that up for the
    sake of rhetoric?
    Sorry, I should have made clear that I made it up for the sake of
    rhetoric. However, the only thing that is untrue is the "A few days
    ago" bit. It has happened repeatedly in the past.
    If it takes more than four decades for different Lisp
    implementations to agree on how to import a module, then I think
    there's a problem.
    Again, you are confusing "different implementations" with "different
    languages".

    Implementations of ANSI Common Lisp agree on, well, anything defined
    within the standard.

    Implementations of different languages clearly do not agree. This is
    true of those in the Lisp family, just as it is for members of any
    other family of Languages.
    I needed to evalute a user-defined expression where the variable
    names are computed based on calling an associated function. The
    functions may take a long time to compute and most names are not
    used in an expression, so I want to compute the names only when
    used.
    You want lazy evaluation ?
    Did I start from scratch? No! I used Python to build the parse
    tree
    In Lisp, you start off with the parse tree. That's the great thing
    about it.
    then tweaked a few nodes of that tree to change the name lookup into
    the right form, then generated the function from that parse tree.
    You've just Greenspunned Lisp macros.

    Manipulating the parse-tree is exactly what Lisp macros are about.
    The syntax was the same as Python's, but the behaviour different.
    In Lisp you would typically give a name to this behaviour, and then
    you would be able to use it alongside the original language.

    For example, if the user-defined expression is

    (+ (foo 2) (bar a b c d))

    The tweaked version would be

    (lazy-eval (+ (foo 2) (bar a b c d)))

    Writing a macro to make Lisp a uniformly lazy language (as you seem to
    have been suggesting one might proceed, way up-post), is definitely
    not the way to do it.
    Though back at the Python level, it's a "call this function to get
    the needed result",
    In Lisp it's "call this macro to get the needed result".
    and has precisely the same nature as any other Python function would
    have.
    And has percisely the same nature as any other Lisp macro would have.
    And some languages are not-at-all close to Python. Eg, I wanted
    to implement a domain-specific language called MCL
    What, "Macintosh Common Lisp" ? :-)
    using my PyDaylight package. I ended up writing a parser for MCL
    and converting the result into Python code, then exec'ing the Python
    code.
    In CL this is done with reader macros; a means of altering the way the
    parse tree is constructed from the source data.
    Similarly, SMILES is a language for describing molecules.
    Why in the world would I want to extend Lisp/Perl/Python/
    whatever to support that language directly?
    a) Because it's easier to build it on top of Lisp than from scratch.

    b) Because you never know what future requirements you might have.
    It's an extreme form of Greenspunning.
    ??? Apparantly not Alan Greenspan.
    Greenspun's Tenth Rule of Programming:

    "Any sufficiently complicated C or Fortran program contains an
    ad-hoc, informally-specified bug-ridden slow implementation of half
    of Common Lisp."

    (Of course, it's a general statement about developing in low-level
    languages as compared to developing in higher-level ones.)
    Please tell me how you would implement this language

    load "abc.pdb" into a
    select "resname LYS" from a into b
    save b as "lysine.pdb"

    as a macro in Lisp. I'll assume 'load-pdb' loads a PDB file into
    a object which holds a set of atoms, and that object has the
    method 'select-atoms' which creates a new (sub)set and also
    has the method 'save-as' for saving those atoms in the right format.

    And the above is all that the user can type.

    How in the world does macros make handling that language
    any easier than the standand parser-based non-macro solution?
    I don't think I understand the true meaning of your question.


    Anyway, your parse tree exapmle shows that you DO understand and use macros
    ... you just don't know that that's the name of what you are doing :-)
  • Alex Martelli at Aug 20, 2003 at 9:15 pm
    Doug Tolton wrote:
    ...
    Linux is based on this concept of allowing people to extend the
    system, it doesn't seem to have suffered from it.
    Linus Thorvalds sits at the center and rejects a LOT of proposed
    modifications to the kernel -- everybody's free to distribute such
    patches separately, but they're NOT part of Linux. You can do
    exactly the same with Python: send patches to Guido, have him
    reject them, distribute them separately with no claim they're part
    of Python (e.g., Stackless is in exactly this situation).

    Powerful macros in Python would *BYPASS* the crucial filtering
    role of the chief architect -- Linus or Guido in these cases. And
    here's where the "attractive nuisance" side of powerful macros,
    the allure that would make oodles of youngsters flock to them,
    amplifies their danger: as it's much easier -- and satisfying to
    many -- to "play the amateur language designer" by coding macros,
    rather than (e.g.) to write device drivers for Linux, the flood
    would quite likely be huge.

    Same argument as above, I don't agree with this logic. Python is a
    great language, that doesn't mean it couldn't be better though. If
    that were the case development would be cease.
    But WHO will make it better: Guido, whose skill as a language
    designer is proven, or a hundred authors of sets of macros? It
    is just too easy to "play language designer" -- far more people
    will do it than are actually GOOD at language design.

    Why do we allow people to write functions even, I mean you have to
    Because "once and only once" is the mantra of programming: in a
    language that lacks user-written functions or procedures, application
    programmers have to copy-and-paste code, a horrid way to program.
    learn the syntax for calling them, what the variables are and what
    they do. Bah, we should make everyone use only built in functions, if
    they want a different one, use a different language. What? It makes
    no sense to me.
    It doesn't, because the number of different SYNTAX FORMS needed for
    powerful expression is incredibly tiny (so that even Python, which is
    a small language, can easily afford redundance there!) while the
    number of different COMPUTATIONS (functions and procedures) needed for
    even a small application exceeds the numbers that can be reasonably
    provided as built-ins. In other words, it makes no sense because you
    are comparing, not even apples and oranges, but rather cantaloupes and
    termites -- completely different things.

    So don't allow people to customize the system huh? They why is Python
    Open Source? That's the *entire* point of Open Source, so that people
    can tweak and customize to their own environment. Do you have any
    People can "tweak and customize" "their own environment" in Windows,
    too (ever seen TweakUI and friends?!), so if your point was well taken
    open-source would not exist. Since it does, it proves your point is
    deeply mistaken.

    Designing a good language is all about designing the right high level
    abstractions. Even a medium skilled designer should be able to design
    a language that maps better to their specific domain than a general
    I entirely, utterly, totally and completely disagree with this point.

    This is like saying that even a medium skilled musician should be
    able to write music that plays better to their specific audience than
    great music written by a genius who's never personally met any of
    the people in the audience: it's just completely false. I want to
    use a language designed by a genius, and I want to listen to music
    written by Bach, Haendel, Mozart, and the like.

    Moreover, judging by the way BY FAR most languages around are in
    fact designed, it's abundantly clear that "medium skilled language
    designers" are a VERY scarce breed indeed. And yet with powerful
    macros everybody and their cousin WILL play the amateur language
    designer. No thanks. If you want Dylan, Common Lisp, or Scheme,
    you know where to find them. Please leave *ONE* language alone,
    with the integrity and conceptual beauty AND usefulness that can
    only come from having *ONE* designer -- a genius-level one --
    firmly at the helm.

    abuse something. That paradigm is filled, there are many languages
    that restrict programmers because they might misuse a feature, or they
    are just too dumb to get it right. I say fine, leave the languages
    like Java / C# / VB to those people, but let's make Python a language
    that allows people the room to do it the way it needs to be done, not
    so much the way Guido or whoever thinks it should be done.
    Let's leave Python as *ONE* language, WITHIN which everything does
    work as you say -- not a *MYRIAD* subtly incompatible languages, each
    partly designed by a different guys, mostly mediocre at language
    design. Just as many languages are overly restrictive, so many
    others are overly permissive (see the above mentioned examples)
    thanks to powerful macro systems. PLEASE leave Python alone at the
    SWEET SPOT, at JUST THE RIGHT COMPROMISE -- neither too permissive
    nor too restrictive. GvR's genius (and/or luck) made it that way;
    don't branch the language into a zillion mediocre ones.


    Alex
  • Cliff Wells at Aug 20, 2003 at 10:57 pm

    On Wed, 2003-08-20 at 14:15, Alex Martelli wrote:

    Designing a good language is all about designing the right high level
    abstractions. Even a medium skilled designer should be able to design
    a language that maps better to their specific domain than a general
    I entirely, utterly, totally and completely disagree with this point.

    This is like saying that even a medium skilled musician should be
    able to write music that plays better to their specific audience than
    great music written by a genius who's never personally met any of
    the people in the audience: it's just completely false. I want to
    use a language designed by a genius, and I want to listen to music
    written by Bach, Haendel, Mozart, and the like.
    I was with you until this point. Your preference for music written by
    geniuses only says that you are part of that audience for that type of
    music; it says nothing about whether Mozart would play well to the
    audience in a mosh pit. I don't think it invalidates your point about
    programming languages, it's just a bad (in fact, incorrect) example.

    Regards,

    --
    Cliff Wells, Software Engineer
    Logiplex Corporation (www.logiplex.net)
    (503) 978-6726 (800) 735-0555
  • Olivier Drolet at Aug 21, 2003 at 5:08 am
    Alex Martelli <aleax at aleax.it> wrote in message news:<ExR0b.21526$zN5.666078 at news1.tin.it>...
    Doug Tolton wrote:
    ...
    Linux is based on...
    (...)
    ... a zillion mediocre ones.


    Alex

    Macros, as found in Common Lisp, do not change the underlying language
    at all! Common Lisp macros, when run, always expand into 100% ANSI
    Common Lisp code! Using macros to become more productive is no
    different from using function abstractions or class hierarchies to
    become more productive. They all require that you, the programmer,
    become familiar with them. Macros don't cause Common Lisp to fork
    anymore than function or class abstractions do. They only alter the
    readability of "program code" (usually for the better), just like
    function or class abstractions do.

    Saying that all hell will break loose in the Python community seems
    rather unfounded and a bit knee-jerk. None of what you claim would
    eventually happen within Python circles is currently happening within
    the Common Lisp community. After years of macro use, ANSI Common Lisp
    is till the same. Macros don't bypass ANSI committees anymore than
    they would the Guidos of this world. On the contrary, they preclude
    the need to bypass them in the first place, and all parties end up
    getting what they need: on the one hand, a static base language, and
    on the other, much greater expressiveness.

    Speaking of expressiveness, someone asked on comp.lang.lisp.fr what
    macros were good for, concretely, and what quantitative difference
    they made in commercial applications (cf. "Macros: est-ce utile ?
    (attn Marc)"). The responses (in French) were quite enlightening. It
    boils down to using multiple macros, in multiple instances, thus
    allowing to reduce total code size (of otherwise pure CL code) by VERY
    significant margins. You can think of it as reuse (as per OOP) or as
    code compression.

    Macros do not have to be used all the time or at all. There are times
    when a macro should not be used, e.g. when a function would do just
    fine. But they are very powerful. As Paul Graham put it, macros allow
    you to program up towards the problem at hand, as opposed to adapting
    the problem to fit the language specification. They allow greater
    expressiveness, when you need it. They allow you to "use" many lines
    of code you no longer have to write. And the lines of code you don't
    have to write are also the lines of code you don't have to debug (as
    it were).

    Cheers.
  • Andrew Dalke at Aug 21, 2003 at 7:13 am

    Olivier Drolet:
    Macros, as found in Common Lisp, do not change the underlying language
    at all! Common Lisp macros, when run, always expand into 100% ANSI
    Common Lisp code!
    I've created a new language, called Speech. It's based on the core
    primitives found in the International Phonetic Alphabet. I've made some
    demos using Speech. One is English and another is Xhosa. This just
    goes to show how powerful Speech is because it can handle so many
    domains. And it's extensible! Anything you say can be expressed in
    Speech!
    Using macros to become more productive is no
    different from using function abstractions or class hierarchies to
    become more productive.
    One of the most incredible things about Speech is that you can
    create domain-specific words. Verbing doesn't wierd nouns -
    it's perfectly crumbulent and embiggens the soul!
    They all require that you, the programmer, become familiar with them.
    Someone else's new words simple require that you, the programmer,
    become familar with them. You can even get dictionaries which
    explain how each word is used in (most of) the different contexts you
    might find it in. In many cases you can even infer the meaning, because
    of built-in redundancy.

    Macros don't cause Common Lisp to fork
    anymore than function or class abstractions do.
    Making new words don't cause Speech to fork any more than
    making new sentences does.
    They only alter the
    readability of "program code" (usually for the better), just like
    function or class abstractions do.
    They only alter the comprehension of "talking" (usually for the
    better).
    Saying that all hell will break loose in the Python community seems
    rather unfounded and a bit knee-jerk
    Saying that different people will decide to use only a part of Speech,
    and not understand all the variations when needed seems rather
    unfounded and a bit knee-jerk.

    ...
    After years of macro use, ANSI Common Lisp
    is till the same. Macros don't bypass ANSI committees anymore than
    they would the Guidos of this world.
    After years of Speech use, ANSI Common Speech is still
    the same. New Speech doesn't bypass ANSI committees anymore
    than they would the Shakespeares and Chief Josephs of the world.
    On the contrary, they preclude
    the need to bypass them in the first place, and all parties end up
    getting what they need: on the one hand, a static base language, and
    on the other, much greater expressiveness.
    On the contrary, Speech definitions preclude the need to make
    new languages, and all parties end up getting what they need:
    on the one hand, a static core set of syllables, and on the other,
    much greater expressiveness.
    Speaking of expressiveness, someone asked on comp.lang.lisp.fr what
    macros were good for, concretely, and what quantitative difference
    they made in commercial applications (cf. "Macros: est-ce utile ?
    (attn Marc)"). The responses (in French) were quite enlightening. It
    boils down to using multiple macros, in multiple instances, thus
    allowing to reduce total code size (of otherwise pure CL code) by VERY
    significant margins. You can think of it as reuse (as per OOP) or as
    code compression.
    Speaking of expressiveness, someone in France suggested that
    "courriel" was a better fit for their local Dialect of Speech, instead of
    the word "e-mail". That's just one perfect example of how different
    groups can customize Speech for internal consistancy in a given
    doman. This allows users to keep a consistent set of pronounciation
    rules and reduce the dictionary size.
    Macros do not have to be used all the time or at all. There are times
    when a macro should not be used, e.g. when a function would do just
    fine.
    Creating new words of Speech does not have to be done all the
    time or at all. There are times when an existing word or a sentence
    (a set of existing words) would do just fine.
    But they are very powerful.
    But making new words is very powerful.
    As Paul Graham put it, macros allow
    you to program up towards the problem at hand, as opposed to adapting
    the problem to fit the language specification. They allow greater
    expressiveness, when you need it. They allow you to "use" many lines
    of code you no longer have to write. And the lines of code you don't
    have to write are also the lines of code you don't have to debug (as
    it were).
    As Some Dude put it, neologisms allows you to Talk about the problem
    at hand, as opposed to adapting the problem to fit the words you
    know. They allow greater expressivenes and precision, when you need
    it. They allow you to "use" many sentences you no longer have to say.
    And the words you don't say are the ones you don't need to pronounce
    (as it were).


    In short, no one is denying that the ability to create new macros is
    a powerful tool, just like no one denies that creating new words is
    a powerful tool. But both require extra training and thought for
    proper use, and while they are easy to write, it puts more effort
    for others to understand you. If I stick to Python/English then
    more people can understand me than if I mixed in a bit of Erlang/
    Danish, *even* *if* the latter makes a more precise description
    of the solution.

    By this analogy, Guido is someone who can come up with words
    that a lot of people find useful, while I am someone who can come
    up withs words appropriate to my specialization, while most
    people come up with words which are never used by anything
    other than close friend. Like, totally tubular dude.

    And since I know you want a demo of Speech, here's an
    example of Xhosa and it's direct translation into English,
    which doesn't have a single word for "going for a purpose"

    Ndiya kulala. -- I am going for the purpose of sleeping.

    And here's an example of Swedish with a translation into
    English, which lack some of the geneaological terms

    min mormor -- my maternal grandmother

    I can combine those and say

    Umormor uya kulala -- my maternal grandmother is going
    for the purpose of sleeping.

    See how much more precise that is because I can select
    words from different Dialects of Speech?

    Andrew
    dalke at dalkescientific.com
  • Borcis at Aug 21, 2003 at 8:48 am

    Andrew Dalke wrote:
    Olivier Drolet:
    Macros, as found in Common Lisp, do not change the underlying language
    at all! Common Lisp macros, when run, always expand into 100% ANSI
    Common Lisp code!

    I've created a new language, called Speech. It's based on the core
    primitives found in the International Phonetic Alphabet. I've made some
    demos using Speech. One is English and another is Xhosa. This just
    goes to show how powerful Speech is because it can handle so many
    domains. And it's extensible! Anything you say can be expressed in
    Speech!
    I believe you unwittingly locate an issue. Machine translation of human
    languages has been an unescapable project for computer scientists, a challenge
    that has consistently revealed harder to achieve than expected. Idiomatic
    machine translation of *programming* languages, in comparison, looks like a
    toy problem, an appetizer. But all the endless debates in the p.l. newsgroups
    certainly show one thing : we don't expect idiomatic translation between
    computer languages to solve our problems. While it clearly could.

    I believe our reasons for not doing it boil down to : (a) the issue of
    *conservation* of programming languages *biodiversity* not having gained
    attention as the key issue it is (b) lack of imagination by programmers too
    engrossed with pet language advocacy.

    What I mean is that the metaphor you use puts the joke on you (or us). You
    should really distinguish between the case of translating between *existing*
    "sibling" languages (be they human languages or programming languages) and
    otoh the case of translating between a newly-bred variant of a language and a
    parent language.

    Isn't it the case that most objections to macros fail to persist invariant if
    we set the purpose of our "macro" facility, to that of *grafting some
    language's surface syntax and basic control structures onto some other
    language's objects and library ? This, to illustrate and set a positive
    criterion, well enough that (say) we will be able to manage with (basically)
    the first language's syntax mode under emacs, what will really run as code in
    the second language* ?
  • Anton Vredegoor at Aug 21, 2003 at 12:28 pm
    "Andrew Dalke" wrote:

    [defines cognitive macro called Speech]
    I've created a new language, called Speech. It's based on the core
    primitives found in the International Phonetic Alphabet. I've made some
    demos using Speech. One is English and another is Xhosa. This just
    goes to show how powerful Speech is because it can handle so many
    domains. And it's extensible! Anything you say can be expressed in
    Speech!
    [snip lots of examples using it, trying to make macros look bad?]
    In short, no one is denying that the ability to create new macros is
    a powerful tool, just like no one denies that creating new words is
    a powerful tool. But both require extra training and thought for
    proper use, and while they are easy to write, it puts more effort
    for others to understand you. If I stick to Python/English then
    more people can understand me than if I mixed in a bit of Erlang/
    Danish, *even* *if* the latter makes a more precise description
    of the solution.
    You have found a wonderful analogy, however you seem to assume that
    your prejudices are so self explanatory that the conclusion that
    macros are bad is natural.

    I am not a native English speaker, and so my expressiveness in this
    language is severely handicapped, while I consider myself a person
    with good linguistic abilities.

    Obviously there are people from other linguistic backgrounds
    participating in discussions in this newsgroup who have the same
    problems. Maybe they are greater speakers than me and yet have still
    more problems using English.

    However this does not in the least cause this newsgroup to deviate
    substantially (or maybe it does but as a non-native speaker I can not
    discern the difference) from English. Rather we all strive to speak
    the same language in order to make as much people as possible
    understand what we are saying.

    While using Python as a programming language we strive for Pythonicity
    and for combining elegance with concisiveness and readability. We are
    using docstrings to comment our code and answer questions about it in
    the newsgroup. Helpful people debug our code and assist in formulating
    our algorithms in Python.

    IMO there is a strong tendency towards unification and standardization
    among the readers of this newsgroup and the need to conform and the
    rewards this brings are well understood.

    Seeing all this it would be a missed chance not to give the community
    the freedom of redefining the language to its advantage.

    Of course there are risks that the community would dissolve in
    mutually incompatible factions and it would be wise to slow down the
    process according to the amount of responsibility the group can be
    trusted with.

    The rewards would be incomparably great however, even to the amount
    that I would be ready to sacrifice Python only to give this thing a
    tiny chance. Suppose you could make a bet for a dollar with an
    expected reward of a thousand dollars? Statistically it doesn't matter
    whether you get a .999 chance of getting a thousand dollars or a
    .00999 chance of getting a million dollars.

    Therefore, the only thing pertinent to this question seems to be the
    risk and gain assessments.
    By this analogy, Guido is someone who can come up with words
    that a lot of people find useful, while I am someone who can come
    up withs words appropriate to my specialization, while most
    people come up with words which are never used by anything
    other than close friend. Like, totally tubular dude.
    Another relevant meme that is running around in this newsgroup is the
    assumption that some people are naturally smarter than other people.
    While I can certainly see the advantage for certain people for keeping
    this illusion going (it's a great way to make money, the market
    doesn't pay for what it gets but for what it thinks it gets) there is
    not a lot of credibility in this argument.

    The "hardware" that peoples minds are running on doesn't come in
    enough varieties to warrant such assumptions. For sure, computer
    equipment can vary a lot, but we as people all have more or less the
    same brain.

    Of course there is a lot of variation between people in the way they
    are educated and some of them have come to be experts at certain
    fields. However no one is an island and one persons thinking process
    is interconnected with a lot of other persons thinking processes. The
    idea that some kind of "genius" is solely responsible for all this
    progress is absurd and a shameful deviation to the kind of
    "leadership" philosophical atrocities that have caused many wars.

    To come back to linguistic issues, there's a lot of variation in the
    way people use their brain to solve linguistic problems. There are
    those that first read all the prescriptions before uttering a word and
    there are those that first leap and then look. It's fascinating to see
    "look before you leap" being deprecated in favor of "easier to ask
    forgiveness than permission" by the same people that would think twice
    to start programming before being sure to know all the syntax.

    In my case for example studying old latin during high school there was
    a guy sitting next to me who always knew the different conjugations
    the latin words where in and as a result he managed to get high grades
    with exact but uninteresting translations. My way of translating latin
    was a bit different, instead of translating word for word and looking
    up each form of each separate word (is it a genitivus, ablativus
    absolutus, imperativus, etcetera) I just read each word and going from
    the approximative meaning of all words put in a sequence of sentences
    I ended up with a translation that was seventy percent correct and
    that had a lot of internal consistency and elegance. It was usually
    enough to get a high enough grade and also some appraisal: "si non e
    vero, e ben trovato" or something like that.

    What this all should lead to I am not really sure, but I *am* sure
    that breaking out of formal mathematical and linguistic and
    programmatic rules is the only way to come to designs that have great
    internal consistency and that can accommodate for new data and
    procedures.

    It is sometimes impossible for a language designer to exactly pinpoint
    the reasons for a certain decision, while at the same time being sure
    that it is the right one.

    The ability to maintain internal consistency and the tendency of other
    people to fill in the gaps so that the final product seems coherent is
    IMO the main reason for this strange time-travel-like ability of
    making the right decisions even before all the facts are available.

    Well, maybe I have made the same mistake as you by providing arguments
    to the contrary of my intention of advocating the emancipation of the
    average Python user to the level of language designer.

    However if I have done so, rest assured that my intuition "knows" from
    before knowing all the facts that this is the way to go, and the
    rewards are infinitely more appealing than the risks of breaking up
    the Python community are threatening.

    One way or the other this is the way programming will be in the
    future, and the only question is: Will Python -and the Python
    community- be up to the task of freeing the programmers expressiveness
    and at the same time provide a home and starting point to come back
    to, or will it be left behind as so many other valiant effort's fate
    has been?

    Anton
  • Borcis at Aug 21, 2003 at 1:28 pm

    Anton Vredegoor wrote:
    The ability to maintain internal consistency and the tendency of other
    people to fill in the gaps so that the final product seems coherent is
    IMO the main reason for this strange time-travel-like ability of
    making the right decisions even before all the facts are available.
    Wow :)
  • Alex Martelli at Aug 21, 2003 at 4:15 pm
    Anton Vredegoor wrote:
    ...
    tiny chance. Suppose you could make a bet for a dollar with an
    expected reward of a thousand dollars? Statistically it doesn't matter
    whether you get a .999 chance of getting a thousand dollars or a
    .00999 chance of getting a million dollars.
    This assertion is false and absurd. "Statistically", of course,
    expected-value is NOT the ONLY thing about any experiment. And
    obviously the utility of different sums need not be linear -- it
    depends on the individual's target-function, typically influenced
    by other non-random sources of income or wealth.

    Case 1: with whatever sum you win you must buy food &c for a
    month; if you have no money you die. The "million dollars chance"
    sees you dead 99.9901 times out of 100, which to most individuals
    means huge negative utility; the "thousand dollars chance" gives
    you a 99.9% chance of surviving. Rational individuals in this
    situation would always choose the 1000-dollars chance unless the
    utility to them of the unlikely million was incredibly huge (which
    generally means there is some goal enormously dear to their heart
    which they could only possibly achieve with that million).

    Case 2: the sum you win is in addition to your steady income of
    100,000 $/month. Then, it may well be that $1000 is peanuts of
    no discernible use to you, while a cool million would let you
    take 6 months' vacation with no lifestyle reduction and thus has
    good utility to you. In this case a rational individual would
    prefer the million-dollars chance.

    Therefore, the only thing pertinent to this question seems to be the
    risk and gain assessments.
    Your use of 'therefore' is inapproprite because it suggests the
    following assertion (which _is_ mathematically speaking correct)
    "follows" from the previous paragraph (which is bunkum). The
    set of (probability, outcome) pairs DOES mathematically form "the
    only thing pertinent" to a choice (together with a utility function
    of course -- but you can finesse that by expressing outcome as
    utility directly) -- the absurdity that multiplying probability
    times outcome (giving an "expected value") is the ONLY relevant
    consideration is not necessary to establish that.

    Another relevant meme that is running around in this newsgroup is the
    assumption that some people are naturally smarter than other people.
    While I can certainly see the advantage for certain people for keeping
    this illusion going (it's a great way to make money, the market
    doesn't pay for what it gets but for what it thinks it gets) there is
    not a lot of credibility in this argument.
    *FOR A GIVEN TASK* there can be little doubt that different people
    do show hugely different levels of ability. Mozart could write
    far better music than I ever could -- I can write Python programs
    far better than Silvio Berlusconi can. That does not translate into
    "naturally smarter" because the "given tasks" are innumerable and
    there's no way to measure them all into a single number: it's quite
    possible that I'm far more effective than Mozart at the important
    task of making and keeping true friends, and/or that Mr Berlusconi
    is far more effective than me at the important tasks of embezzling
    huge sums of money and avoiding going to jail in consequence (and
    THAT is a great way to make money, if you have no scruples).

    Note that for this purpose it does not matter whether the difference
    in effectiveness at given tasks comes from nature or nurture, for
    example -- just that it exists and that it's huge, and of that, only
    a madman could doubt. If you have the choice whom to get music
    from, whom to get Python programs from, whom to get as an accomplice
    in a multi-billion scam, you should consider the potential candidates'
    proven effectiveness at these widely different tasks.

    In particular, effectiveness at design of programming languages can
    be easily shown to vary all over the place by examining the results.

    Of course there is a lot of variation between people in the way they
    are educated and some of them have come to be experts at certain
    fields. However no one is an island and one persons thinking process
    is interconnected with a lot of other persons thinking processes. The
    Of course Mozart would have been a different person -- writing
    different kinds of music, or perhaps doing some other job, maybe
    mediocrely -- had he not been born when and where he was, the son
    of a music teacher and semi-competent musician, and so on. And
    yet huge numbers of other people were born in perfectly similar
    circumstances... but only one of them wrote *HIS* "Requiem"...

    there are those that first leap and then look. It's fascinating to see
    "look before you leap" being deprecated in favor of "easier to ask
    forgiveness than permission" by the same people that would think twice
    to start programming before being sure to know all the syntax.
    Since I'm the person who intensely used those two monickers to
    describe different kinds of error-handling strategies, let me note
    that they're NOT intended to generalize. When I court a girl I
    make EXTREMELY sure that she's interested in my advances before I
    push those advances beyond certain thresholds -- in other words in
    such contexts I *DEFINITELY* "look before I leap" rather than choosing
    to make inappropriate and unwelcome advances and then have to "ask
    forgiveness" if/when rebuffed (and I despise the men who chose the
    latter strategy -- a prime cause of "date rape", IMHO).

    And there's nothing "fascinating" in this contrast. The amount of
    damage you can infert by putting your hands or mouth where they
    SHOULDN'T be just doesn't compare to the (zero) amount of "damage"
    which is produced by e.g. an attempted access to x.y raising an
    AttributeError which you catch with a try/except.



    Alex
  • Anton Vredegoor at Aug 21, 2003 at 9:06 pm

    Alex Martelli wrote:
    Anton Vredegoor wrote:
    ...
    tiny chance. Suppose you could make a bet for a dollar with an
    expected reward of a thousand dollars? Statistically it doesn't matter
    whether you get a .999 chance of getting a thousand dollars or a
    .00999 chance of getting a million dollars.
    This assertion is false and absurd. "Statistically", of course,
    expected-value is NOT the ONLY thing about any experiment. And
    obviously the utility of different sums need not be linear -- it
    depends on the individual's target-function, typically influenced
    by other non-random sources of income or wealth.
    Non linear evaluation functions? Other random sources? Seems you're
    trying to trick me. I did write statistically, which implies a large
    number of observations. Of course people seldom get to experiment with
    those kinds of money, but a simple experiment in Python using a random
    number generator should suffice to prove the concept.

    [snip tricky example cases]
    Another relevant meme that is running around in this newsgroup is the
    assumption that some people are naturally smarter than other people.
    While I can certainly see the advantage for certain people for keeping
    this illusion going (it's a great way to make money, the market
    doesn't pay for what it gets but for what it thinks it gets) there is
    not a lot of credibility in this argument.
    *FOR A GIVEN TASK* there can be little doubt that different people
    do show hugely different levels of ability. Mozart could write
    far better music than I ever could -- I can write Python programs
    far better than Silvio Berlusconi can. That does not translate into
    "naturally smarter" because the "given tasks" are innumerable and
    there's no way to measure them all into a single number: it's quite
    possible that I'm far more effective than Mozart at the important
    task of making and keeping true friends, and/or that Mr Berlusconi
    is far more effective than me at the important tasks of embezzling
    huge sums of money and avoiding going to jail in consequence (and
    THAT is a great way to make money, if you have no scruples).
    And you're eliminating mr. Berlusconis friends out of the equation?
    Seems like trick play again to me. Why are there so few famous classic
    female philosophers or musicians? Surely you're not going to tell me
    that's just because only the very gifted succeed in becoming famous?
    there are those that first leap and then look. It's fascinating to see
    "look before you leap" being deprecated in favor of "easier to ask
    forgiveness than permission" by the same people that would think twice
    to start programming before being sure to know all the syntax.
    Since I'm the person who intensely used those two monickers to
    describe different kinds of error-handling strategies, let me note
    that they're NOT intended to generalize. When I court a girl I
    make EXTREMELY sure that she's interested in my advances before I
    push those advances beyond certain thresholds -- in other words in
    such contexts I *DEFINITELY* "look before I leap" rather than choosing
    to make inappropriate and unwelcome advances and then have to "ask
    forgiveness" if/when rebuffed (and I despise the men who chose the
    latter strategy -- a prime cause of "date rape", IMHO).

    And there's nothing "fascinating" in this contrast. The amount of
    damage you can infert by putting your hands or mouth where they
    SHOULDN'T be just doesn't compare to the (zero) amount of "damage"
    which is produced by e.g. an attempted access to x.y raising an
    AttributeError which you catch with a try/except.
    Somehow one has to establish that a certain protocol is supported.
    However trying to establish a protocol implies supporting the protocol
    oneself. Perhaps not initiating protocols that one doesn't want to see
    supported is the best way to go here.

    Anton
  • Kenny Tilton at Aug 21, 2003 at 4:25 pm

    Anton Vredegoor wrote:
    IMO there is a strong tendency towards unification and standardization
    among the readers of this newsgroup and the need to conform and the
    rewards this brings are well understood.
    Your comment reminds me of a brouhaha over the legendary IF* macro
    (search comp.lang.lisp via Google). A fellow cooked up (IF* [THEN] ...
    ELSE or ELSE-IF ... END-IF), and then used it in a big package his
    employer released (so you had to go find the macro!). He took a little
    heat for that, the gist being, "if you want to use Basic, use Basic."

    Unrelated to macros, on Google you'll also see yours truly getting
    eviscerated for using camelCase. "Dude, we use hyphens".

    So, yeah, yer technically opening up the floodgates, but the social
    pressure is pretty effective at keeping Lisp Lispy and would be at
    keeping Python...Pythonic?


    --

    kenny tilton
    clinisys, inc
    http://www.tilton-technology.com/
    ---------------------------------------------------------------
    "Career highlights? I had two. I got an intentional walk from
    Sandy Koufax and I got out of a rundown against the Mets."
    -- Bob Uecker
  • Jacek Generowicz at Aug 21, 2003 at 1:22 pm

    "Andrew Dalke" <adalke at mindspring.com> writes:

    Ndiya kulala. -- I am going for the purpose of sleeping.

    And here's an example of Swedish with a translation into
    English, which lack some of the geneaological terms

    min mormor -- my maternal grandmother

    I can combine those and say

    Umormor uya kulala -- my maternal grandmother is going
    for the purpose of sleeping.

    See how much more precise that is because I can select
    words from different Dialects of Speech?
    You are absolutely right. "Umormor uya kulala" is less readable than
    "My maternal grandmother is going for the purpose of sleeping", to
    someone who is familiar with English, but unfamiliar with Xhosa and
    Swedish.

    Now explain the Mor/Far concept and the "going for a purpouse"
    concept" to said English speaker, and present him with text it which
    combinations of the concepts are user repeatedly.

    _Now_ ask yourself which is more readable.

    For this reason it is rarely a good idea to define a macro for a
    single use. However, it becomes an excellent idea if the idea the
    macro expresses must be expressed repeatedly. The same is true of
    functions, classes, modules ...

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouppython-list @
categoriespython
postedAug 18, '03 at 1:58a
activeAug 28, '03 at 12:20a
posts276
users75
websitepython.org

People

Translate

site design / logo © 2022 Grokbase