FAQ
I'm trying to write a class factory to create new classes dynamically at runtime from simple 'definition' files that happen to be written in python as well. I'm using a class factory since I couldn't find a way to use properties with dynamically generated instances, for example:

I would prefer this, but it doesn't work:

class Status(object):
pass

def makeStatus(object):
def __init__(self,definitions):
for key,function in definitions:
setattr(self,key,property(function))

this works (and it's fine by me):

def makeStatus(definitions):
class Status(object):
pass
for key,function in definitions:
setattr(Status,key,property(function))
return Status()

but I would also like the functions to only be evaluated when necessary since some may be costly, so I want to do the following:

def makeStatus(definitions):
class Status(object):
pass
for key,function,data in definitions:
setattr(Status,key,property(lambda x: function(data)))
return Status()

but all my properties now act as if they were invoked with the same data even though each one should have been a new lambda function with it's own associated data. It seems Python is 'optimizing'? all the lambdas to the same object even though that's clearly not what I want to do. Anyone have any suggestions as to:

1) why
2) what I should do
3) a better way in which to implement this pattern


Cheers!,
-Craig

Search Discussions

  • Craig Yoshioka at Jun 14, 2010 at 10:36 pm
    Sorry, the first example should be:

    class Status(object):
    def __init__(self,definitions):
    for key,function in definitions:
    setattr(self,key,property(function))
    On Jun 14, 2010, at 3:06 PM, Craig Yoshioka wrote:

    I'm trying to write a class factory to create new classes dynamically at runtime from simple 'definition' files that happen to be written in python as well. I'm using a class factory since I couldn't find a way to use properties with dynamically generated instances, for example:

    I would prefer this, but it doesn't work:

    class Status(object):
    pass

    def makeStatus(object):
    def __init__(self,definitions):
    for key,function in definitions:
    setattr(self,key,property(function))

    this works (and it's fine by me):

    def makeStatus(definitions):
    class Status(object):
    pass
    for key,function in definitions:
    setattr(Status,key,property(function))
    return Status()

    but I would also like the functions to only be evaluated when necessary since some may be costly, so I want to do the following:

    def makeStatus(definitions):
    class Status(object):
    pass
    for key,function,data in definitions:
    setattr(Status,key,property(lambda x: function(data)))
    return Status()

    but all my properties now act as if they were invoked with the same data even though each one should have been a new lambda function with it's own associated data. It seems Python is 'optimizing'? all the lambdas to the same object even though that's clearly not what I want to do. Anyone have any suggestions as to:

    1) why
    2) what I should do
    3) a better way in which to implement this pattern


    Cheers!,
    -Craig
    --
    http://mail.python.org/mailman/listinfo/python-list
    -------------- next part --------------
    An HTML attachment was scrubbed...
    URL: <http://mail.python.org/pipermail/python-list/attachments/20100614/292a8bbd/attachment.html>
  • Ian Kelly at Jun 14, 2010 at 10:36 pm

    On Mon, Jun 14, 2010 at 4:06 PM, Craig Yoshioka wrote:
    def makeStatus(definitions):
    ? ? ? ?class Status(object):
    ? ? ? ? ? ? ? ?pass
    ? ? ? ?for key,function,data in definitions:
    ? ? ? ? ? ? ? ?setattr(Status,key,property(lambda x: function(data)))
    ? ? ? ?return Status()

    but all my properties now act as if they were invoked with the same data even though each one should have been a new lambda function with it's own associated data. ?It seems Python is 'optimizing'? ?all the lambdas to the same object even though that's clearly not what I want to do. ?Anyone have any suggestions as to:

    1) why
    Because the 'data' variable isn't local to the lambda function, so
    when it's called it looks it up in the outer (makeStatus) scope, where
    the current value assigned is whatever the last value was when the
    loop ran.
    2) what I should do
    Make it local to the lambda function:

    setattr(Status, key, property(lambda x, data=data: function(data)))

    Cheers,
    Ian
  • Thomas Jollans at Jun 14, 2010 at 10:38 pm

    On 06/15/2010 12:06 AM, Craig Yoshioka wrote:
    I'm trying to write a class factory to create new classes dynamically at runtime from simple 'definition' files that happen to be written in python as well. I'm using a class factory since I couldn't find a way to use properties with dynamically generated instances, for example:

    I would prefer this, but it doesn't work:

    class Status(object):
    pass

    def makeStatus(object):
    def __init__(self,definitions):
    for key,function in definitions:
    setattr(self,key,property(function))

    this works (and it's fine by me):

    def makeStatus(definitions):
    class Status(object):
    pass
    for key,function in definitions:
    setattr(Status,key,property(function))
    return Status()

    but I would also like the functions to only be evaluated when necessary since some may be costly, so I want to do the following:

    def makeStatus(definitions):
    class Status(object):
    pass
    for key,function,data in definitions:
    setattr(Status,key,property(lambda x: function(data)))
    return Status()

    but all my properties now act as if they were invoked with the same data even though each one should have been a new lambda function with it's own associated data. It seems Python is 'optimizing'? all the lambdas to the same object even though that's clearly not what I want to do. Anyone have any suggestions as to:

    1) why
    (I'm not 100% sure about this)
    I think that when Python encounters "function(data)" while executing any
    one of the lambdas, looks in the scope of the factory function, and uses
    the last value data had there - which has since changed. This old trick
    might help: (if it doesn't, my analysis was wrong)
    2) what I should do
    setattr(Status, key, property(lambda x, d=data: function(d)))

    3) a better way in which to implement this pattern
    how about this:

    class Status(object):
    def __init__(self, definitions):
    """definitions must be a { key: function, ... } mapping"""
    self.functions = definitions
    self.cache = {}

    def __getattribute__(self, name):
    if name in self.cache:
    return self.cache[name]
    elif name in self.functions:
    self.cache[name] = self.functions[name]()
    return self.cache[name]
    else:
    return super(Status, self).__getattribute__(name)

    This doesn't use properties (why should it?) and proposes a different
    format for the definitions: a dict instead of a sequence of tuples.
    dict([(a,b), (c,d)]) == {a: b, c: d}, of course, so that's no problem.

    Have fun,
    Thomas
  • Peter Otten at Jun 15, 2010 at 6:06 am

    Thomas Jollans wrote:
    On 06/15/2010 12:06 AM, Craig Yoshioka wrote:
    I'm trying to write a class factory to create new classes dynamically at
    runtime from simple 'definition' files that happen to be written in
    python as well. I'm using a class factory since I couldn't find a way to
    use properties with dynamically generated instances, for example:

    I would prefer this, but it doesn't work:

    class Status(object):
    pass

    def makeStatus(object):
    def __init__(self,definitions):
    for key,function in definitions:
    setattr(self,key,property(function))

    this works (and it's fine by me):

    def makeStatus(definitions):
    class Status(object):
    pass
    for key,function in definitions:
    setattr(Status,key,property(function))
    return Status()

    but I would also like the functions to only be evaluated when necessary
    since some may be costly, so I want to do the following:

    def makeStatus(definitions):
    class Status(object):
    pass
    for key,function,data in definitions:
    setattr(Status,key,property(lambda x: function(data)))
    return Status()

    but all my properties now act as if they were invoked with the same data
    even though each one should have been a new lambda function with it's own
    associated data. It seems Python is 'optimizing'? all the lambdas to
    the same object even though that's clearly not what I want to do. Anyone
    have any suggestions as to:

    1) why
    (I'm not 100% sure about this)
    I think that when Python encounters "function(data)" while executing any
    one of the lambdas, looks in the scope of the factory function, and uses
    the last value data had there - which has since changed. This old trick
    might help: (if it doesn't, my analysis was wrong)
    2) what I should do
    setattr(Status, key, property(lambda x, d=data: function(d)))

    3) a better way in which to implement this pattern
    how about this:

    class Status(object):
    def __init__(self, definitions):
    """definitions must be a { key: function, ... } mapping"""
    self.functions = definitions
    self.cache = {}

    def __getattribute__(self, name):
    if name in self.cache:
    return self.cache[name]
    elif name in self.functions:
    self.cache[name] = self.functions[name]()
    return self.cache[name]
    else:
    return super(Status, self).__getattribute__(name)

    This doesn't use properties (why should it?) and proposes a different
    format for the definitions: a dict instead of a sequence of tuples.
    dict([(a,b), (c,d)]) == {a: b, c: d}, of course, so that's no problem.

    Have fun,
    Thomas
    An alternative implementation of the above idea:
    from functools import partial
    def get_alpha(data): return 2**data
    ...
    def get_beta(data): return data + 42
    ...
    definitions = [("alpha", get_alpha, 10), ("beta", get_beta, 20)]
    class Status(object):
    ... definitions = dict((k, partial(v, d)) for k, v, d in definitions)
    ... def __getattr__(self, name):
    ... if name not in self.definitions:
    ... raise AttributeError
    ... print "calculating", name
    ... value = self.definitions[name]()
    ... setattr(self, name, value)
    ... return value
    ...
    st = Status()
    st.alpha
    calculating alpha
    1024
    st.alpha
    1024
    st.beta
    calculating beta
    62
    st.beta
    62
    st.gamma
    Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
    File "<stdin>", line 5, in __getattr__
    AttributeError
    del st.beta
    st.beta
    calculating beta
    62

    This has the advantage that there is no overhead for attributes whose value
    has already been calculated.

    Peter

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouppython-list @
categoriespython
postedJun 14, '10 at 10:06p
activeJun 15, '10 at 6:06a
posts5
users4
websitepython.org

People

Translate

site design / logo © 2022 Grokbase