Hi, I have a program that needs to check periodically to see if a "dictionary" has changed since the last time I used it (it's part of a multi-layered "memoizing" caching scheme -- the dictionary contains parameters that would be used in an expensive regeneration calculation, so if it hasn't changed, I can return the cached value). As it happens, though, the "dictionary" is actually going to be a durus.PersistentDict. One way to do this is to check every key in the dictionary against a "snapshot" to see if it has changed. This is obviously inefficient in both space and time, though (especially if the dictionary is largish or I have to check it a lot). Is there a way I can exploit the PersistentDict object's existing change-control code to optimize this? Ideally I might do something like: if mydict._p_changed: # check for specific changes if dict has changed ... The only problem is that _p_changed is probably reset at every save of the PersistentDict. Presumably that happens at database commits, right? If so and my last check was before the last commit, then this won't give me the right answer. So, could I access the system time of the last commit somehow? Then I could keep a timestamp from each access and do a test based on that: if time_of_commit < time_of_last_access(mydict) or mydict._p_changed: # check... I've got other optimizations in mind, but this would be a big time saver, because most of the time that I do this check, the object won't have changed, so I'd like to short circuit that check as early as possible. Cheers, Terry -- Terry Hancock (hancock@AnansiSpaceworks.com) Anansi Spaceworks http://www.AnansiSpaceworks.com