Am Dienstag, 25. Oktober 2005 12:45 schrieb David Binger: > On Oct 25, 2005, at 4:00 AM, Kai Schmidt wrote: > > To make a longer story short: I have troubles with (my) Durus in my > > somewhat > > larger programs. I see "memory allocation errors" very often, so > > that atm > > durus is not usable for me. It just dies, no messages to follow :=( > > I'd like to know why that those errors are there. > How big are these data sets? > Can your python make big dictionaries? > Here is a test loop posted recently. Thanks for the answer. My demand for the ODB is not at all high. Thats surely not the reason ... I run that def() below, and gave up at about 3 Million, when my (huge) swap-partition startet to fill up. Very nice to look at diverse graphical (KDE) system-monitors btw. I can now reproduce those memory errors with durus only having 100 persistent dict() in it. Can I assume, that there is nothing really wrong in my sample code? I have gcc 4.0.2, could that be a problem? IMHO I should not be able to produce a memory|pointer problem with python, should'd I? Is there a way to run Durus with python only? No C-helper? to test ... Ciao Kai > def bigdict(): > d = {} > count = 1 > while True: > try: > d['x%d' % count] = str(count) > count += 1 if count%1000==0: print count > except MemoryError, e: > print count, e > return d