> If the database is going to be the choker for this, have you considered > continuing as you've begun? CGI has the nice property that it spawns > a Quixote process per request, so no request is ever left waiting for > its Quixote process to finish some other request. Of course that's > also the drawback of CGI if the overhead of starting up a process > becomes too large a portion of the cost of servicing the request. > > I only ask because I know how often I've looked right past the obvious > while pondering the arcana of a more interesting solution.Very good point. It's often commented here that CGI performance is terrible, and compared to the alternatives, it is. However, I was recently testing CGI capabilities on a server, and I threw together a simple test script in python (without using Quixote), and I saw that the overhead was insignificant. That made me realize that it wasn't CGI, per se, that was slow, but the way my apps started. Through lazy imports and making sure your modules are compiled to pyc/ptlc, it should be possible to get decent, if not stellar performance even from CGI. Of course, it may not be feasible to make enough imports lazy to make that approach worthwhile, but it warrants consideration. Bear in mind that the usual rules of the modules getting compiled to pyc/ptlc and written to disk don't apply if the user your web server is running as doesn't have permission to write to the directory those modules live in. You'll may have to pre-compile them yourself. Easy enough with a script like those Oleg has posted to this list before (compyle-ptl.py and compyleall-ptl.py, I think they were called.) Jason