On May 30, 2006, at 9:16 PM, Peter Wilkinson wrote: > > What I'd like to do is use QP and Durus, both very nice and Durus > in particular works much better with our data access patterns. My > question is really whether I should just run an external Durus > process to deal with sharing of the data across the different QP > sites or if there is another way to organise things that would work > better. Is the database very large? Do all of your sites have a similar access pattern? The critical performance factor, for sites with access patterns like ours, anyway, is having the objects you need already loaded into the durus client connection cache. Your architecture should keep that in mind. Ideally, every client can have a cache big enough to hold all of the objects it might ever use. Obviously, we don't always have that option, but adding RAM may be the cheapest way to improve performance. If all of your sites use the same data with a similar access pattern, then you might want to use just one QP site and have your Publisher sort out one from another, rather like (I assume) your Medusa application does now. Running a single, external Durus process with multiple QP sites as you suggest should also work. You could have a local QP site that runs the shared Durus database server and maybe offers a administrator view of the database, maybe something like the 'browse' feature in the proto demo. You would need to do some customization though in your sites Publisher constructors to make sure that they get connections to the right Durus server. Also, note that it is possible to run an application with connections to two or more independent Durus databases.