I am very pleased to have found a hosting provider that is directly python friendly and indirectly qp friendly on their non-dedicated / shared-server service offering i.e. I can compile my own stuff and run multiple long running processes without getting stung in the pocket - which is what qp needs. The service is such that qp sits behind a reverse proxy (probably apache) and hence the source of my question/dilemma. Ignoring HTTPS for the time being... from behind a reverse proxy is it better to deliver standard HTTP requests via the inbuilt qp http server or via an scgi setup using apache or lighttpd (preferred) as an intermediary between qp and the reverse proxy? The site will have high traffic, high traffic that is as in a high number of clients + requests. It will not be serving high bandwidth content but rather answering smallish requests for lots of clients, so need to ensure it is architected robustly to handle that profile of use. So the options are: 1.http. reverse proxy --> qp + inbuilt webserver pros - simple to setup and run cons - this will be a high traffic site so not sure if the reverse proxy offers enough of a buffer on its own. Recall reading previously on the mailing list that use of the inbuilt qp http server for high traffic sites was not advisable. Not sure if this still applies behind a reverse proxy? 2.http. reverse proxy --> lighttpd + mod_scgi --> qp scgi pros - qp and scgi work very well together - lighttpd has proved itself to be more than capable of handling high volumes of requests thus have no doubt it will protect qp. cons - extra step between processes (i.e. lighttpd) so I expect it must incur some additional overhead compared to option 1. Tristan