There are many things to say about performance. I'll go into the performance of this site on the web for now, i.e. through the CGI interface: Currently, the web site returns every page through CGI. A 2 Mb standalone executable containing [Tcl/Tk] and MetaKit [http://www.equi4.com/metakit/] (called [TclKit]) gets launched for every page you look at. As things go, that executable then starts doing things like sourcing [Don Libes]' CGI package and lots of other things. The HTML you see, is generated on-the-fly for ''each'' request from the plain text (with its wiki-specific markup). Considering what is going on, it's in fact amazing how fast the site is. But this won't scale well as the site grows. The plan is to generate static pages, and to revert to this CGI-based approach only for editing, searching, change histories, etc. It's not hard IMO and it can be done without altering any URLs, it just hasn't been done yet... ''is it time (already?) to implement this "over-drive"?'' ''Unfortunately, overhead also brings up a limitation of the current code. This version uses file locking to prevent two concurrent accesses from clobbering and damaging the database. But, while trying to figure out how to do that in the most general way, I forgot to add a mechanism which is important for CGI access: the system should retry and wait if a lock is found, instead of just giving up. As a result, the current system will occasionally return an HTTP error. This is harmless, you can simply re-fetch the page (or re-save your edits). I'll add retries in the next revision of WiKit, of course...'' -- JC ---- Actually, it is only 'harmless' in some cases; other times, the browser tosses away the data from the form, resulting in the user having to start over. ---- '''Nov 28, 2001''' - after the umptieth day of seeing huge delays and failures in accessing the Tcl'ers Wiki, I finially found some time to dive in. * The most embarrassing observation, is that at some point, static page caching was turned off again. Turning it back on means there should be a very substantial responsiveness increase again, as the cache fills up again over the next few days (the gain is that Apache serves pages directly, instead of launching CGI). * Another fact which became clear over the past few weeks, is that more and more machines or people are "web-sucking" this area, pulling in all pages one by one. The "/robots.txt" file has been adjusted, but it seems like not all activity is gone. * But the real clincher, is following the "Expand" link at the bottom of the page. That leads to a huge page will all referenced pages tagged on at the end. On the "Recent Changes", that really is a big thing. The HTML generation code is not terribly efficient (yet, there's new stuff coming). It leads to multi-minute CPU runs. With all accesses after it failing for some time. For that reason, all "Expand" links have been removed for now. * There appear to be cases left where a single request takes a minute or more. I'm trying to track down what that could be. * And lastly, this is using an older format Tclkit/MetaKit as engine, both of which have improved quite a bit over time. A quick trial some months ago backfired (damaged database, could be multi-user interference problems - because locking has changed) and was reversed. But once everything turns out to be stable, I'll revisit this issue and look for a way to reliably migrate to the latest code. ---- [Category Wikit] - [Category Tcler's Wiki]