Attending the Tcl 2002 conference, I saw a number of presentations that mentioned in passing a problem that I am dealing with in my application: Automatic (or semiautomatic) generation of Tk GUIs based on metadata descriptions. The applications of this take several forms. * GUI builders need to interpret user gestures into code to build an interface * Some XML applications interpret XML to produce user interfaces for editing values I can see where a front-end/back-end system of reading and interpreting file contents or user gestures then calling constructors to build the interface might make some sense. Here are the questions: * Is there such a library now? * If there were such a library, how would you expect it to be used? * If there were such a library, how would you expect it to be organized? Conceivably, the generated interface might be on one system connected to a remote system. Thoughts? Pointers to wiki pages? David S. Cargo (dcargo@marix.com)