Version 1 of Data Driven Tk GUI Construction

Updated 2002-09-23 19:00:01

Attending the Tcl 2002 conference, I saw a number of presentations that mentioned in passing a problem that I am dealing with in my application:

Automatic (or semiautomatic) generation of Tk GUIs based on metadata descriptions.

The applications of this take several forms.

  • GUI builders need to interpret user gestures into code to build an interface
  • Some XML applications interpret XML to produce user interfaces for editing values

I can see where a front-end/back-end system of reading and interpreting file contents or user gestures then calling constructors to build the interface might make some sense.

Here are the questions:

  • Is there such a library now?
  • If there were such a library, how would you expect it to be used?
  • If there were such a library, how would you expect it to be organized?

Conceivably, the generated interface might be on one system connected to a remote system.

Thoughts? Pointers to wiki pages?

David S. Cargo ([email protected])


Traditionally, this sort of thing has been done via the option database I believe; I think Brent Welch's book describes how to do it. XML seems like a nicer solution to me, though. But I don't think I'd tie "user gestures" in too strongly. On the other hand, it's easy to see how a GUI Builder could store the program's structure as a DOM tree, which could then be loaded and saved as an XML document. Then, if they were ordered appropriately, the app could avoid the cost of including the DOM by parsing the XML document as a sequence of streaming events (a la TclXML) and building the app as it goes. -- WHD