A Tcl extension for neural networks processing using the FANN library can be found at tcl-fann.
---
(RS would call this A little feed-forward back-propagation learning neural network.)
There came a time when I needed to do a series of simulations of a feed-forward back-propagation (FFBP) neural network . Rather then spend just an hour doing the homework, I instead spent over 10 hours writing a full-fledged network builder and simulator. As you can see from the screenshot above the result is a little tool that allows the user to graphically draw a network and then run simulations upon it.
Features include:
The bipolar function is g(x)=2f(x)-1, where f(x) is the sigmoid function. Conveniently, its derivative is g'(x)=0.5(1+g(x))(1-g(x)). A more general-purpose steepest descent algorithm may be found at Differentiation and steepest-descent, though my FFBP does not use his code.
Usage
Shift-click the canvas to place a node. Shift-drag between nodes to add a weight link. Reposition nodes by dragging them around. Double click a node or weight to change its properties.
The real power of the program is its ability to run a number of trials and to learn after each test datum. To do this requires modifying the code a bit. See the last three functions in ffbp.tcl. The current functions demonstrate how to learn the XOR function; the corresponding network is saved in ffbp.net.
While you are at it, take a look at another kind of neural network, Hopfield Networks.
To do list:
Downloads
Version 0.2 - http://tcl.jtang.org/ffbp/ffbp-0.2.tar.gz
Version 0.1 - http://tcl.jtang.org/ffbp/ffbp-0.1.tar.gz
See also ANN