Theo Verelst Process Algebra refers to mathematical formulation and reasoning rules for defining processes, made up of events where we focus on communication events which occur between processes. A process can be represented by an identifier, and is seen as something which can somehow decide to engage in a communication with another process, or maybe with itself, and we take it that we can have n processes which can communicate between 2 at one instance in time. Because we do not explicitly discern what goes on in a process (sometimes the word 'agent' is preferable', though a more than loose coupling with the unix idea of a process is not so bad as thought frame), the only thing we can be sure of is when a communication takes place, which can be given a notation such as com(Pa,Pb) or com(Pa, Pb, Mi) where P is a process and M is a message, or the communication content. The former is a more general formulation which leaves room for more reasoning. We can assume a process changes state after a communication of a certain kind with a certain other process took place, and we could make a list of all possible sequences of communication a set of processes or agents can take part in, possibly in an inductive way. For instance: com(Pa,Pb,M1) or: com(Pa,Pb,M1) com(Pa,Pc,M2) com(Pb,Pc,M2) com(Pb,Pc,m3) com(Pb,Pa,m3) We could take it that the first two arguments of the com operator are order insensitive. A general way of speaking about processes somehow relevant to a problem is 'composition' which means they are being enabled to communicate. Also we could serialize them, a get a certain sequence. Pa | Pb composition pa ; Pb sequence A limit operator can be used to 'rule out' a certain communication from the set of possible communications between certain composed processes: (Pa | Pc) \{M3} would indicate the parallel composition of two processes restricting the allowed communications. ''State progression'' We could take it that each communication generates a new state, where a state can be a generic term, possibly refering to communication agents which are not one to one linked with our normal process idea. ''Traces'' ''(Time) Ordering'' '''Strong and weak Bisimulation''' ''Design considerations'' ''Links with real word programming'' ''Important Limitations'' The law of the ever prevailing permutation or shuffle operator, or 'how faculty grows fastest'... In every system with events and possible reordering of events, or anything else, such as program statements, just the idea of reordering and all possible reorderings for the events in such a system leads to a huge number of possibilities already for a small number of events. Take 10 events, and there are 3.5 million different reorderings possible. Put a few of them on a row, and that number is put to the power of the times you do the reordering experiment. Outgrows the number of atoms in the universe as possible memory locations sooner than one may project. Hence np completeness considerations in some areas of computer science. Any simulator which 'honestly' does extensive searching of the space of CP's of just a touch more complicated than absolutely simplistic already needs powerfull computers to run, and the only good way out is to use good mathematical formulations and derivation theorems and smart problem definitions. ''The relevance of it all'' Anyone not seriously aware of machine code level debugging on a multi tasking platform must learn either that or the type of theory mentioned here... Everything in computers where such concepts are used, communicating processes on unix or windows of some kind, communicating machines over some network, and telephone exchanges and equivalent, somehow needs these type of issues to be dealt with. Either by forcing everything serialized and under control that way, of by making the communication issues basically simple and rigidly enforce error correction when synchronisation is lost, and only detecting general compliance or non compliance with the systems sequencing logic. Or by intelligently using protocols and extensive testing of types of communication, and hoping all hell will not break loose. Or by defining everything mathematically, and 'proofing' ones' way out. Or some combinations of these, including (maybe monte carlo) statistical simulation. Which does require one to understand the concepts involved to have even some success. OOers lack this kind of knowledge and understanding almost by default, and unfortunately seem to want to acquire a world image where the great root object makes all come right in the end or something, while in fact there is on easy way out of parallel programming concepts and problems by simply succoming to the concept of nested messages. At all. ''Tcl examples'' ''References'' Communicating Sequential Processes, by Hoare CCS by R. Milner both a few decades old. I've seen at least one more recent one with similar theory, which was good to read, I'll see if I can find some more full blown references. Related: SDL (structural design language) ---- [AM] I have prepared a paper on a related subject in which I describe how to use Tcl for describing a process or set of processes. The main example still requires some work and then it ought to be ready ... (Now find the opportunity to make the stupid thing work as I want it to)