I'm sure it is usefull, I'm not sure it is popular, and I'm also sure a lot of things would be less miserable with some more knowledge about the subject. by [Theo Verelst] Of course feel free to add (preferably identified) My official, and finished education in the area is that I have a Masters degree in Electrical Engineering, and before I left my position for reasons of completely different nature than content, I was maybe between a few months or a year from a PhD in the area of advanced computer design, at the then network theory section I also graduated. Not that I think mightily high of all fellow workers from that time, but at least I'm qualified. Not that I normally care that much, but when it concerns knowledge about computers, and especially when certain circuits and people in it are involved, I take it to be essential that I make such clear. Computers existed in some form already long ago, I think it may have been in anchien china when mechanical adding machines were invented, and certainly telephone exchanges from a small century ago where computing some complicated things. In second world war, analog computers were used to compute bomb trajectories for the nonlinear (quadric I guess) influence of air resistance and wind on the bombs. A bit later, tube based computers were tried, applying binary principles. I think boolean algebra was a lot older than that, but I shuld look that up. Such machine would produce amazing amounts of heat, and use a lot of power, and of course every our or day one of the tubes would blow, and it would have to be fixed. Things started to accelerate after the transistor became available cheaply, and especially when the first and further digital chips appeared and even became cheaply available. I know this from experience since about 1977 or so, when as a still beginning teenager I bought those for hobbying together (working) circuits). That was a little while before the advancing technology brought forth the Z80, and the PET and TRS80 and other APPLE computers started to become widely sold, in the time when the intel 8080 processor was known (I had a book on it, but hardly dreamt of owning one...). In about 1979 (from memory) serious versions of the trs80 and other microcomputer systems become consumer goods as well, that is they were widely used outside business and for much lower prices. Before that, many machines of great innovative value were made in medium and major business and science sense, such as supercomputers, all kinds of (IBM) mainframes, and the interesting pdp 9 and 11 and others. It was on such and also some smaller (for instance cpm based) systems that most principles were experimented with which 20, 30 years later are still fashionable such as disc based operating systems, multi tasking, multi user operation, memory management such as paging and virtual memory, shared libraries, and also caching, pipelining, and parallel systems (for instance in early and later supercomputers). Remember that our modern and desirable (mean that, too) linux comes in many senses directly from easily 20 year and older quality course files, including the X windows based windows handling, which is from before apple and MS windows, and atari and such. ---- A long introduction to make a main point that computers and their software have undergone a quite logical historical growth, and should not be taken as the product of some hidden and and obscure process of some people having power over bits. ''Processor'' The heart of a computer, which consists of all kinds of registers (small internal memory locations), counters (such as the program counter and stack pointer), and Arithmethal and Logical Unit, which can perform all kinds of computations such as adding and subtracting, and all kinds of bus-es and connection adapters to the outside world, primarily to the main memory and some perifiarals. ''Memory'' The main memory of a single (non parallel) computer system of the ordinary and normally to be assumed von Neuman type of architecture is a big list of storage locations organised as a row of bytes, double words or even 4 or 8 bytes in parallel. Mostly, the actual memory chips are simply linearly addressed, possible in blocks of memory. Memory can be read from (non-destructive) and written to by applying digital 'read' or 'write' signals, in addition to a physical (binary) address to access a certian memory location, and a data word, which is a number of bits, usually a small integer times 8 bits, which can be written into the memory, or read from it. No matter what type of object oriented net enabled new hype bladibla language one used, every ordinary PC and workstation like computer has these types of concepts in the very heart of its operation. ''Instruction fetch and execution'' ''Perifiarals'' ''Machine Code'' ''High Level Programming Languages'' ''Processes'' ''Threads'' ''Library'' ''Objects'' So a thread can be a method of some object associated with a certain processes possibly stored persistently on a hard disc. ''Process and thread switching'' ''Networking basics'' ''Efficiency essentials'' ''Compiling or interpreting'' ''Concurrency versus parallelism'' ''Accessing disc and network stations''