To incite some discussion and controversy. '''Basic question''': With machines getting faster and memory cheaper, would it make sense to write compilers which only have bits and pieces written in C, Ada, ..., and their main components (like complex manipulation of whatever data structures are needed to represent and optimize code) are actually written in a scripting language, like [Tcl], [Python], ... ? '''And secondary''', if we have such a compiler for, for example, the [C language], does make it sense to package it up as an extension for the scripting language the majority of it is written in, for example, Tcl ? Have fun thinking about this -- [AK] :) ---- Note that this is not so much about taking an existing compiler and making it scriptable, and thus allowing others to change its behaviour, but more about making use of highlevel data structures available in scripting languages to make the implementation of algorithms for data-flow analysis and such more ... understandable and/or maintainable. Also note that this not about configuration files, but about implementing parsers, etc. for system languages like C. In a wider sense such are also useful in tools like [Source Navigator] which extract x-ref information out of sources. ---- Sub projects to think of ... * [Scripted Lexing] ----------- Examples: [Lexing C], [Lexing SQL] * [Scripted Parsing] ---------- Examples: [Parsing C], [Parsing SQL] * [Scripted Code Generation] -- Examples: none yet. ---- Related pages in this Wiki: * The [GE ICE Tcl compiler]. * The [Python] Specializing Compiler, [Psyco]. * [CriTcl]. ----- '''Other references''' * [jcw] has his notes about a similar topic at http://www.equi4.com/moam/compilers.html * Another way we might want to take this appears here[http://lambda.weblogs.com/discuss/msgReader$4072]. * [DKF] notes that the language [SML] uses an internal compiler, though its source format is not C but SML itself. OTOH, it does mean that building the binary is interesting, especially on supported binary architectures... ---- '''Comments, notes, and discussion''' DOS C compilers used to offer in-line assembly language as a matter or course. Why shouldn't a scripting language offer in-line C as a matter of course? - [WHD] We already have inline-C with [CriTcl], but that calls an external compiler, [gcc] for now. The secondary question is more about calling the/a scripted compiler, directly loaded into the language interpreter. - [AK] True. It would be nice to have [CriTcl]'s functionality while dispensing with any kind of external compiler. But other than providing an easy way to inline C in a Tcl script, what good would a scripted compiler be? E.g., how would you use scripts to change the compiler's behavior? - [WHD] I was less thinking about changing its behaviour and more of making use of highlevel data structures available in scripting languages to make the implementation of algorithms for data-flow analysis and such more ... understandable and/or maintainable. - [AK]. [AM] Because of Critcl I am working on a package that abstracts the concept of a compiler and a linker away from the platform-dependencies. This way Critcl will be able to support "any" compiler/linker without the user (or the Critcl programmer) having to jump through hoops. [Vince] That sounds great! [NEM] There is [Babel], by Paul Duffin. However, like Feather, it may take a while to get hold of any code from Paul. ----- Merging the above threads and thoughts into a more cohesive picture: Note the obvious connection to [Starkit]'s. In the context of using a scripted compiler to extend the interpreter of a scripting language I see three main components: 1. The interpreter itself, possible written in a combination of its scripting language and a system language. Has a mechanism for loading script code, and shared libraries as defined by the OS it is running on. 1. The compiler package. Takes files containing code written in a system language, and/or files written in a mixture of the scripting language and the system language and compiles them. In Tcl this compiler can evolve out of [CriTcl]. The compiler is able to generate three different results, listed below. 1. A package for loading slim binaries. This package provides a command reading a slim binary, compiling its contents into (in-memory) machine code and linking that into the interpreter. Compiler results: 1. ''[Slim Binaries]''. Such files contain data near to machine code, but not quite. Easy to compile (or map) to machine code, hence very efficient at runtime, but also portable. If the source is a combination of scripting and system language code the slim binaries could either contain the script code, or the portable bytecode used by the interpreter. 1. In-memory ''machine code''. This can be achived by a combination of the last item and the package to load slim binaries. For efficiency we just have to create a path where it is not necessary to write the slim binary to a file before mapping it to machine code. 1. A ''binary library'' containing machine code in a format native to the target processor and OS. Note emphasis on ''target'' processor. Cross-compilation is well within our long-range goals. Reasoning and scenarios behind the above: 1. The interpreter core (Component 1), a mixture of a system language and its own language compiles itself, either for the processor/OS it is currently running on, or for a different processor/OS. The second case is standard cross-compiling, i.e. porting the core to a new platform, after the compiler is extended to generate code for that platform. The first case makes sense too. It can be used to have the intepreter core pick up changes in the compiler, like better optimization algorithms. It is these two cases for which we need the compiler (Component 2) to be able to native binary libraries (Result 3). 1. A scenario for package authors: A package containing a mixture of a system language and its own language is loaded, and the system language parts are compiled into in-memory machine code for use by the other parts. This requires result 2, and, of course, the compiler itself. 1. Extending the above to deployment it makes sense, IMHO, to precompile the system language parts into a dense portable encoding like slim binaries which can be shipped everywhere, are still as fast as machine code and do not have the overhead of truly parsing the system language as in the scenario above. In this scenario we do not need the full-fledged optimizing compiler package (FFOCP) at the target, only a loader package for the slim binaries, i.e. component 3. Actually the FFOCP would be detrimental as the overhead of optimizing would negate the gain we get from having to load only a small file. The above are the scenarios I thought of when I wrote up the lists of required packages and compiler results. For a new scenario I thought of recently see below. 1. If the target host of a deployed package has the FFOCP too it not only could use slim binaries quickly mapped to machine code, but also a native library generated by the FFOCP in the spare time, or in batch mode, containing higher optimized machine code than generated by the loader. It is not clear if the gain to be had by using higher optimized machine code is outweighed by having to load a large native binary library. The research regarding slim binaries suggests that this is not so. At least for shortly running processes, IMHO. For long-running processes the initial overhead could easily be matched by the gains in speed we get. The problem is to determine the break-even point, i.e. the point where it makes sense to switch from one to the other. I should note that the researchers in the area of slim binaries also research the field of optimizing the machine code of highly used procedures at runtime, in parallel to the actual application, using spare cycles in a low-priority thread. The above scenario could be seen as an evolution of this where the optimization results are written to disk for sharing with future instances of any application using the same procedures. -- [AK] Thoughts... Maybe components 1+2 and result 3 are sufficient as first step? Also: binary code would be a great way to secure part of an app, which in turn could then supply keys for decoding the rest. With slim binaries, it gets even better because the same "code" runs cross-platform. -[jcw] '''1+2/3''' is essentially [CriTcl] without an external compiler, and as such a logical first-step. But note that for result 3 we might need result 2 as source. - [AK]. ---- Comments on parsing C and other languages. The text processing capabilities of scripting languages can make this part easier as well. ''[string] map'', [regsub], [split], etc. are powerful tools. Use the first, for example to the detect the various types of tokens and insert marker characters for separation. After this pre-tokenization phase we can listify the string into the token-list via split. Easy, fast, ... This takes care of the lexer. Parsing ... [Yeti], or similar - [AK]. I've never really put much time in learning all that arcana... especially since Tcl makes it dead easy to write config files as Tcl ;) Use the [source] Luke ;) - [Ro] Well, the comment above are not about config-files, but the scripted compiler for C, etc. Also useful in Tools like Source Navigator which extract xref information out of sources. - [AK]. ---- [DKF]: How would you go about debugging such a beast? Without debugging... Well, let's just say that's highly scary, shall we? [AK]: Testsuites for the components, trace log (package require log), data structure dumps (tree's, graph's) fed into visualization tools, symbolic debuggers, like in [TclPro]. ----