Tcl/Tk and Expect

If you want to understand why TCL is in many cases more viable choice than, say, Perl, Python and several other languages you need to think about it as both scripting and macro language, not as a pure scripting language. Also its better suited to imbedding into larger programs (that was the design goal). And TCL was amazingly successful as a macro language for several important tools (for example in famous Expect), mail tools and (probably the most prominently) networking tools. Some molecular modeling applications use TCL too (Medea interface). Due to Tk toolkit TCL was and still is a popular tool for converting old command line application to GUI interface (Tkman), although "Webalization" (via HTML) is a better approach now. Tk was implemented in other scripting languages (Perl, Pytho) and became standa de-facto for creation of simple GUI interfaces.

At one point Sun was supporting the development of TCL on Solaris to the extent the it hired John K. Ousterhout. They produced a couple of new, better versions of TCL and the final quality was high enough for this product to be deployed on Solaris, but Sun became distracted by Java and John K. Ousterhout left Suu for its first TCL startup.

The killer applications for TCL proved to be Expect: a uniquely powerful sysadmin tool and TK a GUI development toolkit. for a nice introduction see Getting Started With Expect (HTML Format)

Expect is available for any flavor of Unix, including Solaris and Linus. Older free version of it is available for Windows too. ActiveState has a newer non-free version for Windows.

The killer applications for TCL proved to be Expect: a uniquely powerful sysadmin tool and TK -- GUI development toolkit

Expect is a great tool in a system administrators arsenal and can be used to easily automate tasks that require periodic user input. This can allow the administrator to make better use of their time than watching the application or utility to spot the next time it requires input. The idea of Expect was developed by Don Libes in 1987. The bulk of version 2 was designed and written between January and April, 1990. Minor evolution occurred after that until Tcl 6.0 was released. At that time (October, 1991) approximately half of Expect was rewritten for version 3. See the Expect history for more information. The HISTORY file is included with the Expect distribution. Around January 1993, an alpha version of Expect 4 was introduced. This included Tk support as well as a large number of enhancements. A few changes were made to the user interface itself, which is why the major version number was changed. A production version of Expect 4 was released in August 1993. In October 1993, an alpha version of Expect 5 was released to match Tcl 7.0. A large number of enhancements were made, including some changes to the user interface itself, which is why the major version number was changed (again).

The production version of Expect 5 was released in March '94. In the summer of 1999, substantial rewriting of Expect was done in order to support Tcl 8.2. (Expect was never ported to 8.1 as it contained fundamental deficiencies.) This included the creation of an exp-channel driver and object support in order to take advantage of the new regexp engine and UTF/Unicode. The Expect book became available in January '95. It describes Expect 5 as it is today, rather than how Expect 5 was when it was originally released. Thus, if you have not upgraded Expect since before getting the book, you should upgrade now.

TCL/Tk were originally developed at the University of California Berkeley, then at Sun by John Ousterhout. Here is a short biography sketch from his homepage

John K. Ousterhout is founder and CEO of Electric Cloud, Inc. He is also creator of the Tcl scripting language and is well known for his work in distributed operating systems, high-performance file systems, and user interfaces. Prior to founding Electric Cloud, Ousterhout was founder and CEO of Scriptics Corporation, Distinguished Engineer at Sun Microsystems, and Professor of Computer Science at U.C. Berkeley. He received a BS degree in Physics from Yale University and a PhD in Computer Science from Carnegie Mellon University. Ousterhout is a member of the National Academy of Engineering and a Fellow of the ACM. He has received numerous awards, including the ACM Software System Award, the ACM Grace Murray Hopper Award, the National Science Foundation Presidential Young Investigator Award, and the U.C. Berkeley Distinguished Teaching Award.

Here how he describes the events:

I got the idea for Tcl while on sabbatical leave at DEC's Western Research Laboratory in the fall of 1987. I started actually implementing it when I got back to Berkeley in the spring of 1988; by summer of that year it was in use in some internal applications of ours, but there was no Tk. The first external releases of Tcl were in 1989, I believe. I started implementing Tk in 1989, and the first release of Tk was in 1991.

TCL has a very simple structure. Each line starts with a command, such as dotask and a number of arguments. Each command is implemented as a C function. This function is responsible for handling all the arguments. It was really brilliant in its simplicity design -- a complete opposite to PL/1 style design of Perl. Actually in addition to TCL John Ousterhout developed a Tk toolkit that also became the most popular (and independent of TCL) GUi development toolkit for Unix. It continues the life of its own and actually became more popular than TCL itself. It is now ported to many languages. Now when we are talking about TCL we often mean TCL/Tk, but still we need to distinguish:

TCL is the Tool Command Language -- an embeddable scripting language

-- an embeddable scripting language TK is the Tool Kit -- a graphical interface development tool.

Sun was the first and the only company that realized the potential TCL in Unix environment and John Ousterhout joined SunLabs as a Distinguished Engineer in 1993.

But TCL was not sexy enough and after Sun started its Java marketing campaign, TCL became an orphan in the corporation. Without major corporate sponsor TCL never was able to realize its full potential in Unix, where it could became a universal application-level macro language. That's really sad.

John Ousterhout left Sun and in 1998 founded Scriptic Corp. (see Tcl's inventor leaves Sun to form startup - SunWorld - February 1998). The life of a startup is a difficult one. In late 2000 it changed it name to Ajuba Solutions and then disappeared after the acquisition by Interwoven. Later he founded Electric Cloud, Inc.

TCL team was organized to drive further development of TCL/TK:

The Tcl Core Team (TCT) is a group of Tcl experts who collectively manage the development of the open source Tcl core, including the Tcl scripting language, the Tk toolkit, the Tcl Developer Exchange Web site, and other things yet to be determined. The Tcl Core Team was formed in August 2000 with an initial membership elected by the Tcl community. The TCT currently contains the following members: Mo DeJong Andreas Kupries Donal Fellows Karl Lehenbauer Mark Harrison Michael McLennan D. Richard Hipp Jan Nijtmans Jeffrey Hobbs John Ousterhout George A. Howlett Don Porter Jim Ingham Brent Welch As of this writing (early October, 2000) the TCT is just starting to get organized. It will probably take several weeks before we figure out the best way to work together and get shared resources such as the source repositories and Web site set up. More information will appear here as things develop.

From the computer science perspective Tcl looks like a more interesting development than Perl as it is the first open sourced embeddable language. IMHO Perl is a kind of mainframe (anti-Unix) style language that very much reminds me PL/1 ("all things for all people" kind of approach; there is always several ways to accomplish anything, etc. -- bonanza for consultants and book writers ;-), while Tcl is a more Unix-style language. It does well one thing well: it's a really decent command language for various tools. And that feature instantly attracted several talented authors. It's not only famous Expect was based on TCL (probably Expect can serve as an example of a tool that is more popular than the underling language ;-). AOLserver is based on Tcl! StoryServer is based on Tcl! Source Navigator is based on Tk! Tcl does awesome XML! Comanche is based on Tk! ...

It's very sad that Tcl never become prominent neither in Solaris nor in Linux environment. BTW that definitely attests Linux as a neo-conservative movement as Tcl has tremendous potential to lift Unix-style Os to a different level by integrating and streamlining command like and GUI as well as providing a common internal language to a million of obscure Unix utilities some of which outlived their usefulness. One needs to understand that despite their non-orthogonality and obscurity (I would like to see a person who really understands all the differences between find and grep in interpreting regular expressions ;-) the current set of Unix utilities represents quite an innovative, semantically rich set of commands for the operating system.

Actually there was one attempt in this direction -- here I would like to mention tksh -- a very interesting development that also failed to get a proper attention of the Unix/Linux community. Due to ability to use Tcl as tool macro language there are a lot of Tcl-based applications (see Top Applications), much more than in any other language (even Rexx) -- and that makes an important difference between TCL and other scripting languages. John Ousterhout views on scripting were best formulated in his famous paper Scripting: Higher Level Programming for the 21st Century. In the paper The salacious truth of the scripting paradigm shift to Tcl-Tk and Python Nicholas Petreley put it very well:

So expect scripting in general to take off soon. Of the scripting languages available, I predict Tcl/Tk and Python will eventually gain the popularity Perl now enjoys. Although it's somewhat of an apples and oranges comparison, both Tcl/Tk and Python are easier to learn than Perl. And like Perl, there is a version of Tcl/Tk and Python for practically every platform, including Windows 95 and Windows NT.

WWW initially dramatically raised the visibility of Perl in recent years -- to a certain extent at the expense of Tcl and other Unix-based scripting languages. then if was overtaken by PHP -- simpler but inferior language. Perl used to dominates CGI scripting arena and as CGI fall into oblivion Perl for WWW fall into oblivion too. After then retunred to the area from which it initially come as a tool for writing bore complex script for sysadmin. Not because it is perfect for the purpose (string-handling capabilities of Perl are probably too much based on regular expressions -- compare for example with Lex-based approaches or with parse in REXX) but because it is close to shell enough to make "phychologically compatible" for those who program in C or shell.

But the main problem with Perl is that it cannot be easily used as an internal macro language. At the same time this is an extremely important problem area and here Tcl really shines. Although it's somewhat of an apples to oranges, I think most people agree that Tcl is easier to learn than Perl or Python and is more extendable. And in portability Tcl comes very close to Perl -- there is a version of Tcl/Tk for practically every major platform. Contrary to some Perl advocates statements, the speed of Tcl and Perl are very similar (of course, if both implementations use similar implementation techniques like conversion to p-code, JVM-like approaches, etc.). See Why Use Tcl for additional details.

Like Tcl itself, TK is a simple and elegant toolkit that proved to be useful not only for TCL programmers but in other scripting languages including Python and Perl. The major benefit of Tk is its conciseness. The classic minimal Tk application is indeed tiny:

pack [button .mybutton -text Bye-bye -command exit]

Compare this with Java and you will understand that newer is not always better. This is a small but complete program in Tcl/Tk that puts a button on the screen and exits when the button is pushed, with proper behavior for iconifying, clicking, and other modern GUI conventions.

This economy of expression has two aspects. One is using TCL (or other scripting language) create a remarkably concile and at the same time powerful toolkit. Typical attempts to create a Java-based GUI application require a lot of learning of Java infrastructure, methods, classes, etc before a beginner even attempts to create a working application. That is not true for Tk -- you are productive almost instantly. That's why a number of programmers discover Tk by accident. There are plenty anecdotes of developers who choose Tk as a quick prototyping tool, then discover they're able to complete a full-blown application by refinement of the prototype.

Again it's important to stress that Tk is remarkably concise and that Ousterhout defined for Tk a very reasonable default values that greatly simplify programming. Tk programmers often write prototype implementations and fill in details as requirements emerge. Tk encourages this prototype-based, "scripting style" more than any other toolkit: scripting languages are comfortable with variable numbers of arguments, default arguments, and arguments of variable type.

As Cameron Laird and Kathryn Soraiz wrote in their excellent paper Tk sets the standard Tk can certainly hold its own against other GUI toolkits. It only takes an hour or so to begin producing meaningful applications, and there are plenty of books and tutorials to help.

However, the king of the hill now, at least temporarily, are WEB interfaces. It's probably that most hot fashion trend and the true toolkit of choice recently is not Tk, but DHTML in WEB browser environment. And as everybody who spend in software development two or more years knows (or at least suspect ;-) software market is driven by fashion. IMHO fashion can dominate this market for a long stretches of years. What is bad is that fashionable trends are not progressive in all respects. In some areas they can even be reactionary. For example Linux as one of the most powerful fashion trends of late 90th and early 200x does not help much TCL/TK. Probably because of Linus Torvalds limited knowledge of languages other than C, Linux was shaped into a pretty conservative Unix-style desktop that lucks flexibility and power of Amiga or Os/2 with REXX as both scripting and macro language. Both gnome and kde are essentially traditional style GUI that are not bound to a scripting language. That means that the bright future of Tk is not given and a the market can capriciously turns its back on Tk. In this case the toolkit's usage might become limited to a those bright developers and administrators who realize its indispensability for extremely rapid production of portable, intelligent, well-behaved GUI applications. But I hope this will not happen.

Dr. Nikolai Bezroukov

Top Visited Your browser does not support iframes. Switchboard Latest Past week Past month

Source code in Tcl: http://wiki.tcl.tk/9235

February 11, 2014 | Wiki de Calcul Québec Example A module file's contents are simple enough, a good starting point is to take an existing file as an example and to modify the variables it contains to adapt it to the module that you would wish to install: File : modulefiles/mpi/openmpi/1.6.3_intel #%Module1.0 ##################################################################### ## ## OPENMPI MPI lib ## ## proc ModulesHelp { } { puts stderr "\tAdds the OpenMPI library to your environment. " } module-whatis "(Category_______) mpi" module-whatis "(Name___________) OpenMPI" module-whatis "(Version________) 1.6.3" module-whatis "(Website________) http://www.open-mpi.org/" module-whatis "(Authorship_____) The Open MPI Team" module-whatis "(Compiler_______) Intel 2013" module-whatis "(Flags__________) CFLAGS='-O3 -xHOST -Wall' ../openmpi-1.6.3/configure --prefix=prefix --with-threads " module-whatis " --enable-mpi-thread-multiple --with-openib --enable-shared --enable-static --with-ft=cr --enable-ft-thread " module-whatis " --with-blcr=/software/apps/blcr/0.8.4 --with-blcr-libdir=/software/apps/blcr/0.8.4/lib --with-tm=/opt/torque " module-whatis " CFLAGS='CFLAGS' --with-io-romio-flags='--with-file-system=testfs+ufs+nfs+lustre'" module-whatis "(Dependencies___) Intel 2013" conflict mpi prereq compilers/intel/2013 set synopsys /software/MPI/openmpi/1.6.3_intel set blcr_synopsys /software/apps/blcr/0.8.4 prepend-path PATH $synopsys/bin:$blcr_synopsys/bin prepend-path LD_LIBRARY_PATH $synopsys/lib:$blcr_synopsys/lib prepend-path C_INCLUDE_PATH $synopsys/include prepend-path CXX_INCLUDE_PATH $synopsys/include prepend-path CPP_INCLUDE_PATH $synopsys/include prepend-path CPLUS_INCLUDE_PATH $synopsys/include prepend-path MANPATH $synopsys/share/man:$blcr_synopsys/man setenv OMPI_MCA_plm_rsh_num_concurrent 960 Let us consider this example in detail. This file start with a comment (lines starting with the number sign # ), which specifies that it is a module using format 1.0. Other comments show that this is a module for the MPI library OpenMPI. The actual module then starts by defining a function, ModulesHelp . This function is then called when a user runs the following command: [name@server $] module help mpi/openmpi/1.6.3_intel

This command outputs the message "Adds the OpenMPI library to your environment." to standard error. Note : All messages that are displayed by the module command use standard error (stderr) instead of standard output (stdout). The module then continues with a list of commands including the following: Command Meaning module-whatis Allows for a more elaborate description of the module, which is shown using module whatis module_name . conflict Specifies that this module cannot be loaded if the given module was already loaded. prereq Specifies a pre-required module. set Defines a variable that is internal to the module. prepend-path Prefix an environment variable of the type PATH using the specified path. setenv Defines an environment variable. The above module defines a detailed description using the module-whatis command. Following that, it specifies that the module cannot be loaded together with another mpi module, that means that only one mpi module can be loaded at the same time. After that, it specifies that the Intel compiler, version 2013, is required. After that, the module adds some directories to various environment variables, and finally defines a new environment variable (an OpenMPI control parameter in this case).

There are two versions of the Environment Modules package. An experimental version is written in Tcl. The stable traditional version is written in C. Both versions use the same modulefiles and command line syntax, with the exception that the Tcl version can use an abbreviated version of the "switch" command. Tcl Version (beta). You must have tclsh somewhere in your default $PATH, version 8.0 or newer; in addition, you must install the files in the init directory someplace that all of your users/systems can access (i.e., there is no automated install for the Tcl version as yet). http://modules.cvs.sourceforge.net/viewvc/modules/modules/tcl/ C version (released versions). This version requires compilation and linking with the libtcl*.a libraries. The first link is the main site; the others are mirrors. http://sourceforge.net/project/showfiles.php?group_id=15538 http://prdownloads.sourceforge.net/modules SourceForge has a nifty feature called "Monitoring" which allows you to be notified when a project releases new files. For more information or to sign up, go to http://sourceforge.net/projects/modules and under "Latest File Releases" click on the envelope icon.

ADMIN Magazine ... ... ... If you want to change you compiler or libraries – basically anything to do with your environment – you might be tempted to change your $PATH in the .bashrc file (if you are using Bash) and then log out and log back in whenever you need to change your compiler/MPI combination. Initially this sounds like a pain, and it is, but it works to some degree. It doesn't work in the situation where you want to run multiple jobs each with a different compiler/MPI combination. For example, say I have a job using the GCC 4.6.2 compilers using Open MPI 1.5.2 , then I have a job using GCC 4.5.3 and MPICH2 . If I have both jobs in the queue at the same time, how can I control my .bashrc to make sure each job has the correct $PATH? The only way to do this is to restrict myself to one job in the queue at a time. When it's finished I can then change my .bashrc and submit a new job. Because you are using a different compiler/MPI combination from what is in the queue, even for something as simple as code development, you have to watch when the job is run to make sure your .bashrc matches your job. The Easy Way A much better way to handle compiler/MPI combinations is to use Environment Modules . (Be careful not to confuse "environment modules" with "kernel modules.") According to the website, "The Environment Modules package provides for the dynamic modification of a user's environment via modulefiles." Although this might not sound earth shattering, it actually is a quantum leap for using multiple compilers/MPI libraries, but you can use it for more than just that, which I will talk about later. You can use Environment Modules to alter or change environment variables such as $PATH, $MANPATH, $LD_LIBRARY_LOAD, and others. Because most job scripts for resource managers, such as LSF , PBS-Pro , and MOAB , are really shell scripts, you can incorporate Environment Modules into the scripts to set the appropriate $PATH for your compiler/MPI combination, or any other environment variables an application requires for operation. How you install Environment Modules depends on how your cluster is built. You can build it from source, as I will discuss later, or you can install it from your package manager. Just be sure to look for Environment Modules. Using Environment Modules To begin, I'll assume that Environment Modules is installed and functioning correctly, so you can now test a few of the options typically used. In this article, I'll be using some examples from TACC . The first thing to check is what modules are available to you by using the module avail command: [laytonjb@dlogin-0 ~]$ module avail ------------------------------------------- /opt/apps/intel11_1/modulefiles ------------------------------------------- fftw3/3.2.2 gotoblas2/1.08 hdf5/1.8.4 mkl/10.2.4.032 mvapich2/1.4 netcdf/4.0.1 openmpi/1.4 ------------------------------------------------ /opt/apps/modulefiles ------------------------------------------------ gnuplot/4.2.6 intel/11.1(default) papi/3.7.2 intel/10.1 lua/5.1.4 pgi/10.2 -------------------------------------------------- /opt/modulefiles --------------------------------------------------- Linux TACC TACC-paths cluster ----------------------------------------------- /cm/shared/modulefiles ------------------------------------------------ acml/gcc/64/4.3.0 fftw3/gcc/64/3.2.2 mpich2/smpd/ge/open64/64/1.1.1p1 acml/gcc/mp/64/4.3.0 fftw3/open64/64/3.2.2 mpiexec/0.84_427 acml/gcc-int64/64/4.3.0 gcc/4.3.4 mvapich/gcc/64/1.1 acml/gcc-int64/mp/64/4.3.0 globalarrays/gcc/openmpi/64/4.2 mvapich/open64/64/1.1 acml/open64/64/4.3.0 globalarrays/open64/openmpi/64/4.2 mvapich2/gcc/64/1.2 acml/open64-int64/64/4.3.0 hdf5/1.6.9 mvapich2/open64/64/1.2 blacs/openmpi/gcc/64/1.1patch03 hpl/2.0 netcdf/gcc/64/4.0.1 blacs/openmpi/open64/64/1.1patch03 intel-cluster-checker/1.3 netcdf/open64/64/4.0.1 blas/gcc/64/1 intel-cluster-runtime/2.1 netperf/2.4.5 blas/open64/64/1 intel-tbb/ia32/22_20090809oss open64/4.2.2.2 bonnie++/1.96 intel-tbb/intel64/22_20090809oss openmpi/gcc/64/1.3.3 cmgui/5.0 iozone/3_326 openmpi/open64/64/1.3.3 default-environment lapack/gcc/64/3.2.1 scalapack/gcc/64/1.8.0 fftw2/gcc/64/double/2.1.5 lapack/open64/64/3.2.1 scalapack/open64/64/1.8.0 fftw2/gcc/64/float/2.1.5 mpich/ge/gcc/64/1.2.7 sge/6.2u3 fftw2/open64/64/double/2.1.5 mpich/ge/open64/64/1.2.7 torque/2.3.7 fftw2/open64/64/float/2.1.5 mpich2/smpd/ge/gcc/64/1.1.1p1 This command lists what environment modules are available. You'll notice that TACC has a very large number of possible modules that provide a range of compilers, MPI libraries, and combinations. A number of applications show up in the list as well. You can check which modules are "loaded" in your environment by using the list option with the module command: [laytonjb@dlogin-0 ~]$ module list Currently Loaded Modulefiles: 1) Linux 2) intel/11.1 3) mvapich2/1.4 4) sge/6.2u3 5) cluster 6) TACC This indicates that when I log in, I have six modules already loaded for me. If I want to use any additional modules, I have to load them manually: [laytonjb@dlogin-0 ~]$ module load gotoblas2/1.08 [laytonjb@dlogin-0 ~]$ module list Currently Loaded Modulefiles: 1) Linux 3) mvapich2/1.4 5) cluster 7) gotoblas2/1.08 2) intel/11.1 4) sge/6.2u3 6) TACC You can just cut and paste from the list of available modules to load the ones you want or need. (This is what I do, and it makes things easier.) By loading a module, you will have just changed the environmental variables defined for that module. Typically this is $PATH, $MANPATH, and $LD_LIBRARY_LOAD. To unload or remove a module, just use the unload option with the module command, but you have to specify the complete name of the environment module: [laytonjb@dlogin-0 ~]$ module unload gotoblas2/1.08 [laytonjb@dlogin-0 ~]$ module list Currently Loaded Modulefiles: 1) Linux 2) intel/11.1 3) mvapich2/1.4 4) sge/6.2u3 5) cluster 6) TACC Notice that the gotoblas2/1.08 module is no longer listed. Alternatively, to you can unload all loaded environment modules using module purge: [laytonjb@dlogin-0 ~]$ module purge [laytonjb@dlogin-0 ~]$ module list No Modulefiles Currently Loaded. You can see here that after the module purge command, no more environment modules are loaded. If you are using a resource manager (job scheduler), you are likely creating a script that requests the resources and runs the application. In this case, you might need to load the correct Environment Modules in your script. Typically after the part of the script in which you request resources (in the PBS world, these are defined as #PBS commands), you will then load the environment modules you need. Now that you've seen a few basic commands for using Environment Modules, I'll go into a little more depth, starting with installing from source. Then I'll use the module in a job script and write my own module. Building Environment Modules for Clusters In my opinion, the quality of open source code has improved over the last several years to the point at which building and installing is fairly straightforward, even if you haven't built any code before. If you haven't built code, don't be afraid to start with Environment Modules. For this article, as an example, I will build Environment Modules on a "head" node in the cluster in /usr/local. I will assume that you have /usr/local NSF exported to the compute nodes or some other filesystem or directory that is mounted on the compute nodes (perhaps a global filesystem?). If you are building and testing your code on a production cluster, be sure to check that /usr/local is mounted on all of the compute nodes. To begin, download the latest version – it should be a *.tar.gz file. (I'm using v3.2.6, but the latest as of writing this article is v3.2.9). To make things easier, build the code in /usr/local. The documentation that comes with Environment Modules recommends that it be built in /usr/local/Modules/src. As root, run the following commands: % cd /usr/local % mkdir Modules % cd Modules % mkdir src % cp modules-3.2.6.tar.gz /usr/local/Modules/src % gunzip -c modules-3.2.6.tar.gz | tar xvf - % cd modules-3.2.6 At this point, I would recommend you carefully read the INSTALL file; it will save your bacon. (The first time I built Environment Modules, I didn't read it and had lots of trouble.) Before you start configuring and building the code, you need to fulfill a few prerequisites. First, you should have Tcl installed, as well as the Tcl Development package. Because I don't know what OS or distribution you are running, I'll leave to you the tasks of installing Tcl and Tcl Development on the node where you will be building Environment Modules. At this point, you should configure and build Environment Modules. As root, enter the following commands: % cd /usr/local/Modules/src/modules-3.2.6 % ./configure % make % make install The INSTALL document recommends making a symbolic link in /usr/local/Modules connecting the current version of Environment Modules to a directory called default: % cd /usr/local/Modules % sudo ln -s 3.2.6 default The reason they recommend using the symbolic link is that, if you upgrade Environment Modules to a new version, you build it in /usr/local/Modules/src and then create a symbolic link from /usr/local/Modules/<new> to /usr/local/Modules/default, which makes it easier to upgrade. The next thing to do is copy one (possibly more) of the init files for Environment Modules to a global location for all users. For my particular cluster, I chose to use the sh init file. This file will configure Environment Modules for all of the users. I chose to use the sh version rather than csh or bash, because sh is the least common denominator: % sudo cp /usr/local/Modules/default/init/sh /etc/profile.d/modules.sh % chmod 755 /etc/profile.d/modules.sh Now users can use Environment Modules by just putting the following in their .bashrc or .profile: %. /etc/profile.d/modules.sh As a simple test, you can run the above script and then type the command module. If you get some information about how to use modules, such as what you would see if you used the -help option, then you have installed Environment Modules correctly. Environment Modules in Job Scripts In this section, I want to show you how you can use Environment Modules in a job script. I am using PBS for this quick example, with this code snippet for the top part of the job script: #PBS -S /bin/bash #PBS -l nodes=8:ppn=2 . /etc/profile.d/modules.sh module load compiler/pgi6.1-X86_64 module load mpi/mpich-1.2.7 (insert mpirun command here) At the top of the code snippet is the PBS directives that begin with #PBS. After the PBS directives, I invoke the Environment Modules startup script (modules.sh). Immediately after that, you should load the modules you need for your job. For this particular example, taken from a three-year-old job script of mine, I've loaded a compiler (pgi 6.1-x86_64) and an MPI library (mpich-1.2.7). Building Your Own Module File Creating your own module file is not too difficult. If you happen to know some Tcl, then it's pretty easy; however, even if you don't know Tcl, it's simple to follow an example to create your own. The modules themselves define what you want to do to the environment when you load the module. For example, you can create new environment variables that you might need to run the application or change $PATH, $LD_LIBRARY_LOAD, or $MANPATH so a particular application will run correctly. Believe it or not, you can even run code within the module or call an external application. This makes Environment Modules very, very flexible. To begin, remember that all modules are written in Tcl, so this makes them very programmable. For the example, here, all of the module files go in /usr/local/Modules/default/modulefiles. In this directory, you can create subdirectories to better label or organize your modules. In this example, I'm going to create a module for gcc-4.6.2 that I build and install into my home account. To begin, I create a subdirectory called compilers for any module file that has to do with compilers. Environment Modules has a sort of template you can use to create your own module. I used this as the starting point for my module. As root, do the following: % cd /usr/local/Modules/default/modulefiles % mkdir compilers % cp modules compilers/gcc-4.6.2 The new module will appear in the module list as compilers/gcc-4.6.2. I would recommend that you look at the template to get a feel for the syntax and what the various parts of the modulefile are doing. Again, recall that Environment Modules use Tcl as its language but you don't have to know much about Tcl to create a module file. The module file I created follows: #%Module1.0##################################################################### ## ## modules compilers/gcc-4.6.2 ## ## modulefiles/compilers/gcc-4.6.2. Written by Jeff Layton ## proc ModulesHelp { } { global version modroot puts stderr "compilers/gcc-4.6.2 - sets the Environment for GCC 4.6.2 in my home directory" } module-whatis "Sets the environment for using gcc-4.6.2 compilers (C, Fortran)" # for Tcl script use only set topdir /home/laytonj/bin/gcc-4.6.2 set version 4.6.2 set sys linux86 setenv CC $topdir/bin/gcc setenv GCC $topdir/bin/gcc setenv FC $topdir/bin/gfortran setenv F77 $topdir/bin/gfortran setenv F90 $topdir/bin/gfortran prepend-path PATH $topdir/include prepend-path PATH $topdir/bin prepend-path MANPATH $topdir/man prepend-path LD_LIBRARY_PATH $topdir/lib The file might seem a bit long, but it is actually fairly compact. The first section provides help with this particular module if a user asks for it (the line that begins with puts stderr); for example: home8:~> module help compilers/gcc-4.6.2 ----------- Module Specific Help for 'compilers/gcc-4.6.2' -------- compilers/gcc-4.6.2 - sets the Environment for GCC 4.6.2 in my home directory You can have multiple strings by using several puts stderr lines in the module (the template has several lines). After the help section in the procedure ModuleHelp, another line provides some simple information when a user uses the whatis option; for example: home8:~> module whatis compilers/gcc-4.6.2 compilers/gcc-4.6.2 : Sets the environment for using gcc-4.6.2 compilers (C, Fortran) After the help and whatis definitions is a section where I create whatever environment variables are needed, as well as modify $PATH, $LD_LIBRARY_PATH, and $MANPATH or other standard environment variables. To make life a little easier for me, I defined some local variables:topdir, version, and sys. I only used topdir, but I defined the other two variables in case I needed to go back and modify the module (the variables can help remind me what the module was designed to do). In this particular modulefile, I defined a set of environment variables pointing to the compilers (CC, GCC, FC, F77, and F90). After defining those environment variables, I modified $PATH, $LD_LIBRARY_PATH, and $MANPATH so that the compiler was first in these paths by using the prepend-path directive. This basic module is pretty simple, but you can get very fancy if you want or need to. For example, you could make a module file dependent on another module file so that you have to load a specific module before you load the one you want. Or, you can call external applications – for example, to see whether an application is installed and functioning. You are pretty much limited only by your needs and imagination. Making Sure It Works Correctly Now that you've defined a module, you need to check to make sure it works. Before you load the module, check to see which gcc is being used: home8:~> which gcc /usr/bin/gcc home8:~> gcc -v Reading specs from /usr/lib/gcc/i386-redhat-linux/3.4.3/specs Configured with: ../configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --enable-shared --enable-threads=posix --disable-checking --with-system-zlib --enable-__cxa_atexit --disable-libunwind-exceptions--enable-java-awt=gtk --host=i386-redhat-linux Thread model: posix gcc version 3.4.3 20050227 (Red Hat 3.4.3-22.1) This means gcc is currently pointing to the system gcc. (Yes, this is a really old gcc; I need to upgrade my simple test box at home). Next, load the module and check which gcc is being used: home8:~> module avail ----------------------------- /usr/local/Modules/versions ------------------------------ 3.2.6 ------------------------- /usr/local/Modules/3.2.6/modulefiles -------------------------- compilers/gcc-4.6.2 dot module-info null compilers/modules module-cvs modules use.own home8:~> module load compilers/gcc-4.6.2 home8:~> module list Currently Loaded Modulefiles: 1) compilers/gcc-4.6.2 home8:~> which gcc ~/bin/gcc-4.6.2/bin/gcc home8:~> gcc -v Using built-in specs. Target: i686-pc-linux-gnu Configured with: ./configure --prefix=/home/laytonj/bin/gcc-4.6.2 --enable-languages=c,fortran --enable-libgomp Thread model: posix gcc version 4.6.2 This means if you used gcc, you would end up using the version built in your home directory. As a final check, unload the module and recheck where the default gcc points: home8:~> module unload compilers/gcc-4.6.2 home8:~> module list No Modulefiles Currently Loaded. home8:~> which gcc /usr/bin/gcc home8:~> gcc -v Reading specs from /usr/lib/gcc/i386-redhat-linux/3.4.3/specs Configured with: ../configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --enable-shared --enable-threads=posix --disable-checking --with-system-zlib --enable-__cxa_atexit --disable-libunwind-exceptions--enable-java-awt=gtk --host=i386-redhat-linux Thread model: posix gcc version 3.4.3 20050227 (Red Hat 3.4.3-22.1) Notice that after you unload the module, the default gcc goes back to the original version, which means the environment variables are probably correct. If you want to be more thorough, you should check all of the environment variables before loading the module, after the module is loaded, and then after the module is unloaded. But at this point, I'm ready to declare success! Final Comments For clusters, Environment Modules are pretty much the best solution for handling multiple compilers, multiple libraries, or even applications. They are easy to use even for beginners to the command line. Just a few commands allow you to add modules to and remove them from your environment easily. You can even use them in job scripts. As you also saw, it's not too difficult to write your own module and use it. Environment Modules are truly one of the indispensable tools for clusters.

Feb 22, 2015 | nickgeoghegan.net Creating module files Above we specified that module files be placed in /modules, so that's where we'll put gcc's module files. Create a gcc directory, if there isn't one mkdir /modules/gcc Add the associated module file vim /modules/gcc/4.6.2 What's in that file, then? #%Module1.0 proc ModulesHelp { } { global dotversion puts stderr "\tGCC 4.6.2 (gcc, g++, gfortran)" } module-whatis "GCC 4.6.2 (gcc, g++, gfortran)" conflict gcc prepend-path PATH /packages/gcc/4.6.2/bin prepend-path LD_LIBRARY_PATH /packages/gcc/4.6.2/lib64 prepend-path LIBRARY_PATH /packages/gcc/4.6.2/lib64 prepend-path MANPATH /packages/gcc/4.6.2/man setenv CC gcc setenv CXX g++ setenv FC gfortran setenv F77 gfortran setenv F90 gfortran Modules allows you to set default versions of packages. So, say you have 4 versions of gcc, and you'd like the 4.6.2 as the default version, you can set it in a version file. vim /modules/gcc/.version #%Module1.0 set ModulesVersion "4.6.2" How do I use modules? Well, it's about bloody time that we finally get to use the damn modules we've setup, otherwise you'd drive to my house and beat the piss out of me. List the modules on your system with module avail. [nick@zoidberg ~]$ module avail ---------------------------------- /modules/ ----------------------------------- gcc/4.6.2(default) haskell/ghc/7.0.4 The (default) means that I can just load gcc without specifying the version numbers. Load a module on your system with module load Before we do this, I'll assure you it works. [nick@zoidberg ~]$ gcc --version gcc (Debian 4.4.5-8) 4.4.5 Let's load gcc version 4.6.2 [nick@zoidberg ~]$ module load gcc/4.6.2 [nick@zoidberg ~]$ gcc --version gcc (GCC) 4.6.2 We can also load this version of gcc without specifying the version number, as 4.6.2 is the default. [nick@zoidberg ~]$ module load gcc [nick@zoidberg ~]$ gcc --version gcc (GCC) 4.6.2 See what modules are loaded The modules loaded will always contain version numbers, if you're install them into the same folder structure as myself. [nick@zoidberg ~]$ module list Currently Loaded Modulefiles: 1) /gcc/4.6.2 Unloading modules The syntax for unloading modules is the same as loading them. [nick@zoidberg ~]$ module unload gcc [nick@zoidberg ~]$ gcc --version gcc (Debian 4.4.5-8) 4.4.5

Jim is a small footprint implementation of the Tcl programming language. It implements a large subset of Tcl and adds new features like references with garbage collection, closures, a built-in object oriented... programming system, functional programming commands, and first class arrays. The interpreter's executable file is only 70 KB in size, and can be reduced by further excluding some commands. It is appropriate for inclusion inside existing programs, for scripting without dependencies, and for embedded systems (more)

Redet is a tool for developing and executing regular expressions using any of more than 50 search programs, editors, and programming languages, intended both for developing regular expressions for use elsewhere and as a search tool in its own right. For each program in each locale, a palette showing the available constructs is provided. The properties of each program are determined by runtime tests, which guarantees that they will be correct for the program version and locale. Additional features include persistent history, extensive help, a variety of character entry tools, and the ability to change locale while running. Redet is highly configurable and fully supports Unicode.

About:

Tickle Text is a fast, lightweight text editor with many features, including xterm, wish term, line numbers, find/replace, special characters, word count, color configuration, and more. Release focus: Major feature enhancements Changes:

A comment/uncomment feature was added. Templates for Perl and Python scripts were added. An FTP dialog to upload or download scripts and Web pages to or from a Web server was added. Saving of user preferences and configurations was added

About: Tony's Tickle Text Editor, or TclText, is a fast, lightweight text editor. It includes an embedded xterm and printing via lpr. To be easy on the eyes, the text and/or background color can be changed, and font size can be manipulated. Changes: The ability to display line numbers was added. Buttons were moved to menus. Xterm was moved to a toplevel window.

About: Tiny Eclipse is distribution of Eclipse for development with dynamic languages for the Web, such as JSP, PHP, Ruby, TCL, and Web Services. It features a small download size, the ability to choose the features you want to install, and GUI installers for Win32 and Linux GTK x86.

Jul 31, 2007 | www.ibm.com/developerworks If you manage systems and networks, you need Expect. More precisely, why would you want to be without Expect? It saves hours common tasks otherwise demand. Even if you already depend on Expect, though, you might not be aware of the capabilities described below. Expect automates command-line interactions You don't have to understand all of Expect to begin profiting from the tool; let's start with a concrete example of how Expect can simplify your work on AIX® or other operating systems: Suppose you have logins on several UNIX® or UNIX-like hosts and you need to change the passwords of these accounts, but the accounts are not synchronized by Network Information Service (NIS), Lightweight Directory Access Protocol (LDAP), or some other mechanism that recognizes you're the same person logging in on each machine. Logging in to a specific host and running the appropriate passwd command doesn't take long-probably only a minute, in most cases. And you must log in "by hand," right, because there's no way to script your password? Wrong. In fact, the standard Expect distribution (full distribution) includes a command-line tool (and a manual page describing its use!) that precisely takes over this chore. passmass (see Resources) is a short script written in Expect that makes it as easy to change passwords on twenty machines as on one. Rather than retyping the same password over and over, you can launch passmass once and let your desktop computer take care of updating each individual host. You save yourself enough time to get a bit of fresh air, and multiple opportunities for the frustration of mistyping something you've already entered. The limits of Expect This passmass application is an excellent model-it illustrates many of Expect's general properties: It's a great return on investment: The utility is already written, freely downloadable, easy to install and use, and saves time and effort.

Its contribution is "superficial," in some sense. If everything were "by the book"-if you had NIS or some other domain authentication or single sign-on system in place-or even if login could be scripted, there'd be no need for passmass . The world isn't polished that way, though, and Expect is very handy for grabbing on to all sorts of sharp edges that remain. Maybe Expect will help you create enough free time to rationalize your configuration so that you no longer need Expect. In the meantime, take advantage of it.

. The world isn't polished that way, though, and Expect is very handy for grabbing on to all sorts of sharp edges that remain. Maybe Expect will help you create enough free time to rationalize your configuration so that you no longer need Expect. In the meantime, take advantage of it. As distributed, passmass only logs in by way of telnet , rlogin , or slogin . I hope all current developerWorks readers have abandoned these protocols for ssh , which passmasss does not fully support.

only logs in by way of , , or . I hope all current developerWorks readers have abandoned these protocols for , which does fully support. On the other hand, almost everything having to do with Expect is clearly written and freely available. It only takes three simple lines (at most) to enhance passmass to respect ssh and other options. You probably know enough already to begin to write or modify your own Expect tools. As it turns out, the passmass distribution actually includes code to log in by means of ssh , but omits the command-line parsing to reach that code. Here's one way you might modify the distribution source to put ssh on the same footing as telnet and the other protocols:

Listing 1. Modified passmass fragment that accepts the -ssh argument ...

} "-rlogin" {

set login "rlogin"

continue

} "-slogin" {

set login "slogin"

continue

} "-ssh" {

set login "ssh"

continue

} "-telnet" {

set login "telnet"

continue

...

In my own code, I actually factor out more of this "boilerplate." For now, though, this cascade of tests, in the vicinity of line #100 of passmass , gives a good idea of Expect's readability. There's no deep programming here-no need for object-orientation, monadic application, co-routines, or other subtleties. You just ask the computer to take over typing you usually do for yourself. As it happens, this small step represents many minutes or hours of human effort saved.

06 Jun 2006 Most computer users interact with their workstations primarily through some form of graphical user interface (GUI). In the world of Microsoft® Windows®, this interface is tightly controlled. The UNIX® world, by contrast, offers a veritable smorgasbord of different GUIs with varying degrees of functionality. The GUIs are commonly known as window managers, because they occupy a layer above the nitty-gritty X-Windows layer and they manage windows for you. UNIX has always been about freedom and the right to choose how you want your system to work. Different personality types require different things, and the UNIX GUIs have something for everyone. They range from minimalist window managers, such as twm, to large, capable tools, such as GNOME and KDE (K Desktop Environment). In between are dozens of others offering various flavors and functions. (To view a current list of some of the most common window managers, see the Resources section.) You might want some form of scripted control over your desktop that lets you program in functionality without spending an inordinate amount of time learning how to do it. Or, you might prefer a programmable window manager, so you can control its behavior as much as you control the command-line interface. A successful option to meet both needs is a minimalist window manager that runs on top of a powerful window manager, allowing you just enough flexibility to script with ease. Tcl/Tk to the rescue Originating with Dr. John Ousterhout of the University of California, Berkeley, and later of Sun Microsystems and Scriptics, the Tool Command Language and Tk GUI toolkit (Tcl/Tk) scripting language offers a simple and elegant way to code GUI widgets with minimal effort. Programming a computer to do something always involves at least two important issues: how it will look and how it will work. The Tcl/Tk programming language strives to makes it as painless as possible to deal with the how will it look issue; it lets you easily script GUI widgets, such as windows and buttons, and attach them to procedures (the how will it work issue). What to do? A clean, uncluttered desktop is good in both the virtual and real worlds. Uncluttered means having just a few icons that do a lot of work to prepare the desktop. (Sometimes these are known as coffee icons -- you click them and then go get a cup of coffee while they start up a set of programs that you need to work with.) It's good to group the execution of several programs directed toward a single task into the actions of just one icon. For example, to write an article like this one, you can script a single icon that, when clicked, opens an xterm to play with, a text editor in which to edit XML files, and a browser that displays the article. With one click of an icon, all these tools are launched, configured the way you need them, with the correct screen geometry, the working directory set, and all files opened and ready to be worked on. That's scripting! You can accomplish the same result with a good bash shell script and attach the script to a desktop icon, but Tcl/Tk offers a significantly better interface to the GUI aspects. Writer's assistant Listing 1 shows a Tcl/Tk program that accomplishes all the tasks described in the previous section with a click of an icon. In addition to starting up the xterm, editor, and browser, it provides a button you can click at any time to convert the article from XML to HTML by running a script provided by IBM.

Listing 1. The writer's assistant

#!/usr/bin/wish -f # 1 -- How should it look? set PageTitle "IBM developerWorks Articles -- Studio B" set InitDir "/home/bill/StudioB/developerworks" set ChooseDir "Choose an Article Directory" frame .rc -borderwidth 2 wm title . $PageTitle wm resizable . 0 0 button .rc.b -width 20 -text "Rebuild Article" -command {rebuild-article} button .rc.c -width 20 -text "Exit" -command {exit} pack .rc pack .rc.b pack .rc.c grid .rc.b .rc.c -sticky ew # 2 -- What should it do? proc rebuild-article {} { exec ./dw-transform.sh & } wm geometry . =336x32+0+707 set dir "" while {$dir == ""} { set dir [tk_chooseDirectory \ -initialdir $InitDir \ -title $ChooseDir] } cd $dir exec nedit -geometry 150x48+0+0 index.xml & exec konsole & exec opera -geometry 1024x680+0+0 index.html &

You can screen-scrape the code by holding down the left button of the mouse and scraping all of it. Then, start your favorite editor and paste the text into it by clicking the middle mouse button. Save the file as article.tcl and make it executable with the chmod command: $ chmod +x article.tcl

Because window managers can differ, as can hardware (you might not even have a mouse with a middle button), these instructions are not as hard and fast as you might need. In that case, contact someone to help you. The next section considers the parts of the program. The variables The first portion of the code uses the traditional UNIX pound-bang string ( #! ) to run the Tcl/Tk interpreter known as wish. The -f switch tells wish to run the script and then exit. After this, a comment, denoted by # , begins the How should it look? block of code; this part is followed by the setting of string variables. It's always a good programming practice to group literals like this, high above the code: #!/usr/bin/wish -f # 1 -- How should it look? set PageTitle "IBM developerWorks Articles -- Studio B" set InitDir "/home/bill/StudioB/developerworks" set ChooseDir "Choose an Article Directory"

The Tk GUI code This tiny block of code creates a window with two buttons side-by-side. For brevity, it's hard to beat Tk when defining windows and widgets: frame .rc -borderwidth 2 wm title . $PageTitle wm resizable . 0 0 button .rc.b -width 20 -text "Rebuild Article" -command {rebuild-article} button .rc.c -width 20 -text "Exit" -command {exit} pack .rc pack .rc.b pack .rc.c grid .rc.b .rc.c -sticky ew

The frame command creates a window called .rc with a borderwidth of two pixels. The wm command tells the underlying window manager to name the topmost window ( . is always topmost) with the page title, and the next wm command prevents the user from resizing it. The two buttons are then defined and packed into the grid horizontally. The button code IBM provides a program to convert articles (written in XML) to Web pages (HTML) that have the look of the developerWorks site. The rebuild-article procedure invokes that script when you click the button of the same name. When the script finishes, it displays a dialog describing how it did in converting the XML to HTML, along with error messages (if any). If the program says that an index.html file was created, then you can click OK, give your browser the focus, and refresh the page to see the new document: proc rebuild-article {} { exec ./dw-transform.sh & }

The startup code In the startup code, wm geometry . =336x32+0+707 puts the desktop extension window at the bottom of a 1024 x 768 desktop. Then, the code opens a dialog that lets you choose which article you want to work on. Finally, it makes the article's directory into the current working directory and starts up the editor (nedit), xterm (konsole), and browser (Opera). All of them are configured properly for work to begin immediately: wm geometry . =336x32+0+707 set dir "" while {$dir == ""} { set dir [tk_chooseDirectory \ -initialdir $InitDir \ -title $ChooseDir] } cd $dir exec nedit -geometry 150x48+0+0 index.xml & exec konsole & exec opera -geometry 1024x680+0+0 index.html &

This Tcl/Tk program can be improved in many ways. Adding a button to create new articles or a button to back up all the articles to an archive would be worthy improvements, and Tcl/Tk makes it easy for you to do these things. You can also set up your preferred editor, xterm, and browser. Feel free to experiment with the look, and by all means don't limit this desktop extension to writing articles for IBM: Modify it to start up a programming environment or a Web-mastering environment. A 50-line scientific calculator Now you'll Tcl your desktop with a more complex GUI program: a scientific calculator written in a mere 50 lines of code (see Listing 2). The calculator evaluates expressions typed in from either the keyboard or the GUI buttons. If you enter an invalid expression, the text turns red. Just as with the program in Listing 1, you can screen-scrape this code into a file, save it as calc.tcl, chmod it as executable, and then run it.

Listing 2. The 50-line calculator

#!/usr/bin/wish -f # 1 -- How should it look? set cflag 0; set nextkey 0 wm title . "50-Line Calculator" wm resizable . 0 0 grid [entry .display -textvariable expression -justify right] -columnspan 6 focus .display bind .display <Return> equals; bind .display <KP_Enter> equals foreach row { {7 8 9 + - sin(} {4 5 6 * / cos(} {1 2 3 ( ) tan(} {C 0 . = Pi log(} } { foreach key $row { switch -- $key { C {set string {set cflag 1; set expression ""}} Pi {set string pi} = {set string equals} default {set string "press $key"} } lappend keys [button .[incr nextkey] -text $key -command $string] } eval grid $keys -sticky we set keys [list] } # 2 -- What should it do? proc press {key} { if $::cflag { set ::expression "" if ![regexp {[0-9().]} $key] {set ::expression $::results} .display configure -fg black .display icursor end; set ::cflag 0 } .display insert end $key } proc pi {} { if $::cflag {set ::expression ""} lappend ::expression "3.14159265" .display icursor end; set ::cflag 0 } proc equals {} { regsub {=.+} $::expression "" ::expression if [catch {lappend ::expression = \ [set ::results [expr \ [string map {/ *1.0/} $::expression]]]}] { .display configure -fg red } .display xview end; set ::cflag 1 }

This code cheats a little by doubling a few of the shorter lines, but you can see that Tcl/Tk is a remarkably terse language. Not many programming languages enable you to create such a complex GUI program in so few lines of code. The main window Besides the usual pound-bang code to run the interpreter, you set a few variables: cflag , which controls how the main data entry widget .display accumulates characters; and nextkey , which is an index used while you build the array of keys. Then, just as in the previous program, you set the window title text and prevent it from being resized. After this, you define the widget that is the calculator's display.Several things to note here: The text is always right-justified -- as in most calculator displays, the keys accumulate at the right side of the control and scroll to the left. The focus verb puts the focus in the display so you can use the keyboard as well as the GUI buttons to enter an expression. Then, you bind the code that you want to run when either the Enter key or the keypad Enter key is pressed, and the display is complete: #!/usr/bin/wish -f # 1 -- How should it look? set cflag 0; set nextkey 0 wm title . "50-Line Calculator" wm resizable . 0 0 grid [entry .display -textvariable expression -justify right] -columnspan 6 focus .display bind .display <Return> equals; bind .display <KP_Enter> equals

The calculator keypad This code dynamically builds the keypad keys of the calculator -- defining each button with the text it puts into the display -- or, in the case of the C, Pi, and = keys, a special procedure to execute when the key is pressed. Not many programming languages enable you to define such a complex GUI with so many widgets in so few lines of code as Tcl/Tk: foreach row { {7 8 9 + - sin(} {4 5 6 * / cos(} {1 2 3 ( ) tan(} {C 0 . = Pi log(} } { foreach key $row { switch -- $key { C {set string {set cflag 1; set expression ""}} Pi {set string pi} = {set string equals} default {set string "press $key"} } lappend keys [button .[incr nextkey] -text $key -command $string] } eval grid $keys -sticky we set keys [list] }

The calculator functionality With just three procedures, you now define all the calculator functionality you need. Keep in mind that while this program runs, the action is focused in the contents of the display widget. The expr function evaluates these contents when you press the = key or the Enter key. The press procedure kicks in when most of the GUI buttons are pressed. Its purpose is to append the contents of the button's name text to the display and to the expression string, where the expression will be evaluated later. The cflag , when set, clears the expression, preparing for the next expression to be entered and evaluated: # 2 -- What should it do? proc press {key} { if $::cflag { set ::expression "" if ![regexp {[0-9().]} $key] {set ::expression $::results} .display configure -fg black .display icursor end; set ::cflag 0 } .display insert end $key }

The pi procedure inserts the value of pi into the display and expression variable: proc pi {} { if $::cflag {set ::expression ""} lappend ::expression "3.14159265" .display icursor end; set ::cflag 0 }

The equals procedure executes when the = key or one of the Enter keys is pressed. By using Tcl/Tk's powerful expr verb and the library of math code it can evaluate, the expression is solved and the results are placed in, well, the results variable. If an input error occurs, such as a mismatched set of parentheses, the catch code is thrown the error by expr and changes the color of the display to red: proc equals {} { regsub {=.+} $::expression "" ::expression if [catch {lappend ::expression = \ [set ::results [expr \ [string map {/ *1.0/} $::expression]]]}] { .display configure -fg red } .display xview end; set ::cflag 1 }

The documentation for the expr verb indicates that it can handle many other mathematical functions, such as asin , acos , atan , exponentials, floors, and so on. You can use the keyboard to type these in and see expr work; and you should have enough knowledge now to modify the calculator code to add buttons that support these functions. To learn more, visit the Tcl/Tk keywords list in the Resources section. Pure Tcl desktops Many people are so enamored of the power of GUI scripting with Tcl/Tk that they have written editors, browsers, and file and desktop window managers in pure Tcl/Tk. A partial list is included in the Resources section. Conclusion Every good invention that defines life in the modern world came about because someone was dissatisfied with the status quo and determined to change it in a positive way. You can improve the world through technology -- and having fun with scripting a new desktop for your workstation might open a door to other ideas with a more profound effect. Happy scripting! Resources Learn

AIX and UNIX articles: Check out other articles written by William Zimmerly.

Window Managers for X: Visit to learn more about the variety of window managers available for UNIX and Linux®.

"Server clinic: Expect exceeds expectations" (developerWorks, April 2002): Learn more about a proper superset of the Tcl/Tk programming language.

Tcler's Wiki: Discover Tcl/Tk and see the writings of some of the Tcl/Tk world's best and brightest.

Tcl/Tk keywords: The list of keywords on Tcler's Wiki provides more information about specific Tcl/Tk commands and functions. Especially visit the E section and the read about the expr verb to see more of what this code can do.

section and the read about the verb to see more of what this code can do. The Script Archive: Explore this site to learn more about the history and future of Tcl/Tk from the company founded by the original author (Dr. John Ousterhout). (Note: as of the writing of this article, the Web site was undergoing reconstruction.)

Linux.com's HOWTO on Tcl and Tk: Read this informative document and learn more about Tcl/Tk on Linux.

AIX and UNIX: Want more? The developerWorks AIX and UNIX zone hosts hundreds of informative articles and introductory, intermediate, and advanced tutorials.

Stay current with developerWorks technical events and webcasts.



[Feb 21, 2007] Unix Review Tcl Scores High in RE Performance by Cameron Laird and Kathryn Soraiz

... Regular-Expressions.info is the Web place to go for tutorials and ongoing enthusiasm on the subject of REs, and the Wikipedia article on the subject does a good job of explaining how programmers see REs. ... There are parsing tasks that simply can't be done by REs, notably XML; others that can be written as REs, but only at the cost of extreme complexity (email addresses are an example); and still others where REs work quite well, but slower or more clumsily than such alternatives as scanf , procedural string parsing, glob-style patterns, the pattern matching such languages as Erlang or Icon build in, and even interesting special-purpose parser other than RE, such as Paul McGuire's Pyparsing or Damian Conway's Parse-RecDescent. Christopher Frenz' book hints at the range of techniques available for practical use. All this still doesn't exhaust what there is to know about Perl and its RE: Perl REs are actually "extended" REs, and there's controversy about the costs of that extension; Perl 6 is changing Perl RE's again; much of Perl's RE functionality is now readily available in "mainstream" languages such as C, Java, C#; and so on. There's a relative abundance of commenters on Perl, though, so we leave these subjects to others for now. Performance curiousities One final misconception about Perl's REs deserves mention, however: that its REs dominate all others in convenience, correctness, performance, and so on. As a recent discussion in the Lambda the Ultimate forum remarked, Perl's performance on a well-known RE benchmark is less than a quarter of that of the equivalent Tcl. This is not unusual; Tcl's RE engine is quite mature and fast. The final point to make for this month about REs, though, is that many of these facts simply don't matter! We've seeded this column with an abundance of hyperlinks to outside discussions and explanations; read for yourself about the different implementations and uses of REs. What's remarkable, though, is how little consequence most of these details appear to have to working programmers. The benchmark just mentioned, for example, and others that demonstrate Tcl can be hundreds of times faster than other languages in RE processing suggests that Tcl might be a favorite of programmers with big parsing jobs. It's just not so. Objectively, Tcl's RE performance is a strength, particularly in light of the fact that it correctly handles Unicode. Almost no one cares. Cisco, Oracle, IBM, Siemens, Daimler, Motorola, and many other companies and government agencies use Tcl in "mission-critical" applications. We've interviewed managers from dozens of such development staffs, and never has a decision-maker volunteered that the speed of Tcl's RE engine affected choice of the language. Even the most passionate pieces of Tcl advocacy don't mention its RE engine. Amazingly, Tcl insiders don't regard its RE engine as "wrung out". In fact, it's one of the more conservative parts of the Tcl implementation, and core maintainer Donal Fellows has abundant ideas for its improvement.

About: Myintcl is an interface for using MySQL in Tcl programs, and it is written only in pure Tcl. The API design of this package follows that of fbsql.

This article is excerpted from Linux Server Hacks, Volume Two, by William von Hagen and Brian K. Jones. Copyright © 2006, O'Reilly Media, Inc. All rights reserved. If you have multiple servers with similar or identical configurations (such as nodes in a cluster), it's often difficult to make sure the contents and configuration of those servers are identical. It's even more difficult when you need to make configuration modifications from the command line, knowing you'll have to execute the exact same command on a large number of systems (better get coffee first). You could try writing a script to perform the task automatically, but sometimes scripting is overkill for the work to be done. Fortunately, there's another way to execute commands on multiple hosts simultaneously. A great solution for this problem is an excellent tool called multixterm, which enables you to simultaneously open xterms to any number of systems, type your commands in a single central window and have the commands executed in each of the xterm windows you've started. Sound appealing? Type once, execute many -- it sounds like a new pipelining instruction set. multixterm requires expect and tk. The most common way to run multixterm is with a command like multixterm -xc "ssh %n"host1 host2 .

Tcl-Tk Resources at Noumena Corporation Tcl is a programming language with too many features to be easily compartmentalized.

The feature list includes: Extensible You can easily add new commands and libraries to a Tcl interpreter.

You can easily add new commands and libraries to a Tcl interpreter. Embeddable You can use Tcl as a scripting language within another larger application.

You can use Tcl as a scripting language within another larger application. Orthogonal The syntax is simple and consistent - Tcl code is easy to write and also easy to read.

The syntax is simple and consistent - Tcl code is easy to write and also easy to read. Modern All the modern programming constructs are supported, including simple objects and namespaces.

All the modern programming constructs are supported, including simple objects and namespaces. Multiplatform The Tcl interpreter has been ported to more platforms that Java. The same scripts can be run on everything from IBM Big-Iron to real-time embedded kernels.

The Tcl interpreter has been ported to more platforms that Java. The same scripts can be run on everything from IBM Big-Iron to real-time embedded kernels. Powerful The Tcl architects have provided powerful and well-thought generalizations that make it fast and easy to write what would otherwise be complex applications. The best thing about Tcl is that it's a tool that lets you concentrate on solving a problem, not dealing with a language. In 30 years of programming, I've never found a language I enjoy working with so much.

lf331, SoftwareDevelopment An introduction to the TclMySQL library

In this article you will learn how to install and use MySQLTcl, a Tcl library which makes possible to do SQL queries (select, insert, delete...) to a MySQL database server from Tcl scripts. The versions of Tcl, MySQL server and the MySQLTcl library covered in this article are respectively 8.4.2, 4.0.15 and 2.40. Tcl stands for Tool Command Language and was invented by John Ousterhout [1]. Tcl is actually two things: a scripting language and an interpreter. Tcl is a structured programming language which uses three basic data structures: strings, lists and arrays. Features of Tcl include regular expressions [2], third party Tcl extention libraries and Tk, a toolkit for writing graphical applicactions in Tcl. MySQL is a very popular database server in the open software comunity which I think it needs no presentation. MySQLTcl is a Tcl library which allows querying a MySQL database server from a Tcl script. Currently, the authors and mantainers of this Tcl library are Paolo Brutti (Paolo.Bruti at tlsoft.it), Tobias Ritzau (tobri at ida.liu.se) and Artur Trzewick (mail at xdobry.de).

[Oct 25, 2005] SGUIL - The Analyst Console for Network Security Monitoring.

Sguil (pronounced sgweel) is built by network security analysts for network security analysts. Sguil's main component is an intuitive GUI that provides realtime events from snort/barnyard. It also includes other components which facilitate the practice of Network Security Monitoring and event driven analysis of IDS alerts. The sguil client is written in tcl/tk and can be run on any operating system that supports tcl/tk (including Linux, *BSD, Solaris, MacOS, and Win32).

[Oct 22, 2005] Some tips and tricks for Tcl/Tk and extensions on Solaris.

Getting Tcl/Tk for Solaris: ActiveState is building binaries of ActiveTcl for Solaris, available at http://aspn.activestate.com/ASPN/Downloads/ActiveTcl/ . It is the most up-to-date and comprehensive binary available for Solaris right now. Unfortunately, the BLT extension is not yet included (as of July 2001).

http://sunfreeware.com/ is also another useful location for binary packages, but the Tcl and expect versions there are rather dated. Building Tcl/Tk on Solaris: First, make absolutely sure you have eradicated all instances of /usr/ucb from your environment. Remove instances from your PATH, LD_LIBRARY_PATH, etc. until the following command comes up empty: env | grep /usr/ucb The reason for doing this is that there are, in various combinations of Solaris, conflicts between the functions found in the /usr/ucb libraries and the /usr/lib libraries. Mixing the two is almost certain to cause malfunctioning applications on various iterations of Solaris. For instance, one conflict that LV has found is that at times, using functions compiled against the /usr/ucb libraries results in directory reading functions which think they are a different format than libraries compiled against non-ucb libraries. This manifests itself as glob's failing, or directory entries missing letters, etc. A decent ANSI C compiler should be your next quest and you basically have two choices -- the free gcc compiler or Sun's non-free SunPro C compiler. gcc: Pre-built binaries for specific versions of Solaris SunPro/Forte C: The name has apparently evolved now. More information is available at http://www.sun.com/forte/c/ as are trial downloads. Historically the SunPro compiler had done a better job at high optimization levels, which are mostly irrelevant for Tcl/Tk and extensions. Other build tools: Be certain to have /usr/ccs/bin in your PATH, so that you can find make, ar, and other useful tools. The Build: Once you have either of the above compilers installed and in your path, Tcl source and most extensions should build with: ./configure && make && make test && make install If you want to use cc, consider a slight variant: CC=cc export CC ./configure && make && make test && make install Otherwise, the configure defaults to looking for gcc . WARNING! Solaris comes with a file called /usr/ucb/cc . This is a shell that sets environment flags , etc. to put the ucb libraries into the compile environment. Be certain NOT to run that version of cc (see above). Martin Lemburg 11.02.2003: I just built tcl 8.3.5 on Solaris 2.8 and had following problems: the configure script doesn't build in the usage of the switch HAVE_TIMEZONE_VAR

the configure script uses CC instead of cc (on IRIX and AIX no problem)

the configure script doesn't build in the usage of the libraries socket and nsl Since building tcl 8.0.5 in 1999 on Solaris I have these problems with the configure script. Couldn't someone of the distribution team change this? Note that I have been building Tcl on Solaris since the late 80s and I don't see the problem that Martin reports. Also, I don't see the situation being quite as desperate with regards to removing ucb from your environment. You just need to make certain that you don't use /usr/ucb/cc as your compiler. Also, note that Solaris 8 comes with an installation disk that contains gcc, if you can't afford Sun ONE's compiler. The remark regarding CC above is however quite important. Another thing I have found when building some Tcl applications and extensions is that when I install Tcl/Tk into some non-standard directory, I sometimes have to edit the generated Makefile to include a -R flag for each -L flag that is used when creating .so libraries or any executable programs. Martin Lemburg 11.02.2003: We don't use ucb. We have restrictions for the usage of our compilation/linkage environment. We are even not allowed to use gcc.

We use /opt/SUNWspro/WS6U2/bin/CC as CC and /opt/SUNWspro/WS6U2/bin/cc as cc.

Compiling with CC results in an abort while trying to compile the first source file

Could someone suggest what should be done that the configure script recognize the usage of the timezone variable?

Could someone suggest what should be done that the configure script recognize the usage of libsocket and libnsl?

the problems I have since 1999 in building tcl on Solaris where ever the same. The workstation I built on, was updated since 1999 regulary from Solaris 2.5 to 2.8, but it is probably not configured in an optimal way.

[Sept 28, 2005] Dave Bodenstab's Home Page

Tcl/Tk Reference Guide The Tcl/Tk Reference Guide is a typeset quick reference guide to John Ousterhout's Tcl script language and Tk toolkit. It is made available by Paul Raines via ftp from here. That version is written for Tcl/Tk versions 8.0, but Tcl/Tk has marched on -- the current release is 8.4.3. I find the reference guide extremely useful, so I have updated it to coincide with the current Tcl/Tk release: 8.4.3. In addition, I have added a section for TclX, the Img package, selected packages from the Tcl standard library, the Tktable package, the vu package (bargraph, dial and pie widgets) and a final section on Vim's Tcl interface. For those who have latex installed, I have also included the TeX input files for the dictionary, Hash, pgtcl, RBtree, sendx, setops, svipc, syslog, tcLex, tclgdbm, tclreadline, Tkspline, tree, trf and trfcrypt packages. The latest version of the reference guide is 8.4.3 -- click here to download. Tk Canvas tree widget Sometime in the past I found Allan Brighton's Tk canvas tree widget package (version 8.0.4) at (I believe) ftp://ftp.archive.eso.org/pub/tree/tree-8.0.4.tar.gz, but I have not been able to locate it again anywhere on the WEB (as of July, 2003). However, Mr. Brighton's 8.0.3 version is available here, and the documentation is here (both URL's verified 24-Feb-2004). Since the 8.0.4 version seems to have disappeared, I have ported it to Tk 8.4. It it can be found here. I've tried to clearly identify any changes I make by using RCS; therefore the original, unmodified source is available for reference as well as the changes I have made. Before building the package you must extract the source from the RCS files: $ tar -xzf bree-8.0.4.1.tgz # Unpack the tar archive $ cd tree-8.0.4.1 $ cd src $ co *,v # Extract updated source from RCS files $ cd .. $ ./configure $ make It has been reported that this (Brighton's) package conflicts with the BWidget package. Brighton's package uses the "Tree" class name for the widget it creates which conflicts with the tree widget from BWidgets which also uses the same class name. Since the BWidget tree binds <Destroy> to the class Tree, that binding is called when Brighton's C++ tree is destroyed. The solution is to either rename Brighton's "tree" command to anything else such that "string totitle new_cmd_name" does not return "Tree", or modify the source.

[Aug 12, 2005] MacDevCenter.com Build a Simple 3D Pipeline in Tcl by Michael J. Norton

As a kid, did you ever take something apart just to see how it worked? I still remember taking apart my old Apple ][ just to see if I could get it back together. The integrated circuits sat on sockets, so you could easily pry them off with a small screw driver. It was an interesting exercise in seeing how computer hardware functions. It's been several decades since I last dismantled an Apple ][, but I still find myself taking things apart just to satisfy my curiosity. I don't take apart computers that much anymore. I mean, where's the fun in taking apart a laptop? Eeek! My interest has shifted to old game software autopsies. Old popular video game source code can be found everywhere on the net. Not only that, but game console development environments are becoming available to the hobbyist. Development environments exist on Linux for the GameCube (see also Linux on the GameCube, the Sony PlayStation 2, and the old Sega Dreamcast console). These game console platforms have some interesting homebrewed 3D demos for them on the internet. These incredible hobbyist efforts sparked my interest in reverse engineering 3D graphics for the game console environment. In some cases, excluding the PlayStation 2, the Linux environments don't support the OpenGL API. Which means coding 3D from scratch is necessary. Do you need any of these development environments to experiment with game console 3D programming? Certainly not. I am surrounded by computers at work and at home; Macintosh, PC, Sun, and SGI. Despite the differences in hardware and software on all of these systems, they all have a nice little scripting language, Tcl, which is more than capable of stepping up to the task. Using Tcl, you can literally prototype algorithms in the interpreter much in the same way you would prototype an electronics circuit on an experimenter's breadboard. In our case, we're going to assemble a game console to experiment with using Tcl. Pretty cool, huh?

[Jul 5, 2005] SourceForge.net Project Info - tkdiff

Free Software Magazine - Make it right using Tcl Published on paper in: May, 2005 (will be free on the 15th of June, 2005 )Software testing with Tcl for Apache Rivet. By David N. Welton

Any sufficiently complex software system has bugs, and those of us who aspire to produce high quality work also seek to not only minimize these, but guarantee that our code does what we say it ought to. One proven way to eliminate bugs, and ensure that code behaves as documented is to test the program. Easy enough to do by hand, when there isn't much functionality. However, when the system grows more complex, and there are many possible environmental factors with various permutations, it quickly becomes obvious that we need to automate our testing. This article aims to provide the reader with a few notions about software testing in general, and then concentrates on a specific example, using the test suite written in Tcl for Apache Rivet, in order to demonstrate a real-world approach to testing a particular program. Software testing Testing is often an afterthought, and even for large, complex systems, or expensive, proprietary software, testing is never going to directly generate revenue, or add new features to the program, and so it has to compete for scarce developer time. Even in free software work, more often than not, it's more fun to spend time hacking on a cool new feature rather than writing test cases to make sure everything works exactly as advertised. Testing is often an afterthought, and even for large, complex systems, or expensive, proprietary software, testing is never going to directly generate revenue, or add new features to the program, and so it has to compete for scarce developer time This means that it's important to try to get the most from the time you dedicate to testing - to get better value for money. In technical terms, it is desirable to maximize is the "code coverage" for the time invested. Coverage refers to how much of the code is exercised by the test suite. Think of running a simple program with options -x, -y, and -z. If you run it with the -x option, the "paths" through the other options will not be taken, and you won't know if they really work or not. So you have to expand your code coverage to all three code paths. Generalizing, the two main approaches are "white box" and "black box" testing. White box testing is testing a program based on knowledge of its internal workings. For example, writing a series of tests that give different input to each C function in a program, and the checking to ensure that it behaves correctly. Obviously, you need the source code, and the ability to rebuild the program in order to do this type of testing. One of the most important reasons to use this approach is that it is theoretically possible to test most or all of the code paths in the program. In practice though, the effort required to do this may be significant - imagine that you have a C function that takes a struct as input, and that struct is in turn generated by other functions, and so on. It's clear that things can quickly get more complicated. Black box (or "functional") testing involves running a program against a specification to make sure it produces the correct output. For instance, testing to ensure the ls program works correctly. A very simple test would be to create a directory with files of known sizes and creation times, run the ls program, and compare its output with the known contents of the directory. Apache Rivet Apache Rivet is a server-side Tcl system for the creation of dynamic web pages. Think JSP or PHP, but using Tcl, a free, multi platform, general-purpose scripting language. For example: <b><? puts "The date is: [clock format [clock seconds]]" ?></b> It is best to test the software with as little modification to the environment as possible, meaning that the test suite will run using the copy of Apache already installed on the computer - having to create a special copy of Apache would defeat the purpose. The goal of the test suite is to be able to start the web server with configuration options of our choosing, send HTTP requests, receive answers, stop the server, and then create a report. Because it's so tightly integrated with Apache and Tcl, white box or unit test would be difficult. It would be very laborious to create and fill in all of the arguments that are passed to each function in the C code, because they usually reference complex C structures such as the Tcl interp struct, or the Apache request struct, which rely, in turn, on lots of configuration and set up. The effort required to make most of this work would probably be more than that involved in creating Rivet itself! So, a "black box" testing style would provide more coverage for the time dedicated to it. From the test suite, there much required, as the real work is in devising clever ways to test as much of Rivet's functionality as possible. An application is required that allows program tests to be performed quickly and flexibly, and provides a lot of tools for interacting with Rivet and Apache: Reading and writing files in order to manipulate Rivet's configuration files.

Process control, to control the Apache process itself.

Sockets and an implementation of the HTTP protocol in order to send requests to the Rivet-enabled web server.

Good string matching and regular expression support. Being a fan of Tcl, I choose. The Tcl Test Suite ships as part of the core Tcl distribution, and is an excellent base upon which to build a series of tests for all kinds of applications

USENIX Third Annual Tcl-Tk Workshop, 1995 Two Years with TkMan: Lessons and Innovations

Or, Everything I Needed to Know about Tcl/Tk I Learned from TkMan by Thomas A. Phelps

Among Tcl/Tk applications, TkMan is unusual. Whereas Tcl was written to glue together C functions, TkMan is written entirely in Tcl. And TkMan is the beneficiary of years of battle testing by 1000s of users on every flavor of UNIX. The extreme position created by the demands of this large, diverse audience and the (self-imposed) limitation of remaining strictly within Tcl brought to the fore a severe set of implementation issues, and provoked a variety of solutions ranging from general methods to a low-level bag of tricks with regard to speeding up Tcl scripts, exploiting Tcl as its own scripting language, configuring applications and interoperating with other tools. Although developed to meet these particular, extreme requirements, most of the resulting solutions can be broadly applied, and this paper shares this lore with the aim of helping other authors develop elegant, efficient and robust Tcl/Tk-based applications. Download the full text of this paper in PDF, ASCII (35,680 bytes) and POSTSCRIPT (149,581 bytes) form.

Writing a Tcl Extension in only 7 years by Don Libes. Usenix paper

Expect is a tool for automating interactive applications. Expect was constructed using Tcl, a language library designed to be embedded into applications. This paper describes experiences with Expect and Tcl over a seven year period. These experiences may help other extension designers as well as the Tcl developers or developers of any other extension language see some of the challenges that a single extension had to deal with while evolving at the same time as Tcl. Tcl and Expect users may also use these 'war stories' to gain insight into why Expect works and looks the way it does today. View the full text of this paper in HTML form, PDF (52688 Bytes) form, and POSTSCRIPT (213776 Bytes) form.

The Tclsh Spot by Clif Flynt (PDF)Login, October 2004, Volume 29, Number 5



freshmeat.net Project details for XOTclIDE

XOTclIDE is an integrated (interactive) development environment for the Tcl and XOTcl language, a highly flexible object-oriented extension of Tcl. It provides a Smalltalk-like programming environment similar to Squick. It allows graphical introspection and editing of a running system (based on Tcl's and XOTcl's introspection facilities). The system state can be saved in the form of Tcl packages. It can optionally use a version control system based on a relational database (MySQL, PostgreSQL, ODBC, or SQLite). All system changes are saved to the database on the fly. It provides browsers for viewing and editing xotcl classes and objects. It also contains syntax highlighting and a static code checker for TCL and xotcl. It contains many developer plugins. Release focus: Minor feature enhancements Changes:

This release include new plug-in: visual regexp. A new wizard helps convert Tcl procedures to object-oriented methods. The version control database can be also driven with an MS Access file (via ODBC). A new Variable-Access-Tracker helps observe read and write access to global variables and object variables. The values of variables can be displayed in the user-friendly GUI. Author:

Artur Trzewik <mail (at) xdobry (dot) de> [contact developer]

freshmeat.net Project details for tkdiff

TkDiff now has inline diff highlighting and a current line comparison window. Initial support for Subversion has been added. TkDiff 4.0 is CDE, Windows, and MacOS X aware.

The Year In Scripting Languages Lua-Perl-Python-Ruby-Tcl 2002

Tcl 8.4 released. The 8.4 release includes a number of significant features, including

a new Virtual File System (VFS) layer that allows (in principle) filesystem activity to be diverted away from the native operating system to something else; drivers exist for a number of underlying "access methods", including FTP, HTTP, Metakit, WebDAV, etc.

64 bit support - both files and arithmetic, also on 32-bit systems

major performance improvements - 8.4 is the fastest Tcl version yet

enhanced thread support, stability, and performance

Unicode and I18N improvements

three new Tk widget types, most other widgets have been enhanced

native MacOS X Aqua support

Details of these and other changes can be found at [1]. Or start your Tcl explorations from this new URL (redirect): http://www.tcl.tk/ The introduction of VFS to the core enables Starkits [2], which provide easy single-file deployment of applications and packages (like Java jar files, but with more power). They can be interpreted using either Tclkit, a single file Tcl/Tk interpreter available on numerous platforms, or ActiveTcl, the ActiveState Tcl distribution. An archive of Starkits is available at [3].

Automating rsync with a Simple Expect Script This short article provides an example script that uses Expect to automate a series of rsync operations using an ssh tunnel.

Expect is a great tool in a system administrators arsenal and can be used to easily automate tasks that require periodic user input. This can allow the administrator to make better use of their time than watching the application or utility to spot the next time it requires input. In the following example expect is used to automate the inputing of a password for a series of rsync commands tunneled through ssh. The script automates a series of rsync operations using only the password for access to the remote host so that the security of the two machines is not reduced by making the source machine trust the destination machine in any way (for example .rhosts or a ssh key with an empty pass phrase). The script reads a password from the user and then holds that password in a variable for use each time the ssh application that rsync is using as a tunnel asks for it. The "stty -echo" prevents the password from being echoed to the screen when it is typed in and the "stty echo" turns it back on. #!/usr/bin/expect -f spawn date expect "#" send_user "The password for HOSTNAME: " stty -echo expect_user -re "(.*)

" {set PASSPH $expect_out(1,string)} send_user "

" stty echo set timeout -1 spawn date expect "#" spawn rsync -ave ssh --numeric-ids HOSTNAME:/etc /sdc/ expect "password:" { send "$PASSPH

"} expect "#" spawn date expect "#" spawn rsync -ave ssh --numeric-ids HOSTNAME:/admin /sdc/ expect "password:" { send "$PASSPH

"} expect "#" spawn date expect "#" spawn rsync -ave ssh --numeric-ids HOSTNAME:/home /sdd expect "password:" { send "$PASSPH

"} expect "#" spawn date expect "#" spawn rsync -ave ssh --numeric-ids HOSTNAME:/mail /sdd expect "password:" { send "$PASSPH

"} expect "#" spawn date expect "#" spawn rsync -ave ssh --numeric-ids HOSTNAME:/work /sdc/ expect "password:" { send "$PASSPH

"} expect "#" spawn date expect "#" (Submitted by Noel Mon Nov 17, 2003 ) Copyright 1999-2003 Noel Davis

Tcl Core Team Interview - OSNews.com

The Tcl programming language has been immensely successful in its almost 15 years of existence. Tcl, stands for 'Tool Command Language' and is pronounced 'tickle' by those in the know. It's probably most famous for the Tk graphical toolkit, which has for years set the standard for rapid, scriptable, cross-platform GUI development, but the language is used throughout a staggering variety of applications, from the open source web server that runs AOL's web site, to code for network testing, to code to run oil rigs. Tcl/Tk Interview, Part 1 Unlike Perl or Python, which are still maintained by their original authors (Larry Wall and Guido van Rossum), Tcl ownership has changed hands - Dr. John Ousterhout, who wrote Tcl while a professor at UC Berkeley, has moved on to other endeavors (although he does keep an eye on his creation), and others have stepped forward to not only maintain Tcl, but work on improving and extending it. Several of the "core team", including the lead maintainer, Jeff "The Tcl Guy" Hobbs, Andreas Kupries, Donal Fellows and Mark Harrison, were kind enough to take some time to respond to some questions about a variety of topics via email. We start by asking Jeff about how the torch was passed... Jeff, can you tell us a little about how 'ownership' of Tcl, passed from Dr. Ousterhout to yourself? Jeff: John hired me to do it. :) It actually took three tries (third time's the charm). The first time I met him was while he was at Sun Labs. I had been working with Tcl for a while already, both at the Tcl user level as well as the Tcl/C level, including extension authoring. I was visiting to show off some research I was doing at the University of Oregon using Tcl (related to wearable computers). I met all of the Sun Labs Tcl team at that time. It was actually following that visit, after talking with John and Jacob Levy, that I was inspired (with some encouragement from them) to write the first Tcl plugin to Netscape. It was a windowless version, but it worked. Anyway, I digress ... I had an offer to join the Sun Tcl team after graduation but turned it down as I wasn't so sure about moving to the Silicon Valley and working for a large company. I received another offer as he was forming Scriptics, but that coincided with the time that I had received an offer from Siemens in Munich. The idea of a startup was more appealing, but the choice between Munich and Mountain View was ... well, we know how that turned out. :) The third and successful offer came while I was at Siemens in 1999. It took John some convincing to get me out of Munich, but basically the management of core development at Scriptics (which it was still called at that time) was all the sweetening that was necessary. Previous offers had been more oriented towards being a part of the core team, but as Scriptics was becoming more successful, John had essentially stopped coding to work on the business side, and other previous core team members were focused on other development. At that time the management of the core was nearly non-existent, which was becoming apparent to the community as well. I have a very nice graph from the February 2000 Tcl conference that shows the marked decline in open bugs and features requests following my starting with Scriptics in August 1999. I wasn't the only one qualified to take over from John, although I did have years of experience with Tcl/Tk by then. I have always had the luck of having had a job that intimately involved Tcl development since university (where I started using it). Other core team members like Brent Welch and Scott Stanton were interested in other things. One important facet that I did bring to it, which perhaps only an outsider could bring, was enthusiasm, and a project like this requires lots of that, plus energy and the right stuff to keep going. ... ... ... What successes has tclcore had? Donal: In my humble opinion, TIPs are a *great* success. Jeff: Well, 8.4 made it out. TIPs are nice. This is a very open question though. Mark: The TIP system (http://www.tcl.tk/cgi-bin/tct/tip/) has worked very well. Who are your inspirations? Who or what inspired you when growing as a programmer? Mark: My first inspiration was my father, who was also a programmer. In the 60s and 70s he worked on a global information distribution system that could transmit weather data to every U.S. military installation in the world in under two minutes, with a downtime of less than five minutes per year. This on a system with about the same computing power as a Gameboy. I was also inspired by Brian Kernighan's writings. He is still the model of exposition and clear thought I turn to when I try to explain things to other people. It was a high point in my career to work with him on our Tcl book. Jeff: John Ousterhout has actually been a big inspiration for me. He answered my questions (and even promptly!) when I start using Tcl, which left a great impression on me. He was also a good mentor. Donal: Alas, I'm not someone who has heroes. Though I'll admit to wanting the respect of other Tcl programmers... What is the coolest thing you've seen Tcl used for? Mark: Two early Tcl projects that totally blew me away were the Shell Auger Platform (in which Tcl programs controlled all the wellhead operations for 32 drillheads on a floating oil well thing floating in the Gulf of Mexico), and the use of Tcl in the Mars Explorer project (the one that didn't crash!). Donal: Now that's a hard question. There were many cool things being done with Tcl at the last Tcl conference (and there's much impressive activity elsewhere too) but the use of Tcl for the electronic version of the DWB (German dictionary) is impressive because it is work by mostly non-programmers and is highly exposed to the public too. And of course the use of Tcl as an OS on a piece of hardware is fantastically funky too. And then there's Tclkit/Starkits... [*]

Sun's Internet Security Technologies -- Safe-Tcl

Safe-Tcl is the security model for Tcl. It makes it possible to run Tcl code from untrusted sources - such as code downloaded over the Internet - safely, to protect against viruses or simply bad coding. Safe-Tcl uses a simple padded cell model where untrusted scripts are executed in a special controlled environment that limits what they can do. The padded cell approach is similar to the kernel-space/user-space mechanism used for protection in operating systems for the last 30 years. How does Safe-Tcl work? An application such as your Web browser can have more than one Tcl interpreter. Each interpreter has its own set of commands and variables, and a script can only use the commands and variables in the interpreter where it runs. The scripts that make up the application run in a Tcl interpreter called the "master interpreter" which has all of Tcl's commands. If an application wishes to execute a Tcl script that it doesn't trust, such as one in a Web page, it creates a separate interpreter in which to run the application. This interpreter is called the "slave interpre