\newcommand{\Lcs}[1]{{\ttfamily\char'134#1}} \providecommand{\PS}{PostScript} \def\MP{MetaPost} \providecommand{\PDF}{PDF} \title{TUG95 at St. Petersburg Beach, Florida} \author[Michel Goossens]{Michel Goossens\\\emph{Email: } \texttt{m.goossens@cern.ch}} \begin{Article} [\emph{Editor's note:} Michel has kindly allowed us to print portions of his longer report on TUG 95, which will be distributed on the Internet.] \section{Monday July 24th} Monday morning saw the official opening of the Conference, with TUG95 Organizing Committee Chairperson Mimi Burbank welcoming all participants before passing the floor on to Michel Goossens, who thanked everybody for coming, and emphasized the different structure of this conference as compared to previous years, since this time we had more workshop oriented sessions (in the afternoons) and were opening up the area of presentations to the world of electronic publishing at large, and SGML/HTML, Adobe Acrobat, CD-ROMs, hypertext, etc.\ in particular. At that point the representatives of NTG said a few words about their third edition of the 4All\TeX{} CD-ROM, and Wietse Dol took this occasion to offer the first pressed CD to Donald Knuth who was a guest of honour at this 16th ($2^{2^{2}}$) \TeX{} Annual Meeting. The first talk was by Ji\v{r}\'{\i} Zlatu\v{s}ka, who showed how \MF{} and \TeX{} can work together to typeset combinations of text and graphics. His new approach based on \TeX's extended ligature mechanism, reduces the number of \MF{} passes to one, and also simplifies the \TeX-\MF{} interface. This permits easier typesetting of text along curves and in particular allows one to generate beautiful institutional seals and logos in various forms and combinations starting from the same base elements. He noted that, although \PS{} is often the first choice for including graphics information in \TeX{} documents, \MF{} often offers improved legibility of the logos and the letters at smaller sizes. The next speaker, Richard Kinch, discussed his work on building reliable \PS{} Type1 and Truetype outlines for the Computer Modern fonts. He emphasized that several \MF{} primitives (stroked pens, overlapping ink) have no equivalent in these formats, since they support only non-overlapping B\'ezier curves. His program, \texttt{MetaFog}, handles most of the difficult problems associated with these conversions. He went on to discuss how \texttt{MetaFog} works, pointing out some of its drawbacks and making a plea for including hints inside the \MF{} sources. He finally compared his results to other outline instances of the CM fonts. After refreshments Alan Hoenig gave another one of his almost ``perfect'' pedagogical talks, this time showing how one can use Adobe's Poetica font set, comprising 21 fonts in two families, exploiting the possibilities of the virtual font mechanism. Alan showed us how his macro package together with the font metrics generated by Alan Jeffrey's \texttt{fontinst} package are able to typeset a sonnet by Shakespeare as though it were written in the most beautiful calligraphy of a scribe. One can never exaggerate the importance of documentation, and to teach first-year students the benefits of that approach an experiment was started at Texas' A\&M University by teaching Knuth's \texttt{WEB} system, that fully exploits combining code and documentation in the same source. It was found that students who had to ``mix'' program description and code acquired increased problem solving skills. They tended to analyse their problems not merely in function of the programming language used, but in terms of the more general literate programming paradigm. Thanks to this increased awareness the students who took the \texttt{WEB} course also were more successful in grasping data structures and program development in general. The morning session was concluded by W{\l}odek Bzyl who showed how by extending Nowman Ramsey's generic literate tool \texttt{noweb} with a few stand-alone front-end programs, it became relatively easy to create a \texttt{\TeX-WEB} system that is easy to understand for the novice user. The system is extensible by allowing customized styles and additional features. As an example he showed the literate source of \texttt{plain.tex}, the \TeX{} source of Knuth's \texttt{plain} format, and proudly handed a printed copy to Donald Knuth, who browsed through it with great interest. After lunch Sebastian Rahtz discussed some advanced exotic features of Timothy van Zandt's \texttt{PSTricks} package, that provides an easy interface between the PostScript and \TeX{} languages (\texttt{PSTricks}'s operating principles were described at TUG94---see \TUB\ 15(3), pp.\ 239--246). The talk was a shortened version of a three-hour presentation by Denis Girou, a well-known \texttt{PSTricks} guru, at the GUTenberg meeting on June 1st in Montpellier (France). By using the electronic version of the slides, Sebastian could easily zoom in on fractions of text and drawings. Fractals, complicated curves, like cycloids (this is the first time that curves of this type published in books on calculus will actually be drawn correctly, Don Knuth remarked when seeing the beautiful and precise graphs), and three dimensional multi-colour calendars were only of few of the graphics gems possible by this approach. Jerry Marsden introduced his \texttt{FAS\TeX} system, a library of standardized system-independent shortcuts for \TeX{} commands. At present versions for Mac and Unix exist. This approach speeds up the keying of input material and greatly increases the accuracy of the final text. Abbreviations exist for most of the well know formats and extensions, like the AMS packages. Commands can easily be edited or added. This approach is especially interesting in an environment where many non-specialist typists have to work together so that consistency and ease of input are important considerations. After tea Denis Kletzing showed how he uses his \texttt{multienumerate} package to handle complicated list structures. This environment handles narrow numbered list entries by bundling them in multiple columns. The drawback of the approach is that the user must specify the actual layout by typing the explicit position of the entry via different \Lcs{mitem\emph{ijk}} commands, where \texttt{\emph{ijk}} are column identifiers. With a little more \TeX{} programming a lot of the positioning can probably be made automatic, but the approach shows that it is relatively straightforward to extend \LaTeX{} to cope with moderately simple but useful structures. Jon Stenerson described the experience gained by using the style files he developed for use with \emph{Scientific Word} and which he described last year at TUG94 (\TUB\ 15(3), pp.\ 247--254). He thought that the basic ideas of his original approach were still all right but that most of them will have to be rewritten for streamlining and to better reflect his present thoughts on the subject. At the end of the day the editors of the journals edited by the various \TeX{} User Groups gave an overview of the problems encountered and ways of improving communication and mutual re-publication of worthwhile material. A few weeks before the conference, an electronic discussion list coordinated by Christina Thiele (TUG and former editor of \emph{TTN}) and Gerard van Nes (NTG, editor of \emph{MAPS}) had been set up and allowed the editors to exchange valuable information. Presentations were made by Sebastian Rahtz (editor of the \ukt's \BV), Michel Goossens (for Jacques Andr\'e, editor of GUTenberg's \emph{Cahiers GUT\-en\-berg} and \emph{Lettre GUT\-en\-berg}), Christina Thiele (previous editor of TUG's \emph{TTN}), Luzia Dietsche (editor of \textsc{Dante}'s \emph{Die \TeX{}nische Kom\"odie}), Gerard van Nes (editor of NTG's \emph{MAPS}), Wietse Dol (editor of the Euro\TeX95 Proceedings), Barbara Beeton (editor of \TUB), and W{\l}odek Bzyl (editor of GUST's magazine). From the various talks it became soon evident that most of the encountered problems were common to most user groups' publications, with in particular often an extreme dependence on a single individual so that when s/he is unavailable the whole production process suffers. It was clear that a production team of a few individuals is the only way out of this situation to ensure that issues can be produced at more or less regular intervals. Other themes were the difficulty of finding authors, and volunteers to proofread, correct and edit the articles. All in all, editing a journal is a non-trivial task and involves the dedication and hard work of a lot of individuals. As Jacques Andr\'e summed it all up: ``An editor is like an organists at Sunday mass: if the music is good, no one hears it; if it is bad, everyone cries.'' At the end of the session Gerard distributed \emph{MAPS} awards to Christina Thiele for her many years as \emph{TTN} editor and to Mimi Burbank for her hard work organizing the TUG95 Conference. \section{Tuesday July 25th} As Donald Knuth only rarely attends TUG conferences these days it was a real pleasure to have him with us at TUG95 and we gave him the floor for the first part of the morning. After answering the ``usual'' first question---``when is volume four of \emph{The Art of Computer Programming} coming out?'' (it will be published over several years with about 200 pages coming out every six months or so)---he talked at length about \TeX{} and how he would do it in basically the same way if he were to start over today. A more detailed account of Knuth's presentation will be published elseware based on notes taken by Christina Thiele. After the break John Hobby, the author of \MP{}, gave a live demonstration of his program which is now used by Knuth himself to prepare the graphical material of his books. \MP{} implements a picture-drawing language based on Knuth's \MF{}, but it outputs \PS{} commands instead. It moreover gives access to all features of \PS{} and allows easy inclusion of text and graphics (for more details on \MP{} see the user guide available on CTAN or an introductory article in the Euro\TeX92 Proceedings, pp.\ 21--36, Prague, Sep.\ 1992). The last set of presentations were about ``future systems''. First Robin Fairbairns gave an introduction to Unicode. He reviewed the winding road from 6-bit propriety codes for encoding information on computers, to 7-bit ASCII-like codes, then to 8-bit EBCDIC and ISO standards 8859-xx, specific language encodings, the 16-bit East-Asian JIS, GB, KS and Big Five codes, to Unicode and 32-bit ISO 10646 (for a detailed discussion of questions of encoding and multi-linguism see, e.g., \emph{Cahiers GUTenberg} 20, pp.1--53). It is to be noted that Unicode is the internal encoding used by Haralambous and Plaice's $\Omega$ program, a 16-bit extension of \TeX{} (\TUB\ 15(3), pp.\ 320--324 and 344--352). Ji\v{r}\'{\i} Zlatu\v{s}ka then brought us up-to-date on the \etex{} project, whose first version had just been distributed to developers. Peter Breitenlohner complemented Ji\v{r}\'{\i}'s talk with some more technical details on \etex{}. It all sounds like an interesting development. During the first part of the afternoon Wietse Dol showed how easy it is to use the ``plug-and-play'' 4all\TeX{} CD-system for PC's. An interactive live presentation showed the ease with which the system can be installed from the CD, and also how applications can be run. \section{Wednesday July 26th} Mark Swift was the first speaker of the day and in his talk \emph{Modularity in \LaTeX} he explained that \LaTeX{} should be built in a highly modular way. In particular an abstraction of functional modules not mapped onto filenames would be an important point, as shown by recent discussions on the \texttt{LATEX3} discussion list where the ``forced'' uniqueness of filenames in the basic \LaTeX{} distribution, like \texttt{article.cls}, was questioned. The speaker proposed some possible extensions to \TeX{} and discussed his work on the \texttt{frankenstein} package, which adds certain kinds of modularity to \LaTeX. Bart Wage of Elsevier Science in Amsterdam gave an interesting description of how journals are handled from source copy to printed/electronic document. Text is converted into SGML, figures are kept in various formats (e.g., TIFF, JPEG) and \LaTeX{} sources are also translated into SGML using the Elsevier DTD. \LaTeX{} allows for easy typesetting, but it has no formal DTD, making extensive tagging somewhat difficult, while SGML allows for a formal DTD, where explicit tagging of all document elements with respect to that DTD is possible. Formal specifications exist for math, tables, bibliographic material, etc.. Elsevier see SGML as an ideal exchange format between different source representations. All documents are translated into SGML and stored in the ``Warehouse'', which forms the common repository for the various further uses of the documents. This is extremely important for electronic documents, where re-use and structure-awareness are of prime importance. Bart emphasized that a journal is not just a collection of articles, but a real web of cross-links to related topics and references, and the future of publishing lies in the optimization of these facilities for all potential users. Pierre Mackay then read a paper on \emph{Modern Catalan Typographical Conventions} written by Gabriel Valiente Feruglio, who could not attend. It was an interesting journey in search of typographic rules for scientific Catalan texts. The author complained that no normative typographical conventions existed for his language and then went on to propose a set based on his study of several reference texts. Finally he introduced a set of possible \TeX{} definitions implementing these rules. After coffee Petr Sojka gave one of the best technical papers of TUG95---Petr got the ``Knuth'' prize of the Conference for discussing something important that Knuth did not think about when developing \TeX---a follow-up of his detailed paper describing the problems of hyphenation with \TeX{} presented at in Gdansk in September 1994 (Euro\TeX94 Proceedings pp.\ 59--68). This time he discussed the problems of hyphenating long compound words, which occur very often in German, Dutch, and the Slavic languages, since in these languages the constituent parts are not signalled by a hyphen or other fill character. This makes it often difficult, if not impossible, to hyphenate words correctly. Therefore Petr suggests extensions to the hyphenation algorithms of \TeX{} to successfully treat such cases and he discussed in a generic way which basic functionalities would be needed. Perhaps something to be implemented in (one of) the ``successor(s)-to-\TeX'', he commented (and Don seemed to agree). The last talk of the morning was by Sebastian Rahtz, who discussed the translation of \LaTeX{} sources into SGML. His presentation was a complement to Bart Wage's earlier that morning. After working on the conversion problem for several months, Sebastian came to the conclusion that the only foolproof way is to use \TeX{} itself to output SGML, a solution implemented by ICPC in Dublin. He is actually using an intermediate approach, (pioneered by Sebastian and myself at CERN), which translates most \LaTeX{} commands into SGML by redefinition of macros, and then extracts the text from the dvi file. This system copes with almost all \LaTeX{} commands (including math). The afternoon had presentations by convenors of various working groups. First Norman Walsh presented the work of the \texttt{tds} (\TeX{} Directory Structure) working group. He explained the rationale of the choices that have been made, emphasizing that one of the basic constraints had been ISO-9660, which (only) allows for directories eight deep and limited to ``8+3'' case-insensitive names for files (for DOS users this will sound as a blessing, I am sure). Since not all \TeX{} engines support an optimized recursive directory search major attention was paid to propose an efficient structure that minimizes losses of efficiency while searching for package and font-related files by \TeX. It was emphasized that a production run-time directory structure, like \texttt{tds} is different in nature to an archive, like CTAN, and that the two cannot be married easily. Tomas Rokicki and Michael Sofka then discussed the work in the dvi-standard committee, especially the standardization of the various \Lcs{special} commands, that had been discussed by an extremely active interested group of implementors, meeting over several of ``working breakfasts and lunches''. I think that real progress was made in this area where a normative syntax had been awaited for too long. I am very grateful for the enthusiasm shown by these people, and am convinced that we shall see their work bear fruit in the near future (Tom told he will be working actively on his program \texttt{dvips} over the next few months, so that we can be sure, knowing Tom's reputation, that many of the hyper- and other goodies discussed during the conference, will become part of this, and other popular dvi-drivers). After the refreshment break T.~V.~Raman gave a practical demonstration of his ASTER system, which allows one to ``hear'' \LaTeX{} sources, including mathematical formulas, being read out. His system uses a speech synthesizer via an augmented emacs editor running with Common Lisp, and is able to analyse, decode and then transcribe into audible form well-structured \LaTeX{} documents. This last remark is extremely important, since, as already pointed out by Sebastian Rahtz in his talk earlier that day, due to the various (ambiguous) ways that mathematics can be coded in \TeX{}, there exists no automatic way to parse such \TeX{} source into something usable more generally, such as SGML or audible sound. This was the loudest plea yet for using well-structured markup. Just preceding the Conference dinner the editors of the various \TeX-related magazines had a second meeting to discuss ways of improving communication. It was decided to write a short overview of the experiences of each team for \emph{TTN}, to provide cross-references to each other's publications on the user groups' respective WWW pages, where tables of contents of the magazines will be posted (in fact, GUT\-enberg already decided to make freely available on the Internet via WWW, all articles---initially in \PS{} form only---of the \emph{Cahiers} and \emph{Lettre GUTenberg}). It was also proposed that all non-English publications try and provide an abstract in both the local language and in English, so that these abstracts can be published in \TUB\ (or elsewhere). Editors were also asked to signal potential articles that might be interesting for translation into English and publication into \TUB\ (of course, editors can translate articles from \TUB\ into their national language also!). Presently, CSTUG, GUST, GUTenberg, NTG, TUG, and \ukt\ are working on a TEXART CD-ROM that will make all publications of those user groups available on this electronic medium (and on the Web). It was also agreed that each author should be asked permission to reprint her/his article(s) in this way. During the meeting Gerard had the pleasure to offer the third \emph{MAPS} award to Barbara Beeton, the most senior and long-standing editor present for her 16 years of efforts to make \TUB\ an example of the typographic quality that can be achieved with \TeX{}. At the end of the conference dinner in the evening a set of books written (and dedicated) by Donald Knuth were put up for sale to the highest bidder. Even Lamport's \LaTeX{} book got a nice inscription by the hand of Knuth. After I let the the volumes of the ``Art of Computer Programming'' escape I concentrated on the ``big one'', namely the five volumes of \emph{Computers \& Typesetting}, who became mine for the nice little round sum of \$700. Adorned with the dedication of Donald Knuth this will remain one of the treasures of my personal library! The sales were an outstanding success and about \$1800 were collected towards the funding of the Euro\TeX{} bus which will take participants from Russia and central Europe to the Euro\TeX{}95 Conference in Papendaal (the Netherlands). Many thanks to Addison-Wesley, who donated the books. \section{Thursday July 27th} In his presentation T.~V.~Raman gave an overview of ASTER---an Audio System For Technical Rea\-dings---the system he demonstrated the day before. ASTER renders \LaTeX{} documents in an audible way, so that visually impaired persons can ``listen'' to their contents. Raman emphasized the importance of the use of clear generic markup for the input source document to ease the extraction of structural logical information that can be easily translated into an internal representation. ASTER then renders information by applying rendering rules written in AFL---Audio Formatting Language---to the internal representation. In a sense AFL is to audio formatting what \PS{} is to visual formatting (although AFL is by far not as complex). As a conclusion he emphasized that one needs a semantic-oriented DTD to produce a high-quality audio document. Since no such completely general DTD can be constructed, one has to use the facilities provided by \LaTeX{} and its hyper\TeX{}t extensions. Mark Doyle next reviewed the purpose and history of the Los Alamos preprint server, which is one of the first (and more successful) document servers on the Web. In fact it started in the area of (theoretical) High Energy Physics and took place in close collaboration with CERN (where WWW was ``invented''). Today several tens of thousands of preprints are available online and over 20000 users visit the server each day. Although at present most documents are only available as (mainly \TeX) source and standard \PS{}, they are now producing \PDF{} versions that include cross-references to other documents on the Web using the \texttt{hyper\TeX} tool and \PDF\ hypertext links. In this way cross-references to other documents can be easily instantiated. During the next half-hour I gave an introduction to Nikos Drakos' tool \texttt{latex2html} and showed how by simple customization the visual quality of the output HTML files can be substantially improved. I went on the show how the \texttt{latex2html} system also allows for interconnecting separate documents. I ended with a few examples of HTML3 output generated by an ad-hoc program developed at CERN and viewed with the HTML3-capable \texttt{arena} browser. After the break Sebastian Rahtz showed how with his \texttt{hypertex} package (sharing some code with the \texttt{Hyper\TeX} package discussed earlier by Mark Doyle) it is easy to turn \LaTeX{} documents into hyper-documents. Their ``hyper'' contents can be enriched by adding supplementary information about \LaTeX's cross-references via \Lcs{special} commands. These are picked up and translated into \PDF{}'s \texttt{pdfmark} commands by Mark Doyle's ``hypertext'' \texttt{dvihps} program, an extension to Tom Rokicki's \texttt{dvips} program. Tom stated that these extensions will end up, in one form or another, in the forthcoming upgrade of standard \texttt{dvips}. The afternoon started by a second presentation of the \emph{4All\TeX} system, and, as always, there was great admiration amongst all those present for the ease with which it is possible to ``plug and play'', i.e., start to setup and run the system without much ado. It became all the more evident that such a CD-ROM for Unix is a real need, and a recurring proposal for the next great thing that TUG should come up with (and we are surely thinking about a way get this done). During the next hour I gave an introduction to SGML using HTML as an example of a DTD, and showed that it is not difficult to understand the structure and syntax of a DTD, and from there to figure out the various possible document elements, their attributes and the entities that are available to the user. Work on other DTD's for mathematics and tables were briefly mentioned, as were a few tools for authoring and checking SGML documents. I came away with the feeling that at the end of my talk most of the audience had a more balanced view about what SGML is, and what it is not. I therefore hope that my presentation will also contribute to eliminate most of the artificial animosity between the SGML and \TeX{} worlds. As Sebastian, myself and a few of the other speakers tried to show, SGML is about structure, and \TeX{} about typesetting, and the two tools are therefore complementary and both useful. Chris Hamlin, in the last scheduled talk of the day, described the production work at the American Physical Society, and, as expected, it is similar in content, form, and structure to what we had heard by other speakers (at this conference Elsevier, or at other conferences Springer, OUP, etc.), namely a mixture of \TeX{} and other word-processor inputs are accepted by the production team. The proportion of \TeX{} sources varies wildly between publications (between almost nothing in the chemical journals to well over 50--60\% in some of the physical journals). Various house styles are available, and at present ways are being investigated to translate the inputs into SGML to take full advantage of electronic publishing tools. The last part of the afternoon was for the TUG Business meeting. \section{Friday July 28th 1995} Already Friday. It seemed as though the Conference only just started, but the bags at the sides of the room and the now-empty vendor's room made us realize that we were only here for another few hours. The morning started with a paper submitted by Jonathan Fine, but read by Alan Hoenig in Jonathan's absence. The title was \emph{New perspectives in \TeX{} Macros} and dealt with a possible way of combining the advantages of both SGML and \TeX. His \TeX{} macro package \textsc{simsim} takes SGML and style files as input and generates pages formatted with \TeX{} as output. \textsc{simsim} comes with an SGML parser and the style files are used to link \TeX{} actions to SGML events. The \textsc{simsim} system also offers a programming environment for writing \TeX{} macros and style files. At present issues of performance were not addressed directly but on sample documents the speed was comparable to that of \LaTeX{}. All by all an interesting idea, and I look forward to see Jonathan's finished product soon. Sergey Lesenko then told us about his \emph{t1part} tool that partially embeds Type1 \PS{} font files into a document. The principle is to include the \PS{} code for only those characters that are actually referenced. This can result in huge savings in size if one uses only a few characters from many fonts (the procedure is based on the same model that includes only the necessary Type3 bitmaps for characters built with \MF). Tom Rokicki and Sergey have been working together over the last few months and this facility will be built into the ``next'' version of Tom's \texttt{dvips}. I mentioned that Basil Malushev has a somewhat similar utility \texttt{fload}, that uses the publicly available \texttt{ghostscript} program to make a map of all referenced fonts and then includes only the characters needed. Basil's approach can be used for any kind of \PS{} file, so that it is complementary to Sergey's which is well integrated with \TeX{} and needs no supplementary external program. During the discussion there were some interesting remarks on copyright issues connected with including Type1 fonts inside documents. It was felt that, although partial font loading would make pirating fonts less effective, it does not mean that all font vendors would agree to let us include their fonts in this way in files distributed electronically (on CD-ROMs or the Internet). To be continued\ldots A more technical talk, on \MF{} this time, was Jeremy Gibbons' presentation \emph{Dotted and dashed lines in \MF}. He showed that drawing evenly spaced dotted and dashed lines in \MF{} is a non-trivial task, and he proposed several solutions to make it possible. He introduced the notions \emph{evenly spaced in time} as opposed to \emph{equally spaced in space} and went on to show that they are far from identical, since points can move at different ``speeds'' in space as they progress along a path evenly in time. Using recursive adaptive refinement techniques he showed how one can solve the problem in \MF. His procedure can be extended to allow for dashed, or alternating dashes and dots. As recursive techniques have the unwanted feature that they can overflow the stack, Jeremy also proposed a solution based on an iterative non-adaptive technique that, although perhaps less elegant and automatic, does the job almost equally well. At the end of his talk he showed several attempts at drawing an attractive muskrat, the logo of the \emph{Mississippi Muskrats} jazz band he used to play in. The last scheduled talk was by Robin Fairbairns. After explaining the principles of the \PS{} multiple master Type1 font format, Robin showed how a crude first system of using these fonts with \TeX{} was set up. All font instances are expressed in function of weights with respect to the master designs. These weights are calculated by the \PS{} interpreter from the design parameters via the \PS{} operator \texttt{ConvertDesignVector}. One has to use version 3.x of the \texttt{ghostscript} program to extract the weights, which were then used to generate the Adobe Font Metrics (AFM) instances from the AFM files for the master designs. Then Rokicki's \texttt{afm2tfm} program was run to generate corresponding tfm files needed by \TeX{}, while a header file was also defined to allow \texttt{dvips} to actually specify the font instances from the weightvector. This set up was used to typeset the last issue of \ukt's magazine \BV\ in Minion, one of Adobe's Multiple Master fonts. The morning ended with the ``Closing Ceremony'' and the announcements of the TUG95 prize winners. Christina Thiele, as vendor and public-relations liaison thanked the various companies who had vendor booths or otherwise contributed to the TUG95 conference, in particular Addison-Wesley for the books they donated (and that were put on for sale on the Wednesday evening for the Euro\TeX95 bus, and still a few left for another sales idea we are playing with for spicing the TUG96 bursary. Stay tuned to \TUB\ or \emph{TTN}!). Mimi Burbank, as Chair of the Organizing Committee, thanked all the people at SCRI who helped her financially, organizationally, by providing PC's or a printer. She also thanked the extremely efficient hotel staff for their never-ending devotion to a job well-done. Sebastian Rahtz, the chair of the Programming Committee, then announced the prize winners for TUG95. Just before coffee that morning all participants were asked to write down an ordered list of the four papers they liked most, and on the basis of that list it was Raman who was selected as best presentation, best paper, and most important contribution to the \TeX{} world (and humanity, one person wrote). The Knuth prize, for the paper discussing something that he ``forgot'' in his \TeX{} program, went to Petr Sojka for his work on hyphenation. Other prizes went to Richard Kinch for \texttt{MetaFog} (who put his prize copy of Textures up for sale, so that the Euro\TeX{} bus got again somewhat more money to take home), Alan Hoenig for his marvellous Poetica work, Jeremy Gibbons for his entertaining and erudite explanation of \MF, and Sergey Lesenko and Tom Rokicki for partial font downloading, and work on dvi standards. Many thanks are due to the fine \TeX\ vendors Blue Sky Research, Y\&Y, PC\TeX, and Richard Kinch, who generously donated copies of their products for the prizes. Of course we did not forget our friends from NTG, without which this conference would not have been the same. Their \emph{4All\TeX} CD-ROM was one of the highlights at this conference (they sold about 40 copies, and the remaining 60 were taken to the TUG office for selling them to the North American TUG community). Their ``presence'', good humour, the organization of the book auction, the coordination of the TEX-ED initiative and the hundred or so photos they took made them a memorable and unforgettable part of this meeting. Therefore a signed copy of the \MF{} book was given to Wietse Dol. In a gesture underlining their dedication to \TeX{} and TUG Wietse then offered TUG the two golden (original) CD-masters of the third edition of \emph{4All\TeX} that just appeared. I had the pleasure to receive them in name of TUG and I promised that they would be framed and displayed in a prominent place in the TUG office in San Francisco. The gifts were concluded with the UK \TeX\ Users Group and TUG presented 2 bottles of wine, and 2 boxes of chocolates, to Don Knuth, maintaining the `2' theme begun by NTG's 2 CDs at the start of the conference. Finally it was my duty to formally end the TUG95 meeting, and after thanking Knuth for his presence, which made this $2^{2^{2}}$th meeting even more special, I re-iterated the thanks to all vendors, SCRI and the hotel staff, for their display of (southern) American hospitality. Then I invited all participants to the next (17th) TUG annual meeting in 1996 in Dubna (Russia, 150 kms north of Moscow, on the Volga River), where from July 28th to August 1st TUG96 will be hosted by the Joint Institute of Nuclear Research. During the afternoon Alan Hoenig gave a partical introduction to the use of virtual fonts. He showed how they can be used to create new characters as various combinations of glyphs and rules. He described how Alan Jeffrey's \texttt{fontinst} package allows one to easily install \PS{} font families. In his usual pedagogical approach Alan made it all sound as though it is extremely simple and straightforward, and all fifty participants to this last ``event'' of the conference came away with the feeling there were ready to generate some virtual fonts themselves. \end{Article}