Skip to content

TeX Is Wonderful, What Is TeX?

March 9, 2011
tags: , , ,


Two stories about the beginning of TeX

Don Knuth is one of the founders of Computer Science, that is CS with capital letters. He has worked in many areas of our field, received perhaps all the awards possible, and is certainly one of the giants of our field.

Today I want to talk about some old and new stories about TeX.

Knuth created TeX in the late 1970’s as his personal typesetting system. The reason he did this is simple: he disliked the job that was done typesetting one of his books. The rest of us would have lived with it, ignored it, or maybe switched publishers. But Knuth decided to follow the rule: if you want something done right, write a program. Okay that is not exactly the statement but is close enough.

We owe Knuth a huge thanks. Today TeX is an invaluable tool that makes our papers and books beautiful.

Two Stories About TeX

{\bullet } Where’s the {\dots}? Jeff Ullman moved in the late 1970’s from Princeton University to Stanford University, where he is still. Jeff is famous, famous for many things, but perhaps most for his great collection of textbooks on various areas of computer science. When he moved to Stanford he was using an old typesetting system called TROFF to write his books. It was adequate, the books came out fine, but he wanted to try the then new system that Knuth had just invented—he wanted to think about switching to TeX.

Jeff did the obvious. He got some basic info on TeX from one of Knuth’s students and tried it out. He liked the output and thought he would switch over to TeX. But recall in those days there was no Internet, no email even, so he needed help on how to use TeX. He realized he needed to have a formal definition of the syntax of TeX. The BNF for what is legal TeX would be quite useful he thought. Recall BNF is a nice way to define syntax. Here is the BNF for US postal addresses from Wikipedia:

So Jeff went to see Knuth and asked him for the grammar for TeX. Knuth looked at him and said,

What grammar? There is no BNF, there is no description.

Jeff was shocked. Knuth was famous for his research into the area of formal grammars, he had invented LR parsing in 1965. How could he design a language without any formal grammar? Well he did. The legal inputs to TeX where those strings Knuth’s program accepted—no more no less.

{\bullet } You did {\dots} what? While I was at Berkeley in the same time period, the late 1970’s, we invited Knuth to visit us as a distinguished visitor. Part of being distinguished, in academia there is no free lunch, he had to give a series of lectures. They could be on whatever he wanted, but he had to give them.

One of the lectures was on the design, implementation, and testing of the first version of TeX. The audience for this, and all his talks, was filled to overflowing. Knuth began to explain how he wrote the first version of TeX. My faculty colleagues began to become more and more uncomfortable as Knuth spoke. The high level version of what he said was:

I sat down and started to type in the entire program in Pascal. After X hours the entire system was entered. Then I tried to compile it all. After fixing a few syntax errors the system compiled. I then tried to typeset a sample piece of TeX. Again after fixing a few errors I got this all to work {\dots}

My colleagues were seeing and hearing with their own ears and eyes Knuth violate all the rules of software building that they had been teaching their students. Build a specification, then code and debug modules, then create testing sets, and on and on. They could not believe it.

Of course the problem was, and is, that Knuth is brilliant. He is one of the few people in the world that could have built the first TeX this way and got it to work in any finite amount of time. Since there was no specification, there was no BNF, there was no way to tell what was legal or not. While the students heard all the wrong statements from Knuth I think that no lasting damage was done. Everyone realized they were watching a virtuoso performance by a master. It was like Wolfgang Mozart explaining how he wrote concerts—you could listen to what he said, but you could not do it yourself.

Open Problems

I have a TeX open problem. The href package takes {\backslash\mathsf{href}\{a\}\{b\}} and outputs the string {b} as a pointer to the url {a}. I wanted to have this act differently: I want it to output the string {b^1} and add the footnote numbered {1} with the url {a}, somewhat as Wikipedia does. Of course the next one would use a different number for the footnote.

I tried to do it myself and failed. I then tried using TeX Stack Exchange, got lots of almost solutions, but none actually work yet. People were very nice and helpful, explaining lots of TeX details, but the problem seems to be that the interaction between url’s and footnotes is tricky. Also recall that a url can have very nasty characters in it.

As we go to press—is that the right term?—I just got a solution for my TeX problem. Here it is thanks to Martin Scharrer via TeX Stack Exchange:

54 Comments leave one →
  1. March 9, 2011 8:59 am

    … seeing Knuth violate all the rules of software building that they had been teaching their students

    One hypothesis to explain this is, as you say, that Knuth is a genius and ordinary programmers shouldn’t try to emulate him.

    But I think that the alternative hypothesis, namely that these “rules” are situational rather than universal (that is, they apply to some software projects and not to others, the latter set including TeX), ought to be considered.

    • March 10, 2011 2:08 am

      Very true. It’s the same issue with design patterns. Some people interpret them as “You have to always use one of the official design patterns for everything.” whereas they really only make sense more as “Occasionally, you may run into a case like this, in which case, this approach might be handy.”

      Software development is not a sequence of absolutes.

    • Dr. Philip Carey permalink
      March 11, 2011 9:10 pm

      The only valid hypothesis is that Knuth is a genius.

  2. March 9, 2011 9:41 am

    I have been using TeX in writing particularly mathematical articles for the last two decades. It is wonderful but as your open problem indicates sometimes you need an expert advice. On the other hand mathematical papers written in another format have been received suspect-ion particularly when submitting it, say to arXiv or to a scientific journal. Is it true that a good trust-able mathematical paper should be written by using TeX?

  3. March 9, 2011 10:09 am

    Hi Dick,

    regarding your href macro, would this definition (in LaTeX) not work, assuming you have loaded the url or hyperref package?

    \newcommand\myhref[2]{#2\footnote{\url{#1}}

    • March 9, 2011 10:10 am

      Ah, the joys of matching braces. Of course, it should read

      \newcommand\myhref[2]{#2\footnote{\url{#1}}}

      • March 9, 2011 12:40 pm

        Sorry, should have thought about those nasty characters first. This should work with hyperref though (see my answer on TeX Stack Exchange):

        \makeatletter
        \newcommand\myhref@[2]{#2\footnote{\url@{#1}}}
        \DeclareRobustCommand{\myhref}{\hyper@normalise\myhref@}
        \makeatother

  4. March 9, 2011 10:29 am

    Any chance you will do a future post about Joris van der Hoeven and his TeXmacs software?

  5. sigma permalink
    March 9, 2011 10:45 am

    Professor Lipton,

    Your history of TeX is not as I understand
    that history.

    My memory is that Knuth wrote the first
    version of TeX in the language SAIL
    (Stanford Artificial Intelligence
    Language) on a DEC PDP-10 computer. The
    output was intended for a new Xerox Dover
    laser printer.

    For getting TeX to type URLs, that’s easy:
    Just use the idea of ‘verbatim’ in Knuth’s
    ‘The TeXBook’.

    Your example of is not TeX but LaTeX.

    Writing some verbatim macros is a good
    exercise for TeX padawan learners. I
    wrote three such macros early in my usage
    of TeX.

    The macro that is sufficient for URLs in
    text I call \IV for ‘inline verbatim’.
    The code is just:

    \def\IV{\begingroup\IVsetup\IVdo}
    \def\IVsetup{%
    \tt\spaceskip=0.5em
    \catcode`\`=\active
    \uncatcodespecials\obeyspaces
    }
    \def\IVdo#1{\def\next##1#1{##1\endgroup}\next}

    So, to use it, pick a character, say, ‘|’,
    not in the URL, and then type, say,

    \IV|http://rjlipton.wordpress.com/|

    and you will be successful!

    I use TeX (not LaTeX) for all my higher
    quality word whacking. The main reason is
    that one of my more important tools and
    interests is applied math, and TeX remains
    by a wide margin the best way to type
    math. Otherwise, I have over 100 macros
    in TeX, such as \IV above, good for
    letters, foils, and papers. That TeX is
    solid and solidly frozen helps: Some of
    my TeX macros are over 10 years old, but
    they continue to run just fine.

    • rjlipton permalink*
      March 9, 2011 12:43 pm

      I am corrected about the language, sorry

    • rjlipton permalink*
      March 9, 2011 12:43 pm

      The url problem is I have a few hundred and wanted to avoid editing

      • sigma permalink
        March 9, 2011 1:51 pm

        Only a few hundred URLs? Not a problem!

        Just use a good editor when typing into
        TeX (assembler, Fortran, PL/I, C, C++,
        Visual Basic .NET, C#, XML, ASPX, e-mail,
        blog posts, recipes, etc.), e.g., KEdit.

        With KEdit, there are various ways.

        (1) Macro. Just write a little macro in
        the KEdit version of M. Cowlishaw’s
        elegant macro language Rexx.

        (2) Interactive. Type in TeX where each
        URL is on its own line, and then use just
        the simplest features of KEdit. So,

        top
        all|http://|
        c//\IV|/ *
        c/ /|/ *

        Done.

  6. March 9, 2011 10:56 am

    Very nice anecdotes.

    I am just wondering why you did not choose to just write b, then add a footnote, which contains the link a, like this: b\footnote{a}?

    • rjlipton permalink*
      March 9, 2011 12:42 pm

      Alexia,

      I think that will have problems with _$& in url’s

      • March 9, 2011 1:13 pm

        Then you could do b\footnote{\url{a}}.

        The problem with the nice solution given to you in StackExchange* is that if there is a punctuation after the \href{}, then the footnote number will appear before that punctuation sign, which is not OK.

        * not the one you give here, but the one here:

        http://tex.stackexchange.com/questions/12774/changing-hrefs-to-footnotes

        modified into

        \let\oldhref\href
        \renewcommand{\href}[2]{{#2}\footnote{\url{#1}}}

        to be closer to what you wanted.

      • sigma permalink
        March 9, 2011 7:15 pm

        Right: The characters ‘_$&%’
        along with, of course ‘\’, will
        give TeX a big headache.

        But, a verbatim macro to the
        rescue! Just type

        \IV|_$&%\|

        using my macro \IV above, and
        TeX will remain all nice and
        happy.

    • March 10, 2011 3:12 am

      It’s nice to have \href{a}{b} in the source and be able to quickly switch between a print version (with footnotes) and an online version (pdf with inline links).

  7. March 9, 2011 11:29 am

    Thanks for the great post. Don Knuth is an amazing person.

    I read the explaination of the solution on Stack Exchange, but I was wondering if you could elaborate a bit on the code. In particular, I’m confused about the functionality of the \hreffootnote function and the use of the @@ in the commands.

    I’ve been using LaTeX for a couple of years, but have pretty much stuck with simple macros.

    Another question, which is somewhat related: Is it possible to get a sytax-highlighting editor (eg. vim) to recognize that the % in a url is not actually a comment? I find it very annoying to edit files like this where the highlighting is messed up.

  8. bonheur2004@free.fr permalink
    March 9, 2011 11:45 am

    MOZART

    Please don’t say Wolfgang Mozart: his full name is Wolfgang Amadeus Mozart. Just from the meaning of this second first name – Amadeus – it adds a very important meaning to the whole. Beside: in Austria we just call him WOLFERL.

    Regards
    Jack

  9. March 9, 2011 12:10 pm

    Great stories. I used to do my math homework in college in TeX and I remember that one time when I didn’t have enough time and wrote it out by hand, the grader pretended to deduct points because it wasn’t in TeX.

    By the way, I had to read the following sentence multiple times to try to make sense of it and I’m convinced you mean “were” when you say “where”: “The legal inputs to TeX where those strings Knuth’s program accepted—no more no less.”

  10. March 9, 2011 1:29 pm

    Hmmm … my own recollection of the history of TeX is somewhat different … that Knuth initially envisioned TeX as a summer project suitable for a student … and that the whole project proved to be *immensely* more difficult, took years longer, and the resulting computer code was far more error-afflicted, than Knuth anticipated.

    Three relevant articles by Knuth (among many) discussing the above points are Mathematical typography (1979), Literate programming (1984), and The errors of \TeX (1989).

    • March 9, 2011 1:39 pm

      I should have mentioned, that the above article Mathematical typography suggests that TeX *does* have a well-defined grammar and syntax … the grammar is a bijective map to the set of blocks of lead type, and the syntax is a bijective map to centuries-old aesthetic rules for typesetting blocks in frames … all of which “grammar” and “syntax” Knuth carefully reviews (with examples!) Mathematical typography .

      That’s why (La)TeX embraces idiomatic constructs like \makebox[][]{}: the typographic naturality of these constructs becomes evident when one associates them, not to any kind of mathematical logic, but rather to the physical processes of typesetting.

  11. Sam Tetruashvili permalink
    March 9, 2011 1:58 pm

    The difference between a novice and a master is that a novice thinks in terms of rules, while a master can see beyond the rules and just go on intuition. My guess is that this is what Knuth did when developing the first version of TeX. I’d argue that one should always do this when developing the first version of anything due to the simple fact that you won’t truly know what it is until you finish building it. After you’ve done this you can go back and make a revised version that fits in with the rules. If you spend all of your time trying to follow all the rules you won’t get anywhere in software development.

  12. March 9, 2011 4:51 pm

    Harumph! You could have been a _little_ more explicit about your problem on the TeX StackExchange site. Your exact question was “Can I define a command that takes an argument but does not eval the argument?” and when asked for clarification it turned in to “Can I redefine \href{a}{b} to \footnote{\href{a}{b}}?”. So the lesson to learn is: include context when asking a question, otherwise people might just answer the question you ask, not the question you mean to ask. (In case it’s not clear (this is the internet, after all), the “Harumph” was meant in jest.). To anyone wondering about the various solutions offered, or with further questions about doing stuff in TeX (or LaTeX), please do come over to the TeX StackExchange site. We’re quite friendly and do our best to answer your questions … even if they aren’t quite the question you meant to ask!

  13. March 9, 2011 7:08 pm

    I use LyX (http://www.lyx.org/) to write a paper in TeX (LaTeX), so as a rule it is not necessary to think about “software building violations”, as well as it is not necessary to think about RTF “grammar and syntax” to read or to write rtf-file. Knuth said that book writing is the same as program writing, but I think these are very different things.

  14. March 9, 2011 9:25 pm

    The Mozart comparison is apt. A young musician is said to have asked Mozart how to get into composition. Mozart told him it was best to stick to short pieces for a while.

    “But, Maestro”, said the young musician in frustration, “you wrote a symphony when you were 8.”

    “Yes”, answered Mozart, “but I wasn’t asking anyone what to write.”

  15. Paul Beame permalink
    March 10, 2011 2:48 am

    I started typesetting my first papers in troff before we had access to TeX. The model of formatted text was still entirely characters that moved around to different spots on the page as opposed to the boxes and glue. (For example creating large braces as in a case statement involved stacking the “top of brace”, “middle of brace” and “bottom of brace” characters with the right number of vertical lines.) It looked truly ugly but we could print it on a laser printer which was great. With one try at TeX one would never want to go back. It was the first time that mathematics actually looked good. The model of boxes and glue was a big advance over troff, though it does not seem to have a model for text that “flows” around an object/box. (Is there a way to do this in TeX at all?)

    TeX’s biggest wins were for its approach to formatting displayed mathematics and Knuth’s much better dynamic program for line-breaking paragraphs.

    However, when I was writing my dissertation I only had access to tex82 and not LaTeX (though it was 1986) and some things in TeX were extraordinarily painful. It had long chapters that I wanted to be easy to leaf through and I spent a long time trying to work with TeX’s page-breaking procedures in order to come up with the code that would make the page header on a page of my thesis be the title of the first new section that began on the page (or the current section if there was no new one). This would have been trivial in LaTeX which I had access to a month later but it gave me an idea of how unpleasant page-breaking in TeX really is. (By the time the page is broken, a bunch of text on the following page is current, but the decision is largely local.)

    LaTeX made life of working with TeX hugely easier but page-breaking is still a bit of a mess – just try to get your diagrams where you want them, or figure out what you need to edit out to get your conference paper to fit in 10 pages. Moreover, though it is more like a proper programming language it also seems to be a bit of an embarrassment in that respect – I am not sure how many of its problems are inherited from TeX and how many are simply inherited from the TeX ethos.

    • March 10, 2011 7:49 am

      For text that flows around boxes, the standard LaTeX packages “wrapfig.sty” works reasonably well, especially in conjunction with adjustments to caption typography “caption.sty”.

      However, when wraparound figures begin to occupy more than (about) 30-40% of the text area, page-breaking becomes unreliable and unaesthetic, such that empirical fiddling is required. In such cases, in the header of the document, the following LaTeX parameter adjustments may help:

      \renewcommand{\topfraction}{1.0}
      \renewcommand{\bottomfraction}{1.0}

      These (in effect) instruct LaTeX “never mind aesthetics, just place figures exactly where I say.”

      If one is desirous of achieving the very highest level of typographic quality, then the magisterial classes “memoir.sty” (layout) and “pgf.sty” (graphics) are worth studying (their aggregate documentation is well in excess of one thousand pages). That’s where I first learned the trick of adjusting “\topfraction” and “\bottomfraction”, for example. Good typography is *not* simple!

      All of the above packages (including TeX itself) largely encode the expertise of the master Monotype machine operators of the 1930s whose high-quality typographic output Knuth greatly admired … that is why TeX’s mathematical toolset for manipulating “boxes” of text so directly reflects the physical toolset of Monotype technology for manipulating slugs of metal type.

    • March 10, 2011 10:08 am

      Paul’s comment is perfect context to add mine. When I arrived at the Oxford Mathematical Institute in late 1981, all they had were a couple of dual-carriage typewriters with a Greek/Symbol wheel that one had to physically hoist into place to type a symbol. There was also a gorgeous typesetting system through O.U.P. that was championed by one professor, but per-use it was prohibitively expensive for students. I took up the cause of first trying to get something like “troff” installed on the University systems (they couldn’t afford even the £793/yr. required to upgrade Real*8 to Real*16 on their public FORTRAN system), but when IBM PC’s came out in 1983 I switched to that direction. I did not know about TeX let alone PC installations of it, but I did learn about a WYSIWYG program called T^3, which is still going strong as Scientific WorkPlace (with subsets Scientific Word and Scientific Notebook) made by MacKichan Software.

      Through the generosity of Roger Penrose and the other Institute directors—allowing that the Mathematical Prizes Fund could be tapped for my purpose—my request was approved in fall 1984. The Toshiba 24-pin dot-matrix printer also came out then, with fine enough resolution to overturn an existing University ban on dot-matrix printers for dissertations. This was set up coincidentally in room T-3 on the Institute’s third floor. I finished testing the installation between 2am and 4am after a Friday (12/14/84) that began with filing long paperwork for my Junior Research Fellowship at immigration in East Croydon south of London, getting out only at the 5pm closing time. Then, rather than return to Oxford for dinner and have the whole evening to work, I was led to ring on the London flat of an African exchange student whom I had known at my Oxford college during the term. Just a half-hour before, she and her daughter had learned of her husband’s suicide in Africa. I stayed with them until the last train. This all came with me set to fly home on the 16th. The system was used for a few dozen dissertations including eventually mine.

      The original dot-matrix fonts supplied were somewhat crudely drawn—maybe they were digitized. The Slavic Languages and Literature Department, who bought the same system and contacted me for appointment the day after I returned in January, expressed disappointment right away, so I started improving the Cyrillic font myself dot-by-dot. Personal circumstances from March onward led me to improve all the other fonts likewise, and I had the full monk-at-a-calligraphy-board experience, but on a 24 x 18 grid, a thousand times over. Everyone used my fonts, so I joked that I had “written 20 theses” before I finished my own. My jigsaw-puzzle pieces for integral signs that could be lifted to any height with a nice slant (not just ramrod-straight) were the one design that the company adopted, for otherwise they sensibly switched to Metafont for their font rendering, rather than a monk! 🙂

  16. leibnizengine permalink
    March 10, 2011 9:59 am

    This story can be another sample use case to test “P is not NP”:

    1. Kunth’s program can be abstracted by a finite discrete series (such as hypergeometric series) of procedures or routines. The program was a polynomial. Kunth’s system can be abstracted by a finite discrete series of programs (such as hypergeometric series). The system was a polynomial.

    2. Kunth had the thoughts on the program in his mind (Fourier Transformation). Hence, the polynomial series (obtained in 1) became a non-polynomial. But there are special cases of being a polynomial.
    2.1 Kunth’s thoughts could be terminated by scaling (q-factor scales) the non-polynomial (obtained in 2) in his mind to obtain a polynomial.
    2.2 Kunth’s thoughts could be terminated by thinking through a code segment to obtain a polynomial (a parameter subset).
    2.3 Kunth could cut off his partial thoughts (by a delta function) to obtain an orthogonal polynomial (on the unit circle) to start coding.
    2.4 Kunth’s thoughts may have errors. The errors had a complex representation. The errors were in the imaginary domain.
    2.5 Kunth’s thoughts constructed continuation in the upper plane. Continuation was entire. This “entire” means the homographic thoughts (non-polynomials). The interrupted thoughts should be abstracted by the generalized structures such as hypergeometric structures, etc.

    3. Continuation in the upper plane was transformed into the lower plane. Hence, Knuth started to type in the program from what he had in his mind (Laplace Transformation). Continuation started to occur in the real domain and the imaginary domain. Now, continuation can be evaluated by the lines of code (LOC) and phase (time used) in the real domain.

    4. Knuth kept on typing. Iterative continuation (thinking and typing) obtained the Hermite polynomial in the real and the imaginary domain. Hence, he could enter the “entire” program in X hours. This “entire” means the modulus and phase completeness of the polynomial. Continuation existed in the imaginary domain such as the thoughts, unknown bugs, a TODO list, etc.

    5. Knuth used continuation for self checking and verified that the “entire” program and the system were entered. Continuation existed in the imaginary domain such as his thoughts, experience, unknown bugs, a wish list, deferred items, etc.

    6. Knuth used the compiler for verification. The verification was terminated by the errors (A complex kernel terminated continuation). Continuation has symmetric relations so Knuth did trials by errors. Now, continuation was evaluated by the number of the bugs to fix, the lines of code to change, and phase (time) in the real domain.

    7. The derivation of continuation kept the structure of continuation. Hence, Knuth wanted to fix the bug. Iterative continuation (thinking and fixing a bug) obtained the Hermite polynomial in the real and the imaginary domain. Hence, he could fix the bug in the real domain. Continuation existed in the real domain and the imaginary domain such as his thoughts, experience, known bugs, etc.

    8. Iterative continuation (thinking, typing, compiling, and fixing) obtained the Hermite polynomial in the real and imaginary domain. Kunth got this all to work… Upon completion of the work (in the real domain), the above items (1-8 in the complex domain)
    became his experience (a non-polynomial in the imaginary domain).He achieved both modulus and phase completeness in the real domain. Continuation existed in the imaginary domain.

    9. Kunth shared his experience (in the imaginary domain) with the audiences through Laplace transformation to construct a flow of information (non-polynomial) in the complex domain. So the audiences could see and hear his experience.

    10. Imaginary kernels cannot remove continuation. Hence, continuation still exists in the imaginary domain. Kunth may not think about this anymore but we are discussing his experience now (in the complex domain).

    11. Software engineering practices such as specifications, rules, and tests can be used for verification (upper plane and lower plane checking). For the projects involved with two parties or multiple parties, these rules, and practices are critical to make polynomial decisions.
    To terminate continuation, we need rules to obtain orthogonal relations for the decisions in both continuous and discrete cases. However, they form an imaginary kernel. The imaginary kernel does not remove continuation. Hence, we need to execute the rules.
    Knuth used his knowledge, skills, and compiler rules to obtain polynomial completeness in the real domain.

    12. “No BNF” and grammar story just support that continuation is the key between non-polynomials and polynomials. Modern software including operating systems and applications are designed and implemented for general purposes and scaling functionalities. Generalization and scaling contain continuation, which form non-polynomials. For example, we can use C or java to develop a business application or a mathematical package. Hence, we have specifications and expression languages for different programming languages. Knuth created the system for a special purpose.
    Specialization (special values and special cases) can terminate continuation to create polynomials. Hence, there are assumptions and restrictions to apply Knuth’s experience.

    This story illustrates the following key points:

    1. Knuth relied on continuation and verification to obtain polynomials in the real domain (his work) and in the imaginary domain (his thoughts).

    2. Polynomials and non-polynomials exist in the real domain and the imaginary domain.

    3. Continuation should be studied as the generalized structures discussed in my proof on “P vs. NP”.

    There are many details cannot be addressed here. For details, please see my proof on “P vs. NP”.

  17. Dr. Philip Carey permalink
    March 11, 2011 2:58 pm

    Two facts: Donald Knuth is the greatest mind alive and Tex is the greatest achievement in the history of CS.

  18. March 12, 2011 5:00 am

    Thanks for the wonderful article, RJ.

  19. Praki permalink
    March 13, 2011 12:30 pm

    I attribute what little software writing skills I have to studying Knuth’s books. After having gone through some of his works in detail (I have almost all of his books), I have realized that his ability to handle complexity is extremely high. He is a rare CS person who has invented so many new things, written a widely used software and published works of lasting quality. Despite this high regard I have for him, I don’t believe that TeX worked flawlessly for the first time on all valid input. Knuth kept a detailed log of TeX bugs from which you can form your own impression.

    Knuth writes somewhere that he believes he has fixed the last bug in TeX. It’s a marvelously complex piece of software and such a claim can only be made by a true master!

    Thanks for the post

  20. TexMan permalink
    March 13, 2011 3:35 pm

    While TeX is wonderful, I have some misgivings about recommending a student to use it. Please feel free to correct:

    1. Tex is pronounced “tech”, but most people use LaTeX, a user friendly overlay on the TeX engine developed by Lamport. LaTeX can be pronounced “latex”. I recommend a dedicated word processor like Texworks or LyX that has templates for easy markup. Too many students, though, would benefit greatly from the spell check and grammar check that some of these programs fail to make as simple as MSWord. It also disappoints me to see fine-tuning with respect to spacing. If this is really necessary, MSWord is a better solution.

    2. The original TeX font CM is a bit spindly, but using XeTeX, any font on the machine is usable. Thus, it is no longer easy to distinguish a MSWord document and a LaTeX document (possibly some ligatures might still be distinct). As such, TeX is now increasingly an affectation, such as insisting on manual gears for driving cars. But if it’s used, it should be used well, not just with the default font, giant margins and single spacing in the article class.

    3. For math input, MathType is developing into an excellent solution. One can write by hand or type using a template and have the math converted into TeX for insertion into the document. I think it is now a standalone program.

    4. It is probably easier to draw diagrams in MSWord, or use a dedicated diagram graphics program, and then insert a JPG/PNG file into a LaTeX document. The truly dedicated TeX user can use a package such as TikZ.

    5. For collaborative work with students, I’ve not been able to find a good TeX package that allows document comparison or highlighted insertions added into the margins. Supposedly, one can use the same techniques that programmers use for collaboration. Any suggestions would be welcome.

    • Random permalink
      March 18, 2011 6:20 pm

      How anyone can advocate the use of word, with its ever changing formats, inflated file sizes, and broken compatibilities, is beyond me.

      But I’ll give you my best reason to disdain Word: any work done through a series of clicks is as good as lost. If you ever achieve something non-trivial in Word, how are you ever going to remember it? Whereas anything you do in (La)TeX can be recovered easily by searching the source.

      So much so that I am going through the trouble of learning the basics of tikz. Yes, it’s a huge program that I only dabble in, yes it can be a royal pain in the buttocks, but at least any figure that I draw can then easily be edited, modified, adapted, ported…

      • March 19, 2011 12:56 am

        As much as I know I’m probably just feeding the troll, in case you’re being serious, I’ll give you a serious response: you’ve missed the point of user interfaces and made a solid counterexample to your own argument.

        In Word, I don’t need to remember how to do something non-trivial. The user interface should ideally make it clear how to do the thing you’d like to do, and for the most part, I find that Word 2007 and later accomplish that. The opposite is true of Tex. You need to remember how to do everything or remember what keywords to search for. For example, it took me hours of searching on how to position a figure in Tex, only to find a website describing that Tex will sometimes completely ignore the figure positioning you specify and there’s nothing you can do about it. Who designs a system that you can’t do something as basic as position a figure?

        Also, that brings up the other issue you mentioned: format compatibility. I can open Word 95 documents perfectly well in Word 2007, but not vice versa. Tex has exactly the same issue! Latex does not support a vast majority of common image formats, but PDFLatex does, and Latex won’t tell you that it doesn’t support the image format you’re giving it; it’ll just spew out a ton of meaningless error messages that have nothing to do with that it’s an obsolete version demanding obsolete formats. Word 95 will at least admit that it’s incapable of opening the file instead of breaking.

        I know you’ll probably just reply with Tex fanboy ramblings, since I’m just feeding a troll, but if you’re actually being serious, you should really try to think about what you post before you post it.

    • TexMan permalink
      March 23, 2011 3:01 pm

      Thanks for the replies. I’ve been looking at the most recent version of LyX. It looks like they are addressing some of the limitations — spell check on the fly and document comparison. Some more collaborative tools and grammar check could be things they should look at in the future.

    • bob permalink
      April 3, 2011 3:20 pm

      This is actually a reply to Neil Dickson. Are you seriously telling us that Word, apparently marketed by your former employer, is a better platform for writing mathematics than TeX?

      As far as usability, my experience using versions of Word for decades now is that it breaks constantly. It generates corrupted files at the drop of a hat. It is not usable for writing science that includes much more than text.

      And frankly, your projection and name-calling reflects poorly on any argument you are trying to advance.

  21. March 13, 2011 7:08 pm

    Actually, there is BNF for Tex. See page 268 and following of the TeXbook (my copy is the 6th printing). It may not amount to a complete formal description of TeX. TeX is after all a macro language. On the other hand, if you include the preprocessor as part of the language, as the standards documents do, you also cannot write a 100% complete formal description C language in BNF.

    • rjlipton permalink*
      March 14, 2011 11:21 am

      Jeffrey

      Thanks for the pointer. I was talking about the beginning for sure, TeX has changed over last years, of course.

  22. March 14, 2011 6:31 am

    My proofs on “P vs. NP” are located at: http://leibnizengine.wordpress.com/2011/02/23/p-np/

    1. If you are curious about the mystery of the Universe with Space and Time, it is written for you, please unlock it now !!!

    2. If you cannot appreciate the proofs, you may not solve the problems now. Do not worry, keep your wonders to solve the problems !!!

  23. March 19, 2011 1:58 pm

    I use ConTeXt, and not LaTeX or TeX. My footnoted-clickable-link macro is:

    % Define \Site[id][url][description] to put in a clickable link with footnoted URL
    \def\Site[#1][#2][#3]{\useURL[#1][#2][][#3]\from[#1]\footnote[#1]{\type{#2}}}

    I think a lot of things are substantially easier in ConTeXt than in LaTeX or Plain TeX, though the lack of documentation means that it can be an adventure finding that out.

  24. Anonymous permalink
    March 27, 2011 10:42 pm

    I do not use LaTeX, I prefer Plain TeX.

    Plain TeX much better.

  25. April 4, 2011 8:31 pm

    I use TeX only under duress, because I know the Old Testament, troff, from forty years’ experience. I have found the New Testament, while modestly more capable, not to be qualitatively different enough to justify investing the time to become really fluent in it. Dick Lipton has exposed a plausible reason. Tex, like troff, “just growed”, albeit in master hands. There is no comprehensive theory behind it–no solid mental model that one can internalize. One must learn it haphazardly as one does a “natural” language.
    The lack of a comprehensive model may be inescapable, because typesetting involves a large component of taste. Knuth himself cited taste as a cause for detouring into typography from his calling as encylopedist. On the way he found some useful submodels, but no unifying theory.

Trackbacks

  1. Top Posts — WordPress.com
  2. Дайджест №10: | ByteFrames
  3. The First Xamuel.com Irregular Linkfest
  4. Weekly Picks « Mathblogging.org — the Blog
  5. Stopped Watches and Data Analytics | Gödel's Lost Letter and P=NP
  6. Word e LaTeX a confronto | Melabit
  7. Are Black Holes Necessary? | Gödel's Lost Letter and P=NP
  8. Dominic Welsh, 1938–2023 | Gödel's Lost Letter and P=NP

Leave a Reply

Discover more from Gödel's Lost Letter and P=NP

Subscribe now to keep reading and get access to the full archive.

Continue reading