New material for Volume 4will first appear in beta-test form as fascicles of approximately128 pages each, issued approximately twice per year.These fascicles will represent my best attempt to write a comprehensiveaccount; but computer science has grown to the point where I cannot hopeto be an authority on all the material covered in these books. ThereforeI'll need feedback from readers in order to prepare the official volumes later.
And after Volumes 1--5 are done, God willing, I plan to publishVolume 6 (the theory of context-free languages) and Volume 7 (Compilertechniques), but only if the things I want to say about those topicsare still relevantand still haven't been said. Volumes 1--5 represent the central coreof computer programming for sequential machines; the subjects of Volumes6 and 7 are important but more specialized.
Whatever your background, if you need to do any serious computer programming, you will find your own good reason to make each volume in this series a readily accessible part of your scholarly or professional library.
It was in 1962, sixty years ago, that Addison Wesley suggested that Knuth write a book on compilers. He initially planned the book as a twelve chapter work and in typical academic fashion the first three chapters would be based on notes for a course. However, once he embarked on writing he decided on a more ambitious work - one that would codify the theories, methods and algorithms of computer programming - an almost virgin subject at the time, in a complete fashion.
The next volume, of what was then expected to be a seven-volume set, was Seminumerical Algorithms was published following year consisting of a further two chapters on Random numbers and on Arithmetic. Two more chapters were combined to create the third volume, Sorting and Searching, which appeared in 1973 and tended to attract more attention perhaps because of the practical importance of the topic. These first three volumes virtually created the subject of computer science overnight and led to Knuth being described as the Euclid of computer science, which gives you some idea of just how important the work is. Volumes 1-3 were first combined into a slip case in 1988.
It's really great reading if you do stuff like program low-level (think C, Assembler), efficient programming or do stuff close to the hardware level (such as microprocessors). It describes the very low level of a program and a computer.
If you're into a higher level of programming (Java, C#, Python etc), unless you're building libraries for it, it is probably going to confuse you, most of the 'hard stuff' is (double precision, floating point, sorting and searching through lists ...) abstracted away. Obviously 'someone' has to know how it works in the end, someone has to write the compilers, I haven't started on the rest of the volumes because that's not \"me\".
I use it from time to time - mainly as a reference book. Most recently this spring, when I needed a reference on a data structure (circular linked lists) for a paper. I've found it useful often when doing professional computer programming and hardware design (for instance, where the hardware has to support some software algorithm efficiently, or efficient algorithms in driver software allow hardware simplification).
If you're working for Oracle and coding the Oracle database and are looking for an algorithm to squeeze a bit more performance out of the engine, go ahead and buy the book, you might find something in there. But most programmers are using sets and dictionaries in their chosen programming language that has a decent implementation of algorithms and won't be helped by some algorithm which might squeeze a few more cycles out of the computer but nobody will reward you for. The Knuth book is for high fliers and
For his major contributions to the analysis of algorithms and the designof programming languages, and in particular for his contributions to the \"art of computer programming\"through his well-known books in a continuous series by this title.
As the topics covered by volumes 1-3 developed, Knuth revised them. He was deeply disappointed when, in 1973, he saw the typesetting from Addison-Wesley for the second edition of volume 2. By then the publishing industry had replaced traditional mechanical typesetting technology with computerized typesetting that did not reproduce the high quality of the original printings of volumes 1-3.
Consequently, in 1977 Knuth began developing a new typesetting system to enable high quality computerized typesetting, in particular for TAOCP. This system was announced in his 1978 American Mathematical Society Gibbs Lecture entitled \"Mathematical Typography\" and published in the Bulletin (New Series) of the American Mathematical Society, volume 1, 1979, pp. 337-372. Knuth had two goals for his system:
Knuth considers TAOCP his masterwork, and in 1993 he retired early to spend more time writing additional volumes. He had produced revised editions of volumes 1-3 in 1978-1979. He designed a new hypothetical computer to replace the MIX computer of volumes 1-3 for the analysis of algorithms; this new computer was described in Knuth's 1999 book devoted to the topic .
The first TAOCP book is the most foundational, as the title suggests, and if you only have it in your budget tobuy one of these books, it's the one I'd recommend. The first hundred-or-so pages give a crash course onthe mathematical concepts and tools that Knuth employs most. The next hundred pages are a description of theimaginary computer that Knuth uses in all his code examples, as well as some basic programming techniques. Thisbrings us to one of the biggest complaints about this series of textbooks: all of the code examples are writtenin assembly! But I had a lot of fun learning how Knuth's strange imaginary computer worked and writing abunch of programsin MIX assembly. The highlight of this section is a MIX simulator, written in MIX itself!The fact that this simulator only requires about 300 lines of assembly really demonstrateshow bare-bones this computer is. But of course, the goal of these books isn't to teach the reader how to programin assembly. The hope is that seeing the algorithms implemented at such a low-level will make programmingthem in any higher-level language a breeze. And of course, every program is accompanied by a detailed descriptionof the algorithm in the highest-level programming language of all: plain old English! This is the format ofthe second half of the book, which introduces linked lists, trees, and uses this concept to describe methodsfor automatic garbage collection and dynamic storage allocation in a low-level computer.
This is the TAOCP book I see referenced/recommended the least, and admittedly it is the oneI have found least useful as a reference so far. But, for me, it was the most enjoyable read of them all.The first two-hundred pages deal with methods for generating random numbers and checking if a sequenceis sufficiently random. At times, Knuth gets quite philosophical, and the main takeaway of this chapteris that there isn't a totally well-defined concept of what a random sequence should be, at least asfar as computers are concerned. The second chapter of this volume deals with arithmetic, starting withmethods for representing floating-point numbers in memory and then diving into algorithms for adding,multiplying, factoring, etc. A detailed analysis of every single algorithm is given, and Knuth doesn't justrestrict himself to real numbers; this chapter discusses generalises many of these operations to modulararithmetic, polynomials, matrices, and power series as well. All in all, this volume concerns the conceptsthat modern programmers take for granted on a day-to-day basis: the abilityfor a computer to perform basic arithmetic operations quickly.
Because of the emphasis placed on sorting and searching algorithms in the undergraduate curriculum, thisvolume is the most likely to show up on a computer science class' \"recommended reading\" list. True to itstitle, Volume 3 is split almost evenly between sorting, in the first half, and searching in the second. All ofthe bread-and-butter algorithms and data structures are covered, from quicksort to binary heaps to tries,which makes this a very important volume, but also perhaps a bit boring, since any computer science majorwill have seen many these concepts in undergraduate classes (though certainly in only a fraction of the detailthat Knuth devotes to them). There is a somewhat large section, complete with a wonderful foldout,devoted to the ins-and-outs of merging large files stored on huge physical rolls of magnetic tape.I found this section quite entertainingbecause of the novelty of learning about how people used to store data, but most of it is quite uselessto the modern reader. I believe that Knuth intends to scrap it and replace it with something else in the next edition.
The talk inspired me to read The Art of Computer Programming, Volume 4A: Combinatorial Algorithms. As he described in the talk, he gives a reward check (for 1 hexadecimal dollar) to anyone who finds an error in one of his books, so I set myself the goal of finding an error in the book while on vacation.After a lot of study, I thought I'd found an error in a diagram, but then I realized I was confused. Next, I found a blatant error in an appendix, but that error had already been discovered and was listed in the errata. Finally I found an error on page 574 that I hoped hadn't been found yet and sent it off to Professor Knuth.I was delighted to receive my reward check today, which Wikipedia calls \"among computerdom's most prized trophies\". That's a big exaggeration but nonetheless, I'm happy to get one. Note that the check is from the fictional Bank of San Seriffe to avoid check fraud problems from all the images of his checks on the web.As for the book itself, it's interesting even if you're not after a reward, although very challenging to read. Volume 4a describes combinatorial algorithms, including more ways of computing permutations and combinations than you can shake a stick at. It also has an extensive discussion of BDDs and ZDDs, which are newish data structures for representing sets. The section on bitwise tricks and techniques is interesting if you like HAKMEM-style tricks such as reversing the bits in an integer through a short sequence of incomprehensible operations.I have to admit that trying to find an error in a book is a strange vacation goal, but I'm glad I succeeded and Knuth's book taught me some interesting algorithms in the process.P.S. I was very surprised to see this article on the front page of Hacker News. To answer the questions there and below, the error I found was in volume 4a page 574 (as the memo line on the check shows). The solution to exercise 67 on that page says a particular circuit uses 6 ANDN gates, which I thought should be NAND gates. It gets a bit more complicated because Knuth was referring to the ANDN op code for MMIX, but there was still a mistake with and-not versus not-and. (The other error I noticed was n choose k on page 824, but I checked the errata and it had already been found.)Email ThisBlogThis!Share to TwitterShare to FacebookShare to PinterestLabels:random,theory6 comments:Anonymoussaid...You aren't going to go into what the error was specifically 59ce067264