A New Testament for Special Functions?
By Barry A. Cipra
SIAM News, March 8, 1998
In contemplation of created things
The improbable intricacies of nature are the basis for a well-worn and, for many, compelling argument for the existence of a divine, if sometimes sadistic, creator. Could additional evidence be cited in the extraordinary mathematics of special functions?
So far, theologicians have steered clear of mathematical alphas and omegas (not to mention the gammas and zetas), but that may be only for lack of access to information about the remarkable properties of special functions. Some upcoming and ongoing projects for spreading the word about special functions were outlined last summer at the SIAM 40th Anniversary Meeting, in the minisymposium "Handbooks for Special Functions and the World Wide Web."
The minisymposium was organized by Richard Askey of the University of Wisconsin and Willard Miller, Jr., who recently became director of the Institute for Mathematics and Its Applications at the University of Minnesota. Participants discussed the need for updated handbooks that reflect the field's progress in recent decades. One of the central themes was the question of how to utilize computer technology in creating future compendia of special functions for the myriad mathematicians, scientists, and engineers whose work they constantly bedevil.
Beyond Sin (and Cos)
One characteristic of these functions is a tendency to appear in a variety of mathematical and physical circumstances. Make no mistake about it: Special functions are ubiquitous. Bessel functions, for example, crop up just about any time there's a differential equation in a cylindrically symmetric domain, such as an optical fiber. (You'll also come across them when taking the Fourier transform of radially symmetric functions in R^n.) The Hungarian mathematician Paul Turan, Askey says, considered "special functions" a misnomer: They should be called useful functions.
The ubiquity of special functions is due in part to the vast number of mathematical identities they satisfy. Each identity is in effect a kind of symmetry, and functions are often characterized by their symmetries. Modern physics especially has become a study of symmetry (and symmetry breaking). Likewise, probability theorists frequently draw conclusions based on the equal likeliness of events. And combinatorialists love to count things in orderly arrangements.
All this leads to the need for a detailed description of individual special functions, by means of formulas, tables, and graphs, and also for a list of as many identities as can possibly be found. That's where handbooks come in.
For the last half century, those needs have been met by two publications: the so-called Bateman Project, which appeared in the 1950s, and the Handbook of Mathematical Functions, first published in 1964 by the National Bureau of Standards (renamed the National Institute of Standards and Technology, or NIST, in 1988). The Bateman Project is named after Harry Bateman, a Caltech mathematician who died in 1946, leaving behind a vast store of information on special functions. The task of bringing his manuscript to completion was headed by Arthur Erdelyi. The project led to the three-volume Higher Transcendental Functions and the two-volume Tables of Integral Transforms.
"The Bateman Project was a tremendous success," Miller says. "It trained a generation of mathematicians who did research in special functions. The absolute top people in the field were involved, and they made some very good choices about what material to include."
The NBS Handbook was also highly successful. Edited by Milton Abramowitz and Irene Stegun, the Handbook contains 29 chapters ranging from "Mathematical Constants" (the book's first entry is the square root of 2 to 20 digits) to "Laplace Transforms." Two of the chapters are the work of an author well known to readers of SIAM News: Philip J. Davis, who began his career at NBS.
Daniel Lozier, a mathematician at NIST, attributes the success of the Handbook to its "comprehensiveness, authoritativeness, timeliness, and applicability." The Handbook "appeared at just about the time electronic computers were beginning to make their influence strongly felt in applied mathematics," Lozier says. "Thus, it served to sum up the state of a highly developed prior mathematical technology-namely mathematical tables and interpolation from them-at a time of transition to a new technology."
Four decades of technology later, it's time for an update.
But now at last the sacred influence
"The Bateman Project badly needs to be redone," says Askey. Fifty years of progress since Bateman's original compilation have swelled the ranks of special functions and given researchers new insights into the structure of the field. In particular, a class of functions known as q-series has emerged at center stage. In addition, special functions specialists feel they've finally found the "right" way to generalize orthogonal polynomials to several variables. Neither topic appears in the Bateman Project.
The theory of multivariate orthogonal polynomials might seem to be a simple exercise in adding y's and z's to formulas that start out with x's alone. But it ain't that easy. The problem, oddly enough, is that it's too easy to generalize the one-variable functions, in too many different ways, most of which turn out to be mathematical cul-de-sacs.
"If we go back to the 19th century, people tried to define special functions in several variables by using the wisdom they had in one variable," explains Mourad Ismail of the University of South Florida. "It turned out that was not good enough." What was needed, he says, was the development of other areas of mathematics that would motivate the generalizations. Those areas now exist, in the form of quantum groups, root systems, and analysis on Grassmannians, among other new subjects in pure and applied mathematics.
"We have the advantage of 50 years of accumulated wisdom," says Ismail. As for q-series, Askey says, the idea can be traced back to Euler in the 18th century and Gauss in the 19th, although it is only recently that researchers have realized how extensive the theory is. A q-series can be thought of as coming from a noncommutative version of the binomial theorem: (x + y)^4, for example, instead of being expanded simply as x^4 + 4x^3y + 6x^2y^2 + 4xy^3 + y^4, is expanded first as xxxx + (xxxy + xxyx + xyxx + yxxx) + . . . , after which the rule yx = qxy is used to obtain x^4 + (q^3 + q^2 + q + 1)x^3y + (q^4 + q^3 + 2q^2 + q + 1)x^2y^2 + (q^3 + q^2 + q + 1)xy^3 + y^4. The "q-polynomials" that arise, along with their infinite cousins, the q-hypergeometric series, turn out to have applications in fields ranging from coding theory to quantum physics.
Ismail and Walter van Assche at the Katholieke Universiteit Leuven in Belgium are heading an international effort to bring the Bateman Project up to date. The new version is called the Askey-Bateman project, in honor of Askey's contributions to the field, Ismail says. It is being developed in cooperation with the SIAM Activity Group on Special Functions and Orthogonal Polynomials, he adds.
Ismail envisions the production of seven or eight volumes over the next decade. He and Askey, along with Roelof Koekoek at the Delft University of Technology and René Swarttouw at the Free University of Amsterdam, are working on the first volume, on orthogonal polynomials. Work is also under way on a second volume, on special functions in number theory and combinatorics.
A Revised Standard Version
Electronic handbooks make a lot of sense, adds Miller. "You should be able, given the definition of a function, to graph it almost immediately or compute with it almost immediately. There's no reason not to have that functionality." Lozier sees a two-stage process for the Digital Library. The first stage is essentially to mimic the existing handbook in electronic form, with the advantage of utilizing interactive computer graphics, symbolic manipulation, and links to algorithms and software for numerical computations. The second, more problematic stage will be to develop a standard reference library of mathematical software "capable of computing function values to at least quadruple precision over very large ranges of the input arguments and parameters," says Lozier. "But we require even more---that the accuracy of quadruple precision is reliable in the sense that all errors arising from truncation and rounding are monitored and kept under strict control."
That's easier said than done. "The obstacles to the construction of such a library are mathematical and computational," Lozier points out. "Algorithms with strict bounds on truncation and rounding errors are not generally available for special functions." But that's why the computers still need people: "These obstacles provide an opportunity for creative mathematicians and computer scientists," Lozier stresses.
The new handbooks will be a fitting start for the new millennium. There's little doubt, however, that more revisions, based on discoveries yet undreamt, will be called for down the line. And it's unlikely that the whole truth will ever be revealed. The theory of special functions could well be, as Milton put it,
An Edifice too large for [Man] to fill,