Renaissance of MacMahons Partition Analysis: Symbolic Summation at RISC-Linz
by Peter Paule
The Combinatorics Group at the Research Institute for Symbolic Computation (RISC, Kepler University Linz, Austria) has specialised in developing computer algebra algorithms to simplify sum expressions arising in combinatorics and related fields. One of the recent achievements is the revitalisation of a hundred years old method of MacMahon.
In his famous book Combinatory Analysis (1916) MacMahon introduced partition analysis as a method in additive number theory for solving problems with constraints in the form of linear diophantine (ie, with integer coefficients) inequalities or equations, respectively. For several decades this method has remained dormant. In recent research work carried out jointly with G.E. Andrews (PennState, USA) who also initiated the project, we developed a new algorithmic approach to MacMahons method and implemented it in the form of the Mathematica package Omega. Omega can be used in various ways: for finding all non-negative integer solutions to a given (homogeneous) linear diophantine system of inequalities or equations, respectively; for deriving closed form representations of multivariate generating functions with linear diophantine constraints on the summation parameters; or as a heuristic tool for combinatorial studies.
Example (MacMahons solid partitions on the cube): Consider all partitions of a given integer into eight non-negative integer parts a1 up to a8 where ai must be greater than or equal to aj whenever an edge is directed from ai to aj on the corresponding cube. How Omega is used to compute the generating function is shown in Figure 1. The computation takes only a few seconds. At MacMahons time the situation was quite different: he cleverly split the problem into eighteen (!) subproblems which add up to the desired result. We cite from MacMahons paper: "Mr. A.B. KEMPE, Treas. R. S., has verified this conclusion by a different and most ingenious method of summation, which also readily yields the result for any desired restriction on the part-magnitude." For Omega also this refined problem is a routine exercise!
The Omega project started some four years ago as a new branch in our longer existing and ongoing project on symbolic summation. Historically, the starting point of modern symbolic summation is Gospers algorithm (1978) which constitutes a decision procedure for indefinite hypergeometric summation. A summand term f(k) is called hypergeometric, if f(k+1) / f(k) equals a fixed rational function r(k) for all k. The algorithm computes a hypergeometric term g(k) such that g(k+1) - g(k) = f(k) for all k; if no hypergeometric solution exists, the algorithm detects this. So in the positive case the sum f(0)+f(1)+...+f(n) has the closed form presentation g(n+1) - g(0).
Despite being a first breakthrough, for a long time its applicability has been considered as quite restricted since most hypergeometric summations arising in practice are definite. Informally, definite sums admit closed form representations only for a particular choice of the summation bounds as, for instance, in the binomial theorem.
Zeilbergers creative telescoping (1990) dissolved this limitation (see Figure 3). He observed that only a slight variation of Gospers method is needed to turn it into an algorithm which computes recurrences as most useful presentations of definite hypergeometric sums. For details we recommend the book A=B by Petkovsek, Wilf and Zeilberger.
Due to the success of Zeilbergers breakthrough, symbolic summation has developed into a well-recognized discipline of computer algebra. The primary interest of the RISC Combinatorics Group is to contribute to this field by the design of new algorithms and by providing prototype implementations in the form of computer algebra packages. Obviously the applicability of these tools is not restricted to purely combinatorial investigations. The problem of simplifying complicated sum expressions arises also in other mathematical areas, for example, in number theory (see Figure 2), in special functions, in the analysis of algorithms, or in statistical mechanics.
Other applications for such procedures concern theorem proving as well as automatic validation of mathematical tables. There is a strong interaction with the THEOREMA group at RISC; see Buchbergers article in this issue. Automatic validation plays an important role in projects like the NIST Digital Library of Mathematical Functions for which Paule serves as editor; see http://dlmf.nist.gov/.
We note that the software developed by our group was already used to generate computer proofs of identities which were conjectured by (human) researchers, but for which no human proof has been found before; eg, see Figure 2 with Ahlgrens identity.
A significant part of the achievements of the RISC Combinatorics Group concerns various extensions and generalizations of Zeilbergers theory. For instance, with Rieses package qZeil for q-hypergeometric sums one can generate automatic proofs of formulae which in the limit yield the celebrated Rogers-Ramanujan identities.
Another important object is to design algorithms for multiple (q-)hypergeometric sums. Based on Sister Celines method, described also in Rainville's book Special Functions, Wilf and Zeilberger established a corresponding algorithmic proof theory. For transforming their setting into efficient procedures, we combined it with ideas of Verbaeten and with techniques from modular arithmetic. This enables to find closed form evaluations of two- and three-fold sums over (q-)hypergeometric summands which up to now have been considered as algorithmically infeasible.
An important step toward extending summation theory from hypergeometric input to summands of more general type (eg, including harmonic numbers) has been made by Schneider. In his PhD work he was able to generalize Karrs approach which can be seen as a difference field analogue to Risch integration. Schneider was the first who noticed that Karrs summation algorithm from 1980 makes implicit use of the creative telescoping principle. This observation, in combination with other new ideas, enabled him to design powerful algorithms for simplifying indefinite and definite nested sums and for solving linear recurrences of arbitrary order in certain difference field extensions. His PhD thesis (2001) presents an impressive list of nontrivial applications, including nested sums originating from a vicious walker problem in statistical mechanics.