|
|
||||||
Algebra and Computation at SZTAKIby András Benczúr, Gábor Ivanyos and Lajos Rónyai There has always been a strong and productive connection between mathematics and computing. Mathematical ideas turned out to be useful in the realm of computation, and, conversely, advances of computational nature have led to mathematical discoveries. Algebra and computation play a major role in some research projects at SZTAKIs Informatics Laboratory. Symbolic Computation with Algebras, Modules and Representations The study of computational complexity of the structure of associative algebras was initiated by researchers of our Laboratory in the eighties. Since then several polynomial time algorithms have been proposed for such problems. Our experience with associative algebras turned out to be eminently useful, when, together with our colleagues at the Technische Universiteit Eindhoven, we developed efficient algorithms for computations with finite-dimensional Lie algebras. The methods have been implemented as part of ELIAS, the Eindhoven LIe Algebra System by Willem de Graaf. Now these procedures are parts of the standard libraries of the computational algebra systems GAP and MAGMA. Interestingly, ideas that surfaced during the development of Lie algebra algorithms appeared to be fruitful in the case of associative algebras as well. In particular, such constructions can be used to generate random elements of the radical (maximal nilpotent ideal) of a finite dimensional associative algebra. This enables us to avoid solving the huge systems of linear equations used in former methods. This technique, supplemented with methods of graph theoretic flavour led to an almost optimal randomized procedure for computing the structure of a matrix algebra over a finite field. The idea of randomization appears to be new in the context of radical computation. The basic method is of Monte Carlo type. There is, however, a somewhat slower Las Vegas variant (it never gives incorrect results, but with small probability may report failure). Based on very similar ideas, together with Klaus Lux, one of the developers of the C-MeatAxe package for computations with modular representations of finite groups, we found a simple extension of the constructive irreducibility test of the MeatAxe to those cases where the original method did not work efficiently. The extension has been implemented in the new release of C-MeatAxe. The interested reader is referred to the homepage of MeatAxe: http://www.math.rwth-aachen.de/LDFM/homes/MTX/. Theoretical Applications of Gröbner Bases Perhaps less known is the fact that Gröbner bases are valuable in theoretical investigations as well. As early as in the 1920s Francis S. Macaulay used them in his treatment of ideals in polynomial rings. It was recognised more recently that they may be applied to the study of finite (combinatorial) structures. Along this line, we considered the following setting. Let F be a set system over the n-element base set [n] (ie, F is a collection of subsets of [n]). Represent the elements X of F via their characteristic vectors. These are vectors of length n; the ith component of the vector of X equals 1 if i is in X, and 0 otherwise. One can view polynomials in n variables as functions on these vectors. The ideal I of all polynomials vanishing on the characteristic vectors of F carries a lot of information about the set system F. For this reason, it is of interest to know the Gröbner bases and standard monomials of such ideals. We found that standard monomials are in close connection with the combinatorial concepts of shattering and higher incidence matrices. We have described Gröbner bases and the standard monomials for the complete uniform families (when F is the set of all k-subsets of [n], for some fixed k). This way, we could extend results of Richard M. Wilson and Péter Frankl on inclusion matrices. In the future we hope to compute Gröbner bases and attached information for more general families F. Prospective application areas are combinatorics and the theory of computation (circuit complexity in particular). Singular Value Decomposition and the WWW Good search engines are essential in helping to find our way across the ocean of resources. With some simplification, there are two major approaches to aid Web navigation, one based on the similarity of word frequencies across documents and another on the hyperlink structure of the Web the Web Graph. Apparently the major engines use these approaches more or less separately. For example Google ranks its hits by using the hyperlink structure, the Northern Light clusters the hit list by word frequencies, while ResearchIndex provides results of various available ranking methods separately. In a recently launched project we aim to combine word frequency similarities with the hyperlink structure and tailor a unified method for documents written in Hungarian. Surprisingly, algebraic techniques can be applied in both navigational aid methods. The Singular Value Decomposition (SVD) sometimes also called Lánczos decomposition has long been used by statisticians (in principal component analysis) to extract essential information stored in matrices. It is only recently realized that, when applied to the matrix of word frequencies in documents, the truncated SVD drastically reduces the dimension of the space representing the documents, providing both compression, noise filtering and a so-called Latent Semantic Indexing of the documents. Finally it is very recently realized that SVD, when applied to the adjacency matrix of the Web Graph, is capable of ranking pages by their relevance to a given query. The first such method from 1998 is the HITS algorithm of Kleinberg that approximates the first singular vector of the Web Graph of the search hit list neighborhood. In our experiments we test variants of the SVD computation and its interpretation in order to obtain more accurate and more efficient retrieval methods. Please contact: |