1. Symbolic Computation
  2. Computer Algebra
  3. Computational Algebra
  4. Combinatorial Matrix Theory
  5. Combinatorial Optimization
  6. Commutative Algebra & Algebraic Geometry
  7. Combinatorial Designs
  8. Discrete Mathematics
  9. Combinatorics

  1. Combinatorial Optimization

    The search for many types of combinatorial matrices can be formulated as a Combinatorial Optimization problem, via certain symmetric matrices that correspond naturally to the concepts of the periodic and non-periodic (aperiodic) functions associated to a finite sequence. Therefore one can use powerful Combinatorial Optimization methods to search efficiently for such matrices.

  2. Combinatorial Designs via Computational Algebra

    Many types of Combinatorial Designs such as Hadamard matrices and D-optimal designs can be studied via a Computational Algebra formalism. An ideal in a suitable multivariate polynomial ring, is associated the combinatorial design, thus allowing for Computational Algebra tools (Gröbner bases, Hilbert functions) to be used in the study of combinatorial designs. These ideals exhibit many symmetries and are highly structured. The associated algebraic varieties possess equally interesting properties.
    A dual binary tree formalism, enables the use of powerful combinatorial optimization techniques (ranking, pruning, simulated annealing, genetic algorithms) as well as high-performance computing techniques. We are currently using high-performance computing (SHARCnet, WestGrid) to complete exhaustive serches for certain types of combinatorial designs (Hadamard matrices and D-optimal designs). The binary tree formalism is appropriate for developing heuristic search criteria as well.

  3. Computer Algebra & Celestial Mechanics:
    Central configurations in the N-body problem of Celestial Mechanics.

    A poster on Central Configurations

  4. Computer Algebra:
    Symbolic Numeric Algorithms for Polynomials. SNAP

  5. Commutative Algebra & Algebraic Geometry:
    Implicitization and Parameterization of algebraic curves, surfaces, hypersurfaces.

    Implicitization and parameterization are two fundamental problems in computational algebraic geometry. Implicitization is the conversion of parametric equations of a geometric object to an implicit equation. The opposite process is referred to as parameterization and is in general a harder problem to address. Both problems have immediate practical applications to computer-aided design (CAGD) where one typically requires thousands of conversions between implicit and parametric forms of equations of curves and surfaces in real time. Thus it is important to have good algorithms at hand for both problems.

    Many efficient and robust algorithms for implicitization and parameterization have been developed in Symbolic Computation using a wide variety of techniques such as resultants, Grobner bases, characteristic sets, perturbations, multidimensional Newton formulae, symmetric functions, elimination, moving frames, linear systems, integral bases, polynomial factorization and adjoint functions. These algorithms provide constructive answers to the implicitization and parameterization problems and have been implemented in commercial and freely available Computer Algebra Systems as well as in stand-alone programs. Some of the available algorithms are suitable for special categories of curves and surfaces only.

    We developed a new algorithm to perform implicitization of curves, surfaces and hypersurfaces. The algorithm is based in an interpretation of the implicitization problem as an eigenproblem using ideas from Rayleigh-Ritz theory in the Calculus of Variations. The algorithm has a symbolic and a numeric mode and reduces the implicitization problem to a computation of the nullspace of a certain matrix. This particularly appealing feature of the algorithm expands dramatically the range of applicability of the algorithm to include parametric families of curves, surfaces and hypersurfaces. The algorithm works for parametric equations specified by very general classes of functions. The algorithm is implemented as part of the standard distribution of Maple.

    Several theoretical improvements and their corresponding practical optimizations are still possible with the aim to speed-up considerably the execution of the algorithm: Using modular approaches to avoid intermediate expression swell, analyze the structure of the implicitization matrices to design fast algorithms for the nullspace computation, develop a sparse implicitization theory and identify and extend the available techniques for predicting the degree of the implicit equation in advance. Some other aspects of the algorithm pose intriguing problems of combinatorial nature.

  6. Entropy Computations

    Symbolic sequences arise naturally in quite different contexts ranging from Nonlinear Dynamical Systems (Symbolic Dynamics) to Theoretical Computer Science (substitutive sequences), Automata Theory (m-automatic sequences) and Biology (DNA and RNA sequences). Entropy-like quantities are a very useful tool for the analysis of such sequences. Of special interest are the block entropies, extending Shannon's classical definition of the entropy of a single state to the entropy of a succession of states. Other standard techniques are Fourier transforms, symbol-to-symbol correlation functions and recurrence and escape time statistics. The development of new visualization techniques is necessary in order to aid human perception to discern patterns and more complex correlations in these symbolic sequences. A good starting point for the development of new visualization techniques is the analysis of substitutive sequences, a very general class of symbolic sequences that have attracted considerable interest recently. 

  7. Computational Mathematics and Statistics

    In the early stages of an experimental situation, a large number of factors is likely to have been identified as possibly having an influence on the response. However, it is believed that only a few of these actually have a substantial effect, a situation known as factor sparsity. The small number of active factors can be identified through a screening experiment. Screening designs are frequently used by experimenters to help understand the impact of a large number of factors in relatively few trials. Traditionally Hadamard matrices have been used for such purposes. After the identification of active factors, the columns that correspond to these factors are selected from our screening design to form a new orthogonal factorial design for further investigation. Algebraic Geometry and Gröbner bases theory can solve such problems under the assumptions that interactions of factors is more likely to be important if its parent factors are also important (effect inheritance) and that lower order interactions are more likely to be important than higher order interactions (effect hierarchy).

Page maintained by: Ilias S. Kotsireas
Last Update: September 2009