Dataset supporting the paper:
Z. Huang, M. England, J.H. Davenport and L.C. Paulson
Using Machine Learning to decide when to Precondition Cylindrical Algebraic Decomposition with Groebner Bases.
Proceedings of the 18th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC '16), pp. 45--52. IEEE, 2016.
Digital Object Identifier: 10.1109/SYNASC.2016.020
############################################################
The paper described an experiment applying machine learning to decide the question of when to precondition Cylindrical Algebraic Decomposition (CAD) with Groebner Bases.
The following software was used:
1. Maple 2017
To calculate the random polynomials that formed the dataset; their algebraic features; the Groebner Bases and the CADs.
Maple is a propriatary Computer Algebra System. See this URL for details: http://www.maplesoft.com/products/maple
Although the software is propriatray the algorithms used are either simple and self-explanatary (feature generation) or documented in other papers we cited (CAD).
2. SVM Light
To build and test the support vector machines.
SVM Light is a free C based implementation of Support Vector Machines. See this URL for details: http://svmlight.joachims.org
For details of how the experiment was run see the paper.
3. WEKA
To run the feature selection experiments.
WEKA is a free Java library to support machine learning. See this URL for details: http://www.cs.waikato.ac.nz/ml/weka/
For details of how the experiment was run see the paper.
This dataset contains the following supplementary material.
- The Maple output showing the random generation of polynomials, calculation of GB and CADs.
These are organised into files ij.txt where i is the number of terms in the random polynomials and j the degree.
- For the 213 problems used in the test we also provide the file beforeScale.txt which shows the full 28 features (as described in Table 1 of the paper) prefixed with either +1 or -1 depending if GB preconditioning is beneficial or not.