meta-algorithmic techniques in sat solving: automated ... · meta-algorithmic techniques...
TRANSCRIPT
Meta-algorithmic techniquesin SAT solving:
Automated configuration,selection and beyond
Holger H. Hoos
BETA LabDepartment of Computer Science
University of British ColumbiaCanada
SAT/SMT Summer School
Trento, Italy, 2012/06/12
What fuels progress in SAT solving?
I Insights into SAT (theory)
I Creativity of algorithm designers heuristics
I Advanced debugging techniques fuzz testing, delta-debugging
I Principled experimentation statistical techniques, experimentation platforms
I SAT competitions
Holger Hoos: Meta-algorithmic techniques in SAT solving 2
2009: 5 of 27 medals 2011: 28 of 54 medals
Holger Hoos: Meta-algorithmic techniques in SAT solving 3
Meta-algorithmic techniques
I algorithms that operate upon other algorithms (SAT solvers)
6= meta-heuristics
I here: algorithms whose inputs include one or more SAT solvers
I configurators (e.g., ParamILS, GGA, SMAC)I selectors (e.g., SATzilla, 3S)I schedulers (e.g., aspeed; also: 3S, SATzilla)I (parallel) portfolios (e.g., ManySAT, ppfolio)
I run-time predictorsI experimentation platforms (e.g., EDACC, HAL)
Holger Hoos: Meta-algorithmic techniques in SAT solving 4
Why are meta-algorithmic techniques important?
I no one knows how to best solve SAT (or any otherNP-hard problem) no single dominant solver
I state-of-the-art performance often achieved by combinationsof various heuristic choices (e.g., pre-processing;variable/value selection heuristic; restart rules; datastructures; . . . ) complex interactions, unexpected behaviour
I performance can be tricky to assess due toI differences in behaviour across problem instancesI stochasticity
potential for suboptimal choices in solver development,applications
Holger Hoos: Meta-algorithmic techniques in SAT solving 5
Why are meta-algorithmic techniques important?
I human intuitions can be misleading, abilities are limited substantial benefit from augmentation withcomputational techniques
I use of fully specified procedures (rather than intuition /ad hoc choices) can improve reproducibility,facilitate scientific analysis / understanding
Holger Hoos: Meta-algorithmic techniques in SAT solving 6
SAT is a hotbed for meta-algorithmic work!
I Drosophila problem for computing science (and beyond)
I prototypical NP-hard problemI prominent in various areas of CS and beyondI important applicationsI conceptual simplicity aids solver design / development
I active and diverse community
I SAT competitions
Holger Hoos: Meta-algorithmic techniques in SAT solving 7
Outline
1. Introduction
2. Algorithm configuration
3. Algorithm selection
4. Algorithm scheduling
5. Parallel algorithm portfolios
6. Programming by Optimisation (PbO)
Holger Hoos: Meta-algorithmic techniques in SAT solving 8
Traditional algorithm design approach:
I iterative, manual process
I designer gradually introduces/modifies components ormechanisms
I performance is tested on benchmark instances
I design often starts from generic or broadly applicableproblem solving method (e.g., evolutionary algorithm)
Holger Hoos: Meta-algorithmic techniques in SAT solving 9
Note:
I During the design process, many decisions are made.
I Some choices take the form of parameters,others are hard-coded.
I Design decisions interact in complex ways.
Holger Hoos: Meta-algorithmic techniques in SAT solving 10
Problems:
I Design process is labour-intensive.
I Design decisions often made in ad-hoc fasion,based on limited experimentation and intuition.
I Human designers typically over-generalise observations,explore few designs.
I Implicit assumptions of independence, monotonicityare often incorrect.
I Number of components and mechanisms tends to growin each stage of design process.
complicated designs, unfulfilled performance potential
Holger Hoos: Meta-algorithmic techniques in SAT solving 11
Solution: Automated Algorithm Configuration
I Key idea: expose design choices as parameters,Key idea: use automated procedure to effectivelyKey idea: explore resulting configuration space
I Given: algorithm A with parameters p andconfiguration space C ; set Π of benchmark instances,performance metric m
I Objective: find configuration c∗ ∈ C for which A performsbest on Π according to metric m
Holger Hoos: Meta-algorithmic techniques in SAT solving 12
Algorithm configuration
Lo Hi
Holger Hoos: Meta-algorithmic techniques in SAT solving 13
Example: SAT-based software verification
Hutter, Babic, HH, Hu (2007)
I Goal: Solve suite of SAT-encoded software verificationGoal: instances as fast as possible
I new DPLL-style SAT solver Spear (by Domagoj Babic)
= highly parameterised heuristic algorithm= (26 parameters, ≈ 8.3 × 1017 configurations)
I manual configuration by algorithm designer
I automated configuration using ParamILS, a genericalgorithm configuration procedureHutter, HH, Stutzle (2007)
Holger Hoos: Meta-algorithmic techniques in SAT solving 14
Spear: Empirical results on software verification benchmarks
solver num. solved mean run-time
MiniSAT 2.0 302/302 161.3 CPU sec
Spear original 298/302 787.1 CPU secSpear generic. opt. config. 302/302 35.9 CPU secSpear specific. opt. config. 302/302 1.5 CPU sec
I ≈ 500-fold speedup through use automated algorithmconfiguration procedure (ParamILS)
I new state of the art(winner of 2007 SMT Competition, QF BV category)
Holger Hoos: Meta-algorithmic techniques in SAT solving 15
SPEAR on software verification
10−2
10−1
100
101
102
103
104
10−2
10−1
100
101
102
103
104
default config. [CPU sec]
au
to-c
onfig
. [C
PU
sec
]
Holger Hoos: Meta-algorithmic techniques in SAT solving 16
Advantages of using automated configurators:
I enables better exploration of larger design spaces better performance
I lets human designer focus on higher-level issues more effective use of human expertise / creativity
I uses principled, fully formalised methods to findgood configurations better reproducibility, fairer comparisons, insights
I can be used to customise algorithms for use inspecific application context with minimal human effort expanded range of successful applications
Holger Hoos: Meta-algorithmic techniques in SAT solving 17
Approaches to automated algorithm configuration
I Standard optimisation techniques(e.g., CMA-ES – Hansen & Ostermeier 2001; MADS – Audet & Orban 2006)
I Advanced sampling methods(e.g., REVAC – Nannen & Eiben 2006–9)
I Racing(e.g., F-Race – Birattari, Stutzle, Paquete, Varrentrapp 2002;
Iterative F-Race – Balaprakash, Birattari, Stutzle 2007)
I Model-free search(e.g., ParamILS – Hutter, HH, Stutzle 2007;
Hutter, HH, Leyton-Brown, Stutzle 2009;
GGA – Ansotegui, Sellmann, Tierney 2009)
I Sequential model-based optimisation(e.g., SPO – Bartz-Beielstein 2006;
SMAC – Hutter, HH, Leyton-Brown 2011–12)
Holger Hoos: Meta-algorithmic techniques in SAT solving 18
Iterated Local Search
(Initialisation)
Holger Hoos: Meta-algorithmic techniques in SAT solving 19
Iterated Local Search
(Initialisation)
Holger Hoos: Meta-algorithmic techniques in SAT solving 19
Iterated Local Search
(Local Search)
Holger Hoos: Meta-algorithmic techniques in SAT solving 19
Iterated Local Search
(Perturbation)
Holger Hoos: Meta-algorithmic techniques in SAT solving 19
Iterated Local Search
(Local Search)
Holger Hoos: Meta-algorithmic techniques in SAT solving 19
Iterated Local Search
?
Selection (using Acceptance Criterion)
Holger Hoos: Meta-algorithmic techniques in SAT solving 19
Iterated Local Search
(Perturbation)
Holger Hoos: Meta-algorithmic techniques in SAT solving 19
ParamILS
I iterated local search in configuration space
I initialisation: pick best of default + R random configurations
I subsidiary local search: iterative first improvement,change one parameter in each step
I perturbation: change s randomly chosen parameters
I acceptance criterion: always select better configuration
I number of runs per configuration increases over time;ensure that incumbent always has same number of runsas challengers
Holger Hoos: Meta-algorithmic techniques in SAT solving 20
SATenstein: Automatically BuildingLocal Search Solvers for SATKhudaBukhsh, Xu, HH, Leyton-Brown (2009)
Frankenstein:
create perfect human being from scavenged body parts
SATenstein:
create perfect SAT solvers using components scavenged fromexisting solvers
Geneneral approach:
I components from GSAT, WalkSAT, dynamic local search andG2WSAT algorithms
I flexible SLS framework (derived from UBCSAT)
I find performance-optimising instantiations using ParamILS
Holger Hoos: Meta-algorithmic techniques in SAT solving 21
Challenge:
I 41 parameters (mostly categorical)
I over 2 · 1011 configurations
I 6 well-known distributions of SAT instances(QCP, SW-GCP, R3SAT, HGEN, FAC, CBMC-SE)
I 11 challenger algorithms(includes all winning SLS solvers from SAT competitions 2003–2008)
Holger Hoos: Meta-algorithmic techniques in SAT solving 22
Result:
I factor 70–1000 performance improvements over bestchallengers on QCP, HGEN, CBMC-SE
I factor 1.4–2 performance improvement over best challengerson SW-GCP, R3SAT, FAC
Holger Hoos: Meta-algorithmic techniques in SAT solving 23
SATenstein-LS vs VW on CBMC-SE
10−2
10−1
100
101
102
103
10−2
10−1
100
101
102
103
SATenstein−LS[CBMC−SE] median runtime (CPU sec)
VW
med
ian
runt
ime
(CP
U s
ec)
SOLVED BY BOTHSOLVED BY ONE
Holger Hoos: Meta-algorithmic techniques in SAT solving 24
SATenstein-LS vs Oracle on CBMC-SE
10−2
10−1
100
101
102
103
10−2
10−1
100
101
102
103
SATenstein−LS[CBMC−SE] median runtime (CPU sec)
Ora
cle
med
ian
runt
ime
(CP
U s
ec)
SOLVED BY BOTHSOLVED BY ONE
Holger Hoos: Meta-algorithmic techniques in SAT solving 25
Algorithm selection
Off-line (per-distribution) algorithm selection:
I Given: set S of algorithms for a problem, representativedistribution (or set) of problem instance Π
I Objective: select from S the algorithm expected to solveinstances from Π most efficiently (w.r.t. aggregateperformance measure) = single best solver
SAT competitions
Holger Hoos: Meta-algorithmic techniques in SAT solving 26
Methods for off-line algorithm selection
I exhaustive evaluation: run all solvers on all instances
I racing methods: eliminate solvers as soon as they“fall significantly behind” the current leader(Maron & Moore 1994; Birattari, Stutzle, Paquete, Varrentrapp 2002)
Holger Hoos: Meta-algorithmic techniques in SAT solving 27
Racing (for Algorithm Selection)
1 2 3 4 5
algorithms
probleminstances
(Initialisation)
Holger Hoos: Meta-algorithmic techniques in SAT solving 28
Racing (for Algorithm Selection)
3 4 5
algorithms
probleminstances
1 2
(Initialisation)
Holger Hoos: Meta-algorithmic techniques in SAT solving 28
Racing (for Algorithm Selection)
5
algorithms
probleminstances
1 2 3 4
winner
(Initialisation)
Holger Hoos: Meta-algorithmic techniques in SAT solving 28
Potential problem:
inhomogenous benchmark sets / distributions may not correctly identify single best solver
Solutions:
I use sets of instances at each stage of the race
I racing within (reasonably) homogenous subsets
I order instances according to difficulty (ordered racing)(Styles & HH – in preparation)
Note:racing can also be used for algorithm configuration (requiresspecial techniques for handling large configuration spaces)(Balaprakash, Birattari, Stutzle 2007)
Holger Hoos: Meta-algorithmic techniques in SAT solving 29
Instance-based algorithm selection (Rice 1976):
I Given: set S of algorithms for a problem, problem instance π
I Objective: select from S the algorithm expected to solve πmost efficiently, based on (cheaply computable) features of π
Note:
Best case performance bounded by oracle, which selectsthe best s ∈ S for each π = virtual best solver (VBS)
Holger Hoos: Meta-algorithmic techniques in SAT solving 30
Instance-based algorithm selection
selector
componentalgorithms
feature extractor
Holger Hoos: Meta-algorithmic techniques in SAT solving 31
Key components:
I set of (state-of-the-art) solvers
I set of cheaply computable, informative features
I efficient procedure for mapping features to solvers (selector)
I training data
I procedure for building good selector based on training data(selector builder)
Holger Hoos: Meta-algorithmic techniques in SAT solving 32
Methods for instance-based selection:
I classification-based: predict the best solver,e.g., using . . .
I decision trees(Guerri & Milano 2004)
I case-based reasoning(Gebruers, Guerri, Hnich, Milano 2004)
I (weighted) k-nearest neighbours(Malitsky, Sabharwal, Samulowitz, Sellmann 2011;
Kadioglu, Malitsky, Sabharwal, Samulowitz, Sellmann 2011)
I pairwise cost-sensitive decision forests + voting(Xu, Hutter, HH, Leyton-Brown 2012)
I regression-based: predict running time for each solver,select the one predicted to be fastest(Leyton-Brown, Nudelman, Shoham 2003;
Xu, Hutter, HH, Leyton-Brown 2007–9)
Holger Hoos: Meta-algorithmic techniques in SAT solving 33
Instance features:
I Use generic and problem-specific features that correlate withperformance and can be computed (relatively) cheaply:
I number of clauses, variables, . . .I constraint graph featuresI local & complete search probes
I Use as features statistics of distributions,e.g., variation coefficient of node degree in constraint graph
I Consider combinations of features (e.g., pairwise products quadratic basis function expansion).
Holger Hoos: Meta-algorithmic techniques in SAT solving 34
SATzilla 2007–9 (Xu, Hutter, HH, Leyton-Brown):
I use state-of-the-art complete (DPLL/CDCL) and incomplete(local search) SAT solvers
I extract (up to) 84 polytime-computable instance features
I use ridge regression on selected features to predict solverrun-times from instance features(one model per solver)
I run solver with best predicted performance
Holger Hoos: Meta-algorithmic techniques in SAT solving 35
Some bells and whistles:
I use pre-solvers to solve ‘easy’ instances quickly
I build run-time predictors for various types of instances,use classifier to select best predictor based on instancefeatures.
I predict time required for feature computation; if that timeis too long (or error occurs), use back-up solver
I use method by Schmee & Hahn (1979) to deal withcensored run-time data
prizes in 5 of the 9 main categories of the 2009 SAT SolverCompetition (3 gold, 2 silver medals)
Holger Hoos: Meta-algorithmic techniques in SAT solving 36
The problem with standard classification approaches
Crucial assumption: solvers behave similarly on instances withsimilar features
But do they really?
I uninformative features
I correlated features
I feature normalisation (tricky!)
I cost of misclassification
Holger Hoos: Meta-algorithmic techniques in SAT solving 37
SATzilla 2011–12 (Xu, Hutter, HH, Leyton-Brown 2012):
I uses cost-based decision forests to directly select solverbased on features
I one predictive model for each pair of solvers (which is better?)
I majority voting (over pairwise predictions) to selectsolver to be run
(further details, results next Monday, 11:30 session)
Holger Hoos: Meta-algorithmic techniques in SAT solving 38
Hydra: Automatically Configuring Algorithmsfor Portfolio-Based SelectionXu, HH, Leyton-Brown (2010)
Note:
I SATenstein builds solvers that work well on average on a givenset of SAT instancesbut: may have to settle for compromises for broad,heterogenous sets
I SATzilla builds algorithm selector based on given setof SAT solversbut: success entirely depends on quality of given solvers
Idea: Combine the two approaches portfolio-based selectionfrom set of automatically constructed solvers
Holger Hoos: Meta-algorithmic techniques in SAT solving 39
Configuration + Selection = Hydra
parametric algorithm(multiple configurations)
selectorfeature extractor
Holger Hoos: Meta-algorithmic techniques in SAT solving 40
Simple combination:
1. build solvers for various types of instances using automatedalgorithm configuration
2. construct portfolio-based selector from these
Drawback: Requires suitably defined sets of instances
Better solution:
iteratively build & add solvers that improve performanceof given portfolio
Hydra
Note: Builds portfolios solely using
I generic, highly configurable solver (e.g., SATenstein)
I instance features (as used in SATzilla)
Holger Hoos: Meta-algorithmic techniques in SAT solving 41
Results on mixture of 6 well-known benchmark sets
10-2
100
102
10-2
10-1
100
101
102
103
Hydra[BM, 1] PAR Score
Hyd
ra[B
M,
7]
PA
R S
co
re
Holger Hoos: Meta-algorithmic techniques in SAT solving 42
Results on mixture of 6 well-known benchmark sets
0 1 2 3 4 5 6 7 80
50
100
150
200
250
300
Number of Hydra Steps
PA
R S
co
re
Hydra[BM] trainingHydra[BM] testSATenFACT on trainingSATenFACT on test
Holger Hoos: Meta-algorithmic techniques in SAT solving 43
Note:
I Hydra can use arbitrary algorithm configurators,selector builders
I different approaches are possible: e.g., ISAC, based onfeature-based instance clustering, distance-based selection(Kadioglu, Malitsky, Sellmann, Tierney 2010)
Holger Hoos: Meta-algorithmic techniques in SAT solving 44
Algorithm scheduling
Note:
SATzilla and 3S∗ use a sequence of (pre-)solvers beforeinstance-based selection.
∗ Kadioglu, Malitsky, Sabharwal, Samulowitz, Sellmann (2011)
Holger Hoos: Meta-algorithmic techniques in SAT solving 45
Algorithm Scheduling
schedule
Holger Hoos: Meta-algorithmic techniques in SAT solving 46
Questions:
1. How to determine that sequence?
2. How much performance can be obtained from solverscheduling only?
Holger Hoos: Meta-algorithmic techniques in SAT solving 47
Methods for algorithm scheduling methods:
I exhaustive search (as done SATzilla) expensive; limited to few solvers, cutoff times
I based on optimisation procedure
I using integer programming (IP) techniques3S – Kadioglu et al. (2011)
I using answer-set-programming (ASP) formulation + solveraspeed – HH, Kaminski, Schaub, Schneider (2012)
Holger Hoos: Meta-algorithmic techniques in SAT solving 48
Preliminary performance results:
Performance of pure scheduling can be suprisingly closeto that of combined scheduling + selection (full SATzilla).
(HH, Kaminski, Schaub, Schneider – in preparation;
Xu, Hutter, HH, Leyton-Brown – in preparation)
Holger Hoos: Meta-algorithmic techniques in SAT solving 49
Notes:
I the ASP solver clasp used by aspeed is powered by a(state-of-the-art) SAT solver core
I pure algorithm scheduling (e.g., aspeed) does not requireinstance features
I sequential schedules can be parallelised easily and effectively(HH, Kaminski, Schaub, Schneider – 2012)
Holger Hoos: Meta-algorithmic techniques in SAT solving 50
Parallel algorithm portfolios
Key idea:
Exploit complementary strengths by running multiple algorithms(or instances of a randomised algorithm) concurrently.
Holger Hoos: Meta-algorithmic techniques in SAT solving 51
Parallel Algorithm Portfolios
Holger Hoos: Meta-algorithmic techniques in SAT solving 51
Parallel algorithm portfolios
Key idea:
Exploit complementary strengths by running multiple algorithms(or instances of a randomised algorithm) concurrently.
risk vs reward (expected running time) tradeoff, robust performance on a wide range of instances
Huberman, Lukose, Hogg (1997); Gomes & Selman (1997,2000)
Note:
I can be realised through time-sharing / multi-tasking
I particularly attractive for multi-core / multi-processorarchitectures
Holger Hoos: Meta-algorithmic techniques in SAT solving 51
Application to decision problems (like SAT, SMT):
Concurrently run given component solvers until the first of themsolves the instance.
running time on instance π = (# solvers) × (running time of VBS on π)
Examples:
I ManySAT (Hamadi, Jabbour, Sais 2009; Guo, Hamadi,Jabbour, Sais 2010)
I Plingeling (Biere 2010–11)
I ppfolio (Roussel 2011)
excellent performance (see 2009, 2011 SAT competitions)
Holger Hoos: Meta-algorithmic techniques in SAT solving 52
Types of parallel portfolios:
I static vs instance-based vs dynamic
I with or without communication (SAT: clause sharing)
Methods for constructing (static) parallel portfolios:
I manual (by human expert, based on experimentation)
I classification maximisation (Petrik & Zilberstein 2006)
I case-based reasoning + greedy construction heuristic (Yun &Epstein 2012)
Holger Hoos: Meta-algorithmic techniques in SAT solving 53
Constructing portfolios from a single parametric solver
HH, Leyton-Brown, Schaub, Schneider (under review)
Key idea: Take single parametric solver, find configurations thatmake an effective parallel portfolio (analogous to Hydra).
Note: This allows to automatically obtain parallel solversfrom sequential sources (automatic parallisation)
Methods for constructing such portfolios:
I global optimisation:simultaneous configuration of all component solvers
I greedy construction:add + configure one component at a time
Holger Hoos: Meta-algorithmic techniques in SAT solving 54
Preliminary results on competition application instances(4 components)
solver PAR1 PAR10 #timeouts
ManySAT (1.1) 1887 16 003 213/679
ManySAT (2.0) 1998 17 373 232/679
Plingeling (276) 1850 15 437 205/679
Plingeling (587) 1684 13 812 183/679
Greedy-MT4(Lingeling) 1717 13 712 181/679
ppfolio 1646 13 310 176/679
CryptoMiniSat 1600 12 271 161/679
VBS over all of the above 1282 10 296 136/679
Holger Hoos: Meta-algorithmic techniques in SAT solving 55
Algorithm configuration
Lo Hi
Holger Hoos: Meta-algorithmic techniques in SAT solving 56
Algorithm configuration
Lo Hi
Holger Hoos: Meta-algorithmic techniques in SAT solving 56
The next step:
Programming by Optimisation (PbO)HH (2010–12)
Key idea:
I specify large, rich design spaces of solver for given problem
I avoid premature, uninformed, possibly detrimentaldesign choices
I active development of promising alternatives fordesign components
I automatically make choices to obtain solver optimisedfor given use context
Holger Hoos: Meta-algorithmic techniques in SAT solving 57
application context
solver
optimisedsolver
design space of solvers
Holger Hoos: Meta-algorithmic techniques in SAT solving 58
application context
plannerplannersolver
optimisedsolver
parallelportfolio
instance-based
selector
design space of solvers
Holger Hoos: Meta-algorithmic techniques in SAT solving 58
application context
plannerplanner
optimisedsolver
parallelportfolio
instance-based
selector
design space of solvers
Holger Hoos: Meta-algorithmic techniques in SAT solving 58
Levels of PbO:
Level 4: Make no design choice prematurely thatcannot be justified compellingly.
Level 3: Strive to provide design choices andalternatives.
Level 2: Keep and expose design choices consideredduring software development.
Level 1: Expose design choices hardwired intoexisting code (magic constants, hiddenparameters, abandoned design alternatives).
Level 0: Optimise settings of parameters exposedby existing software.
Holger Hoos: Meta-algorithmic techniques in SAT solving 59
Success in optimising speed:
Application, Design choices Speedup PbO level
SAT-based software verification (Spear), 41Hutter, Babic, HH, Hu (2007)
4.5–500 × 2–3
AI Planning (LPG), 62Vallati, Fawcett, Gerevini, HH, Saetti (2011)
3–118 × 1
Mixed integer programming (CPLEX), 76Hutter, HH, Leyton-Brown (2010)
2–52 × 0
... and solution quality:
University timetabling, 18 design choices, PbO level 2–3 new state of the art; UBC exam schedulingFawcett, Chiarandini, HH (2009)
Holger Hoos: Meta-algorithmic techniques in SAT solving 60
Software development in the PbO paradigm
use context
PbO-<L>source(s)
parametric<L>
source(s)
instantiated<L>
source(s)
deployedexecutable
designspace
description
PbO-<L> weaver
PbO design
optimiser
benchmarkinputs
Holger Hoos: Meta-algorithmic techniques in SAT solving 61
Design space specification
Option 1: use language-specific mechanisms
I command-line parameters
I conditional execution
I conditional compilation (ifdef)
Option 2: generic programming language extension
Dedicated support for . . .
I exposing parameters
I specifying alternative blocks of code
Holger Hoos: Meta-algorithmic techniques in SAT solving 62
Advantages of generic language extension:
I reduced overhead for programmer
I clean separation of design choices from other code
I dedicated PbO support in software development environments
Key idea:
I augmented sources: PbO-Java = Java + PbO constructs, . . .
I tool to compile down into target language: weaver
Holger Hoos: Meta-algorithmic techniques in SAT solving 63
use context
PbO-<L>source(s)
parametric<L>
source(s)
instantiated<L>
source(s)
deployedexecutable
designspace
description
PbO-<L> weaver
PbO design
optimiser
benchmarkinput
Holger Hoos: Meta-algorithmic techniques in SAT solving 64
Exposing parameters
...
numerator -= (int) (numerator / (adjfactor+1) * 1.4);
... ...
##PARAM(float multiplier=1.4)
numerator -= (int) (numerator / (adjfactor+1) * ##multiplier);
...
I parameter declarations can appear at arbitrary places(before or after first use of parameter)
I access to parameters is read-only (values can only beset/changed via command-line or config file)
Holger Hoos: Meta-algorithmic techniques in SAT solving 65
Specifying design alternatives
I Choice: set of interchangeable fragments of codethat represent design alternatives (instances of choice)
I Choice point:location in a program at which a choice is available
##BEGIN CHOICE preProcessing
<block 1>
##END CHOICE preProcessing
Holger Hoos: Meta-algorithmic techniques in SAT solving 66
Specifying design alternatives
I Choice: set of interchangeable fragments of codethat represent design alternatives (instances of choice)
I Choice point:location in a program at which a choice is available
##BEGIN CHOICE preProcessing=standard
<block S>
##END CHOICE preProcessing
##BEGIN CHOICE preProcessing=enhanced
<block E>
##END CHOICE preProcessing
Holger Hoos: Meta-algorithmic techniques in SAT solving 66
Specifying design alternatives
I Choice: set of interchangeable fragments of codethat represent design alternatives (instances of choice)
I Choice point:location in a program at which a choice is available
##BEGIN CHOICE preProcessing
<block 1>
##END CHOICE preProcessing
...
##BEGIN CHOICE preProcessing
<block 2>
##END CHOICE preProcessing
Holger Hoos: Meta-algorithmic techniques in SAT solving 66
Specifying design alternatives
I Choice: set of interchangeable fragments of codethat represent design alternatives (instances of choice)
I Choice point:location in a program at which a choice is available
##BEGIN CHOICE preProcessing
<block 1a>
##BEGIN CHOICE extraPreProcessing
<block 2>
##END CHOICE extraPreProcessing
<block 1b>
##END CHOICE preProcessing
Holger Hoos: Meta-algorithmic techniques in SAT solving 66
��������
�� �������������
������������
����������
������������
����������
����������������
�����������
����������
����� �������������
�� �������
��������
�������� ����
Holger Hoos: Meta-algorithmic techniques in SAT solving 67
The Weaver
transforms PbO-<L> code into <L> code(<L> = Java, C++, . . . )
I parametric mode:
I expose parameters
I make choices accessible via (conditional, categorical)parameters
I (partial) instantiation mode:
I hardwire (some) parameters into code(expose others)
I hardwire (some) choices into code(make others accessible via parameters)
Holger Hoos: Meta-algorithmic techniques in SAT solving 68
��������
�� �������������
������������
����������
������������
����������
����������������
�����������
����������
����� �������������
�� �������
��������
�������� ����
Holger Hoos: Meta-algorithmic techniques in SAT solving 69
Design space optimisation
Use previously discussed techniques for . . .
I automated algorithm configuration (e.g., ParamILS, . . . )
I instance-based selector construction (e.g., Hydra)
I parallel portfolio construction (e.g., Schneider, )
Much room for further improvements:
I better optimisation procedures (e.g., for configuration)
I meta-optimisation (optimising the optimisers)
I scenario-based optimiser selection
I parallel portfolios of design optimisation procedures
Holger Hoos: Meta-algorithmic techniques in SAT solving 70
Cost & concerns
But what about ...
I Computational complexity?
I Cost of development?
I Limitations of scope?
Holger Hoos: Meta-algorithmic techniques in SAT solving 71
Computationally too expensive?
Spear revisited:
I total configuration time on software verification benchmarks:≈ 30 CPU days
I wall-clock time on 10 CPU cluster:≈ 3 days
I cost on Amazon Elastic Compute Cloud (EC2):61.20 USD (= 42.58 EUR)
I 61.20 USD pays for ...
I 1:45 hours of average software engineerI 8:26 hours at minimum wage
Holger Hoos: Meta-algorithmic techniques in SAT solving 72
Too expensive in terms of development?
Design and coding:
I tradeoff between performance/flexibility and overhead
I overhead depends on level of PbO
I traditional approach: cost from manual exploration ofdesign choices!
Testing and debugging:
I design alternatives for individual mechanisms and componentscan be tested separately
effort linear (rather than exponential) in the number ofdesign choices
Holger Hoos: Meta-algorithmic techniques in SAT solving 73
Limited to the “niche” of NP-hard problem solving?
Some PbO-flavoured work in the literature:
I computing-platform-specific performance optimisationof linear algebra routines(Whaley et al. 2001)
I optimisation of sorting algorithmsusing genetic programming(Li et al. 2005)
I compiler optimisation(Pan & Eigenmann 2006, Cavazos et al. 2007)
I database server configuration(Diao et al. 2003)
Holger Hoos: Meta-algorithmic techniques in SAT solving 74
The road ahead
I Support for PbO-based software development
I Weavers for PbO-C, PbO-C++, PbO-Java
I PbO-aware development platforms
I Improved / integrated PbO design optimiser
I Best practices
I Many further applications
I Scientific insights
Holger Hoos: Meta-algorithmic techniques in SAT solving 75
Communications of the ACM, 55(2), pp. 70–80, February 2012
www.prog-by-opt.net
Meta-algorithmic techniques . . .
I leverage computational power to constructbetter algorithms
I liberate human designers from boring, menial tasks andlets them focus on higher-level design issues
I facilitates principled design of heuristic algorithms
I profoundly changes how we build and use algorithms
Holger Hoos: Meta-algorithmic techniques in SAT solving 76
Three (slightly provocative?) predictions
I Meta-algorithmic techniques will become indispensablefor solving SAT, SMT and all other NP-hard problems.
I Programming by Optimisation (or a closely related approach)will become as ubiquitous as compilation.
I Driven by SAT, SMT, PbO and advances in automatedtesting / debugging, program synthesis from higher-leveldesigns will become practical and widely used.
Holger Hoos: Meta-algorithmic techniques in SAT solving 77