ibm labs in haifa © copyright ibm svrh: non-local stochastic csp solver with learning of high-level...
TRANSCRIPT
IBM Labs in Haifa © Copyright IBM
SVRH: Non-local stochastic CSP solver with learning of high-level topography characteristics
Yehuda NavehSimulation Based Verification Technologies System Verification and Modeling
IBM Labs in Haifa
Copyright IBM2
CSP for simulation-based hardware verification
CSP is a fundamental technology for Simulation-Based Verification The basis of all our generators: X-Gen, GenesysPro, Piparazzi,
DeepTrans, FPgen
The Generation Core (GEC) provides a tool-box for stimuli generation With focus on CSP: modeling, representation, solution
Octopus provides an umbrella for non-GEC activities SVRH …
IBM Labs in Haifa
Copyright IBM3
This Presentation
CSP for test generation (brief) The stochastic solver (using SVRH algorithm) Future considerations
IBM Labs in Haifa
Copyright IBM4
Constraints in test case generation
EA: 0x????????_???????? Huge Domains (264)
RA: 0x????????_???????? Quality of solution: Random uniformity
Architecture: Effective address translates into real address in a complex way.
Testing knowledge: Contention on cache row.
Verification task: Real address in some corner memory space,
Effective address aligned to 64K.
EA: 0x0002FF00_00000000
RA: 0x0002FF00_00000000 Correct, but worthless!
(e.g., Dechter et. al., 2002)
EA: 0x????????_????0000 Huge Domains (264)
RA: 0x0002FF??_???????? Quality of solution: Random uniformity
EA: 0x????????_????0000 Huge Domains (264)
RA: 0x0002FF??_??A4D??? Quality of solution: Random uniformity
EA: 0x0B274FAB_0DBC0000 Huge Domains (264)
RA: 0x0002FFC5_90A4D000 Quality of solution: Random uniformity
IBM Labs in Haifa
Copyright IBM5
Systematic Methods (DPLL, MAC, k-consistency, …)
EA-RA Translation
RAEA
User: Special RA’sUser: Aligned EA
X X
1. Reach a level of consistency through successive reductions of sets
2. Choose an assignment for a variable, and maintain the consistency
IBM Labs in Haifa
Copyright IBM6
Systematic Methods
RAEA
User: Special RA’sUser: Aligned EA
X
EA-RA Translation
X
X
X
X
X
IBM Labs in Haifa
Copyright IBM7
Limitations of Systematic Methods: An example
10.4
00.3
00.2
00.1
ac
cb
ca
ba
0,1 cbaOnly solution:
642,,...,0,, NNcba
Local consistency at onset: Choose randomly with probability 1/N of being correct
(Solution reached at 600 million years)
IBM Labs in Haifa
Copyright IBM8
Limitations of Systematic Methods: Another example
tionrepresentabinary in their s1' five haveeach ,,.2
*.1
cba
cba
642,,...,0,, NNcba
Already a single reduction of domains is hard
IBM Labs in Haifa
Copyright IBM9
Stochastic approaches: defining the metrics State: a tuple representing a single assignment to each variable
(a, b, c) out of 264x3 states
Cost: A function from the set of states to {0} U R+
Cost = 0 iff all constraints are satisfied by the state.
00
0 Cost 00
)*( Cost * 2
a
abba
baccba
Simulated Annealing GSAT …..
All are local search
IBM Labs in Haifa
Copyright IBM10
SVRH: Simulated variable range hopping Non-local search
If a state is represented by bits, many bits may be flipped in a single step. Check states on all length-scales in the problem. Hop to a new state if better.
Learn the topographyGet domain-knowledge as input strategies
IBM Labs in Haifa
Copyright IBM11
SVRH: Learning
Cost normalization Topography parameters
Length scalesPreferred directions (rigidity, correlations between variables)
Decision-making parameters (“learning the heuristics”)Should the next hop be domain-knowledge-based or stochastic?Should it rely on step-size learning?Should it rely on correlation learning?
Never abandon totally stochastic trials!
)()( zyxwvzyxwv
IBM Labs in Haifa
Copyright IBM12
SVRH: Domain-knowledge strategies
Variables: a, b, c : 128 bit integers
Constraints: a = b = c
a = …0011100101110100110101000101001010100101010
b = …0101001010101010010110111101010100010100010
c = …1100001010100101010000011111010101001010111
Strategy: Check all bits, one at a time.
Solution guaranteed in 384 steps.
IBM Labs in Haifa
Copyright IBM13
SVRH: Domain-knowledge strategies
Variables: a, b, c : 128 bit integers
Constraints: a = b = c
a = …0011100101110100110101000101001010100100010
b = …0101001010101010010110111101010100010100010
c = …1100001010100101010000011111010101001010111
Strategy: Check all bits, one at a time.
Solution guaranteed in 384 steps.
IBM Labs in Haifa
Copyright IBM14
SVRH: Domain-knowledge strategies
Variables: a, b, c : 128 bit integers
Constraints: a = b = c
a = …0011100101110100110101000101001010000100010
b = …0101001010101010010110111101010100010100010
c = …1100001010100101010000011111010101001010111
Strategy: Check all bits, one at a time.
Solution guaranteed in 384 steps.
IBM Labs in Haifa
Copyright IBM15
SVRH: Domain-knowledge strategies
Variables: a, b, c : 128 bit integers
Constraints: c = (a * b)msb, number of 1’s in c = 5
a = …0011100101110100110101000101001010100101010
b = …0101001010101010010110111101010100010100010
c = …0100000000100000010000000010001001000000000
Strategy: Flip simultaneously all permutations of adjacent bits
IBM Labs in Haifa
Copyright IBM16
SVRH: Decision algorithm
Decide type (‘random’, ‘learned’, ‘user-defined’) according to previous successesIf ‘random’
Choose a random step-sizecreate a random attempt with chosen step-size
If ‘learned’:decide learn-type (‘step-size’, ‘direction’, ‘…’) according to previous successes if ‘step-size’:
Choose a step-size which was previously successful (weighted)create a random attempt with chosen step size.
if ‘direction’Choose a direction which was previously successful (weighted)create a random attempt with chosen direction
etc.If ‘user-defined’
Get next user-defined attempt
IBM Labs in Haifa
Copyright IBM17
SVRH: Usage
Variables A set of bits
Constraints Cost function Suggest initialization: random but consistent Remove fully constrained bits from problem
Domain-Knowledge Strategies User provides sets of bits to be flipped. Engine decides when to use
Commonly used constraints and strategies are provided as building blocks
See also: Nareyek ’01, Michel and Van Hentenryck ’02, ’03.
IBM Labs in Haifa
Copyright IBM18
Local consistency at onset: Choose randomly with probability 1/N of being correct
(Solution reached at 600 million years)SVRH: Solution reached in less then a second (similarly for N = 2128)
Results: toy example
10
00
00
00
ac
cb
ca
ba
0,1 cbaOnly solution:
642,,...,0,, NNcba
IBM Labs in Haifa
Copyright IBM19
Results: FP-Gen
Comparison with ZChaff for floating-point multiply benchmark (133 solvable tasks)
ZChaff SVRH
Max length 64 bit 128 bit
Average time 200.5 sec 0.97 sec
Best ratio 2861 sec 0.3 sec
Worst ratio 25 sec 5.7 sec
Quality (extreme case) 0p0=0x43F0000000000000
0p1=0xB180000000000000
0p0=0x070E342575271FFA
0p1=0x9560F399ECF4E191
Reports UNSAT Yes No
IBM Labs in Haifa
Copyright IBM20
Results: FP-Gen
Comparison with all engines for all-instructions benchmark (150 tasks)
Lion ZChaff Mixed SVRH
Number of successes 29 59 88 93
Average time per success (seconds)
0.3 65 3 9
IBM Labs in Haifa
Copyright IBM21
Results: Low Autocorrelation Binary Sequence (LABS)
Hard combinatorial problem, well known optimization benchmark Objective: minimize the autocorrelation on a string of N 1’s and -1’s
CLS is a special-purpose algorithm designed for this problem
N CLS backtracks SVRH attempts CLS hours SVRH hours
45 368 X 106 136 X 106 14.70 1.32
46 35 179 1.44 1.74
47 165 242 7.30 2.35
48 78 - 3.36 > 5.00
IBM Labs in Haifa
Copyright IBM22
Future
Basic Input language Static learning More on next slide
Test Generation Piparazzi G-Pro X-Gen
Formal verification Bounded model checking, SAT
FormalSim
IBM Labs in Haifa
Copyright IBM23
Future – more basic aspects
Pruning steps in an overall SVRH (stochastic) framework Hybrid algorithm – extensive literature Report UNSAT (holy grail) First experiments on SAT: somewhat encouraging, but only at
infancy
More sophisticated learning Correlations (bayesian?), dynamics, other… Nothing done yet, cannot predict usefulness