bad and good ways of post-processing biased random numbers

41
Bad and Good Ways of Post-Processing Biased Random Numbers Markus Dichtl Siemens AG Corporate Technology

Upload: others

Post on 03-Jan-2022

3 views

Category:

Documents


0 download

TRANSCRIPT

Bad and Good Ways

of Post-Processing

Biased Random Numbers

Markus DichtlSiemens AG

Corporate Technology

Overview

This talk comes in two parts:

• A bad way• Good ways

Why Post-Processing?

Observation: All physical random numbers seem to deviate from the statistical ideal.

Post-processing is used to remove or reduce these deviations from the ideal.

The Most Frequent Statistical Problem

Bias: A deviation of the probability of 1-bits from the ideal value ½. For statistically independent bits withprobability p of 1-bits:

Bias ε = p – 1/2

The Bad Scheme

TRNG

Bijective, easily

invertiblequasigroup

transformation

Output

In their FSE 2005 paper, “Unbiased Random Sequences from Quasigroup String Transformations”, Markovski, Gligoroski, and Kocarev suggested this scheme for TRNG post-processing.

What is a Quasigroup? (I)

A quasigroup is a set Q with a mapping *Q × Q → Q such that all equations of theforma * x = b and y * a = bare uniquely solvable for x and y for all a and b

What is a Quasigroup? (II)

A function is a quasigroup iff itsfunction table is a latin square.

12303

03212

21031

30120

3210*

The e-Transformation

The e-transformation maps a string a1a2…anand a „leader“ b0 (bo * bo ≠ bo) to the stringb1b2…bn by

bi = bi-1 * ai for i = 1, …, n

a1 a3a2

b0

*b1

*b2 b3

*

The E-Algorithm

E-algorithm : k-fold application of the e-transformation(fixed leader and quasigroup)

According to the recommendations of the original paper for highly biased input, we choose k=128 for a quasigroup of order 4.

The Good News about the Bad Scheme

As the quasigroup mapping is bijective, it can do no harm.

The entropy of the output is just the entropy of the input.

The HB TRNGThe authors of the quasigroup post-processing paper claim that it is suitable for highly biased input like 99.9 % 0-bits0.1 % 1-bits (bias -0.499)

We call this generator HB (for High Bias)

AttackWe attack HB post-processed with the E-Algorithm based on a quasigroup of order 4 and k=128.

As almost all inputs bits are 0, we guess them to be 0 and determine the output by applying the E-Algorithm.

The probability to guess two bits correctly is 0.998001

If we guess wrongly, we use the inverse E-Algorithm to determine the correct input for continuing the attack.

Attack with Quasigroup UnknownIt does not help too much to keep quasigroup and leader secret, as there are only 1728 choices of quasigroups of order 4 and leader.

Simplified attack suggested by an anonymous reviewer: Apply the inverse E-algorithms for the 1728 choices, the correct one is identified by many 0-bits in the output.

What is Going on in the E-Algorithm?Bias is replaced with dependency, and this is achieved very slowly

And now for something quite different

One anonymous FSE 2007 reviewer: The paper needs to be much more up-front about the fact that you are demolishing apples while promoting the virtues of oranges.

We have to give up the idea of bijective post-processing (apples) of random numbers and look at compressing functions instead (oranges).

Von Neumann Post-Processing

John von Neumann (1951)

00 01 1011

→ 0→ 1

For statistically independent but biased input: perfect balanced and independent output

Problem: Unbounded latency

A Dilemma

Perfect output statisticsand bounded latencyexclude each other.

Popular Examples for Bounded Latency Algorithms

XOR

Feeding the RNG-bits into a LFSR, readingoutput from the LFSR at a lower rate

Algorithms for Fixed Input/Output Rate

No perfect solution!We consider the input/output rate 2.

For single bits: XOR is optimal!

Bias after XOR: 22ε

What we are Looking for

Input: 16 bitsOutput: 8 bits

Input is assumed to be statistically independent, but biased. We cannot assume to know the numerical value of the bias ε.

The Function H

2 Bytes are mapped to 1.

The Function H in C

unsigned char H (unsigned char a, unsigned char b){return ( a^rotateleft(a,1)^b); /* ^ is XOR in C*/}

Entropy Comparison: H and XOR

2 bytes are mapped to 1 byte.

What about Low Biases?

Probability of 1-bit: 0.51 (Bias 0.01)Entropy of one output byte with XOR:

7.9999990766751Entropy of one output byte with H:

7.9999999996305which is 2499 times closer to 8.

Probabilities of Raw Bytes

Byte Probabilites for XOR

XXX

Byte Probabilities for H (Part)

Why H is so Good and a New Challenge

That the lowest power of ε in the probabilties of H is ε3 explains why H is better than XOR, which has ε2 terms.

Challenge: To make disappear further powers of ε!

The Functions H2 and H3 in C

unsigned char H2(unsigned char a, unsigned char b){return ( a^rotateleft(a,1)^rotateleft(a,2)^b);}

unsigned char H3(unsigned char a, unsigned char b){return ( a^rotateleft(a,1)^rotateleft(a,2)^ rotateleft(a,4)^ b);}

Properties of H2 and H3

Lowest ε-power in the byte probabilities:

H2: ε4

H3: ε5

Going Further

Of course, we also want to get rid of ε5 !

It seems that linear methods cannot achieve this.

What must be done?

We must partition 216 16-bit-values into 256 sets of 256 elements each in such a way that in the sums of the probabilties of each set the powers ε1 through ε5

cancel out.

The probabilities of the 16-bit-values depend only on the Hamming weight w. Hence, there are 17 possibilities. The different Hamming weights occur with different frequencies.

Occurrences and Probabilities for 16-bit-values

Observation

If we add the probabilty of a 16-bit-tupel and the probabilityof ist bitwise complement, then all odd ε-powers cancel out.So, we add them to our sets only together.

Considerable simplification of the problem

The Simplified Problem

The Solution SThe 256 sets of the solutions S fall into 7 types:

60

2

43

43

36

85

w =7

202442017G

75830134F

155875112E

301637260D

50281446C

42116B

1511211A

w =8w =6w =5w =4w =3w =2w =1w =0#Type

Byte Probabilities of S

Byte Probabilities of S and XOR

Entropy Comparison of S, H, and XOR

Negative Results

The ε6-terms cannot be eliminated. (Proved by linear programming techniques.)

When considering mappings from 32 to 16 bits, the probabilities of the output values contain 9-th or lower powers of ε.

Conclusion

The quasigroup TRNG post-processing suggested by Markovski, Gligoroski, and Kocarev does not work. It is based on faulty mathematics.

The fixed input/output rate TRNG post-processing functions suggested in this talk are considerably better than the previously known algorithms. There are open questions concerning the systematic construction of such functions.