[Ach] removed outdated info on Linux RNG / haveged

micah micah at riseup.net
Sun May 7 17:49:52 CEST 2017


Aaron Zauner <azet at azet.org> writes:

> Also the use of `haveged` is recommended, which is a bad idea as this
> daemon can create blocking situations during key generation
> effectively creating a deadlock and thus security problems.

Can you detail what those blocking cases are? 

>haveged's design is from 2002, it has never been audited, there're only
>papers by the original authors available. Additionally, the design
>rationale is based on 2002 ISA for architectures like UltraSparc II -
>these are far from relevant these days. The removed section already
>mentioned that haveged's memory footprint is too high for embedded
>use-cases, additionally in most embedded boards the design will not
>even work.

I appreciate these reasons as valid to remove haveged. However, are
there not any valid cases where haveged is useful, even if it is not
perfect? If it is producing entropy, even not great entropy, that is
then mixed into the entropy pool with other sources, isn't it
essentially impossible to know the output? Even if you are able to know
the bits that are delivered into the entropy pool, when it is mixed with
bits you don't know, it can help calculate the result, but it would be
far from easy.

I haven't kept up on the newer kernel pool changes, but lets say haveged
is so bad that someone can pre-determine a predictable pattern matrix it
will spit out (somehow, it still passes FIPS-140). You spit this data
into /dev/random, that passes what it gets through
drivers/char/random.c, which is passed through a SHA hash to then
actually feeds the kernel entropy pool.

However, if that data is mixed with other random data from the system,
are we comfortable accepting that the SHA of that mixture is totally
unpredictable? If an attacker knows the input to the hash, knows the
hash function and thus could calculate what the hash output would be,
they won’t know the beginning pool state, and so cannot know how their
input affects it when it goes through the mixing method. So all they
have done is add entropy.

The haveged daemon has not been audited, so could have some logic
mistakes in it, or maybe it has been compromised entirely. But, "the
fact that an intelligent attacker can construct inputs that will produce
controlled alterations to the pool's state is not important because we
don't consider such inputs to contribute any randomness.  The only
property we need with respect to them is that the attacker can't
increase his/her knowledge of the pool's state. Since all additions are
reversible (knowing the final state and the input, you can reconstruct
the initial state), if an attacker has any uncertainty about the initial
state, he/she can only shuffle that uncertainty about, but never cause
any collisions (which would decrease the uncertainty)." (comments from
random.c).

Isn't a mixed entropy pool that isn't depleted better than a depleted
pool that spits out weak results?



More information about the Ach mailing list