James Thew - Fotolia

Tip

Entropy sources: How do NIST rules impact risk assessments?

NIST recently released new guidance on entropy sources used for random bit generation. Judith Myerson explains these recommendations and how they alter cryptography principles.

Entropy in computing is the randomness collected by an operating system or application for use in cryptography or other processes that require random data. This randomness -- also known as the unpredictability of data -- is often collected from entropy sources, including hardware sources -- such as mouse movements -- and specially provided randomness generators.

The goal of generating random data is to make it extremely difficult for an adversary to guess a company's cryptographic keys. However, entropy sources that are used for random bit generation are not always tested properly; some programmers can take advantage of testing flaws that grant them a better chance of predicting sequences of random bits.

In addition, the methods of identifying and testing entropy sources vary from one operating system to another. One way of testing entropy sources for one operating system may not work well for another. Because of this, the National Institute of Standards and Technology (NIST) recommends that you identify and test various entropy sources. These sources should act as an important part of an enterprise's risk assessment.

Entropy sources

NIST divides entropy sources into three main sections: a noise source, an optional conditioning component and a health testing component.

Since the noise source is the most important asset in an enterprise risk assessment, developers should be interviewed on how well they understand entropy behavior, and they should be able to produce a consistent source of entropy. NIST recommends that only independent noise sources, such as thermal noise and keyboard movements, should be considered. Dependent noise sources, such as packet arrival times in a communication network and hard drive access times, should not be considered.

If the noise source sample data is not binary, then the data is digitized into bits. The noise source may include post-processing operations that could improve the entropy rate of the data. Furthermore, the combination of the digitized output and post-processed noise source data is called raw data.

The optional conditioning component is the second asset used to improve the entropy rate of the output bits -- a higher rate indicates the implementation of a cryptographic algorithm that is more likely to succeed. A developer should be able to relate the entropy rate to the behavior of the noise source, as well as review a list of cryptographic algorithms for guidance, or choose an alternative algorithm for the implementation of the optional conditioning component.

As the third asset, health tests should be conducted to ensure the noise source operates properly. If the entropy fails during startup, conditioning or demand tests, error messages will indicate the cause of the failure.

NIST suggests three command interfaces to conduct the tests: GetEntropy, GetNoise and HealthTest.

A GetEntropy call returns a string for the requested entropy, while a GetNoise call returns raw, digitized samples from the noise source, along with the status of the request. Likewise, a HealthTest call is a request to the entropy source to conduct a test of its health. This is a good test to conduct, as it is acceptable for validation based on Federal Information Processing Standard 140 on security requirements for cryptographic modules.

Entropy source validation

Entropy source validation ensures that all the requirements in the guidance are met, and it also grants the developer permission to send an entropy source to an accredited lab for validation testing.

The process of entropy validation starts by collecting the entropy estimation data from entropy sources. Entropy is estimated by using an independent and identically distributed (IID) track and a non-IID track; the IID track is used for entropy sources that generate independent and identically distributed samples. After determining the entropy estimation for both tracks, a min-entropy estimate per sample is calculated -- this is used to measure the difficulty of guessing the entropy source's output, which contains the secret values of cryptographic keys.

Entropy source validation ensures that all the requirements in the guidance are met, and it also grants the developer permission to send an entropy source to an accredited lab for validation testing.

If the output passes the restart test, then the entropy estimate is updated. If conditioning is used, the entropy estimate is updated and validated. If the conditioning is not used, then the entropy estimation is validated, but not updated. If the resulting estimation output doesn't pass the restart test, then validation fails and entropy cannot be estimated.

Entropy overestimation could happen when the entropy estimate is calculated from a single, long output sequence, as an attacker could exploit the overestimation vulnerability if he accesses the multiple noise source outputs after restart. To mitigate this risk, the developer should conduct restart tests to re-evaluate the entropy estimate for the noise source by using different outputs from multiple restarts of the source. Once the entropy estimate output is acceptable, the developer can repeat the process of entropy validation testing.

Conclusion

NIST guidance on entropy sources can assist in implementing the recommendations of an enterprise risk assessment. Entropy source validation is evolving, and the impact of future changes in entropy sources, such as identification and testing methods, should be monitored.

Dig Deeper on Data security and privacy

Networking
CIO
Enterprise Desktop
Cloud Computing
ComputerWeekly.com
Close