Cybersecurity professionals have long worried that the eventual introduction of useful, fault-tolerant quantum computers will make modern encryption keys obsolete, exposing massive amounts of data encrypted today to bad actors within a few years.

Today’s AES and RSA encryption models rely on the randomness in the keys and prime numbers to help thwart bad actors that are trying using computers to decrypt the data they’ve stolen.

“If a password or code is an unguessable string of numbers, it’s harder to crack,” experts at the National Institute of Standards and Technology (NIST) wrote this month. “Many of our cryptographic systems today use random number generators to produce secure keys.”

A lot of work already is being done to create and deploy post-quantum encryption amid worries that threat groups not only will crack the encryption of data stolen in the future but also to protect encrypted data already exfiltrated that bad actors may be holding on to for the time they can use quantum computing to decrypt it, a tactic referred to “harvest now, decrypt later.”

NIST last year released three post-quantum cryptography standards.

The Need for Real Randomness

The problem is that such so-called random number generation really isn’t purely random, the scientists wrote, adding that “classical computer algorithms can only create pseudo-random numbers, and someone with enough knowledge of the algorithm or the system could manipulate it or predict the next number.”

Real randomness is important not only in encryption but a wide range of areas in modern society, from selecting jury candidates and assigning resources through a public lottery to ensuring unpredictable outcomes in gaming, like dice rolls and card shuffling, statistical analysis (a random sample being used to represent a larger population), simulations (to model real-world scenarios), and blockchain technology (node allocation and shard reconfiguration).

“True randomness is something that nothing in the universe can predict in advance,” NIST physicist Krister Shalm said in the report, adding that even if a random number generator used seemingly random processes in nature, it’s difficult to verify that those numbers are truly random.

Enter CURBy

NIST and the University of Colorado Boulder have developed a way to verify the randomness of numbers by building a random number generator that uses quantum entanglement as a certifiable source or randomness. The system, called Colorado University Randomness Beacon – or CURBy – is accessible by the public and uses a protocol that enables any user to verify the outputs.

CURBy, detailed in a paper published in Nature, is the product of a NIST-run Bell test – a physics experiment that tests whether the universe operates according to quantum mechanics – that creates raw randomness from quantum entanglement, in this case entangled photon pairs. According to the NIST scientists behind CURBy, this quantum nonlocality ensure outcomes are at their core unpredictable and thus random.

The system takes the random results of the Bell test, refines them, and makes them into random numbers that are available to anyone through a website.

“The Bell test measures pairs of ‘entangled’ photons whose properties are correlated even when separated by vast distances,” the researchers wrote. “When researchers measure an individual particle, the outcome is random, but the properties of the pair are more correlated than classical physics allows, enabling researchers to verify the randomness.”

“CURBy is one of the first publicly available services that operates with a provable quantum advantage. That’s a big milestone for us,” Shalm said. “The quality and origin of these random bits can be directly certified in a way that conventional random number generators are unable to.”

Early Steps

NIST took the first steps to CURBy with experimental Bell steps in 2015 that established that quantum mechanism is truly random, and three years later developed methods for using the Bell tests to build the first sources of true randomness.

Over the past few years, the NIST team hardened the experiment and had it run automatically so it could provide random numbers on demand.

The process begins by generating a pair of entangled photons inside a nonlinear crystal. The photons travel to separate labs through optical fiber. Once they reach the labs, the photons, their polarlizations are measured and the results of the measurements are truly random. It’s a process that’s repeated 250,000 times a second. The data is sent to a computer at the university and processed into 512-bit strings of binary numbers and then published.

The Twine Protocol

Key to everything is the Twine protocol, which ensures transparency. Every step of the randomness generation process can be verified the protocol, which the researchers described as a “novel set of quantum-compatible blockchain technologies” that market each set of data with a hash, a digital fingerprint that allows any user to verify the data behind each random number.

By intertwining the hash chains, the come timestamps that link the data together into a data structure that can be traced and that can allow those using the Twine protocol to immediately see when data has been manipulated.

“The Twine protocol lets us weave together all these other beacons into a tapestry of trust,” said Jasper Palfree, a research assistant on the project at the university.

In the first 40 days of operation, the protocol produced randoms number at a 99.7% rate – 7,434 times out of 7,454 tries.

Share.

Comments are closed.