Instead of relying on factoring large numbers, a McEliece system hides your data by first wrapping it up with an error-correcting code (ECC) and then deliberately adding noise to it. Quantum computers, incidentally, turn out not to be very useful in computing some types of ECCs, which is the whole point of this operation.

Normally ECCs make it easy to recover a signal that has noise added to it; they’re not “codes” in the encryption sense at all. The crucial step in McEliece cryptography is that you create the ECC with an easy-to-reverse code, and then convert the ECC into a hard-to-solve one by pre- and post-multiplying by matrices. These matrices, that turn the hard problem into the easy problem and vice-versa, become the secret key.

If they are realized, quantum computers could accelerate the development of new chemistries, drugs and materials. The systems also could crack any encryption, which has made their development a top priority among several nations.

And across the board, it could provide companies and countries with a competitive edge.

“Quantum computing is at the forefront of national initiatives,” said Amy Leong, senior vice president at FormFactor. “There have been more than $20 billion in investments announced across 15 countries here. Geopolitical powerhouses like the U.S. and China are certainly leading the race to claim quantum supremacy, followed by a host of others from Europe and Asia.”

The race is heating up among nations as well as between different organizations.

Source: Intel

A quantum system consists of a processor, which incorporates the qubits. Those qubits come in two configurations, with one-qubit and two-qubit gates.
Let’s say you have a quantum processor with 16 qubits. The qubits are arranged in a two-dimensional 4 X 4 array.
The first three rows (top to bottom) may consist of one-qubit gates. The last row may have two-qubit gates.

The processing functions are complex.
In classical computing, you put a number into the computer, it calculates the function, and gives you an output.

Let’s say you have problem with 2n bits of data. “If you have ‘n’ bits, you have 2n. That’s an exponentially large number of states, and you can only work on them one at a time.

There are many who in Round 1 of this started trash-talking D-Wave before they’d ever met the company,” Jurvetson says. “Just the mere notion that someone is going to be building and shipping a quantum computer–they said, ‘They are lying, and it’s smoke and mirrors.’”

Seven years and many demos and papers later, the company isn’t any less controversial. Any blog post or news story about D-Wave instantly grows a shaggy beard of vehement comments, both pro- and anti-.
The critics argue that D-Wave is insufficiently transparent, that it overhypes and underperforms, that its adiabatic approach is unpromising, that its machines are no faster than classical computers and that the qubits in those machines aren’t even exhibiting quantum behavior at all–they’re not qubits, they’re just plain old bits, and Google and the media have been sold a bill of goods.

But I’m not convinced that those effects, right now, are playing any causal role in solving any problems faster than we could solve them with a classical computer. Nor do I think there’s any good argument that D-Wave’s current approach, scaled up, will lead to such a speedup in the future.

It might, but there’s currently no good reason to think so.”

Not only is it hard for laymen to understand the arguments in play, it’s hard to understand why there even is an argument. Either D-Wave has unlocked fathomless oceans of computing power or it hasn’t–right? But it’s not that simple.
D-Wave’s hardware isn’t powerful enough or well enough understood to show serious quantum speedup yet, and you can’t just open the hood and watch the qubits do whatever they’re doing. There isn’t even an agreed-upon method for benchmarking a quantum computer.

Christopher Monroe at the University of Maryland and the Joint Quantum Institute has created a 20-qubit system, which may be the world’s record. Unless, of course, you’re counting D-Wave.

D-Wave’s co-founder and chief technology officer is a 42-year-old Canadian named Geordie Rose with big bushy eyebrows, a solid build and a genial but slightly pugnacious air–he was a competitive wrestler in college.

In 1998 Rose was finishing up a Ph.D. in physics at the University of British Columbia, but he couldn’t see a future for himself in academia. After taking a class on entrepreneurship, Rose identified quantum computing as a promising business opportunity.

Not that he had any more of a clue than anybody else about how to build a quantum computer, but he did have a hell of a lot of self-confidence. “When you’re young you feel invincible, like you can do anything,” Rose says.

This difference between classical and quantum correlations is subtle, but it’s essential for the speedup provided by quantum computers.

If you want to learn more, see the tutorial exploring quantum entanglement with Q# and Azure Quantum.

Quantum computers vs quantum simulators

A quantum computer is a machine that combines the power of classical and quantum computing. The current quantum computers correspond to a hybrid model: a classical computer that controls a quantum processor.

The development of quantum computers is still in its infancy.

Quantum hardware is expensive and most systems are located in universities and research labs. Where classical computers use familiar silicon-based chips, quantum computers use quantum systems such as atoms, ions, photons, or electrons.

Examples include computation in chemistry and physics, for example to discover better antibiotics and medications; but researchers are also investigating potential use cases in machine learning and AI, as well as ways that quantum computing could speed-up optimisation problems in industries like finance or transportation.

For those specific applications, quantum computing could provide an extreme speed-up that will come on top of classical computing. The concept is similar to the use of Graphics Processing Units (GPUs), which can enhance a Central Processing Unit (CPU) for certain use cases like machine learning.

On top of a GPU, therefore, scientists and engineers might one day have the choice of tapping into the capabilities of a Quantum Processing Unit (QPU).

But even if using a QPU, classical computers will still be central to the computation.

11:37pm On Jun 15, 2021Hacking bitcoin wallets with quantum computers could happen – but cryptographers are racing to build a workaround

Stefan Thomas really could have used a quantum computer this year.

The German-born programmer and crypto trader forgot the password to unlock his digital wallet, which contains 7,002 bitcoin, now worth $265 million. Quantum computers, which will be several million times faster than traditional computers, could have easily helped him crack the code.

Though quantum computing is still very much in its infancy, governments and private-sector companies such as Microsoft and Google are working to make it a reality.

According to one theory, at that moment it’s operating in two slightly different universes at the same time, one in which it’s 1, one in which it’s 0; the physicist David Deutsch once described quantum computing as “the first technology that allows useful tasks to be performed in collaboration between parallel universes.” Not only is this excitingly weird, it’s also incredibly useful. If a single quantum bit (or as they’re inevitably called, qubits, pronounced cubits) can be in two states at the same time, it can perform two calculations at the same time.

Two quantum bits could perform four simultaneous calculations; three quantum bits could perform eight; and so on. The power grows exponentially.

The supercooled niobium chip at the heart of the D-Wave Two has 512 qubits and therefore could in theory perform 2512 operations simultaneously.

Similar Posts:

Leave a comment