Quantum Information and Computation for Dummies
Quantum computers are devices capable of performing computations using quantum bits, or qubits.
The first thing we need to understand is what a qubit actually is. A “classical computer,” like the one you’re reading this on (either desktop, laptop, tablet, or phone), is also referred to as a “binary computer” because all of the functions it performs are based on either ones or zeros.
D-Wave seminar at Nagoya University: "An Introduction to Quantum Computing"
On a binary computer the processor uses transistors to perform calculations. Each transistor can be on or off, which indicates the one or zero used to compute the next step in a program or algorithm.
There’s more to it than that, but the important thing to know about binary computers is the ones and zeros they use as the basis for computations are called “bits.”
Quantum computers don’t use bits; they use qubits. Qubits, aside from sounding way cooler, have extra functions that bits don’t. Instead of only being represented as a one or zero, qubits can actually be both at the same time. Often qubits, when unobserved, are considered to be “spinning.” Instead of referring to these types of “spin qubits” using ones or zeros, they’re measured in states of “up,” “down,” and “both.”
Superposition
Qubits can be more than one thing at a time because of a strange phenomenon called superposition. Quantum superposition in qubits can be explained by flipping a coin. We know that the coin will land in one of two states: heads or tails. This is how binary computers think. While the coin is still spinning in the air, assuming your eye isn’t quick enough to ‘observe’ the actual state it’s in, the coin is actually in both states at the same time. Essentially until the coin lands it has to be considered both heads and tails simultaneously.A clever scientist by the name of Schrodinger explained this phenomenon using a cat which he demonstrated as being both alive and dead at the same time.
Quantum Computing: Untangling the Hype
Observation theory
Qubits work on the same principle. An area of related study called “observation theory” dictates that when a quantum particle is being watched it can act like a wave. Basically the universe acts one way when we’re looking, another way when we aren’t. This means quantum computers, using their qubits, can simulate the subatomic particles of the universe in a way that’s actually natural: they speak the same language as an electron or proton, basically.Different companies are approaching qubits in different ways because, as of right now, working with them is incredibly difficult. Since observing them changes their state, and using them creates noise – the more qubits you have the more errors you get – measuring them is challenging to say the least.
This challenge is exacerbated by the fact that most quantum processors have to be kept at near perfect-zero temperatures (colder than space) and require an amount power that is unsustainably high for the quality of computations. Right now, quantum computers aren’t worth the trouble and money they take to build and operate.
In the future, however, they’ll change our entire understanding of biology, chemistry, and physics. Simulations at the molecular level could be conducted that actually imitate physical concepts in the universe we’ve never been able to reproduce or study.
Automatski - RSA-2048 Cryptography Cracked using Shor's Algorithm on a Quantum Computer
Automatski - RSA-2048 Cryptography Cracked using Shor's Algorithm on a Quantum Computer from Aditya Yadav
Quantum Supremacy
For quantum computers to become useful to society we’ll have to achieve certain milestones first. The point at which a quantum computer can process information and perform calculations that a binary computer can’t is called quantum supremacy.Quantum supremacy isn’t all fun and games though, it presents another set of problems. When quantum computers are fully functional even modest systems in the 100 qubit range may be able to bypass binary security like a hot knife through butter.
This is because those qubits, which can be two things at once, figure out multiple solutions to a problem at once. They don’t have to follow binary logic like “if one thing happens do this but if another thing happens do something else.” Individual qubits can do both at the same time while spinning, for example, and then produce the optimum result when properly observed.
Currently there’s a lot of buzz about quantum computers, and rightfully so. Google is pretty sure its new Bristlecone processor will achieve quantum supremacy this year. And it’s hard to bet against Google or one of the other big tech companies. Especially when Intel has already put a quantum processor on a silicon chip and you can access IBM’s in the cloud right now.
No matter your feelings on quantum computers, qubits, or half-dead/half-alive cats, the odds are pretty good that quantum computers will follow the same path that IBM’s mainframes did. They’ll get smaller, faster, more powerful, and eventually we’ll all be using them, even if we don’t understand the science behind them.
Overview
Quantum algorithms are usually described, in the commonly used circuit model of quantum computation, by a quantum circuit which acts on some input qubits and terminates with a measurement. A quantum circuit consists of simple quantum gates which act on at most a fixed number of qubits[why?]. Quantum algorithms may also be stated in other models of quantum computation, such as the Hamiltonian oracle model.[5]Quantum algorithms can be categorized by the main techniques used by the algorithm. Some commonly used techniques/ideas in quantum algorithms include phase kick-back, phase estimation, the quantum Fourier transform, quantum walks, amplitude amplification and topological quantum field theory. Quantum algorithms may also be grouped by the type of problem solved, for instance see the survey on quantum algorithms for algebraic problems.[6]
Precision atom qubits achieve major quantum computing milestone
Algorithms based on the quantum Fourier transform
The quantum Fourier transform is the quantum analogue of the discrete Fourier transform, and is used in several quantum algorithms. The Hadamard transform is also an example of a quantum Fourier transform over an n-dimensional vector space over the field F2. The quantum Fourier transform can be efficiently implemented on a quantum computer using only a polynomial number of quantum gates.[citation needed]
Deutsch–Jozsa algorithm
Main article: Deutsch–Jozsa algorithm
The Deutsch–Jozsa algorithm solves a black-box problem which probably requires exponentially many queries to the black box for any deterministic classical computer, but can be done with exactly one query by a quantum computer. If we allow both bounded-error quantum and classical algorithms, then there is no speedup since a classical probabilistic algorithm can solve the problem with a constant number of queries with small probability of error. The algorithm determines whether a function f is either constant (0 on all inputs or 1 on all inputs) or balanced (returns 1 for half of the input domain and 0 for the other half).
Simon's algorithm
Main article: Simon's algorithm
Simon's algorithm solves a black-box problem exponentially faster than any classical algorithm, including bounded-error probabilistic algorithms. This algorithm, which achieves an exponential speedup over all classical algorithms that we consider efficient, was the motivation for Shor's factoring algorithm.
Quantum phase estimation algorithm
Main article: Quantum phase estimation algorithm
The quantum phase estimation algorithm is used to determine the eigenphase of an eigenvector of a unitary gate given a quantum state proportional to the eigenvector and access to the gate. The algorithm is frequently used as a subroutine in other algorithms.
Shor's algorithm
Main article: Shor's algorithm
Shor's algorithm solves the discrete logarithm problem and the integer factorization problem in polynomial time,[7] whereas the best known classical algorithms take super-polynomial time. These problems are not known to be in P or NP-complete. It is also one of the few quantum algorithms that solves a non–black-box problem in polynomial time where the best known classical algorithms run in super-polynomial time.
Hidden subgroup problem
The abelian hidden subgroup problem is a generalization of many problems that can be solved by a quantum computer, such as Simon's problem, solving Pell's equation, testing the principal ideal of a ring R and factoring. There are efficient quantum algorithms known for the Abelian hidden subgroup problem.[8] The more general hidden subgroup problem, where the group isn't necessarily abelian, is a generalization of the previously mentioned problems and graph isomorphism and certain lattice problems. Efficient quantum algorithms are known for certain non-abelian groups. However, no efficient algorithms are known for the symmetric group, which would give an efficient algorithm for graph isomorphism[9] and the dihedral group, which would solve certain lattice problems.[10]
Boson sampling problem
Main article: Boson sampling
The Boson Sampling Problem in an experimental configuration assumes[11] an input of bosons (ex. photons of light) of moderate number getting randomly scattered into a large number of output modes constrained by a defined unitarity. The problem is then to produce a fair sample of the probability distribution of the output which is dependent on the input arrangement of bosons and the Unitarity.[12] Solving this problem with a classical computer algorithm requires computing the permanent[clarification needed] of the unitary transform matrix, which may be either impossible or take a prohibitively long time. In 2014, it was proposed[13] that existing technology and standard probabilistic methods of generating single photon states could be used as input into a suitable quantum computable linear optical network and that sampling of the output probability distribution would be demonstrably superior using quantum algorithms. In 2015, investigation predicted[14] the sampling problem had similar complexity for inputs other than Fock state photons and identified a transition in computational complexity from classically simulatable to just as hard as the Boson Sampling Problem, dependent on the size of coherent amplitude inputs.
Estimating Gauss sums
A Gauss sum is a type of exponential sum. The best known classical algorithm for estimating these sums takes exponential time. Since the discrete logarithm problem reduces to Gauss sum estimation, an efficient classical algorithm for estimating Gauss sums would imply an efficient classical algorithm for computing discrete logarithms, which is considered unlikely. However, quantum computers can estimate Gauss sums to polynomial precision in polynomial time.[15]
Fourier fishing and Fourier checking
We have an oracle consisting of n random Boolean functions mapping n-bit strings to a Boolean value. We are required to find n n-bit strings z1,..., zn such that for the Hadamard-Fourier transform, at least 3/4 of the strings satisfy
{\displaystyle \left|{\tilde {f}}\left(z_{i}\right)\right|\geqslant 1} \left|{\tilde {f}}\left(z_{i}\right)\right|\geqslant 1
and at least 1/4 satisfies
{\displaystyle \left|{\tilde {f}}\left(z_{i}\right)\right|\geqslant 2} \left|{\tilde {f}}\left(z_{i}\right)\right|\geqslant 2.
This can be done in Bounded-error Quantum Polynomial time (BQP).[16]
Bob Sutor demonstrates the IBM Q quantum computer
Sounds of a Quantum Computer
Algorithms based on amplitude amplification
Amplitude amplification is a technique that allows the amplification of a chosen subspace of a quantum state. Applications of amplitude amplification usually lead to quadratic speedups over the corresponding classical algorithms. It can be considered to be a generalization of Grover's algorithm.
Grover's algorithm
Main article: Grover's algorithm
Grover's algorithm searches an unstructured database (or an unordered list) with N entries, for a marked entry, using only {\displaystyle O({\sqrt {N}})} O({\sqrt {N}}) queries instead of the {\displaystyle O({N})} {\displaystyle O({N})} queries required classically.[17] Classically, {\displaystyle O({N})} {\displaystyle O({N})} queries are required, even if we allow bounded-error probabilistic algorithms.
Bohmian Mechanics is a non-local hidden variable interpretation of quantum mechanics. It has been shown that a non-local hidden variable quantum computer could implement a search of an N-item database at most in {\displaystyle O({\sqrt[{3}]{N}})} {\displaystyle O({\sqrt[{3}]{N}})} steps. This is slightly faster than the {\displaystyle O({\sqrt {N}})} O({\sqrt {N}}) steps taken by Grover's algorithm. Neither search method will allow quantum computers to solve NP-Complete problems in polynomial time.[18]
Quantum counting
Quantum counting solves a generalization of the search problem. It solves the problem of counting the number of marked entries in an unordered list, instead of just detecting if one exists. Specifically, it counts the number of marked entries in an {\displaystyle N} N-element list, with error {\displaystyle \epsilon } \epsilon making only {\displaystyle \Theta \left({\frac {1}{\epsilon }}{\sqrt {\frac {N}{k}}}\right)} \Theta \left({\frac {1}{\epsilon }}{\sqrt {{\frac {N}{k}}}}\right) queries, where {\displaystyle k} k is the number of marked elements in the list.[19][20] More precisely, the algorithm outputs an estimate {\displaystyle k'} k' for {\displaystyle k} k, the number of marked entries, with the following accuracy: {\displaystyle |k-k'|\leq \epsilon k} |k-k'|\leq \epsilon k.
Solving a linear systems of equations
Main article: Quantum algorithm for linear systems of equations
In 2009 Aram Harrow, Avinatan Hassidim, and Seth Lloyd, formulated a quantum algorithm for solving linear systems. The algorithm estimates the result of a scalar measurement on the solution vector to a given linear system of equations.[21]
Provided the linear system is a sparse and has a low condition number {\displaystyle \kappa } \kappa , and that the user is interested in the result of a scalar measurement on the solution vector, instead of the values of the solution vector itself, then the algorithm has a runtime of {\displaystyle O(\log(N)\kappa ^{2})} O(\log(N)\kappa ^{2}), where {\displaystyle N} N is the number of variables in the linear system. This offers an exponential speedup over the fastest classical algorithm, which runs in {\displaystyle O(N\kappa )} O(N\kappa ) (or {\displaystyle O(N{\sqrt {\kappa }})} O(N{\sqrt {\kappa }}) for positive semidefinite matrices).
Algorithms based on quantum walks
Main article: Quantum walk
A quantum walk is the quantum analogue of a classical random walk, which can be described by a probability distribution over some states. A quantum walk can be described by a quantum superposition over states. Quantum walks are known to give exponential speedups for some black-box problems.[22][23] They also provide polynomial speedups for many problems. A framework for the creation of quantum walk algorithms exists and is quite a versatile tool.[24]
Element distinctness problem
Main article: Element distinctness problem
The element distinctness problem is the problem of determining whether all the elements of a list are distinct. Classically, Ω(N) queries are required for a list of size N, since this problem is harder than the search problem which requires Ω(N) queries. However, it can be solved in {\displaystyle \Theta (N^{2/3})} \Theta (N^{{2/3}}) queries on a quantum computer. The optimal algorithm is by Andris Ambainis.[25] Yaoyun Shi first proved a tight lower bound when the size of the range is sufficiently large.[26] Ambainis[27] and Kutin[28] independently (and via different proofs) extended his work to obtain the lower bound for all functions.
Triangle-finding problem
Main article: Triangle finding problem
The triangle-finding problem is the problem of determining whether a given graph contains a triangle (a clique of size 3). The best-known lower bound for quantum algorithms is Ω(N), but the best algorithm known requires O(N1.297) queries,[29] an improvement over the previous best O(N1.3) queries.[24][30]
Quantum Algorithms for Evaluating MIN-MAX Trees
Formula evaluation
A formula is a tree with a gate at each internal node and an input bit at each leaf node. The problem is to evaluate the formula, which is the output of the root node, given oracle access to the input.
A well studied formula is the balanced binary tree with only NAND gates.[31] This type of formula requires Θ(Nc) queries using randomness,[32] where {\displaystyle c=\log _{2}(1+{\sqrt {33}})/4\approx 0.754} c=\log _{2}(1+{\sqrt {33}})/4\approx 0.754. With a quantum algorithm however, it can be solved in Θ(N0.5) queries. No better quantum algorithm for this case was known until one was found for the unconventional Hamiltonian oracle model.[5] The same result for the standard setting soon followed.[33]
Quantum Computing deep dive
Fast quantum algorithms for more complicated formulas are also known.[34]
Group commutativity
The problem is to determine if a black box group, given by k generators, is commutative. A black box group is a group with an oracle function, which must be used to perform the group operations (multiplication, inversion, and comparison with identity). We are interested in the query complexity, which is the number of oracle calls needed to solve the problem. The deterministic and randomized query complexities are {\displaystyle \Theta (k^{2})} \Theta (k^{2}) and {\displaystyle \Theta (k)} \Theta (k) respectively.[35] A quantum algorithm requires {\displaystyle \Omega (k^{2/3})} \Omega (k^{{2/3}}) queries but the best known algorithm uses {\displaystyle O(k^{2/3}\log k)} O(k^{{2/3}}\log k) queries.[36]
BQP-complete problems
Computing knot invariants
Witten had shown that the Chern-Simons topological quantum field theory (TQFT) can be solved in terms of Jones polynomials. A quantum computer can simulate a TQFT, and thereby approximate the Jones polynomial,[37] which as far as we know, is hard to compute classically in the worst-case scenario.[citation needed]
Quantum simulation
The idea that quantum computers might be more powerful than classical computers originated in Richard Feynman's observation that classical computers seem to require exponential time to simulate many-particle quantum systems.[38] Since then, the idea that quantum computers can simulate quantum physical processes exponentially faster than classical computers has been greatly fleshed out and elaborated. Efficient (that is, polynomial-time) quantum algorithms have been developed for simulating both Bosonic and Fermionic systems[39] and in particular, the simulation of chemical reactions beyond the capabilities of current classical supercomputers requires only a few hundred qubits.[40] Quantum computers can also efficiently simulate topological quantum field theories.[41] In addition to its intrinsic interest, this result has led to efficient quantum algorithms for estimating quantum topological invariants such as Jones[42] and HOMFLY polynomials,[43] and the Turaev-Viro invariant of three-dimensional manifolds.[44]
Hybrid quantum/classical algorithms
Hybrid Quantum/Classical Algorithms combine quantum state preparation and measurement with classical optimization.[45] These algorithms generally aim to determine the ground state eigenvector and eigenvalue of a Hermitian Operator.
QAOA
The quantum approximate optimization algorithm is a toy model of quantum annealing which can be used to solve problems in graph theory.[46] The algorithm makes use of classical optimization of quantum operations to maximize an objective function.
Variational Quantum Eigensolver
The VQE algorithm applies classical optimization to minimize the energy expectation of an ansatz state to find the ground state energy of a molecule.[47] This can also be extended to find excited energies of molecules.[48]
BIG THINGS HAPPEN when computers get smaller. Or faster. And quantum computing is about chasing perhaps the biggest performance boost in the history of technology. The basic idea is to smash some barriers that limit the speed of existing computers by harnessing the counterintuitive physics of subatomic scales.
Quantum Computing for Computer Scientists
If the tech industry pulls off that, ahem, quantum leap, you won’t be getting a quantum computer for your pocket. Don’t start saving for an iPhone Q. We could, however, see significant improvements in many areas of science and technology, such as longer-lasting batteries for electric cars or advances in chemistry that reshape industries or enable new medical treatments. Quantum computers won’t be able to do everything faster than conventional computers, but on some tricky problems they have advantages that would enable astounding progress.
It’s not productive (or polite) to ask people working on quantum computing when exactly those dreamy applications will become real. The only thing for sure is that they are still many years away. Prototype quantum computing hardware is still embryonic. But powerful—and, for tech companies, profit-increasing—computers powered by quantum physics have recently started to feel less hypothetical.
The Mathematics of Quantum Computers | Infinite Series
Like the befuddling math underpinning quantum computing, some of the expectations building around this still-impractical technology can make you lightheaded. If you squint out the window of a flight into SFO right now, you can see a haze of quantum hype drifting over Silicon Valley. But the enormous potential of quantum computing is undeniable, and the hardware needed to harness it is advancing fast. If there were ever a perfect time to bend your brain around quantum computing, it’s now. Say “Schrodinger’s superposition” three times fast, and we can dive in.
The History of Quantum Computing Explained
The prehistory of quantum computing begins early in the 20th century, when physicists began to sense they had lost their grip on reality.
First, accepted explanations of the subatomic world turned out to be incomplete. Electrons and other particles didn’t just neatly carom around like Newtonian billiard balls, for example. Sometimes they acted like waves instead. Quantum mechanics emerged to explain such quirks, but introduced troubling questions of its own. To take just one brow-wrinkling example, this new math implied that physical properties of the subatomic world, like the position of an electron, didn’t really exist until they were observed.
If you find that baffling, you’re in good company. A year before winning a Nobel for his contributions to quantum theory, Caltech’s Richard Feynman remarked that “nobody understands quantum mechanics.” The way we experience the world just isn’t compatible. But some people grasped it well enough to redefine our understanding of the universe. And in the 1980s a few of them—including Feynman—began to wonder if quantum phenomena like subatomic particles'“don’t look and I don’t exist” trick could be used to process information. The basic theory or blueprint for quantum computers that took shape in the 80s and 90s still guides Google and others working on the technology.
Mathematics for Machine Learning full Course || Linear Algebra || Part-1
Before we belly flop into the murky shallows of quantum computing 0.101, we should refresh our understanding of regular old computers. As you know, smartwatches, iPhones, and the world’s fastest supercomputer all basically do the same thing: they perform calculations by encoding information as digital bits, aka 0s and 1s. A computer might flip the voltage in a circuit on and off to represent 1s and 0s for example.
Quantum computers do calculations using bits, too. After all, we want them to plug into our existing data and computers. But quantum bits, or qubits, have unique and powerful properties that allow a group of them to do much more than an equivalent number of conventional bits.
Qubits can be built in various ways, but they all represent digital 0s and 1s using the quantum properties of something that can be controlled electronically. Popular examples—at least among a very select slice of humanity—include superconducting circuits, or individual atoms levitated inside electromagnetic fields. The magic power of quantum computing is that this arrangement lets qubits do more than just flip between 0 and 1. Treat them right and they can flip into a mysterious extra mode called a superposition.
QUANTUM LEAPS
1980
Physicist Paul Benioff suggests quantum mechanics could be used for computation.
1981
Nobel-winning physicist Richard Feynman, at Caltech, coins the term quantum computer.
1985
Physicist David Deutsch, at Oxford, maps out how a quantum computer would operate, a blueprint that underpins the nascent industry of today.
1994
Mathematician Peter Shor, at Bell Labs, writes an algorithm that could tap a quantum computer’s power to break widely used forms of encryption.
2007
D-Wave, a Canadian startup, announces a quantum computing chip it says can solve Sudoku puzzles, triggering years of debate over whether the company’s technology really works.
2013
Google teams up with NASA to fund a lab to try out D-Wave’s hardware.
2014
Google hires the professor behind some of the best quantum computer hardware yet to lead its new quantum hardware lab.
2016
IBM puts some of its prototype quantum processors on the internet for anyone to experiment with, saying programmers need to get ready to write quantum code.
2017
Startup Rigetti opens its own quantum computer fabrication facility to build prototype hardware and compete with Google and IBM.
You may have heard that a qubit in superposition is both 0 and 1 at the same time. That’s not quite true and also not quite false—there’s just no equivalent in Homo sapiens’ humdrum classical reality. If you have a yearning to truly grok it, you must make a mathematical odyssey WIRED cannot equip you for. But in the simplified and dare we say perfect world of this explainer, the important thing to know is that the math of a superposition describes the probability of discovering either a 0 or 1 when a qubit is read out—an operation that crashes it out of a quantum superposition into classical reality. A quantum computer can use a collection of qubits in superpositions to play with different possible paths through a calculation. If done correctly, the pointers to incorrect paths cancel out, leaving the correct answer when the qubits are read out as 0s and 1s.
For some problems that are very time consuming for conventional computers, this allows a quantum computer to find a solution in far fewer steps than a conventional computer would need. Grover’s algorithm, a famous quantum search algorithm, could find you in a phone book with 100 million names with just 10,000 operations. If a classical search algorithm just spooled through all the listings to find you, it would require 50 million operations, on average. For Grover’s and some other quantum algorithms, the bigger the initial problem—or phonebook—the further behind a conventional computer is left in the digital dust.
Leading the Evolution of Compute: Quantum Computing
The reason we don’t have useful quantum computers today is that qubits are extremely finicky. The quantum effects they must control are very delicate, and stray heat or noise can flip 0s and 1s, or wipe out a crucial superposition. Qubits have to be carefully shielded, and operated at very cold temperatures, sometimes only fractions of a degree above absolute zero. Most plans for quantum computing depend on using a sizable chunk of a quantum processor’s power to correct its own errors, caused by misfiring qubits.
Recent excitement about quantum computing stems from progress in making qubits less flaky. That’s giving researchers the confidence to start bundling the devices into larger groups. Startup Rigetti Computing recently announced it has built a processor with 128 qubits made with aluminum circuits that are super-cooled to make them superconducting. Google and IBM have announced their own chips with 72 and 50 qubits, respectively. That’s still far fewer than would be needed to do useful work with a quantum computer—it would probably require at least thousands—but as recently as 2016 those companies’ best chips had qubits only in the single digits. After tantalizing computer scientists for 30 years, practical quantum computing may not exactly be close, but it has begun to feel a lot closer.
What the Future Holds for Quantum Computing
Some large companies and governments have started treating quantum computing research like a race—perhaps fittingly it’s one where both the distance to the finish line and the prize for getting there are unknown.
Google, IBM, Intel, and Microsoft have all expanded their teams working on the technology, with a growing swarm of startups such as Rigetti in hot pursuit. China and the European Union have each launched new programs measured in the billions of dollars to stimulate quantum R&D. And in the US, the Trump White House has created a new committee to coordinate government work on quantum information science. Several bills were introduced to Congress in 2018 proposing new funding for quantum research, totalling upwards of $1.3 billion. It’s not quite clear what the first killer apps of quantum computing will be, or when they will appear. But there’s a sense that whoever is first make these machines useful will gain big economic and national security advantages.
JARGON FOR THE QUANTUM QURIOUS
What's a qubit?
A device that uses quantum mechanical effects to represent 0s and 1s of digital data, similar to the bits in a conventional computer.
What's a superposition?
It's the trick that makes quantum computers tick, and makes qubits more powerful than ordinary bits. A superposition is in an intuition-defying mathematical combination of both 0 and 1. Quantum algorithms can use a group of qubits in a superposition to shortcut through calculations.
What's quantum entanglement?
A quantum effect so unintuitive that Einstein dubbed it “spooky action at a distance.” When two qubits in a superposition are entangled, certain operations on one have instant effects on the other, a process that helps quantum algorithms be more powerful than conventional ones.
What's quantum speedup?
The holy grail of quantum computing—a measure of how much faster a quantum computer could crack a problem than a conventional computer could. Quantum computers aren’t well-suited to all kinds of problems, but for some they offer an exponential speedup, meaning their advantage over a conventional computer grows explosively with the size of the input problem.
Back in the world of right now, though, quantum processors are too simple to do practical work. Google is working to stage a demonstration known as quantum supremacy, in which a quantum processor would solve a carefully designed math problem beyond existing supercomputers. But that would be an historic scientific milestone, not proof quantum computing is ready to do real work.
As quantum computer prototypes get larger, the first practical use for them will probably be for chemistry simulations. Computer models of molecules and atoms are vital to the hunt for new drugs or materials. Yet conventional computers can’t accurately simulate the behavior of atoms and electrons during chemical reactions. Why? Because that behavior is driven by quantum mechanics, the full complexity of which is too great for conventional machines. Daimler and Volkswagen have both started investigating quantum computing as a way to improve battery chemistry for electric vehicles. Microsoft says other uses could include designing new catalysts to make industrial processes less energy intensive, or even to pull carbon dioxide out of the atmosphere to mitigate climate change.
Quantum computers would also be a natural fit for code-breaking. We’ve known since the 90s that they could zip through the math underpinning the encryption that secures online banking, flirting, and shopping. Quantum processors would need to be much more advanced to do this, but governments and companies are taking the threat seriously. The National Institute of Standards and Technology is in the process of evaluating new encryption systems that could be rolled out to quantum-proof the internet.
Special Purpose Quantum Annealing Quantum Computer v1.0
Tech companies such as Google are also betting that quantum computers can make artificial intelligence more powerful. That’s further in the future and less well mapped out than chemistry or code-breaking applications, but researchers argue they can figure out the details down the line as they play around with larger and larger quantum processors. One hope is that quantum computers could help machine-learning algorithms pick up complex tasks using many fewer than the millions of examples typically used to train AI systems today.
Despite all the superposition-like uncertainty about when the quantum computing era will really begin, big tech companies argue that programmers need to get ready now. Google, IBM, and Microsoft have all released open source tools to help coders familiarize themselves with writing programs for quantum hardware. IBM has even begun to offer online access to some of its quantum processors, so anyone can experiment with them. Long term, the big computing companies see themselves making money by charging corporations to access data centers packed with supercooled quantum processors.
What’s in it for the rest of us? Despite some definite drawbacks, the age of conventional computers has helped make life safer, richer, and more convenient—many of us are never more than five seconds away from a kitten video. The era of quantum computers should have similarly broad reaching, beneficial, and impossible to predict consequences. Bring on the qubits.
More Information:
http://quantumalgorithmzoo.orghttps://thierry-breton.com/en/the-quantum-challenge/
https://www.digitaltrends.com/computing/microsoft-provides-free-quantum-computing-lessons/
https://www.wired.com/story/wired-guide-to-quantum-computing/
https://www.forbes.com/sites/chadorzel/2017/04/17/what-sorts-of-problems-are-quantum-computers-good-for/
https://cloudblogs.microsoft.com/quantum/2018/06/06/the-microsoft-approach-to-quantum-computing/
https://blogs.microsoft.com/ai/microsoft-doubles-quantum-computing-bet/
https://thenextweb.com/microsoft/2018/07/24/microsoft-debuts-free-quantum-computer-programming-katas/
https://thenextweb.com/artificial-intelligence/2018/03/15/understanding-quantum-computers-the-basics/