text
stringlengths
4.06k
10.7k
id
stringlengths
47
47
dump
stringclasses
20 values
url
stringlengths
26
321
file_path
stringlengths
125
142
language
stringclasses
1 value
language_score
float64
0.71
0.98
token_count
int64
1.02k
2.05k
score
float64
3.5
4.53
int_score
int64
4
5
Researchers at EPFL in Switzerland have used superconducting technology to make two types of optomechanical lattice, including a honeycomb-shaped lattice that can mimic some of the physics of graphene. [Image: Andrea Bancora / EPFL] Scientists in Switzerland have shown how a superconducting circuit can be used to investigate the physics of topological lattices such as strained graphene. They say their work—an implementation of what is known as cavity optomechanics—could potentially be used to create highly entangled mechanical states, a valuable resource for quantum computing and communication based on mechanical oscillators (Nature, doi: 10.1038/s41586-022-05367-9). Optically probing mechanical systems Optomechanical systems use electromagnetic fields to control the vibrations of mechanical objects, taking advantage of the fact that light carries momentum and can therefore exert pressure. At visible wavelengths, such systems involve a laser propagating in an optical cavity whose end mirror is free to vibrate. Microwave devices instead couple longer-wavelength radiation to an LC circuit featuring a vibrating capacitor. In recent years, physicists have also started to use optomechanics to probe the quantum behavior of macroscopic mechanical systems. This has involved manipulating such systems in a number of ways, such as cooling them to their quantum ground states or entangling mechanical oscillators located some distance from one another. Such systems, however, tend to comprise only one or two optomechanical modes. Researchers would like to develop 2D lattices of optomechanical oscillators, as these could shed light on more complex phenomena such as the topology of light and sound or the quantum many-body dynamics of macroscopic systems. Precise control via improved fabrication Building optomechanical lattices where each building block consists of mechanical and optical modes requires very precise control of the properties of individual lattice sites. Tobias Kippenberg, Amir Youssefi and colleagues at the Swiss Federal Institute of Technology Lausanne (EPFL) have now shown how this is possible by constructing an optomechanical system from a superconducting circuit. Key to the work is a parallel-plate vacuum capacitor, which features a suspended top plate that can vibrate. The conventional method for fabricating such capacitors makes it hard to control the size of the gap between the device’s two plates, and with that the resonant mechanical and microwave frequencies as well as the coupling strength between those. Kippenberg and colleagues got round this problem by devising a new fabrication process. First, they etched a trench in a silicon substrate; then, they placed a thin slice of aluminum on the bottom of the trench to serve as the capacitor’s lower plate, before covering that with a layer of silicon dioxide. After that, they leveled off the surface of the silicon dioxide using chemical mechanical polishing and rested a second aluminum plate on top of the leveled surface. By finally removing the silicon dioxide layer, they were able to suspend the upper aluminum plate precisely above the lower one. The fact that the superconducting circuit had to be cooled to cryogenic temperatures induced a tensile stress in the upper plate. This kept the plate flat and the gap size dependent on the depth of the trench, thereby restricting fluctuations in the resonant frequencies of the microwaves and mechanical oscillations to 0.5% and 1%, respectively. The researchers fabricated multiple instances of these capacitors, each one linked to a spiral-shaped inductor to produce a distinct LC resonator. Each resonator was in turn magnetically coupled to its neighbors, with the coupling magnitude determined by the physical distance between resonators. Edge states and a honeycomb lattice The EPFL team implemented two types of circuit. In one, the researchers lined up ten resonators in a chain to mimic what are known as topologically protected edge states. In the other they arranged 24 resonators in the shape of a honeycomb lattice, with the couplings along one of the three lattice axes set so that alternate couplings had high and low values. This allowed them to reproduce the physics of one-carbon-atom-thick sheets of graphene under strain—as can occur, for example, if the material is adsorbed onto substrates like silicon dioxide. By studying the optomechanical interactions between the resonators, the researchers were able to directly measure the resonators’ collective behavior and work out the full Hamiltonian, a function expressing the system’s combined kinetic and potential energy. They say that previously it had only been possible to measure the behavior of such superconducting circuits indirectly, using (for example) near-field scanning probes or laser scanning microscopy. The researchers reckon that such optomechanical lattices could in future shed light on “the rich physics in multimode optomechanics.” Using degenerate mechanical oscillators, the explain, should make it possible to “create collective long-range interactions and observe strong cooperative effects on mechanical motion.” Also, the system might enable highly entangled mechanical states to be created, according to the researchers—something that could benefit future quantum information technology based on mechanical oscillators.
<urn:uuid:0dcbbe4a-becf-4af4-8531-29888b3d74d7>
CC-MAIN-2023-14
https://www.optica-opn.org/home/newsroom/2022/december/2d_optomechanical_lattice_mimics_graphene_physics/?feed=News
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296949097.61/warc/CC-MAIN-20230330035241-20230330065241-00781.warc.gz
en
0.930118
1,072
3.921875
4
Australian scientists put the quantum world on a microchip This article is an installment of Future Explored, a weekly guide to world-changing technology. You can get stories like this one straight to your inbox every Thursday morning by subscribing here. An Australian startup just modeled a molecule on a microchip, placing atoms in silicon with sub-nanometer precision. This ability to simulate molecules on the atomic scale — where matter is ruled by quantum mechanics — could improve our understanding of the quantum world and lead to the creation of incredible new materials, such as high-temperature superconductors or super efficient solar cells. “We could start to mimic how nature behaves and then we can start to make new kinds of materials and devices that the world has never seen before,” said Michelle Simmons, founder of Silicon Quantum Computing, the startup responsible for the microchip. A couple of million years after making our first stone tools, humans discovered that when we zoom in on matter, looking at the atoms and subatomic particles that comprise it, they adhere to a different set of rules than the ones that govern objects on a larger scale. These rules (“quantum mechanics”) can have their own useful applications — MRI scanners, solar cells, and atomic clocks all take advantage of quantum phenomena. “We can start to make new kinds of materials and devices that the world has never seen before.”Michelle Simmons But while it’s easy to heft a rock and extrapolate that it might be good for bashing things, it’s not so easy to see or understand how matter behaves on the quantum scale — especially since observation itself affects quantum systems. We can use computer programs to simulate how some small molecules behave on the atomic or subatomic level, but that isn’t a viable option for larger molecules: there’s too many possible interactions between their particles. “If we can start to understand materials at [the quantum] level, we can design things that have never been made before,” Simmons told ScienceAlert. “The question is: how do you actually control nature at that level?” The quantum simulator The answer, it seems, is by modeling molecules on silicon chips. For a recent study, the SQC team successfully manufactured a microchip at the atomic scale, creating 10 uniformly sized artificial atoms — also known as “quantum dots” — and then using a scanning tunneling microscope to precisely position the dots in silicon. The team modeled their chip after the structure of polyacetylene, a molecule made from carbon and hydrogen atoms connected by alternating single and double carbon bonds. Once it was built, they could apply an electric charge to one part of the chip (the “source”) and study how it moved along the chain of atoms to exit at another part (the “drain”). “We’re literally building it from the bottom up, where we are mimicking the polyacetylene molecule by putting atoms in silicon with the exact distances that represent the single and double carbon-carbon bonds,” said Simmons. Based on theoretical predictions, polyacetylene is supposed to behave differently depending on whether the chain of molecules begins and ends with double carbon bonds or single carbon bonds. To check if their modeling technique was accurate, the researchers created one chip based on each version — and saw that the number electrical peaks did change as the current ran through each version. “This confirms long-standing theoretical predictions and demonstrates our ability to precisely simulate the polyacetylene molecule,” according to SQC. The team also observed an electron existing in two places simultaneously, an example of the quantum phenomenon superposition. “What [this model is] showing is that you can literally mimic what actually happens in the real molecule, and that’s why it’s exciting because the signatures of the two chains are very different,” said Simmons. The team chose a 10-dot chain of the polyacetylene molecule to demonstrate its tech because that’s something we can simulate with classical computers. Now they’re looking to scale up. “We’re near the limit of what classical computers can do, so it’s like stepping off the edge into the unknown,” said Simmons. “And this is the thing that’s exciting — we can now make bigger devices that are beyond what a classical computer can model.” These future quantum models could be for materials that lead to new batteries, pharmaceuticals, and more, predicts Simmons. “It won’t be long before we can start to realize new materials that have never existed before,” she said. We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].
<urn:uuid:bed56780-8698-4ea2-8c5d-564200806952>
CC-MAIN-2023-14
https://www.freethink.com/hard-tech/quantum-simulator
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296943484.34/warc/CC-MAIN-20230320144934-20230320174934-00382.warc.gz
en
0.921596
1,033
3.8125
4
In the last post of this series, we discussed how supercharging quantum computing with Quantum Mechanics’ principles allows high computational power. To come to terms with this, we must first delve into the math behind quantum memory. A computer needs memory. It stores input and output data as a transitional place to operate on data. We care about this functionality because data encodes states. In the classical sense, a state refers to the particular arrangement that something is in at a specific moment. Examples of a classical state are the position of a door: either open or closed; the color of a marker: red, blue, yellow, etc.; the value of a bit: 0 or 1 / false or true; and so on. As you know from daily life, these states are discrete; that is, there is only one particular arrangement a system can be in. However, a quantum state is continuous. Given a set of basis states, a quantum state may be a combination of those basis states. Translated to concrete terms, a quantum state of the colored marker could be 50% red, 25% blue, and 25% yellow. Another well-known example is Schrodinger’s Cat thought experiment; there is a 50-50 chance that the cat is either dead or alive. Intuitively, this does not make sense for how we perceive things to behave in the real world. A bit, by definition, is either 0 or 1; it cannot be in a 0.5 position or some other state because you know what state it is in by seeing the state. The critical point in our perception is seeing, or rather more generally, measuring. Measurement is what differentiates them. When you measure a classical state, you expect the same state to always be the same; a 1 will always be a 1. On the other hand, a measurement of a quantum state need not always be the same. While it is true that a state must be found in some defined arrangement, a quantum state allows you to find a different defined arrangement each time. By nature, this measurement process on a quantum state is random, and the specific quantum state defines its probability distribution. This state hand-wavy explanation becomes challenging to track fast, so to understand, we use math to describe this. To represent a state we use the notation . For example, the state of the color red and the logical state 0 can be represented as and , respectively. This is known as the bra-ket notation or Dirac notation in quantum mechanics. To say that the system is in one of these states, we write or . For now, this syntax works perfectly for defining the classical state of a system. However, to expand the usefulness of this nomenclature for our needs, we must think of as a vector in some space, with each basis state being an axis. For a mixed state, we could write something like for a bit. Considering that a state or would not make much sense because a scaled classical state should not change being in that state. Therefore, we can restrict ourselves to having as a unit vector. This forces our state to be a point on the unit circle, in which we know . This is interesting because the probability is always in the range, which applies to both and , and the sum of the probabilities of all possible states should equal 1, another rule these two follow. Then, we can argue that the square of the state coefficient is the probability of being in that state. Thus, we call these coefficients probability amplitudes. We can verify this argument with the example of a quantum state being 50% in and . Mathematically, we could write that if and refer in some sense to the probability or proportion, then we would want in this case. For this to be true, and for the vector to be normalized, then we must have which means that or 50-50 chance for and . What about negative numbers, though, for probability amplitudes? It turns out that Quantum Mechanics allows not only negative numbers but also complex numbers. The theory must work! We have expanded the two possible states of a bit from this quantum mechanical model of states to a whole space of possible linear combinations. We call this quantum mechanical bit a qubit. The state of a qubit is represented in the form where and are complex numbers. The space with all the possibilities of its state is shown geometrically by the Bloch Sphere. The Block Sphere is not a geometric representation of the vector form of the state (as that would take four dimensions: two real and two imaginary axes), but rather it is a mapping of every possible state of a single qubit. The basis states are for the positive z-axis and for the negative z-axis. Any state with real probability amplitudes lies along the plane. For example, the equal probability state: is the rightmost point of the circle: . Any point with a component will have a complex probability amplitude. In general, any state can be mapped on the Bloch Sphere using spherical coordinates: From the behavior implanted in this mathematical foundation, two key properties arise from quantum mechanics: superposition and entanglement. We discussed before the ability of a quantum state to combine multiple basis states simultaneously. This is called superposition. When the state of a system is a mixture of basis states, we say that it is a superposed state. The remarkable aspect of superposed states is that measurement is probabilistic by nature. While classical computation finds this property problematic, we will see later that it is this property that provides the quantum computer with its incredible parallel power. For a group of interacting quantum states, like in a quantum computer, another helpful property arises entanglement. It connects multiple quantum states such that the measurement of one quantum state also obtains information about the other states. For example, if I prepared two qubits with specific states independent of each other and measured their states, I would obtain results only dependent on the initial state I set them in. However, if I made the qubits interact in some predictable way such that I flipped the value of one quantum state depending on the value of the other state, then from the measurement of either qubit, I could know the state of the other qubit without having measured it. Schrodinger’s Cat also gives us an analogy: knowing the state of the cat or the detonator gives you information about the other. An alive cat means an untriggered detector and vice versa. As we will find, entanglement enables us to exploit the parallel power from superposition. With this crash course on qubits, mathematical state representation, measurement, and some quantum properties, we can now tackle how a quantum computer works, which we will discuss in the next post. Nielsen, M., & Chuang, I. (2010). Quantum Computation and Quantum Information: 10th Anniversary Edition. Cambridge: Cambridge University Press. doi:10.1017/CBO9780511976667 A Modern Approach to Quantum Mechanics, John S. Townsend Why are complex numbers needed in quantum mechanics? Some answers for the introductory level, American Journal of Physics 88, 39 (2020); https://doi.org/10.1119/10.0000258 Dirac, P. (1939). A new notation for quantum mechanics. Mathematical Proceedings of the Cambridge Philosophical Society, 35(3), 416-418. doi:10.1017/S0305004100021162 Superposed State Image. Andrew Daley. Quantum Optics and Quantum Many-Body Systems. https://qoqms.phys.strath.ac.uk/research_qc.html Bloch Sphere Image. By Smite-Meister – Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=5829358 Schrodinger’s Cat Image. Mother Jones. https://www.motherjones.com/kevin-drum/2018/09/schrodingers-cat-is-alive-one-twelfth-of-the-time/ ArrayFire Quantum Simulator. https://github.com/arrayfire/afQuantumSim
<urn:uuid:5e31a1f4-da00-4aa5-b81b-d588eb0e5243>
CC-MAIN-2023-14
https://arrayfire.com/blog/quantum-states-vs-classical-states/
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296945372.38/warc/CC-MAIN-20230325191930-20230325221930-00182.warc.gz
en
0.903872
1,716
3.640625
4
Quantum computers promise huge speedups on some computational problems because they harness a strange physical property called entanglement, in which the physical state of one tiny particle depends on measurements made of another. In quantum computers, entanglement is a computational resource, roughly like a chip’s clock cycles — kilohertz, megahertz, gigahertz — and memory in a conventional computer. In a recent paper in the journal Proceedings of the National Academy of Sciences, researchers at MIT and IBM’s Thomas J. Watson Research Center show that simple systems of quantum particles exhibit exponentially more entanglement than was previously believed. That means that quantum computers — or other quantum information devices — powerful enough to be of practical use could be closer than we thought. Where ordinary computers deal in bits of information, quantum computers deal in quantum bits, or qubits. Previously, researchers believed that in a certain class of simple quantum systems, the degree of entanglement was, at best, proportional to the logarithm of the number of qubits. “For models that satisfy certain physical-reasonability criteria — i.e., they’re not too contrived; they’re something that you could in principle realize in the lab — people thought that a factor of the log of the system size was the best you can do,” says Ramis Movassagh, a researcher at Watson and one of the paper’s two co-authors. “What we proved is that the entanglement scales as the square root of the system size. Which is really exponentially more.” That means that a 10,000-qubit quantum computer could exhibit about 10 times as much entanglement as previously thought. And that difference increases exponentially as more qubits are added. Logical or physical? This matters because of the distinction, in quantum computing, between logical qubits and physical qubits. A logical qubit is an abstraction used to formulate quantum algorithms; a physical qubit is a tiny bit of matter whose quantum states are both controllable and entangled with those of other physical qubits. A computation involving, say, 100 logical qubits would already be beyond the capacity of all the conventional computers in the world. But with most of today’s theoretical designs for general-purpose quantum computers, realizing a single logical qubit requires somewhere around 100 physical qubits. Most of the physical qubits are used for quantum error correction and to encode operations between logical qubits. Since preserving entanglement across large groups of qubits is the biggest obstacle to developing working quantum devices, extracting more entanglement from smaller clusters of qubits could make quantum computing devices more practical. Qubits are analogous to bits in a conventional computer, but where a conventional bit can take on the values 0 or 1, a qubit can be in “superposition,” meaning that it takes on both values at once. If qubits are entangled, they can take on all their possible states simultaneously. One qubit can take on two states, two qubits four, three qubits eight, four qubits 16, and so on. It’s the ability to, in some sense, evaluate computational alternatives simultaneously that gives quantum computers their extraordinary power. In the new paper, Peter Shor, the Morss Professor of Applied Mathematics at MIT, and Movassagh, who completed his PhD with Shor at MIT, analyze systems of qubits called spin chains. In quantum physics, “spin” describes the way a bit of matter — it could be an electron, or an atom, or a molecule — orients itself in a magnetic field. Shor and Movassagh consider bits of matter with five possible spin states: two up states, two corresponding down states, and a zero, or flat, state. Previously, theorists had demonstrated strong entanglement in spin chains whose elements had 21 spin states and interacted with each other in complex ways. But such systems would be extremely difficult to build in the lab. Chain, chain, chain A spin chain can be envisioned as a sequence of particles lined up next to each other. Interactions between the spins of adjacent particles determine the total energy of the system. Shor and Movassagh first considered the set of all possible orientations of their spin chain whose net energy was zero. That means that if somewhere there was a spin up, of either of the two types, somewhere there had to be a corresponding spin down. Then they considered the superposition of all those possible states of the spin chain. But the major breakthrough of the paper was to convert that superposition into the lowest-energy state of a Hamiltonian. A Hamiltonian is a matrix — a big grid of numbers — that figures in the standard equation for describing the evolution of a quantum system. For any given state of the particles in the system, the Hamiltonian provides the system’s total energy. In the previous 30 years, Movassagh says, no one had found an example of a Hamiltonian whose lowest-energy state corresponded to a system with as much entanglement as his and Shor’s exhibits. And even for Shor and Movassagh, finding that Hamiltonian required a little bit of luck. “Originally, we wanted to prove a different problem,” Movassagh says. “We tried to come up with a model that proved some other theorem on generic aspects of entanglement, and we kept failing. But by failing, our models became more and more interesting. At some point, these models started violating this log factor, and they took on a life of their own.” Pros and cons “It’s a beautiful result, a beautiful paper,” says Israel Klich, an associate professor of physics at the University of Virginia. “It certainly made for a lot of interest in some parts of the physics community. The result is in fact very, very succinct and simple. It’s a relatively simple Hamiltonian whose ground state one can understand by simple combinatorial means.” “Inspired by this work, we recently introduced a new variation on this model that is even more entangled, which has, actually, linear scaling of entanglement,” Klich adds. “The reason this was possible is that if you look at the ground state wave function, it’s so easy to understand how entanglement builds up there, and that gave us the idea of how to string it on to be even more entangled.” But John Cardy, an emeritus professor of physics at Oxford University and a visiting professor at the University of California at Berkeley, doesn’t find the MIT researchers’ Hamiltonian so simple. “If you read the description of the Hamiltonian, it takes a lot of description,” he says. “When we have physically reasonable Hamiltonians, we can just write them down in one expression. They do have an equation that tells you what the Hamiltonian is. But to explain what all those ingredients are requires this whole formalism that is deliberately designed, as far as I can tell, to get the result that they want.” “But I don’t want to sound unduly negative, because this is the way that science proceeds,” he adds. “You find one counterexample, then you might find others that are more reasonable.”
<urn:uuid:66881313-81dc-4e5a-a866-3d69db1591b5>
CC-MAIN-2023-14
https://news.mit.edu/2016/simple-quantum-computers-1118
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296950373.88/warc/CC-MAIN-20230402012805-20230402042805-00183.warc.gz
en
0.940422
1,551
3.796875
4
A concept of computer engineering, Neuromorphic Computing refers to the designing of computers that are based on the systems found in the human brain and the nervous system. Driven by the vast potential and ability of the human brain, neuromorphic computing devises computers that can work as efficiently as the human brain without acquiring large room for the placement of software. Inspired by the human brain and the functioning of the nervous system, Neuromorphic Computing was a concept introduced in the 1980s. Yet this concept has taken the front seat in recent times as Artificial Intelligence has led scientists to advance Neuromorphic Computing to excel in the field of technology. One of the technological advancements that has rekindled the interest of scientists in neuromorphic computing is the development of the Artificial Neural Network model (ANN). Since traditional computers, backed by CPUs (Computer Processing Units) do not have the ability to support neuromorphic computing, modern computers are now being built with adequate hardware to support such technology. Backed by the advanced technology of neuromorphic computing, computers can now act and work like the human brain. With the help of algorithms and data, neuromorphic computing enables computers to work rapidly and on low energy too. While the definition of this concept can be a bit too complicated to understand, the working of neuromorphic computing can make you understand the essence of it more easily. Let's begin with the working of neuromorphic computing. The working of neuromorphic computing-enabled devices begins with the placement of Artificial Neural Networks (ANN) that comprise millions of artificial neurons. These neurons are similar to the human brain neurons. Enabling a machine (computer) to act and work like the human brain, layers of these artificial neurons pass signals to one another. These electric signals or electric spikes convert input into an output that results in the working of neuromorphic computing machines. The passing on of electric spikes or signals functions on the basis of Spiking Neural Networks (SNN). This spiking neural network architecture further enables an artificial machine to work like the human brain does and perform functions that humans can do on a daily basis. This can involve visual recognition, interpretation of data, and a lot more such tasks. Since these artificial neurons only consume power when the electric spikes are passed through them, neuromorphic computing machines are low-power-consuming computers as compared to traditional computers. By imitating the neuro-biological networks present in the human brain, neuromorphic computing machines work like a human brain and perform tasks efficiently and effectively. Bringing on the ability to work like the human brain, neuromorphic computing has advanced the developments in the field of technology. The engineering of computers in the earlier times led to the generation of traditional computers that consumed a lot of space for functioning. However, computers working on the basis of neuromorphic computing consume much less space with an in-built capability to work faster and better. (Must check: AI with Neuroscience) Neuromorphic computers are specifically known for their rapid response system because their processing is highly rapid. As compared to traditional computers, neuromorphic computers are built to work like a human brain and so their rapid response system is a major highlight. Owing to the concept of Spiking Neural Networks (SNN), neuromorphic machines work when electric spikes or signals are passed through the artificial neurons. These artificial neurons work only when electric spikes are passed through them thus consuming low energy. Modern computers have a knack for adaptability and so do neuromorphic computers. With higher adaptability, neuromorphic computers work well according to the evolving demands of technology. With changing times, neuromorphic computers adapt themselves and change from time to time resulting in efficient working. Machines working on the principle of neuromorphic computing are highly fast-paced when it comes to learning. Establishing algorithms based on interpretation of data and formulating algorithms as and when new data is fed into such computers, neuromorphic computing enables machines to learn rapidly. One of the most striking features of neuromorphic computing is its mobile architecture. Unlike traditional computers that used to consume vast space for working, neuromorphic computers are mobile and handy. They do not require much space and are highly efficient in terms of space occupancy. (Most related: Top Deep Learning Algorithms) An essential realm of AI, neuromorphic AI computing is significant because of its advanced technology. Leading to the functioning of artificial computers like the human brain, neuromorphic computing has opened the doors to better technology and rapid growth in computer engineering. Not only does it lead to rapid growth but neuromorphic computing chips have also revolutionized the way computers work. From the analysis of data to machine learning algorithms, computers can do almost anything today. While neuromorphic computing was a concept introduced in the 1980s, it has only been brought into the limelight in recent times. With numerous neuromorphic computing applications in physics, data analytics, and numerical algorithms, the significance of this concept is unmatched. Even though the concept has many challenges to face, it still is leading the revolution of making computers work along the lines of the human brain. "We’ve seen a lot of progress in scaling and industrialization of neuromorphic architectures. Still, building and deploying complete neuromorphic solutions will require overcoming some additional challenges." (From) Artificial Intelligence technology intends to impart human abilities in computers to make them work like humans. On the other hand, neuromorphic computing attempts to engineer computers that work like the human brain does. Comprising millions of artificial neurons that pass on electric signals to one another, neuromorphic computing has been a revolutionary concept in the realm of Artificial Intelligence. By inducing the technology of information processing, neuromorphic computers have become the leaders of AI that, as many say, have resulted in the 3rd wave. The third generation of AI has led scientists to draw parallels with the human brain and its abilities like the interpretation of data and adaptation. With the help of one of the techniques of AI, (machine learning), neuromorphic computing has advanced the process of information processing and enabled computers to work with better and bigger technology. Thanks to AI, neuromorphic computing has reinvented its place in the field of technology and is pushing the limits of AI further. Intertwined with each other, neuromorphic computing and AI have a long way to go as both attempt to mimic human abilities and imitate them in computer software. "Intel Labs is driving computer-science research that contributes to this third generation of AI. Key focus areas include neuromorphic computing, which is concerned with emulating the neural structure and operation of the human brain, as well as probabilistic computing, which creates algorithmic approaches to dealing with the uncertainty, ambiguity, and contradiction in the natural world." Intel- Neuromorphic Computing In simple terms, Artificial Intelligence future is Neuromorphic Computing. Setting forth the third wave or era of AI, neuromorphic computing will take over the technological advancements of the field and become the driving force of artificial intelligence future scope. While the current wave of AI is faced with a number of challenges like heavy processing hardware and software storage capacity, the third wave of neuromorphic computing in AI will most likely put a stop to these challenges and enable the human-like activities performed by computers. Neuromorphic chips, being manufactured by big tech giants like IBM, will be the key factor in making computers function like the human nervous system. "The neuromorphic computing market is poised to grow rapidly over the next decade to reach approximately $1.78 billion (around Rs11,570 crore) by 2025, according to a 10 April report by US-based Research and Markets. The reason is simple—the growing interest of companies in Artificial Intelligence, or AI, which can always do with more and more computing power." Neuromorphic computing the future of AI To conclude, neuromorphic computing will bring forth the untouched capabilities of AI and will set a revolutionary example in the coming years. The objective of neuromorphic computing is to make computers behave like a human brain and work along the lines of the human nervous system, and neuromorphic computing posits the engineering of computers in a way that comprises millions of artificial silicon neurons enabled to transfer electric spikes from one another. (Recommended blog: How quantum computing improves ML?) In the long run, this concept will gain more relevance and regard as it is all set to bring about the 3rd wave of Artificial Intelligence. Elasticity of Demand and its TypesREAD MORE 5 Factors Influencing Consumer BehaviorREAD MORE What is PESTLE Analysis? Everything you need to know about itREAD MORE An Overview of Descriptive AnalysisREAD MORE What is Managerial Economics? Definition, Types, Nature, Principles, and ScopeREAD MORE 5 Factors Affecting the Price Elasticity of Demand (PED)READ MORE Dijkstra’s Algorithm: The Shortest Path AlgorithmREAD MORE 6 Major Branches of Artificial Intelligence (AI)READ MORE Scope of Managerial EconomicsREAD MORE 7 Types of Statistical Analysis: Definition and ExplanationREAD MORE
<urn:uuid:03106bf7-003d-42df-8e8e-4392cad2941e>
CC-MAIN-2023-14
https://www.analyticssteps.com/blogs/what-neuromorphic-computing-working-and-features
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296950373.88/warc/CC-MAIN-20230402012805-20230402042805-00183.warc.gz
en
0.937368
1,887
4.03125
4
US Takes an Important Step Toward Quantum Internet (Inside Science) -- While researchers continue to make quantum computers increasingly capable, regular computers still hold a massive advantage: Their data, represented in sequences of zeros and ones, can ride the information superhighway. Quantum computers, which instead run on quantum superpositions of zeros and ones, can’t use the internet to communicate with each other. Multiple projects across the world are working to create a “quantum internet,” a network where quantum computers can share and exchange information. One such project, a collaboration between Brookhaven National Lab and Stony Brook University in New York, recently hit a major milestone: demonstrating that quantum bits, or qubits, from two distant quantum computers can be entangled in a third location. This is a critical step in creating a quantum internet, and significantly, the researchers did it over standard internet cables. “Part of the challenge of building a quantum internet is, to what extent can I even get quantum information through the kinds of fiber networks that we use for normal communications?” said Joseph Lykken, deputy director of research at Fermi National Accelerator Laboratory and head of the Fermilab Quantum Institute. “That’s really important, and they’re doing this at a longer distance at Brookhaven-Stony Brook than I think almost anybody else.” A new kind of computing needs a new kind of internet Quantum computers aren’t superpowerful versions of classical computers. Instead, they approach computing in a whole new way. They can theoretically take advantage of quantum mechanical concepts such as superposition and entanglement to solve certain types of problems -- for example, ones that show up when encrypting data or simulating chemical reactions -- much faster than traditional approaches. Quantum computing technology is still in the early stages of development, and many of the most promising applications remain unrealized. Other applications may have yet to be discovered. Similarly, the “quantum internet” will not be a superfast and secure version of today’s internet. Instead, it will likely have particular applications transferring quantum information between computers. To do this, the computers’ qubits are entangled, meaning they are put in a superposition in which their separate possible quantum states become dependent on each other and the qubits then become a single quantum system. Measuring the state of one of these qubits breaks the superposition, immediately influencing the state of the others -- and this measurement/entanglement process is how quantum information can be transmitted. Entanglement between two quantum computers has been experimentally possible for several years, but the team at Brookhaven and Stony Brook has gone one step further: They have created the longest quantum network in the United States by showing that two quantum computers can be entangled using a third node. This is the first step in building a network where many computers can “talk” to each other through a central node. To do the experiment, the researchers faced a challenge unique to quantum systems: In order to entangle quantum particles, which make up qubits, the particles must arrive at the node completely indistinguishable from one another even though they took different paths to get there. The more different the path, the more difficult this is -- and the network between Brookhaven and Stony Brook runs over traditional fiber-optic cables that are miles long, going under the neighborhoods and highways of Long Island. “It’s not really feasible to lay new cables everywhere, so being able to use what’s in the ground was important,” said Kerstin Kleese Van Dam, the director of Brookhaven’s Computational Science Initiative. Any unexpected interaction between one of the transmitted quantum particles and its environment might have made it distinguishable from the other. But despite all the potential sources of interference, the experiment was able to prove that the particles could travel over 70 kilometers (almost 45 miles) over traditional infrastructure and still arrive indistinguishable. “Our results demonstrate that these photons can be entangled, that the measurement will work,” said Eden Figueroa, a quantum physicist at Stony Brook University and lead scientist of the project. The recent experiment was one-way: The quantum computers sent their qubits to the node, but the node simply determined whether they could be entangled and didn’t send anything back. The next step, Figueroa said, is to entangle the computers’ quantum memories, which would be analogous to linking two traditional computers’ hard drives. “Down the line we hope that instead of just memories, we will be entangling computers -- not just connecting the hard drives but also the processing units,” Figueroa said. “Of course, that’s not easy.” How far away is the quantum internet? The remaining obstacles to a quantum internet are a blend of research questions and infrastructure concerns. One issue is that manipulating qubits between quantum computers requires synchronization and supervision in a way that the management of traditional bits doesn’t. This means that while quantum computers can’t directly exchange quantum information over the internet, they still need conventional computers that do use the internet to communicate. “You cannot build a quantum network and be successful without a classical network,” said Inder Monga, the director of the Energy Sciences Network, which provides networking services to all U.S. national labs. “You have to control, manage and synchronize the quantum devices over the classical network to really transmit information between the two ends of a quantum network.” This reliance on traditional internet means that the endeavor to build a quantum internet is very interdisciplinary, Monga and Figueroa said. It requires expertise in basic quantum computing research as well as communication infrastructure engineering. “There are as many research problems as are engineering problems,” Monga said, “and to really get to the vision of the quantum internet, it will require a strong collaboration between people and funding to solve not just the basic physics research problems but also the really grand engineering challenges as well.” A central obstacle to the quantum internet is what Figueroa calls “the holy grail of quantum communication”: a quantum repeater. A quantum repeater works like an amplifier, in that it receives a signal of quantum information and passes it on so that entanglement between computers can happen at a greater distance. This is necessary to make a quantum internet that spreads beyond Long Island. But there’s a catch: Any interaction with a qubit breaks its superposition -- and for information to be transmitted, that can’t happen until the qubit reaches its destination. A true quantum repeater would be able to amplify a qubit without interacting with it, a seemingly paradoxical task. The recent experiment is essentially half of a quantum repeater. Kleese Van Dam and Figueroa see a completed quantum repeater in the near future: possibly as soon as 2022, Figueroa said. They plan to transmit entanglement to a third lab in Brooklyn but need a quantum repeater to do so. “We hope that in a few years, we might actually have a working system with repeaters,” Figueroa said. “The minute we can demonstrate that quantum repeater connection, you just need to reproduce the same architecture, again and again, to connect places that are more and more distant from each other.” He sees a network across New York state in 10-15 years. The last obstacle is far more distant, in a future where the New York quantum network is connected to the one being built by Argonne National Laboratory and the University of Chicago, or the one being built in Europe. Those networks are built using fundamentally different quantum computers -- while the New York network uses computers whose qubits are embedded in single trapped atoms, the other networks use what are called solid-state systems to make and manipulate qubits. The two kinds of quantum computers perform computation with completely different architecture. “You can imagine that the actual quantum internet is going to be a collection of solid-state-based quantum computers like the ones in Chicago and atomic-based quantum computers like the ones we have here, and we have to find a way to connect all of them to really come out with a first prototype of the quantum internet,” Figueroa said. “That would be very cool. That would be like science fiction.” In July 2020, the U.S. Department of Energy released a “blueprint” of their strategy to create a national quantum internet. This effort includes the Brookhaven-Stony Brook project and the Argonne-University of Chicago project, which are in turn both supported by research at other national labs such as Fermi National Accelerator Laboratory, and Lawrence Berkeley, Oak Ridge, and Los Alamos National Laboratories. “While quantum computing has gotten a lot of press and funding, the wave is going toward quantum networking,” Figueroa said, “because unless you connect quantum computers into this quantum internet, their applications will be limited. So, it is a good time to be doing these kinds of experiments.”
<urn:uuid:9f2bcbda-057d-4972-86a5-679409fd8932>
CC-MAIN-2023-14
https://www.insidescience.org/news/us-takes-important-step-toward-quantum-internet
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296945218.30/warc/CC-MAIN-20230323225049-20230324015049-00383.warc.gz
en
0.928467
1,921
3.609375
4
The current Cybersecurity architecture is based on cryptographic algorithms implemented all over the world. Authentication, digital signatures, encryption, electronic transactions, time-stamping, and secure network communication are few examples of everyday security implementations that are based on cryptography. The success of these cryptographic measures is dependent on the computational complexity of the Cryptographic protocols. For instance, the Advanced Encryption Standard (AES) used in symmetric cryptography has a key length of 128 (AES-128), 192 (AES-192), and 256 (AES-256) bits. If an attacker launches a brute-force attack against AES-128, it requires 2127 combination of keys which may take thousands of years with current computation powers. However, with the evolution and advancement in quantum computing, the complexity of these cryptographic algorithms may reduce significantly. Although it’s a great achievement in IT development sector, it can also be a great security threat to the current Cyberspace architecture. In this article, we will discover the idea of quantum computing, the current advancements in the field, and security challenges that may arise as the quantum computing flourishes with time. What is Quantum Computing Quantum computing is the scientific study and advancement in computer technology that is based on quantum physics. Quantum physics is the study of energy and matter at the most fundamental (i-e atomic and sub-atomic) level. To understand quantum computing, we need to understand few quantum physics terminologies including quantum, qubit, entanglement, and superposition. Quantum: Quantum in physics is the smallest possible unit of a physical property. In quantum computing, the quantum is the mechanics used by the systems to calculate the outputs. Superposition: The concept of superposition can be understood with the help of a coin example. Let’s assume flipping of a coin. In classical computing, we can only predict one of two positions of the coin i-e heads or tails. However, if we can see both heads and tails at the same time and all the states in between, the coin is said to be in superposition. Qubit: The basic unit of information in quantum computing is called quantum bit or simply the qubit. The role of qubit in quantum computing is like the role of binary bits (0,1) in classical computing. However, the qubit can hold a superposition of all possible states. Entanglement: Entanglement is the property to correlate the results of one qubit with other qubits. This property allows adding more qubits to the system and solving exponentially complex problems. Quantum interference: Quantum interference is the native behavior of the qubits while attaining the superposition. Reducing interference is one of the biggest challenges of quantum computers technology. To conclude the above, quantum computing can be thought of a computation that benefits from the collective properties of quantum states. Quantum computing relies on quantum computers with three major requirements: a space to hold the qubits, mechanism to transfer signals to the qubits, and classical computers to run commands and instructions. Applications of Quantum Computing Although Quantum computing is at evolutionary stage, experts have predicted the future use of Quantum computing in different domains. Following is a brief summary of some of the most popular fields where Quantum computing is expected to play a major role in future. Artificial Intelligence (AI) is a fast-growing field with widespread applications, such as, Objects recognition, Speech recognition, Fraud detection, Spam detection, Malware detection, etc. Although traditional computers are handling majority of AI tasks efficiently, they do have limitations while solving complex computational problems. On the other hand, Quantum computer can manage these problems with better accuracy. For instance, AI is used in diagnostics of various ailments by discovering patterns. Quantum computing can make the discovery process faster with more accuracy. Finance industry uses different simulations and algorithms to find out the best investment opportunities. Traditional computers are used to find out the best indicators by comparing past data. Technically, the traditional computers are not good enough to make the right decisions as compared to the Quantum computers. Quantum computers, being fast and more intelligent, can make right choices of investments in a very short span of time. Quantum computing can also facilitate the weather forecasting sector. Traditional computing technology often makes wrong weather predictions since climate indicators change dramatically, making it difficult for traditional computers to timely and precisely predict the changes. These issues can easily be resolved with the introduction of Quantum computing in Weather forecasting field. Quantum computing is expected to play a major role in solving optimization problems in industries using different quantum approaches (models). There are two main models in Quantum computing that are used for optimization problems. These are: (1) Universal Gate Model and (2) Quantum Annealers. Energy grids, large autonomous fleets, financial portfolio, logistic routes, and manufacturing designs are few example sectors where Quantum computing can play the optimization role. With classical computers and technology, pharmaceutical industries spend billions of dollars and years to launch and market the product. Quantum computing can reduce this cost and time through correct drug designing, efficient testing, and faster target (patients) identification. Products designed and tested using Quantum computing shall be less prone to errors and trials as compared to pharmaceutical products prepared using classical R&D techniques. Quantum Computing Security Challenges Although Quantum computing is going to be a miraculous addition to the global technology, it can create undesired problems for the Cybersecurity industry at the same time. In Cyberspace, data is considered as the most valuable asset for any organization. Global industries spend billions of dollars to secure data from being hacked. The main technique used for data security is the application of encryption algorithms. Majority of the existing encryption techniques used by the enterprises are based on the idea of combinatorics, a mathematical field that deals with selection, arrangement, and mathematical operations within a discrete or finite systems. Quantum computing can solve the combinatorics quite easily to break almost all such encryption algorithms. This capability of Quantum computers is a big threat to the current encryption standards including AES and RSA public key cryptography. There is a great deal of benefits for industries to adopt Quantum computing in the future. Data storage methods, processing techniques, and complex calculations in no time are the key attributes that differentiate Quantum computing from classical computing. Threat to current encryption algorithms and immense utilization of energy are the main drawbacks of Quantum technology. Many organizations have started working on Quantum-ready algorithms to counter the future Quantum threats to Cybersecurity.
<urn:uuid:dd12df41-3689-4464-acb2-e57531889b00>
CC-MAIN-2023-14
https://www.hackingloops.com/hacking-news/
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296949598.87/warc/CC-MAIN-20230331082653-20230331112653-00585.warc.gz
en
0.908866
1,344
3.53125
4
Every particle reaction happens because of interaction with one of the fundamental forces. Baryons and mesons (Hadrons, not Leptons) are affected by the strong force. The weak force affects most particles. Particle decay occurs because of the 3 forces. Electrons don’t decay, they just turn themselves into other things, some with an electric charge. The law of conservation of parity of particle (not true for the beta decay of nuclei) states that, if an isolated ensemble of particles has a definite parity, then the parity remains invariable in the process of ensemble evolution. Parity is a property that is important in the quantum-mechanical description of a physical system. In most cases, it relates to the symmetry of the wave function representing a system of fundamental particles. A parity transformation replaces such a system with a type of mirror image. A strange particle is an elementary particle with a strangeness quantum number different from zero. The classification of particles, as mesons and baryons, follows the quark/anti-quark and three quark content respectively. Particles of matter transfer discrete amounts of energy by exchanging bosons with each other. Each fundamental force has its own corresponding boson – the strong force is carried by the “gluon”, the electromagnetic force is carried by the “photon”, and the “W and Z bosons” are responsible for the weak force. Bosons are force–carrying particles. This means that they are made up of tiny bundles of energy. Photon – Light is made up of a type of boson called a photon. Mesons are hadrons that do not decay into protons, such as pinons and kaons. Pinons and kaons can be positive, neutral, and negative. Baryons and mesons aren’t fundamental particles and so can be split into smaller particles known as quarks. Leptons − Leptons are particles that interact using the weak nuclear force. Hadrons are the heaviest particles. This group is then split- up into baryons and mesons. Baryons are the heaviest particles of all, followed by mesons. Leptons are the lightest particles. These 3 families carry a force from place to place. All 3 families have anti-particles. Symmetry is the casual structure built into the creation module. The creation module has a two-way arrow of time that is built into it. All current information is always passed back into the versatile storage unit. These informational totals can’t be changed or deleted. The closed sub-atomic quantum system is a duplicate of the macro quantum system. The two systems interact on a binary basis. Photons mediate electromagnetic interactions between particles in quantum electrodynamics. An isolated electron at a constant velocity cannot emit or absorb a real photon; doing so would violate conservation of energy and momentum. Instead, virtual photons can transfer momentum between two charged particles. Everything quantum has a wave-particle property duality. Also, quantum indeterminacy can be quantitatively characterized by a probability distribution on the set of outcomes of measurements of an observable. The distribution is uniquely determined by the system state, and moreover, quantum mechanics provides a recipe for calculating this probability distribution. Indeterminacy in measurement was not an innovation of quantum mechanics, since it had been established early on by experimentalists that errors in measurement may lead to indeterminate outcomes. Retro causality, or backward causation, is a concept of cause and effect in which an effect precedes its cause in time and so a later event affects an earlier one. In quantum physics, the distinction between cause and effect is not made at the most fundamental level and so time-symmetric systems can be viewed as causal or retrocausal. Quantum entanglement does seem to violate causality in the context of Bell’s theorem but in reality, it doesn’t do so. In the subatomic realm, where the laws of quantum physics make seemingly impossible feats routine, the one thing that we always considered beyond the pale might just be true. This idea that the future can influence the present, and that the present can influence the past, is known as retrocausality. Entanglement of two particles does not violate causality when the particles are first entangled. Key dates Dalton s Plum Pudding model Thomson s electron discovery Rutherford s scattering of α particles with Geiger & Marsden, Manchester Bohr theory of the atom Rutherford – the discovery of proton Dirac predicts positron Pauli predicts neutrino Fermi names predicted neutrino Anderson observes positron Chadwick – the discovery of neutron Segre & Chamberlain – observed anti-proton Neutrino observed Salam & Weinberg predicted W & Z bosons Perl – discovery of Tau 1784MeV/c² CERN W & Z bosons observed. Higgs Boson (not quite yet). Forces: 1. strong (hadronic) nuclear interaction (carried by exchange bosons called gluons). 2. electromagnetic interaction (carried by exchange particle photon). 3. weak nuclear interaction (carried by W & Z exchange bosons). 4. gravitational interaction (carried by predicted graviton exchange boson). atom. Hadrons. Strong, hadronic interactions. (made of quarks up, down & strange (flavors) top, bottom & charm (newer).) Leptons. Spin ½ s …so fermions. Weak interactions. Fundamental particles. No size. Mostly light. Each has an anti-particle. Baryons. Spin ½ s …so fermions. Heavy hadrons. Triplets of quarks. Mesons. Spin 0,1,2,..so bosons .. Intermediate mass. Quark & anti-quark. Muon νμ Neutrino. Electron e. Muon μ. Tau τ. (heavy) Nucleons. Others. Electron neutrino νe. Tau neutrino ντ. Neutron. Proton. Quarks. Fundamental particles. Up, down, strange, top, bottom, charm. Lambda. Sigma. Xi. Pion up. Kaon. Eta. COMMON SENSE DOESN’T APPLY TO QUANTUM MECHANICS. Every symmetry of physics laws leads to a conservation law, and every conservation law arises from a symmetry in the laws of physics. THE LAW THAT CONTROLS ALL PARTICLE INTERACTIONS IS THIS: ALL THINGS ARE TRIUNE, WITH BINARY INTERACTIVES. THIS IS THE LINKAGE BETWEEN MATTER AND FORCE CARRYING PARTICLES. FERMIONS AND BOSONS control THE LINKAGE BETWEEN THE PARTICLE ZOO. THE REALITY OF HOW LIFE FORMS CAME ABOUT ON THIS REMOTE BLUE MARBLE IS THIS: THE EVENT ORIGINATOR WROTE THE CODE, PRODUCED THE BLUEPRINT, AND USED AN EVOLVEMENT PROCESS TO OBTAIN THE REQUISITE RESULT. IT’S ALL JUST A BINARY SOFTWARE PROGRAM. IT’S ALL ABOUT THE CODE THAT YOU START WITH. THE DESIGNER/CREATOR’S PROCESS : (recap). 1ST: Write the code for the upcoming big bang that will create another universe. (One universe does not an infinity make.) 2nd: Write the code for the design and descent for all intended results as the event unfolds. ( One event does not an eternity make). 3rd: Set the event in motion. All things are triune, with binary interactives. 4th: Monitor, fine-tune, adjust, and select out on-going. 5th: Use DESIGN AND DECENT as the process. Write a separately coded blueprint for the consciousness of the known thought reposers. 6th: It’s not the people, it’s the event. 7th: Harvesting new crops of known thought reposers was the intended result. ONE EVENT DOES NOT make AN ETERNITY EVOLUTION IS ONLY PART OF THE PROCESS.* ALL THINGS ARE TRIUNE, WITH BINARY INTERACTIVES. BEYOND GOOD AND EVIL IS ONLY GOOD. ITS NOT THE PEOPLE, ITS THE EVENT. GET BACK TO WHERE YOU ONCE BELONGED. IN THE END, CHOICES ARE NEVER FREE. WHAT TO DO, IS UP TO YOU. ONE UNIVERSE DOES NOT AN INFINITY MAKE. GETTING BEYOND EVIL IS A START.
<urn:uuid:d40b7777-c964-4230-9fbe-20ae9a1c0683>
CC-MAIN-2023-14
https://vernbender.com/18253-2/
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296950373.88/warc/CC-MAIN-20230402012805-20230402042805-00188.warc.gz
en
0.861504
1,813
4.28125
4
What are “Quantum Computers”? A quantum computer is a device for computation that makes direct use of quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers will harness the power of atoms and molecules to perform memory and processing tasks. Quantum computers are different from normal computers that use transistors and store the data in binary form i.e. 0s and 1s. Quantum computer uses quantum properties to represent data and perform operations on these data. Quantum Computers are still very much only in books but some achievement has been achieved on performing small operations on these “qubits”. Large-scale quantum computers could be able to solve certain problems much faster than any classical computer by using the best currently known algorithms, like integer factorization using Shor’s algorithm or the simulation of quantum many-body systems. There exist quantum algorithms, such as Simon’s algorithm, which runs faster than any possible probabilistic classical algorithm. Shor’s Algorithm: This algo is basically used for the integer factorization and is able to calculate the prime factor of any given number, N in polynomial time. This is largely used in data encryption and protection where the prime requirement is to find the prime numbers. You can see a very good representation of the shor’s algo here is. Simon Algorithm: Aner probabilistic algo that is black box behaviour of the things. This includes the string and expression matching, complex Fourier transforms and much more. How is data represented in quantum computing and how is this different from normal computers? A classical computer has a memory made up of bits, where each bit represents either a one or a zero. A quantum computer maintains a sequence of qubits. A single qubit can represent a one, a zero, or, crucially, any quantum superposition of these; moreover, a pair of qubits can be in any quantum superposition of 4 states, and three qubits in any superposition of 8. In general, thus, the physical state of a qubit is the superposition ψ = α0 + β1(where α and β are complex numbers). The state of a qubit can be described as a vector in a two-dimensional Hilbert space, a complex vector space The special states 0 and 1 are known as the computational basis states, and form an orthonormal basis for this vector space. Image used under the GNU Free Documentation License 1.2| The Bloch sphere is a representation of a qubit, the fundamental building block of quantum computers. The Turing machine, developed by Alan Turing in the 1930s, is a theoretical device that consists of a tape of unlimited length that is divided into little squares. Each square can either hold a symbol (1 or 0) or be left blank. A read-write device reads these symbols and blanks, which gives the machine its instructions to perform a certain program. Does this sound familiar? Well, in a quantum Turing machine, the difference is that the tape exists in a quantum state, as does the read-write head. This means that the symbols on the tape can be either 0 or 1 or a superposition of 0 and 1; in other words, the symbols are both 0 and 1 (and all points in between) at the same time. While a normal Turing machine can only perform one calculation at a time, a quantum Turing machine can perform many calculations at once. Today’s computers, like a Turing machine, work by manipulating bits that exist in one of two states: a 0 or a 1. Quantum computers aren’t limited to two states; they encode information as quantum bits, or qubits, which can exist in superposition. Qubits represent atoms, ions, photons or electrons, and their respective control devices that are working together to act as computer memory and a processor. Because a quantum computer can contain these multiple states simultaneously, it has the potential to be millions of times more powerful than today’s most powerful supercomputers. Quantum computers also utilize another aspect of quantum mechanics known as entanglement. One problem with the idea of quantum computers is that if you try to look at the subatomic particles, you could bump them, and thereby change their value. If you look at a qubit in superposition to determine its value, the qubit will assume the value of either 0 or 1, but not both. Can a real quantum computer be made? To make a practical quantum computer, scientists have to devise ways of making measurements indirectly to preserve the system’s integrity. Entanglement provides a potential answer. In quantum physics, if you apply an outside force to two atoms, it can cause them to become entangled, and the second atom can take on the properties of the first atom. So if left alone, an atom will spin in all directions. The instant it is disturbed it chooses one spin or one value; and at the same time, the second entangled atom will choose an opposite spin or value. This allows scientists to know the value of the qubits without actually looking at them. If functional quantum computers can be built, they will be valuable in factoring large numbers, and therefore extremely useful for decoding and encoding secret information. If one were to be built today, no information on the Internet would be safe. Our current methods of encryption are simple compared to the complicated methods possible in quantum computers. Quantum computers could also be used to search large databases in a fraction of the time that it would take a conventional computer. Other applications could include using quantum computers to study quantum mechanics, or even to design other quantum computers. But quantum computing is still in its early stages of development, and many computer scientists believe the technology needed to create a practical quantum computer is years away. Quantum computers must have at least several dozen qubits to be able to solve real-world problems, and thus serve as a viable computing method. Leave a Reply
<urn:uuid:573d868c-f5fd-47ee-bce9-28ab84fc9743>
CC-MAIN-2023-14
https://dailyjag.com/technology/what-are-quantum-computers/
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296945372.38/warc/CC-MAIN-20230325191930-20230325221930-00189.warc.gz
en
0.925973
1,245
3.6875
4
QC101 Quantum Computing & Intro to Quantum Machine Learning What you'll learn - Use quantum cryptography to communicate securely - Develop, simulate, and debug quantum programs on IBM Qiskit and Microsoft Q# - Run quantum programs on a real quantum computer through IBM Quantum Experience - Use Dirac's notation and quantum physics models to analyze quantum circuits - Train a Quantum Support Vector Machine (Quantum Machine Learning) on real-world data and use it to make predictions - Learn Data science and how quantum computing can help in artificial intelligence / machine learning - Learn why machine learning will be the killer-app for quantum computing - 12th grade level high-school Math and Physics - You must have studied Math and Physics upt o 12th grade level and *enjoyed* it. Quantum Computing is primarily about Math & Physics. There is very little coding involved. - 12th grade level, high school Math: Complex numbers, linear algebra, probability, statistics, & boolean logic Welcome to the bestselling quantum computing course on Udemy! Quantum Computing is the next wave of the software industry. Quantum computers are exponentially faster than classical computers of today. Problems that were considered too difficult for computers to solve, such as simulation of protein folding in biological systems, and cracking RSA encryption, are now possible through quantum computers. How fast are Quantum Computers? A 64-bit quantum computer can process 36 billion billion bytes of information in each step of computation. Compare that to the 8 bytes that your home computer can process in each step of computation! Companies like Google, Intel, IBM, and Microsoft are investing billions in their quest to build quantum computers. If you master quantum computing now, you will be ready to profit from this technology revolution. This course teaches quantum computing from the ground up. The only background you need is 12th grade level high-school Math and Physics. IMPORTANT: You must enjoy Physics and Math to get the most out of this course. This course is primarily about analyzing the behavior of quantum circuits using Math and Quantum Physics. While everything you need to know beyond 12th grade high school science is explained here, you must be aware that Quantum Physics is an extremely difficult subject. You might frequently need to stop the video and replay the lesson to understand it. QUANTUM MACHINE LEARNING It appears that the killer-app for quantum computing will be machine learning and artificial intelligence. Quantum machine learning algorithms provide a significant speed-up in training. This speed-up can result in more accurate predictions. While understanding quantum algorithms requires mastery of complex math, using quantum machine learning is relatively simple. Qiskit encapsulates machine learning algorithms inside an API that mimics the popular Scikit-Learn machine-learning toolkit. So you can use quantum machine learning almost as easily as you would traditional ML! Quantum machine learning can be applied in the back-end to train models, and those trained models can be used in consumer gadgets. This means that quantum machine learning might enhance your everyday life even if quantum computers remain expensive! We begin by learning about basic math. You might have forgotten the math you learned in high-school. I will review linear algebra, probability, Boolean algebra, and complex numbers. Quantum physics is usually considered unapproachable because it deals with the behavior of extremely tiny particles. But in this course, I will explain quantum physics through the behavior of polarized light. Light is an everyday phenomenon and you will be able to understand it easily. Next we learn about quantum cryptography. Quantum cryptography is provably unbreakable. I will explain the BB84 quantum protocol for secure key sharing. Then we will learn about the building-blocks of quantum programs which are quantum gates. To understand how quantum gates work, we will study quantum superposition and quantum entanglement in depth. We will apply what we have learned by constructing quantum circuits using Microsoft Q# (QSharp) and IBM Qiskit. For those of you who don't know the Python programming language, I will provide a crisp introduction of what you need to know. We will begin with simple circuits and then progress to a full implementation of the BB84 quantum cryptography protocol in Qiskit. We will learn how to use Qiskit's implementation of Shor's algorithm for factoring large numbers. The killer-app for quantum computing is quantum machine learning. To understand quantum machine learning, we must first learn how classical machine learning works. I provide a crisp introduction to classical machine learning and neural networks (deep learning). Finally, we will train a Quantum Support Vector Machine on real-world data and use it to make predictions. For a better learning experience, open the transcript panel. You will see a small "transcript" button at the bottom-right of the video player on Udemy's website. If you click this button, the transcript of the narration will be displayed. The transcripts for all the videos have been hand-edited for accuracy. Opening the transcript panel will help you understand the concepts better. If you missed an important concept, then you can click on text in the transcript panel to return directly to the part you want to repeat. Conversely, if you already understand the concept being presented, you can click on text in the transcript panel to skip ahead in the video. Enroll today and join the quantum revolution! Who this course is for: - Software professionals and technical managers who want to learn quantum computing and enjoy Math & Physics - Machine Learning and AI professionals who want to learn how quantum computing can be used in data science I am passionate about making technology easy to understand. I have taught students at the University of Massachusetts and guided software professionals at Cadence Design Systems, iCOMS, Empirix, Relona, and Johnson & Johnson. My goal is to help you earn more than $200,000 annually as a software professional. I focus on teaching AI and Quantum Computing because these are the highest paid skills in the industry. My courses help beginners who have a basic understanding of high school Math and coding. In about 6 months you can complete several courses and become an expert earning $200+ per hour. In addition to teaching technical skills, I also help you build leadership ability. My courses discuss trade-offs between various technical choices and help you take wise decisions. As an expert software professional, you will be able to recommend solutions, suggest implementation choices, and guide software design. My courses have a 30 day money back guarantee. Check out the free video previews and enroll today. I have an electrical engineering degree from IIT and a masters degree in computer science from the University of Massachusetts. I have managed software teams and helped startups launch products in international markets. I have lived most of my professional life in the Boston area. I enjoy reading science fiction and economic theory. I am a gourmet who loves to try out interesting recipes and new restaurants with friends and family.
<urn:uuid:4c32ff82-8585-40fc-8db0-f18ab8a6ed27>
CC-MAIN-2023-14
https://www.udemy.com/course/qc101-introduction-to-quantum-computing-quantum-physics-for-beginners/?LSNPUBID=vedj0cWlu2Y&%3BranEAID=vedj0cWlu2Y&%3BranMID=39197&%3BranSiteID=vedj0cWlu2Y-271V9_hcDfy2AneMmUHrmw&%3Butm_medium=udemyads&%3Butm_source=aff-campaign&ref=qmedia
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296945183.40/warc/CC-MAIN-20230323194025-20230323224025-00189.warc.gz
en
0.913485
1,447
3.5625
4
Nanoelectronics – in which semiconductor components’ critical features such as logic transistors and memory measure well under 100 nm – is a new and rapidly growing field. Potential applications in quantum computing, advanced memory, and energy storage and generation make this a tiny technology with enormous potential. Image Credit: NIMEDIA/Shutterstock.com A History of Nanoelectronics The first nanoscale electronic devices were developed by researchers in the 1960s, who produced gold thin film just 10 nm in thickness as the base for a metal semiconductor junction transistor. In the late 1980s, a team of IBM researchers demonstrated the first metal oxide semiconductor field effect transistor (MOSFET) with a gate oxide thickness of just 10 nm. The product used tungsten gate technology to achieve this nanoscale dimension. The first multi-gate MOSFET, the FinFET, was developed in 1989. The FinFET, or fin field effect transistor, is a double gate MOSFET that is also three-dimensional and non-planar. In 2002, a 10 nm FinFET was fabricated. A CMOS (complementary MOS) transistor was developed in 1999 to show just how far MOSFET transistor technology could take us in terms of nanoscale electronics. Just a few years later, in 2006, researchers developed a MOSFET measuring just 3 nm, making it the smallest nanoelectronic device in the world at the time. Nanoelectronic semiconductor devices went into commercial production in the 2010s, and Samsung is currently releasing a 3 nm GAAFET, or gate all round FET on the market. How Was Nanoelectronics Made Possible? Nanoscale electronic devices are the culmination – or, more accurately, the latest product – of decades of cutting-edge research in nanosciences and nanotechnology. Ever since Richard Feynman proposed the possibility of computing with “submicroscopic” computers in a groundbreaking lecture in 1959, researchers at the forefront of physics, materials science, and instrumentation design and manufacturing development have been pursuing ever smaller microscopic measurements. This journey has led to a number of groundbreaking applications in information and communication technology (ICT) – personal computers, smartphones, Internet of Things (IoT) technology, and many more everyday game changers of the modern world rely on progress gained by nanoelectronic research. But truly, nanoelectronic devices are yet to reach our shelves. While semiconductor technology has progressed remarkably, even microscopic computers are still mostly out of reach. Sub microscopic (nanoscale) computers may be even farther off. Current semiconductor technology may even be limited in terms of minimum system sizes, and we may be approaching that limit. However, nanoelectronics currently under development may enable us to break this barrier by developing and manufacturing truly sub-microscopic, nanoscale electronic devices in the next few decades. Moving Nanoelectronics Forward Researchers propose that the best way to develop nanoelectronics is to combine microelectronic devices with nanoelectronics devices in hybrid systems. This approach builds on the progress already achieved in microelectronics, for example, in developing microelectromechanical systems (MEMS) technology which has brought numerous MEMS sensors like accelerometers and microphones, magnetometers and gyroscopes, and even power generators to the market. One current innovation pursuing this hybrid approach is a solid state quantum effect nanoelectronic device for resonance tunneling. The device uses a standard silicon bulk effect transistor to generate a multi-state switching device, which its developers refer to as a “resonance tunneling transistor.” The resonance tunneling transistor can be used for making circuitry with greater available logic density than conventional microelectronic transistor logic is capable of. Another nanoelectronic device in development is the single electron transistor or SET. The SET is a switching device that controls electron tunneling and uses this to amplify a current. Two tunnel junctions, each made of two pieces of metal and a sub-nanometer thin insulator between them, share a common electrode. Electrons must tunnel through the insulator material to get between electrodes. Because quantum tunneling is a discrete process, the electric charge produced by electrons moving through the tunnel junction is produced in multitudes of each electron’s charge. Devices like electronic tunneling devices, as well as quantum dots, work with quantized energy. This is energy in its smallest possible interacting parts, at a scale where quantum physics phenomena like particle entanglement, tunneling, and particle superposition can be observed. Nanoelectronics devices will deploy electrons over incredibly small regions. Naturally, energy quantization and its effects on the devices and their intended functions are significant research focuses at the moment. Another focus of research in the nanoelectronics field is investigating ways to use electrically conductive polymers in nanoscale applications of organic electronics. Researchers are studying electrically conductive nanostructured polymers, nanoparticle-based polymers, and polymer nanocomposites dispersed with conductive nanoparticles. These materials are optimal for creating nanoscale electronic devices that are organic. This is due to the nanopolymers’ suitability as a building block material for both complicated and simple hierarchical nanostructures. Organic nanoelectronics and nanostructured electronic systems can also be used in conjunction with π-conjugated polymers to act as electron acceptors in next-generation organic nanoscale photovoltaic devices. References and Further Reading Achilli, S., et al. (2021). Position-Controlled Functionalization of Vacancies in Silicon by Single-Ion Implanted Germanium Atoms. Advanced Functional Materials. doi.org/10.1002/adfm.202011175. Altawell, N. (2022). Nanoelectronic systems. Introduction to Machine Olfaction Devices. doi.org/10.1016/B978-0-12-822420-5.00014-3. Khalifeh, S. (2020). Optimized Electronic Polymers, Small Molecules, Complexes, and Elastomers for Organic Electronic Systems. Polymers in Organic Electronics. doi.org/10.1016/B978-1-927885-67-3.50008-0. Shilov, A. (2019). Samsung Completes Development of 5nm EUV Process Technology. [Online] Anand Tech. Available at: https://www.anandtech.com/show/14231/samsung-completes-development-of-5-nm-euv-process-technology Tian, B., et al (2007). Coaxial silicon nanowires as solar cells and nanoelectronic power sources. Nature. doi.org/10.1038/nature06181. Zhirnov, V.V., and R.K. Cavin III (2015). The nanomorphic cell: atomic-level limits of computing. Microsystems for Bioelectronics. doi.org/10.1016/B978-0-323-31302-5.00001-6. Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.
<urn:uuid:3459bc20-3acf-45b7-9459-558363ee79ea>
CC-MAIN-2023-14
https://www.azonano.com/article.aspx?ArticleID=6234
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296945315.31/warc/CC-MAIN-20230325033306-20230325063306-00189.warc.gz
en
0.88387
1,589
3.84375
4
Quantum computing is an emerging technology that promises to revolutionize computing as we know it. By leveraging the principles of quantum physics, quantum computers can perform calculations at speeds and with levels of accuracy that far surpass traditional computers. This means that previously intractable problems can now be solved in minutes or seconds and with previously impossible levels of accuracy. In this blog post, we will explore the benefits and advantages of quantum figuring and discuss how it can be used to solve challenging real-world problems. It has the potential to greatly increase processing power in comparison to traditional computers. By utilizing quantum-level operations, quantum computers can execute more calculations per second than classical computers. This increased processing power makes it possible for quantum figuring to tackle complex problems and solve them faster than a traditional computer. Quantum computers can store and process data using qubits (quantum bits), allowing information to be stored in multiple states simultaneously. The verge allows quantum computers to account for many variables simultaneously, making it easier to analyze and solve complex problems. With its increased processing power, quantum figuring can speed up tasks that would normally take an incredibly long time on a traditional computer. Quantum computing can provide a major leap forward in the efficient use of resources. Instead of using multiple processors or massive computing power, quantum figuring can help make processing much faster. It can also handle more complex calculations and tasks using fewer resources than traditional computers. Quantum figuring enables machines to tackle more complex problems and better utilize their available resources. This is because quantum figuring is based on principles allowing more efficient energy and resources use. For example, a single qubit (a unit of quantum information) can perform a calculation in one step that would take multiple steps for a traditional computer. This improved efficiency means that quantum computers can process information faster, with fewer resources being use up in the process. One of the major advantages of quantum figuring is the ability to explore new types of algorithms that were not possible before. This is because quantum figuring utilizes principles of quantum mechanics, which enable processing of more complex and intricate data than traditional computing. Through this, we can use quantum figuring to solve complex problems more effectively, quickly, and efficiently. With quantum figuring, algorithms can written, allowing a much more powerful and efficient way to process data than traditional computing. For example, a quantum algorithm called Grover’s algorithm has been shown to have the potential to reduce the amount of time required to search a large database from hours or days to just seconds. This algorithm could be invaluable for medical diagnosis, image recognition, and fraud detection. Another type of algorithm made possible by quantum figuring is quantum annealing, which can used to find optimal solutions to complicated optimization problems. This has many applications, including scheduling, finance, machine learning, and operations research. By utilizing quantum figuring, we can use new types of algorithms that weren’t possible before. These algorithms are capable of solving complex problems in an incredibly fast and efficient manner, making them invaluable for a variety of different tasks. The development of quantum computing has opened up a new frontier in secure information processing. Quantum figuring utilizes the principles of quantum mechanics to provide a level of encryption and security that surpasses what is available with traditional computing. A quantum computer can easily encrypt data by utilizing entanglement, ensuring it remains secure and confidential even when accessed by a third party. This level of security is not achievable with traditional computing and provides an additional layer of security that can be invaluable in certain situations. With quantum figuring, it is possible to securely share data and communications across different networks while ensuring the integrity of the information. This allows organizations to securely exchange confidential information, making it possible to protect sensitive data from cyber-attacks and other malicious activities. It also offers faster computation speed, allowing for more efficient data processing, and further enhancing the information’s security. Quantum computing is helping to revolutionize the tech world. Unlike traditional computing, which is based on a binary system of 0s and 1s, quantum figuring uses qubits that can exist in multiple states simultaneously, allowing for more efficient and powerful processing capabilities. This means that quantum computers can now tackle complex tasks that were previously too difficult or even impossible to solve. One example of this is in machine learning, where quantum figuring can help develop more accurate models for predicting outcomes. These models can produce results faster than traditional algorithms by utilizing the ability to run many calculations at once. Additionally, since the qubits can store more information than traditional bits, these models can also consider a wider range of data. Quantum figuring has also solved some of the most challenging computational problems in fields like cryptography and drug development. For example, scientists have developed algorithms that can crack existing encryption keys in a fraction of the time it would take using traditional computing methods. Similarly, pharmaceutical companies have employed quantum figuring to rapidly design new drugs that would have otherwise taken much longer to develop with traditional methods. In short, quantum figuring is helping to revolutionize the way complex problems solved by providing faster and more efficient computing solutions. By taking advantage of the unique properties of qubits, these computers can now tackle tasks previously thought impossible. This means that researchers and professionals in various fields can now solve complex problems much more quickly and accurately than ever before. Quantum computing is a revolutionary technology that has the potential to revolutionize many aspects of computing. The increased processing power, more efficient use of resources, and new algorithms available through quantum figuring can lead to a wide range of applications. Additionally, quantum figuring provides greater security for information processing, which helps to protect data from being accessed by malicious actors. Finally, quantum figuring can help us solve complex problems that are too difficult for conventional computers. With its immense potential, quantum figuring could be the future of computing and change how we use computers forever.
<urn:uuid:4f56a787-5f33-48bb-b8aa-af31906776b6>
CC-MAIN-2023-14
https://www.thetechnologytrends.com/the-benefits-and-advantages-of-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296945433.92/warc/CC-MAIN-20230326044821-20230326074821-00793.warc.gz
en
0.934712
1,175
3.671875
4
In this article, we will explore how quantum algorithms can solve real-world problems, and how you can get involved in this quantum revolution! Quantum computers can solve NP-hard problems that classical computers are unable to solve. Currently, the two most important and notable complexity classes are “P” and “NP.” P represents problems that can be solved in polynomial time by a classical computer. For instance, asking if a number is prime belongs to P. NP problems are problems that cannot be solved in polynomial time by classical computers, but the answers to the problem can be verified quickly with a classical computer. Asking what are the prime factors of a number is an NP problem, as it can be easily verified if x is the prime factor of a number y, however it is very hard for the computer to find out its prime factors. The problem of whether P=NP, whether the two complexity classes are distinct or not is an important dilemma and the one who solves it gets a million dollars! In 1993, Ethan Bernstein and Umesh Vazirani defined a new complexity class called “bounded-error quantum polynomial time" or BQP. They defined this class to contain decision problems — problems with a yes or no answer — that quantum computers can solve efficiently. They also proved that P is a subset of BQP- that a quantum computer can solve all problems that a classical computer can solve. They also defined another class of problems called PH or "Polynomial Hierarchy". PH is a generalization of NP. Problems in PH are NP problems that are made more complex by asking questions like is it true "for all" or "does it exist for a particular x". But can a quantum computer solve problems that classical computers are unable to solve- the NP hard problems? Can we use quantum computing to solve practical problems that industries or companies are facing in real life? Well, you might have heard about how Shor’s algorithm might crack the encryption codes such as RSA and break into your bank account. Shor’s algorithm is able to solve the NP-hard problem of factoring large numbers- check out our implementation of Shor’s algorithm. Recently, researchers at Chalmers University of Technology have been able to solve a small part of a logistics problem faced by the aviation industry- the Tail Assignment Problem- assigning airplanes to flights with the goal of minimizing connection times between flights and keeping in mind maintenance constraints. This is a scheduling problem- which scales up exponentially with the number of flights and routes. The team at Chalmers was able to execute their algorithm on a processor with two qubits using the Quantum Approximate Optimization Algorithm or QAOA. The research team also simulated the optimization problem for up to 278 aircraft, however it requires a 25 qubit processor. Read this article to find out more! So what exactly is the Quantum Approximate Optimization algorithm? Optimization is searching for an optimal solution in a finite or countably infinite set of potential solutions of a cost function, which is set to be maximized or minimized. In the tail assignment problem, the connection times between flights should be minimized. The problem can also be defined in a way such that the total distance travelled by all airplanes over all air routes should be minimized. Let us take a simple version of an optimization problem that is easy to visualize. Consider the Travelling Salesman Problem: A salesman wants to travel through all historic sites of the United States to sell souvenirs. The aim is to find the shortest route the salesman should travel such that he visits all the sites and returns back to his starting point. The image above represents the shortest path to travel through all the historic sites of America. It would take the salesman 50 years to travel this path! For a small number of cities, we can apply the “brute-force” solution: calculate all the possible routes and pick the shortest. For a large number of cities n, the complexity of this approach is O(n!), which is not efficient How we would find this path is to use graph theory: each historic site is a vertex, and the edges are drawn between the vertices and represent the journey that the salesman takes. There will be numbers between the edges that represent the distance between the sites. How we minimize the distance is to first convert the problem into a weighted bipartite graph, and minimize the sum of the edges of the graph. A weighted bipartite graph looks like the one above. We describe it as a hamiltonian cycle: A cycle where the start and end point is the same and uses each vertex of the graph exactly once. In qiskit, we can map this problem to a Ising Hamiltonian, and minimize the value of the Ising Hamiltonian using the minimum variational Quantum Eigensolver optimizer. We will use the QuadraticProgram() function in Qiskit to make a model of the optimization problem. Check here to find out more. To find out the shortest path between the vertices, we will be using the tsp module from qiskit.optimization.applications.ising class to solve our problem! Then we will find out what the shortest distance is( which is the objective or the minimum value of our cost function) Our output will look something like this: This means that the solution is the path from 1 to 2 to 3 to 0 to 1. The minimum distance when traversing the graph is 236.0. If we want to make sure this is the correct solution, we can compare with the brute force method(to find the minimum sum of the edges). We are still in the Noisy Intermediate Quantum Scale Era, and have a long way to go before running optimization algorithms will be feasible. In the paper, the authors estimate that 420 qubits will be necessary to run the QAOA in a short amount of time and scalable to complex optimization problems. Currently, IBM's supercomputers work on around 50 qubits. However, quantum optimization can be used to solve other problems such as finding the ground state energy of a molecule or optimizing portfolios in finance. Qiskit can solve a wide range of combinatorial optimization problems. To find out more, check [this](https://qiskit.org/textbook/ch-applications/qaoa.html.). In this paper, computer scientists have found out a problem that is in BQP but not in PH. This means even if classical computers were able to solve NP problems, quantum computers still have an advantage over them as a problem is found to exist in BQP and not PH- problems that only a quantum computer can solve. This further proves the fact that quantum computers have a processing capacity beyond what a classical computer can achieve.
<urn:uuid:af6b72e2-e578-4474-98b0-8b6d5343cbe3>
CC-MAIN-2023-14
https://www.qmunity.tech/post/problems-that-only-quantum-computers-can-solve
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296948871.42/warc/CC-MAIN-20230328201715-20230328231715-00793.warc.gz
en
0.934615
1,411
3.578125
4
Quantum for pharma Quantum computing has the potential to revolutionize a wide range of industries, including pharma. With its ability to perform complex calculations and simulations at a much faster rate than classical computers, quantum computers have the potential to transform the way pharmaceutical companies approach problems such as drug discovery, clinical trial optimization, and supply chain management. How quantum computing works Quantum computers operate based on the principles of quantum mechanics, which govern the behavior of particles at the atomic and subatomic level. Quantum computers use quantum bits, or qubits, to store and process information. Unlike classical bits, which can only represent a value of 0 or 1, qubits can represent a combination of 0 and 1 simultaneously, allowing quantum computers to perform many calculations in parallel. This makes quantum computers much faster and more powerful than classical computers for certain types of problems. Type of problems that could be improved with quantum algorithms Quantum computers have the potential to significantly improve optimization problems in pharma. For example, they could be used to optimize clinical trial designs by identifying the most effective treatments and patient populations. Quantum computers could also be used to optimize drug discovery by identifying the most promising compounds for further testing. Quantum computers could also be used to simulate complex chemical systems, such as entire drug discovery pipelines. This could allow pharmaceutical companies to better understand and predict the outcomes of different treatments, improving risk assessment and decision-making. Quantum computers could be used to improve machine learning algorithms in pharma. For example, they could be used to analyze large amounts of data to predict the outcomes of clinical trials or to optimize the design of new drugs. This could have applications in areas such as drug discovery and clinical trial optimization. - Drug discovery. Quantum computers could be used to identify the most promising compounds for further testing, taking into account factors such as chemical structure and potential side effects. This could help pharmaceutical companies to accelerate the drug discovery process and bring new treatments to market more quickly. - Clinical trial optimization. Quantum algorithms could be used to optimize the design of clinical trials, taking into account factors such as treatment efficacy, patient populations, and risk. This could help pharmaceutical companies to reduce costs and improve the success rate of clinical trials. - Supply chain management. Quantum computers could be used to optimize the supply chain for pharmaceutical products, taking into account factors such as demand, expiration dates, and storage conditions. This could help pharmaceutical companies to reduce waste and improve efficiency. - Regulatory compliance. Quantum algorithms could be used to ensure compliance with complex regulatory requirements, such as those related to the handling and distribution of pharmaceutical products. This could help pharmaceutical companies to avoid costly penalties and maintain their reputation. - Pricing optimization. Quantum computers could be used to accurately value complex pharmaceutical contracts, such as pricing agreements with payers. This could help pharmaceutical companies to optimize their pricing strategies and improve profitability. - Research and development. Quantum algorithms could be used to optimize the allocation of resources for research and development projects, taking into account factors such as expected return on investment and risk. This could help pharmaceutical companies to prioritize the most promising projects and allocate resources more efficiently. - Patient stratification. Quantum algorithms could be used to identify patient subpopulations that are most likely to benefit from certain treatments, based on factors such as genetics, demographics, and medical history. This could help pharmaceutical companies to personalize treatments and improve outcomes for patients. - Adverse event prediction. Quantum computers could be used to analyze large amounts of data, such as electronic health records, to predict the likelihood of adverse events occurring during clinical trials. This could help pharmaceutical companies to identify and mitigate potential risks, improving the safety of their products. - Molecular dynamics. Quantum algorithms could be used to simulate the behavior of molecules, helping pharmaceutical companies to better understand the properties of new drugs and optimize their design. - Structural prediction. Quantum computers could be used to predict the 3D structure of proteins, which is important for understanding how drugs interact with their targets. Quantum computing has the potential to significantly impact the pharma industry, from drug discovery to supply chain management. By leveraging the power of quantum algorithms, pharmaceutical companies can optimize their operations, reduce costs, and bring new treatments to market more quickly. Challenges to being quantum ready - Lack of talent. One of the main challenges that organizations face in becoming quantum ready is the lack of skilled quantum personnel. Quantum computing is a relatively new field, and there is currently a shortage of professionals with expertise in this area. This can make it difficult for organizations to build in-house quantum capabilities. - Integrations with current and future quantum hardware over the cloud. Another challenge is the integration of quantum computers with current and future quantum hardware over the cloud. Organizations need to ensure they can execute over different quantum HW cloud providers with out being vendor-looking and complex integrations process. - Comparisons and analysis tools for algorithm execution. Organizations also need access to comparison and analysis tools to evaluate the cost, accuracy, and speed of different quantum algorithms. This can help them to choose the most appropriate algorithms for their specific needs. - Algorithms, data upload, and easy execution with no code tools and APIs. Organizations also need tools and APIs that allow them to easily upload data and execute algorithms with minimal or no coding. This can help to reduce the learning curve and make it easier for non-technical personnel to use quantum computers and explore the benefits of difernte quantum algorithms. - Integration with IT company systems. Finally, organizations need to ensure that the quantum algorithms and computing can be integrated with their existing IT systems and processes. Using a Quantum-as-a-Service Platform A Quantum-as-a-Service Platform, such as QCentroid’s, helps organizations overcome these challenges and access the benefits of quantum computing. One way QCentroid helps is by providing a catalog of ready-to-test algorithms from top quantum companies, reducing the need for organizations to develop their own algorithms from scratch. In addition, QCentroid provides access to quantum hardware over the cloud, allowing organizations to use quantum computers without the need to purchase and maintain their own hardware. This helps to reduce the cost and complexity of implementing quantum solutions. QCentroid provides tools for comparing the cost, accuracy, and speed of different algorithms, helping organizations to determine the best approach for a given problem. And, with easy-to-use tools and APIs for uploading algorithms and data, and executing quantum algorithms, QCentroid makes it easier for organizations to use quantum computing without a deep understanding of the underlying technology. QCentroid helps organizations to integrate quantum solutions with their existing IT systems, making it easier to take advantage of the benefits of quantum computing.
<urn:uuid:4ddbb806-75b5-4333-b811-95ad8f47c907>
CC-MAIN-2023-14
https://qcentroid.xyz/quantum-for-pharma/
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296945168.36/warc/CC-MAIN-20230323132026-20230323162026-00793.warc.gz
en
0.93393
1,385
3.59375
4
IBM's Q quantum computer. (Image source: IBM Research) Even though artificial intelligence (AI) and machine learning (ML) are taking center stage in the world of emerging technologies, there's another technology that is slowly making its presence known to society – quantum computing. New quantum machines such as Google's Bristlecone chip and IBM's Q initiative are already appearing in headlines. IBM has even provided public access to an online quantum computer for research and experimentation purposes. The science behind quantum machines dates to the early 1900s, to a German physicist named Max Planck. But experts say quantum computing has the potential to greatly enhance the technologies of today – including AI and ML algorithms – because of its ability to perform computations exponentially faster than today’s transistor-based computers. But the workings of quantum computers can be quite a bit to untangle. Here are five key concepts and questions regarding this unique computational machine: 1.) What Is Quantum Mechanics? Quantum mechanics is defined as the branch of physical science that is concern with the behaviors of subatomic particles, waves, matter, and energy of atoms. The term was coined by German physicist Max Born while he was conducting theoretical solid-state physics and quantum theory research in 1924. There are several unique properties of quantum mechanics such as superposition, entanglement, collapse, and uncertainty that factors into the application and design of quantum computers. Several related technologies like nanotechnology, structural biology, particle physics, and electronics are also supported by quantum mechanics. 2.) What is Quantum Hardware? Like traditional digital computers, quantum computers have three main components: inputs/outputs (I/O), memory, and a processor. The quantum computer’s I/O is a physical process of manipulating the states of qubits (more on those in a moment). The qubit manipulation is based on machine states that allow quanta (photonic energy) bits to propagate through the quantum computer. The qubit is the fundamental element of storing a 1, 0, or 0-1 quanta state. Multiple qubits can be grouped to make registers that assist in storing and moving large amounts of quanta data through the quantum system. Like traditional digital computers, the processor is created by using qubit logic gates. The qubit logic gates are constructed to perform complex operations within the quantum computer. An example of quantum logic circuit. (Image source: IBM Research) 3.) What Is a Qubit? The quantum equivalent of a bit is called a qubit. The qubit’s quantum state can be a 1, 0, or 0-1. Qubits can be configured as registers for data storage or as processors using quantum logic gates. The combination of quantum logic gates allows the quantum computer to perform single or multiple operations based on unitary operators. Basic logic gates used in quantum computers are the Hadamard or H-gate, the X-gate, the CNOT gate, and transformation or phase gates (Z, S+, T, and T+). A Josephson Junction and equivalent electrical circuit is the core component of a qubit. Image source: qsd.magnet.fsu.edu A Josephson Junction and equivalent electrical circuit is the core component of a qubit. (Image source: qsd.magnet.fsu.edu) 4.) What Is Superposition? Unlike a traditional digital computer, a quantum computer has a third state where the qubit can be 0 and 1 simultaneously. This tertiary state is called superposition. The superposition is probabilistic upon measurement of the state. The data value having the highest percentage rating is probability of the qubit being in that state. The analog equivalent of superposition is waves. A single physical disturbance can produce one wave, but additional waves can be superimposed to make one unique oscillatory pattern. With superposition, configuring qubits as registers allow new methods of computing complex problems using large data sets. AI and ML algorithms, therefore can be processed faster using quantum superposition. 5.) What is Entanglement? Another unique attribute of the quantum computer is the ability of two qubits being linked without physical contact to one another. This physical link phenomenon is called entanglement. Qubits being able to posse information between them allows data processing to occur simultaneously. Traditional digital computers must use a pipeline-fetch method of avoiding multiple execution processes from occurring at the same time. Because of entanglement, race conditions are not a concern with quantum computers. Although two qubits will have unique states, entanglement will allow the initial and final data bits to be simultaneously equal during long distance transmission events. Teleportation, which is typically reserved for science fiction, is actually being researched as it would be an application of entanglement that allows long distance data transmission. Don Wilcher is a passionate teacher of electronics technology and an electrical engineer with 26 years of industrial experience. He’s worked on industrial robotics systems, automotive electronic modules/systems, and embedded wireless controls for small consumer appliances. He’s also a book author, writing DIY project books on electronics and robotics technologies. ESC BOSTON IS BACK! The nation's largest embedded systems conference is back with a new education program tailored to the needs of today's embedded systems professionals, connecting you to hundreds of software developers, hardware engineers, start-up visionaries, and industry pros across the space. Be inspired through hands-on training and education across five conference tracks. Plus, take part in technical tutorials delivered by top embedded systems professionals. Click here to register today!
<urn:uuid:b34ed314-e0fc-4382-af41-1230d14f1097>
CC-MAIN-2023-14
https://www.designnews.com/electronics-test/quantum-computing-101-5-key-concepts-understand
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296949355.52/warc/CC-MAIN-20230330163823-20230330193823-00593.warc.gz
en
0.91585
1,152
3.765625
4
In what has been hailed as a computing milestone, a team of researchers from the University of Science and Technology of China has achieved quantum supremacy thanks to a device that can manipulate tiny particles of light. Dubbed Jiuzhang, the system performed a quantum computation called "Gaussian boson sampling", which has been shown to be intractable for classical computers. Quantum supremacy is achieved when a quantum device is proven to be able to carry out a task that a classical computer would find impossible, or take too long to complete. While Jiuzhang achieved Gaussian boson sampling in just 200 seconds, the researchers estimated that the same calculation would take the world's fastest supercomputer, Fugaku, 600 million years to complete. Quantum supremacy has only been claimed once before. Last year, Google's researchers showed off a 54-qubit processor that they said could run a test computation in 200 seconds – a calculation that, according to the research, would take the world's biggest supercomputers 10,000 years to complete. Qubits come with unprecedented computational power due to their ability to exist in a dual quantum state, and therefore to carry out many calculations at one. Researchers expect that, armed with enough stable qubits, quantum computers will shake up industries ranging from AI to finance through transportation and supply-chains. The crux of the challenge consists of creating and maintaining enough qubits to make a quantum computer useful, and there are different ways to do so. The quantum technology developed by Google, for example, is entirely different from Jiuzhang's set up: the search giant, for its part, is investing in metal-based superconducting qubits. This is also IBM's preferred quantum technique, and both tech giants have poured large sums of money into superconducting circuits to push quantum computing research. For superconducting qubits to remain controllable, however, they need to be kept in very cold temperatures – colder than in deep space. Needless to say, making this practical is still a significant barrier. The extreme sensitivity of qubits to their external environment also means that it is hard to scale up the devices. Instead of particles of metal, Jiuzhang manipulates photons. The device was built specifically for the quantum task that it carried out, Gaussian boson sampling, which consists of simulating and predicting the erratic behavior of photons. The task consists of injecting particles of light into a network of beam splitters and mirrors that give photons multiple choices of paths to travel through before reaching different output ports. Photons, however, come with strange quantum properties that complicate the matter: there is no way of knowing deterministically which way they will choose. What's more, if two identical photons hit the beam splitter at exactly the same time, they will stick together and both travel the same randomly-chosen path. All of this makes it very difficult for classical computers to identify patterns of photon behavior, and to predict the output configuration of photons based on how the particles were input. The difficulty of the calculation also exponentially increases as more photons get involved, which means that a Gaussian boson sampling device is difficult to scale up. Christine Silberhorn, professor of integrated quantum optics at Paderborn University in Germany, has been working on Gaussian boson sampling for many years. "The scheme has its own challenges," she tells ZDNet. "Scaling up the system is hard, because all components have to be engineered for a quantum experiment, and they have to work accurately together. Moreover, it requires the detections and processing of very large datasets." The researchers equipped Jiuzhang with 300 beam splitters and 75 mirrors, and said that they managed to measure up to 76 photons during their experiments – enough particles of light to make the calculation intractable for a classical computer. Cracking the Gaussian boson sampling equation has limited usefulness. For now, in fact, the experiment has done little more than show that Jiuzhang is better than classical computers at solving one very specific task – simulating the unpredictable behavior of photons. That doesn't mean, however, that a large-scale quantum computer will be built anytime soon to solve real-life problems. The value of the experiment rather lies in the proof that light-based quantum computers might be just as promising as their matter-based counterparts, which so far, courtesy of big tech's interest, have grabbed most of the headlines. "This experiment is an important milestone experiment for quantum simulations based on linear optical systems," says Silberhorn. "It demonstrates the high potential for scalable quantum computation using photons." Researchers have recently taken interest in photonic quantum computers because of the potential that particles of light have to remain stable even in uncontrolled environments. Unlike devices based on superconducting qubits, photons don't require extreme refrigeration, and could in theory scale up much faster. "The Boson sampling experiment reported by the USTC group is a real tour de force, and illustrates the potential of photonics as a quantum technology platform," Ian Walmsley, chair in experimental physics at Imperial College London, told ZDNet. "This is a real step forward in developing technologies that harness the power of quantum physics to perform tasks that that are not possible using current technologies." The new milestone achieved by the team at the University of Science and Technology of China, therefore, is likely to bring new impetus to the on-going race to build up quantum technologies. Google and IBM are only two examples of deep-pocketed players who have shown interest in developing quantum computers, and a rich ecosystem is growing at pace to bring new innovations to the space. In addition to industry players, nation states have shown strong interest in developing quantum technologies. The Chinese government, for one, is investing heavily in the field. In fact, Jian-Wei Pan, who led the research team that worked on Jiuzhang, was also behind a recent quantum cryptography breakthrough that achieved quantum key distribution over a record-breaking 745 miles.
<urn:uuid:a321a6aa-ba43-4858-bf4d-881d82fdec88>
CC-MAIN-2023-14
https://www.zdnet.com/article/quantum-supremacy-milestone-achieved-by-light-emitting-quantum-computer/
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296945472.93/warc/CC-MAIN-20230326111045-20230326141045-00195.warc.gz
en
0.954966
1,231
3.6875
4
Three scientists who have made seminal contributions to the experimental study of quantum entanglement and its applications share the Nobel Prize in Physics in 2022. Scientists John Clauser of the United States and Alain Aspect of France devised a method to definitively detect entanglement between photons. Quantum communication relies on entanglement, which was first successfully transmitted by Anton Zeilinger of the University of Vienna. The technologies of the future include quantum computing and quantum communication. Because they allow for rapid resolution of difficult problems and the use of “unbreakable” encrypted data. Particles like photons, ions, and atoms act under quantum physical phenomena like superposition and entanglement. Due to these occurrences, quantum computers can process vast amounts of data in a short amount of time, and quantum signals can be “teleported” almost instantly. The mystery of “spooky action at distance” Quantum entanglement has been described as “spooky action at a distance” by Albert Einstein and as the most crucial aspect of quantum physics by Erwin Schrödinger. Up until the measurement of the state of one of the entangled particles, the other remains in a superposition state, not knowing which of the two it is. Only then does the second one decide on its state simultaneously. All current quantum technologies are reliant on the observation of quantum entanglement. One analogy for quantum entanglement is that of two balls, one white and one black, whose superposition in midair renders them gray. The ultimate color of each ball is revealed only when one of them is captured. Simultaneously, it becomes obvious that the second ball is the opposite color. However, this raises the issue of how the balls determine which color they need to take on. Are their colors coincidental or do they potentially contain information that foretells the color they’ll show up in advance? Physicist John Stewart Bell suggested a theoretical potential in the 1960s for empirically clarifying this issue. According to this, a real entanglement without hidden variables would have to exhibit a specific degree of correlation when the measurements are repeated numerous times. But how to assess this in a realistic manner remained uncertain. John Clauser and Alain Aspect: The Bell test becomes practical The first prize winner of the 2022 Nobel Prize in Physics was the American physicist John Clauser for his work in this area. For the first time, he devised an experiment to prove that quantum entanglement is really possible and that Bell’s inequality could be broken. The scientist accomplished this by generating polarization-entangled pairs of photons. Clauser found out how frequently each combination happened by passing these photons through various polarization filters. As a result, it was clear that the entangled photons did disprove Bell’s inequality. There was no way to predict or account for the strength of the relationships. Instead, it was a “spooky action at distance” effect in which the measurement of one particle determines the state of another, nullifying the superposition. Clauser and his team’s experiment was exceedingly inefficient, however, since only a tiny percentage of the created photons were traceable through the filters and hence measurable. French physicist Alain Aspect, who came in second for the 2022 Physics Nobel Prize, decided to interfere here. He refined the experiment by separating the entangled photons and measuring them after they passed through two polarizers. Anton Zeilinger: Quantum teleportation and quantum amplification When sending optical information over long distances, for example via a fiber-optic cable, the light signal degrades, limiting the range; this is the issue that Anton Zeilinger of the University of Vienna addressed, and it is strongly connected to quantum entanglement. Over a distance of 6 miles (10 kilometers), about one photon is lost per second. Standard optical transmissions include intermediate amplifiers that account for this. Unfortunately, this cannot be done with entangled photons; the amplifier’s need to read out the signal before boosting it would destroy the quantum signal by canceling the entanglement. In 1998, Zeilinger and his group solved the problem using quantum teleportation. This stems from the discovery that one entangled pair of photons may impart that entanglement to another. As a result, all a quantum amplifier has to do to transport the entanglement and the quantum information it carries from one pair of photons to another is to guarantee that the two pairs make contact with each other under the correct conditions. This finding paves the way for the use of fiber optic cables to carry quantum communications across significant distances. Photons from the sun have also been “entangled” by scientists. Early adopters of quantum technology The three physicists who shared the 2022 Nobel Prize in Physics have thereby provided the groundwork for the eventual practicality of quantum technology. Their research on entangled states is groundbreaking. The Nobel Foundation explains that this is because “their results have cleared the way for new technology based upon quantum information.”
<urn:uuid:db2510bf-cb50-41ec-8005-6494ae5b3210>
CC-MAIN-2023-14
https://malevus.com/2022-physics-nobel-prize/
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296943695.23/warc/CC-MAIN-20230321095704-20230321125704-00594.warc.gz
en
0.931918
1,030
3.6875
4
I was on a vendor call last week and they were discussing their recent technological advances in quantum computing. During the discussion they mentioned a number of ways to code for quantum computers. The currently most popular one is based on the QIS (Quantum Information Software) Kit. I went looking for a principle of operations on quantum computers. Ssomething akin to the System 360 Principles of Operations Manual that explained how to code for an IBM 360 computer. But there was no such manual. Instead there is a paper, on the Open Quantum Assembly Language (QASM) that describes the Quantum computational environment and coding language used in QIS Kit. It appears that quantum computers can be considered a special computational co-proccesor engine, operated in parallel with normal digital computation. This co-processor happens to provide a quantum simulation. One programs a quantum computer by creating a digital program which describes a quantum circuit that uses qubits and quantum registers to perform some algorithm on those circuits. The quantum circuit can be measured to provide a result which more digital code can interpret and potentially use to create other quantum circuits in a sort of loop. There are four phases during the processing of a QIS Kit quantum algorithm. - QASM compilation which occurs solely on a digital computer. QASM source code describing the quantum circuit together with compile time parameters are translated into a quantum PLUS digital intermediate representation. - Circuit generation, which also occurs on a digital computer with access to the quantum co-processor. The intermediate language compiled above is combined with other parameters (available from the quantum computer environment) and together these are translated into specific quantum building blocks (circuits) and some classical digital code needed and used during quantum circuit execution. - Execution, which takes place solely on the quantum computer. The system takes as input, the collection of quantum circuits defined above and runtime control parameters,and transforms these using a high-level quantum computer controller into low-level, real time instructions for the quantum computer building the quantum circuits. These are then executed and the results of the quantum circuit(s) execution creates a result stream (measurements) that can be passed back to the digital program for further processing - Post-Processing, which takes place on a digital computer and uses the results from the quantum circuit(s) execution and other intermediate results and processes these to either generate follow-on quantum circuits or output ae final result for the quantum algorithm. As qubit coherence only last for a short while, so results from one execution of a quantum circuit cannot be passed directly to another execution of quantum circuits. Thus these results have to be passed through some digital computations before they can be used in subsequent quantum circuits. A qubit is a quantum bit. Quantum circuits don’t offer any branching as such. The only storage for QASM are classical (digital) registers (creg) and quantum registers (qreg) which are an array of bits and qubits respectively. There are limited number of built-in quantum operations that can be performed on qregs and qubits. One described in the QASM paper noted above is the CNOT operation, which flips a qubit, i.e., CNOT alb will flip a qubit in b, iff a corresponding qubit in a is on. Quantum circuits are made up of one or more gate(s). Gates are invoked with a set of variable parameter names and quantum arguments (qargs). QASM gates can be construed as macros that are expanded at runtime. Gates are essentially lists of unitary quantum subroutines (other gate invocations), builtin quantum functions or barrier statements that are executed in sequence and operate on the input quantum argument (qargs) used in the gate invocation. Opaque gates are quantum gates whose circuits (code) have yet to be defined. Opaque gates have a physical implementation may yet be possible but whose definition is undefined. Essentially these operate as place holders to be defined in a subsequent circuit execution or perhaps something the quantum circuit creates in real time depending on gate execution (not really sure how this would work). In addition to builtin quantum operations, there are other statements like the measure or reset statement. The reset statement sets a qubit or qreg qubits to 0. The measure statement copies the state of a qubit or qreg into a digital bit or creg (digital register). There is one conditional command in QASM, the If statement. The if statement can compare a creg against an integer and if equal execute a quantum operation. There is one “decision” creg, used as an integer. By using IF statements one can essentially construct a case statement in normal coding logic to execute quantum (circuits) blocks. Quantum logic within a gate can be optimized during the compilation phase so that they may not be executed (e.g., if the same operation occurs twice in a gate, normally the 2nd execution would be optimized out) unless a barrier statement is encountered which prevents optimization. Quantum computer cloud In 2016, IBM started offering quantum computers in its BlueMix cloud through the IBM Quantum (Q) Experience. The IBM Q Experience currently allows researchers access to 5- and 16-qubit quantum computers. There are three pools of quantum computers: 1 pool called IBMQX5, consists of 8 16-qubit computers and 2 pools of 5 5-qubit computers, IBMQX2 and IBMQX4. As I’m writing this, IBMQX5 and IBMQX2 are offline for maintenance but IBMQX4 is active. Google has recently released the OpenFermion as open source, which is another software development kit for quantum computation (will review this in another post). Although Google also seems to have quantum computers and has provided researchers access to them, I couldn’t find much documentation on their quantum computers. Two other companies are working on quantum computation: D-Wave Systems and Rigetti Computing. Rigetti has their Forest 1.0 quantum computing full stack programming and execution environment but I couldn’t easily find anything on D-Wave Systems programming environment. Last month, IBM announced they have constructed a 50-Qubit quantum computer prototype. IBM has also released 20-Qubit quantum computers for customer use and plans to offer the new 50-Qubit computers to customers in the future. Picture Credit(s): Quantum Leap Supercomputer, IBM What is Quantum Computing Website QASM control flow, Open Quantum Assembly Language, by A. Cross, et al. IBM’s newly revealed 50-Qubit Quantum Processer …, Softcares blog post
<urn:uuid:865aed77-2905-4c89-b5f5-1e3f06688890>
CC-MAIN-2023-14
https://silvertonconsulting.com/2017/12/11/quantum-computer-programming/
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296950247.65/warc/CC-MAIN-20230401191131-20230401221131-00794.warc.gz
en
0.911008
1,397
3.5
4
It may be possible in the future to use information technology where electron spin is used to process information in quantum computers. It has long been the goal of scientists to be able to use spin-based quantum information technology at room temperature. Researchers from Sweden, Finland and Japan have now constructed a semiconductor component in which information can be efficiently exchanged between electron spin and light – at room temperature and above. It is well known that electrons have a negative charge, and they also have another property, namely spin. The latter may prove instrumental in the advance of information technology. To put it simply, we can imagine the electron rotating around its own axis, similar to the way in which the Earth rotates around its own axis. Spintronics – a promising candidate for future information technology – uses this quantum property of electrons to store, process, and transfer information. This brings important benefits, such as higher speed and lower energy consumption than traditional electronics. Developments in spintronics in recent decades have been based on the use of metals, and these have been highly significant for the possibility of storing large amounts of data. There would, however, be several advantages in using spintronics based on semiconductors, in the same way that semiconductors form the backbone of today’s electronics and photonics. “One important advantage of spintronics based on semiconductors is the possibility to convert the information that is represented by the spin state and transfer it to light, and vice versa. The technology is known as opto-spintronics. It would make it possible to integrate information processing and storage based on spin with information transfer through light”, says Weimin Chen, professor at Linköping University, Sweden, who led the project. As electronics used today operates at room temperature and above, a serious problem in the development of spintronics has been that electrons tend to switch and randomize their direction of spin when the temperature rises. This means that the information coded by the electron spin states is lost or becomes ambiguous. It is thus a necessary condition for the development of semiconductor-based spintronics that we can orient essentially all electrons to the same spin state and maintain it, in other words that they are spin polarized, at room temperature and higher temperatures. Previous research has achieved a highest electron spin polarization of around 60% at room temperature, untenable for large-scale practical applications. Researchers at Linköping University, Tampere University and Hokkaido University have now achieved an electron spin polarization at room temperature greater than 90%. The spin polarization remains at a high level even up to 110 °C. This technological advance, which is described in Nature Photonics, is based on an opto-spintronic nanostructure that the researchers have constructed from layers of different semiconductor materials (see description below the article). It contains nanoscale regions called quantum dots. Each quantum dot is around 10,000 times smaller than the thickness of a human hair. When a spin polarized electron impinges on a quantum dot, it emits light – to be more precise, it emits a single photon with a state (angular momentum) determined by the electron spin. Thus, quantum dots are considered to have a great potential as an interface to transfer information between electron spin and light, as will be necessary in spintronics, photonics and quantum computing. In the newly published study, the scientists show that it is possible to use an adjacent spin filter to control the electron spin of the quantum dots remotely, and at room temperature. The quantum dots are made from indium arsenide (InAs), and a layer of gallium nitrogen arsenide (GaNAs) functions as a filter of spin. A layer of gallium arsenide (GaAs) is sandwiched between them. Similar structures are already being used in optoelectronic technology based on gallium arsenide, and the researchers believe that this can make it easier to integrate spintronics with existing electronic and photonic components. “We are very happy that our long-term efforts to increase the expertise required to fabricate highly-controlled N-containing semiconductors is defining a new frontier in spintronics. So far, we have had a good level of success when using such materials for optoelectronics devices, most recently in high-efficiency solar-cells and laser diodes. Now we are looking forward to continuing this work and to unite photonics and spintronics, using a common platform for light-based and spin-based quantum technology”, says Professor Mircea Guina, head of the research team at Tampere University in Finland. What is spintronics? Spintronics is a technology that uses both the charge and the spin of electrons to process and carry information. The spin of an electron can be envisioned as arising when the electron rotates clockwise or anticlockwise around its axis, in the same way that the Earth rotates around its axis. The two directions of rotation are called “up” and “down”. In the electronic technology used today, the electron charge is used to represent 0 and 1, and in this way carry information. In a corresponding way, the information can be represented in spintronics using the spin state of the electrons. In the world of quantum physics, an electron can possess both directions of spin at the same time (and thus be in a state that is a mixture of 1 and 0). This is, of course, completely unthinkable in the traditional, “classical” world, and is the key to quantum computing. Spintronics is therefore promising for the development of quantum computers. Opto-spintronics involves transferring the information that is represented by the spin state of the electrons to light, and vice versa. The light, photons, can then carry the information onwards through optical fibers, very rapidly and across long distances. The spin state of the electron determines the properties of the light, or to put it more accurately, it determines whether the electromagnetic field of the light will rotate clockwise or anticlockwise around the direction of travel, in roughly the same way that a corkscrew can have a clockwise or anticlockwise direction of turn. Source: Weimin Chen, professor at Linköping University Reference: “Room-temperature electron spin polarization exceeding 90% in an opto-spintronic semiconductor nanostructure via remote spin filtering” by Yuqing Huang, Ville Polojärvi, Satoshi Hiura, Pontus Höjer, Arto Aho, Riku Isoaho, Teemu Hakkarainen, Mircea Guina, Shino Sato, Junichi Takayama, Akihiro Murayama, Irina A. Buyanova and Weimin M. Chen, 8 April 2021, Nature Photonics. Financial support for the research has been granted by, among other bodies, the Swedish Research Council, the Swedish Foundation for International Cooperation in Research and Higher Education (STINT), the Swedish Government Strategic Research Area in Materials Science on Functional Materials at Linköping University, the European Research Council ERC, the Academy of Finland, and the Japan Society for the Promotion of Science. Be the first to comment on "Technology Breakthrough Enables Practical Semiconductor Spintronics"
<urn:uuid:682fa74b-3301-48e6-9d5f-df57b09b639d>
CC-MAIN-2023-14
https://scitechdaily.com/technology-breakthrough-enables-practical-semiconductor-spintronics/
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296943483.86/warc/CC-MAIN-20230320114206-20230320144206-00198.warc.gz
en
0.910405
1,530
3.78125
4
Silicon spin qubits satisfy the necessary criteria for quantum information processing. However, a demonstration of high-fidelity state preparation and readout combined with high-fidelity single- and two-qubit gates has been lacking. Now, scientists from Princeton University are taking a step towards using silicon-based technologies in quantum computing. Using a two-qubit silicon quantum device, scientists obtained an unprecedented level of fidelity at above 99 percent. This is the highest fidelity achieved for a two-qubit gate in a semiconductor and is on par with the best results achieved by competing technologies. Scientists were also able to capture two electrons and force them to interact. The spin state of each electron can be used as a qubit, and the interaction between the electrons can entangle these qubits. This operation is crucial for quantum computation, and scientists performed this operation at a fidelity level exceeding 99.8 percent. Adam Mills, a graduate student in the Department of Physics at Princeton University, said, “Silicon spin qubits are gaining momentum [in the field]. It’s looking like a big year for silicon overall.” “In a qubit, you can encode zeros and ones, but you can also have superpositions of these zeros and ones. This means that each qubit can be simultaneously a zero and a one. This concept, called superposition, is a fundamental quality of quantum mechanics and allows qubits to do operations that seem amazing and otherworldly. In practical terms, it allows the quantum computer a greater advantage over conventional computers in, for example, factoring very large numbers or isolating the most optimal solution to a problem.” The spin in spin qubits is a quantum property that acts as a tiny magnetic dipole that can be used to encode information. Quantum mechanically, the electron’s spin can align with the magnetic field generated in the lab, be oriented anti-parallel to the area (spin-down), or be in a quantum superposition of spin-up and spin-down. Mills said, “In general, silicon spin qubits have advantages over other qubit types. The idea is that every system will have to scale up to many qubits. And right now, the other qubit systems have real physical limitations to scalability. Size could be a real problem with these systems. There’s only so much space you can cram these things into.” Unlike conventional superconducting qubit that is 300 microns across, this two-qubit silicon quantum device is just about 100 nanometers across. Jason Petta, the Eugene Higgins Professor of Physics at Princeton, said, “The other advantage of silicon spin qubits is that conventional electronics today are based on silicon technology. Our feeling is that if you want to make a million or ten million qubits that are required to do something practical, that’s only going to happen in a solid-state system that can be scaled using the standard semiconductor fabrication industry.” “One of the bottlenecks for the technology of spin qubits is that the two-qubit gate fidelity up until recently has not been that high. It’s been well below 90 percent in most experiments.” For the experiment, scientists first need to capture a single electron, get it into a specific region of space and then make it dance. To do so, they constructed a cage. This took the form of a wafer-thin semiconductor made primarily out of silicon. The team patterned little electrodes to the top of this, which created the electrostatic potential used to corral the electron. Two of these cages, each separated by a barrier, or gate, constituted the double quantum dot. By adjusting the voltage on these gates, scientists momentarily pushed the electrons together and made them interact. They dubbed this as a two-qubit gate. Due to the interaction, each spin qubit evolves according to the state of its neighboring spin qubit, hence causing entanglement in quantum systems. Petta said that “the results of this experiment place this technology — silicon spin qubits — on an equal footing with the best results achieved by the other major competing technologies. This technology is on a strongly increasing slope, and I think it’s just a matter of time before it overtakes the superconducting systems.” “Another important aspect of this paper is that it’s not just a demonstration of a high fidelity two-qubit gate, but this device does it all. This is the first demonstration of a semiconductor spin qubit system where we have integrated the entire system’s performance — the state preparation, the readout, the single-qubit control, the two-qubit control — all with performance metrics that exceed the threshold you need to make a larger-scale system work.” - Adam Mills, Charles Guinn, Michael Gullans et al. Two-qubit silicon quantum processor with operation fidelity exceeding 99%. DOI: 10.1126/sciadv.abn5130
<urn:uuid:95bc2f69-1d6c-4629-a5e7-2c6665c050d5>
CC-MAIN-2023-14
https://www.techexplorist.com/silicon-qubits-quantum-computing/46323/
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296943747.51/warc/CC-MAIN-20230321225117-20230322015117-00398.warc.gz
en
0.919665
1,057
3.640625
4
Research team supersizes 'quantum squeezing' to measure ultrasmall motion Physicists at the National Institute of Standards and Technology (NIST) have harnessed the phenomenon of "quantum squeezing" to amplify and measure trillionths-of-a-meter motions of a lone trapped magnesium ion (electrically charged atom). Described in the June 21 issue of Science, NIST's rapid, reversible squeezing method could enhance sensing of extremely weak electric fields in surface science applications, for example, or detect absorption of very slight amounts of light in devices such as atomic clocks. The technique could also speed up operations in a quantum computer. "By using squeezing, we can measure with greater sensitivity than could be achieved without quantum effects," lead author Shaun Burd said. "We demonstrate one of the highest levels of quantum squeezing ever reported and use it to amplify small mechanical motions," NIST physicist Daniel Slichter said. "We are 7.3 times more sensitive to these motions than would be possible without the use of this technique." Although squeezing an orange might make a juicy mess, quantum squeezing is a very precise process, which moves measurement uncertainty from one place to another. Imagine you are holding a long balloon, and the air inside it represents uncertainty. Quantum squeezing is like pinching the balloon on one end to push air into the other end. You move uncertainty from a place where you want more precise measurements, to another place, where you can live with less precision, while keeping the total uncertainty of the system the same. In the case of the magnesium ion, measurements of its motion are normally limited by so-called quantum fluctuations in the ion's position and momentum, which occur all the time, even when the ion has the lowest possible energy. Squeezing manipulates these fluctuations, for example by pushing uncertainty from the position to the momentum when improved position sensitivity is desired. In NIST's method, a single ion is held in space 30 micrometers (millionths of a meter) above a flat sapphire chip covered with gold electrodes used to trap and control the ion. Laser and microwave pulses are applied to calm the ion's electrons and motion to their lowest-energy states. The motion is then squeezed by wiggling the voltage on certain electrodes at twice the natural frequency of the ion's back-and-forth motion. This process lasts only a few microseconds. After the squeezing, a small, oscillating electric field "test signal" is applied to the ion to make it move a little bit in three-dimensional space. To be amplified, this extra motion needs to be "in sync" with the squeezing. Finally, the squeezing step is repeated, but now with the electrode voltages exactly out of sync with the original squeezing voltages. This out-of-sync squeezing reverses the initial squeezing; however, at the same time it amplifies the small motion caused by the test signal. When this step is complete, the uncertainty in the ion motion is back to its original value, but the back-and-forth motion of the ion is larger than if the test signal had been applied without any of the squeezing steps. To obtain the results, an oscillating magnetic field is applied to map or encode the ion's motion onto its electronic "spin" state, which is then measured by shining a laser on the ion and observing whether it fluoresces. Using a test signal allows the NIST researchers to measure how much amplification their technique provides. In a real sensing application, the test signal would be replaced by the actual signal to be amplified and measured. The NIST method can amplify and quickly measure ion motions of just 50 picometers (trillionths of a meter), which is about one-tenth the size of the smallest atom (hydrogen) and about one-hundredth the size of the unsqueezed quantum fluctuations. Even smaller motions can be measured by repeating the experiment more times and averaging the results. The squeezing-based amplification technique allows motions of a given size to be sensed with 53 times fewer measurements than would otherwise be needed. Squeezing has previously been achieved in a variety of physical systems, including ions, but the NIST result represents one of the largest squeezing-based sensing enhancements ever reported. NIST's new squeezing method can boost measurement sensitivity in quantum sensors and could be used to more rapidly create entanglement, which links properties of quantum particles, thus speeding up quantum simulation and quantum computing operations. The methods might also be used to generate exotic motional states. The amplification method is applicable to many other vibrating mechanical objects and other charged particles such as electrons. More information: S.C. Burd el al., "Quantum amplification of mechanical oscillator motion," Science (2019). science.sciencemag.org/cgi/doi … 1126/science.aaw2884 "Squeezing out higher precision," Science (2019). science.sciencemag.org/cgi/doi … 1126/science.aax0143 Journal information: Science Provided by National Institute of Standards and Technology
<urn:uuid:a2fc7250-a7c4-43a9-91a3-1a45c1bbd72a>
CC-MAIN-2023-14
https://phys.org/news/2019-06-team-supersizes-quantum-ultrasmall-motion.html
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296948708.2/warc/CC-MAIN-20230327220742-20230328010742-00398.warc.gz
en
0.921711
1,046
3.921875
4
Quantum computing is here to shake the existing mechanical, electrical and electronic systems. Modern electronics in particular will not be the same if quantum computing gains acceptance. Therere voices of support as well as dissent. In this post, well analyze future trends in quantum computing. Keep reading! Quantum computers use atoms to perform calculation. The computation speed depends principally on Qubits (quantum bits). These quantum bits are the fundamental building blocks of a quantum computer. Recent developments in the field of quantum research expect to eliminate/drop the Moore’s law by 2020. The future of Quantum computers as of now is not very certain particularly due to already known problems in areas such as de-coherence, error correction, output observance and cost related issues. But, if scientists succeed in developing a practically useful quantum computer, it may replace traditional computers in sectors such as robotics (Industrial Automation), cyber security, alternative energy etc. Such computers may also be deployed for solving emerging tactical problems like Tsunami alerts. Quantum computers can scale up the possibility of enhancing computation power to a new and unanticipated peak point by providing a fast and efficient platform for high performance computing. At present, we don’t have very efficient systems capable of solving tactical problems such as Correct weather forecasting Predicting right patterns in stock markets Analyzing the molecular/ DNA part of human body in medical research. Today, processor die size is drastically shrinking, but there is not enough software solutions developed for harnessing the full processor potential. Computing power over the next few years will perhaps get skyrocketed with the advent of quantum computers. Many experts argue that the computing world today doesn’t even have the right programs to actually utilize a 1 GHz mobile processor in the best possible way. It’s not more processor speed but better programs we need urgently right now, or is it? Have a look at some areas where quantum computers can play a vital role in near future: Artificial intelligence (AI) was primarily meant to assist humans in executing complex jobs such as handling operations in the middle of a furnace blast or during space and military missions. Today, robotic systems are heavily used in the industrial automotive world for boosting production. Introduction of quantum computing can give a major boost to AI by ensuring creation of even more powerful & intelligent robots. The capability of encoding information in fuzzy quantum states will multiply the power of these artificial creatures. It would be possible to scan through large databases in few seconds with qubits. Quantum AI techniques can dramatically speed up image acquisition and processing techniques. Algorithms have already been developed and ready for implementation in quantum computers now. But recent failures in controlling Qubits inside laboratories, pose serious questions regarding the viability of quantum computing. Developed robots featuring powerful qubit will be able to break maximum encryption code within near zero time. A quantum computer will possibly crack any possible password in an instant. No security algorithm will then be able to provide 100% security to content placed on web servers. As far as the Internet is concerned, everything (yes, everything) will have to be redefined using quantum computers. Qubits (known as Quantum dots in solar terminology) can be largely deployed in solar panels to replace the current photovoltaic cells technology. Quantum dot is a nanoscale particle of semiconducting material that can be embedded. It can therefore revolutionize the renewable energy sector. Qubits can also be used to make quantum batteries in order to store energy generated by powerful windmills. Teleportation (if it ever becomes a reality) will allow transfer of matter from one place to another without traversing through physical medium. With this technology, (some say) time travelling can become possible which still is considered a myth. Quantum teleportation technology will enable humans to travel far distances without losing a moment as seen in fictional/sci-fi movies. Right now, it’s all speculation. Quantum computers can be connected in series to form a quantum network, thus building a smart grid. They will offer high encoding and decoding speeds with fast transfer of information (qubits). Smart energy grids will offer high efficiency in energy delivery system. Additionally, quantum computers can also be used to process large amount of data coming from geothermal activities. The already developed and much touted quantum computer from ‘D-Wave’ systems is 3600 times powerful than a conventional PC. But the project was declared a failure on application front by Google. Questions about the real-world feasibility of such expensive projects remain unanswered. But, given the fact that everything from cellphones, wireless networks and electricity was no less than a miracle few dozen years ago, quantum computing too may appear as a miracle at first and slowly become an integral part of our lives. About Amy Baker A computer science engineer, Amy holds a Masters Degree in Quantum Computing. She is based in Texas. The content & opinions in this article are the author’s and do not necessarily represent the views of RoboticsTomorrow This post does not have any comments. Be the first to leave a comment below. Post A Comment You must be logged in before you can post a comment. Login now.
<urn:uuid:b60f1c05-fd10-4bb0-9cf2-fc5aa4569d47>
CC-MAIN-2019-26
https://www.roboticstomorrow.com/article/2014/02/an-uncertain-future-for-quantum-computing/235/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999539.60/warc/CC-MAIN-20190624130856-20190624152856-00379.warc.gz
en
0.918999
1,071
3.546875
4
Scientists pinpoint the singularity for quantum computers Researchers from the University of Bristol have discovered that super-powerful quantum computers, which scientists and engineers across the world are racing to build, need to be even more powerful than previously thought before they can beat today's ordinary PCs. Quantum computers are a new type of machine that operate on quantum mechanical hardware and are predicted to give enormous speed advantages in solving certain problems. Research groups at leading universities and companies, including Google, Microsoft and IBM, are part of a worldwide race to realise the first quantum computer that crosses into the 'quantum computational singularity'. This represents a problem so complex that today's top supercomputer would take centuries to find a solution, while a quantum computer could crack it in minutes. Now a team of scientists from Bristol have discovered that the boundary to this singularity is further away than previously thought. The research is reported this week in Nature Physics. The results apply to a highly influential quantum algorithm known as 'boson sampling', which was devised as a very direct route to demonstrate quantum computing's supremacy over classical machines. The boson sampling problem is designed to be solved by photons (particles of light) controlled in optical chips – technology pioneered by Bristol's Quantum Engineering and Technology Labs (QETLabs). Predicting the pattern of many photons emerging from a large optical chip is related to an extremely hard random matrix calculation. With the rapid progress in quantum technologies, it appeared as though a boson sampling experiment that crossed into the quantum computational singularity was within reach. However, the Bristol team were able to redesign an old classical algorithm to simulate boson sampling, with dramatic consequences. Dr Anthony Laing, who heads a group in QETLabs and led this research, said: "It's like tuning up an old propeller aeroplane to go faster than an early jet aircraft. "We're at a moment in history where it is still possible for classical algorithms to outperform the quantum algorithms that we expect to ultimately be supersonic. "But demonstrating such a feat meant assembling a crack team of scientists, mathematicians, and programmers." Classical algorithms expert Dr Raphaël Clifford, from Bristol's Department of Computer Science, redesigned several classical algorithms to attack the boson sampling problem, with the 1950's Metropolised Independence Sampling algorithm giving the best performance. The simulation code was optimised by QETLabs researcher 'EJ', a former LucasArts programmer. Expertise on computational complexity came from Dr Ashley Montanaro, of Bristol's School of Mathematics, while QETLabs students Chris Sparrow and Patrick Birchall worked out the projected performance of the competing quantum photonics technology. At the heart of the project and bringing all these strands together was QETLabs PhD student and first author on the paper, Alex Neville, who tested, implemented, compared, and analysed, all of the algorithms. He said: "The largest boson sampling experiment reported so far is for five photons. "It was believed that 30 or even 20 photons would be enough to demonstrate quantum computational supremacy." Yet he was able to simulate boson sampling for 20 photons on his own laptop, and increased the simulation size to 30 photons by using departmental servers. Alex added: "With access to today's most powerful supercomputer, we could simulate boson sampling with 50 photons." The research builds on Bristol's reputation as a centre of activity for quantum science and the development of quantum technologies. Through QETLabs, the university has embarked on an ambitious programme to bring quantum technologies out of the laboratory and engineer them in to useful devices that have real-world applications for tackling some of society's toughest problems. In addition to collaborations with tech companies such as Microsoft, Google, and Nokia, start-ups and new business activities focused on quantum technologies have emerged in Bristol. An important theme across the overall quantum research activity is developing our understanding of exactly how quantum technologies can provably outperform conventional computers. Recently Dr Montanaro, together with Professor Noah Linden of the School of Mathematics, organised a Heilbronn Focused Research Group on the topic of quantum computational supremacy. This meeting brought some of the world leaders in the field, from both industry and academia, to Bristol for a week of intense discussions and collaboration. Among the attendees was one of the theorists who devised boson sampling, Professor Scott Aaronson, from UT Austin. Although outperforming classical computers might take a little longer than originally hoped, Dr Laing is still optimistic about the prospects for building a device to do just that. He said: "We now have a solid idea of the technological challenge we must meet to demonstrate that quantum machines can out-compute their classical counterparts. For boson sampling, the singularity lies just beyond 50 photons. It's a tougher nut to crack than we first thought, but we still fancy our chances." With Dr Laing's group focused on practical applications of quantum technologies, the current work puts bounds on the size and sophistication of photonic devices that will be required to tackle industrially relevant problems that are beyond the capabilities of today's classical algorithms.
<urn:uuid:b1e71c84-c02d-4bc5-a7b5-0560fab30f09>
CC-MAIN-2019-26
https://phys.org/news/2017-10-scientists-singularity-quantum.html
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998724.57/warc/CC-MAIN-20190618123355-20190618145355-00300.warc.gz
en
0.937086
1,067
3.75
4
News and videos about quantum computers (QC) are common. ‘Quantum’ inspires awe and mystery. Astonishing speed-ups are promised. ‘Entanglement’ is thrown in the mix - people become hooked. But this computer research that inspires such fascination is an area that offers the fewest opportunities for involvement or understanding. Want to learn to programme? Use tools like Scratch. Want to develop machine learning skills? There’s a Python package for that. Want to learn about QC? Zip through these courses on complex vector spaces, number theory, and an undergraduate introduction to quantum mechanics. Then you can start trying to understand the basics of QC! But what about the only ‘killer app’ for QC - Shor’s Algorithm? Well, that would strain the brain of a third-year maths undergraduate. The mysteries of quantum effects are easy to understand in the maths. In the equations all is clear. But it is a mix of maths topics unusual to find in the average computer programmer. Another approach to understanding QC involves helping other people understand it. One way to do this is to create musical problems and use QC to solve them. Discussing the solution to these problems can provide a greater insight into QC. The example in this article is the musical problem of chords, solved on a quantum D-Wave 2X. The first company to sell quantum computers was D-Wave, who flogged a few off to people such as Google, NASA and Lockheed Martin. The D-Wave computers are adiabatic quantum computers (ADC). They are not like normal algorithmic step-by-step QC, such as those made by IBM. An adiabatic quantum computer is reminiscent of a neural network. It is based on the equations for Ising models. Ising models describe the physics of a magnetic material through the molecules within it. An ADC solves the equations of the Ising model to minimise the energy in the simulated magnetic material. The programming involves defining properties of the simulated ‘molecules’. Over a period of 28 years, more than 10,000 publications came out in areas as wide as zoology and artificial intelligence on the applications of the Ising. There is an ongoing debate about how the D-Wave ADC truly functions and what speedup it can provide. Google claimed large speed increases for its quantum hardware. This is thought to be due to quantum tunnelling. When searching for low energy states, a quantum system can tunnel into nearby states. Quantum tunnelling allows physical systems to move to states in ways that would not be possible in the classical Newtonian view of the world. The systems ‘tunnel’ through to the new, usually inaccessible states instantaneously. This particular musical problem was set up by assigning each note of the musical scale to one ‘molecule’ of the Ising model. Each molecule is modelled by a quantum bit, or Qubit. At this point, the mathematical world of quantum mechanics is entered, where everything makes sense in the equations, but not in the explanation! Every qubit can be simultaneously a one or zero (unlike a bit which can only be one or zero). This is very simple mathematically, but makes no sense in our everyday observed world. For example, a cat cannot be both alive and dead, as Schrodinger once observed in his famous thought experiment. He was trying to imagine the laws of quantum mechanics applying to the world beyond subatomic particles. This, so called, ‘superposition’ of one and zero is not a form of statistical or probabilistic computing. It is something more complex. In the combination of one and zero held by this single qubit, the one and the zero also have what is known as a ‘phase’. This can be thought of as the result of another strange consequence of quantum theory: everything is simultaneously a wave and a particle. An electron is a waveform, and a light wave is also a particle of light called a photon. When the qubit is actually measured, its resulting value will always be 0 or 1. For definite. What’s more, the phase of the 0 and 1 in the superposition has no effect on the chance of whether a 0 or 1 is seen. But, until that observation, not only is the result indeterminate, but these phases have dramatic effects on how qubits interact. Things have clearly moved beyond the realms of how programming is normally thought about. The qubit being like a bit that is both 0 and 1 is a useful analogy, but it’s incomplete. Qubits in harmony The D-Wave 2X dealt with many underlying complexities. Connections were set up between the ‘molecules’ (the musical notes) in such a way that when the D-Wave program was triggered, it generated the form of musical chord required. A simple musical rule is used. The D-Wave would be sent a note, and it would find three or four notes which included this note, and which were not too close together nor far apart on the piano keyboard. Try pressing three notes at the same time on the piano keyboard. If they are too close they clash, if they are too far apart they don’t sound like a chord. Each time the D-Wave was asked to harmonise a note using this algorithm, it would send me multiple possible solutions. This highlights a key element of QC - there is no single correct solution to an algorithm. The solutions are held in a superposition, and then when observed, a single solution presents itself. This is not necessarily the precise process the D-Wave is following, but its qubits move through a number of superpositions as a solution form. These ideas were captured and explained in a performance at the Port Eliot Music Festival in July 2017 called ‘Superposition’. It was a composition for mezzo soprano (Juliette Pochin) and electronic sounds. The electronics were generated by a real-time music system on my laptop, connected over the internet to the D-Wave 2X at USC. The mezzo-soprano’s music was pre-composed. The sounds of her voice were picked up live by the laptop, converted into energy and frequency readings, and sent to the D-Wave as a problem to be solved by the harmony generator. The D-Wave returned multiple solutions. The local laptop took the multiple chords, spread them across the musical range, and played them together. These giant chords gave the audience some sense of the multiple solutions that may have existed in the superposition inside the quantum computer. Universal quantum computers The next performance planned will involve the Universal QC (UQC) of IBM. UQC have logic gate diagrams and assembly code. They have processing elements, like NOT, XOR and a form of AND gate. But… the analogy breaks down. There are also gates that change qubit phase. The ‘Hadamard’ gate that takes as input a qubit that is definitely a 1 or 0, and turns it into an indeterminate superposition. Combine a Hadamard gate with a quantum XOR gate and you have ‘entangled’ qubits. Entanglement, vital to QC algorithms and probably the most famous element of QC, is once again simple to see in the maths, but makes little sense if explained otherwise. Quantum computing, both adiabatic and universal, is proving a fruitful research topic. What is lagging is true public engagement. People, and most programmers, don’t know degree-level maths. So, let’s find new approaches to explain, and perhaps one day utilise, the power of quantum computing in more comprehensible ways. Information on Alexis Kirke’s work and further projects can be found at: www.alexiskirke.com
<urn:uuid:2e4652c7-8f40-4806-a3eb-a9a79570ab75>
CC-MAIN-2019-26
https://www.bcs.org/content-hub/experiencing-quantum-through-music/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560628000266.39/warc/CC-MAIN-20190626094111-20190626120111-00062.warc.gz
en
0.948361
1,654
3.78125
4