Quantum Computing II: Wall Street Bets, Cyber Threats, and Richard Feynman

“Nature isn’t classical, damnit, and if you want to make a simulation of nature, you’d better make it quantum mechanical.” – Richard Feynman

Quantum stocks might be having their moment and investment-focused observers might view this as cutting edge. But just as AI has its academic roots in the 1960s, quantum computers were first conceived in the 1980s. It was during a lecture at MIT in May 1981 that Richard Feynman, a quantum physicist and amateur safecracker widely regarded as one of the most important thinkers of the last century (in any field), made the observation above. At the same conference, nuclear chemist Paul Benioff explained his vision of quantum computing, building on a paper he had published a year earlier. 

Unsurprisingly, Feynman’s vision was the result of frustration with the limitations of classical computers in assisting him with his quantum-physics research. Though it’s obviously true that computers at the time were woefully inadequate relative to what we have today, Feynman extrapolated their path and concluded that for certain problems, classical computers would almost certainly never improve enough to solve the kinds of problems he was interested in. 

Academia began working simultaneously on the idea of quantum algorithms and the achievement of an actual, working quantum computer, with notables like Oxford University’s David Deutsche and Bell Labs’ Peter Shor and Lov Grover making notable early contributions to the field. (We will return in particular to Dr. Shor, now a professor at his alma mater of MIT, later.)

Governmental funding of basic science funded the lion’s share of research into quantum computing in these early decades, and it was not until 1998-2000 that the first working (rudimentary) quantum computers were unveiled. These first models, which made use of nuclear magnetic resonance to generate qubits rather than supercooled superconducting circuits (see Part I), ultimately convinced scientists that the NMR path would ultimately lead to a dead-end due to scalability limits. The first physical quantum computers on which today’s machines are based began to emerge in 2001, at places like the Technical University of Munich, Stanford, and the Los Alamos National Laboratory. 

Yet it would not be until 2019 that Google, building on earlier research, demonstrated real-world quantum supremacy for the first time. In fairness, quantum supremacy would have been accomplished sooner except that computer scientists “annoyingly” insisted on repeatedly improving classical supercomputers and thus, moving the goalposts – not that we’re complaining. (Quantum supremacy describes a quantum computer not just succeeding at a specific computing task, but doing in a manner that is demonstrably superior to what a classical computer could achieve.) 

Quantum algorithms

We touched on the nature of quantum computing previously, but it’s worth a review. Classical computing is largely about sequential calculations, doing a series of computations step by step by step. At speed, this can seem simultaneous: as an everyday example, when you watch a digital video, the processor is essentially plotting hundreds of thousands of pixels in sequence to form a still image, and then repeating this process dozens of times a second. It appears to be simultaneous to our much slower eyes and brains, but it isn’t. (Graphical processors or GPUs, such as those used in video, streaming, and now AI, enhance this illusion by using many cores to process multiple pixels at the same time, but the sequential nature still remains.)

Perhaps the better analogy is sound. Imagine that we want to find all the places in a classical symphony where the pitch “440 Hz A” is played. A classical algorithm would evaluate every instrument’s part (first violins, cellos, French horns, etc.) from beginning to end in sequence and flag those instances. A quantum-computing approach would be to simultaneously play every note in the symphonic score at the same time, creating a chaotic, cacophonic burst of noise. The quantum algorithm (compare this to emitting a separate soundwave) would then destructively interfere with and cancel out those notes that are unlikely to be 440 Hz A, while increasingly and constructively interfering with (or amplifying) those likely to be the note we’re seeking. This symphonic analogy is not quite perfect in describing quantum computing processes, but it is perhaps a more intuitive way of thinking about the difference.

The reason we included another explanation of quantum computing is to make it more abundantly clear that quantum computing as we currently understand it will never totally replace classical computing – or AI. Instead, it would serve as a complement. Classical computing will likely always be superior at many types of tasks – just about any function you currently carry out on your smartphone or tablet, for example. The role of quantum computing will logically be used in those areas in which classical computing falters.

Quantum Computing II: Wall Street Bets, Cyber Threats, and Richard Feynman
There’s a certain “Everything, Everywhere All At Once” mindset to quantum computing. Pictured here: Michelle Yeoh in the film of the same name, image courtesy of Lionsgate Films

Quantum computing is currently at a stage analogous to the initial development of power grids in the late 1880s, when Thomas Edison and George Westinghouse battled over two approaches that each had a distinct set of advantages and disadvantages. 

We discussed an approach in Part I in which the qubits used in quantum computing are generated through supercooled, superconducting circuits. While this is one of the first ways that scientists discovered to generate qubits, the slow rate of progress on that front has led scientists to investigate multiple other methods, some of which seem roughly as promising. 

Trapped Ion Qubits

Quantum Computing II: Wall Street Bets, Cyber Threats, and Richard Feynman
Yes, but it’s a good kind of trap. Source: Lucasfilm

Another method of generating qubits is to place ions (charged atoms) into an electromagnetic trap to keep them in place, then hitting them with an array of precisely placed and angled lasers to cool them down. This might seem counterintuitive, since we tend to associate lasers with intense heat. However, it’s important to remember that heat can be thought of as just kinetic energy – the vibration of atoms (this is admittedly a bit of an oversimplification for laypersons). Cooling an atom means slowing down its vibrations, and this can be done by directing precisely aimed and timed photons (light particles) at an atom, damping and lessening the atomic vibrations. It’s a bit like throwing small snowballs at a rushing linebacker to slow him down. If aimed correctly and in sufficient quantity, this would bring him to a halt – in other words, it would cool him down in more ways than one. 

Trapped-ion qubits are typically stable, with long coherence times. In addition, under certain circumstances, the gates that are used to manipulate trapped-ion qubits tend to be high-fidelity while still maintaining respectable speeds (though admittedly slower than gates used with superconducting quantum computers like IBM’s). All of this helps minimize errors.

Perhaps the most attractive advantage of trapped-ion computers is that they require far less conventional cooling. The trapped ions themselves become supercooled, but the immediate surroundings of a trapped-ion computer can remain at something close to room temperature. (Quantinuum/Honeywell has found that moderate cryogenics (4 degrees Kelvin rather than just a few milikelvins above zero) can help improve performance, however.)

However, the precision with which the lasers need to be applied present a serious engineering challenge, and the difficulty level ramps up rapidly when trying to scale up the size of the system as a whole. (Coherent COHR -2.85%  and other silicon photonics companies, which we discussed in a previous Signal are working on this problem, as well.) 

When scaling up trapped-ion computers, gate speeds can also slow, negating much of their initial advantage. Although there are theoretical solutions involving photonics and optics that can mitigate these, the additional complexity means more points of potential failure, adding a different kind of risk.

Arguably the leader in this type of quantum computing is IonQ (IONQ -3.93% ), currently beginning to move more fully away from pure research and into commercial-scale manufacturing. There are some enthusiastic pundits who are even describing IonQ as the “next Nvidia,” and while we view it as far too early to make such an assessment due to the nascent nature of quantum computing research, there’s arguably no denying that IonQ’s work – particularly its Tempo and Forte systems – shows promise. 

As we mentioned earlier, Honeywell HON -0.49%  and Quantinuum are also seeking to develop trapped ion computers. (Quantinuum, a privately held startup in which Honeywell owns a majority stake, was formed through a spinoff of Honeywell’s quantum unit and a merger of that unit with Cambridge Quantum). The two companies, while separate, are developing their own trapped ion computer, with Honeywell’s industrial expertise being leveraged to produce the customized components required. Quantinuum’s trapped-ion approach differs from IonQ’s in its focus on flexibility and developing “quality” error-free computing rather than power (as expressed by sheer number of qubits). 

Topology

Another approach makes use of topological qubits. These are quite different from the qubits described above, and topological computing is an aggressively ambitious and difficult approach. 

Though quantum physics might seem like cutting-edge science to the layperson, topological qubits take things a step further. Rather than manipulating electrons or atoms, topological qubits are generated by using “quasiparticles” called Majorana anyons. Some regard these quasiparticles as a new state of matter (one of the roughly dozen or so beyond the solid-liquid-gas trinity we all learned about in elementary school.) 

Generating topological qubits requires scientists to engineer an entirely new class of superconductors that layers semiconductor and superconductor materials on top of each other at a microscopic level – adding to the challenges. 

Within the commercial space, arguably the most prominent proponent of this approach is Microsoft. The Redmond, Wash.-based company is betting that the risks and difficulties will be worth it. It’s akin to taking on a lot of additional risk in investing because the potential payoff could be significantly higher. Here’s why: 

Topological qubits are more commonly compared to braided ropes rather than a spinning coin, and the “computing” takes place by moving strands of the various braids around each other. This analogy works in explaining the primary advantage of topological qubits. As Microsoft researcher Chetan Nayak explained it, “Most qubits are like balancing a needle on its point; topological qubits are like a knot in a rope—much harder to accidentally undo.” They are more resistant to external phenomena like heat, vibration, and magnetic waves, with longer coherence. 

As Robert Willet, a leading researcher in the field working at Nokia NOK 4.31%  describes it: The hypothesis is that a topological qubit will be “intrinsically stable and easy to control,” with “extremely low error rates, which means we would not need to build massive redundancy into quantum computers. This would make quantum computers smaller, more energy efficient and massively powerful.”

In February 2025, Microsoft announced that it had successfully come up with the requisite new material, used it to create the Majorana anyons, and incorporated them into a processor called the Majorana 1. Importantly, not everyone agrees that Microsoft has proven that its efforts have been successful

Just in case the critics are right, Microsoft is using its Azure Quantum platform to partner with other quantum ventures, primarily by offering its qubit-virtualization platform to pair with third-party hardware. For instance, they have partnered with Quantinuum, using Quantinuum’s hardware to test their virtualization, diagnostic, and error-correction software, and they have partnered with Atom Computing, a privately held startup working on an even newer variant of quantum computing that focuses on neutral-atom qubits. (While its scalability potential is promising enough to warrant continued research, neutral-atom efforts are currently in their infancy and primarily based in academia and small, private startups.)

On cybersecurity and cryptography

Much of modern cryptography and cybersecurity, including those used in cryptocurrency transactions, are all based on a specific type of mathematical problem, one that is easy to do in one direction but much more difficult to perform in reverse. Mathematicians call this a “one-way function.” Cybersecurity based on such mathematics is known as asymmetric cryptography.

For example, one widely used form of asymmetric cryptography is RSA. RSA is based on the fact that it is relatively easy to multiply two large prime numbers together. (By large, we mean that each prime number consists of 600 or more digits in decimal format.) The product of those prime numbers is known as the public key used in an algorithm to encrypt data. (It might be useful to think of this as akin to a conventional physical lock.) Decrypting such information requires the “private key” – knowledge of what the two prime-number factors are. Someone who is not authorized to know the factors would have a very difficult time determining the right keys to use even if they were to get their hands on the lock. 

That will likely cease to be true for RSA and other types of asymmetric cryptography, including elliptic curve cryptography (ECC, widely used in the cryptocurrency world). Thanks to Shor’s Algorithm, developed by Peter Shor while he was working at Bell Labs, a sufficiently capable quantum computer theoretically would be able to significantly reduce the amount of work needed for a brute-force determination of what the private key to any asymmetric cryptography might be. What would have taken a powerful classical supercomputer trillions of years to accomplish could be done in a matter of days – or even hours – by a fault-tolerant quantum computer with sufficient qubit processing power. 

It’s worth noting that quantum computing means that encrypted data that was stolen in the past might also become easily decryptable. Though some of the data might be stale by the time quantum computers catch up, the risk remains that some of it will remain sensitive and potentially damaging. Indeed, cybersecurity experts believe that as quantum computing advances, cybercriminals have begun to employ a “Harvest Now, Decrypt” later approach, gathering up as much encrypted data as possible in anticipation that it will become decryptable – and thus, valuable – someday. It’s a bit like stealing an entire safe in the hopes that it will someday be possible to extract its contents. 

In the meantime, researchers are seeking to develop and deploy PQC. Just as weapons manufacturers are also often the leading manufacturers of technologies to guard against those same weapons, the private-sector leaders in PQC include several leaders in quantum research. IBM -0.76% , for example, has worked closely with the National Institute of Standards and Technology, arguably the global leader in cybersecurity response, to set standards for what is being called post-quantum cryptography (PQC). IBM (IBM -0.76% ) has also developed several PQC algorithms that are currently being tested by the NIST. 

Meanwhile, Google (GOOGL -0.81% ) is looking into integrating PQC technology into its existing technology – not just its data centers, but also everyday products like its Chrome browser. Microsoft is taking a similar approach by working to build in PQC protection into everyday products like Windows and Office. 

That’s not to say other companies are sitting around inactive. Marvell Technology MRVL -3.39%  claims that it has integrated PQC capabilities into its hardware, developing specialized chips like LiquidSecurity and NITROX modules that are dedicated to performing quantum-safe mathematical cryptography.

Traditional cybersecurity firms like Cloudflare NET 0.08%  and Palo Alto Networks PANW -1.00%  are already implementing PQC cybersecurity – or at least they’re trying to. Both companies are facing challenges that are all too familiar. The first involves convincing clients to go through the hassle and cost of upgrading security to protect against the threat that’s almost certain to be very serious someday – but doesn’t exist yet and has yet to scare them by wreaking global havoc. The second involves actually implementing the upgraded security – both companies are keenly aware that they too are at risk of inadvertently triggering a major Crowdstrike-like global outage every time they carry out an update or upgrade.

Cryptocurrency

Unsurprisingly, the cryptography implications we just discussed have future ramifications for those involved with cryptocurrencies – the clue is in the etymology, after all. As BlackRock put it in its iShares Bitcoin Trust prospectus: “If quantum computing technology is able to advance, it could potentially undermine the viability of many of the cryptographic algorithms used across the world’s information technology infrastructure, including those used for digital assets like Bitcoin.” That’s quite a big “if,” however. In its current state, quantum computing is generally viewed as still being far from capable of making these risks imminent. Yet many researchers in the field would assert that the question is not really “if,” but rather “when.”

As we mentioned earlier, asymmetric ECC-based security is widely used to safeguard crypto wallets and validate cryptocurrency transactions – particularly when it comes to legacy addresses and wallets. While the problem is fixable with enough effort and motivation, a May 2025 report suggests that somewhere between 20% and 50% of global Bitcoin supply would be vulnerable to theft if sufficiently powerful quantum computers existed right now. Ethereum is similarly vulnerable, though the nature of Ethereum makes it arguably and theoretically easier to quantum-proof its architecture. 

Quantum technology also poses potential risks to crypto miners. Quantum computing has the potential to disrupt the field of competition – miners who continue to rely on classical computers are likely to find themselves at a serious disadvantage. This could induce some to cease operations, ultimately threatening the decentralized nature of cryptocurrencies, at least temporarily. As this is a major draw of cryptocurrencies in the first place, it’s reasonable to assume that this will be followed by significant volatility – as well as debate about the path forward for each of the coins impacted. We should stress that quantum computing in its current state is arguably quite far from posing such a threat, but many experts in the field are at a point when they view this as a question of when rather than if.

Providers of quantum security and PQC services can arguably expect opportunities to arise as quantum computing advances and crypto companies seek to prepare and mitigate the emerging risks. In addition to the general quantum cybersecurity companies listed above, companies focused specifically on risks to cryptocurrency markets include:

  • BTQ Technologies BTQ: The company recently launched a testnet version of its Bitcoin Quantum network, which positions itself as a possible Bitcoin fork that integrates a PQC standard endorsed by the NIST. BTQ is also developing hardware-level PQC security through a Quantum Compute-in-Memory architecture that, if successful, would be dedicated to handling the security-related aspects of crypto transactions. 
  • SEALSQ LAES -4.63% : SEALSQ is focused on PQC cryptocurrency-focused infrastructure. This includes the development of cold-storage wallets that incorporate PQC chips, research into how to integrate PQC algorithms directly into blockchain core software, and making proof-of-work and proof-of-stake processes resistant to quantum-driven interference.
  • Arqit Quantum ARQQ -4.54% : Arqit is taking a different tack than the other two. Rather than trying to make existing asymmetric security resistant to quantum-based intrusions, Arqit is championing replacing it altogether with symmetric security for crypto exchanges and financial institutions. (We have not discussed symmetric cryptography in detail, but in a nutshell, symmetric cybersecurity is far more resistant to quantum cryptography, but it presents a different set of security issues, as well as challenges with scalability and communication/transaction validation.)

Traditional Finance

The discussions above generally apply to traditional banking and finance as well. If (when) quantum computers become sufficiently capable, the security of many aspects of banking and finance could come under risk – from bank deposits and withdrawals to equities and options trading, from cross-border wire transfers to digital identity verification and transactional execution.

The major banking giants aren’t passively waiting around to find out what happens. Indeed, something of a quantum arms race has arisen among major traditional financial institutions, as they invest in quantum efforts, partner with leaders to experiment with applications, and investigate potential risks and how to safeguard against them. 

Take JPMorgan Chase JPM -1.95% , for example. The firm is arguably the most actively involved in quantum development within the financial sector – the bank’s name even appears on a number of quantum patents and research papers. JPM has partnered with IBM to experiment with the use of quantum-driven Monte Carlo simulations to inform portfolio optimization. They have also deployed an experimental quantum-secured crypto-agile network between its major data centers that will resist quantum attacks. The bank has sizable stakes in quantum startups such as Quantinuum, QC Ware, and IonQ. (Fun fact: Dr. Marco Pistoia, IonQ’s new Global Head of Special Projects, formerly led JPMorgan’s quantum research efforts.)

HSBC HSBC 0.23%  is also actively engaged in preparing for a quantum future, having recently deployed Quantum Key Distribution that helps resist hacking of digital ownership of physical assets. In September, HSBC partnered with IBM to determine whether a classical-quantum hybrid system could improve bond pricing in algorithmic trading (specifically, predicting the likelihood that an order would be filled at a given price). While the results were ultimately deemed inconclusive, the experiment appeared to show that the quantum approach might have been superior. (IBM also ran a similar portfolio-optimization experiment in partnership with Vanguard, with the results again proving inconclusive but hopeful.)

Quantum computing innovations in finance are not exclusive to the U.S. For example, since 2021, Italy’s Intesa Sanpaulo ISNPY has dedicated significant resources to quantum research, with partnerships with IBM -0.76%  and D-Wave QBTS -6.56% . These efforts have included experimenting with the use of quantum algorithms in credit scoring and derivative pricing.

Conclusion

We focused on applications in quantum computing in areas likely of interest to our readers, but it’s important to note that leading companies and startups have attracted interest from a broad range of companies in other industries – defense and aerospace companies like Airbus and Lockheed Martin, pharmaceutical giants like AstraZeneca, automobile makers like Volkswagen and Mercedes, and energy giants like ExxonMobil and BP, to name just a few. 

While the willingness of all these major corporations to invest in quantum efforts suggests confidence in the promise of quantum computing (as does the “Quantum Gold Rush” equity investors witnessed in 2025), it remains important to stress that quantum computing has yet to reach the holy grail of large-scale, fault-tolerant, widely deployable quantum computing – and there’s no guarantee that this objective will be achieved anytime soon.

Thus, as always, Signal From Noise should not be used as a source of investment recommendations but rather ideas for further investigation. We encourage you to explore our full Signal From Noise library, which includes deep dives on the race to onshore chip fabrication, the AI Merry-Go-Round, space-exploration investments, the military drone industry, the presidential effect on markets, ChatGPT’s challenge to Google Search, and the rising wealth of women. You’ll also find a recent update on AI focusing on sovereign AI and AI agents, the TikTok demographic, and the tech-powered utilities trade.

Disclosures (show)

Events

Trending tickers in our research