INNOVATE is the online magazine by and for AIPLA members from IP law students all the way through retired practitioners. Designed as an online publication, INNOVATE features magazine-like articles on a wide variety of topics in IP law.
In This Section
The Quantum Frontier: Disrupting AI and Igniting a Patent Race
Diego F. Freire
The contemporary computer processor — at only half the size of a penny — possesses the extraordinary capacity to carry out 11 trillion operations per second, with the assistance of an impressive assembly of 16 billion transistors. This feat starkly contrasts the early days of transistor-based machines, such as the Manchester Transistor Computer, which had an estimated 100,000 operations per second, using 92 transistors and having a dimension of a large refrigerator. For comparison, while the Manchester Transistor Computer could take several seconds or minutes to calculate the sum of two large numbers, the Apple M1 chip can calculate it almost instantly. Such a rapid acceleration of processing capabilities and device miniaturization is attributable to the empirical observation known as Moore's Law, named after the late Gordon Moore, the co-founder of Intel. Moore’s Law posits that the number of transistors integrated into a circuit is poised to double approximately every two years. 
In their development, these powerful processors have paved the way for advancements in diverse domains, including the disruptive field of artificial intelligence (AI). Nevertheless, as we confront the boundaries of Moore's Law due to the physical limits of transistor miniaturization, the horizons of the field of computing are extended into the enigmatic sphere of quantum physics — the branch of physics that studies the behavior of matter and energy at the atomic and subatomic scales. It is within this realm that the prospect of quantum computing arises, offering immense potential for exponential growth in computational performance and speed, thereby heralding a transformative era in AI.
In this article, we scrutinize the captivating universe of quantum computing and its prospective implications on the development of AI and examine the legal measures adopted by leading tech companies to protect their innovations within this rapidly advancing field, particularly through patent law.
Qubits: The Building Blocks of Quantum Computing
In classical computing, the storage and computation of information are entrusted to binary bits, which assume either a 0 or 1 value. For example, a classical computer can have a specialized storage device called a register that can store a specific number at a time using bits. Each bit is like a slot that can be either empty (0) or occupied (1), and together they can represent numbers, such as the number 2 (with a binary representation of 010). In contrast, quantum computing harnesses the potential of quantum bits (infinitesimal particles, such as electrons or photons, defined by their respective quantum properties, including spin or polarization), commonly referred to as qubits. Distinct from their classical counterparts, qubits can coexist in a superposition of states, signifying their capacity to represent both 0 and 1 simultaneously. This advantage means that, unlike bits with slots that are either empty or occupied, each qubit can be both empty and occupied at the same time, allowing each register to represent multiple numbers concurrently. While a bit register can only represent the number 2 (010), a qubit register can represent both the numbers 2 and 4 (010 and 100) simultaneously.
This superposition of states enables the parallel processing of information since multiple numbers in a qubit register can be processed at one time. For example, a classical computer may use two different bit registers to first add the number 2 to the number 4 (010 +100) and then add the number 4 to the number 1 (100+001), performing the calculations one after the other. In contrast, qubit registers, since they can hold multiple numbers at once, can perform both operations—adding the number 2 to the number 4 (010 + 100) and adding the number 4 to the number 1 (100 + 001)—simultaneously.
Moreover, qubits employ the singular characteristics of entanglement and interference to execute intricate computations with a level of efficiency unattainable by classical computers. For instance, entanglement facilitates instant communication and coordination, which increases computational efficiency. At the same time, interference involves performing calculations on multiple possibilities at once and adjusting probability amplitudes to guide the quantum system toward the optimal solution. Collectively, these attributes equip quantum computers with the ability to confront challenges that would otherwise remain insurmountable for conventional computing systems, thereby radically disrupting the field of computing and every field that depends on it.
Quantum computing embodies a transformative leap for AI, providing the capacity to process large data sets and complex algorithms at unprecedented speeds. This transformative technology has far-reaching implications in fields like cryptography, drug discovery, financial modeling, and numerous other disciplines, as it offers unparalleled computational power and efficacy. For example, a classical computer using a General Number Field Sieve (GNFS) algorithm might take several months or even years to factorize a 2048-bit number. In contrast, a quantum computer using Shor's algorithm (a quantum algorithm) could potentially accomplish this task in a matter of hours or days. This capability can be used to break the widely used RSA public key encryption system, which would take conventional computers tens or hundreds of millions of years to break, jeopardizing the security of encrypted data, communications, and transactions across industries such as finance, healthcare, and government. Leveraging the unique properties of qubits—including superposition, entanglement, and interference— quantum computers are equipped to process vast amounts of information in parallel. This capability enables them to address intricate problems and undertake calculations at velocities that, in certain but not all cases, surpass those of classical computers by orders of magnitude.
The augmented computational capacity of quantum computing is promising to significantly disrupt various AI domains, encompassing quantum machine learning, natural language processing (NLP), and optimization quandaries. For instance, quantum algorithms can expedite the training of machine learning models by processing extensive datasets with greater efficiency, enhancing performance and accelerating model development. Furthermore, quantum-boosted natural language processing algorithms may yield more precise language translation, sentiment analysis, and information extraction, fundamentally altering how we engage with technology.
Patent Applications Related to Quantum Computers
While quantum computers remain in their nascent phase, to date, the United States Patent and Trademark Office has received more than 6,000 applications directed to quantum computers, with over 1,800 applications being granted a United States patent. Among these applications and patents, IBM emerges as the preeminent leader, trailed closely by various companies, including Microsoft, Google, and Intel, which are recognized as significant contributors to the field of AI. For instance, Microsoft is a major investor in OpenAI (the developer of ChatGPT) and has developed Azure AI (a suite of AI services and tools for implementing AI into applications or services) and is integrating ChatGPT into various Microsoft products like Bing and Microsoft 365 Copilot. Similarly, Google has created AI breakthroughs such as AlphaGo (AI that defeated the world champion of the board game Go), hardware like tensor processing units (TPUs) (for accelerating machine learning and deep learning tasks), and has released its own chatbot called Bard (also known as LaMDA).
FIG. 1 Current assignments for patents and patent applications related to “Quantum Computers.”
Patents Covering Quantum Computing
The domain of quantum computing is progressing at a remarkable pace, as current research seeks to refine hardware, create error correction methodologies, and investigate novel algorithms and applications. IBM and Microsoft stand at the forefront of this R&D landscape in quantum computing. Both enterprises have strategically harnessed their research findings to secure early patents encompassing quantum computers. Notwithstanding, this initial phase may merely represent the inception of a competitive endeavor to obtain patents in this rapidly evolving field.
Quantum computing signifies a monumental leap forward for AI, offering unparalleled computational strength and efficiency. As we approach the limits of Moore's Law, the future of AI is contingent upon the harnessing of qubits’ distinctive properties, such as superposition, entanglement, and interference. The cultivation of quantum machine learning, along with its applications in an array of AI domains, including advanced machine learning, NLP, and optimization, portends a revolution in how we address complex challenges and engage with technology.
Prominent tech companies like IBM and Microsoft have demonstrated their commitment to this burgeoning field through investments and the construction of patent portfolios that encompass this technology. The evident significance of quantum computing in shaping the future of AI suggests that we may be witnessing the onset of a competitive patent race within the sphere of quantum computing.
This article was previously published.
 For example, the Apple M1, announced in 2020.
 Multicore processors and GPUs have helped extend Moore's Law by incorporating more processing units on a single chip, effectively increasing the overall transistor count and computational power. As traditional methods of scaling down transistors face physical limits, leveraging parallelism through multiple cores and specialized GPU architecture has become a key factor in maintaining the growth of computing performance.
 Physical limits include electron leakage (caused by quantum tunneling, where electrons can “tunnel” through barriers that they should not be able to pass through, causing increased power consumption and possible errors in computation) and heat dissipation (which occurs because as transistors become smaller and more densely packed, it becomes increasingly difficult to dissipate the heat generated during their operation, which can lead to overheating and performance degradation).
 Enabling the efficient breaking of classical cryptographic schemes while simultaneously driving the development of quantum-resistant algorithms and secure quantum communication protocols for enhanced data protection.
 Efficiently simulating molecular interactions and chemical reactions, enabling the identification of new pharmaceutical compounds and optimization of drug properties at a faster pace than traditional methods.
 Rapidly solving complex optimization problems, improving risk analysis, and enhancing portfolio management, leading to more accurate predictions and better decision-making tools for financial institutions.
 For example, classical computers can be faster when doing simple arithmetic and basic calculations, text processing and data management, and running most everyday applications (like web browsers and office suites).
Diego is an associate in Levenfeld Pearlstein’s IP group. He concentrates his practice assisting clients with protecting their patent and trademark rights across various technical areas, including artificial intelligence, fintech, blockchain, and medical devices. Before attending law school, he designed medical devices as an Edison Engineer at GE Healthcare.
Publishing an article to INNOVATE is a great way for AIPLA members to build their brand by increasing recognition among peers and setting themselves apart as thought leaders in the IP industry.
Any current AIPLA member in good standing may submit an article for consideration in INNOVATE throughout the year. IP law students are especially encouraged to submit articles for publication.
Articles submitted to firstname.lastname@example.org are reviewed by an ad-hoc sub-committee of volunteers from AIPLA's Fellows Committee, and other AIPLA peers.
Don’t miss your chance to be published with AIPLA’s INNOVATE! Email your article submission to email@example.com to be considered for the next edition.