top of page
Search

Quantum Computing Overview - Part#1

Writer's picture: Sachin TahSachin Tah


Google has just unveiled its groundbreaking next-generation Quantum chip, Willow, which promises to break all computing records exponentially! This revolutionary release drastically reduces processing errors and introduces self-corrections unseen in the past 30 years.


According to Google's quantum director, Julian Kelly, Willow has achieved the incredible feat of computing in under 5 minutes which would take a supercomputer a mind-boggling 10 septillion (that's 10 ^25) years!


So, what exactly is a Quantum chip or Quantum Computer, and how does it work? How can I program Quantum computers? What kind of applications can I develop on Quantum computers? In this blog, we will explore some of Quantum's internals at a high level and see possible use cases for applying them to quantum processing.


Quantum Computing


Quantum computers are the next big thing in the technology industry, and they will redefine almost everything about how we embrace technology. However, they are nowhere close to replacing classical computers because they can perform only specific computations.


To comprehend computations at the quantum level, it is essential to grasp some fundamental principles of quantum mechanics. While this blog is not intended to serve as a physics lecture and I am not a physics expert, understanding a few basic terms and concepts will be beneficial.


Quantum (derived from the Latin word amount or quantity) signifies a unit, typically for any physical entity's minimum amount. "Photon" is the quantum of light and an "Electron" is the quantum of electricity. Quantum mechanics operates at the atomic and subatomic levels.


Using properties exhibited by atoms and particles like Scale, Superposition, Entanglement, Uncertainty principles etc., Some of the real-world applications using these principles are Lasers, Transistors, and of course Quantum Computers.


Quantum Computers use principles of quantum mechanics to solve computational problems and you need specialized hardware to run such computations. The speed of these computations is exponential than that of conventional silicon-based computers. Quantum computers rely on superposition and entanglement properties of atoms to achieve desired results.


Classical and Quantum Computers


To better understand quantum computers, it is helpful to compare them to their predecessors, the traditional silicon-based computers.


Processing Unit/Hardware

As you know, the heart of the conventional computer is the central processing unit (CPU), consisting of three logical units: Control, Arithmetic, and Memory.


Conventional Computer
Conventional Computer

All computations are performed by the CPU, which is governed by the control unit. Processing data is stored in a memory unit, and the ALU is responsible for performing all types of calculations. Other parts of computer systems, such as storage units, IO devices, network interfaces, and so on, enable us to interact with computers.


Quantum computers are very different from conventional PCs and you can figure this out by looking at the diagram below.


Quantum Computer
Quantum Computer

A conventional PC is still used to assign processing tasks or push algorithms inside a quantum computer. This computer is responsible for communicating with the control processor and initiating quantum operations. It translates algorithms into quantum instructions that can be executed on quantum machines and also serves as an interface for reading quantum results.


The control processor is responsible for the manipulation of physical qubits and logic gates available in the quantum processor by sending signals. The control & Measurement layer controls the physical qubits, each physical qubit is attached to the control and measurement layer via wires. Desired qubits are sent signals during the execution of algorithms. The quantum layer is responsible for the actual creation of qubits, it also manages quantum states of qubits. Changes in the state of qubits are read by the control and measurement layer and results are sent to the computer as output.


Temperature control and cryogenic units are used to control temperatures inside quantum units. Qubits are super sensitive, and a very slight temperature change alters the state of the qubit resulting in the wrong.


Data Units

Data units are used to quantify the size of digital information, a real-world example could be KB, MB, or GB.


1s & 0s
1s & 0s

In conventional computing, the smallest unit of data is known as a bit, 8 bits comprising 1 byte. A bit can store binary values, representing either 0 or 1, depending on the state it signifies. For instance, 0 indicates 'Off,' while 1 indicates 'On.' A combination of these bits conveys meaningful information; for example, the digit 9 is represented as 1001.


Bits are stored on silicon chips using electrical currents within memory cells, and all operations are conducted using these bits only.



|ψ⟩ = α|0⟩ + β|1⟩
|ψ⟩ = α|0⟩ + β|1⟩

On the other hand, Quantum uses Qubits as its smallest data unit. Qubits also have values 0 and 1, however, there is one significant difference that changes everything.


Qubits can attain both values at the same time, therefore attaining multiple states at once; this is termed superposition. Unlike Silicon bits, Qubits can be created using trapped ions (water), superconducting qubits, or photonic qubits (Silicon).


The most common qubit technologies are trapped-ion qubits, superconducting qubits, and topological qubits. There are various ways to send signals to qubits and perform manipulations, such as microwaves, lasers, and voltage.


Logic Gates

If you have an engineering background, you are likely familiar with logic gates. In traditional computer processing, logic gates are integral to every computation, ranging from basic addition to executing complex machine learning algorithms. They serve as the fundamental building blocks of any electronic device or computer circuit and are typically constructed using transistors.


A conventional computer CPU for example Apple M4 has around 28 Billion transistors which helps it to perform operations.  

Conventional Logic Gates
Conventional Logic Gates

Logic gates receive input and produce a single binary output. They work on boolean algebra where results are either 0 or 1 (Bits). Conventional logic gates provide unidirectional results.


For example, consider a simple logic gate NOT that will output 1 if 0 is input and output 0 if 1 is input.  


Similar to what we have in the traditional computing world,  operations on qubits are performed using quantum logic gates. However there is a difference, in the Quantum world, Logic gates are platform (hardware) specific.


As you know, Quantum computers can be implemented using methods like SuperconductingTrapped IonsPhotonics, and Topological. Each implementation provides unique gates that may not be available to others. One of the best-performing hardware can be implemented using the Superconducting technique, which is usually implemented using superconductors like Nobiam and Titanium.


I am not going to cover all Quibit gates here, below are some qubit gates available with Superconducting and Trapped qubits


Superconducting

  • X Gate (Pauli-X) - Flips the state of a qubit (Classical NOT gate), flips the probability of 0 and 1

  • Y Gate (Pauli-Y) - Applies a bit and phase flip, flips the probability of 0 and 1 and + and -

  • Z Gate (Pauli-Z) - Applies a phase flip, +/-

  • Hadamard Gate (H) - Creates superposition states.

  • CNOT Gate (Controlled-NOT) - Two-qubit gate, flips the second qubit if the first qubit is in state |1⟩.

  • SWAP Gate - Swaps the states of two qubits.


Trapped Ions

  • Mølmer-Sørensen Gate - A two-qubit entangling gate.

  • Rydberg Gate - Uses highly excited states for qubit interactions.

  • Sideband Gate - Uses sideband transitions for qubit manipulation.


Quantum Properties


What precisely distinguishes quantum computers from conventional computers, and why do they perform at exponentially higher speeds? To comprehend this, we must revisit certain quantum properties that enable quantum computers to surpass traditional computing systems.


Superposition

Quantum Superposition
Quantum Superposition

As discussed previously, a qubit is different from a bit because it can exist in multiple states at any point in time, which is termed a superposition. A bit can be represented as 0 or 1, however, the state of a qubit is represented by a function

 

   |ψ⟩ = α|0⟩ + β|1⟩


where ψ is the state of a qubit and α and β are complex numbers.


It is important to understand that the state of a qubit can be maintained in a superposition state only while a quantum system is unobserved or not measured. Once measured, the state collapses into one of the basis states.


Entanglements

Qubit Entanglements
Qubit Entanglements

Entanglement is a fundamental aspect of quantum mechanics. In the field of quantum computing, entanglement allows two qubits to maintain correlated states. This implies that modifying the quantum state of one qubit will, in turn, alter the state of its entangled counterpart. Furthermore, measuring the state of one qubit not only collapses its superposition state but also the state of the entangled qubit.


This characteristic facilitates parallelism in quantum computers, enabling the manipulation of multiple qubits in a single operation rather than handling each qubit individually. The controlled-NOT (CNOT) gate primarily helps create entangled qubits. It operates on two qubits, with one as control and the other as target.


Interference

Quantum Interference
Quantum Interference

Interference is a quantum phenomenon used in quantum computing that leverages the wave-like properties of quantum states. As the name applies, interference can be used to amplify or suppress certain outcomes to achieve desired results.


Quantum algorithms leverage interference to enhance the probability of correct answers while diminishing the likelihood of incorrect ones. Qubits in a superposition interact with each other, resulting in either the amplification or reduction of a quantum state. We can use either to increase the probability of desired results. Interference is used in various ways to change and control quantum states and also to filter unwanted results.


Programming Quantum Hardware


Quantum Providers

In the modern era, the most efficient way to engage with new tools and technologies is through online platforms and software-as-a-service (SaaS) models. This method eliminates the need for substantial upfront investments in hardware or platforms. Fortunately, nearly all leading cloud providers offer quantum hardware as part of their service offerings.


Below is a list of various cloud providers that offer quantum hardware in the SaaS model.


  • IBM Quantum offers the IBM Q Experience, which includes quantum hardware and simulators accessible via the Python-based Qiskit framework.

  • Google Quantum AI provides access to quantum computing hardware and the open-source platform Cirq for building and testing quantum algorithms.

  • Microsoft Azure Quantum provides access to quantum hardware, software, and solutions, and supports popular quantum SDKs like Q#, Qiskit, and Cirq1.

  • Amazon Braket allows users to build quantum applications and run algorithms on quantum computers.

  • Alibaba Cloud offers access to an 11-qubit quantum computer and educational resources for scientific researchers.

  • D-Wave Leap provides a cloud-based quantum processor and hybrid solver service combining quantum and classical resources.

  • Xanadu Cloud offers free access to photonic quantum computers and software support.


To execute quantum algorithms, much like any processor, there exists a collection of quantum instruction sets that convert high-level instructions into physical commands. Additionally, high-level programming languages are available to facilitate the design of programs and algorithms on a quantum computer. These high-level languages abstract the complexities of the underlying hardware and instruction sets, allowing programmers to concentrate on developing algorithms without concern for the underlying system. This is analogous to writing software code in a high-level language such as Java, which can operate on any processor with any instruction set. 


Although there are multiple frameworks/libraries/languages available to program Quantum computers, here are some


  1. Qiskit - Open source framework developed by IBM,

  2. Cirq - Python library created by Google for running quantum code on Google quantum computers

  3. Q# - Developed by Microsoft for writing quantum algorithms

  4. Braket SDK - Developed by AWS to write Python code which runs on AWS Braket

  5. PyQuil - Developed by Rigetti Computing, Quil is a quantum instruction language designed for hybrid quantum-classical algorithms


Quantum Use Cases


Quantum computers are not going to replace traditional computers in any way, at least shortly. Quantum computers can solve very specific problems with amazingly high speed.


Quantum computers can typically solve problems related to OptimizationLarge-scale Simulations, and certain types of Cryptography.


Below are some of the examples where quantum computers can be used


  • ML & AI - Pattern identification in large datasets, Large data analysis

  • Cryptography - Breaking current algorithms therefore providing ways to further enhance security and cryptography

  • Climate Predictions - Better understand and predict climate change

  • Resource Management - Helps optimize resources, be it water, energy, gas etc.

  • Drug Discovery - Simulating complex molecular structures, can speed up the process of drug discovery


Quantum computers can be utilized in various scenarios where traditional computers are either unable to process the information or fail to provide optimized and timely results.


In my upcoming blog, I will discuss a hands-on experiment with quantum computers. Subscribe to receive notifications.


Some images are AI-generated images


Sachin Tah



118 views0 comments

Recent Posts

See All

Comments


bottom of page