Whether it be through our PCs or the smart devices we carry around, computing controls almost everything we do in the modern world. Futurists view computational power and related capacities as key indicators of technological advancement. The computational explosion of data will have a direct impact on the fundamental pillars of society, such as healthcare, security, communications, transportation, and energy.
The term “classical computing” of traditional binary, control-flow, and stored-program computer models has evolved over the last few decades from room-sized computers to microchips In Industry 4.0, new classes of advanced computing have appeared, and their utility and functionality are expanding. The goal of these emerging technologies is to speed the development of our ability to expand and improve upon our inherent human capacities.
Undoubtedly, the field of computing has experienced tremendous progress since the development of the electronic calculator in the 1960s. In our hyperconnected society, information processing has undergone revolutionary changes. According to futurist Ray Kurzweil, processing power doubles every two years on average, and humans will be able to “expand the scope of our intelligence a billion-fold.”
A decade ago, we could not have envisaged the reality of cognitive computing, but recent advances in physics and nanotechnologies have made it possible. Below are some examples of the areas where computing paradigms are rapidly changing:
Traditional or Classical computing:
Traditional or classical computing has seen many materials science iterations using vacuum tubes, transistors, microprocessors, and integrated circuits. It is predicated on the idea that electrical circuits exist in only one state at any given time—that is, they are either on or off. Arithmetic and Logic Units (ALU), processor registers, and control units make up the Central Processing Unit (CPU), which handles data processing in traditional computers. When using a classical computer, all processing is done using the same logic, or high voltage/low voltage, and information is stored as bits in a memory device that can be either 1 or 0 (a binary system).
Analog Computing:
Analog computers may process inputs and provide meaningful outputs without requiring the inputs to be translated into any specific computer language. Analog computers represent numbers using easily observable quantities like voltages or rotations rather than codes, programming languages, or algorithms. Common examples of analog computers that we use today are thermometers, speedometers, and voltmeters.
Super-Computing:
There are a few things that set supercomputers apart. In contrast to traditional computers, they typically include many CPUs (central processing units), which are comprised of circuits for deciphering instructions from programs and carrying out logical and mathematical operations in the correct order. Supercomputers differ from mainframe computers in that they have large amounts of data storage and powerful computing capabilities.
Exascale computing is a measure of supercomputer capability. Both classical and exascale computers are built on bits. Computing systems that are capable of performing at least “1018 IEEE 754 Double Precision (64-bit) operations (multiplications and/or additions) per second (exaFLOPS)” are referred to as exascale computing systems.
For essential jobs, supercomputing and its corresponding high-performance computing have emerged as the means and mechanisms for addressing complex problems at greater speed. The Frontier supercomputer at Oak Ridge National Laboratory remains the fastest computer in the world and can compute at 1.102 quintillion operations per second.
An effective explanation of HPC may be found on the website Techtarget.com, which defines it as “the use of parallel processing for running advanced application programs efficiently, reliably, and quickly.” Academic institutions, engineers, and scientists are the main consumers of HPC systems. A few governmental organizations, mostly the military, also use HPC for sophisticated applications.” Supercomputing and high-performance computing (HPC) go hand in hand since both require the pooling of computer resources to solve problems.
Cloud Computing:
Moving and storing data and apps from distant servers via the Internet is known as cloud computing. Cloud computing provides cost flexibility, mobility, and productivity gains for business users. Operation and commerce depend on the ability to safely store, prioritize, analyze, distribute, and scale that data. Business data is increasingly being moved to hybrid and cloud clouds. Within the next few years, most data processing tasks, according to forecasters, will be found in cloud data.
To meet the growing demands for storage and analytics, the public and private sectors are constructing bigger data repositories and pooling cloud data centers. Think about the fact that the globe produces twice as much data as it does every two years—that is, 2.5 quintillion bytes of data every day.
The cloud makes improved firewalling and controlled security possible in terms of cybersecurity. Knowing where the data is stored and who is in charge of its security is one of the cloud’s main benefits. It makes perfect sense to store data in the cloud or on hybrid clouds due to those functional needs.
Edge Computing:
Edge computing is a product of our sensor society, where everything and anything is connected, often referred to as the Internet of Things. Edge computing places computing power and analytics functions close to where data is created.
Edge computing catalyzes the move to data-driven edge infrastructure and is used to maximize the capabilities of processing speeds and reduce bandwidth requirements. Operations and commerce depend on the capacity to safely store, prioritize, analyze, exchange, and scale device data. To reduce latency, edge computing aims to move operations, data storage, and real-time processing closer to the device itself rather than depending on a central location.
Fog Computing:
A decentralized computer infrastructure is known by the terms fog computing, fog networking, or simply “fogging.” Data, computation, storage, and applications are positioned in the most sensible and effective locations as cloud computing (data center) is extended to the edge of a network. Often referred to as being “out in the fog,” this location is situated between the cloud and the data’s source.
Quantum Computing:
With quantum computing, civilization is currently on its doorstep. Quantum computing works by harnessing the special properties of atoms and subatomic particles. To put it simply, quantum computers employ quantum bits, or qubits, for digital communications rather than the more conventional binary bits of ones and zeros. It does this by processing data inputs using the distinct characteristics of subatomic particles. Atoms are used in quantum computing because they are a physical system that may exist in both 0 and 1 states at the same time.
Unprecedented processing speed and predictive analytics will be possible with quantum computing, enabling problem-solving. Quantum technology is expected to transform various fields, such as real-time analytics and cybersecurity. Physicists are designing quantum computers that can outperform conventional computing and calculate at amazing speeds, which would enable a whole new type of analytics and cryptography. Moreover, because quantum computing is non-deterministic, it can be used to simultaneously explore a large number of possible solutions.
David Awschalom is a distinguished scientist at Argonne National Laboratory, the director of the Chicago Quantum Exchange (CQE), the Liew Family Professor in Molecular Engineering and Physics at the University of Chicago, and the head of Q-NEXT, a Department of Energy Quantum Information Science Center. CQE predicts that secure communication will be enabled via metropolitan-scale entangled quantum networks in the next five years. It is also possible to use these networks to build tiny quantum machine clusters for high-end computation. Quantum sensors could also be used to enhance intracellular sensing, mapping, and clocks. David Awschalom | Chicago Quantum Exchange
Robert Liscouski, CEO, of Quantum Computing Inc believes that we will see practical applications of quantum computing in 2024. He is confident that the state of the technology is at a point today where end users—business users, medical researchers, and cybersecurity professionals—will change the conversation from “What can quantum computing do” to “Look what I can achieve with quantum computing.” He believes that Quantum Computing Inc. will be at the forefront of leading that discussion .www.quantumcomputinginc.com
We are living in a time of quantum discoveries in 2024. Although we are still in the early stages of quantum computing, it might happen sooner than we think. The accelerated investment in quantum by both the public and private sectors is an affirmation of what lies on the quantum digital horizon.
Biological Computing:
The cutting-edge field of biological computing involves the use of biological products to carry out tasks that would typically require the use of materials like fiberglass and copper wire. DNA and amino acids are often utilized as biological elements in these investigations. Biological computing will conduct information processing using cells (protein synthesis) as well as DNA, proteins, and RNA to create new cells. These materials can be used to manipulate the natural chemical reactions that perform computations.
Biocomputers might be able to store data on living cells’ DNA in the future. With the use of this technology, biocomputers could be able to store infinite amounts of data and carry out sophisticated computations that are currently unattainable.
Recently, A biological computer that can monitor various elements in the environment was invented in Israel by Technion researchers. It was built inside a bacterial cell. At the moment, the computer recognizes and reports on many elements, including dangerous ones. “In the live cells, we constructed a kind of biological computer. Circuits do complex calculations in this computer, as they do in other computers,” explained Barger. “Only here, these circuits are genetic, not electronic, and information is carried by proteins and not electrons.”
Additionally, it was stated that scientists at the National Institute of Standards and Technology (NIST) might have created biological computers with extended lifespans that might survive inside cells. They constructed computers using RNA nucleic acid. Samuel Schaffter, a postdoctoral researcher at NIST, explained that the distinction between biological computing and classical computing is that “instead of coding with ones and zeroes, you write strings of A, T, C, and G, which are the four chemical bases that make up DNA.” Upgraded Architecture May Bring Potent Biological Computers from Test Tubes to Cells | NIST.
Optical and Photonic Computing:
Optical light pulses are used in photonic computing instead of electrical transistors to create logic gates for computer computation. In order to meet the data processing and transfer requirements of next-generation computing, researchers at Aalto University have created light-based optical logic gates. The ultrafast processing rates of their new optical chirality logic gates are almost a million times quicker than those of current technologies. Without a doubt, this is a field in computing to watch. Millions of Times Quicker Than Existing Technology: Ultrafast Processing is Available with a New Optical Computing Approach (scitechdaily.com)
Chemical computing:
Chemical computing is an additional non-conventional method of computer processing. In nature, chemical systems can function as logic gates to perform computations. Director of the International Center of Unconventional Computing at the University of the West of England in Bristol and computer scientist Andrew Adamatzky claims that “we are already using chemical computers because our brains and bodies employ communication via the diffusion of mediators, neuromodulators, hormones, etc.” He concludes, “We are chemical computers.” The Future of Artificial Intelligence: Chemical Computing | OpenMind (bbvaopenmind.com)
Spatial Computing:
Spatial computing allows the virtual and physical worlds to blend seamlessly and allows users to interact with computers more naturally and intuitively. Headsets for virtual reality, augmented reality, and mixed reality let you experience it. These devices display the real world while also incorporating real objects into the frame to give the impression that it is three-dimensional. The interface’s components are integrated with the environment. Because of the more natural interaction between the user and the computer system, spatial computing provides a more engaging user experience.
Human-Computer Interface:
A fascinating prospective area of advancement in AI is human-computer interaction, which has the potential to enhance human cognitive abilities and memory. Already, significant progress is being made in brain/computer interface science. Brain mapping and neuromorphic chips may be examples of this. Emerging assistive technology with implantable sensors that capture brain electrical signals and use them to power external devices is what gives rise to brain-computer interfaces.
It has even been demonstrated that a brain-computer interface can interpret thoughts. To detect electrical activity, an electrode plate known as an ECOG is placed in direct contact with the surface of the brain. According to Dr. Brian Brown, a professor at the Icahn School of Medicine at Mount Sinai, people who are paralyzed via ECOG can now converse with others by having their thoughts converted into text. Can Humans Be Made “Super” by Technology? – Innotechtoday.com: Innovation & Technology Today
The promise of the human-computer interface was summarized in a Frontiers in Science publication that involved the cooperation of academic institutions, institutes, and scientists. “We can imagine the possibilities of what may come next with the human-brain-machine interface,” they said in their conclusion. A human B/CI system made possible by neural nanorobotics has the potential to improve human intellect and learning abilities because it has immediate access to all of the human information kept in the cloud. Furthermore, it may raise the bar for fully immersive virtual and augmented reality to previously unheard-of heights, enabling users to express themselves more fully and richly and have more meaningful experiences. These improvements might make it easier for people to use newly developed AI systems as tools for human augmentation, helping to lessen the impact of emerging threats to humankind. Human Brain/Cloud Interface – Frontiers (frontiersin.org)
Most recently, a group of researchers from Stanford University tested a novel brain-computer interface (BCI) that they claim can decipher speech at a rate of up to 62 words per minute—a 3.4-fold increase over the previous record. Researchers Report That Users Can Send 62 Words Per Minute Using a Novel Brain-Computer Interface (futurism.com)
With the aforementioned advancements, artificial synapses that mimic human brain function will direct the next generation of computing. Furthermore, the possibility of a quantum brain built on a material with intelligence that can change physically to learn is promising. The pursuit of the so-called “quantum brain” by physicists has advanced significantly. They have shown that they can imitate the independent actions of neurons and synapses in the brain by patterning and connecting a network of individual atoms. See The First Steps Toward a Quantum Brain: An Intelligent Material That Changes Physically to Learn (scitechdaily.com)
Adding the Ingredients of Artificial Intelligence to Computing:
Artificial intelligence (AI) systems aim to transcend human speed and constraints by mimicking human characteristics and computing abilities in a computer. It is already taking place. Artificial intelligence (AI)-enabled computers are mostly used for automation tasks including speech recognition, learning, planning, and problem-solving. By prioritizing and acting on data, AI technologies can make decision-making more efficient, especially in bigger networks with numerous users and factors.
Computing paradigms as we know them will exponentially change when artificial intelligence is combined with classical, biological, chemical, and quantum computing. Artificial intelligence might guide and enhance quantum computing, run in a 5G or 6G environment, facilitate the Internet of Things, and stimulate materials science, biotech, genomics, and the metaverse.
Computers that can execute more than a quadrillion calculations per second should be available within the next ten years. We will also rely on clever computing software solutions to automate knowledge labor. Artificial intelligence technologies that improve cognitive performance across all envisioned industry verticals will support our future computing.
Advanced computing has a fascinating and mind-blowing future. It will include computers that can communicate via lightwave transmission, function as a human-machine interface, and self-assemble and teach themselves thanks to artificial intelligence. One day, computers might have sentience.
Emerging advanced computing technologies can bring about significant benefits, but there is also a risk if companies and investors are not ready to adopt them. An industry framework that prioritizes planning, moral policy guidelines, and methodical technological integration is necessary in this situation. Hopefully, that will give us a prescriptive technological framework that keeps us safe and productive.
###
Chuck Brooks is President of Brooks Consulting International. He also serves as an Adjunct Professor at Georgetown University, where he teaches graduate courses on risk management, homeland security, and cybersecurity.
As a thought leader, blogger, and event speaker, Chuck has briefed the G20 on energy cybersecurity, The US Embassy to the Holy See and Vatican on global cybersecurity cooperation. He has served on two National Academy of Science Advisory groups, including one on digitalizing the USAF, and another on securing BioTech. He has also addressed USTRANSCOM on cybersecurity and served on an industry/government Working group for CISA focused on security space systems.
In his career, Chuck has received presidential appointments for executive service by two U.S. Presidents and served as the first Director of Legislative Affairs at the DHS Science & Technology Directorate. He has been named “Cybersecurity Person of the Year” by Cyber Express, Cybersecurity Marketer of the Year, a Top Cybersecurity SME to Follow, and a “Top 5 Tech person to follow” by LinkedIn where he has over 114,000 followers on his profile.
Chuck has an MA from the University of Chicago, a BA from DePauw University, and a Certificate in International Law from The Hague Academy of International Law
Read the full article here