- Details
- Category: Volume 84 (Fall 2012 - Spring 2013)
- Published: 14 November 2012
- Written by SHAHARYAR AHMAD | SCIENCE EDITOR

If you have glanced at specifications for the typical personal computer over the past few years you have probably noticed the exponential rate by which many of the computer’s components are improving.

This trend is the result of an observation made in 1965 by the cofounder of Intel, Gordon Moore, known as Moore’s Law which states that the number of transistors per square inch on an integrated circuit will double every two years, according to intel.com.

Transistors are semiconductors which are the fundamental components of most electronic devices. They can act as amplifiers by controlling a large electrical output signal with changes to a small input signal (much the same way as a small amount of effort is used to allow a faucet to release a large volume of water). Transistors can also act as switches that can open and close very quickly to regulate the current flowing through an electrical circuit.

An analysis of personal computer specifications of the norm over the past decade showed an increase in RAM from 256 MB to 4 GB and in hard-drive space from 50 GB to 500 GB. With respect to storage capacity, we went from storing a few word processed documents on 3 ½ floppy discs (R.I.P.) with 720 KB and with the later ones 1.4 MB in the 90s and early 2000s, respectively.

Then CD-Rs came with upwards of 700 MB storage space, giving way to DVD-Rs with 4.7 GB, and eventually dl-DVD-Rs with 8.5 GB. The recent Blu-Ray discs boast a storage capacity of upwards of 25 GB for single layer and 50 GB for dl-Blu-Ray – capable of holding upwards of 9 hours of high definition video – a 3.7 percent increase in storage capacity over that of floppy discs.

Nevertheless, the numbers are continuing to increase.

One component that seems to have hit a wall is processing power. Even though processing power for the average PC has improved impressively to where it is now – quad core processors running 2.3 ghz compared to single core 800 mhz Celeron processors from the turn of the millennium – its rate of increase seems to have slowed down with respect to the continual expansion of the aforementioned.

A new type of computer, one that processes information not in the binary fashion based on logic states of “on” and “off”, but on quantum bits known as qubits which may allow the information in the dichotomy to be expressed as a superposition of both states, would allow for comparatively lightning-fast processing speeds, enhancing the computational capabilities far beyond contemporary norms, according to the New York Times.

This “quantum computer” poses an exciting endeavor for many research labs across the world. If successfully developed, it could provide ground-breaking ways of analyzing information quickly and efficiently.

According to the Clay Mathematics Institute, in one form of cryptography known as RSA, very large numbers are used to encode information with their prime factors. Prime factorization involves breaking down these numbers into a product of their smallest composite prime numbers. As the numbers get larger, it takes an increasingly larger amount of time to derive the composites.

If commandeered by enhanced quantum computation capabilities, the reduced time in factoring such numbers would allow RSA decryption to become increasingly faster. According to the New York Times, “Where quantum computers could produce an answer in days or maybe even seconds, the fastest conventional computer would take longer than 13.7 billion years.”

“If they can do this, I think the computer may eventually surpass human intellect because with the way the new web is advancing, computers could start thinking for themselves,” said junior Mike Hamilton.

Progress in quantum computing in the future would revolutionize the now nascent fields of nanotechnology and drug design to encompass multidisciplinary approaches to solving epidemiological problems in cost and time efficient manners.

The laws obeyed by the subatomic world are quite different from the macroscopic one we are accustomed to observing. “It would be silly to speak of a ‘minus 30 percent chance of rain tomorrow,’” said Dr. Scott Anderson, electrical engineering and computer science professor at MIT according to the New York Times.

Quantum mechanics is based on numbers called amplitudes which are closely related to probabilities which can be negative as they are complex numbers. If a photon hits a screen, said Anderson, it could happen with a positive amplitude in one way and with a negative amplitude in another way. The two amplitudes could interfere destructively to cancel each other out so the event never happened.

A critical goal in quantum computing is to “choreograph a computation so that the amplitudes leading to wrong answers cancel each other out, while the amplitudes leading to the right answers reinforce,” said Anderson.

Dr. Dmytro Kosenkov, physical chemistry professor at the University, said “We live in an era of the dawn of quantum computing. New machines are potentially capable of revolutionizing the field of information searching and cryptography. Now it is too early to talk about producing consumer quantum computers. Quantum computers, currently available in research labs, are able to process just a few quantum bits of information and store them only for a very short period of time—no longer than 100 microseconds (millionth of a second ).”

“However, the situation is paradoxical: while there are no available quantum computers, the languages for quantum computations have been already developed (for example QCL – Quantum Computation Language),” said Dr. Kosenkov. “Once a computer is made, software will be immediately created. Nowadays, quantum computers make a great engineering challenge, while the theory is done.”

IMAGE TAKEN from defenseindustrydaily.com