The Evolution of Computation: Making the impossible possible

Evolution: anything and everything from Survival to Free Will

R

Reyna Bulchandani Year 12

The Lady Eleanor Holles School Middlesex

Shortlisted 10th July 2024

As we near the collapse of Moore’s Law, the rise in computational power slows in the face of ever- increasing demands from larger, faster and more complex algorithms. The growth of AI and machine learning systems in turn requires exponentially more resources, rapidly outpacing our forecasted developments. Biology ingrains in us that evolutionary adaptation of a species is necessary for survival despite fast-changing environments; it is only natural that our computers mirror this to prevent extinction. The first step to moving forward, however counter-intuitive it may seem, is looking at and learning from our technological past. Early computers, such as the ENIAC, were built to automate military calculations in their most rudimentary forms as WWII persisted. The state-of-the-art, 27-ton machine fulfilled its purpose in calculating artillery trajectories 2,400 times faster than humans. However, as the war concluded, the potential of computers for wider use was recognised and we saw the vacuum tubes of the ENIAC evolve into transistors to produce smaller, faster machines. This was of highest priority to propagate their use within the general public, spurring the development of integrated circuits (ICs). Each new generation of ICs has evolved to shrink transistor dimensions, exponentially increasing transistor count. This has taken us through the invention of multi-core chips, microprocessors and microcontrollers (to name a few) as we pursued powerful yet more portable applications. Progressing to the current day, phones one-thousandth the size of the ENIAC that carry out twenty-two billion times more calculations per second are ubiquitous. Evidently, the acceptance and perceptive application of new and emerging technologies is key to ensuring the successful adaptation of computers. But how exactly could this help inform the future of our computer systems? Learning to grow past the same tired selection pressures of how many transistors we can crowd into one chip, we should embrace new computational techniques. Perhaps surprisingly, the solution may lie in what is currently presumed a hindrance to current development. As the advent of nanotechnology has pushed transistors to the atomic scale, electrons have begun to exhibit unforeseen properties of quantum tunnelling. The oxide gates between components have become so thin that electrons have begun leaking between them, effectively rendering their switch-like function useless; these transistors can never be fully turned off. From our current standpoint, this devastating news signals collapse for faster processors. But from a unique perspective, this offers boundless opportunity in the advancing field of quantum computing. ‘Qubits’, the fundamental unit of information in quantum computers, exist in a superposition of two states at once; the qubit itself does not have a given state, but instead a certain probability of assuming either state when it is measured by an observer. Thereupon, we are freed from the constraints of regular binary bits only ever being in one of two discrete states. Using this phenomenon, unimaginable levels of data manipulation can be reached, producing more efficient and accurate algorithms to revolutionise computation. For example, the extinction of contemporary encryption is threatened by quantum algorithms for prime factorisation while quantum data representation in biological modelling could mirror the multi-variate nature of molecular interactions to evolve drug development, outcompeting current systems. It is evident that adapting future computers to our needs is crucial as we draw on inspiration from the past. As we take our next steps in the world of artificial intelligence, it is estimated that processing demands are doubling every 3-4 months. To produce the most realistic, human-like models, an exorbitant volume of data must be used to train AI systems - an amount we cannot keep up with in the current day. As quantum computing offers significant increases in processing power, the development of this new technology suggests a promising solution to a growing issue. Slowly but surely, major players in the technological field are recognising the future of quantum computing, developing systems such as IBM’s QS Two featuring the Heron processors. It extends a hefty 22 by 12 feet, reminiscent of its classical ancestors from almost a century ago. Similarly, doubts plaguing the future of quantum computing echo the hesitance felt towards the first classical computers; understanding evolution, we can only thrive through innovation, not stagnation.

Stay Updated

Join Our Elite Mailing List

Get early access to our courses before anyone else! Enjoy exclusive content from Oxford and Cambridge experts, and stay up-to-date with the latest competitions.

Examable Logo
Oxford Scientist Logo

Unlock exclusive learning with Examable, crafted by Oxford and Cambridge scholars and top examiners. Where ambition meets distinguished education, only on our platform.

Explore Examable

© 2024 | Examable | All Rights Reserved

Privacy Policy