Sophisticated computer modeling will be a key tool in protecting cities from climate change impacts.
By John Elkington
So first, a question: what links the Spanish town of Guernica in 1937, New York in 2012 and a city like Shanghai in 2039?
Probability suggests there are a number of correct answers, but the one we are after is that later in the century all three, for different reasons, will be seen as important stepping stones toward a world in which embedded computing and the routine processing of “Big Data” will be ubiquitous, critical to the health of the global economy and, increasingly, taken for granted.
Peer down at Earth from an airliner at night and you see urban areas as brilliant splashes, stripes and webs of light spread across the planet’s dark surface. Less visible are the immense flows of data and information that now knit our world together. If you want to track down the origins of that light, go back to Thomas Edison’s laboratory in the 1870s.
But what if you were looking for the source of those huge data flows?
Bombing of Guernica Sparked Data Revolution
Oddly, a good place to start would be the burning ruins of the town of Guernica in the Spanish civil war. The bombing raid by German bombers working for General Franco’s fascists inspired one of Spanish artist Pablo Picasso’s greatest masterpieces. But the raid also triggered a tsunami of concern across Europe as governments suddenly rumbled the fact that their cities were vulnerable to total destruction from the air.
One consequence: a reinvigoration of efforts to crack coded signals sent by the Nazi armed forces. Polish experts first broke signals encrypted by German Enigma machines in 1932, but later struggled as encryption and security processes became more sophisticated. Even in the early days, the chances of breaking the code were calculated to be 158 million million million to one.
Happily, as the Nazis prepared to invade Poland, the secrets of how to crack the Enigma machines were handed over to the French and British secret services. All of this came to mind as several of us headed north by train to visit Bletchley Park, the once highly secret — but now increasingly famous — center of British code-breaking in World War II. At its peak, Bletchley Park employed over 8,000 people in helping to break the codes used by German and Japanese armed forces: it is said that their efforts helped shorten the war by as much as two years.
From Alan Turing to Google
A key figure at Bletchley Park was mathematician Alan Turing, now seen as the originator of digital technology. As Bletchley Park CEO Iain Standen told us, just as the inventor and painter Leonardo da Vinci sketched helicopters as long ago as 1493, so Turing in the 1930s had sketched the future of computing. In his case, however, it was only a matter of years before his vision began to turn into reality.
In his brilliant book Turing’s Cathedral, sub-titled “Origins of the Digital Universe,” George Dyson explains how today’s computers track back to the work of Bletchley Park and, later, of scientists at the Institute of Advanced Study at America’s Princeton University, racing to build the first hydrogen bomb. From their relatively simple hardware and code evolved everything from smartphone apps to Google’s globe-straddling algorithms.
Computers Help Cities Respond To Sea Level Rise
As for the New York link, powerful computers helped model Hurricane Sandy as it built out in the Atlantic and then collided with America’s east coast and, most spectacularly, New York City. The sheer cost of the damage caused guarantees that huge additional sums will now be invested in evolving computer modeling and early warning systems to help minimize the impact of superstorms and other symptoms of global warming.
As for cities like Shanghai, 2039 will be the centennial year of Bletchley Park’s first intelligence efforts. By that time the evolutionary race that Turing and other code breakers began will have reached levels of computing power and artificial intelligence almost unimaginable today. And like Shanghai, which is hugely vulnerable to sea level rise, most cities will be using advanced forms of IT to shrink their carbon footprints and ward off the worst effects of climate change.
More than two-thirds of the world's largest cities are at growing risk from rising sea levels. One initiative designed to help reduce the damage is Connecting Delta Cities, a network of delta cities exploring new responses to climate change. And one thing is clear: data, information and intelligence systems will be crucial in ensuring that coastal concentrations of population and industry make it through the twenty-first century in good order.
Helpfully, a new Atkins report, Future Proofing Cities, assesses the risks to 129 cities—from megacities like Bangkok to smaller cities such as Zaria in Africa. Among the key trends spotlighted:
Nearly 900 million people are likely to live in informal settlements by 2020, many of them particularly vulnerable to the impacts of climate change and to changes in the price and availability of critical resources such as energy, water and food.
The study provides risk profiles for different cities and different types of city, linked to climate hazards, resource scarcities, and damage to ecosystems. Those responsible for future-proofing cities will be happy to know that more than 100 practical policy options are on offer.
Designing the OS of a Sustainable Economy
Just as Alan Turing and the Bletchley analysts cracked the operating code of an evil empire, so a new wave of designers, scientists and engineers are now working to devise the operating systems for a more sustainable global economy. Some see the world’s growing number of cities as tomorrow’s open-air, live-in computers, in which everything will be wired and ever-greater volumes of data will be processed.
Perhaps, if we are lucky, digital trajectories that began at Bletchley, sparked by the Guernica bombing, will accelerate again as cities compete to shrink their environmental footprints and generally smarten up for the future.