Web Development by SUSTAINABLE

A mirror world

14 min


The infrastructure sector has proven it is capable of achieving great things when working together – but change is sluggish, and success is piecemeal. How can companies break down competition barriers to data-sharing in order to take advantage of today’s technological edge? Cityscape speaks to Matthew Bourne of the UK Regulators Network (UKRN) and Matthew Evans of techUK to find out more

“This book describes an event that will happen someday soon: You will look into a computer screen and see reality. Some part of your world – the town you live in, the company you work for, your school system, the city hospital – will hang there in a sharp colour image, abstract but recognisable, moving subtly in a thousand places. This Mirror World you are looking at is fed by a steady rush of new data pouring in through cables. It is infiltrated by your own software creatures, doing your business. People are drawn to these software gadgets: When you switch one on, you turn the world (like an old sweater) inside out. You stuff the huge multi-institutional ratwork that encompasses you into a genie bottle on your desk. You can see over, under and through it. You can see deeply into it.”

So began the book ‘Mirror Worlds,’ a fascinating exploration by Yale University computer science professor David Gelernter of what he predicted would soon become “high-tech voodoo dolls”: a cyber-replica of real life that would revolutionise our relationship with computers and, consequently, our relationship with reality.

Perhaps the most impressive aspect of Gelernter’s predictions is that they were made in 1992: almost three decades ago, experts already expected to one day coexist with a piece of technology powerful enough to allow us to explore the physical world with unprecedented levels of detail and accuracy without ever leaving the house.

And he was right: the once-futuristic technology now currently exists in the form of ‘digital twins,’ or digital replicas of a physical entity that allow us to bridge the gap between the real and the virtual world through vast swathes of data. According to Atkins director of digital engineering Simon Evans, the global digital twin market was valued at $3.8bn this year and is already expected to surpass $35bn by 2025 – a staggering prediction that elucidates just how crucial this technology has become to our foundational relationship with the world around us.

But not every digital twin is created the same. Simon subscribes to the idea of a ‘digital twin spectrum,’ a range split into six identifiable elements that dictate the different characteristics of any given twin – for example, whether it uses real-time data collected via sensors, whether there is two-way data integration and interaction, whether it uses a 2D map or 3D model, and so on, until it reaches a stage of fully autonomous operation. “Broadly, the spectrum of a twin can be organised into six identifiable elements,” Simon wrote in an Atkins blog in September. “Although each element may increase in complexity and cost, it’s neither a linear nor a sequential process, so a twin might possess early or experimental features of higher-order elements before possessing the lower-order, foundational ones.”

Eventually, he writes, a digital twin could evolve to “become one with the physical asset, and a ‘single version of truth,’” as promised by Gelernter’s predictions – a still relatively distant reality, but a reality nonetheless.

Of course, for the infrastructure sector to ever reach that level of autonomy, it relies on one key element: high-quality data.

A collaborative framework

Popularly nicknamed “the new gold,” well-curated and well-managed data forms the very basis of the modern infrastructure projects of today. Building information modelling (BIM), for example, has become the standard process adopted by companies looking to plan, design, construct, operate, and maintain physical assets such buildings, water, energy, telecoms, tunnels, bridges, or ports. Crossrail, for example, the multibillion-pound rail network nearing completion in London, was one of the first transport projects in the UK to have a digital twin; in essence, the Crossrail team built two railways instead of just one.

This type of innovation in the infrastructure sector is certainly commendable, especially when done collaboratively. In the case of Crossrail, the company teamed up with Bentley Systems in a technology partnership to establish the Crossrail-Bentley Information Academy, which not only enabled the use of BIM on an unprecedented scale, but also helped ensure the scheme was capturing and sharing best practices on digital information with the entire supply chain.

But data is a precious thing. Most markets that fall under the ‘infrastructure’ umbrella are – or believe they are – competing both with each other and across sectors, making their data a valuable commodity that won’t be easily pried off by public hands.

Indeed, a recent report by the UKRN painted a bleak picture of data-sharing across the nation’s infrastructure sector: issues with confidentiality, a lack of common standards and language, liability risks, poor quality of information, and an old-fashioned organisational culture are all significant barriers to the levels of widespread data-sharing that could truly transform how we relate to physical assets. To top it all off, the lack of a user-friendly, easily accessible central portal responsible for pooling together mountains of data under one single platform makes any real aspiration to blur the lines between physical and digital dead in the water.

Amongst its more long-term recommendations, the UKRN-commissioned report said that regulators must provide guidance and best-practice guidelines around what data can be shared, as well as support the industry to work more collaboratively and agree on common data standards, definitions, and frameworks. This largely echoed the findings of a 2017 report produced by Deloitte for the National Infrastructure Commission which, as well as prescribing a more collaborative approach, also called on the government to fully address cultural and commercial barriers to data sharing.

Time for a change

While progress is slow, perhaps the very existence of such in-depth analyses into the barriers of our digital assets is enough to demonstrate that the sector is ready to embrace change. “I think people have woken up to the need to get to grips with this,” agreed Matthew Bourne, UKRN manager. “The NIC report and government initiatives [such as the Digital Framework and Digital Transformation task groups] were a useful kickstart to that. But what consultants found, and we have observed, is that there have been a lot of initiatives in individual sectors; people are aware of the benefits, but often have been doing so in piecemeal pilots. But because of the obvious benefit, the more that regulators can do to encourage their major stakeholders to progress plans and discuss what standards and what data they’re prepared to publish, the better.”

There’s a growing push towards adopting digital solutions and moving out of that pilot stage – but it will still take a little while before we see that at a more universal level across the infrastructure sector

While the UKRN report identified a distinct move in favour of open data over the past few years, data-sharing between sectors is less advanced and, while the potential benefits are clear, those surveyed expressed concerns over a perfect storm of perceived risks, high costs, and closed cultures.

“The infrastructure sector knows that it has to change,” said Matthew Evans, who used to run techUK’s SmarterUK scheme and now oversees all of their market area programmes. “You only have to look at the Carillions of the world to come across that. There is a pretty good realisation at the moment that it needs to change. But at what pace can that change be brought about? It’s going to take a while to develop standards and frameworks, so you can understand why some companies are saying, ‘Well, I don’t want to run down a cul-de-sac.’ You can understand why they’re treating this with kid gloves.

“But at the same time, you have some organisations – for example, UK Power Networks – that have a very strong focus on actually investing in physical assets as a last resort, and looking at what you can do with data to open up new markets. There’s a growing push towards adopting digital solutions and moving out of that pilot stage – but it will still take a little while before we see that at a more universal level across the infrastructure sector.”

People have woken up to the need to get to grips with this

Just how long, exactly? Evans predicts that we’ll start seeing some big changes in two to three years, partly as a result of sector pressures influencing providers to look at digital technologies more closely, and partly because the technologies themselves are maturing faster. This is, of course, if we’re able to overcome the statutory barriers that exist across the currently heavily regulated infrastructure sector. Not only that, but according to the UKRN report, providers feel that “different regulatory requirements, expectations and practices appear to contribute to inconsistencies in data sharing and data quality across sectors.” Enter the need for consistent standards which share the same common language and framework, but the industry is still unclear on who exactly should be responsible for setting this up.

A regulatory fix?

Evans remains agnostic as to who should take charge of this ambition. The Energy Data Taskforce that techUK participated in identified Ordnance Survey as the best body to oversee the development of a ‘data catalogue,’ which would provide visibility through standardised metadata of energy system datasets across the public and private sectors. But more generally, and across sectors, it will depend on the government’s appetite and the industry’s particular needs. “We do need to make sure that there is a sustainable funding model in place,” added Evans. “Whether that’s best under the private sector or the public sector, we don’t really have a view on it – the important thing is to make sure that it’s sustainable, and to make sure that we get the right level and quality of data within that. Whatever method best achieves those goals, we’d happily support.”

The infrastructure sector knows that it has to change. You only have to look at the Carillions of the world to come across that

From a regulatory perspective, Matt Bourne doesn’t think there’s any supplier unwillingness to embrace data-driven technologies; the problem is that, as a whole, the industry is still stuck in the exploratory phase of determining what these shared data standards should even be – and how to make them applicable to as many companies as possible. “This won’t necessarily come from regulation,” he noted. “Indeed, regulators have very limited powers in this area, particularly because it frankly hadn’t been sorted out at the time when most of the regulators’ plans were drawn up. It’s ultimately for the government, and indeed Parliament, to change powers in that respect.

“It’s at the point now where people are identifying the benefits,” Bourne continued. “You don’t necessarily jump to the regulatory lever until you realise that the benefits might not be unlocked without some form of coordination – but that can take many forms.”

Innovations at scale

That’s not to say that, in the meantime, different infrastructure divisions can’t thrive despite a lack of a shared language tying them together. Both Evans and Bourne picked the energy sector as a distinct frontrunner in this, partly because the industry seems to be paying more attention to it, and partly because they have a more collaborative approach to data-sharing than, say, telecoms providers (a competitive realm ripe with commercial sensitivities).

Water companies are catching up, too: the Environment Agency has made open data the default within the organisation; companies are undergoing various data-cleansing pilot projects; water utilities are actively sharing details of their asset data as part of the BIM4Water task group; and some of the respondents in the UKRN report are engaging with multi-stage European Union collaboration projects which look at the potential value of data-sharing and exploiting the Internet of Things.

Leicester Cathedral 3D BIM Model from Plowman Craven

“In the water industry, there are quite interesting ways to gather data,” Matt Evans explained. “There are some water pipe measuring tools that you can lob down a manhole which test any movement within the pipes, because that tends to be a good indication that there’s a leak somewhere [in the water distribution network]. If you can use several of these, you can start to isolate data to quite a granular level to identify where the potential leak is. The great thing about them is that they’re mobile, so the cost of actually monitoring the network drops quite a lot. It’s a neat way of doing monitoring at scale, but at a lower cost.”

The industry has also been responsible for a pioneering digital surface water drainage system in Glasgow, which uses sensors and predictive weather technology to help the government pre-emptively manage floodwater. The North Glasgow Integrated Water Management System aims to create a ‘sponge city’ which passively absorbs, cleans, and uses rainfall intelligently – which might sound like something you’d find in a city like Amsterdam or Oslo, but it’s happening right here in the UK.

Elsewhere, the construction and transport sectors have set themselves apart by not only harnessing BIM tools, but by taking advantage of augmented and virtual reality devices to help on two fronts: within companies themselves, and in the interface between them and the customers. Augmented reality, for example – such as the Microsoft HoloLens glasses – is used by engineers working in remote sites so that they can liaise with senior members of the team, who talk them through each particular intervention whilst sitting in an office.

And where companies interact with their customers, some have been using virtual reality equipment to demonstrate in a more meaningful and tangible way what a finalised project might look like. This can be done sonically, as well: Arup, for example, use its in-house SoundLab to create a 360-degree simulation of what a scheme will sound like once complete, in order to both inform the design stage and gather public feedback. For instance, the auralisations of what an HS2 train might sound like in different parts of England – in effect providing an objective experience of the real deal – helped inform the government on how to best mitigate potential noise pollution in rural towns.

Keep the engine running

“A Mirror World is some huge institution’s moving, true-to-life mirror image trapped inside a computer – where you can see and grasp it whole. The thick, dense, busy sub-world that encompasses you is also, now, an object in your hands. A brand-new equilibrium is born.

“The software model of your city, once it’s set up, will be available (like a public park) to however many people are interested, hundreds or thousands or millions at the same time. It will show each visitor exactly what he wants to see – it will sustain a million different views, a million different focuses on the same city simultaneously.

“Such models, such Mirror Worlds, promise to be powerful, fascinating, and gigantic in their implications. They are scientific viewing tools – microscopes, telescopes – focused not on the hugely large or small, but on the human-scale social world of organisations, institutions and machines; promising that same vast microscopic, telescopic increase in depth, sharpness and clarity of vision. Such Mirror Worlds don’t exist, yet. But most of the necessary components have been designed, built and separately test-fired, and we are now entering the assembly stages that will produce complete (albeit small-scale) prototypes. The intellectual content, the social implications of these software gizmos make them far too important to be left in the hands of the computer sciencearchy.”

Gelernter’s prediction of a digitally-enabled world, sustained through a constant and real-time drip of information, hasn’t yet become the norm in the UK. But as we did in 1992, we still now have all the necessary components to create this universe; we have the prototypes and the software and the skills needed to move this forward, to use data as a collaborative force for good, just as important in society as railroads or energy networks are.

Data is the engine that underpins every single aspect of our civil and business community; in a sense, data itself is the infrastructure. It’s about time we treat it as such.


Choose A Format
Trivia quiz
Series of questions with right and wrong answers that intends to check knowledge
Poll
Voting to make decisions or determine opinions
Story
Formatted Text with Embeds and Visuals
List
The Classic Internet Listicles
Meme
Upload your own images to make custom memes
Video
Youtube, Vimeo or Vine Embeds
Audio
Soundcloud or Mixcloud Embeds
Image
Photo or GIF
Gif
GIF format