While the concept of a digital twin has been around since 2002, it is only thanks to the Internet of Things (IoT) and scalable information systems that it has become feasible to implement them at scale. Indeed, digital twins are so imperative to business today that the concept was named one of Gartner’s Top 10 Strategic Technology Trends for 2017.

Quite simply, a digital twin is a virtual model of a process, product or service. This pairing of the virtual and physical worlds allows analysis of data and monitoring of systems to head off problems before they even occur, prevent downtime, develop new opportunities and even plan for the future by using simulations.

Infrastructure owners have embraced the concept and are beginning to implement digital avatar plans for their own asset empires. But this will only benefit cities – the place of convergence for multiple infrastructure systems – if a federated approach is used for infrastructure digital twins. This would enable individual asset owners to develop and curate digital twins of their own assets, but would also enable city authorities to mandate an overarching framework so these can be combined into an integrated city model.

If you believe the supply side, the market would also appear to be well advanced when it comes to city-scale digital twins, with multiple offerings that come from a variety of starting points including the gaming industry. However, there is still a question as to how close city clients are to the purchase of such models to improve their cities. And are tech companies as close to a one-stop-shop city twin solution as it would appear from their websites and marketing collateral?

Some cities are developing their own systems, for example the municipality of Dubai has an emerging city-scale model in which services can be visualised in real time, and past data can be analysed to provide insights to assist decision-making for citizens and commercial businesses. These plans may still be a little way from delivering significant and tangible benefit to service operators and citizens, but with prototypes in place and long-term plans for delivery, the stage is set. With such far-sighted city leaders showing the way, how fast will the rest of our global cities catch up?

The answer to the questions posed above must lie in the objectives of the cities themselves, as these will guide the spend and direction of effort for the – often cash strapped – city municipalities. And in turn this affects how fast the supply side is able to develop products that are tuned to customer need.

There is no doubt that opportunities for massive scale digital twins are significant, with the potential to bring great benefits to city and infrastructure clients such as improved asset maintenance, business transparency and better and more extensive optioneering and decision-making. The benefits fall into two categories:

  1. Reactive: Feedback and visualisations enhance real-time or near real-time interventions and improve the smooth day-to-day running of the city or asset;
  2. Predictive: Interdependency modelling based on accurate input data is used to improve longer-term scenario planning to steer appropriate (and equitable) investment decisions. For example, if one item is changed then the impacts can be modelled for all other items in the model to enable testing of:
    – Iterative/feedback loops and responses: For example, we could predict how an impending storm detected from weather data would affect flood relief systems in the city and the activities that are dependent on their smooth operation, enabling informed decisions to be made about pre-emptive action if required.
    Sandbox scenario testing: Taking the flood risk scenario above this could include predictive analysis of road locations and flood risk areas for new urban zones. Another example is impact testing of different charging scenarios for a city’s transport network, in order to optimise effective future use of road and public transport assets.

The question is therefore no longer the ‘why’, or the ‘what’; it is the ‘how’. How will this clear benefit manifest itself, and where is the market for those looking to recoup digital twin benefits at city scale? The answer, as many of the tech giants understand, is in getting closer to the city ecosystem; how it operates, what the motivations of the stakeholders are and what the big issues are that take time, effort and resources to resolve. Most importantly, understanding that the key driver in a city is to improve people’s lives. The game changer is where disruptive technology and the fast-moving power of machine learning meet the industry expertise whose remit has always been to create better places to live, using the available tools of the day to improve outcomes for people and the efficiency with which we get there.

This requires the meeting of two historically separate industries, and implementation will be multifaceted and complex. Much of the infrastructure industry continues to work in sectoral silos that were artificially created according to historical precedent, set when optimisation across interdependent systems was simply not possible and imposed boundary conditions (assumptions) were required in order to manage decision making. With the advent of improved data processing power and the ever-increasing abundance of data, we can now address interconnected city-scale challenges that would have been unthinkable in the last century. However, before we can make great strides in applying the power of data and technology to analyse inter-sectoral city dependencies, we must be able to challenge and unpick these silos.

We have derived a smart city concept framework to categorise the associated layers of input, to assess the market capabilities, and to hold stakeholder conversations in this fast-emerging field, which has little accepted terminology or precedent. It is this latter point that is clouding the issue, creating confusion on what is exactly on offer and allowing vendors to (most likely inadvertently) oversell their services.

As the image illustrates, the layers of service build from the three-pronged data collection layer, hosted on a well organised platform which enables a palette of analytics tools to be selected against specific city issues combined with an inter-operability which unlocks additional insight.

Creating a viable city-scale digital twin is a massive undertaking. The key to successfully achieving it is scalability, such that digital twins of individual assets or systems are designed with the need to feed into a cumulative city-scale model in mind. Delivering early success is crucial, both from the point of fast-fail testing and for revealing the benefits to stakeholders so they add to the momentum and become part of the journey. It is therefore the scoping of early projects, mapped against critical city issues, that forms a crucial element to success – for the vendor, for the city and for citizens. This selection requires the combination of ‘domain knowledge’ (a good example of terminology misunderstanding – for the infrastructure industry this just means the ability to successfully deliver infrastructure projects) and advanced technical know-how.

I am excited by the prospect of using digital twins to advance the effectiveness of our city systems, and to improve the elegance of multi-sector decision-making. However, despite impressive progress on both sides, neither the tech industry or the construction industry can do this effectively alone. Strong collaboration between both industries is the only way to embed the step change needed to make the most of digital twin technology.