So far in this series I’ve discussed two facets of the infrastructure construction industry’s current transformation. The first foresaw a notional ‘singularity’ resulting from a convergence of technology and practice to yield better efficiency and effectiveness, widening the traditional boundaries of the industry ecosystem to cover the asset’s entire life-cycle. The second looked at the issues raised by the current focus on standardisation in the shape of P-DfMA – a platform approach to design for manufacture and assembly.

This blog looks at the potential for automation to improve every corner of the industry, all catalysed by digitalisation. While some mistrust it, automation is usually regarded as desirable in that it can complete dull, repetitive and hazardous jobs cheaply, quickly and faultlessly. So long as it is cost-effective, reliable and quality-assured, it frees up humans to concentrate their efforts on work that requires skills and leaps of imagination that machines currently lack.

The subject of automation can usefully be split into a couple broad and rather fuzzy categories. The first is automation that allows us to create and elaborate data into, for example, accurate construction information. I would include generative design, optimisation software, machine learning and artificial intelligence in this category. The second is automation that translates data into physical analogues and objects – or vice versa. This category includes reality-capture scanning technologies, virtual/augmented/mixed reality visualisation tools, sensors, 3-D printing, and robotics. Both categories are loaded with potential to improve the way that infrastructure is conceived, designed, built, maintained and eventually decommissioned.

The common thread here, of course, is data and, in particular, how to extract as much value as possible from it. While the most valuable warp of data is that which stretches all the way through from design and construction (the Project Information Model) to operation (the Asset Information Model), there is the weft of other digital tools that help it along the way. These organize, clarify and speed up all kinds of processes, including billing, payment, reporting, communication, resource allocation, contract management, and so on.

The key to it all is whether the information thus created can be passed on efficiently and usefully across disciplinary silos and over time. There are technical speedbumps here to do with data structure, privacy and interoperability, but more significant are the liability and contractual roadblocks. Even when innovation overcomes the speedbumps for potentially huge efficiency and effectiveness gains, firms are very wary of adopting it.

Kelly Cone, Vice President of Industry Strategy at ClearEdge3D, a reality-capture company, has written colourfully about precisely how silo-busting functionality is stymied because it is ’contractually off limits’. As he puts it, “Our industry is sick – 50 shades of grey perverse. And, 95% of that weird, nasty, adversarial S&M stuff we live with day to day as our ‘normal operations’ in this business comes from our contracts.”

This is a huge problem for fulfilling automation’s promise in the singularity. This vision is not just the fevered ranting of a swivel-eyed soothsayer. It is shared by many others, notably the Centre for Digital Built Britain (CDBB, part of the Construction Innovation Hub), which is funded by the UK Government as part of their Construction Sector Deal.

The CDBB have set their sights on a National Digital Twin, underwritten by Gemini Principles. This network of connected platforms is their answer to the long-lamented missing link in construction: feedback that would close the whole-life data loop.

The idea is that data from the public infrastructure assets in operation is collected and shared. Rather than just going to help in asset management, the data generated would have two other useful destinations. First, it would slip out of the construction ecosystem to inform, for example, policy, the public, and any number of uses as yet unimagined.

Second, it would feed back to the design and construction community, allowing them unparalleled opportunity to validate solutions, thus providing the quality assurance so desperately needed to convince funders. The continuous stream of data could eventually produce enough data to feed the hungry maw of machine learning and artificial intelligence, triggering a virtuous spiral of ever-improving private and public value.

As with everything in the singularity, its realisation depends not just on the ingenuity and commercial savvy of the tech companies delivering it, but also on the sector being willing, organised enough and able to adopt it. My final blog looks at what these enabling conditions might be.

Read Matt’s first blog: The Vision for Technology

Read Matt’s second blog: The Quest for Standardization in Infrastructure Construction

Read Matt’s final blog: Enlightened Leadership, Trust and the Conditions Enabling Innovation

Matthew Thompson
Director of Matt Thompson Communications

Matt provides expert writing and editing services with a specialism in the built environment. Clients include building design consultancies, manufacturers, developers, publishers (including RIBA Journal), and industry organisations. He is currently a member of the RIBA’s Client Liaison Group and the CIC’s Building Quality Initiative Working Group. He was previously in charge of RIBA Publishing and was a Founding Director of the Centre for Understanding the Built Environment. Matt is also founder of, the architect’s client feedback tool.

Connect with Matt via Linkedin