GE Digital, the company that helped promote all the hype around the internet of things back in 2013, has since calmed down considerably. Once dedicated to selling the industrial IoT, now it’s focused squarely on use cases, vertical solutions, and providing actual value. And according to CTO Colin Parris, a lot of that value will be based on digital twins.
As Parris explains, digital twins are not a new concept — indeed, the original digital twin was a NASA construct from the 1970s. What is new is that the digital twin model can now be continually updated to reflect reality. “We have always built models, but after we have built a model we stop using it,” he says. “Digital twins are a living and learning model, where we say, ‘Let’s continue instantiating that model and keeping it up to date to show the status of the assets.’”
If the first essential ingredient of today’s digital twin is that it represent the current state of an asset, the second is that it must be capable of learning and predicting how that asset will fare over time under varying conditions. This may sound simple, but building a digital twin model that can ingest new information and make predictions takes technical skills, access to data, and an understanding of what types of equations need to run.
Parris also believes that a successful digital twin will either be able to take the predictions it makes and take action or deliver them the insights to a human who can then take action.
As an example of a digital twin leading to an automated response, Parris references GE’s digital ghost technology introduced last year. The digital ghost technology is used to detect strange behaviors on the control networks for manufacturing or electrical grid equipment, then quarantine potentially affected equipment so it can cause problems. The technology prevents hackers from getting into a plant’s control networks to force machines to sabotage themselves, or take actions that could harm some of the plant’s workers. As someone who thinks about digital twins as a model of a physical object, thinking of a twin as a model of a network was a slight shift. It does make sense, though.
And as digital twins cover more than a single piece of equipment — representing, say, the design of an entire factory floor or the health of an airplane in specific environments — they will encompass both physical assets and a constantly changing array of environmental conditions.
So where does all of that data come from? In aviation, for example, GE may already have all the data and models needed to measure the physical performance of an engine it has manufactured, but it also needs to pull in data from other instruments on the plane and environmental data from the locations the plane flies to in order to assess its overall health.
Depending on the contracts that GE has with a customer, it will handle the data gathering in one of two ways. If GE is maintaining the engine and essentially offering uptime as a service, it will contract with other companies to get the data it needs. If GE is selling or licensing the digital twin, it might rely on the customer to contract with companies for the data, which it will then integrate into the digital twin. Many efforts to use digital twins are narrowly defined because that makes it easier to create a business case, build statistical models to search for a set behavior, and then find the appropriate data sources to put into those models.
I came away from my conversation with Parris with a new appreciation of just how valuable a tool a digital twin is. I also understood better than ever before just how much work it is to build them and get them set up in a way that can benefit a business. For example, although Parris talked about ensuring that digital twins can scale, I really can’t see how that will happen at GE Digital unless it builds out pre-defined models for each piece of equipment it sells, then lays out the data sources that a customer would need to integrate into those models. That will work for a few common use cases, but as customers tweak these twins for their own operations, it feels like everything will become far more customized.
In addition to our chat about digital twins, Parris gave me an update on GE Digital. Back in 2018, after a reorganization, the subsidiary was put on the auction block. But last year, GE CEO Larry Culp decided to retain the GE Digital business and refocus it on use cases around the utility grid, manufacturing, power, aviation, and oil and gas. The goal is to provide value to the customer as opposed to talking up digital transformations.
So expect to hear less about the broad benefits of a platform like Predix and more about case studies on narrowly focused IoT implementations that have a clear return on investment or business value. And yes, expect to hear more about digital twins.
The post Talking to GE Digital’s CTO about digital twins appeared first on Stacey on IoT | Internet of Things news and analysis.