PTC has assembled a robust portfolio of Internet of Things technologies that, when combined with the company’s history in digital 3D design and Product Lifecycle Management (PLM) tools, makes for some interesting new ways to bridge the physical and digital worlds. To see how it all adds up, Network World Editor in Chief John Dix caught up with Michael Campbell, Executive Vice President of PTC’s ThingWorx IoT platform. Campbell, who has been with PTC since 1995, has a background in 3D CAD and visualization, and ran the company’s CAD business for years when they started to think about the convergence of IoT and 3D and how it might all come together in augmented and virtual reality.
Before we dig deeper, give us a sense of where ThingWorx fits into PTC overall?
Basically PTC is in two really important businesses. The first is the traditional CAD (Computer Aided Design), PLM (Product Lifecycle Management), and SLM (Service Lifecycle Management) business. That’s somewhere around a billion-dollar solutions business where we help companies create and manufacture products. Then there’s the platform business, ThingWorx, which is a set of tools that allow people to build industrial IoT solutions.
So, traditionally PTC helped people create digital representations that would then be manufactured in the form of physical products. We also helped them manage the lifecycle of those products, and more recently we’ve been helping companies service those products by leveraging that rich engineering intelligence and helping them define service procedures and manage their field service technicians.
One of the more interesting things at PTC these days is how we’re leveraging the ThingWorx Platform in Smart Manufacturing/Industrie 4.0 kinds of use cases. It’s a huge opportunity and we’re aggressively pursuing smart, connected products, analyzing data coming off of those products, gaining insights into those products, and visualizing that information in augmented reality and also, in some cases, virtual reality as well.
One of the basic challenges with IoT is normalizing the collected data to make sense of it. Do you address that?
We help companies do a bunch of different things. The first is get data from a lot of different sources. We can either connect stuff directly into ThingWorx or, if the customer is using one of these IoT cloud providers — AWS, Azure, Predix, pick your favorite — we can get data out of there.
A lot of time people will use ThingWorx in the factory, collecting information from sensors and controllers and various other pieces of hardware. ThingWorx is a great tool for aggregating that information. But it can also bring in data from other digital resources, such as CAD and PLM and even ERP.
Really what all of this is about is allowing you to create a digital twin of what’s happening out there in the physical world. You’ve got some smart, connected product in the world, you want to be able to have a digital equivalent of it so you can understand how it’s being operated, predict when it’s going to fail, make sure it’s operating most efficiently.
I’m hearing more and more about digital twins (See GE favors SaaS for non-differentiated apps, moves away from MPLS, has big plans for IoT). Is that an industry accepted approach by now?
The digital twin is getting more and more airplay. What goes into the digital twin? Ideally it’s everything you would ever want to know about that thing. In practical use cases, what do you care about? A digital twin could be a set of properties and their current attributes. It could be rich 3D information. It could be information from other enterprise systems like who owns that thing, when was that thing last serviced. It really depends what you want. ThingWorx is the way you aggregate that. We have this concept called the Thing Model and the Thing Model is the structure of that information. That’s what we help with.
The next thing we want to do is contextualize the data about the digital twin, say the location and the geometry. Then we want to get insights out of it by performing analytics or simulations. We provide a lot of capabilities that allow you to flex the digital twin and understand what could possibly happen out there in the physical world and get greater insight into what is happening.
I presume you ultimately want to be able to act on the information.
Right. As you gain those insights you want to be able to do something about it. We use the term orchestration to describe the idea of taking action. Orchestration takes on many different layers. One might be kicking off some digital command in a business system — Go roll a truck to service a system. It might be driving information back out to the physical device — Close a valve, stop a machine, make it go faster, whatever the case may be.
Or it might involve human engagement — I want you to go do something. How do we facilitate that today? Companies often use a web-based mashup on the desktop or an app on a mobile phone. But Leveraging the CAD model and other data, we can present that information in the context of the physical thing.
Can you give us some examples?
Here’s something we did with GE Transportation (see video above, in particular, starting at minute 4). What we’re doing here is presenting what it would be like to do maintenance work on this giant engine. There are a bunch of subsystems and data coming off the engine indicates that the oil system is unhealthy. We’re able to show the technician exactly what subsystem this is in the context of the engine itself, and then we’re able to go and look at the different levels of work that would be required and we can present this in a very compelling way, again using augmented reality.
Some of these require a Level 2 work scope – a more complex service procedure — because they contain injectors that have been recalled. In this case the data is coming from GE Predix, from their supplier system. It’s coming from CAD and it’s being digitally mapped and, in this case, being presented in the context of augmented reality.
What’s the source of the CAD here?
The original engineering CAD is Siemen’s CAD and then we bring that into our ThingWorx Studio environment. We can bring in data from anywhere in order to create these AR experiences.
VIDEO OF SAMPLE FLOW SYSTEM
And here is an example using a closed circulating system we built (see video above, in particular, starting at minute 1.3). It’s instrumented with sensors from National Instruments and it’s got one of those IoT-in-a-box solutions from HP. Basically all it does is pump water around, but we can introduce a clog and simulate vibration and other issues like that.
This is a ThingWorx dashboard. These are values that are coming off of this machine and we are checking for anomalies. Basically what we’ve told ThingWorx is, here are some pressure values and pressure deltas, keep an eye on those, begin learning right now when we turn the machine on. It takes a minute or two to learn and then, once it’s learned, when something goes strange, it says, “Hey, there’s been an anomaly here.”
We’re doing those analytics and we’re again presenting this information for the pump operator in the context of augmented reality. What we know is that something has gone wrong and we can show him where that is.
Using our technology and integration with analytics tools we can take these actual values and we can pump them into a computational fluid dynamics simulation so we can see problems [in the video, the purple area is air, which means there is cavitation and the pump is going to fail sooner than they want it to]. We can actually use that data to perform not an idealized simulation, but visually represent what’s actually happening in the pump right now.
So, you envision a day when a technician can walk up to a machine and “see” what is ailing it?
Right. What we would like to do, where we’re headed, is being able to present this on the pump itself in augmented reality so you can look at it and see what’s happening. You’ve seen a wind tunnel where they introduce smoke so you can see what the airstream is doing? Augmented reality is similar.
Again, lots going on here, data coming from all kinds of other systems; there is analytics, which is the anomaly detection as well as simulation and again, we’re presenting that in the context of augmented reality. Those are the types of things that people are using ThingWorx for to get insights into what’s going on with their operations and what’s going on with their products.
In terms of common use cases, is it mostly around maintenance?
A lot of them are service related, but a lot are manufacturing and inspection related. But here is a service example from a company called Sysmex. They make blood analyzers that you would find in a medical lab. The machine is smart and connected with ThingWorx so data is streaming off of it already. They ship their products with an industrial iPad.
VIDEO MED LAB TOOL FIX
This lab technician has been doing some work but there is a clog in the machine (see video above, in particular, starting at minute 2:40), so he runs an automated cleaning cycle but that fails. In the past he would have to call up a service technician to come. What he can do now is use augmented reality to perform basic service himself. He scans a marker on the machine and an augmented reality experience is loaded for him that gives him instructions on how to clean the aperture.
Those are a couple different examples of use cases and in general what our research or our customers are telling us is that there’s a ton of value here.
This is a slightly richer example. This is a 3D printed replica of a pump from a Bobcat Skid-Steer. In this case we’re doing something a little different. We’re blending physical and digital. We’ve got the pump here in front of you that we’re looking at on an iPad [the camera pointed at the pump] and you’ll see, as I move it pump around, the 3D data superimposed on the image of the pump moves around with it, and we’ve got some different commands here. We can go get some enterprise system information like who owns this thing and when it was last serviced. We can see that there’s some sensor data around temperature and RPMs and oil.
Now there’s a problem with the oil, so how are we going to take this thing apart? What is it exactly that needs to be done? So we overlay some visual disassembly instructions, showing what screws to remove, etc.
Very slick. You’ve been demonstrating all of this on an iPad, but you can do it with goggles too?
Yes. The experience using something Microsoft HoloLens is great because it leaves your hands free. I mentioned that GE locomotive earlier. Those engines are quite huge, but using HoloLens and ThingWorx View, you can bring up an image of one of the 12 engine cylinders in a GE locomotive.
That’s pretty cool.
You can actually stick your face right into it to see the parts working inside, and also instruct it to take itself apart.
Wow. So you have customers using this already?
We’ve had 1,500 companies go through a pilot program using the Studio tool and leveraging their 3D, incorporating IoT, and this product became available last summer. We’re updating it every month. It’s getting better and better each month. The HoloLens support isn’t official yet. That’s in alpha but that will be made available in June, and then in the Studio environment, just like I created it for the iPad or the iPhone, I can create it for the HoloLens as well.
So, closing the loop and coming back to IoT, wrap it all up for us.
We think of IoT in its traditional sense as a way to listen to and talk to your smart connected products, and we think about augmented reality as a way to see and experience what’s going on with those products. What this combination means is that, instead of driving, looking at the road and then looking at a digital twin of the road on your GPS, we’re mapping the GPS information to what you’re seeing through your windshield. The information is presented in context, clearly, intuitively, and it is easily digestible and much more impactful.