Using AI to Improve Thermal Imaging
April 29, 2020

Using AI to Improve Thermal Imaging

Pablo Garcia
Pablo Garcia

At CrowdAI, we help people make the most of their imagery—whatever the source. One type of imagery we’ve seen increase in popularity in recent years is infrared (IR) or thermal imagery. We’ve worked with these bands of imagery and video for some time now, and the opportunities to leverage AI to improve the value of IR are bigger than you may think.

Of course, COVID-19 is at the forefront of our minds as a team right now. Given our experience creating AI models for IR, we decided to explore how we might be able to help. We wanted to explore what’s possible when AI and IR are combined, and turned to our experience with the FLIR ONE Pro camera we use for R&D. Could this be used for something like non-invasive temperature screening?

When it comes to temperature screening using IR, we’re aware of two main problems, and we think AI can help:

  1. The forehead is not a very good proxy for core temperature, but it turns out that the inner canthus—that little triangle in the corner of your eye—is a decent one. But this leads to the next problem.
  2. It’s hard for the user to get a quick and reliable temperature reading for a localized region within the image. You can look at the image and see the hotspots pretty easily, but what if you wanted to understand the average temperature of a specific part of the image—the inner canthus, for instance. As humans, we lack the ability to quickly gather this type of data, but this is exactly where AI can shine.

Examining Thermal Video
Using our FLIR One Pro with an iPhone 7 on a tripod, we recorded dozens of hours of footage of team members walking in and out of frame (in their own homes—safety first). With the resolution of the FLIR One Pro, we can’t see just the inner canthus itself, so we instead annotated the area immediately surrounding the corner of the eye. Team members used heat pads to elevate their body temperature in some of the video. We ingested the videos into our platform and used our video annotation tool to draw polygons around the inner eye area of any face that was in frame.

The resolution of the FLIR ONE Pro allows us to detect elevated temperatures on a person, but doesn’t immediately help us understand a more difficult concept, such as core temperature.

We want to be clear that these videos are strictly for proving that AI can be utilized for detecting elevated temperatures with thermal video. While the data was not recorded in the real world, we think the video can serve as a proxy for what might be used in a production environment.

AI enables more robust data for infrared
When run against the validation set that was put aside at the beginning, our model was able to reliably detect the area around the inner canthus of each eye. While the polygons change size a bit throughout a video clip (especially when someone moves away from the camera), the overall results suggest this can be useful in production. After the model labels the appropriate regions, we can simply take the average temperature value of each polygon, either providing two readings per face (one for each eye), or averaging the two polygons to get an average value per face. Our model processes the video in real-time, at 15 frames per second and faster.

.

By labeling the area immediately around the inner canthus (green polygons), our model can instantaneously provide temperature readings for just that region, ignoring other things that might throw off that reading—such as another person in the background.

But why do it this way—why not just take the highest temperature value in the frame and call it a day?

Both our experience and academic literature demonstrate that naively taking the highest temperature value in the frame won’t accurately represent a person’s core temperature. A number of factors, from ambient temperature and humidity to the presence of multiple people in the frame immediately complicates this simplistic approach. This is where AI is helpful: by first reliably detecting a face and then segmenting just the area around the inner canthus, we can get a more reliable and accurate measurement of core temperature.

Improving IR for everyone
Of course, infrared images and video aren’t just being used for measuring temperatures. Whether detecting micro-leaks in gas pipelines or monitoring manufacturing processes, this type of visual data has a lot of potential. Many industries are already exploring using IR, and we think that’s all the better.

Infrared imaging can be used for a growing number of applications across industries. For example, leaks from natural gas infrastructure are undetectable to the unaided eye, but can be seen in IR imaging.

Where thermal imaging is already in use or could be useful, AI can streamline the process and reduce the need for collecting data via direct contact. Using algorithms on existing sensors in real time would allow for more objective and precise IR reads when monitoring anything from large crowds to construction sites.

If you have an interesting IR use-case, we’d love to hear from you about how you’re using this sensor and how AI can make it even better.

Advancing AI
Oil & Gas
Utilities
Defense
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.