Drones have proven useful in monitoring large swaths of land for wildfire, but the thermal-imaging equipment they often carry tends to be expensive, heavy and power hungry.
That equipment just might eventually be able to stay on the ground sometimes, thanks to new research out of Clemson University. A team has developed a way of estimating wildfire temperatures using a standard camera and artificial intelligence.
Fatemeh Afghah, an associate professor of electrical and computer engineering, partnered on the research with her former master’s student, Michael Marinaccio, who has graduated and is now working in industry.
They found that artificial intelligence can learn to estimate heat by studying patterns in ordinary red-green-blue (RGB) images, the type of standard color photos taken by digital cameras.
It could set the stage for cheaper, easier-to-deploy systems that help fire managers spot lingering hotspots hidden beneath smoke, ash or vegetation.
Afghah said she likes that the research had a practical use.
“You learn the problem from the end user, you find the solution and you get it back to them to use it,” she said. “That’s a good feeling, and that’s what I enjoy about the wildfire project.”
Marinaccio said the ability to be a part of an impactful wildfire research project, while working alongside highly motivated and talented peers, defined his graduate school experience.
“I was able to develop my research, communication, and interpersonal skills, while completing an original project from start to finish, and have carried what I learned into my current career,” he said. “The IS-WiN Lab is a fantastic group and I would highly recommend anyone interested in communications systems, drones, or wildfire monitoring, to seek out research opportunities in the lab.”
In the project, researchers collected pairs of RGB and thermal images that were taken during actual prescribed burns. They then used those images to train AI to see how patterns in thermal data lined up with subtle visual cues in regular RGB images.
The training helps the system estimate temperatures using only RGB images, meaning a thermal camera would not be required during deployment in certain scenarios.
The system’s level of accuracy is lower than what high-end thermal cameras can achieve, which can be accurate to within a few degrees. Still, the researchers said the results are strong enough to be useful for monitoring fires when thermal equipment is unavailable or impractical.
Researchers are calling the training framework SAM-TIFF, an acronym for Segment Anything Model-Radiometric Thermal TIFF files.
The work is part of a larger Clemson effort focused on using unmanned aircraft systems to improve wildfire detection, monitoring and response.
That effort is supported by collaborators including NASA’s FireSense program and the National Science Foundation, as well as partners working directly with fire agencies.
Researchers described their findings in a paper entitled, “Seeing Heat with Color – RGB-Only Wildfire Temperature Inference from SAM-Guided Multimodal Distillation using Radiometric Ground Truth.”
The paper won second place in the Best Student Paper contest at the IEEE Signal Processing Society’s Asilomar Conference on Signals, Systems, and Computers.
Hai Xiao, chair of the Holcombe Department of Electrical and Computer Engineering, said the work illustrates how researchers tackle real-world problems while creating meaningful research experiences for students.
“This kind of research shows how our faculty and students work together to address challenges that matter in South Carolina and beyond,” Xiao said. “It also reflects Clemson’s focus on giving students hands-on opportunities to contribute to research with real-world impact.”


