Skip to content
Joshua Fu stands on the pedestrian bridge attached to the John D. Tickle engineering building.

Joshua Fu Study Clears the Air for Data Quality

Clouds of debate over the value of artificial intelligence (AI) fill the skies of popular discourse these days, but UT researchers on the forefront of AI technology breath more easily knowing what an important tool it can be to further science—and in some cases, literally help clear the air.

New research published this summer by Joshua Fu—Chancellor’s Professor, John D. Tickle Professor, and James G. Gibson Professor in the Department of Civil and Environmental Engineering—examines a multi-tiered method to increase the quality of data when measuring air pollution with the machine modeling techniques of an AI approach. He shared his findings in an article for The Conversation.

Historical measuring methods track pollutants from known sources like industry, automobiles, or wildfires and factor in meteorological information like wind, precipitation, and more. This approach produces limited information, though, in measuring the impact of multiple types of pollutants.

Fu’s team added machine-modeled datasets—a machine learning and measurement-model fusion (ML-MMF)—to deliver a more dynamic and accurate picture of a region’s air-quality situation. They found their results were 66% more accurate after comparing its predictions with actual pollution measurements.

“We combine them together, so we fix the uncertainty using ML-MMF,” said Fu. “We saw that the data represented more real-world measurements.”

The datasets measure levels of manmade ozone pollution that forms from a blend of two emissions: nitrogen oxides (NOx) from power plants and volatile organic compounds (VOCs) from petroleum-based fuels in automobiles and industry. These more detailed measurements can inform better ways to reduce air pollutants.

“If we reduce them, they will not form ozone and particulate matter,” said Fu.

The approach could also influence usage policies that help alleviate respiratory illnesses and other negative effects from air-pollution.

His study, published in Ozone Response Modeling to NOx and VOC Emissions: Examining Machine learning Models, looked at a daily, ground-level ozone average in a case study in Taiwan, where their Environmental Protection Agency has comprehensive air-quality monitoring stations around the country.

“We chose Taiwan because we can get very good datasets, like hospital data, the air-quality modeling data, and also the measurement data,” said Fu. “We don’t have an easy way to combine all that data together in the world.”

He believes the technique could eventually be applied in areas like Los Angeles or New York City, or in impactful situations such as the 2023 wildfires in Canada and other countries. This innovative combination of techniques will hopefully refine guidelines and improve quality of life from an environmental standpoint. A follow-up study in the works will add information about impacts to public health, using data such as mortality rates, and potential economic costs.

“This is going to introduce this methodology to the community,” said Fu. “If you want to get a better quality of data than current modeling guidelines, this is one good approach.”


Contact

Randall Brown (865-974-0533, rbrown73@utk.edu)