A new international climate modeling study led by researchers at King Abdullah University of Science and Technology (KAUST) highlights different potential scenarios for the future climate of the Arabian Peninsula, depending on which climate policies are implemented.[1]
The Arabian Peninsula has long been known for its high temperatures and water scarcity that challenge living and working there. However, these problems will only exasperate with the temperature increases predicted by all climate models, affecting a population that is expected to double between now and the end of the century.
KAUST Emeritus Professor Georgiy Stenchikov, who this month was part of the team of KAUST and international researchers that won the “Nobel” prize for high-performance computing, the ACM Gordon Bell Prize for Climate Modelling, led the Arabian Peninsula study using a sophisticated tool known as “statistical downscaling” that was applied to climate models to analyze the Middle East region.
“We applied statistical downscaling to 26 global climate models under different greenhouse gas emissions scenarios, giving us a spatial resolution of 9 km” said Stenchikov. “This fine resolution enhances our ability to detect and analyze regional warming and hotspots more effectively, presenting the most accurate regional-scale prediction of temperature change over the Middle East and North Africa.”
After applying this technique, Stenchikov, along with his colleagues and lead author Abdul Malik, found that some regions in the Middle East are heating at rates three times faster than global averages.
In the best-case scenario, if greenhouse gas emissions reach net zero by 2050, which reflects the goals of the Paris Agreement, animations from the modeling reveal that temperatures of the Arabian Peninsula will still rise by more than 2.5 degrees before the year 2100, with some parts warming 3.5 times faster than global averages.
In the more alarming scenario, if greenhouse gas emissions reach what climate scientists call the “high emission scenario,” several provinces in Saudi Arabia, including Riyadh, may see average temperatures rise by more than 9 degrees this century. These temperature rises are expected to put severe stress on both the habitability and economic productivity of the region, reiterating the importance of good climate policy.
This research highlights KAUST’s focus on tackling regional climate issues, aligning with its efforts highlighted at COP16 to combat desertification and promote sustainability.
High value metals, such as lithium, could be extracted directly from seawater, lake brines, or could be recycled from electronic waste, a study of designer nanoporous membranes suggests. The membranes incorporate ring-shaped ‘macrocycle’ molecules, which form precisely defined pores that permit only the target metal to pass[1]. Macrocycle membranes can also efficiently purify challenging mixtures of high value chemicals, such as pharmaceutical ingredients, the KAUST research team has shown[2].
Separating multicomponent mixtures is a core part of industrial activity ranging from raw minerals processing to fine chemical and pharmaceutical production. These steps have a large environmental footprint, however. Most separations involve energy-intensive heat-driven processes such as distillation and evaporation. “More effective separation methods would lead to a much more sustainable and profitable chemical industry, reducing the need for carbon capture at the end of the process,” says Suzana Nunes, who led the research.
“It is crucial to develop new materials that enable more efficient separations,” says Gyorgy Szekely, who co-led part of the work. A few industries – notably, seawater desalination – have employed membranes as an energy-efficient alternative to heat-driven separations. The membranes they use consist of tightly woven thin polymer sheets, through which water molecules can squeeze but salt cannot.
“Commercial membranes mostly separate water from salt, or separate large solutes from very small ones,” Nunes says. But because these membranes lack precisely defined pores, they are ineffective for finer-grained separations, she adds.
To efficiently separate mixtures of similarly sized molecules, Nunes and her colleagues developed membranes that incorporate ring-shaped macrocycles into their structure. “The macrocycles can act as a pore, tuned specifically for the size of molecule or ion to be transported,” Nunes says.
The team’s first challenge was to develop a versatile and scalable method for making membranes featuring embedded macrocycle, Szekely notes. The researchers were able to adapt an existing method that industry already uses to manufacture membranes, called interfacial polymerization.
This highlights the team’s pragmatism to enable easy industrial uptake. “Our focus is to invest in methods and materials that could fabricated at the scales required by industry,” Nunes says.
The membrane was made by combining the macrocycle component with a molecular linker. The interfacial polymerization method brings these two components together under controlled conditions, by exploiting that water and organic solvents do not mix. When the team created a macrocycle component that dissolved in water, and a linker component dissolved in organic solvent, then combined the two immiscible liquids, the membrane self-assembled as a thin film at the interface between the two liquids.
Nunes and Szekely showed that when they made membranes from a macrocycle called 18-crown-6, they could separate mixtures of closely related pharmaceutical molecular ingredients. “The membranes showed excellent selectivity,” Szekely says. “We could separate solutes that have small difference in their molecular weight, which is highly sought-after by the pharmaceutical and fine chemical industries.”
Nunes also showed that, using a macrocycle called a cyclodextrin, the membrane could selectively concentrate valuable lithium or magnesium from salty mixtures mimicking seawater and industrial brines.
Macrocyclic molecules are readily available in a wide range of sizes, suggesting that ultra-selective macrocycle membranes tailor made for specific industrial separations could be created. Mixtures of metals generated from electronic waste recycling could also be separated by this method, Nunes says. “We are now investigating different macrocycles, different forms of self-assembly and different applications,” Nunes says. “In collaboration with industry, we are also working on scaling up the membranes,” she adds.
From skyscrapers to nanomaterials, detailed blueprints are an essential element of structural design, delineating how simple building blocks can be combined to create complex structures. A new approach for creating chemical blueprints of unprecedented complexity for porous crystalline structures such as metal-organic frameworks (MOFs) has been developed by researchers at KAUST[1].
The “merged nets” approach to MOF-blueprint creation has implications for the design of bespoke MOFs for sustainability-related applications including gas storage, catalysis and molecular separations.
Design blueprints for MOFs and related periodic porous materials called covalent organic frameworks (COFs) are based on a form of mathematical graphs called periodic nets. “By providing predefined patterns, periodic nets allow researchers to select molecular building blocks with compatible geometries, enabling their precise assembly into desired structures,” says Hao Jiang, a research scientist in Mohamed Eddaoudi’s research group. “This approach has facilitated the systematic construction of MOFs and COFs with targeted structures and properties,” Jiang says.
Previously, researchers could only rely on the 53 edge transitive nets — nets with one kind of edge — as a blueprint for the rational design MOF and COF. “These simple nets are insufficient as blueprints for more complex structures that are essential for achieving advanced properties and applications,” Jiang says. As a result, complex multicomponent MOFs synthesis had remained a slow and tedious process of trial and error.
The first step toward complex MOF rational design came in 2018, when the team made a simple MOF called Tb-spn-MOF-1 based on the edge-transitive net known as “spn” net. This MOF structure featured open sites that enabled additional molecular linkers to be placed as connectors between metal clusters within the material’s porous structure. The team realized that the placed linkers connecting with the metal cluster could be viewed as a separate MOF based on another known designable net, “hxg.” “The whole structure merged the spn and hxg nets into a more complex net,” Jiang says.
By merging two simple nets, the team had hit upon a blueprint that was far more complex than its component parts. They then enumerated all possible merged nets by analyzing all the 53 edge transitive nets and extracting the shared features between them, namely the “signature nets” for each net pair. “Merging requires specific compatible net pairs,” says Jiang. To systematically identify compatible pairs, the team developed the concept of “signature nets,” which captures key structural information about different nets. “If two nets share the same signature net, they may be compatible for merging,” Jiang says.
The discovery has generated 353 new blueprints for complex MOF design. “Using the robust design capabilities of our merged net framework, we have proposed over 100 multicomponent MOF platforms,” Eddaoudi says. Merged net blueprints can be divided into four structural categories based on the periodicity of the nets to be merged, ranging from three to zero periodic. “We validated the practicality of merged net design by crystallizing new materials representing all four categories,” he says.
The merged net methodology has great potential to accelerate the discovery of novel porous materials with impact on real-world applications in energy, environmental sustainability and beyond. “Leveraging the unprecedented design capabilities that merged nets offers, we have already started a multimillion-dollar project with Aramco to develop practical materials for the energy-efficient and cost-effective direct capture of CO2 from the atmosphere,” Eddaoudi says.
Building safe skyscrapers and stirring cups of coffee have something in common: both involve a phenomenon known as “turbulent flow,” which, in these cases, is the movement of air particles around a tower or liquid around a spoon. Now, with the aid of state-of-the-art cameras, mechanical engineers at KAUST have discovered unexpected patterns in this seemingly chaotic flow under certain conditions[1]. Their findings could help improve the construction of airplanes and underwater vehicles, as well as the efficiency of flow in pipes and industrial equipment.
Saudi Arabia is currently constructing the Jeddah Tower, planned to be the world’s tallest structure at one kilometer high. “It is vital for comfort and safety to understand how turbulent wind will shake the Jeddah Tower,” says KAUST’s Sigurdur Thoroddsen. “Turbulent flow also governs weather patterns and mixing in ocean currents and river flows.”
Physicists have spent around a century studying how vortices are generated as fluids move through pipes and around objects. Until now, however, they thought these vortices were generated chaotically, so their behavior could not be accurately predicted or mitigated for in advance. But in a new study, Thoroddsen and colleagues Abdullah Alhareth, Vivek Mugundhan and Kenneth Langley discovered some unexpected patterns in their length, occurrence and alignment.
The team used four high-speed cameras with pulsed lasers to create a precise 3D map of fluorescent microparticles added to water, as the liquid was pumped from a 500-liter tank through a roughly 3-meter-long tunnel, which narrowed along its length. “We simultaneously tracked the position and speed of around 200,000 particles to reconstruct the vortices,” says Alhareth. “The experiment required a huge amount of computer memory and around six days of supercomputer time to analyze each dataset.”
The team discovered repeating vortex structures, aligned with the flow, which were stretched out and spun quickly. “This is similar to the way that rotating figure skaters will spin faster when they draw their arms towards their body,” says Thoroddsen. “We were surprised how prominent these vortices were, how often they appear, how quickly they form and how long they last.”
Alhareth holds a joint appointment at the King Abdulaziz City for Science and Technology, and this partnership with KAUST enabled him to extend his understanding beyond water flow to air flow. “This gave me the opportunity to consider aerospace applications of our turbulence results compared with those obtained in a wind tunnel,” Alhareth explains. The findings could thus improve modeling of turbulent flows in aerodynamics, combustion, submarine and space applications, he says.
Predicting the formation of vortices will also help enable the design of clean and efficient engineering devices for a sustainable environment, says Mugundhan. “The vortices could enhance mixing and increase heat transport, or they could increase pressure drops along a pipe, so you need more pumping,” he says. “Our work will help people to model where they can enhance performance with vortices and anticipate where vortices will be a hindrance.”
Hydrogen sulfide is a toxic component found in certain types of natural gas and can be generated during refining processes. Now, KAUST researchers have developed a technology that transforms this waste product into sulfur and hydrogen, both useful industrial chemicals[1]. By pinpointing exactly how this reaction works, the research paves the way for further improvements in the catalyst’s performance.
“We have developed a method to engineer a catalyst that has reached very high levels of activity in laboratory conditions,” says KAUST’s Pedro Castaño, who co-led the work in collaboration with the petroleum and natural gas company Saudi Aramco. “A significant aspect of this project is that it is a true partnership, with a very effective exchange of people, ideas and responsibilities,” Castaño says.
Much of Saudi Arabia’s natural gas reserves contain hydrogen sulfide. Aside from its toxicity, hydrogen sulfide also contributes to atmospheric pollution and acid rain, so it must be removed from this “sour” natural gas. The existing industrial Claus process then converts hydrogen sulfide into water and sulfur, which is used in fertilizers and other products.
Some catalysts can turn hydrogen sulfide into sulfur and hydrogen, a clean-burning fuel that could reduce the greenhouse-gas emissions of transportation and industrial processes. In principle, this could increase the economic value of hydrogen sulfide, but to date none of the catalysts have been effective enough to deploy at industrial scale. So the KAUST team studied one of the most promising catalysts, molybdenum disulfide (MoS2), to find ways to improve it.
The atoms in MoS2 are ordered into flat sheets that can adopt two different arrangements called phases. Incoming hydrogen sulfide molecules may bind at several different locations on the faces and edges of these sheets. At some of these sites, the molecules break apart so that their atoms can recombine into hydrogen and sulfur. The efficiency of this process depends on factors such as how strongly hydrogen sulfide binds to the catalyst, and how easily sulfur and hydrogen escape once they have formed.
The researchers’ computer simulations of the two phases showed that the ideal sites were found along the edges of the MoS2 sheets: sites that contain more prominent sulfur atoms being particularly important. “We need a particular balance of edges and imperfections on the MoS2, and when we understand this balance we can engineer the catalyst accordingly,” says Castaño.
The researchers then made three different MoS2 catalysts in the lab so they could study their surfaces and test their activity. They found that a catalyst prepared with a small amount of oxalic acid was the best performer, because the additive helped to generate more of the desirable edge sites.
The experiments also revealed that the strength of hydrogen-atom binding to the catalyst plays a key role in dictating the overall rate of the reaction. By applying the same theoretical approach to other materials, the team discovered that vanadium disulfide and niobium disulfide might also be promising catalysts for this reaction.
All these findings should help to further optimize such technology, so that they can be used for hydrogen sulfide decomposition at an industrial scale.
Scientists are sounding the alarm: coral reef restoration is not a distraction, but a crucial weapon in the battle against climate change and other threats to these vital ecosystems. While some critics question the effectiveness of restoration efforts, a recent paper published in Nature Climate Change[1] argues that dismissing restoration undermines a key component of coral reef conservation.
The authors — a group of leading experts in the field including lead authors Professors Raquel Peixoto and David Suggett from KAUST — acknowledge the challenges of restoring damaged reefs, particularly in the face of ongoing climate change. However, they stress that restoration plays a vital role, particularly in responding to smaller-scale localized disturbances.
“Coral reefs need to be retained, and restoration practices need to be studied and optimized while we reach carbon neutrality,” notes Peixoto. She further remarks on the urgency of the situation: “The ongoing decline of coral reefs is undeniable, and local communities have no option but to address climate and environmental impacts that are already threatening their livelihoods and homes.”
The field of coral reef restoration has made significant strides in recent years, with growing evidence of its effectiveness. Projects such as Hope Reef in Indonesia and Laughing Bird Caye National Park in Belize have demonstrated the ability of restoration efforts to shift reef carbonate budgets from negative to positive, fostering recovery and resilience.
Technological advancements have been instrumental in this progress. Genotyping and improved coral husbandry techniques have helped preserve genetic diversity in Florida’s coral populations. Meanwhile, larval enhancement projects in the Philippines have successfully accelerated recovery of breeding coral populations in areas where natural recruitment was limited.
The authors acknowledge the challenges and therefore emphasize the importance of tailoring restoration approaches to local contexts. Understanding and managing local stressors is paramount, and strategies must be customized to address the specific socio-ecological challenges of each site. This includes incorporating resilience-oriented frameworks to ensure the long-term success of interventions.
“Reef restoration has been criticized as ineffective and unscaleable based on outcomes from ‘fast fail’ experiments used to optimize the practice rather than the actual outcomes of restoration, which themselves are starting to show real promise,” says Suggett. He adds: “We need to move past the narrative that restoration is an alternative to — and a convenient distraction from — tackling climate change. Both are needed to secure a future for reefs and the millions of people worldwide who depend on them.”
While large-scale restoration efforts will be necessary as climate change intensifies, the authors also highlight the critical role of smaller-scale projects, particularly in low- to middle-income countries. Often driven by the need to protect livelihoods and resources, these local initiatives are making significant progress in preserving and restoring reefs.
The authors advocate for a comprehensive coral conservation strategy encompassing three pillars: mitigating local stressors, reducing global climate threats, and actively restoring reefs. They caution against the dangers of inaction, arguing that the risks of action must be weighed against the potential consequences of doing nothing.
Coral reef restoration is not a silver bullet, but a vital part of a multifaceted solution to the complex and urgent coral crisis. By combining active restoration with efforts to address climate change and local threats, we can give these vital ecosystems a fighting chance for survival.
The publishing of this paper coincides with, and will be at the center of discussions at, “Reef Futures,” the world reef restoration conference, where the world’s leading scientist, practitioners, policy makers and investors convene to chart critically needed steps to advance restoration effectiveness worldwide.
A method that can grow a useful insulating material into exceptionally high-quality films that are just one atom thick and is suitable for industrial-scale production has been developed by an international team led by Xixiang Zhang from KAUST[1].
The material, called hexagonal boron nitride (hBN), is used in semiconductor devices and can also enhance the performance of other two-dimensional (2D) materials such as graphene and transition metal dichalcogenides (TMDs).
Researchers can combine 2D materials to build tiny electronic components for quantum computing, electronic communications and other applications. While most 2D materials conduct electricity, hBN is one of the few that is an insulator, making it an indispensable component within many of these devices.
In the laboratory, hBN flakes are often peeled from bulk samples of the material, a time-consuming and size-limiting approach that is unsuitable for mass manufacturing. Alternatively, an industrial process called chemical vapor deposition (CVD) can produce hBN by decomposing a precursor called ammonia borane. Boron and nitrogen atoms released from the precursor then form triangular islands of hBN on a copper foil, and these islands gradually grow larger until they join together into a continuous honeycomb lattice.
The team has improved this process by growing hBN from hexagonal islands instead, producing a higher-quality film. “Hexagonal islands have fewer defects, making the final film more uniform and reliable,” says Zhang. The method depends on adding a trace of oxygen during the growth process.
As hBN islands grow on copper, their edges can be zigzags of either boron or nitrogen atoms. In the triangular islands, all three edges feature nitrogen atoms. In contrast, the hexagonal islands formed by the new method have three nitrogen edges and three boron edges.
The researchers’ theoretical calculations show that nitrogen edges are usually more energetically stable than boron edges, explaining why triangular islands are formed during CVD. But oxygen interacts with the islands and the copper foil in a way that gives the nitrogen and boron edges almost identical stabilities. This means the two types of edges can grow at the same rate, which generates hexagonal islands.
The researchers used techniques such as atomic force microscopy and high-resolution transmission electron microscopy to study the islands and films formed by their method. They found that it creates single-crystal films of hBN that are free of pinholes, exhibit low defect densities, have a uniform thickness, and possess excellent insulating properties. “This makes it especially suitable for high-performance 2D-material electronic devices and offers enhanced robustness for nanodevices,” says Bo Tian, who was part of the KAUST team, and is now based at Nanyang Technological University in Singapore.
The team used the method to grow a 25 x 70 mm film of hBN, and larger areas should be possible. “The process is now limited by the CVD system or substrate size, so it is suitable for industrial production,” Tian says. The researchers are now studying the mechanism of CVD growth of hBN in more detail, to further improve the quality and increase the size of the films they can produce.
A peer-reviewed mathematical proof published by a KAUST co-led international research team in PLOS[1] has shown that the recently published Assembly Theory (AT) approximates existing theories of algorithmic complexity and information compression. The findings cast doubt on some of the new theory’s claims to explain a range of natural phenomena from biology and evolution to physics, the Universe, and even the search for extraterrestrial life.
“We were motivated to study AT because of its broad claims and widespread misconceptions regarding information compression, computability and algorithmic complexity, which have been the focus of our group’s research for over a decade,” says Jesper Tegner from KAUST’s Living Systems Lab, who, along with Hector Zenil of King’s College London, led the research team. “AT’s attraction lies in its interdisciplinary appeal, offering the idea of a unified framework that could potentially quantify complexity across various domains, in particularly selection and evolution, and solve longstanding problems. So, we set out to rigorously scrutinize AT’s claims, particularly regarding its novelty.”
The idea behind AT is to measure the complexity of molecular structures based on the number of steps required to assemble them from basic building blocks, calculated through an “assembly index.” The researchers observed that this is very similar to a popular statistical compression method called Lempel-Ziv (LZ) compression, like that used for ZIP and PNG files, which identify patterns in data and compress the file size by replacing repetitive elements with shorter representations.
“This applies not just to digital data; it is effectively the same as counting how many unique blocks are needed to recreate any original object, as our own measures have always done,” says Tegner. “Our collaboration, which includes experts in cell and molecular biology, complexity science and information theory, had previously shown the strong connections between compression and selection and evolution. What the assembly index does is in fact already done better by an index introduced by our collaboration over a decade ago called the Block Decomposition Method, so it is important for us to address these misunderstandings for the scientific community.”
Through their mathematical proof, the research team demonstrated the direct relationship between AT’s assembly index and LZ compression, as well as to a measure of uncertainty in a system known as Shannon entropy.
“A well-shuffled deck of cards for example would have a high Shannon entropy, while a perfectly ordered deck would have a low entropy,” says Zenil. “Assembly theory uses the same principle, but does so with many extra steps.”
The mathematical proof showed the step-by-step equivalence between AT and existing methods based on information theory and algorithmic complexity, confirming that existing methods can achieve similar results without the need for AT’s additional conceptual layers.
“AT garnered attention in the popular media despite its unresolved theoretical challenges,” Tegner says. “This speaks to the power of framing and the appeal of interdisciplinary ideas, even if not fully substantiated. It is a cautionary tale, a reminder to maintain a critical perspective, particularly regarding claims of universal solutions to complex scientific problems.”
Creating electricity from sunlight is a promising route to renewable and carbon-free energy, yet the processes to produce this electricity need to be sustainable. A team at KAUST has worked with international colleagues to make one group of emerging materials more efficient, durable and stable.
Silicon underpins the prevailing commercial solar-cell technology, however emerging alternatives made from organic materials, or organic–inorganic hybrids, hold potential if they can be produced more sustainably. These alternatives can be light, flexible and even transparent, making them useful in a much wider range of practical applications.
The Organic/Hybrid Materials for Energy Applications Laboratory (OMEGALAB) focusses on the development of these organic solar cells. Headed by Derya Baran, the lab aims to develop sustainable electronic materials and devices using low-energy processes and with minimal impact on the environment.
Organic solar cells neatly fit this ambition. They can be manufactured using so-called roll-to-roll printing, which may be less expensive and less energy-intensive than traditional solar cells. This, together with the fact that organic materials can be environmentally friendly and abundantly available reduces the technology’s impact.
The efficiency of organic solar cells has been improving and is now as high as 20 percent in the lab: so the scientists at OMEGALAB are also working towards another important goal of making the devices more durable.
One factor that limits the performance of an organic solar cell lies in its morphology, or the arrangement and structure of the different organic components. Stress in the device caused by temperature can alter the morphology and thereby degrade efficiency over time.
The team worked with colleagues at Jianghan University, China, and the University of the Basque Country, Spain, to show that the thermostability of organic photovoltaics can be significantly improved by introducing so-called thermoset materials[1].
“A cross-linked thermoset matrix is a type of polymer network formed by chemical bonds that create a three-dimensional structure,” explains Jianhua Han, a former postdoctoral fellow from KAUST, now at Julius-Maximilians-Universität Würzburg, in Germany.
“In this matrix, the polymer chains are interconnected through covalent bonds, known as cross-links, which prevent the material from softening or melting when exposed to heat,” explains Han. Using these materials, the team doubled the energy generated by their organic solar cell during an eleven-week, outdoor test.
A further challenge is that some of the changes to device architecture that aim to improve efficiency have had the unintended consequence of worsening stability. So there is a trade-off between the two[2], demonstrated the group—working with colleagues from Flinders University, in Australia, and Middle East Technical University, in Turkey.
In this work, the team looked at the role of self-assembled monolayers. Previously, these have been shown to improve organic solar-cell operation by minimizing absorbance at the interface with the indium tin oxide (ITO) electrical contact, or anode.
They showed that the molecules in the self-assembled monolayer are not photochemically stable, and that they degrade and decompose under light exposure. “We establish direct correlation between the properties of so-called self-assembled monolayers in organic solar cells and their performance and stability—an area that has been largely unexplored in the field,” explains KAUST research scientist, Anirudh Sharma. The team showed that inserting a nickel oxide layer in between the ITO and the self-assembled monolayer reduced the problem.
Baran believes that KAUST is the perfect place to do this type of important research. “The campus location is close to the equator, so it is exposed to plenty of direct sunlight,” she says. “It is also hot and humid, conditions where even the performance and reliability of commercial silicon solar cells need to be improved. So, any technology proven at KAUST should also be useful elsewhere in the world.”
Checking this world-wide applicability of their devices is the team’s next step. “We will conduct a global ‘round robin’ study to understand how organic photovoltaics work in different parts of the world and compare it with perovskites,” Baran says. “This will be a collective effort in collaboration with several international institutions in a study that is the first of its kind.”
A computer model that can reliably identify gases under a wide range of conditions could be used in automated systems that detect volatile organic compounds[1]. It has potential in sectors such as environmental monitoring, healthcare, and energy, where precise gas detection is essential for safety and efficiency.
“This work has significant implications for security and decision-making, essentially creating a sensor that confidently reports its predictions,” says KAUST’s Aamir Farooq, who led the work.
The model interprets data gathered by a technique called spectroscopy, which measures how gas molecules absorb particular wavelengths of light or microwaves, for example. This spectrum serves as a unique fingerprint for each gas. However, it is often challenging to detect the presence of one gas within a complex mixture, because features of different gas spectra can overlap and obscure one another.
That is where machine learning models can help. Machine learning involves feeding information to a computer algorithm so that it gradually ‘learns’ to recognize patterns and connections in the data. Once this training is complete, the model can then interpret entirely new data.
Researchers have previously trained machine learning models to identify the spectroscopic fingerprints of molecules, but these often failed to cope with real-world challenges such as noise, interference, and variations in pressure and temperature. “Previous models often required extensive datasets, and struggled to adapt to new conditions,” says Mohamed Sy, a PhD candidate in Farooq’s group who worked on the new model.
The KAUST team used several tactics to create a better machine learning model called VOC-certifire, which can identify gas molecules from their terahertz-frequency spectra. Although this model was trained on relatively limited amounts of data, it achieved the same level of accuracy as more data-intensive models.
The researchers started with simulated spectra of 12 volatile organic compounds, including ethanol and the toxic gas hydrogen sulfide, and used a strategy called augmentation to generate many variants of these spectra. This process simulated how the spectra would change at different temperatures and pressures, for example, and also introduced additional noise to mimic real-world readings.
The augmentation process produced a set of 12,000 spectra to train the model. “By training the model with these variations, it becomes more resilient, enabling it to generalize effectively with less data,” says Farooq.
When the team tested the model’s ability to identify spectra, they used another technique called randomized smoothing to further improve accuracy. This involved tweaking the spectrum multiple times, and asking the model to identify each variant. The gas molecule recognized in the majority of these tests is then more likely to be the correct answer — and the model can also report how confident it is in its assertion.
The team found that VOC-certifire was better than three rival models, and could even “offer a standardized level of accuracy that human experts may not consistently achieve”, says Sy. The researchers have continued to develop the model, and will report their latest work at the prestigious Neural Information Processing Systems conference in December 2024.