A better understanding of how tiny airborne pollutant particles, or aerosols, promotes sulfate formation could help improve management of air quality, a KAUST-led team[1] has shown. The findings could improve models used to predict and reduce secondary pollution.
Aerosols pose serious environmental and health threats, especially in Asia and the Middle East and North Africa region. Primary aerosols are generated by outdoor and household sources, such as power plants, industries, agriculture, automobiles, wildfires, dust storms, cooking activities, and incense burning. Secondary aerosols result from chemical reactions in the atmosphere.
Together, primary and secondary aerosols cause respiratory and cardiovascular disorders that lead to seven million premature deaths annually. Air pollution in Saudi Arabia, with one of the world’s highest aerosol concentrations, has shortened life expectancy by almost 1.5 years.
Aerosols from burning biomass represent 60 to 85 percent of the total primary organic aerosols emitted annually and are likely to increase as wildfires become more frequent and severe with intensifying climate change. They also absorb sunlight and worsen haze, which accelerates the warming of the Earth’s atmosphere.
Sulfate is a major aerosol component that results from the oxidation of sulfur dioxide during haze events. The Middle East emits more than 15 percent of global sulfur dioxide. Conventional air quality models have attributed this reaction to oxidants present in the gas phase, yet fail to explain the elevated sulfate levels observed when haze occurs.
Some molecules in the wildfire smoke absorb light and can transition to long-lived high-energy states called triplet excited states when exposed to light. In a triplet state, the molecules show unique reactivity and can initiate reactions with other compounds.
Inspired by the ability of these brown carbon molecules to form reactive species inside atmospheric particles, a team led by Chak Chan, and post-doc Zhancong Liang, has now proposed that triplet states generated in burning-biomass organic aerosols account for the ‘missing sulfate’.
The researchers assessed the photochemical reactivity of the aerosol particles in the multiphase oxidation of sulfur dioxide. They burned typical biomass to produce particles and then used an aerosol flow reactor to better mimic the reactions of submicron particulates in the atmosphere.
The team discovered that biomass smoke contained photosensitizers and generated triplet states that were key in stimulating the oxidation of sulfur dioxide into sulfate in the aerosol particles. Aerosol particles also displayed oxidation rates three orders of magnitude higher than bulk solution.
Simulations further indicated that triplet-state-driven sulfate formation enhances sulfate levels in wildfire-prone regions that are rich in biomass-burning organic aerosols. “Incorporating our kinetic parameters into atmospheric models can improve predictions of secondary pollution and help improve understanding of the environmental impact of global warming,” Liang says.
Next, the researchers plan to develop a predictive framework for the reactivity of various biomass-burning organic aerosols using their chemical structures. “The diversity and mechanistic ambiguity of organic aerosols make it complex to parameterize and predict triplet-state-driven reactions,” Liang says, noting the need to use samples with diverse compositions to investigate interactions with different atmospheric molecules, such as volatile organic compounds.
The team also collaborates with Imed Gallouzi, Chair of the KAUST Center of Excellence for Smart Health, on health impacts of atmospheric particles.
A mathematical proof established more than 140 years ago provided the key for a KAUST-led team to develop a computational method for accurately simulating complex biophysical processes, such as the spread of disease and the growth of tumors[1].
Reaction-diffusion equations are widely used to model the dynamics of complex systems by providing a macroscopic mathematical description of the interplay between local interactions and random motion.
“Reaction-diffusion equations play a crucial role in many clinical applications, including computational epidemiology — an interdisciplinary field that combines mathematics and computational science — to enhance our understanding and control of the spatiotemporal spread of diseases in real time,” says Rasha Al Jahdali from the KAUST research team.
“This discipline is vital for informing health policy decisions worldwide. Since studying epidemiological phenomena is often complex and sometimes unfeasible, it is important to develop efficient, robust, and predictive algorithms for reaction-diffusion processes.”
These equations capture real physical mechanisms in the form of continuous ‘partial differential’ equations that describe rates of change over space and time. While mathematically ideal, such equations can be very difficult to solve for the purposes of simulating complex biophysical processes because computers require numerical methods involving discrete calculations of real numbers.
“Discretizing continuous reaction-diffusion equations using numerical schemes allows us to use computers to solve them numerically,” says Al Jahdali. “This involves approximating continuous variables and their derivatives or rates of change at specific points in space and time. The problem is that existing discretization methods often lack the ability to maintain stability and accuracy when simulating complex, nonlinear interactions like those inherent in biological systems.”
That means existing numerical schemes often produce ‘unphysical’ results or break down, particularly with spatially varying reaction-diffusion equations, which limits their utility for making reliable predictions in a clinical context.
To solve this problem, Al Jahdali and her colleagues turned to an old mathematical proof, called the Lyapunov direct method, that tests for the existence of a stable solution of time-varying system without needing to solve the underlying partial differential equations.
“Our computational framework leverages the Lyapunov’s direct method to develop fully discrete and ‘smart’ self-adapting schemes of arbitrary accuracy in space and time,” says Al Jahdali. “This new computational framework provides robust and accurate solutions suitable for applications in complex environments, capturing correctly the dynamics of phenomena, which is crucial for accurately modeling real-world scenarios in biological and clinical applications.”
The researchers applied their method to a commonly used spatially varying ‘susceptible-infected’ model for predicting the endemic spread of disease, showing that their numerical approach stayed consistent with the original physically accurate reaction-diffusion solution. They also used their approach to model the treatment of a tumor in the brain using virotherapy, involving complex interactions in space and time based on known biochemical processes.
“Our approach demonstrates superior performance compared to traditional numerical methods for solving reaction-diffusion partial differential equations, which will enable more reliable and physically consistent results,” says Al Jahdali. “This represents a significant step in developing computational methods that are not only theoretically sound, but practically useful for addressing pressing global problems.”
Optoelectronic devices developed at KAUST that behave as either synapses or neurons, and adapt and reconfigure their response to light, could find use in optical neuromorphic information processing and edge computing[1].
The team from KAUST has designed and fabricated metal-oxide semiconductor capacitors (MOSCaps) based on the 2D material hafnium diselenide (HfSe2) that act as smart memories. The devices feature a vertical stack structure where HfSe2 is sandwiched between layers of aluminum oxide (Al2O3) and placed on a p-type silicon substrate. A transparent indium tin oxide (ITO) layer sits on top, allowing light to enter from above.
“When hafnium diselenide nanosheets are integrated into charge-trapping memory devices through solution-based processes, they enable both optical data sensing and retention capabilities,” says graduate student Bashayr Alqahtani. This allows the device to be reconfigured to sense light or store optical data after the light source is removed, depending on the bias conditions. “Our device is based on a two-terminal capacitive memory, which shows promise for device 3D stacking, paving the way for more adaptive and energy-efficient solutions,” she explains.
Experiments show that the charge trapping and capacitance of MOSCaps change with light conditions, allowing them to serve as smart memories that learn using light. As a result, optical signals can be used to train and alter the response of the device, while electrical bias signals can be used to erase the device. In particular, the team has shown that exposure to blue light with a wavelength of 465nm can reinforce or strengthen the response to red light at 635nm, a behavior known as associated learning. In the terminology of neuromorphic computing, the MOSCap acts like an artificial synapse showing both long-term potentiation (increase in synaptic response) and long-term depression (weakening of synaptic response).
”This work investigates how artificial neurons respond and adapt to optical stimuli — specifically, changes in light intensity, duration and wavelength,” explains Nazek El-Atab, who led the team. “This research is crucial for understanding these smart memories’ capabilities and improving their adaptive learning mechanisms.”
The team used these characteristics to run a simulation that predicts that a capacitive synaptic array circuit based on the devices could recognize handwritten digits from the industry standard MNIST database with an accuracy of 96%. They also show that, in principle, the adaptive sensing capabilities of a MOSCap neuron could be used to perform exoplanet detection by correctly spotting transient changes in a star’s light intensity as an exoplanet periodically passes in front of the star with an accuracy of 90%.
“These devices demonstrate in-memory light sensing capabilities that make them ideal for edge computing applications,” comments El-Atab. “They show particular promise for artificial intelligence applications where rapid processing and storage of large data volumes is essential, especially when it comes to optical data. The range of potential applications can be wide — from autonomous vehicles to virtual reality and IoT systems.”
Separating and purifying closely related mixtures of molecules can be some of the most energy-intensive processes in the chemical industry, and contributes to its globally significant carbon footprint. In many cases, traditional industrial separation protocols could be replaced using the latest energy-efficient nanofiltration membranes — but testing the best separation technology for each industrial use case is slow and expensive.
A computational tool that can reduce this work by comparing separation technologies for a given chemical mixture, and predict the most efficient and inexpensive technology for the task, has been developed by researchers at KAUST[1].
“We are able to predict the separation of millions of molecules relevant across industries such as pharmaceuticals, pesticides, and pigments,” says Gyorgy Szekely, who led the research.
Commercial nanofiltration membranes can slash the energy cost of chemical separations, compared to traditional heat-driven methods such as evaporation and distillation, by selectively filtering out the desired product. Nanofiltration does not work in all cases, however. “Predicting the separation performance of membranes for different chemical mixtures is a notoriously difficult challenge,” Szekely says.
To develop their overall chemical separation technology selection tool, Szekely and his team compiled a collection of nearly 10,000 nanofiltration measurements from the scientific literature, focusing on commercially available membranes.
The researchers used machine learning to analyze the data, generating an AI model able to predict the nanofiltration performance for untested chemical mixtures. This information was combined with mechanistic models to estimate the energy and cost requirements of a chemical separation if it was performed by nanofiltration, evaporation or extraction.
“Our novel hybrid modelling approach enables us to evaluate millions of potential separation options, to identify the most suitable and energy-efficient technology for any given chemical separation task,” says Gergo Ignacz, a member of Szekely’s team. “This will allow industry to make better-informed decisions that significantly reduce operating costs, energy consumption, and carbon emissions,” he adds.
The predictive power of the hybrid model was experimentally validated using three industrially relevant case studies, Szekely says. “We found an excellent match between the values that our model predicted, and measured values for these processes,” he says.
The researchers showed that the carbon dioxide emissions of pharmaceutical purifications could be reduced by up to 90 percent by selecting the most efficient technology for the task. Overall, the energy consumption and carbon dioxide emissions of industrial separations could be cut by an average 40 percent using this method, they estimated.
One surprising finding was the stark difference between the best method and the other two methods for any given separation, Ignacz says. “For most cases, either nanofiltration, evaporation, or extraction emerged as a clear winner, with one method significantly outperforming the others based on economic and energy metrics, leaving little middle ground.” he says.
Although the predictive power of the model proved to be high, there is still room for improvement and further validation, Szekely says. “Our tools are available as open access through the OSN Database at www.osndatabase.com, and we encourage the community to use them,” he says.
The Red Sea is significantly warmer and more saline, even in its deepest waters, than other marine basins. These conditions offer opportunities for unique ecosystems to develop, such as deep warm-water coral frameworks. However, little is known about these deep coral ecosystems and their distribution in the Red Sea.
KAUST researchers, together with scientists in the U.S. and Italy, used a computer modeling approach combined with data gathered on ocean voyages to determine the likely distribution of deep warm-water coral frameworks in the northern Red Sea and the Gulf of Aqaba.
“This region has steep underwater slopes very close to shore, meaning that deep-sea ecosystems could be directly impacted by human coastal activities,” says Ph.D. candidate Megan Nolan, supervised by KAUST faculty Francesca Benzoni, who worked on the project with a team. “The NEOM development on Saudi’s northern coast is keen to safeguard marine ecosystems and invest in sustainable tourism there.”
Both shallow and deep coral frameworks are built by stony corals and offer a range of ecosystem services, including providing habitat for fish and other creatures. These frameworks form the base of coral reefs and are crucial to promoting marine biodiversity, yet even deep-sea reefs are now under threat from ocean acidification, global warming and other damage caused by human activity.
Coral frameworks are extensive enough to modulate the shape of the seafloor. The living framework is accompanied by the rubble of dead coral, which offers a vital substrate for different corals and other species to grow. However, not all deep-sea coral species form frameworks.
“Even the Caryophylliidae and Dendrophylliidae coral families that we observed in this study do not always form frameworks. The exact conditions and processes needed to trigger these formations are puzzling,” says Nolan.
Studying these enigmatic structures is hampered by accessibility issues and high costs. While Remotely Operated Vehicles (ROVs) and submersibles offer considerable insights, they can only survey narrow transects of the seafloor each time. Nolan and co-workers developed a habitat suitability model using ROV and sensor data gathered from the NEOM-funded “Deep Blue” expedition on the M/V OceanXplorer in 2020. They knew the location of several deep-sea frameworks built by different corals and used this information to predict further likely locations for each species.
“Habitat suitability models collate data on specific environmental conditions in places where frameworks are known to survive and then assess the whole study region to identify areas with similar conditions,” says Nolan.
The model assessed 12 variables including water temperature, salinity and sea-floor topography. The team found that a total area over 250km2 was suitable for deep-coral frameworks in the study zone.
Frameworks built by the Caryophylliidae corals probably cover more than 100km2 of sea floor, exclusively in the northern Red Sea. Dendrophylliidae coral frameworks could cover a wider range, at least 150km2 of seafloor across both the northern Red Sea and the Gulf of Aqaba.
On a second OceanXplorer voyage, the researchers visited some of the locations suggested by their model and found coral frameworks present either exactly where predicted or close by.
“Frameworks of this size could act as significant carbon factories, potentially influencing carbon and nutrient cycling in the Red Sea,” concludes Benzoni. “This work is part of a wider effort to characterize deep-sea species, and KAUST’s teams will continue to investigate this understudied environment.”
A new study by KAUST researchers reveals that the simple act of shielding ocean thermometers from the sun may improve coral reef restoration efforts[1]. The study, published in PLOS Climate, emphasizes the need for standardized practices to ensure reliable temperature measurements. It also positions KAUST to advise and set guidelines for coral restoration and monitoring efforts in the Red Sea.
Coral reefs occupy about 1% of the ocean, but they support over 25% of marine life. Economically, they are worth several trillion dollars annually and support the livelihood of over one billion people. Consequently, coral restoration and conservation efforts are rapidly growing for reefs worldwide, including in the Red Sea.
Rising ocean temperatures are the primary cause of coral bleaching and death around the world. Many corals already live at their maximum temperature threshold, and increases as small as 1°C can have devastating consequences. The role of subtle temperature changes on coral populations underscores the importance of accurately measuring ocean temperatures when planning restoration efforts worldwide. The need for water temperature measurements has led to a rapid growth in commercially available temperature loggers, yet there is little guidance on best practices.
“A little more than ten years ago, there was only one or two companies selling temperature loggers. Today, more than 10 companies are offering popular products,” said KAUST postdoc Walter Rich, who was part of the study.
The KAUST team led by Michael D. Fox determined that the accuracy of 10 common temperature loggers manufactured by six companies varied widely and that almost all models produced erroneously high temperatures when not shielded from direct sunlight.
To show this variability, working with the KAUST Coral Restoration Initiative — responsible for the largest coral restoration effort in the Red Sea — they reviewed 329 coral reef studies published between 2013 and 2022. Less than 5% of the studies reported intentionally shading their loggers, such as inserting them inside a PVC tube. This lack of proper deployment methods threatens the accuracy of global reef temperature records; indeed, experiments conducted by the research team found that if unprotected, recorded temperatures were inaccurately high, some by as much as 3°C.
The potential implications of inaccurately high-temperature measurements are widespread. For example, strategies for coral restoration may be assessing incorrect temperature thresholds for corals, risking poor selection of locations and species. The scientists argue that for proper scientific comparisons, research groups should embrace standards for temperature logger deployment and calibration and transparently report their methods.
Moreover, with Saudi Arabia beginning its huge investment in coral restoration and reef monitoring, the findings welcome a cheap and straightforward method to lower the cost and increase the success of the Kingdom’s coral restoration efforts.
“Our motivation for the study was to provide guidelines for Red Sea coral restoration and environmental monitoring on Saudi Arabian reefs. By demonstrating the benefits of this simple solution, we hope to facilitate new efforts to monitor ocean temperature that will have tremendous benefits to our understanding of coral survival in the Red Sea and beyond,” said Fox.
If your genes could set an alarm clock, EZH1 might be the one ringing the bell.
A new study has revealed how this underappreciated protein ensures the rhythmic expression of genes in skeletal muscle, aligning them with the body’s 24-hour internal cycles[1].
A KAUST-led research team showed that EZH1 plays a dual role in circadian regulation. It both stabilizes a critical protein called RNA Polymerase II — the molecular engine of gene transcription — and reshapes the structure of chromatin, the tightly packed form of DNA, to activate or silence genes on schedule.
Thanks to this surprising versatility, EZH1 functions like a maestro, ensuring that genes involved in metabolism, sleep and other essential processes rise and fall on cue. When the protein’s activity declines — as may occur with aging — genetic timing can falter, leading to metabolic imbalances and disease.
“EZH1-mediated rhythmicity could play a key role in maintaining the fidelity and adaptability of tissue-specific genetic programs,” says Peng Liu, who co-led the study with colleague Valerio Orlando.
The research team — which included imaging specialist Satoshi Habuchi and stem-cell biologist Mo Li — made these discoveries by studying skeletal muscle tissue from mice and conducting experiments on cultured mouse muscle cells. By tracking gene expression and protein activity over a 24-hour cycle, they found that EZH1 levels oscillate in tandem with other master circadian genes. This rhythmic pattern allows EZH1 to regulate thousands of other genes tied to the body’s internal clock.
One of EZH1’s key roles, the researchers discovered, is stabilizing RNA Polymerase II — the enzyme that converts DNA-encoded instructions into RNA intermediaries — in a process known as transcription, which drives protein production and cellular function. At the same time, EZH1 modifies chromatin by adding or removing chemical tags. These epigenetic adornments make genes more or less accessible for transcription.
The two functions of EZH1 work together to maintain a precise rhythm of gene expression, notes Orlando, an unexpected finding that, he says, “highlights the importance of continuing to explore the basic aspects of epigenetic mechanisms.”
The study also revealed the consequences of EZH1 malfunction. When the researchers disrupted EZH1 in cells, they found that the rhythmic expression of numerous genes fell out of sync.
This misalignment could lead to problems such as impaired muscle repair, disrupted metabolism and increased susceptibility to age-related diseases — conditions that might one day be treated by targeting EZH1 and its pathways with new medicines. Supporting this idea, the KAUST-led team demonstrated that restoring EZH1 function largely reinstated the disrupted rhythms.
Many questions remain, including how EZH1 functions outside muscle tissue. Genome-wide analyses showed that disrupting EZH1 dampened the rhythmic transcription of over 1,000 circadian-regulated genes. While some of these genes are tied to skeletal muscle function, many are involved in broader metabolic and cellular repair pathways.
As researchers continue to unravel the complexities of EZH1, the latest findings underscore a critical new insight: this understudied protein orchestrates genetic timekeeping with precision, ensuring our bodies run like clockwork.
An AI-powered tool from KAUST researchers is helping scientists trace hidden connections between diseases, revealing insights into how one illness might lead to another and, by extension, how treating one illness could help prevent another[1].
By systematically combing through medical literature and real-world patient data, this tool maps cause-and-effect relationships, creating a framework that could guide targeted therapeutic strategies and uncover potential for drug repurposing.
Think of it as the ultimate disease relationship detective. Using natural language processing, the tool scans vast quantities of biomedical research to pinpoint causal connections — like how high blood pressure can set the stage for heart failure.
“Instead of treating diseases as unrelated outcomes, our approach facilitates the identification of shared risk factors among causally linked diseases,” says Sumyyah Toonsi, a graduate student in the Bio-Ontology Research Group. “This deepens our understanding of human diseases and enhances the performance of risk-prediction tools for personalized medicine.”
The tool’s power lies in its ability to go beyond mere association. Traditional methods might highlight which diseases commonly co-occur, but the KAUST tool — developed by Toonsi and her team under the guidance of computer scientist Robert Hoehndorf — identifies which diseases can trigger others.
For example, type 2 diabetes leads to high blood sugar, causing small blood vessel disease, ultimately resulting in a diabetic eye condition. Mapping these relationships suggests that treating one “upstream” condition may help prevent or lessen downstream complications.
To achieve these insights, the tool integrates scientific literature with data from the UK Biobank, a large-scale health database of about half a million Britons. This dual approach validates disease connections by checking that diseases follow a logical sequence, with causes preceding outcomes. This process strengthens the evidence of causation while highlighting new connections that might otherwise be overlooked.
Among its discoveries, the tool unearthed surprising links. As Toonsi explains, “We found endocrine, metabolic and nutritional diseases to be leading drivers of diseases in other categories,” including cardiovascular, nervous system and inflammatory diseases of the gut and eye. “This is interesting because many metabolic diseases can be managed with lifestyle changes, opening opportunities for broad disease prevention,” she says.
A standout feature is the tool’s ability to improve polygenic risk scores (PRS) — calculations that assess a person’s genetic susceptibility to disease. Standard PRS models don’t account for how one genetic variant might affect multiple diseases, but by adding causal disease relationships, the KAUST tool produces an enhanced PRS that improves prediction accuracy, especially for complex diseases.
This helps disentangle pleiotropic effects, where a single gene variant can impact multiple conditions. By factoring in these causal links, the tool offers a more holistic view of genetic risk.
Now freely available to the research community, this tool represents a major advancement for scientists exploring disease connections. Its potential applications range from refining prevention strategies to suggesting new uses for existing drugs. As researchers further investigate disease pathways, this tool could serve as a key resource in the quest to decode the interconnected landscape of human health.
A bespoke copper catalyst could slash the energy penalty of one of the most foundational processes in the chemical industry. The catalyst can harness sunlight to selectively oxidize methane, the main component of natural gas, into formaldehyde, a versatile chemical feedstock from which high-value products from polymers to pharmaceuticals can be made[1].
Despite its natural abundance, methane poses challenges as a chemical feedstock. “It’s difficult to liquefy, has high transportation costs, and cannot be used directly as a chemical raw material,” says Chengyang Feng, a researcher in the sustainable energy advanced catalysis group of Huabin Zhang, who led the research. “Large amounts of methane are directly flared or vented in gas and oil fields, leading to resource wastage and environmental issues,” Zhang adds.
Where methane is still used for chemical production, it is converted first into a more reactive intermediate called syngas: but this energy-intensive conversion requires high temperatures and pressures.
An alternative, mild method of methane conversion could be used to react it with oxygen from the air, in a process powered by sunlight, to create formaldehyde. “Converting methane to formaldehyde transforms the gas into a valuable liquid that is already used widely in the chemical and pharmaceutical industries,” Feng says. “This reduces the difficulty of storage and transportation and improves economic returns.”
The challenge of photo-catalytically reacting oxygen with methane is that it typically generates a mixture of products, including methanol and even carbon dioxide, as well as formaldehyde. The reaction outcome critically depends on the oxygen activation step. Formaldehyde is formed when the oxygen activation generates reactive species called hydroperoxyl radicals.
To promote hydroperoxyl radical generation and formaldehyde production, the team designed a photocatalyst based on a material called a metal-organic framework (MOF). At the nanoscale, these porous crystalline substances consist of metal sites held together with carbon-based organic linkers in a highly regular repeating pattern. By changing the metals and the linkers that the material is made from, different molecular architectures can be accessed.
The researchers created MOFs in which the metal sites were fully bonded to the linkers, forming a chemically inert backbone. “When single copper atoms are then specifically anchored within this framework, these sites become the sole active centers in the catalyst, enabling precise modulation of the reaction pathway,” says Zhang.
The team tailored the environment around each copper atom so that the oxygen molecules could only adsorb to the activating metal end-on. “By changing the initial adsorption mode of oxygen, we achieved the selective formation of hydroperoxyl radicals and the generation of formaldehyde,” says Feng. The photocatalyst converted methane into formaldehyde in high yield and near-100 percent selectivity.
The team’s next target is to harness more of the energy in sunlight, Zhang says. “Our photocatalysts only respond to the ultraviolet and part of the visible light spectrum, leading to substantial solar energy loss,” he says.
The team aims to develop catalysts that capture the remaining solar energy as heat to help drive the reaction. “Our next major endeavor will be to test our optimized catalyst at scale, outdoors, using sunlight.”
Saudi Arabia’s government aims to generate more than half of the country’s electricity from renewable sources by 2030. This goal is particularly pertinent in the NEOM region, where an ambitious large-scale project is underway to build a community powered entirely by renewable energy sources.
KAUST researchers have developed a clustering-optimization model that could help to design an integrated multisector energy system for NEOM[1]. Crucially, their model factors in days when weather conditions are such that the demand for total electricity becomes extreme in that part of the world. For example, when limited solar irradiation or no wind means the system comes close to being unable to supply the required electricity.
“Existing optimization models use weather input data, but usually ignore outliers, which is unhelpful when it comes to determining reliability in renewable power generation,” says Ricardo Lima at KAUST. Lima worked on the project with colleagues including Jefferson Riera and Justin Ezekiel, under the supervision of KAUST faculty members Omar Knio and Martin Mai.
“The inherent intermittency of wind and solar power means that it is vital to factor in weather pattern variability, including extreme events,” continues Lima. “Integrating renewable technologies across sectors is another important consideration. We took a novel multisector approach to optimize the proposed system.”
The researchers used past weather data (2008-2018) from the NEOM region in their model. Their approach incorporates the interactions between electricity generation, water desalination and heating generation, allowing for the exchange of information about demand and generation throughout the system.
The model could be expanded to include district cooling systems and novel energy storage methods. The results provide valuable insights for both NEOM and Saudi Arabia’s national power systems.
“Our optimization method searches through many combinations of renewable power generation, water desalination, geothermal energy and heat conversion technologies to identify the best operational conditions to meet hourly demands,” says Riera. “Our goal is to design an efficient system that minimizes yearly investments and operating costs.”
The model highlights several key points for planners to consider. Firstly, designing 100% renewable systems without factoring in extreme weather conditions results in energy demands not being met. This leads to a continued reliance on external power or water supplies at certain times. However, if sufficient storage and flexibility are built into the system — and extreme weather is factored in — the reliability of the system significantly improves.
“NEOM’s system could benefit from harnessing concentrated solar power and overall costs would be reduced by investing in geothermal energy, particularly for the heating sector,” says Lima. “For NEOM’s power system, the costs of extreme weather events could be mitigated by including 10% of energy generated from fossil fuels.”
The team will continue to refine and optimize their model further. For example, the model does not yet consider fluctuations in renewable technologies’ capital and operational costs, which may affect the resulting energy system. The team also plans to incorporate hydrogen production and its conversion to other chemicals in the model.
“This specific work relied on historical weather data, but it could easily be adapted to use projections from climate change models for specific regions. We hope to leverage these projections to design resilient energy systems in the face of climate change,” concludes Knio.