Skip to main content
 

Cardiovascular diseases remain a leading cause of death worldwide. Every year thousands of children are born with congenital heart disease. Reliable, efficient tools that can accurately assess heart function are critical for heart disease diagnosis and treatments.

Now, a novel machine learning method of analyzing real-time cardiac magnetic resonance imaging (MRI) scans has been developed by KAUST researcher Raúl Tempone in collaboration with scientists at the German Aerospace Center, in Cologne[1].

“Cardiac real-time MRI is an invaluable method of scanning the heart, and can acquire up to 50 frames per second, which is excellent in clinical terms,” says Tempone. “However, this generates thousands of images that are near-impossible to process manually.”

A typical short scan at 30 frames per second for 10 seconds can produce around 4,500 images to annotate across 15 ‘slices’, or cross-sectional images through the heart. Neural networks can accurately segment most of these images, but fail to consistently segment the ventricular cavity in the outer slices at the base and apex of the heart. This means that the ventricular volume — the amount of blood inside the heart’s ventricles throughout the cardiac cycle — in the outer slices is not always estimated reliably by the neural networks.

In the workflow, a clinician must perform a visual quality check to separate reliably segmented inner slices from the unreliable outer slices. This is particularly important for patients with rare heart diseases and unusual anatomy, such as univentricular hearts.

“We wanted to find a way to attain trustworthy estimates of ventricular volume across the entire heart, and analyze the most pertinent images accurately,” says Tempone. Ventricular volume is a critical measurement in the assessment of cardiac diseases, but it changes continuously in complex ways depending on the patient’s breathing and heartbeat.

“The volume of a heart ventricle over time is not arbitrary: it is dominated by a small number of frequencies, primarily associated with the heart rate and breathing pattern,” says Tempone. “Our modeling framework uses sparse Bayesian learning (SBL) to identify the most relevant frequencies for each patient.”

Their model learns a patient-specific ‘frequency fingerprint’ from ventricular volume curves extracted from the reliably segmented inner slices. Those dominant frequencies correspond primarily to the heart rhythm and breathing. The SBL model identifies which frequencies really matter, and automatically prunes irrelevant frequency components during training. The model then selects the most informative frames (time points) for manual labeling on the unreliable outer slices, to reduce uncertainty.

“With a few carefully chosen manual labels, we can reconstruct accurate ventricular volume curves across the entire heart,” says Tempone. “Crucially, we can also quantify uncertainty: in other words, our model clearly states when automated measurements can, and can’t, be trusted.”

In tests on real-time MRI data from two patients with univentricular hearts, the approach required only a few manually labeled images to accurately predict ventricular volume on the challenging outer slices.

“Our model-based labeling strategy is generic and can be integrated into other cardiac MRI workflows,” concludes Tempone. “The model could easily be transferred into other medical image analysis pipelines that have similar data and label scarcity issues.”

 

Millions of servings of sustainably sourced fish are being lost each year, with data from a study of thousands of tropical reefs showing that reef fish stocks could provide much more if sustainably maintained.

Coral reef fish are a critical source of nutritious food for many tropical communities, and managing such resources sustainably offers potential food security benefits, especially for nations facing high burdens of malnutrition.

“Well-managed reef fisheries can deliver sustainable supplies of nutritious aquatic foods while helping reef ecosystems and dependent communities increase their resilience to other stresses,” says Jessica Zamborain-Mason, interdisciplinary marine scientist at KAUST, who led the international research team. “Millions of people are losing out on sustainably sourced and nutritious food supplies.”

According to Zamborain-Mason, the finding underscores the need for effective reef fisheries management as the global burden of malnutrition grows each year and reef ecosystems face unprecedented pressures from climate change[1].

Reef fisheries are inherently complex. They capture many species at once and are often located in data-poor regions with limited monitoring and management. This makes it difficult for scientists to collect and analyze data to assess their sustainability levels.

“What’s different now is the availability of large, fisheries-independent datasets that compile reef fish and associated information from across the globe,” says Zamborain-Mason. “These new datasets reveal patterns that were not visible with localized and scattered data.”

The team analyzed global data from 1,211 individual reef sites and 23 jurisdictions that had been identified as being below their maximum production levels. Rather than focusing on loss, the research team investigated what could be gained from replenishing reef fish stocks and ensuring that reef fisheries are sustainably managed.

Their results show that allowing fish stocks to recover could generate from 20,000 up to 162 million additional sustainable servings of reef fish annually, depending on the jurisdiction. For individual jurisdictions such as Indonesia, this could provide fish intake recommendations for an additional 1.4 million people per year. However, to achieve this requires active recovery of fish stocks, with many reefs needing to double their current biomass.

While recovery timeframes will depend on the state of depletion, the researchers estimate that, on average across all reefs examined, recovery could take from six years to almost 50 years, depending on the level of fishing allowed.

“If nations invest now in sustainably rebuilding their reef fishery resources, they can increase long-term yields and aquatic food supplies,” says Zamborain-Mason. “This could reduce their reliance on imports, improve nutritional and health outcomes, and boost ecosystem and economic resilience.”

Zamborain-Mason hopes their findings will encourage rigorous and systematic reef fisheries monitoring and management, while also raising awareness of the critical role of fisheries management for human health.

Such interventions could include greater investments in fisheries monitoring, management and enforcement, and the integration of fisheries into future food security plans and policies. Zamborain-Mason also considers that Saudi Arabia has the potential to provide a world-leading example of climate-resilient and sustainable reef fisheries management.

“By improving our understanding of reef fisheries and aquatic food systems in the region, we can provide scientifically grounded management information and support the Kingdom’s mission centered on sustainability, food security, and public health,” concludes Zamborain-Mason.

 

 

Stand in direct sunlight and you can feel it: the Sun delivers its biggest kick not as visible light, but as the infrared energy we experience as heat. Capturing this rarely tapped form of solar power could offer an efficient way to generate renewable fuels, KAUST researchers have shown[1].

“Saudi Arabia has some of the richest solar energy resources on Earth,” says Yunzhi Wang, a Ph.D. student in the lab of Huabin Zhang, who co-led the research. By harnessing this energy to split water molecules and release hydrogen, Saudi’s solar riches could be converted into an exportable resource. “Hydrogen is a versatile clean fuel, considered one of the most promising renewable energy sources,” Wang adds.

Although a range of known photocatalyst materials can harness the energy in sunlight to split hydrogen from water, most of these materials only absorb ultraviolet light, which makes up less than 5% of the solar spectrum. More recently, an unconventional type of photocatalyst made of organic molecules has been developed, which can capture the infrared light that makes up more than 50% of the Sun’s power.

To achieve efficient water splitting with these materials, however, researchers have had to use blends of two organic photocatalysts. Sunlight striking these materials generates pairs of excited-state electrons and positively charged holes. If the electron-hole pair immediately recombines, the captured solar energy is lost.

To counter this recombination, KAUST researchers mixed electron donor and electron acceptor photocatalysts, creating blends that quickly draw the excited electron and hole apart and enable hydrogen production.

“Producing these organic photocatalyst blends for real-world applications poses significant challenges,” Zhang says. It can be difficult to get the two components to mix, requiring complex fabrication processes. The resulting materials can also be unstable. “Single-component organic photocatalysts would therefore be more suitable for large-scale applications,” Zhang adds.

In a significant step toward that goal, the team has now designed an organic photocatalyst that does not need blending with another material to achieve efficient hydrogen production. “The photocatalyst is a type of long-chain organic material called a double-cable polymer,” Wang says.

“Double-cable polymers contain electron donor units as the conjugated polymer backbone and electron acceptors as side units,” Wang explains. The team showed that combining donor and acceptor groups in the same molecule successfully supported rapid electron-hole dissociation, enabling efficient solar-powered hydrogen production. “The hydrogen evolution rate of the double-cable polymer nanoparticles was 57 times higher than that of nanoparticles made using only the donor component of the polymer,” Wang says.

Zhang teamed up with KAUST colleague Omar Mohammed, whose group studies carrier dynamics in advanced materials, to probe the double-cable polymer photocatalyst’s behavior in detail. “Our group investigates carrier dynamics in various materials using advanced spectroscopic techniques,” Mohammed explains. The team used pulsed laser light to study the materials in their photo-excited state.

“Because of the rapid processes at play, we need to use an ultrashort laser pulse,” says Partha Maity, a research scientist in Mohammed’s team. “We conducted nanosecond transient absorption spectroscopy measurements to accurately investigate the lifetimes of the excited species,” Maity explains. The analysis showed that excited species in the double-cable polymer nanoparticles were surprisingly long-lived, enabling the boosted hydrogen production that Zhang’s team had observed.

Zhang and his team are now testing their double-cable polymer nanoparticles’ solar-driven hydrogen production performance using a customized panel reactor. “Fresh water is a precious resource and so the team is exploring hydrogen production from industrial wastewater splitting,” he says.

“Double-cable polymer photocatalysts have great potential beyond hydrogen production,” Zhang adds. “We plan to use them to fabricate single-component organic photocatalysts for other potential reactions.”

“The research is already generating promising results,” Mohammed says. “We have other interesting collaborative works underway with the Zhang group.”

A tiny sensor that detects hazardous head impacts the instant they occur could reshape safety monitoring in sports, transportation and other high-risk settings.

The device, developed by researchers at KAUST, acts like a safety switch that activates in response to sudden acceleration, sensing forces from any direction and gauging their severity in real time[1].

Roughly the size of a small fingernail, the sensor can be attached to football helmets, ski goggles, industrial hard hats or children’s headbands. Drawing no power in its normal standby state, it switches on only when a shock closes the internal electrical circuit through mechanical contact between the movable and fixed structures. This means the sensor can operate for long periods without draining the battery or requiring routine upkeep.

“It’s like a seatbelt for the brain,” says Yousef Algoos, an electromechanical engineer who built the device as a PhD student in the KAUST Robotics, Intelligent Systems, and Control Group led by Eric Feron. Algoos co-led the study together with Mohammad Younis from the State University of New York at Binghamton, United States.

“By combining omnidirectional precision, multi-threshold capability and passive operation, this innovation paves the way for next-generation wearable safety systems for concussion detection and impact monitoring in sports, transportation and daily life,” Algoos says. “There is no sensor on the market that offers this combination of features,” he adds.

The KAUST sensor detects impacts by mechanically distinguishing minor bumps from dangerous blows, without continuous monitoring or power-hungry electronics. Figure adapted from Algoos et al., Scientific Reports 15, 37713 (2025), cropped for layout, CC BY-NC-ND 4.0.

The project began in response to a personal loss. In 2018, Algoos’s brother Abdullah died following a car accident that led to head trauma and internal bleeding that doctors were slow to diagnose. “That experience opened my eyes to the life-saving importance of early smart detection tools on the spot,” Algoos says.

Existing head-impact monitors are used only in limited settings, in part because most rely on accelerometers that must be powered continuously. That constant activity drains batteries, requires bulky housings and limits the technology largely to elite sports or research environments.

The KAUST sensor takes a different approach. Rather than tracking movement nonstop, it stays dormant until a sharp jolt pushes a suspended mass inside the chip into contact with one of several concentric electrodes. Each contact corresponds to a different acceleration threshold, allowing the device to distinguish minor bumps from more dangerous blows without software, power-hungry circuitry or continuous monitoring.

To validate the design, the team subjected the chip to a series of controlled laboratory tests. Using a drop-table apparatus, they delivered shocks from multiple directions and found that the sensor consistently triggered at levels associated with mild and severe head trauma with 360-degree accuracy.

With those results in hand, the researchers are preparing for the next stage: mounting the sensors on crash-test dummies to evaluate how they respond during complex, whole-body impacts.

Although still a prototype, Algoos says the technology could eventually “trigger an immediate alert through a mobile app, an audible signal, or a wireless notification to caregivers, coaches, or first responders, depending on the final product design.”

With a patent already filed, “we are now exploring commercialization pathways with potential partners,” he adds.

 

A new cellular atlas charts, in unprecedented detail, how human B cells are built step by step — and offers a window into how this developmental process goes awry in leukemia.

The KAUST-led investigation provides a sweeping inventory of the genes and regulatory switches that guide early B cell differentiation. This offers a blueprint that could open avenues for both diagnostics and therapies for B-cell acute lymphoblastic leukemia (B-ALL), the most common childhood cancer[1].

“We show that leukemia subtypes line up with distinct regulatory signatures that mirror differentiation in healthy individuals, which may help explain why some leukemias behave differently,” says David Gomez-Cabrero, a computational biologist who co-led the study.

“Because each stage of healthy B-cell development carries a recognizable signature,” adds Núria Planell, a co-leader of this project and former postdoc in Gomez-Cabrero’s group, “those patterns could be used for better classifications of leukemia, or eventually flag abnormal development earlier.”

B cells are the body’s antibody factories, generated in the bone marrow through a tightly ordered series of steps. These transitions are controlled by a handful of regulatory proteins that flip genes on or off.

Most of what biologists know about these switches comes from mice. Whether human cells follow the same rules has been less clear. The new work now provides the first comprehensive human roadmap, showing how genetic regulators orchestrate each step of B cell maturation and how those programs can be hijacked in leukemia.

Gomez-Cabrero — together with Jesper Tegnér and other members of their AI4BioMedicine lab, along with collaborators in Spain, Sweden, the United Kingdom, and the United States — combined experimental and computational approaches. They charted accessibility of stretches of DNA, and how that shaped gene activity, across eight successive stages of B cell precursors from 13 healthy donors. Integrating these datasets, they assembled a regulatory atlas of unprecedented depth that captures both master switches and the downstream programs they control. Those regulatory programs were further validated at single-cell resolution.

New insights from this resource included the discovery that ELK3, a transcription factor previously overlooked in human B cell biology, appears to spur proliferation during early development. The researchers traced its regulatory program and found it tied directly to cell cycle and growth pathways already implicated in cancer, suggesting a potential vulnerability that malignant cells may exploit.

Further clarity came from layering disease data onto the healthy-cell framework. When leukemia samples were aligned with the atlas, distinct patterns emerged: Subtypes of B-ALL bore the hallmarks of specific developmental states. By mirroring normal stages, the authors infer, these leukemias may inherit biological traits that shape their aggressiveness and resistance to therapy — insights that may help refine prognosis and treatment decisions.

“Turning these insights into treatments requires follow-up lab testing, but our map points to where to look first,” explains study co-author Xabier Martinez de Morentin, a postdoctoral scientist at KAUST.

Importantly, researchers worldwide can explore the atlas through B-rex[2], a publicly available web application built by Alberto Maíllo, a KAUST research scientist.

Meanwhile, Gomez-Cabrero and his colleagues are extending the approach in Saudi Arabia to local blood cancers, with a focus on regional genomic variation and lifestyle-specific risk factors too often absent from global studies. By grounding their atlas in the realities of local biology, they hope to reveal cancer signatures that might otherwise remain invisible and to help bring precision medicine closer to patients across the Gulf region.

 

Plants run on growth regulators, tiny chemical signals that choreograph development, but their lines of communication are more malleable than once assumed. A trio of KAUST-led studies underscore this flexibility by showing that the signaling and metabolism of specialized plant hormones known as strigolactones are regulated not by rigid locks and keys, but through enzymes and receptors capable of adapting to context and competitors.

Together, the findings point to a common theme: plants preserve control over these influential hormones by building in molecular versatility. They also broaden understanding of hormone signaling in crops and across the plant kingdom, showing that even well-mapped pathways can harbor unexpected cross-connections — and that tiny amounts of hormone act through finely tuned networks responsive to both internal cues and environmental change, with clear implications for agriculture.

“Understanding strigolactones perception and controlled catabolism is integral to optimizing crop growth and minimizing losses,” says Salim Al-Babili, a plant biochemist who co-led the studies with Stefan Arold, a structural biologist at KAUST.

“By combining structural biology with plant biochemistry, we’re revealing that these signaling systems are far more dynamic than previously thought,” adds Arold. “Flexibility, not rigidity, seems to be what gives plants such adaptive power.”

The work stems from collaborations between researchers in the Center of Excellence for Sustainable Food Security and the Center of Excellence for Smart Health at KAUST. Using Arabidopsis thaliana, a favored model organism of plant biologists, the teams explored how this modest weed manages the levels and perception of strigolactones — and in doing so, uncovered unexpected strategies that could apply across the plant kingdom.

One line of investigation zeroed in on a protein called CXE15, an enzyme newly recognized for breaking down strigolactones. Unlike similar enzymes that function alone, CXE15 is only active when two identical copies come together. The resulting dimer creates a pocket large enough to hold and cleave the hormone.

This activity is not constant, however. The KAUST team, led by Umar F. Shahul Hameed from Arold’s group and Aparna Balakrishna from Al-Babili’s group, discovered that under oxidative stress, a bond forms between the enzyme’s two halves, effectively locking it shut and stopping its activity[1].

Consequently, plants can suspend strigolactone breakdown when conditions — such as a scarcity of essential nutrients — make higher hormone levels more useful. That ability to toggle hormone catabolism on and off provides a way for plants to fine-tune their architecture, encouraging root growth to forage for nutrients or signaling to symbiotic fungi when resources are limited.

Using Arabidopsis thaliana, a favored model organism of plant biologists, the KAUST scientist Umar F. Shahul Hameed, Juan C. Moreno, Alexandra Vancea, and Aparna Balakrishna, studied how this modest weed regulates strigolactone levels and perception. © 2025 KAUST

A complementary story emerges from the receptors that sense strigolactones. These proteins, it turns out, can also take on more than one role. In a separate study, Al-Babili, Arold, and their colleagues found that a growth regulator called zaxinone binds directly to the strigolactone receptor DWARF14. By competing with strigolactones for the same site, zaxinone blocks the receptor’s usual handshake with partner proteins and dampens strigolactone signaling[2].

“We finally solved the enigma of the connection between zaxinone and strigolactones,” says Juan C. Moreno, a plant molecular biologist in Al-Babili’s group at KAUST, and lead author of the study. “These findings not only show that a receptor of a plant hormone can bind different growth regulators, but also unravel, for the first time, a protein interaction partner for zaxinone,” he explains.

The third study takes this insight even further, showing how perception of strigolactones is translated into action. Using cryogenic electron microscopy, the KAUST team, led by Alexandra Vancea from Arold’s group, revealed the structure and dynamics of the protein complex that forms after the strigolactone is bound and processed by its receptor. The structural snapshots of this receptor–ligase–substrate complex reveal that strigolactone hydrolysis triggers a sequence of flexible, cooperative interactions among the three proteins, turning a fleeting chemical signal into a genetic response[3].

This work shows that even this key step of signaling depends on dynamic molecular choreography, not static binding — echoing the same theme of flexibility that runs through all three discoveries.

The three studies reveal that from hormone breakdown to receptor signaling, plants rely on adaptable molecular assemblies to fine-tune their growth and environmental responses.

Taken together, these advances change the framing of strigolactone biology. Regulation does not occur solely through biosynthesis or simple receptor turnover. Instead, plants employ specialized enzymes whose activity can be switched off under stress, receptors that can accommodate structurally distinct metabolites competing for control, and signaling assemblies that dynamically adjust their configuration to fine-tune gene expression.

For agriculture, these insights could one day inform strategies to design crops with greater resilience or more finely tuned architecture — reducing vulnerability to parasitic weeds or enhancing beneficial fungal partnerships. And for plant biology, they are a reminder that even the most intensively studied laboratory species still hold molecular secrets capable of reshaping scientific dogma.

The continued growth of solar and wind power is reshaping global energy systems, creating an urgent demand for storage technologies that are both durable and affordable. Sodium-ion batteries are an attractive alternative to lithium technologies because sodium is abundant, widely distributed, and inexpensive.

Despite their promise, high-voltage sodium batteries have remained difficult to commercialize due to a fundamental materials challenge: the electrolyte must stabilize both the highly reactive sodium metal anode and the high-voltage cathode — two surfaces that typically require opposite chemical conditions to remain stable.

“Traditionally, additives that protect one side of the battery tend to damage the other,” says Husam Alshareef, a materials scientist at KAUST who leads the Center of Excellence for Renewable Energy and Storage Technologies (CREST). “This trade-off has been a major barrier to developing practical high-voltage sodium batteries.”

Alshareef and his collaborators from CREST at KAUST have now broken this long-standing limitation by introducing a new class of electrolyte additives called non-solvating additives (NSAs). The approach offers a simple, low-cost route to stabilizing both electrodes simultaneously, enabling long-life sodium batteries that operate at voltages comparable to commercial lithium-ion systems[1].

The study focuses on how the additive interacts with ions in the electrolyte. Most existing additives are strongly solvating: they bind tightly to sodium ions, follow them to the anode during charging, and often decompose there, destabilizing the sodium metal surface. Meanwhile, they leave the cathode insufficiently protected against high-voltage degradation.

The KAUST team took the opposite approach. They identified a fluorinated ether molecule — 1,1,2,2-tetrafluoroethyl 2,2,3,3-tetrafluoropropyl ether (TTE) — that interacts weakly with sodium ions but preferentially binds to negatively charged anions. “Because these additives do not cling to sodium ions, they aren’t dragged to the anode, where they could be harmful,” explains Dong Guo, the study’s lead author. “Instead, they travel with the anions toward the cathode, where their protective effect is actually needed.”

This “anti-solvation” mechanism enables the use of a tiny amount of additive — just 3% by weight — to form a robust, stable interphase on the high-voltage cathode. The result is a battery that withstands aggressive cycling conditions once thought incompatible with sodium chemistry. In tests, cells retained 90% capacity after 1,200 cycles, while the sodium metal anode achieved an exceptionally high efficiency of 99.92%, indicating strong long-term cycling stability.

The team also validated the approach in Ah-scale pouch cells, achieving energy densities of approximately 180 Wh kg⁻¹, comparable to those of lithium iron phosphate (LFP) batteries, which are widely used in stationary storage and mid-range electric vehicles. Because the NSA formulation works as a drop-in additive for standard ether electrolytes and relies on commercially available chemicals, it is immediately compatible with industrial manufacturing processes.

“This is a practical and scalable solution,” Alshareef says. “By rethinking how additives behave inside the electrolyte, we can unlock high-voltage sodium batteries without relying on complex or expensive chemistries.”

With lithium supply constraints and cost pressures shaping global energy markets, sodium batteries are becoming increasingly important for grid storage, backup power systems, and cost-sensitive electric vehicles. The NSA concept could accelerate this transition by narrowing the performance gap between sodium- and lithium-based technologies while maintaining the resource and cost advantages of sodium.

“Our approach combines high voltage, long cycle life, and low cost,” adds Guo. “It opens the door to sustainable, lithium-free energy storage at scale.”

 

 

Extremophile algae that thrive in acidic hot springs could be ideal industrial partners for supporting sustainable manufacturing. These tough and versatile species can be grown in the dark, fed on a wide range of organic wastes, and are especially productive when grown in an environment spiked with CO2, KAUST researchers have shown[1].

“Galdieria’s evolution in hot, corrosive volcanic pools has equipped it to colonize new habitats few other organisms can handle,” says Mauricio Lopez-Portillo Masson, a Ph.D. candidate in the lab of Kyle Lauersen, who led the research. “Galdieria can be found in industrial waste sites with high dissolved gases, high temperatures, and acidic conditions, in addition to their native hot springs,” Lopez-Portillo Masson adds.

“Resilient and metabolically flexible, Galdieria can switch between photosynthesis and feeding on glucose or other organic molecules to meet its energy needs, which has sparked industrial interest,” Lauersen explains. As it grows, Galdieria produces a rich, balanced blend of essential amino acids, suggesting potential as a novel food or high-quality livestock feed. The alga also produces a blue pigment called phycocyanin, with applications as a natural blue food coloring, a cosmetic ingredient, and an antioxidant nutraceutical.

As Galdieria’s natural environment is in acid hot springs, its phycocyanin is heat- and acid-stable, and thus well-suited for industrial food pasteurization or cosmetics production. “Because Galdieria can eat many carbon sources and can be grown in fermenters like those used for brewing, the field’s long-term goal is to use Galdieria to turn sugar or other organic carbon into this valuable blue pigment,” Lauersen says.

In collaboration with Peter Lammers at Arizona State University, the team identified a Galdieria strain from Yellowstone National Park, Galdieria yellowstonensis, that reliably produces photosynthetic pigments, including phycocyanin, even when fed a glucose diet. The next step was to find the ideal conditions for growing these unusual microbes at scale.

While screening potential conditions, the team discovered that spiking the fermentation tank with CO₂ significantly boosted Galdieria’s growth. That result might be expected when the alga is grown in the light and consuming CO₂ during photosynthesis, but the team recorded the same boost even when Galdieria was grown on glucose in the dark.

Working with Michael Fox and KAUST’s Core Labs, the team used stable-isotope measurements to confirm that CO₂ acts as a trigger, rather than being incorporated into the alga’s biomass. These growing conditions may resemble the CO₂-rich environment in hydrothermal vents where the species evolved, Lopez-Portillo Masson says. “It’s as if the cells recognize that they’re ‘home’ when CO₂ levels are high.”

Testing the CO2-enriched growth conditions in a real-world example of circular waste management, the team showed the alga happily consumed confectionery waste from a local Mars chocolate bar factory, generating phycocyanin-containing Galdieria biomass.

Other industries generate much more carbon waste than the 20 tons generated annually by the chocolate factory, Lauersen adds. “Glycerol is also a preferred Galdieria food: in the kingdom, some companies produce hundreds of tons of waste glycerol per month.

“From existing local waste streams, we could grow 150 tons of algal biomass per month, which can be used for animal feed, cosmetics, pigments, and many other applications,” Lauersen says. Such resource circularity aligns with the Saudi Vision 2030 goals and is within the Kingdom’s Research, Development, and Innovation Authority (RDIA) strategic pillars for Sustainable Environment and Essential Needs, as well as Economies of the Future.

 

A wearable device that alerts people with food allergies before a reaction begins has the potential to reduce life-threatening anaphylaxis and transform allergy management from reactive to preventive care.

The AllergE patch is a microneedle-based biosensor developed by researchers at KAUST that painlessly detects immunoglobulin E (IgE), the antibody that triggers allergic reactions, directly from the fluid beneath the skin[1].

Food allergies — especially to eggs, nuts, milk, and seafood — are a growing public health concern. Conventional allergy tests, such as skin pricks and blood draws, are invasive, time-consuming, and carry the risk of provoking mild reactions. By contrast, the AllergE patch is painless and quick, relying on an array of tiny, porous needles — each less than a millimeter long and about the width of a human hair.

Inside each microneedle sit DNA strands, known as aptamers, that act as molecular sentinels. When these encounter IgE antibodies, they twist into new shapes that generate an electrochemical signal. A flexible electrode and a small reader then translate the signal into measurable data — a setup that the researchers say could eventually sync with a smartphone for remote monitoring at home.

The result is a continuous readout of a person’s allergy-related immune levels, all in real time. “This smart patch could one day help prevent anaphylaxis and enable safe, at-home early allergy sensitization detection,” says Dana Alsulaiman, the study’s corresponding author. Researchers hope that early, noninvasive detection could help families monitor allergy risks before exposure leads to a dangerous immune response.

In lab tests on artificial skin substitutes and on explanted skin from a human donor, the device detected IgE concentrations as low as 30 picograms per milliliter — hundreds of times more sensitive than most current assays. It could also distinguish IgE from structurally similar antibodies that dominate the body’s immune defenses but play no role in mediating allergic responses.

The microneedles are manufactured using two-photon polymerization, a high-resolution 3D lithography technique that allows precise control over their geometry and strength. That level of structural precision ensures the needles reach just beneath the surface to collect the tiny amounts of fluid needed for analysis, deep enough to sense, but shallow enough to stay painless and unobtrusive, all without breaking or losing functionality over repeated uses.

Though still in early stages, the KAUST team, including members of the Lab of Biomedical Materials and Devices (BioMAD Lab), led by Alsulaiman, and the Sensors Lab, led by Khaled Salama, the study’s co-corresponding author, envision the AllergE patch as part of a broader family of skin-worn diagnostics, capable of tracking immune molecules, hormones, and other biomarkers linked to inflammation and disease.

But it all started with a simple goal: to make allergy testing faster, safer, and more accessible. It was prompted by the experience of Esraa Fakeih, a former Ph.D. student in Salama’s lab, who experienced the dangers of severe food allergies for herself and in her family. Around five years ago, she and her young nieces each suffered serious reactions after eating foods containing trace amounts of allergens.

“These situations inspired the idea of a wearable patch that can help prevent severe reactions,” Fakeih says — an idea that goes beyond being a technical challenge to be a personal mission. And for millions of people at risk of severe allergies, a wearable sensor like AllergE could transform management of allergic reactions — ideally preventing them altogether, or at least providing precious extra minutes of warning when every second counts.

 

Editing the genome of human cells to disable, repair, or replace faulty genes holds great potential for treating debilitating conditions like sickle cell disease and cancer.

Several therapies that use the CRISPR-Cas9 tool — molecular scissors that precisely cut DNA to edit problematic genes —have now reached the clinic, including a treatment for sickle cell disease and β-thalassemia that has recently been approved by UK and US regulators. Many other potential treatments are undergoing clinical trials worldwide.

However, the impact of this method on epigenetics remains insufficiently explored. To address this gap, a KAUST study employed a novel technique to describe some of the complex epigenetic implications of CRISPR-Cas9 editing[1].

The study shows unexpected side effects of genomic editing that extend beyond changes to DNA sequences. The work by Mengge Wang, Yingzi Zhang, Chongwei Bi, and Mo Li discovered that CRISPR-Cas9 can also disrupt the cell’s epigenetic landscape, leading to changes in DNA methylation that persist long after the DNA breaks have been repaired.

“Scientists have extensively studied the genetic consequences of Cas9-induced DNA breaks, yet their potential epigenetic impact remains largely unknown,” says Wang. “Our work shows that genome editing can leave behind epigenetic scars—lasting changes in DNA methylation that endure well after the DNA itself is repaired.”

DNA methylation acts like a biochemical ‘switch’ that controls which genes are turned on or off without changing the underlying genetic code. When these switches are disrupted, genes may become abnormally silenced or activated, potentially affecting how cells grow, differentiate, or respond to stress. Such epigenetic alterations are often linked to diseases like cancer, aging-related disorders, and developmental abnormalities.

Most previous studies relied on short-read or PCR-based methods, which cannot simultaneously capture DNA sequence and methylation information, especially at the single-molecule resolution. These methods lose the native chemical context of DNA, notes Wang, making it difficult to study how CRISPR-Cas9-induced DNA breaks affect genome methylation.

“By contrast, we used long-read nanopore sequencing of native DNA that overcomes these limitations and provides a complete view of both sequence and methylation changes on the DNA molecule,” explains Zhang. “We tracked how the epigenetic ‘marks’ around Cas9 cutting sites changed after DNA repair.”

“Together with insights from our stem cell models and a customized analysis strategy, we demonstrated that Cas9-induced DNA breaks can cause profound, stable changes in local DNA methylation patterns,” says Wang. “Importantly, this effect is universal — it occurs regardless of the chromatin context or cell type.”

Changes in DNA methylation could influence gene expression long after the DNA itself is repaired. “We found that similar DNA methylation changes also occur during natural DNA damage repair processes in unedited human cells,” adds Zhang. “It may be possible to harness controlled breaks and subsequent repair processes to correct abnormal methylation patterns associated with cancer, genetic disorders, or aging.”

“This discovery reveals a previously overlooked layer of complexity in genome editing and DNA repair, emphasizing the need to consider epigenetic stability alongside genetic accuracy,” says Li. “Recognizing and monitoring these effects will help improve the safety of therapeutic genome editing.”

The researchers believe that understanding these processes could support the development of new therapeutic strategies. “Controlled DNA breaks could be harnessed to correct abnormal methylation patterns associated with cancer or genetic disease — turning a potential side effect into a precision tool for epigenetic therapy,” concludes Li.