The rapid climate change we are experiencing today is mainly driven by the greenhouse gases we humans keep releasing into the air.
But new evidence from ancient Antarctic ice cores suggests this wasn't always the case for the past three million years of Earth's changing climate.
According to the findings of two new papers published in Nature, at certain transition points ocean temperatures could have had a greater influence over Earth's climate than greenhouse gases.
Two research teams analyzed ice cores extracted from the Allan Hills, a blue ice region of Antarctica. The Allan Hills cores are samples of some of the world's oldest ice, with some dating as far back as 6 million years ago.
A drilled core at the Allan Hills during the 2024-2025 field season.
(Jenna Epifanio/COLDEX)
Blue ice areas like the Allan Hills make up only about 1 percent of the surface area of Antarctica's ice sheet, and they're named as such because strong winds blow away any new snow, keeping older, glacial ice exposed at the surface.
The Allan Hills region hasn't moved horizontally or vertically very much at all, making it a particularly unique site for taking cores of very, very old ice.
Ice cores are some of our best natural 'archives' of Earth's long-term climate.
They don't necessarily contain a complete, continuous record. The Allan Hills cores, for example, contain layers that are out of chronological order thanks to the way the ice was deposited across the millennia.
But each layer of ice contains a climate snapshot that can tell us a lot about what was going on at the time of freezing, and there are ways of decoding their secrets.
Certain isotopes in the ice hint at ocean temperature. Impurities like volcanic ash and other particles can indicate sources of air pollution.
And, perhaps most importantly to climate scientists, the ice can trap tiny bubbles of the air, revealing the historic gas composition of the atmosphere across millions of years.
A cut section of very old ice from Allan Hills, at the NSF Ice Core Facility in Denver, Colorado.
(Peter Neff/COLDEX)
Woods Hole Oceanographic Institute paleoclimatologist Sarah Shackleton led an international team of researchers in the study focused on global ocean temperatures across the past 3 million years.
Dissolved xenon and krypton, two noble gases that dissolve in seawater at different temperatures, provided them with a way of estimating the ocean's heat.
These proxy measurements suggest the ocean cooled drastically around 2.7 million years ago, roughly matching up with the Plio-Pleistocene Transition, when the Earth gradually shifted from a warmer to cooler climate that led to glacier formation across large parts of the Northern Hemisphere.
The ice core data also suggests that average ocean temperatures stayed relatively stable across the Mid-Pleistocene Transition, another shift in glacial cycles that occurred between 1.2 and 0.8 million years ago.
Ice core drilling at Allan Hills in 2019.
(Austin Carter/COLDEX)
Meanwhile, from the same ice cores, a team led by Oregon State University geochemist Julia Marks-Peterson found atmospheric levels of carbon dioxide and methane were "broadly stable" across the past 3 million years.
"Although paleoclimate archives from Antarctic blue ice areas are complex, our records show that measurements of greenhouse gases in ice cores can be extended to the late Pliocene epoch, providing snapshots of Earth's climate system over a time of global cooling and falling sea level," Marks-Peterson and team write.
As Cambridge climatologist Eric Wolff writes in an accompanying commentary article, this suggests that either ice-sheet growth and survival was "exquisitely sensitive" to minuscule changes in carbon dioxide – or past changes in Earth's climate may have been driven by something else.
The work of Shackleton and colleagues could provide further clues to the puzzle. They found an apparent decoupling of changes in sea surface and mean ocean temperatures.
Understanding the way Earth's climate worked before we started meddling with it at a large scale is important if we want to figure out how to re-stabilize this planet we call home.
But there are limitations to interpreting these ice cores, as Shackleton recently explained in a Science Sessions podcast.
"These records are still quite new, and they're more complicated to interpret than the continuous ice cores that we're used to working with," she said.
"So, with how highly compressed the ice is, especially the oldest ice, we're also probably averaging over glacial and interglacial cycles, so we're currently unable to study how the climate evolved across glacial and interglacial periods.
"Exactly what these records capture in terms of exactly how smooth or exactly how much we're averaging over a glacial versus interglacial conditions is still an outstanding question."
You're late for an important appointment. Just as you are leaving your house, you realise your phone is flat.
Imagine you could charge it almost instantly by exploiting the strange rules of quantum physics. That's the promise of quantum batteries.
My colleagues and I at CSIRO have developed the world's first quantum battery prototypes – and the direction the technology has taken is surprising.
Collective quantum effects
You may have heard of the peculiar quantum effects of superposition and entanglement, which allow mostly very tiny objects to behave very strangely. They could also allow quantum computers to solve problems that conventional computers cannot.
One strange feature of the quantum world is what are called "collective effects". They are what give quantum batteries their unique properties.
Under the right circumstances, the storage units of quantum batteries don't act individually, but behave collectively. In a counterintuitive twist, this means the units charge faster together than if they were charging alone.
Let's say your quantum battery has N storage units, and each unit takes one second to charge. Collective effects mean that if all units are charged at once, each unit will take only 1∕√N seconds to charge.
This means that the bigger your quantum battery, the less time it takes to charge. If it doubles in size, charging will take just a little more than half as long.
It is as if each unit somehow knows there are other units around, and their presence makes the unit charge faster. Strange, right?
This is radically different from how conventional batteries work, where bigger batteries typically take longer to charge. That's why it might take an hour to charge your mobile phone, but your electric car needs all night.
Building a quantum battery
The idea of a quantum battery was just a theoretical curiosity for a long time. But back in 2018, I set out to demonstrate that they could actually be built.
In 2022, working with colleagues in the United Kingdom and Italy, we built a quantum battery prototype using an organic microcavity – a kind of tiny, complicated multi-layer sandwich of several different materials that traps light in a particular way.
And we were able to show for the first time the exotic behaviour where larger quantum batteries really do take less time to charge.
The world's first fully functioning proof-of-concept quantum battery engineered by CSIRO and collaborators, the University of Melbourne and RMIT.
(CSIRO)
In fact, we were able to demonstrate that the charging time decreases as 1∕√N, where N was the number of molecules in our battery. The more molecules we included, the faster the battery charged — exactly as theory had predicted.
One thing this first prototype didn't have was a way to extract the energy out of it. To do this, in our latest study, published in the journal Light: Science & Applications, we added extra layers into our device that converted the energy into an electrical current. This marks a major step towards a practical quantum battery.
Progress still to be made
So, why aren't we seeing quantum batteries in stores?
Well, the capacity of quantum batteries is still tiny (a few billion electron-volts), and the time they hold their charge is fleetingly short (a few nanoseconds). This means quantum batteries are too small to power conventional devices such as your mobile phone, at least for now.
CSIRO's clean lab for engineering prototype quantum batteries.
(CSIRO)
But quantum batteries might be perfect for powering quantum devices such as quantum computers. In fact, quantum batteries could be the exact solution quantum computers need to work at bigger scales and become practical.
While we don't have practical quantum batteries yet, we are currently working on ways to scale up our prototype's size and extend how long it can hold its charge. We hope to create a hybrid design that combines the exceptional charging speed of the quantum battery with the long storage time of the classical battery.
The progress we've made is a testament to the century of theoretical work done by quantum scientists before us.
Our first prototype's battery charge lasted nanoseconds. The Wright brothers' first plane flight lasted a little longer. Progress takes time – but quantum batteries are certainly on our horizon.
Once plants really got a foothold, they transformed our planet.
(Image credit: EyeEm Mobile GmbH/Getty Images)
Long before dinosaurs roamed the land, Earth looked very different from the planet we know today. Around 500 million years ago, most of Earth's surface was bare rock and dry soil. There were no trees, no grass and no flowers. Life existed almost entirely in the oceans.
Then something amazing happened: Plants began to grow on land.
This moment was one of the most important events in Earth's history because it changed the planet forever. As a geoscientist, I am interested in changes in the diversity of flora and fauna — that's plants and animals — over time.
Predecessors of plants lived in water
The story of plants begins in the water. The earliest plantlike organisms were simple, tiny green life-forms such as algae. You can still see algae today as seaweed along beaches or as green slime on rocks in ponds.
Algae have lived in Earth's oceans and lakes for over 1 billion years. They can make their own food, using sunlight, water and carbon dioxide to create sugars. This process is called photosynthesis; it releases oxygen — the gas we need to breathe — as a byproduct.
At first, Earth's atmosphere had very little oxygen. Over millions of years, photosynthesizing organisms like algae and some bacteria slowly released oxygen into the air. This change, sometimes called the Great Oxygenation Event, made it possible for larger and more complex life to evolve. Without oxygen-producing organisms, animals, including humans, could never have existed.
Scientists believe the first true plants evolved from green algae around 470 million years ago. These early plants lived in shallow water near shorelines, where conditions changed often. Sometimes they were underwater, and sometimes they were exposed to air. This habitat helped them slowly adapt to life on land.
Getting a foothold on dry land
Moving onto land was not easy. Water plants are supported by water and can absorb nutrients easily, but land plants faced new challenges. How would they avoid drying out? How could they stand upright without floating? How would they get water and nutrients from dry ground?
To survive, early plants evolved important new features. One key adaptation was a waxy coating, called a cuticle, which helped keep water inside the plant. Plants also developed stronger cell walls that allowed them to stand upright against gravity. Simple rootlike structures, called rhizoids, helped anchor plants to the ground and absorb water and minerals from the soil.
The earliest land plants were very small and simple. They looked similar to modern mosses, liverworts and hornworts, which still grow today in damp places like forest floors and stream edges. These plants did not have true roots or stems, and they stayed close to the ground. Fossils of early land plants, such as Cooksonia date back to about 430 million years ago and show small branching stems only an inch or two tall.
The y-shaped fossil in this rock is Cooksonia barrandei, the oldest terrestrial plant in the world (432 million years old), seen at the National Museum in Prague, Czech Republic.
(Image credit: Skot, CC BY-SA 4.0, via Wikimedia Commons)
Even though these plants were tiny, they had a huge impact on Earth. As plants spread across land, their roots helped break down rocks into soil, a process called weathering. This created richer soil that could support more life.
Plants also released more oxygen into the atmosphere, improving air quality and helping animals breathe. Plants created new habitats and food sources, allowing insects and other animals to move from water onto land.
Increasing complexity across millions of years
Once plants became established on land, evolution continued. Around 420 million years ago, plants evolved vascular tissue: tiny tubes that transport water and nutrients throughout the plant. This adaptation allowed plants to grow taller and stronger because water could be moved upward from the roots to the leaves. These vascular plants included early relatives of ferns and club mosses.
With vascular tissue, plant life really started to flourish. By about 360 million years ago, vast forests covered much of Earth. Giant ferns and treelike plants, some over 100 feet (30 meters) tall, dominated the landscape. Over time, dead plant material from these forests was buried and compressed, eventually forming coal, which people still use as an energy source today.
Another major step in plant evolution was the development of seeds, around 380 million years ago, found in seed ferns. Other seed plants, such as early conifers — a group that includes modern pine trees — could reproduce without needing water for fertilization. Seeds protected plant embryos and allowed plants to survive harsh conditions like drought or cold.
The most recent major plant evolution happened around 140 million years ago, when flowering plants, what scientists call angiosperms, appeared. Flowers helped plants attract animals like insects and birds, which spread pollen and seeds. Fruits developed to protect seeds and help them travel. Today, flowering plants make up most of the plants we see, including trees, grasses, fruits and vegetables.
The first plants didn't just survive; they transformed Earth. They changed the atmosphere, built soil, and created ecosystems that allowed animals to thrive on land. Thanks to plant evolution, Earth became a green, living planet full of diverse life.
Actor Dick Van Dyke at the Daytime Emmys Awards in 2024.
(Rodin Eckenroth/Stringer/Getty Images)
Dick Van Dyke, the legendary American actor and comedian who starred in classics such as Mary Poppins and Chitty Chitty Bang Bang, turned 100 on December 13. The beloved actor credits his remarkable longevity to his positive outlook and never getting angry.
While longevity of course comes down to many factors – including genetics and lifestyle – there is some truth to Van Dyke's claims.
Numerous studies have shown that keeping stress levels low and maintaining a positive, optimistic outlook are correlated with longevity.
For instance, in the early 1930s, researchers asked a group of 678 novice nuns – most of whom were around 22 years of age – to write an autobiography when they joined a convent.
Six decades later, researchers analysed their works. They also compared their analyses with the women's long-term health outcomes.
The researchers found that women who expressed more positive emotion early in life (such as saying they felt grateful, instead of resentful) lived an average of ten years longer than those whose writing tended to be more negative.
A UK study also found that people who were more optimistic lived between 11% and 15% longer than their pessimistic counterparts.
And, in 2022, a study that looked at around 160,000 women from a range of ethnic backgrounds found that those who reported being more optimistic were more likely to live into their 90s compared to pessimists.
One potential explanation for these outcomes is related to the effects anger has on our hearts.
People who tend to have a more positive or optimistic outlook on life appear to be better at managing or controlling their anger. This is important, as anger can have a number of significant effects on the body.
Dick Van Dyke credits his remarkable longevity to his positive outlook and never getting angry.
(Scott Dudelson/ Stringer/ Getty Images)
Anger triggers the release of adrenaline and cortisol, the body's primary stress hormones – particularly in men. Even brief angry outbursts can lead to a decline in cardiovascular health.
The added strain that chronic stress and anger put on the cardiovascular system has been linked to increased risk of developing conditions such as heart disease, stroke, and type 2 diabetes.
These diseases account for roughly 75% of early deaths. While stress and anger aren't the only causes of these diseases, they contribute to them significantly.
So when Dick Van Dyke says he doesn't get angry, it may well be one of the reasons for his longevity.
There's also a deeper, cellular explanation behind stress's influence on longevity, which relates to our telomeres. These are protective caps found on the ends of our chromosomes (the packages of DNA information found in our cells).
In young, healthy cells, telomeres remain long and sturdy. But as we age, telomeres gradually shorten and fray. Once they become too worn, cells struggle to divide and repair themselves. This is one reason ageing accelerates over time.
Stress has been linked to faster telomere shortening, which makes it harder for cells to communicate and renew. In other words, stress-inducing emotions such as uncontrolled anger may speed up the ageing process.
One study also found that meditation, which can help reduce stress, is positively associated with telomere length. So better anger management might just help support a longer life.
Added to this is the fact optimists appear to be more likely to engage in healthy habits, such as regular exercise or healthy eating, which can further support health and longevity by lowering risk of cardiovascular disease. Even Dick Van Dyke himself still tries to exercise at least three times a week.
Improving longevity
If you want to live as long as Dick Van Dyke, there are things you can do to manage your stress and anger levels.
Contrary to popular belief, trying to "let out" anger by punching a bag, shouting into a pillow, or running until the feeling passes doesn't actually help. These actions keep the body in a heightened state, which impacts the cardiovascular system and can prolong the stress response.
A calmer approach works better. Slowing down your breath, counting them, or using other relaxation techniques (such as yoga) can help calm the cardiovascular system rather than overstimulate it. Over time, this reduces strain on the heart, which can help you live longer. It's important you aim to do this anytime you're feeling particularly stressed or angry.
You can also boost positive emotions by trying to be more present in your daily life. By staying present, you become more aware of what's happening around you and within you.
For instance, if you're planning to go out for dinner with your partner, try to be more intentional in how you go about it.
This could include booking a restaurant you both truly like, or asking to eat in a quieter spot in the restaurant so you have more time to catch up. Slow down and try to pay attention to the moment, taking in all the senses you're experiencing as much as you can.
You can also boost positive emotions by making time for play. For adults, play means doing something simply because it's enjoyable – not because it has any specific purpose. Play will give you a boost of positive emotions, which may in turn benefit your health.
Dick Van Dyke's advice may be correct. While we can't control everything that has an impact on our health, learning to manage anger and make room for a more positive outlook in life can help support both well-being and longevity.
Research suggests you can keep your brain sharp into old age by learning languages and creating art – and it seems birdwatching may have similar effects.
A new study from scientists in Canada found that the brains of experienced birdwatchers showed denser, more complex tissue structures in brain regions linked to attention and perception, compared to novices.
The findings feed into the idea of neuroplasticity, that the way we work our brains can actually rewire them to some extent – potentially in ways that can protect against cognitive decline through later life.
"Regions involved in attention and perception showed structural modification in experts, and these same regions were selectively engaged to support identification in challenging circumstances," write the researchers in their published paper.
"Results also suggest that knowledge acquisition might mitigate age-related decline in circumscribed brain regions supporting expert performance."
The researchers scanned for mean diffusivity in the brain, a measure of the complexity of brain tissue.
(Wing et al., J. Neurosci., 2026)
The study examined both brain structure and brain processing at the same time. To this end, MRI scans were carried out on 29 birdwatching experts and 29 birdwatching novices, matched for age and education. During the MRI, the participants were asked to identify images of different birds, and the scans were later analyzed for a measure of brain complexity called "mean diffusivity."
"The measure we used is the diffusion of water molecules in the brain," says neuroscientist Erik Wing, at the Rotman Research Institute in Canada. "One way of putting it is that there's less constraint on where water goes in the brains of experts."
Sure enough, experienced birdwatchers were found to have lower mean diffusivity in areas of their brains linked to spotting birds, as if these had undergone a system upgrade. What's more, when these participants were shown birds they weren't familiar with, it was these brain regions that lit up.
There were more subtle hints towards protection against cognitive decline in later life, too: brain tissue naturally gets less complex (a higher mean diffusivity) as we get older, but this appeared to be progressing more slowly than normal in the expert birders.
"Acquiring skills from birding could be beneficial for cognition as people age," says Wing.
Birdwatchers are a suitable study group for research like this, because their hobby involves a combination of picking out key details from a lot of visual information, and keeping attention levels high over extended periods – after all, you never know when a rare bird may appear.
While these brain differences are positive though, it's important not to interpret the findings too broadly. These participants weren't given memory or cognition tests, for example, so all we can say for sure is that the experts' brains seemed fine-tuned to their particular speciality.
It's also difficult to prove cause and effect in a one-off study like this, in which the participants weren't tracked over time. It's possible that there were already specific, beneficial characteristics of the experts' brains that led them to get into birdwatching in the first place.
A northern cardinal (Cardinalis cardinalis).
(David Kanigan/Pexels)
However, it's more likely that years of engaging in this hobby have tuned the birdwatchers' brains in this way – and the researchers suggest that future studies could investigate whether these optimizations might be used for other cognitive tasks outside the realm of birdwatching.
We know from related studies that learning to play an instrument or speak a new language can shift the structures of the brain and potentially delay some of the natural decline as we age. The new study suggests that birdwatching could also exercise related brain regions and potentially protect against cognitive decline.
"Given findings that older experts can harness specialized knowledge to support cognition involving their domain of experience, future work will be needed to uncover how age-related structural trajectories affect specialized performance later in life," write the researchers.
Researchers have uncovered a surprising connection between gut bacteria and the development of ALS and frontotemporal dementia. Their work suggests that certain microbes produce inflammatory sugars capable of triggering immune responses that damage brain cells.
Credit: Shutterstock
A new study reveals how gut bacteria may influence the onset of ALS and frontotemporal dementia.
A newly uncovered gut-brain connection may help explain why two devastating neurological diseases develop in some people but not others.
Researchers at Case Western Reserve University found evidence that gut bacteria may contribute to brain damage in Amyotrophic Lateral Sclerosis (ALS) and Frontotemporal Dementia (FTD). Their study points to certain bacterial sugars as a possible trigger for immune activity that harms cells, while also suggesting a way that process might be stopped.
ALS, often known for its effects on movement, destroys motor neurons and gradually leads to worsening muscle weakness and paralysis. FTD affects the frontal and temporal regions of the brain, disrupting personality, behavior, and language.
A Molecular Clue to Disease Risk
Most ALS and FTD cases still have no clear cause. Scientists have long investigated genetics, environmental exposures, diet, and head trauma, but the full picture has remained incomplete.
The new study, published in Cell Reports, offers a possible answer to one of the field’s most persistent questions. The researchers identified a molecular link between gut microbes and disease risk that may help explain why some people with inherited susceptibility go on to develop ALS or FTD while others do not.
Aaron Burberry.
Credit: Case Western Reserve University
“We found that harmful gut bacteria produce inflammatory forms of glycogen (a type of sugar), and that these bacterial sugars trigger immune responses that damage the brain,” said Aaron Burberry, assistant professor in the Department of Pathology at the Case Western Reserve School of Medicine.
Burberry, the study’s senior investigator, reported that 70% of the 23 ALS/FTD patients examined had dangerous glycogen levels. Among people without these neurological conditions, only about one-third showed similarly high levels.
New Opportunities for Treatment
The findings could have immediate value for patient care. The research highlights new biological targets for treating ALS and FTD and identifies biomarkers that may help doctors determine which patients could benefit from therapies aimed at the gut.
The discovery also opens the door to treatments designed to break down harmful sugars in the digestive tract. It may also support the development of medications that target communication between the gut and the brain, offering potential new strategies to slow or prevent these devastating disorders.
Alex Rodriguez-Palacios, assistant professor in the Digestive Health Research Institute at the School of Medicine, said the researchers used their findings to reduce the harmful sugars, which “improved brain health and extended lifespan.”
Explaining the C90RF72 Mutation
The results are especially important for people who carry the C90RF72 mutation, the most common genetic cause of ALS and FTD. The study suggests that gut bacteria may act as a key environmental trigger, helping explain why some carriers develop the diseases while others do not.
Scientists in the university’s Department of Pathology and Digestive Health Research Institute are advancing research on neurodegenerative diseases through their ability to study germ-free mouse models. These mice are raised in completely sterile environments without bacteria, which allows researchers to isolate the effects of specific microbes on brain disease.
Fabio Cominelli, Distinguished University Professor and director of the Digestive Health Research Institute, leads this program. It relies on an innovative “cage-in-cage” sterile housing system developed by Rodriguez-Palacios, a specialized capability available at only a few institutions worldwide and critical to the discovery.
Advanced Tools for Gut–Brain Research
This system makes it possible to perform large-scale microbiological studies needed to examine the complex communication between the gut and the brain. Traditional laboratory methods can typically support only a small number of mice at a time, which limits this type of research.
“To understand when and why harmful microbial glycogen is produced, the team will next conduct larger studies surveying gut microbiome communities in ALS/FTD patients before and after disease onset,” Burberry said. “Clinical trials to determine whether glycogen degradation in ALS/FTD patients could slow disease progression are also supported by our findings and could begin in a year.”
The vivid hue of the lush green hills of Ireland may not have arisen from the mechanism many textbooks have touted for decades.
Chlorophyll, the pigments that allow plants to derive energy from light, are often said to reflect green light, making many plants an eye-catching emerald. But this is a common misconception.
According to a 2020 paper, chlorophyll does not reflect light at all. Instead, it merely absorbs blue and red light more strongly, leaving the green light more likely to scatter out of the leaf, probably from structures such as the plant's cell walls.
Chlorophyll still plays a role in determining plants' green color – but it's not doing it the way many textbooks state, and that may actually be far more interesting and complex than the accepted explanation.
"Plant leaves are green because green light is less efficiently absorbed by chlorophylls a and b than red or blue light, and therefore green light has a higher probability to become diffusely reflected from cell walls than red or blue light," writes a team led by molecular plant biologist Olli Virtanen of the University of Turku in Finland.
"Chlorophylls do not reflect light."
Ireland's weather makes it particularly conducive to rainbows, too. (mikroman6/Moment/Getty Images)
The common explanation for the green color seen in many plants relies on a very basic rule of optics: The color of an object is determined by the wavelength of light it reflects. For a flat, homogenous object such as a Lego brick, this is true – the reflection spectrum is essentially a mirror of the absorption spectrum.
However, a plant leaf is rather more complex than a Lego brick, made up of multiple structures and materials. Such a heterogeneous object may interact with light in a more complicated fashion, with one component absorbing light, and another doing the scattering.
The way chlorophyll absorbs light has been well understood for decades – its absorption is strongest in the violet-blue and red parts of the visible spectrum, and weakest in the green wavelength range.
Green light also isn't as useless to plants as many people assume. The researchers note that leaves absorb green wavelengths only about 20 to 30 percent less efficiently than red or blue light. Because green light penetrates deeper into leaves and plant canopies, it can help drive photosynthesis in lower layers that other wavelengths reach only weakly.
https://www.youtube.com/watch?v=pwymX2LxnQs&t=1s
But a weaker absorption of green light doesn't mean that chlorophyll is reflecting it.
To investigate, Virtanen and his colleagues conducted a series of experiments to see how leaves of different colors reflect light – not just green leaves, but yellow and white leaves, too, which have different levels of chlorophyll. Yellow leaves have much less chlorophyll than green leaves of the same plant species; and white leaves have none at all.
The researchers measured how much light of each wavelength range was reflected by the leaves – and, fascinatingly, the yellow and white leaves reflected more green light than the green leaves.
The green leaves reflected less than 10 percent of the green light that shone upon them. The yellow leaves reflected around twice as much green light as the green leaves, while the white leaves reflected about 30 percent of the green light.
If the chlorophyll was responsible for reflecting green light, then the amount of green light reflected by leaves with less or no chlorophyll should have been lower. The fact that it wasn't tells us something else is doing the scattering.
That something else, the researchers believe, is likely to be the cellulose in the walls of the plants' cells, although further research will be needed to confirm.
You might also be asking yourself: if leaves without chlorophyll reflect more green light, why don't they appear greener? And if leaves with chlorophyll reflect so little green light, why do they appear so vividly green?
The answers to these questions concern the properties of light itself and our vision, respectively.
The white and yellow leaves don't just reflect green light more strongly, but light across the spectrum. The color that is reflected most strongly is the color that we see. For yellow leaves, that color is obviously yellow.
For white leaves, the reflectivity is even across the spectrum – think of how a prism splits light into its constituent colors. Mix all those colors back together, and you get white light.
But under normal daylight conditions, there's something curious about the human eye. It's most sensitive to light in green wavelengths. Green appears brighter to our eyes than other wavelengths shining with the same intensity.
This means that only a little bit of green is needed for it to dominate – so, even though green leaves do absorb most green light, the little that's scattered by other structures in the leaves is sufficient for a vivid viridian color.
"With these data," the researchers write, "we seek out to falsify and correct the common misconception about chlorophyll reflecting green light."
So there you go. You now know why your four-leaf clover is green. The source of its mystical powers, however, lies somewhere over the rainbow.
Lithium has become one of the most critical materials in the global shift toward renewable energy, yet the methods used to obtain it remain slow, resource-intensive, and geographically limited. Researchers at Columbia Engineering are exploring a new approach that could transform how the metal is extracted from natural brines.
Credit: Shutterstock
A new solvent-based technique could change how lithium is extracted from brines, potentially making the process faster, cheaper, and viable in places where conventional methods fail.
Few elements are as key to the clean energy transition as lithium, and global demand for it is soaring. The metal powers the rechargeable batteries inside electric vehicles and the massive storage systems that allow solar and wind energy to supply electricity long after the sun sets and the wind calms. Unfortunately, current methods for producing lithium are slow and require high-quality feedstocks found in relatively few locations on Earth.
Ironically, the environmental costs are also significant. Refining the mineral behind clean energy requires large amounts of land and can pollute water supplies that local communities depend on.
In a new paper, researchers from Columbia Engineering describe a method for extracting lithium that could dramatically shorten processing times, unlock reserves that existing methods cannot access, and reduce environmental impact. Their technique uses a temperature-sensitive solvent to extract lithium directly from brines found in deposits around the world.
Unlike current technologies, this approach can efficiently extract lithium even when it is present in very low concentrations or mixed with chemically similar materials.
The results, detailed in a paper published in Joule, show that the innovation—called switchable solvent selective extraction, S3E (pronounced S three E)—can extract lithium with strong selectivity: up to 10 times higher than for sodium and 12 times higher than for potassium. The process also excludes magnesium, a common contaminant in lithium brines, by triggering a chemical precipitation step that separates it out.
Improving on Solar Evaporation
Roughly 40% of global lithium production begins with salty brines stored in large underground reservoirs beneath deserts. Nearly all of that lithium is extracted using a technique called solar evaporation, in which brine is pumped into sprawling ponds that bake under the desert sun—sometimes for up to two years—until enough water evaporates.
This approach is only feasible in dry, flat regions with vast areas of land, such as Chile’s Atacama Desert or parts of Nevada. It also consumes large volumes of water in places that can scarcely afford it.
“There’s no way solar evaporation alone can match future demand,” said Ngai Yin Yip, La Von Duddleson Krumb Associate Professor of Earth and Environmental Engineering at Columbia University. “And there are promising lithium-rich brines, like those in California’s Salton Sea, where this method simply can’t be used at all.”
Unlike conventional lithium recovery methods, S3E does not rely on binding chemicals or extensive post-processing. Instead, the process exploits how lithium ions interact with water molecules in a solvent system that changes its behavior with temperature.
At room temperature, the solvent pulls lithium and water from the brine. When heated, it releases the lithium, along with water, into a purified stream and regenerates itself for reuse.
An Approach with Tremendous Potential
In laboratory tests using synthetic brines modeled on the Salton Sea, a geothermal region in Southern California estimated to contain enough lithium to supply more than 375 million EV batteries, the system recovered nearly 40% of the lithium after just four cycles using the same solvent batch. That performance suggests a potential path toward continuous operation.
“This is a new way to do direct lithium extraction,” said Yip. “It’s fast, selective, and easy to scale. And it can be powered by low-grade heat from waste sources or solar collectors.”
The team emphasized that this is a proof-of-concept study. The system hasn’t yet been optimized for yield or efficiency. But even in this early form, S3E appears promising enough to offer an alternative to evaporation ponds and hard-rock mining, the two approaches that dominate the lithium supply chain today and come with steep tradeoffs.
As the global clean energy transition picks up speed, technologies like S3E could play a crucial role in keeping it on track—by making it possible to extract lithium faster, more cleanly, and from more places than ever before.
“We talk about green energy all the time,” said Yip. “But we rarely talk about how dirty some of the supply chains are. If we want a truly sustainable transition, we need cleaner ways to get the materials it depends on. This is one step in that direction.”
Our relationships shape our health in many ways. Friends and family can provide support during difficult times and encourage healthy habits. But not all relationships are positive – some can be a persistent source of stress.
A new study published in the journal PNAS asked what happens when the stress in our lives comes from the people around us. The researchers focused on difficult ties in people's social networks – individuals they called "hasslers".
The researchers wondered whether difficult relationships might affect aging in the same way as other chronic stressors.
Stress is not always bad for us. Short bursts of stress can help us learn coping skills, become more adaptable, and trigger hormone and brain changes that prepare us for future challenges. But long-term stress – such as poverty, discrimination, or unemployment – can wear down the body and speed up aging.
Participants were asked to name people they spent time with, talked to about personal or health matters, or who influenced their health habits. Crucially, participants were also asked whether there were people in their network who often caused them stress or made life difficult – the hasslers.
Only those reported as often causing stress were classified as hasslers. People who only occasionally caused stress were not considered hasslers. Importantly, the same person could be nominated in multiple categories, meaning that a single relationship could serve several social roles.
Our relationships shape our health in many ways.
(Peopleimages/Canva)
People taking part also provided saliva samples to calculate two complementary measures of biological aging. The first measures your biological age relative to your age in years. In other words, is your body older or younger than your numerical age? The second measures how quickly you are aging right now.
Almost 30% of participants had at least one hassler in their social network, with about 10% reporting at least two hasslers, confirming that hasslers are reasonably common and "negative" ties are part of our social worlds.
This is certainly worth noting since negative ties and their effects are understudied in comparison to positive or neutral ties. Each additional hassler was associated with roughly nine months higher biological age, and with a slightly faster pace of biological aging (1.5% faster).
Since the saliva samples were only measured once, we can't be sure how this builds up over time, but if the pace of aging is faster for the rest of your life, it certainly feels worth reflecting on.
This effect was strongest when the difficult relationship was between family members, rather than between friends or acquaintances. This might reflect the challenges in extricating oneself from family relationships.
Family ties are the hardest to cut
It's a lot easier to slowly distance oneself from an acquaintance than to discard a relationship that may have existed for your entire lifetime and which is embedded in other close relationships. Besides, most relationships aren't purely positive or negative. Even the most stressful family relationships can have some positive aspects – and vice versa.
Only 3.5% of friendships were classified as hasslers, compared with almost 10% of parents and of children, supporting the notion that hasslers are more difficult to discard when they are part of our families.
Interestingly, negative relationships with spouses and partners did not show the same association with accelerated aging. One possible explanation is that occasional conflict or stress within these partnerships happens alongside substantial support, which could mitigate the physiological consequences of these negative interactions.
Also, hasslers were less likely to appear across multiple domains of interaction – such as both a confidant and a companion. In contrast, supportive relationships often spanned several domains of social life.
Once relationships become difficult, people might gradually reduce the number of ways they interact. Or, high-conflict relationships may be less likely to develop into deeply embedded ties that we engage with in multiple ways.
Nonetheless, it's worth considering alternative explanations before we ditch our hassler ties. Experiencing accelerated aging could make people feel more poorly, and perhaps more irritable.
Irritable people might more easily interpret interactions as "hassling", meaning that accelerated aging could be contributing to perceptions of hasslers, rather than the other way around.
Similarly, depression can both accelerate the aging process and contribute to generally negative evaluations of different aspects of life, including relationships. Not all of us are equally likely to have hasslers in our networks. Women, smokers, and those with greater histories of life stress in childhood tended to report more hasslers.
Extra hasslers were also associated with poorer evaluations of one's own health, more anxiety and depression symptoms, more long-term health conditions, and higher body weight, suggesting that difficult ties are relevant across several aspects of health.
Negative social ties might act similarly to other chronic stressors in our lives, influencing health and well-being, with accelerated aging as one potential pathway identified in this study.
Although it's important to nurture our social connections, these findings suggest we should also reflect on those connections that often bring "hassle" to our daily lives.