May 5, 2024

Gemini south reveals origin of unexpected differences in giant binary stars

Using the Gemini South telescope a team of astronomers have confirmed for the first time that differences in binary stars' composition can originate from chemical variations in the cloud of stellar material from which they formed. The results help explain why stars born from the same molecular cloud can possess different chemical composition and host different planetary systems, as well as pose challenges to current stellar and planet formation models.

It is estimated that up to 85% of stars exist in binary star systems, some even in systems with three or more stars. These stellar pairs are born together out of the same molecular cloud from a shared abundance of chemical building blocks, so astronomers would expect to find that they have nearly identical compositions and planetary systems. However, for many binaries that isn't the case. While some proposed explanations attribute these dissimilarities to events occurring after the stars evolved, a team of astronomers have confirmed for the first time that they can actually originate from before the stars even began to form.

Led by Carlos Saffe of the Institute of Astronomical, Earth and Space Sciences (ICATE-CONICET) in Argentina, the team used the Gemini South telescope in Chile, one half of the International Gemini Observatory, supported in part by the U.S. National Science Foundation and operated by NSF NOIRLab. With the new, precise Gemini High Resolution Optical SpecTrograph (GHOST) the team studied the different wavelengths of light, or spectra, given off by a pair of giant stars, which revealed significant differences in their chemical make-up. "GHOST's extremely high-quality spectra offered unprecedented resolution," said Saffe, "allowing us to measure the stars' stellar parameters and chemical abundances with the highest possible precision." These measurements revealed that one star had higher abundances of heavy elements than the other. To disentangle the origin of this discrepancy, the team used a unique approach.

Previous studies have proposed three possible explanations for observed chemical differences between binary stars. Two of them involve processes that would occur well into the stars' evolution: atomic diffusion, or the settling of chemical elements into gradient layers depending on each star's temperature and surface gravity; and the engulfment of a small, rocky planet, which would introduce chemical variations in a star's composition.

The third possible explanation looks back at the beginning of the stars' formation, suggesting that the differences originate from primordial, or pre-existing, areas of nonuniformity within the molecular cloud. In simpler terms, if the molecular cloud has an uneven distribution of chemical elements, then stars born within that cloud will have different compositions depending on which elements were available at the location where each formed.

So far, studies have concluded that all three explanations are probable; however, these studies focused solely on main-sequence binaries. The 'main-sequence' is the stage where a star spends most of its existence, and the majority of stars in the Universe are main-sequence stars, including our Sun. Instead, Saffe and his team observed a binary consisting of two giant stars. These stars possess extremely deep and strongly turbulent external layers, or convective zones. Owing to the properties of these thick convective zones, the team was able to rule out two of the three possible explanations.

The continuous swirling of fluid within the convective zone would make it difficult for material to settle into layers, meaning giant stars are less sensitive to the effects of atomic diffusion -- ruling out the first explanation. The thick external layer also means that a planetary engulfment would not change a star's composition much since the ingested material would rapidly be diluted -- ruling out the second explanation. This leaves primordial inhomogeneities within the molecular cloud as the confirmed explanation. "This is the first time astronomers have been able to confirm that differences between binary stars begin at the earliest stages of their formation," said Saffe.

"Using the precision-measurement capabilities provided by the GHOST instrument, Gemini South is now collecting observations of stars at the end of their lives to reveal the environment in which they were born," says Martin Still, NSF program director for the International Gemini Observatory. "This gives us the ability to explore how the conditions in which stars form can influence their entire existence over millions or billions of years."

Three consequences of this study are of particular significance. First, these results offer an explanation for why astronomers see binary stars with such different planetary systems. "Different planetary systems could mean very different planets -- rocky, Earth-like, ice giants, gas giants -- that orbit their host stars at different distances and where the potential to support life might be very different," said Saffe.

Second, these results pose a crucial challenge to the concept of chemical tagging -- using chemical composition to identify stars that came from the same environment or stellar nursery -- by showing that stars with different chemical compositions can still have the same origin.

Finally, observed differences previously attributed to planetary impacts on a star's surface will need to be reviewed, as they might now be seen as having been there from the very beginning of the star's life.

Read more at Science Daily

Webb captures top of iconic horsehead nebula in unprecedented detail

NASA's James Webb Space Telescope has captured the sharpest infrared images to date of a zoomed-in portion of one of the most distinctive objects in our skies, the Horsehead Nebula. These observations show the top of the "horse's mane" or edge of this iconic nebula in a whole new light, capturing the region's complexity with unprecedented spatial resolution.

Webb's new images show part of the sky in the constellation Orion (The Hunter), in the western side of a dense region known as the Orion B molecular cloud. Rising from turbulent waves of dust and gas is the Horsehead Nebula, otherwise known as Barnard 33, which resides roughly 1,300 light-years away.

The nebula formed from a collapsing interstellar cloud of material, and glows because it is illuminated by a nearby hot star. The gas clouds surrounding the Horsehead have already dissipated, but the jutting pillar is made of thick clumps of material and therefore is harder to erode. Astronomers estimate that the Horsehead has about five million years left before it too disintegrates. Webb's new view focuses on the illuminated edge of the top of the nebula's distinctive dust and gas structure.

The Horsehead Nebula is a well-known photodissociation region, or PDR. In such a region, ultraviolet (UV) light from young, massive stars creates a mostly neutral, warm area of gas and dust between the fully ionized gas surrounding the massive stars and the clouds in which they are born. This UV radiation strongly influences the chemistry of these regions and acts as a significant source of heat.

These regions occur where interstellar gas is dense enough to remain mostly neutral, but not dense enough to prevent the penetration of UV light from massive stars. The light emitted from such PDRs provides a unique tool to study the physical and chemical processes that drive the evolution of interstellar matter in our galaxy, and throughout the universe from the early era of vigorous star formation to the present day.

Due to its proximity and its nearly edge-on geometry, the Horsehead Nebula is an ideal target for astronomers to study the physical structures of PDRs and the molecular evolution of the gas and dust within their respective environments, and the transition regions between them. It is considered one of the best regions in the sky to study how radiation interacts with interstellar matter.

Thanks to Webb's MIRI and NIRCam instruments, an international team of astronomers has revealed for the first time the small-scale structures of the illuminated edge of the Horsehead. As UV light evaporates the dust cloud, dust particles are swept out away from the cloud, carried with the heated gas. Webb has detected a network of thin features tracing this movement. The observations have also allowed astronomers to investigate how the dust blocks and emits light, and to better understand the multidimensional shape of the nebula.

Next, astronomers intend to study the spectroscopic data that has been obtained to gain insights into the evolution of the physical and chemical properties of the material observed across the nebula.

Read more at Science Daily

Ice shelves fracture under weight of meltwater lakes

When air temperatures in Antarctica rise and glacier ice melts, water can pool on the surface of floating ice shelves, weighing them down and causing the ice to bend. Now, for the first time in the field, CIRES-led research shows that ice shelves don't just buckle under the weight of meltwater lakes -- they fracture. As the climate warms and melt rates in Antarctica increase, this fracturing could cause vulnerable ice shelves to collapse, allowing inland glacier ice to spill into the ocean and contribute to sea level rise.

"Ice shelves are extremely important for the Antarctic Ice Sheet's overall health as they act to buttress or hold back the glacier ice on land," said Alison Banwell, a CIRES scientist in the Earth Science and Observation Center (ESOC) and lead author of the study published today in the Journal of Glaciology. "Scientists have predicted and modeled that surface meltwater loading could cause ice shelves to fracture, but no one had observed the process in the field, until now."

The new work may help explain how the Larsen B Ice Shelf abruptly collapsed in 2002. In the months before its catastrophic breakup, thousands of meltwater lakes littered the ice shelf's surface, which then drained over just a few weeks.

To investigate the impacts of surface meltwater on ice shelf stability, Banwell and her colleagues from the University of Cambridge, University of Oxford, and University of Chicago traveled to the George VI Ice Shelf on the Antarctic Peninsula in November 2019. First, the team identified a depression or "doline" in the ice surface that had formed by a previous lake drainage event where they thought meltwater was likely to pool again on the ice. Then, they ventured out into the frigid landscape on snowmobiles, pulling all their science equipment and safety gear behind on sleds.

Around the doline, the team installed high-precision GPS stations to measure small changes in elevation at the ice's surface, water-pressure sensors to measure lake depth, and a timelapse camera system to capture images of the ice surface and meltwater lakes every 30 minutes.

In 2020, the COVID-19 pandemic brought their fieldwork to a screeching halt. When the team finally made it back to their field site in November 2021, only two GPS sensors and one timelapse camera remained; two other GPS and all water pressure sensors had been flooded and buried in solid ice. Fortunately, the surviving instruments captured the vertical and horizontal movement of the ice's surface and images of the meltwater lake that formed and drained during the record-high 2019/2020 melt season.

GPS data indicate that the ice in the center of the lake basin flexed downward about a foot in response to the increased weight from meltwater. That finding builds upon previous work led by Banwell that produced the first direct field measurements of ice shelf buckling caused by meltwater ponding and drainage.

The team also found that the horizontal distance between the edge and center of the meltwater lake basin increased by over a foot. This was most likely due to the formation and/or widening of circular fractures around the meltwater lake, which the timelapse imagery captured. Their results provide the first field-based evidence of ice shelf fracturing in response to a surface meltwater lake weighing down the ice.

"This is an exciting discovery," Banwell said. "We believe these types of circular fractures were key in the chain reaction style lake drainage process that helped to break up the Larsen B Ice Shelf."

The work supports modeling results that show the immense weight of thousands of meltwater lakes and subsequent draining caused the Larsen B Ice Shelf to bend and break, contributing to its collapse.

Read more at Science Daily

Scientists identify new brain circuit in mice that controls body's inflammatory reactions

The brain can direct the immune system to an unexpected degree, capable of detecting, ramping up and tamping down inflammation, shows a new study in mice from researchers at Columbia's Zuckerman Institute.

"The brain is the center of our thoughts, emotions, memories and feelings," said Hao Jin, PhD, a co-first author of the study published online today in Nature. "Thanks to great advances in circuit tracking and single-cell technology, we now know the brain does far more than that. It is monitoring the function of every system in the body."

Future research could identify drugs that can target this newfound brain circuit to help treat a vast range of disorders and diseases in which the immune system goes haywire.

"This new discovery could provide an exciting therapeutic venue to control inflammation and immunity," said Charles S. Zuker, PhD, the study's senior author, a principal investigator at Columbia's Zuckerman Institute and a Howard Hughes Medical Institute investigator.

Recent work from the Zuker lab and other groups is revealing the importance of the body-brain axis, a vital pathway that conveys data between the organs and the brain. For example, Dr. Zuker and his colleagues discovered that sugar and fat entering the gut use the body-brain axis to drive the craving and strong appetite for sugary and fatty foods.

"We found all these ways in which the body is informing the brain about the body's current state," said co-first author Mengtong Li, PhD, a postdoctoral researcher in the Zuker lab. "We wanted to understand how much farther the brain's knowledge and control of the body's biology went."

The scientists looked for connections the brain might have with inflammation and innate immunity, the defense system shared by all animals and the most ancient component of the immune system. Whereas the adaptive immune system remembers previous encounters with intruders to help it resist them if they invade again, the innate immune system attacks anything with common traits of germs. The relative simplicity of innate immunity lets it respond to new insults more quickly than adaptive immunity.

Prior studies in humans revealed that electrically stimulating the vagus nerve -- a bundle of thousands of nerve fibers linking the brain and the body's internal organs -- could reduce the response linked to a specific inflammatory molecule. However, much remained unknown about the nature of this body-brain system: for instance, the generality of the brain's modulation of immunity and the inflammatory response, the selective lines of communication between the body and the brain, the logic of the underlying neural circuit, and the identity of the vagal and brain components that monitor and regulate inflammation.

The Zuker lab turned to a bacterial compound that sets off innate immune responses. The scientists found that giving this molecule to mice activated the caudal nucleus of the solitary tract, or cNST, which is tucked inside the brainstem. The cNST plays a major role in the body-brain axis and is the primary target of the vagus nerve.

The scientists showed that chemically suppressing the cNST resulted in an out-of-control inflammatory response to the immune insult: levels of pro-inflammatory molecules released by the immune system were more than three times higher than usual, and levels of anti-inflammatory immune compounds were roughly three times lower than normal. In contrast, artificially activating the cNST reduced pro-inflammatory molecule levels by nearly 70 percent and increased anti-inflammatory chemical levels almost tenfold.

"Similar to a thermostat, this newfound brain circuit helps increase or decrease inflammatory responses to keep the body responding in a healthy manner," said Dr. Jin, who started this study as a postdoctoral researcher in Dr. Zuker's lab. Dr. Jin is now a tenure track investigator at the National Institute of Allergy and Infectious Diseases. "In retrospect, it makes sense to have a master arbiter controlling this vital response."

Previous vagus nerve stimulation research in humans suggests the findings go beyond mice. The new research may also be in line with thousands of years of thought on the potential importance of the mind on the body.

"A lot of psychosomatic effects could actually be linked to brain circuits telling your body something," Dr. Jin noted.

The scientists identified the specific groups of neurons in the vagus nerve and in the cNST that help detect and control pro- and anti-inflammatory activity. "This opens up a new window into how the brain monitors and modulates body physiology," said Dr. Zuker, a professor of biochemistry, molecular biophysics and neuroscience at Columbia's Vagelos College of Physicians and Surgeons.

Discovering ways to control this newfound brain circuit may lead to novel therapies for common auto-immune diseases such as rheumatoid arthritis, type I diabetes, multiple sclerosis, neurodegenerative diseases, lupus, inflammatory bowel disease and Crohn's disease, as well as conditions such as long COVID syndrome, immune rejection of transplanted organs, and the potentially deadly outbursts known as cytokine storms that COVID infections can trigger.

Read more at Science Daily

May 4, 2024

Astronomers' simulations support dark matter theory

Computer simulations by astronomers support the idea that dark matter -- matter that no one has yet directly detected but which many physicists think must be there to explain several aspects of the observable universe -- exists, according to the researchers, who include those at the University of California, Irvine.

The work addresses a fundamental debate in astrophysics -- does invisible dark matter need to exist to explain how the universe works the way it does, or can physicists explain how things work based solely on the matter we can directly observe? Currently, many physicists think something like dark matter must exist to explain the motions of stars and galaxies.

"Our paper shows how we can use real, observed relationships as a basis to test two different models to describe the universe," said Francisco Mercado, lead author and recent Ph.D. graduate from the UC Irvine Department of Physics & Astronomy who is now a postdoctoral scholar at Pomona College. "We put forth a powerful test to discriminate between the two models."

The test involved running computer simulations with both types of matter -- normal and dark -- to explain the presence of intriguing features measured in real galaxies. The team reported their results in Monthly Notices of the Royal Astronomy Society.

The features in galaxies the team found "are expected to appear in a universe with dark matter but would be difficult to explain in a universe without it," said Mercado. "We show that such features appear in observations of many real galaxies. If we take these data at face value, this reaffirms the position of the dark matter model as the one that best describes the universe we live in."

These features Mercado noted describe patterns in the motions of stars and gas in galaxies that seem to only be possible in a universe with dark matter.

"Observed galaxies seem to obey a tight relationship between the matter we see and the inferred dark matter we detect, so much so that some have suggested that what we call dark matter is really evidence that our theory of gravity is wrong," said co-author James Bullock, professor of physics at UCI and dean of the UCI School of Physical Sciences. "What we showed is that not only does dark matter predict the relationship, but for many galaxies it can explain what we see more naturally than modified gravity. I come away even more convinced that dark matter is the right model."

The features also appear in observations made by proponents of a dark matter-free universe. "The observations we examined -- the very observations where we found these features -- were conducted by adherents of dark matter-free theories," said co-author Jorge Moreno, associate professor of physics and astronomy at Pomona College. "Despite their obvious presence, little-to-no analysis was performed on these features by that community. It took folks like us, scientists working with both regular and dark matter, to start the conversation."

Moreno added that he expects debate within his research community to follow in the wake of the study, but that there may be room for common ground, as the team also found that such features only appear in their simulations when there is both dark matter and normal matter in the universe.

"As stars are born and die, they explode into supernovae, which can shape the centers of galaxies, naturally explaining the existence of these features," said Moreno. "Simply put, the features we examined in observations require both the existence of dark matter and the incorporation of normal-matter physics."

Now that the dark matter model of the universe appears to be the leading one, the next step, Mercado explained, is to see if it remains consistent across a dark matter universe.

Read more at Science Daily

Rock solid evidence: Angola geology reveals prehistoric split between South America and Africa

An SMU-led research team has found that ancient rocks and fossils from long-extinct marine reptiles in Angola clearly show a key part of Earth's past -- the splitting of South America and Africa and the subsequent formation of the South Atlantic Ocean.

With their easily visualized "jigsaw-puzzle fit," it has long been known that the western coast of Africa and the eastern coast of South America once nestled together in the supercontinent Gondwana -- which broke off from the larger landmass of Pangea.

The research team says the southern coast of Angola, where they dug up the samples, arguably provides the most complete geological record ever recorded on land of the two continents moving apart and the opening of the South Atlantic Ocean. Rocks and fossils found date back from 130 million years ago to 71 million years.

"There are places that you can go to in South America, for instance, where you can see this part of the split or that part of it, but in Angola, it's all laid out in one place," said Louis L. Jacobs, SMU professor emeritus of Earth Sciences and president of ISEM. Jacobs is the lead author of a study published in The Geological Society, London, Special Publications.

"Before this, there was not a place known to go and see the rocks on the surface that really reflected the opening of the South Atlantic Ocean, because they're now in the ocean or eroded away," Jacobs said.

Angola rocks and fossils tell the whole story

Africa and South America started to split around 140 million years ago, causing gashes in Earth's crust called rifts to open up along pre-existing weaknesses. As the tectonic plates beneath South America and Africa moved apart, magma from the Earth's mantle rose to the surface, creating a new oceanic crust and pushing the continents away from each other. And eventually, the South Atlantic Ocean filled the void between these two newly-formed continents.

Scientists have previously found evidence of these events through geophysics and well cores drilled through the ocean floor.

But these tell-tale signs have never been found in one place, or been so clearly visible for anyone to see, said study co-author Michael J. Polcyn, research associate in the Huffington Department of Earth Sciences and senior research fellow, ISEM at SMU.

"It's one thing for a geophysicist to be able to look at seismic data and make inferences from that," he said. "It's quite another thing to be able to take a school field trip out to the rock formations, or outcrops, and say this is when the lava was spreading from eastern South America. Or this was when it was a continuous land."

Essentially, Angola presents the opportunity for someone to easily walk through each phase of this geologically significant chapter in Earth's history.

"That gives Angola major bragging rights," Jacobs said.

Jacobs, Polcyn and Diana P. Vineyard -- who is a research associate at SMU -- worked with an international team of paleontologists, geologists and others to analyze both the rock formations they found in eight different locations on the coast and the fossils within them.

Fieldwork in Angola's Namibe Province began in 2005. At that time, the research team recognized particular types of sediments, which gave them a good indication of what the western coast of Africa had been like at various stages millions of years ago. For instance, fields of lava revealed volcanic outpourings, and faults or breaks showed where the continents were being rifted apart. Sediments and salt deposits showed ocean flooding and evaporation, while overlying oceanic sediments and marine reptiles showed completion of the South Atlantic Ocean.

Paleontologists, meanwhile, discovered fossils in Angola from large marine reptiles that had lived late during the Cretaceous Period, right after the Atlantic Ocean was completed and while it grew wider.

By bringing together experts from a wide range of fields, "we were able to document when there was no ocean at all, to when there was a fresh enough ocean for those reptiles to thrive and have enough to eat," Vineyard said.

Many of the ancient fossils are currently on display at the Smithsonian's National Museum of Natural History "Sea Monsters Unearthed: Life in Angola's Ancient Seas" exhibit, which was co-produced with SMU -- a nationally-ranked Dallas-based private university.

Read more at Science Daily

Did a magnetic field collapse trigger the emergence of animals?

The Ediacaran Period, spanning from about 635 to 541 million years ago, was a pivotal time in Earth's history. It marked a transformative era during which complex, multicellular organisms emerged, setting the stage for the explosion of life.

But how did this surge of life unfold and what factors on Earth may have contributed to it?

Researchers from the University of Rochester have uncovered compelling evidence that Earth's magnetic field was in a highly unusual state when the macroscopic animals of the Ediacaran Period diversified and thrived. Their study, published in Nature Communications Earth & Environment, raises the question of whether these fluctuations in Earth's ancient magnetic field led to shifts in oxygen levels that may have been crucial to the proliferation of life forms millions of years ago.

According to John Tarduno, the William Kenan, Jr. Professor in the Department of Earth and Environmental Sciences, one of the most remarkable life forms during the Ediacaran Period was the Ediacaran fauna. They were notable for their resemblance to early animals -- some even reached more than a meter (three feet) in size and were mobile, indicating they probably needed more oxygen compared to earlier life forms.

"Previous ideas for the appearance of the spectacular Ediacaran fauna have included genetic or ecologic driving factors, but the close timing with the ultra-low geomagnetic field motivated us to revisit environmental issues, and, in particular, atmospheric and ocean oxygenation," says Tarduno, who is also the Dean of Research in the School of Arts & Sciences and the School of Engineering and Applied Sciences.

Earth's magnetic mysteries

About 1,800 miles below us, liquid iron churns in Earth's outer core, creating the planet's protective magnetic field. Though invisible, the magnetic field is essential for life on Earth because it shields the planet from solar wind -- streams of radiation from the sun. But Earth's magnetic field wasn't always as strong as it is today.

Researchers have proposed that an unusually low magnetic field might have contributed to the rise of animal life. However, it has been challenging to examine the link because of limited data about the strength of the magnetic field during this time.

Tarduno and his team used innovative strategies and techniques to examine the strength of the magnetic field by studying magnetism locked in ancient feldspar and pyroxene crystals from the rock anorthosite. The crystals contain magnetic particles that preserve magnetization from the time the minerals were formed. By dating the rocks, researchers can construct a timeline of the development of Earth's magnetic field.

Leveraging cutting-edge tools, including a CO2 laser and the lab's superconducting quantum interference device (SQUID) magnetometer, the team analyzed with precision the crystals and the magnetism locked within.

A weak magnetic field


Their data indicates that Earth's magnetic field at times during the Ediacaran Period was the weakest field known to date -- up to 30 times weaker than the magnetic field today -- and that the ultra-low field strength lasted for at least 26 million years.

A weak magnetic field makes it easier for charged particles from the sun to strip away lightweight atoms such as hydrogen from the atmosphere, causing them to escape into space. If hydrogen loss is significant, more oxygen may remain in the atmosphere instead of reacting with hydrogen to form water vapor. These reactions can lead to a buildup of oxygen over time.

The research conducted by Tarduno and his team suggests that during the Ediacaran Period, the ultraweak magnetic field caused a loss of hydrogen over at least tens of millions of years. This loss may have led to increased oxygenation of the atmosphere and surface ocean, enabling more advanced life forms to emerge.

Tarduno and his research team previously discovered that the geomagnetic field recovered in strength during the subsequent Cambrian Period, when most animal groups begin to appear in the fossil record, and the protective magnetic field was reestablished, allowing life to thrive.

"If the extraordinarily weak field had remained after the Ediacaran, Earth might look very different from the water-rich planet it is today: water loss might have gradually dried Earth," Tarduno says.

Core dynamics and evolution

The work suggests that understanding planetary interiors is crucial in contemplating the potential of life beyond Earth.

"It's fascinating to think that processes in Earth's core could be linked ultimately to evolution," Tarduno says. "As we think about the possibility of life elsewhere, we also need to consider how the interiors of planets form and develop."

Read more at Science Daily

May 3, 2024

Webb telescope probably didn't find life on an exoplanet -- yet

Recent reports of NASA's James Webb Space Telescope finding signs of life on a distant planet understandably sparked excitement. A new study challenges this finding, but also outlines how the telescope might verify the presence of the life-produced gas.

The UC Riverside study, published in the Astrophysical Journal Letters, may be a disappointment to extraterrestrial enthusiasts but does not rule out the near-future possibility of discovery.

In 2023 there were tantalizing reports of a biosignature gas in the atmosphere of planet K2-18b, which seemed to have several conditions that would make life possible.

Many exoplanets, meaning planets orbiting other stars, are not easily comparable to Earth. Their temperatures, atmospheres, and climates make it hard to imagine Earth-type life on them.

However, K2-18b is a bit different. "This planet gets almost the same amount of solar radiation as Earth. And if atmosphere is removed as a factor, K2-18b has a temperature close to Earth's, which is also an ideal situation in which to find life," said UCR project scientist and paper author Shang-Min Tsai.

K2-18b's atmosphere is mainly hydrogen, unlike our nitrogen-based atmosphere. But there was speculation that K2-18b has water oceans, like Earth. That makes K2-18b a potentially "Hycean" world, which means a combination of a hydrogen atmosphere and water oceans.

Last year, a Cambridge team revealed methane and carbon dioxide in the atmosphere of K2-18b using JWST -- other elements that could point to signs of life.

"What was icing on the cake, in terms of the search for life, is that last year these researchers reported a tentative detection of dimethyl sulfide, or DMS, in the atmosphere of that planet, which is produced by ocean phytoplankton on Earth," Tsai said. DMS is the main source of airborne sulfur on our planet and may play a role in cloud formation.

Because the telescope data were inconclusive, the UCR researchers wanted to understand whether enough DMS could accumulate to detectable levels on K2-18b, which is about 120 light years away from Earth. As with any planet that far away, obtaining physical samples of atmospheric chemicals is impossible.

"The DMS signal from the Webb telescope was not very strong and only showed up in certain ways when analyzing the data," Tsai said. "We wanted to know if we could be sure of what seemed like a hint about DMS."

Based on computer models that account for the physics and chemistry of DMS, as well as the hydrogen-based atmosphere, the researchers found that it is unlikely the data show the presence of DMS. "The signal strongly overlaps with methane, and we think that picking out DMS from methane is beyond this instrument's capability," Tsai said.

However, the researchers believe it is possible for DMS to accumulate to detectable levels. For that to happen, plankton or some other life form would have to produce 20 times more DMS than is present on Earth.

Detecting life on exoplanets is a daunting task, given their distance from Earth. To find DMS, the Webb telescope would need to use an instrument better able to detect infrared wavelengths in the atmosphere than the one used last year. Fortunately, the telescope will use such an instrument later this year, revealing definitively whether DMS exists on K2-18b.

"The best biosignatures on an exoplanet may differ significantly from those we find most abundant on Earth today. On a planet with a hydrogen-rich atmosphere, we may be more likely to find DMS made by life instead of oxygen made by plants and bacteria as on Earth," said UCR astrobiologist Eddie Schwieterman, a senior author of the study.

Given the complexities of searching far-flung planets for signs of life, some wonder about the researchers continued motivations.

Read more at Science Daily

'Gap' in carbon removal: Countries' plans to remove CO2 not enough

New research involving the University of East Anglia (UEA) suggests that countries' current plans to remove CO2 from the atmosphere will not be enough to comply with the 1.5 ºC warming limit set out under the Paris Agreement.

Since 2010, the United Nations environmental organisation UNEP has taken an annual measurement of the emissions gap -- the difference between countries' climate protection pledges and what is necessary to limit global heating to 1.5 ºC, or at least below 2 ºC.

The UNEP Emissions Gap Reports are clear: climate policy needs more ambition. This new study now explicitly applies this analytical concept to carbon dioxide removal (CDR) -- the removal of the most important greenhouse gas, CO2, from the atmosphere.

The study, published today in the journal Nature Climate Change, was led by the Berlin-based Mercator Research Institute on Global Commons and Climate Change (MCC) and involved an international team of scientists.

"In the Emissions Gap Reports, carbon removals are only accounted for indirectly," said lead author Dr William Lamb, of the MCC Applied Sustainability Science working group.

"After all, the usual benchmark for climate protection pledges is net emissions, ie emissions minus removals. We are now making transparent the specific ambition gap in scaling up removals.

"This planetary waste management will soon place completely new requirements on policymakers and may even become a central pillar of climate protection in the second half of the century."

Co-author Dr Naomi Vaughan, of the Tyndall Centre for Climate Change Research at UEA, added: "Carbon dioxide removal methods have a small but vital role to play in achieving net zero and limiting the impacts of climate change.

"Our analysis shows that countries need more awareness, ambition and action on scaling up CDR methods together with deep emissions reductions to achieve the aspirations of the Paris Agreement."

According to the study, if national targets are fully implemented, annual human-induced carbon removals could increase by a maximum of 0.5 gigatonnes of CO2 (500 million tonnes) by 2030, and by a maximum of 1.9 gigatonnes by 2050.

This contrasts with the 5.1 gigatonne increase required in a 'focus scenario', which the research team depicts as typical from the latest Intergovernmental Panel on Climate Change (IPCC) assessment report.

There, global heating, calculated over the entire course of this century, is limited to 1.5 ºC, and a particularly rapid expansion of renewable energies and reduction of fossil emissions is depicted as the core climate protection strategy.

But, the focus scenario still relies on scaling up carbon removals. The gap for the year 2050 is therefore at least 3.2 gigatonnes of CO2 (5.1 minus a maximum of 1.9).

An alternative focus scenario, also derived from the IPCC, assumes a significant reduction in global energy demand, due to politically initiated behaviour changes as the core element of climate protection strategy.

Here, carbon removals would increase by a more modest amount: 2.5 gigatonnes in 2050. Fully implemented national targets would be close to sufficient when compared to this scenario, with a gap in 2050 of 0.4 gigatonnes.

The research team points out the problem of sustainability limits in scaling up carbon removals; for example, the associated land area demand will come to jeopardise biodiversity and food security. Nevertheless, there is still plenty of room for designing fair and sustainable land management policies.

In addition, novel carbon removal options, such as air filter systems, or 'enhanced rock weathering', have hardly been promoted by politicians to date.

They currently only remove 0.002 gigatonnes of CO2 per year from the atmosphere, compared to 3 gigatonnes through conventional options such as afforestation, and they are unlikely to significantly increase by 2030. According to the scenarios, they must become more prevalent than conventional options by 2010.

Since only 40 countries have so far quantified their removal plans in their long-term low emissions development strategies, the study also draws on other national documents and best-guess assumptions.

"The calculation should certainly be refined," said Dr Lamb. "But our proposal using the focus scenarios further opens the discourse on how much carbon removal is necessary to meet the Paris Agreement.

"This much is clear: without a rapid reduction in emissions towards zero, across all sectors, the 1.5 ºC limit will not be met under any circumstances."

Read more at Science Daily

Significant new discovery in teleportation research -- Noise can improve the quality of quantum teleportation

In teleportation, the state of a quantum particle, or qubit, is transferred from one location to another without sending the particle itself. This transfer requires quantum resources, such as entanglement between an additional pair of qubits. In an ideal case, the transfer and teleportation of the qubit state can be done perfectly. However, real-world systems are vulnerable to noise and disturbances -- and this reduces and limits the quality of the teleportation.

Researchers from the University of Turku, Finland, and the University of Science and Technology of China, Hefei, have now proposed a theoretical idea and made corresponding experiments to overcome this problem. In other words, the new approach enables reaching high-quality teleportation despite the presence of noise.

"The work is based on an idea of distributing entanglement -- prior to running the teleportation protocol -- beyond the used qubits, i.e., exploiting the hybrid entanglement between different physical degrees of freedom," says Professor Jyrki Piilo from the University of Turku.

Conventionally, the polarisation of photons has been used for the entanglement of qubits in teleportation, while the current approach exploits the hybrid entanglement between the photons' polarisation and frequency.

"This allows for a significant change in how the noise influences the protocol, and as a matter of fact our discovery reverses the role of the noise from being harmful to being beneficial to teleportation," Piilo describes.

With conventional qubit entanglement in the presence of noise, the teleportation protocol does not work. In a case where there is initially hybrid entanglement and no noise, the teleportation does not work either.

"However, when we have hybrid entanglement and add noise, the teleportation and quantum state transfer occur in almost perfect manner," says Dr Olli Siltanen whose doctoral dissertation presented the theoretical part of the current research.

In general, the discovery enables almost ideal teleportation despite the presence of certain type of noise when using photons for teleportation.

"While we have done numerous experiments on different facets of quantum physics with photons in our laboratory, it was very thrilling and rewarding to see this very challenging teleportation experiment successfully completed," says Dr Zhao-Di Liu from the University of Science and Technology of China, Hefei.

"This is a significant proof-of-principle experiment in the context of one of the most important quantum protocols," says Professor Chuan-Feng Li from the University of Science and Technology of China, Hefei.

Read more at Science Daily