Categories
Science

AI might make alien contact extra seemingly for SETI’s Mission Hail Mary

“Project Hail Mary,” a science fiction novel that has just been adapted into a big-screen motion picture, tells the story of an unlikely astronaut who unexpectedly encounters an alien during a desperate mission to save their respective civilizations.

The astronaut (played by Ryan Gosling in the film) and the alien must immediately find out whether they are friends or enemies. You also need to develop a translation system that supports two completely different types of communication.

It all makes for a life-and-death space drama reminiscent of “Apollo 13” – but the day is fast approaching when advances in astronomy and artificial intelligence could largely remove the drama from extraterrestrial contact.

Seth Shostak, senior astronomer at the SETI Institute, says he wouldn’t be at all surprised if our first encounter with aliens came in the form of AI-to-AI contact.

“I suspect the aliens will be machines, because that’s what we do, right?” he says in the latest episode of the Fiction Science podcast. “We are in the early stages of building machines that can do things that humans have had to do in the past. I’m sure that in 100 years the most powerful intelligence on this planet will no longer be a soft and soft biological thing. That will be a machine. So if we hear the aliens, I suspect it’s more than likely that they too will be machines.”

If you’re worried that discussing AI and the search for aliens requires a deep dive into spoilers, have no fear: Artificial intelligence plays no real role in the movie “Project Hail Mary.” In the novel by Andy Weir on which the film is based, it is mentioned only once – only to explain why the planners of the “Life or Death” mission decided against using AI. (However, we will encounter spoilers towards the end of this post, so be warned.)

For more than 65 years, astronomers have searched the sky for radio signals that may have been emitted by extraterrestrial civilizations. “The usual approach is to build a receiver that can monitor thousands – today even millions – of different channels at the same time,” says Shostak. “And you can just look at how this ability has improved over time. It turns out that it follows something called Moore’s Law… which says that the speed of electronics more or less doubles every two years.”

It takes a lot of computing power to monitor millions of channels, and Shostak is sure that AI will accelerate the search for extraterrestrial intelligence, better known as SETI.

There’s already evidence of this: Last November, the Breakthrough Listen Initiative reported that an AI system developed in collaboration with NVIDIA could process real-time data from telescopes searching for fast radio bursts at a speed more than 600 times faster than the current data pipeline. The system improved detection accuracy by 7% and reduced false alarms by almost an order of magnitude.

“This technology not only allows us to find known signal types more quickly – it also allows us to discover completely unexpected signal morphologies,” said Andrew Siemion, principal investigator of the Breakthrough Listen Initiative, in a press release. “An advanced civilization could use burst communications, modulated signals, or transmission schemes that we haven’t even imagined yet. This AI system can learn to recognize patterns that might be completely missed by a human.”

A few years ago, another team of astronomers used a machine speech algorithm to identify potential extraterrestrial signals missed by other data processing systems. (But don’t get too excited: Follow-up observations did not confirm that the signals came from alien civilizations. They would have heard it if they did.)

AI tools could help astronomers overcome some of the obstacles facing the SETI search. For example, a group of researchers recently reported that signals from extraterrestrial civilizations could be disrupted by stormy space weather. Improved pattern recognition software could potentially detect the signal hidden in the cosmic noise.

AI models could also come into play to interpret alien messages as they are found. But Shostak isn’t so focused on that challenge. “Even if we never understand what the aliens are saying, just the fact that we can receive the signal and recognize that it is an artificial signal – in other words, generated by some technology – is very interesting, because we have proven that they are there,” says Shostak.

Understanding what the aliens are saying “would be interesting to know, but I would consider that a secondary benefit to detecting their presence,” he says.

Seth Shostak is a senior astronomer at the SETI Institute. (Photo by SETI Institute)

Shostak compares the challenge of deciphering alien messages to the challenge archaeologists faced when excavating Egyptian hieroglyphs. “The best way to decipher the hieroglyphs is to have lots of people working on the problem, so just make them known,” he says. “I think the same logic applies here.”

Douglas Vakoch, the president of METI International, has spent a lot of time studying the problem of news translation. You can tell from his organization’s acronym, which stands for “Messaging Extraterrestrial Intelligence.” He says AI can play a supporting role in detecting and deciphering alien messages, but is not the main role.

“We need to be aware that when we humans try to find patterns hidden in radio statics, we might start with a few simple guidelines that are very similar to the clear rules of AI. But often we don’t see exactly why our rules are inadequate because we don’t spell them out clearly,” Vakoch told me via email. “AI forces us to be aware of how we’re trying to solve problems, and by simply learning from AI how it tries to solve a problem, we can say, ‘You missed something critical. You need to do this instead.’ ”

In his opinion, discovering an alien message is only half the battle.

“An even greater challenge will be to understand what it means. And this is where humans will continue to play a role, even as AI becomes more sophisticated in the coming years,” said Vakoch. “Deciphering a message from aliens will be much more unclear. AI could help us spot patterns in alien messages that humans would miss, but we will still need people to figure out what the message means.”

How long will it take to make contact with aliens? Do we have to wait for a life-threatening mission to a distant star system? More than 20 years ago, Shostak predicted that we would find evidence of aliens by around 2025. And he has been putting a lot of coffee on it for more than 15 years.

Now Shostak admits he may have to pay. “The next time I see you, I’ll buy you a cup of coffee,” he says. “We haven’t found them yet. … Maybe it was just wishful thinking, but honestly I think it was more based on the known rate of improvement in the alien discovery experiments.”

Maybe SETI astronomers just need more time to use Moore’s Law and AI. Perhaps it will be another 20 or 200 years before the promise of Project Hail Mary is put into action and brought into contact with extraterrestrial travelers. But in the meantime I’ll have the cup of coffee.

Here come the spoilers

If you haven’t read Project Hail Mary yet, it may be difficult to keep track of the film’s scientific twists. Some of these plot twists have interesting parallels to real-world science, and I can’t help but point them out.

Project Hail Mary is set to hit theaters on March 20 and is already receiving rave reviews. For more from Seth Shostak, check out Big Picture Science, the podcast he co-hosts. and look for his columns in Astronomy magazine.

My co-host for the Fiction Science podcast is Dominica Phetteplace, an award-winning author, graduate of the Clarion West Writers Workshop and based in San Francisco. To learn more about Phetteplace, visit DominicaPhetteplace.com.

This report was originally published on Cosmic Log, home base of the Fiction Science podcast. Stay tuned for future episodes of Fiction Science on Apple, Spotify, Player.fm, Pocket Casts and Podchaser. Fiction Science is included in FeedSpot’s Top 100 Science Fiction Podcasts. If you enjoy Fiction Science, please rate the podcast and subscribe to receive notifications for future episodes.

Categories
Science

How jagged moon mud may help future astronauts

Moon dust can be painful – but it is also, quite literally, the ground we must traverse if we ever want a permanent human settlement on the moon. In this particular use case, its adhesive, jagged and static properties may actually be beneficial, according to a new paper recently published in Research by researchers at Beihang University who analyzed the mechanical properties of samples returned to the far side of the moon by the Chang’e-6 mission.

Chang’e 6 is the first mission ever to bring back samples from the far side of the moon. It collected some from the South Pole-Aitken (SPA) basin – the largest, deepest and oldest known impact crater in the solar system, which formed about 4.2 billion years ago. This formation caused significant changes in the geotechnical properties of its soil compared to those of the opposite side, previously collected by NASA astronauts and Chinese landers.

But it is difficult to test these properties on Earth. Simulants can’t really do justice to reality, and there simply aren’t enough real lunar regoliths on Earth to provide unlimited samples to every interested researcher. Performing some tests will also destroy the sample, making it useless for other research later. That’s why the authors came up with an alternative: perform non-destructive testing and then run a simulation.

Fraser explains how big a problem dust is.

They chose the discrete element method (DEM) for the model. This mathematical approach simulates the behavior of bulk solids by calculating the physical interactions, friction and collisions of millions of individual particles. As input, it takes the particle’s shape and some of its physical properties, and as output, it can create a “digital twin” of the soil that future rovers or astronauts will have to traverse without ever touching another sample.

To get there, however, the authors first had to touch some samples. To do this, they used high-resolution X-ray micro-computed tomography (micro-CT) to scale part of the sample returned by Chang’e 6. This non-destructive imaging technique, which also uses another technique called convolutional neural network, allowed the researcher to individually reconstruct nearly 350,000 individual particles for analysis.

Analysis of this data set showed some clear differences between the sample on the far side and those on the near side. Notably, the sample on the far side has fewer large, coarse particles than samples on the near side, but also that these particles have low “sphericity,” which measures how close a particle is to a true sphere.

Targeting dust with an electric field is one way to combat it, as Fraser explains.

After inserting this data set into their DEM program, the authors found that the regolith is exceptionally strong and is at the upper limit of measurements from Apollo-era samples. This is mainly due to a high internal friction angle and dust cohesion. Most likely, the jaggedness of the particles, which makes them so frustrating in machines or human lungs, is actually helpful in the context of improving their mechanical properties on the ground. Furthermore, the mechanical strength of the samples was increased by “cementation” by glassy agglutinates, most likely caused by a micrometeoroid impact. These make up about 30% of the sample and serve as a glue that holds the remaining particles together.

To build large-scale infrastructure such as a future Artemis habitat or the International Lunar Research Station, understanding the fundamentals of soil is crucial. This unique geotechnical investigation of the other side shows how diverse the samples can be. And while it may take a while to really build anything on the other side (due to communication issues), it’s still good to know that when we do, there’s a solid foundation waiting for us. Even if that same solid ground could eventually destroy our machines and kill us if exposed to it for too long.

Learn more:

Research / EurekaAlert – Building on the other side: AI analysis suggests more stable foundations for future moon bases

H. Wang et al. – Particle morphology controls bulk mechanical behavior of far-side lunar regolith from Chang’e-6 samples and deep learning

UT – The sticky moon dust problem gets a mathematical solution

UT – Flexible force fields can protect our return to the moon

Categories
Science

Starshade idea might reveal Earth-like exoplanets

Finding Earth-like exoplanets with the composition and ingredients for life as we know it is the Holy Grail of exoplanet hunting. Since the discovery of the first exoplanets in the 1990s, scientists have been pushing the boundaries of exoplanet hunting with new and exciting methods. One of these methods is direct imaging, in which the host star is carefully blocked out within the observing telescope to reveal the orbiting exoplanets that were originally hidden in the star’s immense glow.

Only about 1.5 percent of confirmed exoplanets were discovered using this method. One reason for this is atmospheric turbulence, which makes ground-based telescope observations difficult. However, a team of researchers has proposed improving this method to find an Earth-like exoplanet while mitigating this turbulence.

Here, Universe Today discusses these results, published in a recent study published in Nature Astronomy, which examines the use of a combination of ground-based telescopes with a space-based “starshadow.” We also share insights from the study’s lead author, Dr. Ahmed Mohamed Soliman, a scientist and technologist at NASA’s Jet Propulsion Laboratory, discussed the motivation behind the study, how their method compares to current direct imaging methods and upcoming missions, and what next steps will be taken to make this concept a reality. So what was the motivation behind this study?

“Many people think that only large space telescopes like the Nancy Grace Roman Space Telescope, the James Webb Space Telescope or the planned Habitable Worlds Observatory can search for life outside our solar system, but they don’t know what our NASA NIAC-funded study – Hybrid Observatory for Earth-like Exoplanets (HOEE) – can do,” says Dr. Soliman told Universe Today.

For the study, Dr. Soliman and his colleagues proposed the use of a hybrid ground-space observatory concept that includes a 99-meter (325-foot)-diameter orbiting star shadow and multiple powerful ground-based telescopes. These telescopes include the Extremely Large Telescope (ELT), the Giant Magellan Telescope (GMT) and the Thirty Meter Telescope (TMT), with the ELT and GMT located in the Atacama Desert (Chile), while the TMT is located in Hawaii (USA). As the starshade blocks the star’s bright light and reveals the previously hidden exoplanets, the ground-based telescopes will work to find out whether the exoplanets are Earth-like.

Dr. Soliman tells Universe Today that the goal will be to identify dozens of Earth-sized exoplanets. He also points out that this concept will take only minutes to identify entire solar systems, including Earth-like exoplanets orbiting Sun-like stars, and that it will only take hours to identify potential biosignatures.

“In addition, as shown in our Nature Astronomy studies, the ELT’s advanced adaptive optics can correct for atmospheric turbulence, enabling clear imaging of habitable exoplanets and detection of potential life under moderate weather conditions,” says Dr. Soliman told Universe Today. “The planet should lie within the star’s habitable zone, where conditions allow the existence of oxygen and water. For a Sun-like star, this corresponds to about 1 astronomical unit (AU), the distance between the Earth and the Sun. For nearby stars, this corresponds to an angle of about 0.1 arcsecond. HOEE can observe at an angle of only 0.058 mas.” [milliarcseconds] from the star.”

As already mentioned, the direct imaging method involves the use of special instruments in telescopes to block the star’s glare and reveal the previously hidden exoplanets. This instrument, called a coronagraph, is an internal blocking method, while the proposed star umbrella serves as an external blocking method. There are countless ground-based telescopes that use coronagraphs to study exoplanets, including the Very Large Telescope and the Magellan Telescope in Chile, and the Subaru Telescope and the Gemini North Telescope in Hawaii.

Examples of currently active space-based telescopes that use coronagraphs to study exoplanets include NASA’s James Webb Space Telescope (JWST) and Hubble Space Telescope (HST), while the European Space Agency’s Solar and Heliospheric Observatory and India’s Aditya-L1 telescope use coronagraphs to study our Sun. But how does this hybrid concept compare to current direct imaging methods?

Dr. Soliman tells Universe Today: “Current space telescopes such as the James Webb Space Telescope and the soon-to-fly Nancy Grace Roman Space Telescope use internal coronagraphs for direct imaging, but their contrast is not strong enough to directly detect real Earth-like planets in habitable zones. Existing ground-based telescopes also lack the required contrast and resolution. A hybrid system combining a space-based starshade with large ground-based telescopes would significantly improve starlight suppression and angular resolution and enable direct detection.” Earth-like exoplanets possible.”

The Nancy Grace Roman Space Telescope is currently scheduled to launch between September 2026 and May 2027 and will operate at the Sun-Earth Lagrange point L2, located on the opposite side of the lunar orbit from Earth. To illustrate, this is where JWST is currently located, as it offers an uninterrupted view of the night sky while being protected from the heat of the sun and having clear radio communications to send data back. In contrast, HST currently orbits the Earth every 45 minutes, its communications are blocked by our planet when it attempts to send data back to NASA, and its view of the sky is also blocked by our planet.

In addition to the upcoming Nancy Grace Roman Space Telescope, another planned space-based telescope has the potential to deliver groundbreaking exoplanet research: the Habitable Worlds Observatory (HWO), scheduled to launch in the late 2030s or early 2040s. HWO’s primary goal is to directly image and identify at least 25 Earth-like exoplanets and search for biosignatures, potentially using a coronagraph or starshade to support its direct imaging techniques. But how does the starshade concept proposed in this study compare to HWO?

“HWO will be more flexible in terms of targeting and monitoring cadence,” says Dr. Soliman told Universe Today. “HOEE, on the other hand, can observe about several times faster because it uses a ground telescope that is about six times larger than the HWO. HOEE achieves an angular resolution of [about] six times larger and allows detection of planets embedded in circumstellar dust produced by comets and asteroids in exoplanetary systems. HOEE can be a technological stepping stone and complement HWO or even accelerate the characterization of exoplanets before the launch of HWO.”

The journey to bring a space mission from concept to reality often takes years, often decades, of designs, tests, funding proposals, rejections, approvals, more tests, redesigns, and countless staff committees deciding its fate. These ventures are often funding-driven but also require a statement about the scientific value of the mission. For example, the United States National Academies created the Decadal Survey, which is a 10-year plan that sets out scientific goals for planetary science, astrophysics, earth sciences, and space physics.

The most recent decadal survey was the Astro2020 Decadal Survey, which established three key goals for NASA’s space exploration for 2030 and beyond. These include the identification of habitable exoplanets, the study of black holes and neutron stars, and the evolution of galaxies. The starshade concept proposed in this study falls under the habitable worlds target for the Astro2020 Decadal Survey. So what are the next steps to make this hybrid space-ground starshade approach a reality?

“The next question now is: Can we actually build it and launch it?” Dr. Soliman tells Universe Today. “The starshade must be 100 meters wide and very light so that rockets can launch it into space and move it from star to star. It sounds difficult, but exciting progress is already being made at NASA’s Jet Propulsion Laboratory, NASA Goddard Space Flight Center and NASA Ames Research Center through NASA’s Starshade and NIAC programs. The Keck Institute for Space Studies has brought together top scientists and engineers to chart a clear path for a true HOEE mission aimed at achieving the starshade “To find the very first Earth-like planet.” orbiting around a sun-like star.”

How will this starshade concept help identify Earth-like exoplanets in the coming years and decades? Only time will tell, and that’s why we do science!

As always, keep up the science and keep looking up!

Categories
Science

Astronauts use micro organism and fungi to reap metals in area

It is a well-known fact that humanity must bring Earth’s environment with it if it wants to explore space and live and work on other planets. These include life support systems that utilize biological processes – also known as: bioregenerative life support systems (BLSS) – but also the many types of microbes that are essential to living systems. Humans already bring microbes with them when they travel into space, particularly to the International Space Station (ISS). These microbes become part of the natural environment, sticking to surfaces, growing in nooks and crannies, and getting into everything.

Given their constant presence, it is paramount that we understand how they survive in space. Additionally, they have potential uses that could enable greater self-sufficiency in space. For example, certain species of bacteria and fungi extract minerals from rocks as a source of nutrients. In a recent study aboard the ISS, researchers from Cornell and the University of Edinburgh examined how these species could be used to extract platinum from a meteorite under microgravity conditions. Their results suggest that this could be an effective method for extracting mineral resources in space and reducing dependence on Earth.

The study was led by Rosa Santomartino, an assistant professor of biological and environmental engineering at Cornell’s College of Agriculture and Life Sciences (CALS), and Alessandro Stirpe, a research fellow in microbiology at Cornell and the School of Biological Sciences at the University of Edinburgh. They were joined by researchers from the Medical University of Graz in Austria, Rice University, Cancer Research UK, the UK Center for Astrobiology at the University of Edinburgh, Kayser Space Ltd and Kayser Italia. Their study was published January 30 in npj Microgravity.

*A bioreactor manufactured by the BioAsteroid project at the University of Edinburgh. Photo credit: University of Edinburgh*

The work was part of the BioAsteroid project, a collaborative initiative between the University of Edinburgh and the European Space Agency (ESA). This project is led by Charles Cockell, a professor of astrobiology at the University of Edinburgh and senior author of the study. Cockell and his colleagues developed “biomining reactors” that were deployed on the ISS in late 2020/early 2021 to study how gravity affects the interaction between microbes and rocks in microgravity.

These reactors contained samples of an L-chondrite asteroid that were treated with the bacterium Sphingomonas desiccabilis and the fungus Penicillium simplicissimum. These microbes are promising for raw material extraction because they produce carboxylic acids that bind to minerals and release them from rocks. However, there is still uncertainty about how this mechanism works. To this end, the experiment also included a metabolomic analysis, in which part of the liquid culture was extracted and analyzed for biomolecules and secondary metabolites. As Santomartino said in a Cornell Chronicle press release:

This is likely the first experiment of its kind on the International Space Station [a] Meteorite. We wanted to keep the approach individual but also general to increase its impact. These are two completely different species, and they will extract different things. So we wanted to understand what and how, but keep the results relevant for a broader perspective, since not much is known about the mechanisms that influence microbial behavior in space.

The experiment was conducted aboard the ISS by NASA astronaut Michael Scott Hopkins, while the researchers conducted their own control version in the laboratory. This allowed them to examine how the experiment would work in weightlessness compared to Earth’s gravity. Santomartino and Stirpe then analyzed the experiment data and showed that of the 44 different elements, 18 were extracted through biological processes. Stirpe said:

We broke the analysis down to the individual element and asked ourselves, “Okay, does extraction behave differently in space than on Earth?” Are these elements more extracted if we have a bacteria or a fungus or if we have both? Is this just noise, or can we see something that might make a little sense? We don’t see any major differences, but there are some very interesting ones.

NASA astronaut Michael Scott Hopkins installs the experimental containers in KUBIK (left) and the six hardware units inserted into the KUBIK on board the ISS (right). Image credit: ESA/NASA/

Their analysis found that the microbes produced consistent results in both Earth gravity and microgravity. However, there were also clear changes in microbial metabolism, particularly in the fungal samples. In microgravity, the fungus increased its production of carboxylic acids and other molecules, leading to the extraction of more palladium, platinum and other elements. Meanwhile, the non-biological leaching experiment proved less effective in zero gravity than on Earth. Santomartino said:

In these cases, the microbe does not improve the extraction itself, but rather ensures that the extraction remains at a constant level regardless of gravity. And this applies not only to palladium, but to various types of metals, although not all. Another complex but very interesting result, in my opinion, is that the extraction rate varies greatly depending on the metal you are considering and the microbial and gravity conditions.

This experiment successfully demonstrated the potential of “biomining” that future astronauts could use to explore the Moon and Mars. In addition to life support systems that rely on cyanobacteria and other photosynthetic organisms to purify the air and produce edible algae, microbes and fungi could be used to leach minerals from the local regolith. These, in turn, could be used to produce building materials for structures and tools, reducing the amount of supplies that need to be sent from Earth.

Additionally, biomining has potential applications here on Earth, providing a biological method for extracting metals in resource-limited environments or from mining waste. This technology could also lead to biotechnologies that facilitate the emergence of a waste-free circular economy. However, the team cautions that more research is needed because there are many variables and uncertainties surrounding the effects of space on microbes.

“Depending on the type of microbe, depending on the space conditions, depending on the method the researchers use, everything changes,” Santomartino said. “Bacteria and fungi are all so different and space conditions are so complex that there is currently no single answer. So maybe we need to dig more. I don’t want to be too poetic, but for me that’s a bit [of] the beauty of it. It’s very complex. And I like it.”

Further reading: Cornell Chronicle, npj Microgravity.

Categories
Science

The VLT picture reveals a “Cosmic Falcon” spreading its wings.

The European Southern Observatory (ESO) has just released its photo of the week. This image, taken by the Very Large Telescope (VLT) in Chile, shows the nebula RCW 36, located about 2,300 light-years away in the constellation Vela. But to observers, it looks like a cosmic falcon spreading its wings: the dark clouds in the center resemble the falcon’s head and body, and the filaments running to the right and left serve as wings. And the nice thing about it: The image itself was taken with the High Acuity Wide-field K-band Imager-1 (HAWK-1) instrument on the VLT.

This powerful near-infrared imager is designed to capture deep, high-resolution images, allowing it to penetrate the dust and gas clouds that obscure dimmer objects such as newly forming stars. Below the falcon in the image, several new stars are visible, embedded in clouds of nebula gas and dust. The intense radiation from these massive young stars illuminates the nebula, causing it to glow blue, red and white. However, it is the population of faint brown dwarfs that was of interest to the astronomers who captured this image.

Brown dwarfs are essentially substellar objects, very large gas giants that were not massive enough to undergo gravitational collapse and fuse hydrogen. The HAWK-1 is ideal for this task because it combines high sensitivity with adaptive optics that correct for atmospheric disturbances. This allowed the international team, led by astronomers from the Instituto de Astrofĩsica e Ciências do Espaço (IA) in Lisbon, to identify the many fainter objects in the image. Their efforts are described in an article published in *Astronomy & Astrophysics*: “Substellar population of the young massive star cluster RCW 36 in Vela.”

The study not only provided important data that will improve our understanding of brown dwarf formation, but also provided a striking picture. Afonso do Brito do Vale, a graduate student at IA and lead author of the paper, described it as “massive stars ‘pushing away’ the gas and dust clouds surrounding them, almost like an animal breaking through its eggshell for the first time.” This completes the picture and gives the impression that the falcon is protecting these baby stars and brown dwarfs as if they were its eggs. Over time, new stars will “hatch” and join the nest!

Further reading: ESO, astronomy and astrophysics

Categories
Science

Pictures from Mars Categorical reveal Mars’ pockmarked floor

ESA’s Mars Express probe has been exploring Mars from orbit for more than twenty years. The way it mapped the surface with its High Resolution Stereo Camera (HRSC) has drastically changed the way we see the Red Planet. In a recent paper, ESA released a series of HRSC images highlighting the heavily cratered Arabia Terra region. The study of Martian craters provides insights into Mars’ geology, meteorology, and its long and turbulent history. The images were generated from the camera’s digital terrain model as well as the nadir and color channels.

The image above shows the Arabia Terra region, a large plain in the southern highlands that is heavily cratered by impactors that have struck the planet over time. The features are labeled (if you click on the image) and can be enlarged. The crater volume is due to Arabia Terra being one of the oldest geological formations on Mars, estimated to be 3.7 to 4.1 billion years old. At this time, geological activity inside Mars ended, causing it to lose its planetary magnetosphere and its atmosphere to be slowly eroded by the solar wind.

*A bird’s eye view of a region in Trouvelot Crater. It shows the dark volcanic deposits covering the crater floor and a light mound visible within these deposits. Photo credit: ESA/DLR/FU Berlin*

Much like the Moon’s airless environment has preserved its craters, Mars’ thin atmosphere has ensured that these craters are well preserved. Some of the craters in the image are filled with dark material, while others are filled with lighter sand and undulating dunes. This suggests that some of this sand was deposited on Mars by dust storms, while other material may have been thrown out by the impacts themselves. Others still show signs of collapsing crater walls and worn rims, also indicating wind-induced erosion.

To the left of Trouvelot Crater is an older, more eroded basin with a completely collapsed wall almost entirely covered in dark rock. This material was formed by the wind into the characteristic undulating structures known as “barchan” dunes, characterized by their crescent-shaped profile. Mars Express photographed these dunes at several locations in the northern lowlands and the large Tharsis volcanic region. The dark material, known as “mafic rock,” is rich in minerals and is often associated with volcanism here on Earth.

*Close-up showing the light hill at top left contrasting with the dark rock. Photo credit: ESA/DLR/FU Berlin*

This again indicates that material thrown out by impacts was blown around by the wind and eventually pulled down along the crater walls. The fact that Trouvelot intersects this crater suggests that it is the younger of the two, and the commonality of craters with dark material suggests that the mechanisms involved are ubiquitous on Mars. Amid the dark material is a light hill, about 20 km long, covered in ridges and grooves. Such mounds have been observed in other locations and suggest that other processes may be at work.

One clue is the minerals observed in these mounds, which suggest they formed in the presence of running water. Whether this is the case remains a matter of scientific debate, and there are several possibilities as to how they could have been deposited by water.

Further reading: ESA

Categories
Science

Physicists from Illinois and UChicago are growing a brand new methodology to measure cosmic enlargement

For about a century, scientists have known that the universe is in a state of constant expansion. In honor of the scientists who conclusively demonstrated this, this extension became known as the Hubble constant (or Hubble-Lemaitre constant). Today, scientists use two main techniques to measure the expansion rate: the cosmic microwave background (CMB) and the cosmic distance ladder. The former is based on redshift measurements of the CMB, the relic radiation left over from the Big Bang, while the latter is based on parallax and redshift measurements of variable stars and supernovae (also called “standard candles”).

The only problem is that the two methods do not agree, resulting in what is known as “Hubble tension.” This problem is considered one of the greatest cosmological puzzles facing scientists today. Fortunately, new methods are emerging that could help resolve this “tension” and bring order to the Standard Model of cosmology. In a recent study, a team of astrophysicists, cosmologists and physicists from the University of Illinois and the University of Chicago proposed a new method that exploits tiny ripples in spacetime known as gravitational waves (GWs).

The study was led by Bryce Cousins, an NSF Graduate Research Fellow from the Institute of Gravitation and the Cosmos (IGC) at the University of Illinois Urbana-Champaign. He was joined by several colleagues from the IGC as well as researchers from the Kavli Institute for Cosmological Physics and the Enrico Fermi Institute at the University of Chicago. Their study, “Stochastic Siren: Astrophysical gravitational-wave background measurements of the Hubble Constant,” appeared Jan. 16 in Physical Review Letters.

Scientists hoping to resolve the Hubble tension have proposed several solutions, ranging from early dark energy (EDE) and interactions between dark matter (DM) and neutrinos to the evolving dynamics of dark energy. In recent years, the discovery of gravitational waves has also emerged as a way to resolve the tension by offering a new way to measure cosmic expansion. Originally predicted by Einstein’s theory of general relativity, gravitational waves are waves that arise in the fabric of space-time and are created by the merger of massive objects (neutron stars and/or black holes).

They were first confirmed in 2016 by scientists at the Laser Interferometer Gravitational Wave Observatory (LIGO). Thanks to improved tools and international collaboration, the LIGO-Virgo-KAGRA (LVK) collaboration has discovered more than 300 GW events. During this time, astronomers have found ways to use events to study cosmological phenomena, including measuring the expansion of the cosmos. In current research, the team has found a way to improve these measurements by exploiting the gravitational wave background (GWB) caused by astrophysical collisions that the LVK network is not yet sensitive enough to detect.

They call it the “standard stochastic siren” method because the collisions that form the gravitational wave background occur stochastically. Daniel Holz, UChicago professor and co-author of the study, explained in a UIUC press release:

It’s not every day that you come up with an entirely new tool for cosmology. We show that we can learn about the age and composition of the universe using the background hum of gravitational waves from merging black holes in distant galaxies. This is an exciting and completely new direction, and we look forward to applying our methods to future datasets to help constrain the Hubble constant as well as other important cosmological quantities.

*Artist’s impression of the electromagnetic signal from the merger of two neutron stars. Photo credit: NSF/LIGO/Sonoma State University/A. Simonnet*

As a proof of principle, the team applied their method to recent LVK collaboration data. They found that failure to detect the GWB provides evidence against slow cosmic expansion rates. They then combined their method with measurements of the Hubble constant based on individual black hole collisions to get a more accurate rate. “Because we observe individual black hole collisions, we can determine the frequency of these collisions throughout the universe,” Cousins ​​said. “Based on these rates, we expect there will be many more events that we cannot observe, called the gravitational wave background.”

This showed that at lower values ​​of the Hubble constant, the total volume of space in which collisions occur is smaller. This would mean that the density of object collisions is higher and the strength of the GWB signal increases to a level that current instruments could detect. “This result is very significant – it is important to obtain an independent measurement of the Hubble constant to resolve the current Hubble voltage,” added co-author Nicolás Yunes, the founding director of the Illinois Center for Advanced Studies of the Universe (ICASU). “Our method is an innovative way to improve the accuracy of Hubble constant conclusions using gravitational waves.”

Due to LVK’s improved architecture, scientists believe that the GWB will likely be discovered within the next six years. If this happens, the team’s method could be used to further improve measurements of the Hubble constant. Until then, the stochastic siren method could be used to constrain higher values ​​of the Hubble constant, thereby setting upper limits on the GWB and allowing scientists to study it before a full discovery occurs.

“This should pave the way for future application of this method as we can further increase the sensitivity, better isolate the gravitational wave background and perhaps even detect it,” says Cousins. “By incorporating this information, we expect to obtain better cosmological results and come closer to resolving the Hubble tension.”

Further reading: University of Illinois

Categories
Science

Pink dwarf stars might deprive alien crops of the “high quality gentle” they should breathe

Red dwarfs make up the vast majority of stars in the galaxy. Because of their ubiquity, they host the most rocky exoplanets we have found so far – which in turn makes them interesting for astrobiological studies. However, there’s a catch: Astrobiologists aren’t sure whether the light from these stars can actually support oxygen-producing life. A new paper by Giovanni Covone and Amedeo Balbi, available as a preprint on arXiv, suggests that may not be the case – when it comes to starlight, quality is just as important as quantity. And according to their calculations, maintaining Earth-like biospheres around red dwarfs is incredibly difficult.

Their argument is based on the concept of exergy – a measure of the maximum amount of useful work that can be extracted from a radiation field. In other words, it measures the thermodynamic quality of light and not just the raw energy it contains. When measuring the “habitable zone” of stars, astrobiologists typically look at the total number of photons, particularly in the visible light range between 400 and 700 nanometers wavelength.

So what “useful work” does light do on exoplanets? Perhaps the most important is breaking up water. This process, known as “water oxidation,” represents a kinetic bottleneck in the process of photosynthesis and produces the oxygen expected in biosignatures. However, biological systems require a significant amount of kinetic energy to carry out this chemical reaction. And when it comes to providing that energy, red dwarfs have two advantages.

Fraser talks about habitable planets around red dwarfs

Red dwarfs are cool and their light is strongly redshifted into the infrared. Their photons do not concentrate enough energy to reach the threshold required for water splitting. But even those that do have a smaller percentage of their energy that can actually be converted into useful chemical work. This double combination enormously reduces the potential for the formation of oxygen-containing life around red dwarfs. In comparison, the exergy available to drive water oxidation around Sun-like stars is about five times higher.

However, astrobiologists are an optimistic bunch, so their immediate answer to this concern would be: perhaps life around these stars has evolved to adapt to these higher infrared environments. Could they use longer, lower-energy infrared wavelengths under a red dwarf sky? The short answer is “no” because it is the so-called red border. This is the longest wavelength of light that can support photosynthesis. The authors argue that this is not a fixed value, but rather an emergent property determined by a star’s spectrum, the planet’s atmosphere and a targeted chemical reaction – in this case water oxidation.

They estimate that the red limit for red dwarfs is 0.95 µm, while for sun-like stars it is closer to 1.0 µm. In practice, this means that life cannot simply shift its primary absorption bands deeper into the near-infrared to accommodate its less powerful star. Another concern concerns the development of life on one of these planets. Oxygen-deficient bacteria can use infrared light effectively. If they could multiply, they could outcompete oxygen-containing bacteria and the world would never experience a “major oxidation event” equivalent to what happened on Earth. Without abundant oxygen in the atmosphere, multicellular life would be severely limited, if not eliminated entirely.

Fraser has some videos on this topic that show there is an ongoing debate.

When all of this is taken into account, the possibility of life near red dwarfs paints a bleak picture. But let’s not rule it out completely. Currently, Earth’s biosphere is consuming only about three orders of magnitude below maximum thermodynamics – evidence that life itself is extremely inefficient. But even then, the conditions around red dwarfs that would be favorable for life are likely extremely rare. This article proves that our time searching for an oxygen-rich alien forest would be better spent near stars like our sun than chasing the statistical rarity of a thriving biosphere around a red dwarf.

Learn more:

G. Covone & A. Blabi – Photosynthetic exergy I. Thermodynamic limits for planets in the habitable zone

UT – Red dwarfs are too weak to generate complex life

UT – Planets in the habitable zone around red dwarfs are unlikely to host exomoons

UT – New research suggests advanced civilizations are unlikely to exist in red dwarf systems

Categories
Science

Astronomers are creating a brand new methodology to measure cosmic enlargement utilizing lensing supernovae

Superliminous supernovae are miraculous events. They are also an important tool for astronomers to measure cosmic distances and the rate at which the universe is expanding. Part of the cosmic distance ladder, these incredibly bright stellar explosions are the “standard candles” for objects billions of light-years away. In a rare case, researchers at the University of Munich used the Large Binocular Telescope (LBT) in Arizona to observe a super-bright supernova 10 billion light-years away that was far brighter than most explosions of its kind.

What was special about this supernova was that it appeared five times in the night sky due to the gravitational lensing of two foreground galaxies. These galaxies bent the path of the supernova’s light, causing it to take different paths. Because these paths have different lengths, the light appeared in different places around the galaxies at different times. By measuring the time delays between the multiple images, the researchers were able to obtain measurements of how quickly the universe is expanding – also known as the Hubble-Lemaitre constant.

The team consisted of researchers from the Technical University of Munich (TUM), the Max Planck Institute for Astrophysics (MPG), the Harvard & Smithsonian Center for Astrophysics (CfA), the EO Lawrence Berkeley National Laboratory, ETH Zurich, the Research Center for the Early Universe (RESCEU), the Cosmic Dawn Center (DAWN), the Ulugh Beg Astronomical Institute, the Chinese Academy of Sciences (CAS), the Institute of Space Sciences (ICE, CSIC). Cluster of Excellence ORIGINS, the National Astronomical Observatory of Japan (NAOJ), the European Southern Observatory (ESO), the Space Telescope Science Institute (STScI) and several universities.

*Large binocular telescope on Mount Graham in Arizona, USA. Photo credit & ©: Dr. Christoph Saulder/MPE*

The paper describing their observations has been accepted for publication in Astronomy & Astrophysics

Few such measurements have been made so far because gravitational lensing supernovae are so rare. It is also a challenging process that requires astronomers to determine the masses of the lensing galaxies, as these determine how much the background object’s light is deflected. To determine the masses of the two galaxies, the team captured images with the LBT using its two 8.4-meter (27.5-foot) mirrors and an adaptive optics system. The observations revealed two foreground lensed galaxies in the center, surrounded by five bluish images of the supernova explosion, making them look like fireworks!

Sherry Suyu, associate professor of observational cosmology at TUM and fellow at the Max Planck Institute for Astrophysics, explained in an MPG press release:

We nicknamed this supernova SN Winny, inspired by its official name SN 2025wny. It is an extremely rare event that could play a key role in improving our understanding of the cosmos. The chance of finding a super-bright supernova perfectly aligned with a suitable gravitational lens is less than one in a million. We have been looking for such an event for six years by compiling a list of promising gravitational lenses, and in August 2025 SN Winny exactly coincided with one of them.

The image surprised the team because galaxy-scale lens systems typically only produce two or four copies. The young researchers Allan Schweinfurth (TUM) and Leon Ecker (LMU) created the first model of the lens mass distribution from the positions of all five. Allan Schweinfurth said:

To date, most lensing supernovae have been magnified by massive galaxy clusters, whose mass distributions are complex and difficult to model. However, SN Winny is only blinded by two individual galaxies. We find overall smooth and regular light and mass distributions for these galaxies, suggesting that they have not collided in the past despite their apparent proximity. The overall simplicity of the system provides an exciting opportunity to measure the expansion rate of the Universe with high accuracy.

This, in turn, could help astronomers and cosmologists mitigate the ongoing problem of Hubble tension. Until now, scientists have relied primarily on two methods for measuring cosmic expansion: the Cosmic Distance Ladder and Cosmic Microwave Background (CMB) measurements. The former is the local method, which combines parallax, supernovae and redshift measurements of bright objects to determine distances step by step. Since each step depends on the previous one, even small mistakes can add up and affect the final result.

In contrast, CMB measurements look back to the beginning of cosmic time by studying the “relic radiation” left over from the Big Bang. This approach is highly precise and relies on models of the early universe to calculate its current expansion rate. However, it relies heavily on assumptions about the evolution of the universe that are still a matter of debate. This study presents a third possible method in which astronomers use gravitational lensing supernovae and measure the time delays between multiple copies of the same image.

By calculating the mass distribution of the lensing galaxy, scientists can directly calculate the Hubble-Lemaitre constant. “In contrast to the cosmic distance ladder, this is a one-step method with fewer and completely different sources of systematic uncertainties,” said Stefan Taubenberger, a leading member of Professor Suyu’s team and first author of their study.

Astronomers around the world are now observing SN Winny in detail using ground- and space-based telescopes. Their results will provide new insights into cosmic expansion that could help resolve the Hubble tension.

Further reading: MPG

Categories
Science

Time to cease the endangerment of growing international locations with CO2 regulation – watts with that?

By Vijay Jayaraj

Imagine irony to identify a substance as “dangerous” just to determine that the true danger is not in the substance, but in the plot of its defamation. This is the case with carbon dioxide (CO₂) and how it was incorrectly characterized in order to establish suicide policy worldwide.

In 2009, the US Environmental Protection Agency (EPA) emphasized its risk finding, which was declared as a pollutant CO – two pounds, each of which we exhale every day. It laid the bureaucratic basis for far -reaching regulations for the elimination of the use of fossil fuels, a goal that contradicts the social goods of reliable energy supply and prosperity.

The hazard finds have been considered the dominant factor for a “dangerous” increase in global temperatures in recent decades.

This regulatory corruption marked the beginning of what can only be referred to as weapons of environmental administration against energy systems based on coal, oil and natural gas that have lived out of poverty since the 19th century.

However, a study by the US Ministry (DOE) in July entitled “A critical review of the effects of greenhouse gas emissions on the US climate” corresponded to this nonsense. The document was written by a team of independent scientists with a different background and states that “CO2-induced warming could be economically less harmful than generally assumed, and an excessive aggressive reduction guidelines could prove more useful than beneficial.”

According to this comprehensive analysis, EPA secretary Lee Zeldin proposed that his agency cancel the risk. For everyone who follows the messages, it is already obvious that the current US administration has changed the course of energy policy by stirring the destructive anti-fossil fuel position of the previous bid regime. The abolition of the hazard could be the death blow for a “green” mania that did not cost the world billions of dollars to an advantage.

The question for developing countries is whether their governments continue to tolerate CO₂ hysteria that suffocates domestic economies such as a Boa Constrictor. How long do poorer countries suffer from climate policy that are produced in UN offices and villages imposed without electricity?

Green Energy Vehicles -such as the Paris Agreement and Netto -Null destinations -were encouraged on behalf of the Climate Work, but sabotaged growth, set industrial progress and punish the poor. From the ruthless shift of projects to the development of fossil fuel supply to puppetry behavior from legislators who recite from guidelines that were written by the United Nations and the World Economic Forum, the fingerprints of the green agenda are everywhere.

The projects that have suffered from anti-water cruisers are a 1,445 kilometer pipeline to transport crude oil from Uganda to Tanzania.

The price of climate regulations is ruinous. As the DOE report says, the exorbitant costs that are associated with guidelines such as electric vehicle data, goals and rules for household appliances exceed the fake “social costs of carbon”, which are promoted as part of its pseudo -oscience from the climate industrial complex. Green programs are an embarrassing failure of a rational cost-benefit analysis.

With regard to the actual pollution in the Third World, the recent climate evaluation of the Doe makes a lengthy distinction that mainstream media and bureaucrats have ignored for years. It rightly points out that CO₂ in the traditional, legally defined sense is not a pollutant: “In many ways, CO₂ differs from the so -called criteria that have air pollutants. It does not affect local air quality and has no toxicological effects on people on ambient levels.”

Now it is time that political decision -makers in the development of economies hire the treatment of plant food as a public enemy number one so that their companies can use energy resources that make economic – and ecological sense.

Your economies can no longer wait to remove CO2 -controlled restrictions on energy generation and use, since they do not have the wealth buffer of wealthier nations. The negative effects of anti-fossil fuel policy are already obvious and changes are required to avoid more damage.

This comment was first published on August 16, 2025 by Townhall.

Vijay Jayaraj is a science and research assistant in the CO₂ CO₂ coalition, Fairfax, Virginia. He has an MS in environmental sciences from the University of East Anglia and a postgraduate degree in energy management at Robert Gordon University in Great Britain and a Bachelor engineering at Anna University, India.

Like this:

How Load…

Do you discover more from watts?

Subscribe to the latest posts to your e -mail.