Categories
ScienceDaily

Comet discovered to have its own northern lights

Data from NASA instruments aboard the ESA (European Space Agency) Rosetta mission have helped reveal that comet 67P/Churyumov-Gerasimenko has its own far-ultraviolet aurora. It is the first time such electromagnetic emissions in the far-ultraviolet have been documented on a celestial object other than a planet or moon. A paper on the findings was released today in the journal Nature Astronomy.

On Earth, aurora (also known as the northern or southern lights) are generated when electrically charged particles speeding from the Sun hit the upper atmosphere to create colorful shimmers of green, white, and red. Elsewhere in the solar system, Jupiter and some of its moons — as well as Saturn, Uranus, Neptune, and even Mars — have all exhibited their own version of northern lights. But the phenomena had yet to be documented in comets.

Rosetta is space exploration’s most traveled and accomplished comet hunter. Launched in 2004, it orbited comet 67P/Churyumov-Gerasimenko (67P/C-G) from Aug. 2014 until its dramatic end-of-mission comet landing in Sept. 2016. The data for this most recent study is on what mission scientists initially interpreted as “dayglow,” a process caused by photons of light interacting with the envelope of gas — known as the coma — that radiates from, and surrounds, the comet’s nucleus. But new analysis of the data paints a very different picture.

“The glow surrounding 67P/C-G is one of a kind,” said Marina Galand of Imperial College London and lead author of the study. “By linking data from numerous Rosetta instruments, we were able to get a better picture of what was going on. This enabled us to unambiguously identify how 67P/C-G’s ultraviolet atomic emissions form.”

The data indicate 67P/C-G’s emissions are actually auroral in nature. Electrons streaming out in the solar wind — the stream of charged particles flowing out from the Sun — interact with the gas in the comet’s coma, breaking apart water and other molecules. The resulting atoms give off a distinctive far-ultraviolet light. Invisible to the naked eye, far-ultraviolet has the shortest wavelengths of radiation in the ultraviolet spectrum.

Exploring the emission of 67P/C-G will enable scientists to learn how the particles in the solar wind change over time, something that is crucial for understanding space weather throughout the solar system. By providing better information on how the Sun’s radiation affects the space environment they must travel through, such information could ultimately can help protect satellites and spacecraft, as well as astronauts traveling to the Moon and Mars.

“Rosetta is the gift that keeps on giving,” said Paul Feldman, an investigator on Alice at the Johns Hopkins University in Baltimore and a co-author of the paper. “The treasure trove of data it returned over its two-year visit to the comet have allowed us to rewrite the book on these most exotic inhabitants of our solar system — and by all accounts there is much more to come.”

NASA Instruments Aboard ESA’s Rosetta

NASA-supplied instruments contributed to this investigation. The Ion and Electron Sensor (IES) instrument detected the amount and energy of electrons near the spacecraft, the Alice instrument measured the ultraviolet light emitted by the aurora, and the Microwave Instrument for the Rosetta Orbiter (MIRO) measured the amount of water molecules around the comet (the MIRO instrument includes contributions from France, Germany, and Taiwan). Other instruments aboard the spacecraft used in the research were the Italian Space Agency’s Visible and InfraRed Thermal Imaging Spectrometer (VIRTIS), the Langmuir Probe (LAP) provided by Sweden, and the Rosetta Orbiter Spectrometer for Ion and Neutral Analysis (ROSINA) provided by Switzerland.

Rosetta was an ESA mission with contributions from its member states and NASA. Rosetta’s Philae lander, which successfully landed on the comet in November 2014, was provided by a consortium led by the German Aerospace Center in Cologne; Max Planck Institute for Solar System Research in Gottingen, Germany; the French National Space Agency in, Paris; and the Italian Space Agency in Rome. A division of Caltech, NASA’s Jet Propulsion Laboratory in Southern California managed the U.S. contribution of the Rosetta mission for NASA’s Science Mission Directorate in Washington. JPL also built the MIRO and hosts its principal investigator, Mark Hofstadter. The Southwest Research Institute (San Antonio and Boulder, Colorado), developed the Rosetta orbiter’s IES and Alice instruments and hosts their principal investigators, James Burch (IES) and Joel Parker (Alice).

For more information on the U.S. instruments aboard Rosetta, visit: http://rosetta.jpl.nasa.gov

More information about Rosetta is available at: http://www.esa.int/rosetta

Story Source:

Materials provided by NASA/Jet Propulsion Laboratory. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Hubble captures crisp new portrait of Jupiter’s storms

The latest image of Jupiter, taken by NASA’s Hubble Space Telescope on Aug. 25, 2020, was captured when the planet was 406 million miles from Earth. Hubble’s sharp view is giving researchers an updated weather report on the monster planet’s turbulent atmosphere, including a remarkable new storm brewing, and a cousin of the famous Great Red Spot region gearing up to change color — again.

A unique and exciting detail of Hubble’s snapshot appears at mid-northern latitudes as a bright, white, stretched-out storm traveling around the planet at 350 miles per hour (560 kilometers per hour). This single plume erupted on Aug. 18, 2020 — and ground-based observers have discovered two more that appeared later at the same latitude.

While it’s common for storms to pop up in this region every six years or so, often with multiple storms at once, the timing of the Hubble observations is perfect for showing the structure in the wake of the disturbance, during the early stages of its evolution. Trailing behind the plume are small, rounded features with complex “red, white, and blue” colors in Hubble’s ultraviolet, visible, and near-infrared light image. Such discrete features typically dissipate on Jupiter, leaving behind only changes in cloud colors and wind speeds, but a similar storm on Saturn led to a long-lasting vortex. The differences in the aftermaths of Jupiter and Saturn storms may be related to the contrasting water abundances in their atmospheres, since water vapor may govern the massive amount of stored-up energy that can be released by these storm eruptions.

Hubble shows that the Great Red Spot, rolling counterclockwise in the planet’s southern hemisphere, is plowing into the clouds ahead of it, forming a cascade of white and beige ribbons. The Great Red Spot is currently an exceptionally rich red color, with its core and outermost band appearing deeper red.

Researchers say the Great Red Spot now measures about 9,800 miles across, big enough to swallow Earth. The super-storm is still shrinking as noted in telescopic observations dating back to 1930, but the reason for its dwindling size is a complete mystery.

Another feature researchers are noticing has changed is Oval BA, nicknamed by astronomers as Red Spot Jr., which appears just below the Great Red Spot in this image. For the past few years, Red Spot Jr. has been fading in color to its original shade of white after appearing red in 2006. However, now the core of this storm appears to be darkening slightly. This could hint that Red Spot Jr. is on its way to turning to a color more similar to its cousin once again.

Hubble’s image shows that Jupiter is clearing out its higher altitude white clouds, especially along the planet’s equator, where an orangish hydrocarbon smog wraps around it.

The icy moon Europa, thought to hold potential ingredients for life, is visible to the left of the gas giant.

This Hubble image is part of yearly maps of the entire planet taken as part of the Outer Planets Atmospheres Legacy program, or OPAL. The program provides annual Hubble global views of the outer planets to look for changes in their storms, winds, and clouds.

The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA’s Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy in Washington, D.C.

Story Source:

Materials provided by NASA/Goddard Space Flight Center. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Enormous planet quickly orbiting a tiny, dying star

Thanks to a bevy of telescopes in space and on Earth — and even a pair of amateur astronomers in Arizona — a University of Wisconsin-Madison astronomer and his colleagues have discovered a Jupiter-sized planet orbiting at breakneck speed around a distant white dwarf star. The system, about 80 light years away, violates all common conventions about stars and planets. The white dwarf is the remnant of a sun-like star, greatly shrunken down to roughly the size of Earth, yet it retains half the sun’s mass. The massive planet looms over its tiny star, which it circles every 34 hours thanks to an incredibly close orbit. In contrast, Mercury takes a comparatively lethargic 90 days to orbit the sun. While there have been hints of large planets orbiting close to white dwarfs in the past, the new findings are the clearest evidence yet that these bizarre pairings exist. That confirmation highlights the diverse ways stellar systems can evolve and may give a glimpse at our own solar system’s fate. Such a white dwarf system could even provide a rare habitable arrangement for life to arise in the light of a dying star.

“We’ve never seen evidence before of a planet coming in so close to a white dwarf and surviving. It’s a pleasant surprise,” says lead researcher Andrew Vanderburg, who recently joined the UW-Madison astronomy department as an assistant professor. Vanderburg completed the work while an independent NASA Sagan Fellow at the University of Texas at Austin.

The researchers published their findings Sept. 16 in the journal Nature. Vanderburg led a large, international collaboration of astronomers who analyzed the data. The contributing telescopes included NASA’s exoplanet-hunting telescope TESS and two large ground-based telescopes in the Canary Islands.

Vanderburg was originally drawn to studying white dwarfs — the remains of sun-sized stars after they exhaust their nuclear fuel — and their planets by accident. While in graduate school, he was reviewing data from TESS’s predecessor, the Kepler space telescope, and noticed a white dwarf with a cloud of debris around it.

“What we ended up finding was that this was a minor planet or asteroid that was being ripped apart as we watched, which was really cool,” says Vanderburg. The planet had been destroyed by the star’s gravity after its transition to a white dwarf caused the planet’s orbit to fall in toward the star.

Ever since, Vanderburg has wondered if planets, especially large ones, could survive the journey in toward an aging star.

By scanning data for thousands of white dwarf systems collected by TESS, the researchers spotted a star whose brightness dimmed by half about every one-and-a-half days, a sign that something big was passing in front of the star on a tight, lightning-fast orbit. But it was hard to interpret the data because the glare from a nearby star was interfering with TESS’s measurements. To overcome this obstacle, the astronomers supplemented the TESS data from higher-resolution ground-based telescopes, including three run by amateur astronomers.

“Once the glare was under control, in one night, they got much nicer and much cleaner data than we got with a month of observations from space,” says Vanderburg. Because white dwarfs are so much smaller than normal stars, large planets passing in front of them block a lot of the star’s light, making detection by ground-based telescopes much simpler.

The data revealed that a planet roughly the size of Jupiter, perhaps a little larger, was orbiting very close to its star. Vanderburg’s team believes the gas giant started off much farther from the star and moved into its current orbit after the star evolved into a white dwarf.

The question became: how did this planet avoid being torn apart during the upheaval? Previous models of white dwarf-planet interactions didn’t seem to line up for this particular star system.

The researchers ran new simulations that provided a potential answer to the mystery. When the star ran out of fuel, it expanded into a red giant, engulfing any nearby planets and destabilizing the Jupiter-sized planet that orbited farther away. That caused the planet to take on an exaggerated, oval orbit that passed very close to the now-shrunken white dwarf but also flung the planet very far away at the orbit’s apex.

Over eons, the gravitational interaction between the white dwarf and its planet slowly dispersed energy, ultimately guiding the planet into a tight, circular orbit that takes just one-and-a-half days to complete. That process takes time — billions of years. This particular white dwarf is one of the oldest observed by the TESS telescope at almost 6 billion years old, plenty of time to slow down its massive planet partner.

While white dwarfs no longer conduct nuclear fusion, they still release light and heat as they cool down. It’s possible that a planet close enough to such a dying star would find itself in the habitable zone, the region near a star where liquid water can exist, presumed to be required for life to arise and survive.

Now that research has confirmed these systems exist, they offer a tantalizing opportunity for searching for other forms of life. The unique structure of white dwarf-planet systems provides an ideal opportunity to study the chemical signatures of orbiting planets’ atmospheres, a potential way to search for signs of life from afar.

“I think the most exciting part of this work is what it means for both habitability in general — can there be hospitable regions in these dead solar systems — and also our ability to find evidence of that habitability,” says Vanderburg.

Go to Source
Author:

Categories
ScienceDaily

Carbon-rich exoplanets may be made of diamonds

As missions like NASA’s Hubble Space Telescope, TESS and Kepler continue to provide insights into the properties of exoplanets (planets around other stars), scientists are increasingly able to piece together what these planets look like, what they are made of, and if they could be habitable or even inhabited.

In a new study published recently in The Planetary Science Journal, a team of researchers from Arizona State University (ASU) and the University of Chicago have determined that some carbon-rich exoplanets, given the right circumstances, could be made of diamonds and silica.

“These exoplanets are unlike anything in our solar system,” says lead author Harrison Allen-Sutter of ASU’s School of Earth and Space Exploration.

Diamond exoplanet formation

When stars and planets are formed, they do so from the same cloud of gas, so their bulk compositions are similar. A star with a lower carbon to oxygen ratio will have planets like Earth, comprised of silicates and oxides with a very small diamond content (Earth’s diamond content is about 0.001%).

But exoplanets around stars with a higher carbon to oxygen ratio than our sun are more likely to be carbon-rich. Allen-Sutter and co-authors Emily Garhart, Kurt Leinenweber and Dan Shim of ASU, with Vitali Prakapenka and Eran Greenberg of the University of Chicago, hypothesized that these carbon-rich exoplanets could convert to diamond and silicate, if water (which is abundant in the universe) were present, creating a diamond-rich composition.

Diamond-anvils and X-rays

To test this hypothesis, the research team needed to mimic the interior of carbide exoplanets using high heat and high pressure. To do so, they used high pressure diamond-anvil cells at co-author Shim’s Lab for Earth and Planetary Materials.

First, they immersed silicon carbide in water and compressed the sample between diamonds to a very high pressure. Then, to monitor the reaction between silicon carbide and water, they conducted laser heating at the Argonne National Laboratory in Illinois, taking X-ray measurements while the laser heated the sample at high pressures.

As they predicted, with high heat and pressure, the silicon carbide reacted with water and turned into diamonds and silica.

Habitability and inhabitability

So far, we have not found life on other planets, but the search continues. Planetary scientists and astrobiologists are using sophisticated instruments in space and on Earth to find planets with the right properties and the right location around their stars where life could exist.

For carbon-rich planets that are the focus of this study, however, they likely do not have the properties needed for life.

While Earth is geologically active (an indicator habitability), the results of this study show that carbon-rich planets are too hard to be geologically active and this lack of geologic activity may make atmospheric composition uninhabitable. Atmospheres are critical for life as it provides us with air to breathe, protection from the harsh environment of space, and even pressure to allow for liquid water.

“Regardless of habitability, this is one additional step in helping us understand and characterize our ever- increasing and improving observations of exoplanets,” says Allen-Sutter. “The more we learn, the better we’ll be able to interpret new data from upcoming future missions like the James Webb Space Telescope and the Nancy Grace Roman Space Telescope to understand the worlds beyond on our own solar system.”

Go to Source
Author:

Categories
ScienceDaily

Ingredient missing from current dark matter theories

Observations by the NASA/ESA Hubble Space Telescope and the European Southern Observatory’s Very Large Telescope (VLT) in Chile have found that something may be missing from the theories of how dark matter behaves. This missing ingredient may explain why researchers have uncovered an unexpected discrepancy between observations of the dark matter concentrations in a sample of massive galaxy clusters and theoretical computer simulations of how dark matter should be distributed in clusters. The new findings indicate that some small-scale concentrations of dark matter produce lensing effects that are 10 times stronger than expected.

Dark matter is the invisible glue that keeps stars, dust, and gas together in a galaxy. This mysterious substance makes up the bulk of a galaxy’s mass and forms the foundation of our Universe’s large-scale structure. Because dark matter does not emit, absorb, or reflect light, its presence is only known through its gravitational pull on visible matter in space. Astronomers and physicists are still trying to pin down what it is.

Galaxy clusters, the most massive and recently assembled structures in the Universe, are also the largest repositories of dark matter. Clusters are composed of individual member galaxies that are held together largely by the gravity of dark matter.

“Galaxy clusters are ideal laboratories in which to study whether the numerical simulations of the Universe that are currently available reproduce well what we can infer from gravitational lensing,” said Massimo Meneghetti of the INAF-Observatory of Astrophysics and Space Science of Bologna in Italy, the study’s lead author.

“We have done a lot of testing of the data in this study, and we are sure that this mismatch indicates that some physical ingredient is missing either from the simulations or from our understanding of the nature of dark matter,” added Meneghetti.

“There’s a feature of the real Universe that we are simply not capturing in our current theoretical models,” added Priyamvada Natarajan of Yale University in Connecticut, USA, one of the senior theorists on the team. “This could signal a gap in our current understanding of the nature of dark matter and its properties, as these exquisite data have permitted us to probe the detailed distribution of dark matter on the smallest scales.”

The distribution of dark matter in clusters is mapped by measuring the bending of light — the gravitational lensing effect — that they produce. The gravity of dark matter concentrated in clusters magnifies and warps light from distant background objects. This effect produces distortions in the shapes of background galaxies which appear in images of the clusters. Gravitational lensing can often also produce multiple images of the same distant galaxy.

The higher the concentration of dark matter in a cluster, the more dramatic its light-bending effect. The presence of smaller-scale clumps of dark matter associated with individual cluster galaxies enhances the level of distortions. In some sense, the galaxy cluster acts as a large-scale lens that has many smaller lenses embedded within it.

Hubble’s crisp images were taken by the telescope’s Wide Field Camera 3 and Advanced Camera for Surveys. Coupled with spectra from the European Southern Observatory’s Very Large Telescope (VLT), the team produced an accurate, high-fidelity, dark-matter map. By measuring the lensing distortions astronomers could trace out the amount and distribution of dark matter. The three key galaxy clusters, MACS J1206.2-0847, MACS J0416.1-2403, and Abell S1063, were part of two Hubble surveys: The Frontier Fields and the Cluster Lensing And Supernova survey with Hubble (CLASH) programs.

To the team’s surprise, in addition to the dramatic arcs and elongated features of distant galaxies produced by each cluster’s gravitational lensing, the Hubble images also revealed an unexpected number of smaller-scale arcs and distorted images nested near each cluster’s core, where the most massive galaxies reside. The researchers believe the nested lenses are produced by the gravity of dense concentrations of matter inside the individual cluster galaxies. Follow-up spectroscopic observations measured the velocity of the stars orbiting inside several of the cluster galaxies to therby pin down their masses.

“The data from Hubble and the VLT provided excellent synergy,” shared team member Piero Rosati of the Università degli Studi di Ferrara in Italy, who led the spectroscopic campaign. “We were able to associate the galaxies with each cluster and estimate their distances.”

“The speed of the stars gave us an estimate of each individual galaxy’s mass, including the amount of dark matter,” added team member Pietro Bergamini of the INAF-Observatory of Astrophysics and Space Science in Bologna, Italy.

By combining Hubble imaging and VLT spectroscopy, the astronomers were able to identify dozens of multiply imaged, lensed, background galaxies. This allowed them to assemble a well-calibrated, high-resolution map of the mass distribution of dark matter in each cluster.

The team compared the dark-matter maps with samples of simulated galaxy clusters with similar masses, located at roughly the same distances. The clusters in the computer model did not show any of the same level of dark-matter concentration on the smallest scales — the scales associated with individual cluster galaxies.

“The results of these analyses further demonstrate how observations and numerical simulations go hand in hand,” said team member Elena Rasia of the INAF-Astronomical Observatory of Trieste, Italy.

“With high-resolution simulations, we can match the quality of observations analysed in our paper, permitting detailed comparisons like never before,” added Stefano Borgani of the Università degli Studi di Trieste, Italy.

Astronomers, including those of this team, look forward to continuing to probe dark matter and its mysteries in order to finally pin down its nature.

Go to Source
Author:

Categories
ScienceDaily

Study rules out dark matter destruction as origin of extra radiation in galaxy center

The detection more than a decade ago by the Fermi Gamma Ray Space Telescope of an excess of high-energy radiation in the center of the Milky Way convinced some physicists that they were seeing evidence of the annihilation of dark matter particles, but a team led by researchers at the University of California, Irvine has ruled out that interpretation.

In a paper published recently in the journal Physical Review D, the UCI scientists and colleagues at Virginia Polytechnic Institute and State University and other institutions report that — through an analysis of the Fermi data and an exhaustive series of modeling exercises — they were able to determine that the observed gamma rays could not have been produced by what are called weakly interacting massive particles, most popularly theorized as the stuff of dark matter.

By eliminating these particles, the destruction of which could generate energies of up to 300 giga-electron volts, the paper’s authors say, they have put the strongest constraints yet on dark matter properties.

“For 40 years or so, the leading candidate for dark matter among particle physicists was a thermal, weakly interacting and weak-scale particle, and this result for the first time rules out that candidate up to very high-mass particles,” said co-author Kevork Abazajian, UCI professor of physics & astronomy.

“In many models, this particle ranges from 10 to 1,000 times the mass of a proton, with more massive particles being less attractive theoretically as a dark matter particle,” added co-author Manoj Kaplinghat, also a UCI professor of physics & astronomy. “In this paper, we’re eliminating dark matter candidates over the favored range, which is a huge improvement in the constraints we put on the possibilities that these are representative of dark matter.”

Abazajian said that dark matter signals could be crowded out by other astrophysical phenomena in the Galactic Center — such as star formation, cosmic ray deflection off molecular gas and, most notably, neutron stars and millisecond pulsars — as sources of excess gamma rays detected by the Fermi space telescope.

“We looked at all of the different modeling that goes on in the Galactic Center, including molecular gas, stellar emissions and high-energy electrons that scatter low-energy photons,” said co-author Oscar Macias, a postdoctoral scholar in physics and astronomy at the Kavli Institute for the Physics and Mathematics of the Universe at the University of Tokyo whose visit to UCI in 2017 initiated this project. “We took over three years to pull all of these new, better models together and examine the emissions, finding that there is little room left for dark matter.”

Macias, who is also a postdoctoral researcher with the GRAPPA Centre at the University of Amsterdam, added that this result would not have been possible without data and software provided by the Fermi Large Area Telescope collaboration.

The group tested all classes of models used in the Galactic Center region for excess emission analyses, and its conclusions remained unchanged. “One would have to craft a diffuse emission model that leaves a big ‘hole’ in them to relax our constraints, and science doesn’t work that way,” Macias said.

Kaplinghat noted that physicists have predicted that radiation from dark matter annihilation would be represented in a neat spherical or elliptical shape emanating from the Galactic Center, but the gamma ray excess detected by the Fermi space telescope after its June 2008 deployment shows up as a triaxial, bar-like structure.

“If you peer at the Galactic Center, you see that the stars are distributed in a boxy way,” he said. “There’s a disk of stars, and right in the center, there’s a bulge that’s about 10 degrees on the sky, and it’s actually a very specific shape — sort of an asymmetric box — and this shape leaves very little room for additional dark matter.”

Does this research rule out the existence of dark matter in the galaxy? “No,” Kaplinghat said. “Our study constrains the kind of particle that dark matter could be. The multiple lines of evidence for dark matter in the galaxy are robust and unaffected by our work.”

Far from considering the team’s findings to be discouraging, Abazajian said they should encourage physicists to focus on concepts other than the most popular ones.

“There are a lot of alternative dark matter candidates out there,” he said. “The search is going to be more like a fishing expedition where you don’t already know where the fish are.”

Also contributing to this research project — which was supported by the National Science Foundation, the U.S. Department of Energy Office of Science and Japan’s World Premier International Research Center Initiative — were Ryan Keeley, who earned a Ph.D. in physics & astronomy at UCI in 2018 and is now at the Korea Astronomy and Space Science Institute, and Shunsaku Horiuchi, a former UCI postdoctoral scholar in physics & astronomy who is now an assistant professor of physics at Virginia Tech.

Go to Source
Author:

Categories
ScienceDaily

Sleep duration, efficiency and structure change in space

It’s hard to get a good night’s sleep in space. An evaluation of astronauts serving on the Mir space station found that they experienced shorter sleep durations, more wakefulness, and changes in the structure of their sleep cycles while in microgravity.

Researchers at Harvard College, Harvard Medical School, and NASA Ames Research Center studied the sleep patterns of four cosmonauts and one astronaut before, during and after spaceflight to conduct missions on the space station. Preliminary results show that they slept an average of only 5.7 hours in space, compared with 6.7 hours on Earth. They also spent significantly more time awake in bed, leading to a 17.7% reduction in sleep efficiency.

In space their time in non-REM and REM sleep decreased by 14.1% and 25.8% respectively. On average it also took about 90 minutes after falling asleep for astronauts to reach their first episode of REM sleep in space, nearly 1.5 times longer than on Earth. In contrast, most sleep measures were stable across the inflight phase, with the exception of a decrease in the amount of time spent in bed and an increase in the length of time it took to fall asleep after going to bed.

“There were marked shifts in sleep architecture compared to baseline, and some of these evolved over the course of the mission,” said lead author Oliver Piltch, an undergraduate researcher at Harvard College. “Our findings were consistent with previous studies that focus on the issue of sleep continuity. We found significant decreases in sleep efficiency during spaceflight despite similar times in bed.”

Piltch said scientists need to understand how sleep is affected by spaceflight to better equip astronauts for success on long-duration flights, like a trip to Mars or the Moon. He noted that the research also has implications for sleep on Earth.

“The significant sleep changes induced by the extreme environmental conditions of spaceflight can magnify and help reveal similar, though potentially less noticeable, changes that are induced by the more moderate conditions of Earth,” he said. “Our results support other studies indicating that sleep architecture can adapt to different environments. Also, the sleep deficits that our subjects were facing while working around the clock in a high-pressure environment provide further evidence for the danger of stress and shift-work schedules for humans anywhere.”

Statistical analyses of the research were guided by Erin Flynn-Evans, PhD, director of the NASA Fatigue Countermeasures Laboratory. The experiment was designed and led by Robert Stickgold, PhD, director of the Sleep and Cognition Lab at Beth Israel Deaconess Medical Center and professor of psychiatry at Harvard Medical School, alongside Dr. J. Allan Hobson, professor emeritus of psychiatry.

Story Source:

Materials provided by American Academy of Sleep Medicine. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Hubble snaps close-up of celebrity comet NEOWISE

NASA Hubble Space Telescope images of comet NEOWISE, taken on Aug. 8, zero in on the visitor’s coma, the gossamer shell of gas and dust that surrounds its nucleus as it is heated by the Sun. This is the first time Hubble has photographed a comet of this brightness at such resolution after this close of a pass by the Sun.

The comet photos were taken after NEOWISE skimmed closest to the Sun on July 3, 2020, at a distance of 27 million miles (43 million kilometers). Other comets often break apart due to thermal and gravitational stresses at such close encounters, but Hubble’s view shows that apparently NEOWISE’s solid nucleus stayed intact.

“Hubble has far better resolution than we can get with any other telescope of this comet,” said lead researcher Qicheng Zhang of Caltech in Pasadena, California. “That resolution is very key for seeing details very close to the nucleus. It lets us see changes in the dust right after it’s stripped from that nucleus due to solar heat, sampling dust as close to the original properties of the comet as possible.”

The heart of the comet, its icy nucleus, is too small to be seen by Hubble. The ball of ice may be no more than 3 miles (4.8 kilometers) across. Instead, the Hubble image captures a portion of the vast cloud of gas and dust enveloping the nucleus, which measures about 11,000 miles (18,000 kilometers) across in this photo. Hubble resolves a pair of jets from the nucleus shooting out in opposite directions. They emerge from the nucleus as cones of dust and gas, and then are curved into broader fan-like structures by the rotation of the nucleus. Jets are the result of ice sublimating beneath the surface with the resulting dust/gas being squeezed out at high velocity.

The Hubble photos may help reveal the color of the comet’s dust and how those colors change as the comet moves away from the Sun. This, in turn, may explain how solar heat affects the composition and structure of that dust in the comet’s coma. The ultimate goal here would be to learn the original properties of the dust to learn more about the conditions of the early solar system in which it formed.

Comet NEOWISE is considered the brightest comet visible from the Northern Hemisphere since 1997’s Hale-Bopp. It’s headed beyond the outer solar system, now traveling at a whopping 144,000 miles per hour. It will not return to the Sun for another nearly 7,000 years.

Researchers are currently delving more into the data to see what they’re able to confirm.

NASA’s Near-Earth Object Wide-field Infrared Survey Explorer (NEOWISE) mission first discovered its namesake comet in March 2020. As the comet made its way closer to the Sun, searing heat melted its ices, unleashing dust and gas that leaves the signature tails. Throughout the summer, ground-based sky watchers in the Northern Hemisphere were able to catch a view of the traveler moving across the sky.

The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA’s Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy in Washington, D.C.

Story Source:

Materials provided by NASA/Goddard Space Flight Center. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Unveiling rogue planets with NASA’s Roman Space Telescope

New simulations show that NASA’s Nancy Grace Roman Space Telescope will be able to reveal myriad rogue planets — freely floating bodies that drift through our galaxy untethered to a star. Studying these island worlds will help us understand more about how planetary systems form, evolve, and break apart.

Astronomers discovered planets beyond our solar system, known as exoplanets, in the 1990s. We quickly went from knowing of only our own planetary system to realizing that planets likely outnumber the hundreds of billions of stars in our galaxy. Now, a team of scientists is finding ways to improve our understanding of planet demographics by searching for rogue worlds.

“As our view of the universe has expanded, we’ve realized that our solar system may be unusual,” said Samson Johnson, a graduate student at Ohio State University in Columbus who led the research effort. “Roman will help us learn more about how we fit in the cosmic scheme of things by studying rogue planets.”

The findings, published in the Astronomical Journal, center on the Roman Space Telescope’s ability to locate and characterize isolated planets. Astronomers have only tentatively discovered a few of these nomad worlds so far because they are so difficult to detect.

Finding galactic nomads

Roman will find rogue planets by conducting a large microlensing survey. Gravitational lensing is an observational effect that occurs because the presence of mass warps the fabric of space-time. The effect is extreme around very massive objects, like black holes and entire galaxies. Even solitary planets cause a detectable degree of warping, called microlensing.

If a rogue planet aligns closely with a more distant star from our vantage point, the star’s light will bend as it travels through the curved space-time around the planet. The result is that the planet acts like a natural magnifying glass, amplifying light from the background star. Astronomers see the effect as a spike in the star’s brightness as the star and planet come into alignment. Measuring how the spike changes over time reveals clues to the rogue planet’s mass.

“The microlensing signal from a rogue planet only lasts between a few hours and a couple of days and then is gone forever,” said co-author Matthew Penny, an assistant professor of physics and astronomy at Louisiana State University in Baton Rouge. “This makes them difficult to observe from Earth, even with multiple telescopes. Roman is a game-changer for rogue planet searches.”

Microlensing offers the best way to systematically search for rogue planets — especially those with low masses. They don’t shine like stars and are often very cool objects, emitting too little heat for infrared telescopes to see. These vagabond worlds are essentially invisible, but Roman will discover them indirectly thanks to their gravitational effects on the light of more distant stars.

Lessons from cosmic castaways

Johnson and co-authors showed that Roman will be able to detect rogue planets with masses as small as Mars. Studying these planets will help narrow down competing models of planetary formation.

The planet-building process can be chaotic, since smaller objects collide with one another and sometimes stick together to form larger bodies. It’s similar to using a piece of playdough to pick up other pieces. But occasionally collisions and close encounters can be so violent that they fling a planet out of the gravitational grip of its parent star. Unless it manages to drag a moon along with it, the newly orphaned world is doomed to wander the galaxy alone.

Rogue planets may also form in isolation from clouds of gas and dust, similar to how stars grow. A small cloud of gas and dust could collapse to form a central planet instead of a star, with moons instead of planets surrounding it.

Roman will test planetary formation and evolution models that predict different numbers of these isolated worlds. Determining the abundance and masses of rogue planets will offer insight into the physics that drives their formation. The research team found that the mission will provide a rogue planet count that is at least 10 times more precise than current estimates, which range from tens of billions to trillions in our galaxy. These estimates mainly come from observations by ground-based telescopes.

Since Roman will observe above the atmosphere, nearly a million miles away from Earth in the direction opposite the Sun, it will yield far superior microlensing results. In addition to providing a sharper view, Roman’s perspective will allow it to stare at the same patch of sky continuously for months at a time. Johnson and his colleagues showed that Roman’s microlensing survey will detect hundreds of rogue planets, even though it will search only a relatively narrow strip of the galaxy.

Part of the study involved determining how to analyze the mission’s future data to obtain a more accurate census. Scientists will be able to extrapolate from Roman’s rogue planet count to estimate how common these objects are throughout the entire galaxy.

“The universe could be teeming with rogue planets and we wouldn’t even know it,” said Scott Gaudi, a professor of astronomy at Ohio State University and a co-author of the paper. “We would never find out without undertaking a thorough, space-based microlensing survey like Roman is going to do.”

The Nancy Grace Roman Space Telescope is managed at Goddard, with participation by NASA’s Jet Propulsion Laboratory and Caltech/IPAC in Pasadena, the Space Telescope Science Institute in Baltimore, and a science team comprising scientists from research institutions across the United States.

Go to Source
Author:

Categories
ScienceDaily

Deep learning will help future Mars rovers go farther, faster, and do more science

NASA’s Mars rovers have been one of the great scientific and space successes of the past two decades.

Four generations of rovers have traversed the red planet gathering scientific data, sending back evocative photographs, and surviving incredibly harsh conditions — all using on-board computers less powerful than an iPhone 1. The latest rover, Perseverance, was launched on July 30, 2020, and engineers are already dreaming of a future generation of rovers.

While a major achievement, these missions have only scratched the surface (literally and figuratively) of the planet and its geology, geography, and atmosphere.

“The surface area of Mars is approximately the same as the total area of the land on Earth,” said Masahiro (Hiro) Ono, group lead of the Robotic Surface Mobility Group at the NASA Jet Propulsion Laboratory (JPL) — which has led all the Mars rover missions — and one of the researchers who developed the software that allows the current rover to operate.

“Imagine, you’re an alien and you know almost nothing about Earth, and you land on seven or eight points on Earth and drive a few hundred kilometers. Does that alien species know enough about Earth?” Ono asked. “No. If we want to represent the huge diversity of Mars we’ll need more measurements on the ground, and the key is substantially extended distance, hopefully covering thousands of miles.”

Travelling across Mars’ diverse, treacherous terrain with limited computing power and a restricted energy diet — only as much sun as the rover can capture and convert to power in a single Martian day, or sol — is a huge challenge.

The first rover, Sojourner, covered 330 feet over 91 sols; the second, Spirit, travelled 4.8 miles in about five years; Opportunity, travelled 28 miles over 15 years; and Curiosity has travelled more than 12 miles since it landed in 2012.

“Our team is working on Mars robot autonomy to make future rovers more intelligent, to enhance safety, to improve productivity, and in particular to drive faster and farther,” Ono said.

NEW HARDWARE, NEW POSSIBILITIES

The Perseverance rover, which launched this summer, computes using RAD 750s — radiation-hardened single board computers manufactured by BAE Systems Electronics.

Future missions, however, would potentially use new high-performance, multi-core radiation hardened processors designed through the High Performance Spaceflight Computing (HPSC) project. (Qualcomm’s Snapdragon processor is also being tested for missions.) These chips will provide about one hundred times the computational capacity of current flight processors using the same amount of power.

“All of the autonomy that you see on our latest Mars rover is largely human-in-the-loop” — meaning it requires human interaction to operate, according to Chris Mattmann, the deputy chief technology and innovation officer at JPL. “Part of the reason for that is the limits of the processors that are running on them. One of the core missions for these new chips is to do deep learning and machine learning, like we do terrestrially, on board. What are the killer apps given that new computing environment?”

The Machine Learning-based Analytics for Autonomous Rover Systems (MAARS) program — which started three years ago and will conclude this year — encompasses a range of areas where artificial intelligence could be useful. The team presented results of the MAARS project at hIEEE Aerospace Conference in March 2020. The project was a finalist for the NASA Software Award.

“Terrestrial high performance computing has enabled incredible breakthroughs in autonomous vehicle navigation, machine learning, and data analysis for Earth-based applications,” the team wrote in their IEEE paper. “The main roadblock to a Mars exploration rollout of such advances is that the best computers are on Earth, while the most valuable data is located on Mars.”

Training machine learning models on the Maverick2 supercomputer at the Texas Advanced Computing Center (TACC), as well as on Amazon Web Services and JPL clusters, Ono, Mattmann and their team have been developing two novel capabilities for future Mars rovers, which they call Drive-By Science and Energy-Optimal Autonomous Navigation.

ENERGY-OPTIMAL AUTONOMOUS NAVIGATION

Ono was part of the team that wrote the on-board pathfinding software for Perseverance. Perseverance’s software includes some machine learning abilities, but the way it does pathfinding is still fairly naïve.

“We’d like future rovers to have a human-like ability to see and understand terrain,” Ono said. “For rovers, energy is very important. There’s no paved highway on Mars. The drivability varies substantially based on the terrain — for instance beach versus. bedrock. That is not currently considered. Coming up with a path with all of these constraints is complicated, but that’s the level of computation that we can handle with the HPSC or Snapdragon chips. But to do so we’re going to need to change the paradigm a little bit.”

Ono explains that new paradigm as commanding by policy, a middle ground between the human-dictated: “Go from A to B and do C,” and the purely autonomous: “Go do science.”

Commanding by policy involves pre-planning for a range of scenarios, and then allowing the rover to determine what conditions it is encountering and what it should do.

“We use a supercomputer on the ground, where we have infinite computational resources like those at TACC, to develop a plan where a policy is: if X, then do this; if y, then do that,” Ono explained. “We’ll basically make a huge to-do list and send gigabytes of data to the rover, compressing it in huge tables. Then we’ll use the increased power of the rover to de-compress the policy and execute it.”

The pre-planned list is generated using machine learning-derived optimizations. The on-board chip can then use those plans to perform inference: taking the inputs from its environment and plugging them into the pre-trained model. The inference tasks are computationally much easier and can be computed on a chip like those that may accompany future rovers to Mars.

“The rover has the flexibility of changing the plan on board instead of just sticking to a sequence of pre-planned options,” Ono said. “This is important in case something bad happens or it finds something interesting.”

DRIVE-BY SCIENCE

Current Mars missions typically use tens of images a Sol from the rover to decide what to do the next day, according to Mattmann. “But what if in the future we could use one million image captions instead? That’s the core tenet of Drive-By Science,” he said. “If the rover can return text labels and captions that were scientifically validated, our mission team would have a lot more to go on.”

Mattmann and the team adapted Google’s Show and Tell software — a neural image caption generator first launched in 2014 — for the rover missions, the first non-Google application of the technology.

The algorithm takes in images and spits out human-readable captions. These include basic, but critical information, like cardinality — how many rocks, how far away? — and properties like the vein structure in outcrops near bedrock. “The types of science knowledge that we currently use images for to decide what’s interesting,” Mattmann said.

Over the past few years, planetary geologists have labeled and curated Mars-specific image annotations to train the model.

“We use the one million captions to find 100 more important things,” Mattmann said. “Using search and information retrieval capabilities, we can prioritize targets. Humans are still in the loop, but they’re getting much more information and are able to search it a lot faster.”

Results of the team’s work appear in the September 2020 issue of Planetary and Space Science.

TACC’s supercomputers proved instrumental in helping the JPL team test the system. On Maverick 2, the team trained, validated, and improved their model using 6,700 labels created by experts.

The ability to travel much farther would be a necessity for future Mars rovers. An example is the Sample Fetch Rover, proposed to be developed by the European Space Association and launched in late 2020s, whose main task will be to pick up samples dug up by the Mars 2020 rover and collect them.

“Those rovers in a period of years would have to drive 10 times further than previous rovers to collect all the samples and to get them to a rendezvous site,” Mattmann said. “We’ll need to be smarter about the way we drive and use energy.”

Before the new models and algorithms are loaded onto a rover destined for space, they are tested on a dirt training ground next to JPL that serves as an Earth-based analogue for the surface of Mars.

The team developed a demonstration that shows an overhead map, streaming images collected by the rover, and the algorithms running live on the rover, and then exposes the rover doing terrain classification and captioning on board. They had hoped to finish testing the new system this spring, but COVID-19 shuttered the lab and delayed testing.

In the meantime, Ono and his team developed a citizen science app, AI4Mars, that allows the public to annotate more than 20,000 images taken by the Curiosity rover. These will be used to further train machine learning algorithms to identify and avoid hazardous terrains.

The public have generated 170,000 labels so far in less than three months. “People are excited. It’s an opportunity for people to help,” Ono said. “The labels that people create will help us make the rover safer.”

The efforts to develop a new AI-based paradigm for future autonomous missions can be applied not just to rovers but to any autonomous space mission, from orbiters to fly-bys to interstellar probes, Ono says.

“The combination of more powerful on-board computing power, pre-planned commands computed on high performance computers like those at TACC, and new algorithms has the potential to allow future rovers to travel much further and do more science.”

Go to Source
Author: