Categories
ScienceDaily

Scientists precisely measure total amount of matter in the universe

A top goal in cosmology is to precisely measure the total amount of matter in the universe, a daunting exercise for even the most mathematically proficient. A team led by scientists at the University of California, Riverside, has now done just that.

Reporting in the Astrophysical Journal, the team determined that matter makes up 31% of the total amount of matter and energy in the universe, with the remainder consisting of dark energy.

“To put that amount of matter in context, if all the matter in the universe were spread out evenly across space, it would correspond to an average mass density equal to only about six hydrogen atoms per cubic meter,” said first author Mohamed Abdullah, a graduate student in the UCR Department of Physics and Astronomy. “However, since we know 80% of matter is actually dark matter, in reality, most of this matter consists not of hydrogen atoms but rather of a type of matter which cosmologists don’t yet understand.”

Abdullah explained that one well-proven technique for determining the total amount of matter in the universe is to compare the observed number and mass of galaxy clusters per unit volume with predictions from numerical simulations. Because present-day galaxy clusters have formed from matter that has collapsed over billions of years under its own gravity, the number of clusters observed at the present time is very sensitive to cosmological conditions and, in particular, the total amount of matter.

“A higher percentage of matter would result in more clusters,” Abdullah said. “The ‘Goldilocks’ challenge for our team was to measure the number of clusters and then determine which answer was ‘just right.’ But it is difficult to measure the mass of any galaxy cluster accurately because most of the matter is dark so we can’t see it with telescopes.”

To overcome this difficulty, the UCR-led team of astronomers first developed “GalWeight,” a cosmological tool to measure the mass of a galaxy cluster using the orbits of its member galaxies. The researchers then applied their tool to observations from the Sloan Digital Sky Survey (SDSS) to create “GalWCat19,” a publicly available catalog of galaxy clusters. Finally, they compared the number of clusters in their new catalog with simulations to determine the total amount of matter in the universe.

“We have succeeded in making one of the most precise measurements ever made using the galaxy cluster technique,” said coauthor Gillian Wilson, a professor of physics and astronomy at UCR in whose lab Abdullah works. “Moreover, this is the first use of the galaxy orbit technique which has obtained a value in agreement with those obtained by teams who used noncluster techniques such as cosmic microwave background anisotropies, baryon acoustic oscillations, Type Ia supernovae, or gravitational lensing.”

“A huge advantage of using our GalWeight galaxy orbit technique was that our team was able to determine a mass for each cluster individually rather than rely on more indirect, statistical methods,” said the third coauthor Anatoly Klypin, an expert in numerical simulations and cosmology.

By combining their measurement with those from the other teams that used different techniques, the UCR-led team was able to determine a best combined value, concluding that matter makes up 31.5±1.3% of the total amount of matter and energy in the universe.

Story Source:

Materials provided by University of California – Riverside. Original written by Iqbal Pittalwala. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Avoiding environmental losses in quantum information systems

New research published in EPJ D has revealed how robust initial states can be prepared in quantum information systems, minimising any unwanted transitions which lead to losses in quantum information.

Through new techniques for generating ‘exceptional points’ in quantum information systems, researchers have minimised the transitions through which they lose information to their surrounding environments.

Recently, researchers have begun to exploit the effects of quantum mechanics to process information in some fascinating new ways. One of the main challenges faced by these efforts is that systems can easily lose their quantum information as they interact with particles in their surrounding environments. To understand this behaviour, researchers in the past have used advanced models to observe how systems can spontaneously evolve into different states over time — losing their quantum information in the process. Through new research published in EPJ D, M. Reboiro and colleagues at the University of La Plata in Argentina have discovered how robust initial states can be prepared in quantum information systems, avoiding any unwanted transitions extensive time periods.

The team’s findings could provide valuable insights for the rapidly advancing field of quantum computing; potentially enabling more complex operations to be carried out using the cutting-edge devices. Their study considered a ‘hybrid’ quantum information system based around a specialised loop of superconducting metal, which interacted with an ensemble of imperfections within the atomic lattice of diamond. Within this system, the researchers aimed to generate sets of ‘exceptional points.’ When these are present, information states don’t decay in the usual way: instead, any gains and losses of quantum information can be perfectly balanced between states.

By accounting for quantum effects, Reboiro and colleagues modelled how the dynamics of ensembled imperfections were affected by their surrounding environments. From these results, they combined information states which displayed large transition probabilities over long time intervals — allowing them to generate exceptional points. Since this considerably increased the survival probability of a state, the team could finally prepare initial states which were robust against the effects of their environments. Their techniques could soon be used to build quantum information systems which retain their information for far longer than was previously possible.

Story Source:

Materials provided by Springer. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Antiferromagnet lattice arrangements influence phase transitions

New research published in EPJ B reveals that the nature of the boundary at which an antiferromagnet transitions to a state of disorder slightly depends on the geometry of its lattice arrangement.

Calculations involving ‘imaginary’ magnetic fields show how the transitioning behaviours of antiferromagnets are subtly shaped by their lattice arrangements.

Antiferromagnets contain orderly lattices of atoms and molecules, whose magnetic moments are always pointed in exactly opposite directions to those of their neighbours.

These materials are driven to transition to other, more disorderly quantum states of matter, or ‘phases,’ by the quantum fluctuations of their atoms and molecules — but so far, the precise nature of this process hasn’t been fully explored. Through new research published in EPJ B, Yoshihiro Nishiyama at Okayama University in Japan has found that the nature of the boundary at which this transition occurs depends on the geometry of an antiferromagnet’s lattice arrangement.

Nishiyama’s discovery could enable physicists to apply antiferromagnets in a wider variety of contexts within material and quantum physics. His calculations concerned the ‘fidelity’ of the materials, which refers in this case to the degree of overlap between the ground states of their interacting lattice components. Furthermore, the fidelity ‘susceptibility’ describes the degree to which this overlap is influenced by an applied magnetic field. Since susceptibility is driven by quantum fluctuations, it can be expressed within the language of statistical mechanics — describing how macroscopic observations can arise from the combined influences of many microscopic vibrations.

This makes it a useful probe of how antiferromagnet phase transitions are driven by quantum fluctuations.

Using advanced mathematical techniques, Nishiyama calculated how the susceptibility is affected by ‘imaginary’ magnetic fields — which do not influence the physical world, but are crucial for describing the statistical mechanics of phase transitions. By applying this technique to an antiferromagnet arranged in a honeycomb lattice, he revealed that the transition between orderly, anti-aligned magnetic moments, and a state of disorder, occurs across a boundary with a different shape to that associated with the same transition in a square lattice. By clarifying how the geometric arrangement of lattice components has a subtle influence on this point of transition, Nishiyama’s work could advance physicists’ understanding of the statistical mechanics of antiferromagnets.

Story Source:

Materials provided by Springer. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Cement-free concrete beats corrosion and gives fatbergs the flush

Researchers from RMIT University have developed an eco-friendly zero-cement concrete, which all but eliminates corrosion.

Concrete corrosion and fatbergs plague sewage systems around the world, leading to costly and disruptive maintenance.

But now RMIT engineers have developed concrete that can withstand the corrosive acidic environment found in sewage pipes, while greatly reducing residual lime that leaches out, contributing to fatbergs.

Fatbergs are gross globs of congealed mass clogging sewers with fat, grease, oil and non-biodegradable junk like wet wipes and nappies, some growing to be 200 metres long and weighing tonnes.

Billion-dollar savings

These build-ups of fat, oil and grease in sewers and pipelines, as well as general corrosion over time, costs billions in repairs and replacement pipes.

The RMIT researchers, led by Dr Rajeev Roychand, created a concrete that eliminates free lime — a chemical compound that promotes corrosion and fatbergs.

Roychand said the solution is more durable than ordinary Portland cement, making it perfect for use in major infrastructure, such as sewage drainage pipes.

“The world’s concrete sewage pipes have suffered durability issues for too long,” Roychand said.

“Until now, there was a large research gap in developing eco-friendly material to protect sewers from corrosion and fatbergs.

“But we’ve created concrete that’s protective, strong and environmental — the perfect trio.”

The perfect blend

By-products of the manufacturing industry are key ingredients of the cement-less concrete — a zero cement composite of nano-silica, fly-ash, slag and hydrated lime.

Not only does their concrete use large volumes of industrial by-products, supporting a circular economy, it surpasses sewage pipe strength standards set by ASTM International.

“Though ordinary Portland cement is widely used in the fast-paced construction industry, it poses long term durability issues in some of its applications,” Roychand said.

“We found making concrete out of this composite blend — rather than cement — significantly improved longevity.”

Sustainable benefits

Replacing underground concrete pipes is a tedious task, ripping up the ground is expensive and often has a ripple effect of prolonged traffic delays and neighbourhood nuisances.

The Water Services Association of Australia estimates maintaining sewage networks costs $15 million each year, billions worldwide.

The environmental cost is greater — ordinary Portland cement accounts for about 5% of the world’s greenhouse gas emissions.

However, the RMIT study has proven certain by-products can be up to the job, replacing cement and able to withstand the high acidity of sewage pipes.

“Our zero-cement concrete achieves multiple benefits: it’s environmentally friendly, reduces concrete corrosion by 96% and totally eliminates residual lime that is instrumental in the formation of fatbergs,” Roychand said.

“With further development, our zero-cement concrete could be made totally resistant to acid corrosion.”

Story Source:

Materials provided by RMIT University. Original written by Aeden Ratcliffe. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

To kill a quasiparticle: A quantum whodunit

In large systems of interacting particles in quantum mechanics, an intriguing phenomenon often emerges: groups of particles begin to behave like single particles. Physicists refer to such groups of particles as quasiparticles.

Understanding the properties of quasiparticles may be key to comprehending, and eventually controlling, technologically important quantum effects like superconductivity and superfluidity.

Unfortunately, quasiparticles are only useful while they live. It is thus particularly unfortunate that many quasiparticles die young, lasting far, far less than a second.

The authors of a new Monash University-led study published today in Physical Review Letters investigate the crucial question: how do quasiparticles die?

Beyond the usual suspect — quasiparticle decay into lower energy states — the authors identify a new culprit: many-body dephasing.

MANY BODY DEPHASING

Many-body dephasing is the disordering of the constituent particles in the quasiparticle that occurs naturally over time.

As the disorder increases, the quasiparticle’s resemblance to a single particle fades. Eventually, the inescapable effect of many-body dephasing kills the quasiparticle.

Far from a negligible effect, the authors demonstrate that many-body dephasing can even dominate over other forms of quasiparticle death.

This is shown through investigations of a particularly ‘clean’ quasiparticle — an impurity in an ultracold atomic gas — where the authors find strong evidence of many-body dephasing in past experimental results.

The authors focus on the case where the ultracold atomic gas is a Fermi sea. An impurity in a Fermi sea gives rise to a quasiparticle known as the repulsive Fermi polaron.

The repulsive Fermi polaron is a highly complicated quasiparticle and has a history of eluding both experimental and theoretical studies.

Through extensive simulations and new theory, the authors show that an established experimental protocol — Rabi oscillations between impurity spin states — exhibits the effects of many-body dephasing in the repulsive Fermi polaron.

These previously unrecognised results provide strong evidence that many-body dephasing is fundamental to the nature of quasiparticles.

Go to Source
Author:

Categories
ScienceDaily

First study with CHEOPS data describes one of the most extreme planets in the universe

Eight months after the space telescope CHEOPS started its journey into space, the first scientific publication using data from CHEOPS has been issued. CHEOPS is the first ESA mission dedicated to characterising known exoplanets. Exoplanets, i.e. planets outside the Solar system, were first found in 1995 by two Swiss astronomers, Michel Mayor and Didier Queloz, who were last year awarded the Nobel Prize for this discovery. CHEOPS was developed as part of a partnership between ESA and Switzerland. Under the leadership of the University of Bern and ESA, a consortium of more than a hundred scientists and engineers from eleven European states was involved in constructing the satellite over five years. The Science Operations Center of CHEOPS is located at the observatory of the University of Geneva.

Using data from CHEOPS, scientists have recently carried out a detailed study of the exoplanet WASP-189b. The results have just been accepted for publication in the journal Astronomy & Astrophysics. Willy Benz, professor of astrophysics at the University of Bern and head of the CHEOPS consortium, was delighted about the findings: “These observations demonstrate that CHEOPS fully meets the high expectations regarding its performance.”

One of the most extreme planets in the universe

WASP-189b, the target of the CHEOPS observations, is an exoplanet orbiting the star HD 133112, one of the hottest stars known to have a planetary system. “The WASP-189 system is 322 light years away and located in the constellation Libra (the weighing scales),” explains Monika Lendl, lead author of the study from the University of Geneva, and member of the National Centre of Competence in Research PlanetS.

“WASP-189b is especially interesting because it is a gas giant that orbits very close to its host star. It takes less than 3 days for it to circle its star, and it is 20 times closer to it than Earth is to the Sun,” Monika Lendl describes the planet, which is more than one and a half times as large as Jupiter, the largest planet of the Solar system.

Monika Lendl further explains that planetary objects like WASP-189b are very exotic: “They have a permanent day side, which is always exposed to the light of the star, and, accordingly, a permanent night side.” This means that its climate is completely different from that of the gas giants Jupiter and Saturn in our solar system. “Based on the observations using CHEOPS, we estimate the temperature of WASP-189b to be 3,200 degrees Celsius. Planets like WASP-189b are called “ultra-hot Jupiters.” Iron melts at such a high temperature, and even becomes gaseous. This object is one of the most extreme planets we know so far,” says Lendl.

Highly precise brightness measurements

“We cannot see the planet itself as it is too far away and too close to its host star, so we have to rely on indirect methods,” explains Lendl. For this, CHEOPS uses highly precise brightness measurements: When a planet passes in front of its star as seen from Earth, the star seems fainter for a short time. This phenomenon is called a transit. Monika Lendl explains: “Because the exoplanet WASP-189b is so close to its star, its dayside is so bright that we can even measure the ‘missing’ light when the planet passes behind its star; this is called an occultation. We have observed several such occultations of WASP-189b with CHEOPS,” says Lendl. “It appears that the planet does not reflect a lot of starlight. Instead, most of the starlight gets absorbed by the planet, heating it up and making it shine.” The researchers believe that the planet is not very reflective because there are no clouds present on its dayside: “This is not surprising, as theoretical models tell us that clouds cannot form at such high temperatures.”

And the star is special too

“We also found that the transit of the gas giant in front of its star is asymmetrical. This happens when the star possesses brighter and darker zones on its surface,” adds Willy Benz. “Thanks to CHEOPS data, we can conclude that the star itself rotates so quickly that its shape is no longer spherical; but ellipsoidal. The star is being pulled outwards at its equator.” continues Benz.

The star around which WASP-189b orbits is very different from the sun. Monika Lendl says: “The star is considerably larger and more than two thousand degrees Celsius hotter than our sun. Because it is so hot, the star appears blue and not yellow-white like the sun.” Willy Benz adds: “Only a handful of planets are known to orbit such hot stars, and this system is the brightest by far.” As a consequence, it forms a benchmark for further studies.

In conclusion, Willy Benz explains: “We are expecting further spectacular findings on exoplanets thanks to observations with CHEOPS. The next papers are already in preparation.”

Story Source:

Materials provided by University of Bern. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

First measurements of radiation levels on the moon

In the coming years and decades, various nations want to explore the moon, and plan to send astronauts there again for this purpose. But on our inhospitable satellite, space radiation poses a significant risk. The Apollo astronauts carried so-called dosimeters with them, which performed rudimentary measurements of the total radiation exposure during their entire expedition to the moon and back again. In the current issue (25 September) of the journal Science Advances, Chinese and German scientists report for the first time on time-resolved measurements of the radiation on the moon.

The “Lunar Lander Neutron and Dosimetry” (LND) was developed and built at Kiel University, on behalf of the Space Administration at the German Aerospace Center (DLR), with funding from the Federal Ministry for Economic Affairs and Energy (BMWi). The measurements taken by the LND allow the calculation of the so-called equivalent dose. This is important to estimate the biological effects of space radiation on humans. “The radiation exposure we have measured is a good benchmark for the radiation within an astronaut suit,” said Thomas Berger of the German Aerospace Center in Cologne, co-author of the publication.

The measurements show an equivalent dose rate of about 60 microsieverts per hour. In comparison, on a long-haul flight from Frankfurt to New York, it is about 5 to 10 times lower, and on the ground well over 200 times lower. Since astronauts would be on the moon for much longer than passengers flying to New York and back, this represents considerable exposure for humans, said Robert Wimmer-Schweingruber from Kiel University, whose team developed and built the instrument. “We humans are not really made to withstand space radiation. However, astronauts can and should shield themselves as far as possible during longer stays on the moon, for example by covering their habitat with a thick layer of lunar soil,” explained second author Wimmer-Schweingruber. “During long-term stays on the moon, the astronauts’ risk of getting cancer and other diseases could thus be reduced,” added co-author Christine Hellweg from the German Aerospace Center.

The measurements were taken on board the Chinese lunar lander Chang’e-4, which landed on the far side of the moon on 3 January 2019. The device from Kiel takes measurements during the lunar “daylight,” and like all other scientific equipment, switches off during the very cold and nearly two-week-long lunar night, to conserve battery power. The device and lander were scheduled to take measurements for at least a year, and have now already exceeded this goal. The data from the device and the lander is transmitted back to earth via the relay satellite Queqiao, which is located behind the moon.

The data obtained also has some relevance with respect to future interplanetary missions. Since the moon has neither a protective magnetic field nor an atmosphere, the radiation field on the surface of the moon is similar to that in interplanetary space, apart from the shielding by the moon itself. “This is why the measurements taken by the LND will also be used to review and further develop models that can be used for future missions. For example, if a manned mission departs to Mars, the new findings enable us to reliably estimate the anticipated radiation exposure in advance. That’s why it is important that our detector also allows us to measure the composition of the radiation,” said Wimmer-Schweingruber.

Story Source:

Materials provided by Kiel University. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Faced with shortages, researchers combine heat and humidity to disinfect N95 masks

As the COVID-19 pandemic swept around the world early this year, shortages of protective equipment such as N95 masks left healthcare workers little choice but to reuse the masks they had — increasing the risk of infection for both them and their patients.

Now, researchers at the Department of Energy’s SLAC National Accelerator Laboratory, Stanford University and the University of Texas Medical Branch may have a solution: Using a combination of moderate heat and high relative humidity, the team was able to disinfect N95 mask materials without hampering their ability to filter out viruses.

What’s more, it should not be too difficult to turn the new results into an automated system hospitals could use in short order — because the process is so simple, it might take just a few months to design and test a device.

“This is really an issue, so if you can find a way to recycle the masks a few dozen times, the shortage goes way down,” said Stanford physicist Steven Chu, a senior author on the new paper. “You can imagine each doctor or nurse having their own personal collection of up to a dozen masks. The ability to decontaminate several of these masks while they are having a coffee break will lessen the chance that masks contaminated with COVID viruses would expose other patients.”

The team reported their results September 25th in the journal ACS Nano.

Facing a shortage of the masks early this year, researchers considered a number of ways to disinfect them for reuse, including ultraviolet light, hydrogen peroxide vapors, autoclaves and chemical disinfectants. The problem is that many of those methods degrade N95 masks’ filtering abilities, so that at most they could be reused a few times.

In the new study, Chu, University of Texas Medical Branch virologist Scott Weaver and Stanford/SLAC professors Yi Cui and Wah Chiu and colleagues focused their attention on a combination of heat and humidity to try to decontaminate masks.

Working at the World Reference Center for Emerging Viruses and Arboviruses, which has biosafety measures in place for working with the most contagious viruses, the team first mixed up batches of SARS-CoV-2 virus in liquids designed to mimic the fluids that might spray out of our mouths when we cough, sneeze, sing or simply breathe. They next sprayed droplets of the brew on a piece of meltblown fabric, a material used in most N95 masks, and let it dry.

Finally, they heated their samples at temperatures ranging from 25 to 95 degrees Celsius for up to 30 minutes with relative humidity up to 100 percent.

Higher humidity and heat substantially reduced the amount of virus the team could detect on the mask, although they had to be careful not to go too hot, which additional tests revealed could lower the material’s ability to filter out virus-carrying droplets. The sweet spot appeared to be 85 degrees Celsius with 100 percent relatively humidity — the team could find no trace of SARS-CoV-2 after cooking the masks under those conditions.

Additional results indicate masks could be decontaminated and reused upwards of 20 times and that the process works on at least two other viruses — a human coronavirus that causes the common cold and the chikungunya virus.

Weaver said that although the results are not especially surprising — researchers have known for a long time that heat and humidity are good ways to inactivate viruses — there hadn’t been an urgent need for a detailed quantitative analysis of something like mask decontamination until now. The new data, he said, “provide some quantitative guidance for the future.”

And even after the coronavirus pandemic is over, there are likely benefits, in part because of the method’s application beyond SARS-CoV-2 to other viruses, and because of the economic and environmental benefits of reusing masks. “It’s good all around,” Cui said.

The research was supported by the DOE Office of Science through the National Virtual Biotechnology Laboratory, a consortium of DOE national laboratories focused on response to COVID-19, with funding provided by the Coronavirus CARES Act and by World Reference Center for Emerging Viruses and Arboviruses, funded by the National Institutes of Health.

Go to Source
Author:

Categories
ScienceDaily

Comparing face coverings in controlling expired particles

Laboratory tests of surgical and N95 masks by researchers at the University of California, Davis, show that they do cut down the amount of aerosolized particles emitted during breathing, talking and coughing. Tests of homemade cloth face coverings, however, show that the fabric itself releases a large amount of fibers into the air, underscoring the importance of washing them. The work is published Sept. 24 in Scientific Reports.

As the COVID-19 pandemic continues, the use of masks and other face coverings has emerged as an important tool alongside contact tracing and isolation, hand-washing and social distancing to reduce the spread of coronavirus. The CDC and the World Health Organization endorse the use of face coverings, and masks or face coverings are required by many state and local governments, including the state of California.

The goal of wearing face coverings is to prevent people who are infected with COVID-19 but asymptomatic from transmitting the virus to others. But while evidence shows that face coverings generally reduce the spread of airborne particles, there is limited information on how well they compare with each other.

Sima Asadi, a graduate student working with Professor William Ristenpart in the UC Davis Department of Chemical Engineering, and colleagues at UC Davis and Icahn School of Medicine at Mount Sinai, New York, set up experiments to measure the flow of particles from volunteers wearing masks while they performed “expiratory activities” including breathing, talking, coughing and moving their jaw as if chewing gum.

Asadi and Ristenpart have previously studied how people emit small particles, or aerosols, during speech. These particles are small enough to float through the air over a considerable distance, but large enough to carry viruses such as influenza or coronavirus. They have found that a fraction of people are “superemitters” who give off many more particles than average.

The 10 volunteers sat in front of a funnel in a laminar flow cabinet. The funnel drew air from in front of their faces into a device that measured the size and number of particles exhaled. They wore either no mask, a medical-grade surgical mask, two types of N95 mask (vented or not), a homemade paper mask or homemade one- or two-layer cloth mask made from a cotton T-shirt according to CDC directions.

Up to 90 percent of particles blocked

The tests only measured outward transmission — whether the masks could block an infected person from giving off particles that might carry viruses.

Without a mask, talking (reading a passage of text) gave off about 10 times more particles than simple breathing. Forced coughing produced a variable amount of particles. One of the volunteers in the study was a superemitter who consistently produced nearly 100 times as many particles as the others when coughing.

In all the test scenarios, surgical and N95 masks blocked as much as 90 percent of particles, compared to not wearing a mask. Face coverings also reduced airborne particles from the superemitter.

Homemade cotton masks actually produced more particles than not wearing a mask. These appeared to be tiny fibers released from the fabric. Because the cotton masks produced particles themselves, it’s difficult to tell if they also blocked exhaled particles. They did seem to at least reduce the number of larger particles.

The results confirm that masks and face coverings are effective in reducing the spread of airborne particles, Ristenpart said, and also the importance of regularly washing cloth masks.

Additional co-authors on the study are Christopher Cappa, Santiago Barreda and Anthony Wexler at UC Davis; and Nicole Bouvier, Icahn School of Medicine at Mount Sinai, New York. It was supported by a grant from the National Institute of Allergy and Infectious Diseases of the National Institutes of Health.

Story Source:

Materials provided by University of California – Davis. Original written by Andy Fell. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Spin clean-up method brings practical quantum computers closer to reality

Quantum computers are the new frontier in advanced research technology, with potential applications such as performing critical calculations, protecting financial assets, or predicting molecular behavior in pharmaceuticals. Researchers from Osaka City University have now solved a major problem hindering large-scale quantum computers from practical use: precise and accurate predictions of atomic and molecular behavior.

They published their method to remove extraneous information from quantum chemical calculations on Sept. 17 as an advanced online article in Physical Chemistry Chemical Physics, a journal of the Royal Society of Chemistry.

“One of the most anticipated applications of quantum computers is electronic structure simulations of atoms and molecules,” said paper authors Kenji Sugisaki, Lecturer and Takeji Takui, Professor Emeritus in the Department of Chemistry and Molecular Materials Science in Osaka City University’s Graduate School of Science.

Quantum chemical calculations are ubiquitous across scientific disciplines, including pharmaceutical therapy development and materials research. All of the calculations are based on solving physicist Erwin Schrödinger’s equation, which uses electronic and molecular interactions that result in a particular property to describe the state of a quantum-mechanical system.

“Schrödinger equations govern any behavior of electrons in molecules, including all chemical properties of molecules and materials, including chemical reactions,” Sugisaki and Takui said.

On classical computers, such precise equations would take exponential time. On quantum computers, this precision is possible in realistic time, but it requires “cleaning” during the calculations to obtain the true nature of the system, according to them.

A quantum system at a specific moment in time, known as a wave function, has a property described as spin, which is the total of the spin of each electron in the system. Due to hardware faults or mathematical errors, there may be incorrect spins informing the system’s spin calculation. To remove these ‘spin contaminants,’ the researchers implemented an algorithm that allows them to select the desired spin quantum number. This purifies the spin, removing contaminants during each calculation — a first on quantum computers, according to them.

“Quantum chemical calculations based on exactly solving Schrödinger equations for any behavior of atoms and molecules can afford predictions of their physical-chemical properties and complete interpretations on chemical reactions and processes,” they said, noting that this is not possible with currently available classical computers and algorithms. “The present paper has given a solution by implementing a quantum algorithm on quantum computers.”

The researchers next plan to develop and implement algorithms designed to determine the state of electrons in molecules with the same accuracy for both excited- or ground-state electrons.

Story Source:

Materials provided by Osaka City University. Note: Content may be edited for style and length.

Go to Source
Author: