Categories
ScienceDaily

The most sensitive instrument in the search for life beyond Earth

The question of whether life exists beyond the Earth is one of humanity’s most fundamental questions. Future NASA missions, for example, aim to examine the ice moons of Jupiter and Saturn, which may potentially shelter life in the liquid oceans underneath the thick layer of ice, on the ground. Proving traces of life beyond the Earth is extremely challenging, however. Highly sensitive instruments which take measurements on the ground with the greatest possible degree of autonomy and with high precision — millions of kilometers from the Earth and thus without direct support from humankind — are required.

An international group of researchers under the leadership of Andreas Riedo and Niels Ligterink at the University of Bern have now developed ORIGIN, a mass spectrometer which can detect and identify the smallest amounts of such traces of life. They describe the instrument in a recently published article in the specialist journal Nature Scientific Reports. Niels Ligterink from the Center for Space and Habitability (CSH) is the lead author of the international study, and co-author Andreas Riedo from the Physics Institute at the University of Bern developed the instrument in the laboratories of the space research and planetary sciences divison of the Physics Institute. Various international space agencies, particularly NASA, have already expressed interest in testing ORIGIN for future missions.

New instrument required

Since the first Mars mission “Viking” in the 1970s, humanity has been searching for traces of life on Mars using highly specialized instruments which are installed on landing platforms and rovers. In its early years, Mars was Earth-like, had a dense atmosphere and even liquid water. However, as Niels Ligterink explains, Mars lost its protective atmosphere over the course of time: “As a result of this, the surface of Mars is subjected to high solar and cosmic radiation which makes life on the surface impossible.” NASA’s “Curiosity” rover is currently examining Mars in detail but with no concrete indications of traces of life to date.

Since the discovery by the Cassini and Galileo missions of the global oceans beneath kilometers of ice layers on Jupiter’s moon Europa and Saturn’s moon Enceladus, these two bodies have increasingly become the focus of the search for extraterrestrial life for researchers. According to current knowledge, the oceans have all of the properties which are not only needed for the occurrence of life, but also which provide environments in which life can exist in the long term. NASA therefore plans to land a mission on Jupiter’s moon Europa around 2030 and take measurements on the ground. The goal: Identification of life. Co-author Prof. Dr. Peter Wurz from the Physics Institute at the University of Bern says: “Concepts which were specially developed for Mars cannot be simply applied to other bodies in our solar systembecause they are very different. New instruments with higher sensitivity and simpler and more robust analysis systems must be designed and used.”

Unprecedented measurement sensitivity for proof of life in space

ORIGIN is one such new instrument which outperforms previous space instruments many terms over in terms of its measurement sensitivty. Various international space agencies have expressed great interest in the instrument for future missions. Andreas Riedo says: “NASA has invited us to particpaite and test our instrument in the Arctic. The Artic is the optimal test environment in the context of the EUROPA LANDER mission, which should start in 2025, which will allow us to demonstrate the performance of ORIGIN.”

Amino acids are key components of life as we know it on Earth. Contemporaneous proof of certain amino acids on extraterrestrial surfaces, such as those of Europa, allow conclusions to be drawn about possible life. The measurement principle developed by the Bern-based researchers is simple. Niels Ligterink explains: “Laser pulses are directed at the surface to be examined. In the process, small amounts of material are detached, the chemical composition of which is analyzed by ORIGIN in a second step.” Andreas Riedo adds: “The compelling aspect of our technology is that no complicated sample preparation techniques, which could potentially affect the result, are required. This was one of the biggest problems on Mars until now,” says Riedo. The amino acids which have been analyzed with ORIGIN to date have a specific chemical fingerprint which allows them to be directly identified. Niels Ligterink: “To be honest, we didn’t expect that our first measurements would already be able to identify amino acids.”

The discovery of traces of past or present life on bodies in our solar system beyond the Earth is of great importance for a better understanding of the existence of life in the universe and its genesis. Andreas Riedo says: “Our new measurement technology is a real improvement on the instruments currently used on space missions. If we are taken along on a future mission, we may be able to answer one of humanity’s most fundamental questions with ORIGIN: Is there life in space?.”

Story Source:

Materials provided by University of Bern. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Understanding of relaxor ferroelectric properties could lead to many advances

A new fundamental understanding of polymeric relaxor ferroelectric behavior could lead to advances in flexible electronics, actuators and transducers, energy storage, piezoelectric sensors and electrocaloric cooling, according to a team of researchers at Penn State and North Carolina State.

Researchers have debated the theory behind the mechanism of relaxor ferroelectrics for more than 50 years, said Qing Wang, professor of materials science and engineering at Penn State. While relaxor ferroelectrics are well-recognized, fundamentally fascinating and technologically useful materials, a Nature article commented in 2006 that they were heterogeneous, hopeless messes.

Without a fundamental understanding of the mechanism, little progress has been made in designing new relaxor ferroelectric materials. The new understanding, which relies on both experiment and theoretical modeling, shows that relaxor ferroelectricity in polymers comes from chain conformation disorders induced by chirality. Chirality is a feature of many organic materials in which molecules are mirror images of each other, but not exactly the same. The relaxor mechanism in polymers is vastly different from the mechanism proposed for ceramics whose relaxor behavior originates from chemical disorders.

“Different from ferroelectrics, relaxors exhibit no long-range large ferroelectric domains but disordered local polar domains,” Wang explained. “The research in relaxor polymeric materials has been challenging owing to the presence of multiple phases such as crystalline, amorphous and crystalline-amorphous interfacial area in polymers.”

In energy storage capacitors, relaxors can deliver a much higher energy density than normal ferroelectrics, which have high ferroelectric loss that turns into waste heat. In addition, relaxors can generate larger strain under the applied electric fields and have a much better efficiency of energy conversion than normal ferroelectrics, which makes them preferred materials for actuators and sensors.

Penn State has a long history of discovery in ferroelectric materials. Qiming Zhang, professor of electrical engineering at Penn State, discovered the first relaxor ferroelectric polymer in 1998, when he used an electron beam to irradiate a ferroelectric polymer and found it had become a relaxor. Zhang along with Qing Wang also made seminal discoveries in the electrocaloric effect using relaxor polymers, which allows for solid state cooling without the use of noxious gases and uses much less energy than conventional refrigeration.

“The new understanding of relaxor behavior would open up unprecedented opportunities for us to design relaxor ferroelectric polymers for a range of energy storage and conversion applications,” said Wang.

Story Source:

Materials provided by Penn State. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Biomechanics of skin can perform useful tactile computations

As our body’s largest and most prominent organ, the skin also provides one of our most fundamental connections to the world around us. From the moment we’re born, it is intimately involved in every physical interaction we have.

Though scientists have studied the sense of touch, or haptics, for more than a century, many aspects of how it works remain a mystery.

“The sense of touch is not fully understood, even though it is at the heart of our ability to interact with the world,” said UC Santa Barbara haptics researcher Yon Visell. “Anything we do with our hands — picking up a glass, signing our name or finding keys in our bag — none of that is possible without the sense of touch. Yet we don’t fully understand the nature of the sensations captured by the skin or how they are processed in order to enable perception and action.”

We have better models for how our other senses, such as vision and hearing, work, but our understanding of how the sense of touch works is much less complete, he added.

To help fill that gap, Visell and his research team, including Yitian Shao and collaborator Vincent Hayward at the Sorbonne, have been studying the physics of touch sensation — how touching an object gives rise to signals in the skin that shape what we feel. In a study (link) published in the journal Science Advances, the group reveals how the intrinsic elasticity of the skin aids tactile sensing. Remarkably, they show that far from being a simple sensing material, the skin can also aid the processing of tactile information.

To understand this significant but little-known aspect of touch, Visell thinks it is helpful to think about how the eye, our visual organ, processes optical information.

“Human vision relies on the optics of the eye to focus light into an image on the retina,” he said. “The retina contains light-sensitive receptors that translate this image into information that our brain uses to decompose and interpret what we’re looking at.”

An analogous process unfolds when we touch a surface with our skin, Visell continued. Similar to the structures such as the cornea and iris that capture and focus light onto the retina, the skin’s elasticity distributes tactile signals to sensory receptors throughout the skin.

Building on previous work which used an array of tiny accelerometers worn on the hand to sense and catalog the spatial patterns of vibrations generated by actions such as tapping, sliding or grasping, the researchers here employed a similar approach to capture spatial patterns of vibration that are generated as the hand feels the environment.

“We used a custom device consisting of 30 three-axis sensors gently bonded to the skin,” explained lead author Shao. “And then we asked each participant in our experiments to perform many different touch interactions with their hands.” The research team collected a dataset of nearly 5000 such interactions, and analyzed that data to interpret how the transmission of touch-produced vibration patterns that were transmitted throughout the hand shaped information content in the tactile signals. The vibration patterns arose from the elastic coupling within the skin itself.

The team then analyzed these patterns in order to clarify how the transmission of vibrations in the hand shaped information in the tactile signals. “We used a mathematical model in which high-dimensional signals felt throughout the hand were represented as combinations of a small number of primitive patterns,” Shao explained. The primitive patterns provided a compact lexicon, or dictionary, that compressed the size of the information in the signals, enabling them to be encoded more efficiently.

This analysis generated a dozen or fewer primitive wave patterns — vibrations of the skin throughout the hand that could be used to capture information in the tactile signals felt by the hand. The striking feature of these primitive vibration patterns, Visell said, is that they automatically reflected the structure of the hand and the physics of wave transmission in the skin.

“Elasticity plays this very basic function in the skin of engaging thousands of sensory receptors for touch in the skin, even when contact occurs at a small skin area,” he explained. “This allows us to use far more sensory resources than would otherwise be available to interpret what it is that we’re touching.” The remarkable finding of their research is that this process also makes it possible to more efficiently capture information in the tactile signals, Visell said. Information processing of this kind is normally considered to be performed by the brain, rather than the skin.

The role played by mechanical transmission in the skin is in some respects similar to the role of the mechanics of the inner ear in hearing, Visell said. In 1961, von Bekesy received the Nobel Prize for his work showing how the mechanics of the inner ear facilitate auditory processing. By spreading sounds with different frequency content to different sensory receptors in the ear they aid the encoding of sounds by the auditory system. The team’s work suggests that similar processes may underly the sense of touch.

These findings, according to the researchers, not only contribute to our understanding of the brain, but may also suggest new approaches for the engineering of future prosthetic limbs for amputees that might be endowed with skin-like elastic materials. Similar methods also could one day be used to improve tactile sensing by next-generation robots.

Go to Source
Author:

Categories
ScienceDaily

Why is there any matter in the universe at all? New study sheds light

Scientists at the University of Sussex have measured a property of the neutron — a fundamental particle in the universe — more precisely than ever before. Their research is part of an investigation into why there is matter left over in the universe, that is, why all the antimatter created in the Big Bang didn’t just cancel out the matter.

The team — which included the Science and Technology Facilities Council’s (STFC) Rutherford Appleton Laboratory in the UK, the Paul Scherrer Institute (PSI) in Switzerland, and a number of other institutions — was looking into whether or not the neutron acts like an “electric compass.” Neutrons are believed to be slightly asymmetrical in shape, being slightly positive at one end and slightly negative at the other — a bit like the electrical equivalent of a bar magnet. This is the so-called “electric dipole moment” (EDM), and is what the team was looking for.

This is an important piece of the puzzle in the mystery of why matter remains in the Universe, because scientific theories about why there is matter left over also predict that neutrons have the “electric compass” property, to a greater or lesser extent. Measuring it then it helps scientists to get closer to the truth about why matter remains.

The team of physicists found that the neutron has a significantly smaller EDM than predicted by various theories about why matter remains in the universe; this makes these theories less likely to be correct, so they have to be altered, or new theories found. In fact it’s been said in the literature that over the years, these EDM measurements, considered as a set, have probably disproved more theories than any other experiment in the history of physics. The results are reported today, Friday 28 February 2020, in the journal Physical Review Letters.

Professor Philip Harris, Head of the School of Mathematical and Physical Sciences and leader of the EDM group at the University of Sussex, said:

“After more than two decades of work by researchers at the University of Sussex and elsewhere, a final result has emerged from an experiment designed to address one of the most profound problems in cosmology for the last fifty years: namely, the question of why the Universe contains so much more matter than antimatter, and, indeed, why it now contains any matter at all. Why didn’t the antimatter cancel out all the matter? Why is there any matter left?

“The answer relates to a structural asymmetry that should appear in fundamental particles like neutrons. This is what we’ve been looking for. We’ve found that the “electric dipole moment” is smaller than previously believed. This helps us to rule out theories about why there is matter left over — because the theories governing the two things are linked.

“We have set a new international standard for the sensitivity of this experiment. What we’re searching for in the neutron — the asymmetry which shows that it is positive at one end and negative at the other — is incredibly tiny. Our experiment was able to measure this in such detail that if the asymmetry could be scaled up to the size of a football, then a football scaled up by the same amount would fill the visible Universe.”

The experiment is an upgraded version of apparatus originally designed by researchers at the University of Sussex and the Rutherford Appleton Laboratory (RAL), and which has held the world sensitivity record continuously from 1999 until now.

Dr Maurits van der Grinten, from the neutron EDM group at the Rutherford Appleton Laboratory (RAL), said:

“The experiment combines various state of the art technologies that all need to perform simultaneously. We’re pleased that the equipment, technology and expertise developed by scientists from RAL has contributed to the work to push the limit on this important parameter”

Dr Clark Griffith, Lecturer in Physics from the School of Mathematical and Physical Sciences at the University of Sussex, said:

“This experiment brings together techniques from atomic and low energy nuclear physics, including laser-based optical magnetometry and quantum-spin manipulation. By using these multi-disciplinary tools to measure the properties of the neutron extremely precisely, we are able to probe questions relevant to high-energy particle physics and the fundamental nature of the symmetries underlying the universe. “

50,000 measurements

Any electric dipole moment that a neutron may have is tiny, and so is extremely difficult to measure. Previous measurements by other researchers have borne this out. In particular, the team had to go to great lengths to keep the local magnetic field very constant during their latest measurement. For example, every truck that drove by on the road next to the institute disturbed the magnetic field on a scale that would have been significant for the experiment, so this effect had to be compensated for during the measurement.

Also, the number of neutrons observed needed to be large enough to provide a chance to measure the electric dipole moment. The measurements ran over a period of two years. So-called ultracold neutrons, that is, neutrons with a comparatively slow speed, were measured. Every 300 seconds, a bunch of more than 10,000 neutrons was directed to the experiment and examined in detail. The researchers measured a total of 50,000 such bunches.

A new international standard is set

The researchers’ latest results supported and enhanced those of their predecessors: a new international standard has been set. The size of the EDM is still too small to measure with the instruments that have been used up until now, so some theories that attempted to explain the excess of matter have become less likely. The mystery therefore remains, for the time being.

The next, more precise, measurement is already being constructed at PSI. The PSI collaboration expects to start their next series of measurements by 2021.

Search for “new physics”

The new result was determined by a group of researchers at 18 institutes and universities in Europe and the USA on the basis of data collected at PSI’s ultracold neutron source. The researchers collected measurement data there over a period of two years, evaluated it very carefully in two separate teams, and were then able to obtain a more accurate result than ever before.

The research project is part of the search for “new physics” that would go beyond the so-called Standard Model of Physics, which sets out the properties of all known particles. This is also a major goal of experiments at larger facilities such as the Large Hadron Collider (LHC) at CERN.

The techniques originally developed for the first EDM measurement in the 1950s led to world-changing developments such as atomic clocks and MRI scanners, and to this day it retains its huge and ongoing impact in the field of particle physics.

Go to Source
Author:

Categories
IEEE Spectrum

Monitoring Your Network with Time Series

Networks play a fundamental role in the adoption and growth of Internet applications. Penetrating enterprises, homes, factories, and even cities, networks sustain modern society. In this webinar, Daniella Pontes of InfluxData will explore the flexibility and potential use cases of open source and time series databases.

In this webinar you will:

-Learn how to use a time series database platform to monitor your network

-Understand the value of using open source tools

-Gain insight into what key aspects of network monitoring you should focus on

Categories
ScienceDaily

A milestone in ultrashort-pulse laser oscillators

Ultrafast laser sources are at the heart of an ever-expanding range of fundamental scientific studies and industrial applications, from high-field-physics experiments with attosecond temporal resolution to micrometre-precision machining of materials. In order to push the envelope even further, repetition rates of several megahertz and average output powers of hundreds of Watt are required. A particularly compelling route to realizing such high-power laser pulses is to generate them directly by scaling up the power output from laser oscillators, rather than relying on multi-stage amplifier systems. The latter approach adds a high degree of complexity, whereas the former leads to robust and potentially cost-effective devices. Reporting recently in Optics Express, the group of Ursula Keller at the Institute of Quantum Electronics has now taken the ‘power-scaling’ approach to a new level. They present a source that combines the simplicity and high repetition rates of oscillators with record-high average output power from this type of laser.

The ETH team worked with a so-called thin-disk laser oscillator, where the gain medium — the material in which the quantum processes leading to lasing take place — is shaped as a disk, typically some 100 ?m thin. This geometry affords a relatively large surface area, which in turn helps cooling. Still, thermal effects remained a major bottleneck, and since 2012 the record output power stood at 275 W.

Until now. Combining several advances in thin-disk laser technology developed in the Keller group, PhD student Francesco Saltarelli, senior research scientist Christopher Phillips and colleagues took a decisive step and achieved an average output power of 350 W, with pulses that are only 940 femtoseconds long, carry an energy of 39 microjoule and repeat at a 8.88-megahertz rate — values that are of immediate interest for applications both in science and industry.

A key aspect of the work is that the researchers found a way to enable several passes of the pump beam through the gain medium without inflicting detrimental thermal effects, and so to reduce the stress on the relevant components. The ability to control effects due to heating opened the gate to go firmly beyond the 275-W level and to set the new benchmark. The approach now developed can be taken even further though, and output powers beyond 500 W seem realistic. With further improvements, the ETH researchers estimate, the kilowatt level might come into sight.

Story Source:

Materials provided by ETH Zurich Department of Physics. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Cesium vapor aids in the search for dark matter

The hunt for dark matter is one of the most exciting challenges facing fundamental physics in the 21st century. Researchers have long known that it must exist, as many astrophysical observations would otherwise be impossible to explain. For example, stars rotate much faster in galaxies than they would if only ‘normal’ matter existed.

In total, the matter we can see only accounts for, at the most, 20 percent of the total matter in the universe — meaning that a remarkable 80 percent is dark matter. “There’s an elephant in the room but we just can’t see it,” said Professor Dmitry Budker, a researcher at the PRISMA+ Cluster of Excellence of Johannes Gutenberg University Mainz (JGU) and the Helmholtz Institute Mainz (HIM), explaining the problem he and many of his colleagues worldwide are contending with.

Dark matter could consist of extremely light particles

But so far no one knows what dark matter is made of. Scientists in the field are considering and researching a whole range of possible particles that might theoretically qualify as candidates. Among these are extremely lightweight bosonic particles, currently considered to be one of the most promising prospects. “These can also be regarded as a classical field oscillating at a specific frequency. But we can’t yet put a figure on this — and therefore the mass of the particles,” explained Budker. “Our basic assumption is that this dark matter field is coupled to visible matter and has an extremely subtle influence on certain atomic properties that would normally be constant.”

Budker and his team in Mainz have now developed a new method which they describe in the current issue of the leading specialist journal Physical Review Letters. It employs atomic spectroscopy and involves the use of cesium atom vapor. Only on exposure to laser light of a very specific wavelength do these atoms become excited. The conjecture is that minute changes in the corresponding observed wavelength would indicate coupling of the cesium vapor to a dark matter particle field.

“In principle, our work is based on a particular theoretical model, the hypotheses of which we are experimentally testing,” added the paper’s principal author, Dr. Dionysis Antypas. “In this case, the concept underlying our work is the relaxion model developed by our colleagues and co-authors at the Weizmann Institute in Israel.” According to the relaxion theory, there must be a region in the vicinity of large masses such as the Earth in which the density of dark matter is greater, making the coupling effects easier to observe and detect.

Previously inaccessible frequency range searched

With their new technique, the scientists have now accessed a hitherto unexplored frequency range in which, as postulated in relaxion theory, the effects of certain forms of dark matter on the atomic properties of cesium should be relatively easy to spot. The results also allow the researchers to formulate new restrictions as to what the nature of dark matter is likely to be. Dmitry Budker likens this meticulous search to the hunt for a tiger in a desert. “In the frequency range that we’ve explored in our current work, we still have not pinpointed dark matter. But at least, now that we’ve searched in this range, we know we don’t have to do it again.” The researchers still don’t know where dark matter — the tiger in his metaphor — is lurking, but they now know where it is not. “We just keep on targeting in more closely on the part of the desert where the tiger is most likely to be. And, at some point, we will catch him,” maintained Budker with confidence.

Story Source:

Materials provided by Johannes Gutenberg Universitaet Mainz. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Quantum physics: Simulating fundamental interactions with ultracold atoms

Fundamental interactions between particles are mediated by gauge bosons. Generally, these are described by gauge theories which are extremely challenging to treat theoretically in a wide range of parameters. Tackling some of these open questions in table-top experiments with specifically-designed quantum simulators constitutes an outstanding goal. Now, scientists at Ludwig-Maximilians-Universitaet (LMU) in Munich and the Max Planck Institute of Quantum Optics together with collaborators from the Technical University Munich, Harvard University and the Université Libre de Bruxelles succeeded in demonstrating the main ingredients of a specific lattice gauge theory with two-component ultracold bosons in optical superlattices. The study appears in the scientific magazine Nature Physics.

How do elementary constituents of matter interact with each other?

One of the great challenges in modern physical sciences concerns the identification of elementary constituents of matter, and the manner by which these particles interact with each other. This fundamental problem occurs in many areas of physics including high-energy, condensed matter and quantum computation. While there have been remarkable achievements, confirming the existence of a plethora of elementary particles and novel exotic phases of matter, many fundamental questions remain unanswered due to the great complexity of the problem. One of the most prominent examples in this regard is the still incomplete knowledge of the phase diagram of Quantum Chromodynamics, which describes the strong interaction between quarks and gluons.

New insights by quantum simulation

Due to the vast progress in controlling individual particles including ions, photons and atoms, it has been suggested that quantum simulations could offer new insights on open questions related to the fundamental interactions between (quasi-)particles, which are mediated by gauge fields. Originally the concept of quantum simulation was proposed by Nobel-prize winner Richard Feynman. The key idea is to engineer a quantum many-body system that is tailored to emulate the properties of a given theoretical model, hence, offering a clear view on fundamental physical phenomena of interest in a controlled laboratory environment. Engineered quantum systems made of ultracold atoms in optical lattices emerged as versatile platforms to study the properties of exotic quantum phases of matter.

Simulating gauge fields, however, is extremely demanding, since it requires the precise implementation of matter particles and gauge fields, which interact in a way that has to respect the local symmetry of the gauge theory of interest.

Simulating gauge-mediated interactions with charge-neutral atoms

A team of physicists at LMU Munich and the Max Planck Institute of Quantum Optics (MPQ), led by Professor Monika Aidelsburger and Professor Immanuel Bloch, carefully designed and successfully realized the fundamental ingredients of a specific minimal lattice gauge theory — a Z2 lattice gauge theory, which plays an important role in condensed matter physics and quantum computation. The team realized a controllable quantum simulator of ultracold bosonic particles trapped in a bichromatic optical lattice. Isolating the dynamics of two particles in a double well facilitated the controlled investigation of the basic building block of the theory, which in future experiments could be used to build extended models. The complex interactions between the particles were manipulated using laser beams, whose intensity was modulated periodically in time. The challenge consisted in implementing well-defined local interactions between “matter” particles and “gauge bosons,” the mediators of fundamental interactions. The experimentalists used two different electronic states of the atoms to emulate the different types of particles and the local interactions have been realized by addressing the atoms in a state-dependent manner. The team validated a novel approach based on periodic driving by observing the dynamics of the atoms state- and site-resolved. The excellent knowledge of the microscopic parameters of the model further allowed them to outline the path for future experiments in extended geometries and in higher dimensions.

Dr. Christian Schweizer, the lead author of this study concludes: “While it is still a long path to advance existing experimental platforms in a way that will enable us to shed new light onto fundamental open questions regarding the phase diagram of Quantum Chromodynamics, these are exciting times for quantum simulators, which develop at a remarkable rate.” The authors have taken the first steps in the long journey towards studying high-energy physics problems with table-top experiments. This study provides a novel route for solving the outstanding challenges that experimentalists face with currently available protocols to simulate the fundamental interactions between elementary constituents of nature.

Story Source:

Materials provided by Ludwig-Maximilians-Universität München. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

An astonishing parabola trick: Unusual magnetic behavior

Prospective digital data storage devices predominantly rely on novel fundamental magnetic phenomena. The better we understand these phenomena, the better and more energy efficient the memory chips and hard drives we can build. Physicists from the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) and the Helmholtz-Zentrum Berlin (HZB) have now completed the essential fundamental work for future storage devices: Using a creative approach of shaping magnetic thin films in curved architectures, they validated the presence of chiral responses in a commonly used magnetic material. This facilitates the creation of magnetic systems with desired properties that rely on simple geometrical transformations.

We all know that our left hand is different from our right — a left glove won’t fit your right hand and vice versa. Scientists use the term “chirality” to describe objects that do not align with their mirror image. Chemists, in particular, are familiar with this property in molecules, as in left- and right-rotating lactic acid. Humans metabolize the right-rotating variant more easily than its “mirror image.”

Such chiral effects are known to occur in magnetic materials, where magnetic textures also have chiral properties: the arrangement of individual magnetic moments inside the material, or, figuratively speaking, the alignment of the many tiny “compass needles” that make up a magnet, could form right- and left-handed alignments. Under certain conditions, some textures behave like image and mirror image — a left-handed texture cannot be made congruent with its right-handed version.

The interesting aspect here is that “the two textures can present different magnetic behaviors,” as HZDR physicist Dr. Denys Makarov points out. “To put it simply: a right-handed texture can be more energetically preferable than a left-handed texture. Since systems in nature tend to assume their lowest possible energetic state, the right-handed state is preferred.” Such chiral effects hold great technological promise. Among other things, they could be helpful in the future development of highly energy-efficient electronic components such as sensors, switches, and non-volatile storage devices.

Magnetic curved architectures

“Helimagnets are materials with well-defined chiral magnetic properties, due to a lack of internal magnetic symmetry,” explains the lead author of the paper, Dr. Oleksii Volkov from HZDR’s Institute of Ion Beam Physics and Materials Research. “Despite the fact that they have been known for a long time, these are rather exotic materials that are difficult to produce. Moreover, helimagnets usually exhibit their unique chiral properties at low temperatures.” That is why Makarov’s team chose a different path. They used a common magnetic material, iron-nickel alloy (known as Permalloy), to build curved objects like parabola-shaped strips. Using lithography, they formed various parabolic strips of several micrometers from thin sheets of Permalloy.

The physicists then exposed the samples to a magnetic field, thus orienting the magnetic moments in the parabola along this magnetic field. They then experimentally explored the magnetization reversal by using a highly sensitive analysis method at HZB’s synchrotron. The team was able to show that the magnetic moments in the parabolic strip remained in their original direction until a reversed magnetic field of a certain critical value was applied.

Surprisingly strong effect

This delayed response is due to chiral effects caused by the curvature at the apex area of the parabola strips. “Theorists have predicted this unusual behavior for some time, but it was actually considered more of a theoretical trick,” explains Dr. Florian Kronast of Helmholtz-Zentrum Berlin. “But now we have shown that this trick actually works in practice. We detected magnetic chiral response in a conventional soft ferromagnetic material, just through the geometric curvature of the strips we used.”

In the process, the team were faced with two more surprises: On the one hand, the effect was remarkably strong, which means it could be used to influence the magneto-electric responses of materials. On the other hand, the effect was detected in a relatively large object: micrometer-sized parabolas that can be produced using conventional lithography. Previously, experts had assumed that these curvature-induced chiral effects could only be observed in magnetic objects with dimensions of about a dozen of nanometers.

“In terms of possible applications, we are looking forward to novel magnetic switches and data storage devices that utilize geometrically-induced chiral properties,” Makarov emphasizes. There are concepts that envision future digital data storage in certain magnetic objects, so-called chiral domain walls or skyrmions. The recent discovery might help to produce such objects quite easily — at room temperature, and using common materials. In addition, the newly discovered effect also paves the way for novel, highly sensitive magnetic field sensors.

Go to Source
Author:

Categories
ScienceDaily

Agrivoltaics proves mutually beneficial across food, water, energy nexus

Building resilience in renewable energy and food production is a fundamental challenge in today’s changing world, especially in regions susceptible to heat and drought. Agrivoltaics, the co-locating of agriculture and solar photovoltaic panels, offers a possible solution, with new University of Arizona-led research reporting positive impacts on food production, water savings and the efficiency of electricity production.

Agrivoltaics, also known as solar sharing, is an idea that has been gaining traction in recent years; however, few studies have monitored all aspects of the associated food, energy and water systems, and none have focused on dryland areas — regions that experience food production challenges and water shortages, but have an overabundance of sun energy.

“Many of us want more renewable energy, but where do you put all of those panels? As solar installations grow, they tend to be out on the edges of cities, and this is historically where we have already been growing our food,” said Greg Barron-Gafford, an associate professor in the School of Geography and Development and lead author on the paper that was published today in Nature Sustainability.

A recent high-profile study in Nature found that current croplands are the “land covers with the greatest solar PV power potential” based on an extensive analysis of incoming sunlight, air temperature and relative humidity.

“So which land use do you prefer — food or energy production? This challenge strikes right at the intersection of human-environment connections, and that is where geographers shine!” said Barron-Gafford, who is also a researcher with Biosphere 2. “We started to ask, ‘Why not do produce both in the same place?’ And we have been growing crops like tomatoes, peppers, chard, kale, and herbs in the shade of solar panels ever since.”

Using solar photovoltaic, or PV, panels and regional vegetables, the team created the first agrivoltaics research site at Biosphere 2. Professors and students, both undergraduate and graduate, measured everything from when plants germinated to the amount of carbon plants were sucking out of the atmosphere and the water they were releasing, to their total food production throughout the growing season.

The study focused on chiltepin pepper, jalapeno and cherry tomato plants that were positioned under a PV array. Throughout the average three-month summer growing season, researchers continuously monitored incoming light levels, air temperature and relative humidity using sensors mounted above the soil surface, and soil surface temperature and moisture at a depth of 5 centimeters. Both the traditional planting area and the agrivoltaic system received equal irrigation rates and were tested using two irrigation scenarios — daily irrigation and irrigation every second day.

They found that the agrivoltaics system significantly impacted three factors that affect plant growth and reproduction — air temperatures, direct sunlight and atmospheric demand for water. The shade provided by the PV panels resulted in cooler daytime temperatures and warmer nighttime temperatures than the traditional, open-sky planting system. There was also a lower vapor pressure deficit in the agrivoltaics system, meaning there was more moisture in the air.

“We found that many of our food crops do better in the shade of solar panels because they are spared from the direct sun,” Baron-Gafford said. “In fact, total chiltepin fruit production was three times greater under the PV panels in an agrivoltaic system, and tomato production was twice as great!”

Jalapenos produced a similar amount of fruit in both the agrivoltaics system and the traditional plot, but did so with 65% less transpirational water loss.

“At the same time, we found that each irrigation event can support crop growth for days, not just hours, as in current agriculture practices. This finding suggests we could reduce our water use but still maintain levels of food production,” Barron-Gafford added, noting that soil moisture remained approximately 15% higher in the agrivoltaics system than the control plot when irrigating every other day.

In addition to the benefits to the plants, the researchers also found that the agrivoltaics system increased the efficiency of energy production. Solar panels are inherently sensitive to temperature — as they warm, their efficiency drops. By cultivating crops underneath the PV panels, researchers were able to reduce the temperature of the panels.

“Those overheating solar panels are actually cooled down by the fact that the crops underneath are emitting water through their natural process of transpiration — just like misters on the patio of your favorite restaurant,” Barron-Gafford said. “All told, that is a win-win-win in terms of bettering our how we grow our food, utilize our precious water resources, and produce renewable energy.”

Barron-Gafford’s research into agrivoltaics has expanded to include several solar installations on Tucson Unified School District, or TUSD, land. Moses Thompson, who splits his time between the TUSD and the UA School of Geography and Development, notes that the team is also using the TUSD solar installations to engage with K-12 students.

“What draws me to this work is what happens to the K-12 learner when their involvement is consequential and the research lives in their community,” Thompson said. “That shift in dynamics creates students who feel agency in addressing grand challenges such as climate change.”

The authors say more research with additional plant species is needed. They also note the currently unexplored impact agrivoltaics could have on the physical and social well-bring of farm laborers. Preliminary data show that skin temperature can be about 18 degrees Fahrenheit cooler when working in an agrivoltaics area than in traditional agriculture.

“Climate change is already disrupting food production and farm worker health in Arizona,” said Gary Nabhan, an agroecologist in the UA Southwest Center and co-author on the paper. “The Southwestern U.S. sees a lot of heat stroke and heat-related death among our farm laborers; this could have a direct impact there, too.”

Barron-Gafford and the team are now working with the U.S. Department of Energy’s National Renewable Energy Lab to assess how well an agrivoltaics approach can work in other regions of the country and how regional policies can promote adoption of novel approaches to solve these pervasive problems.

“This is UA innovation at its best — an interdisciplinary team of researchers working to address some of our most challenging environmental dilemmas,” said co-author Andrea Gerlak, a professor in the School of Geography and Development in the College of Social and Behavioral Sciences. “Imagine the impact we can have in our community — and the larger world — by more creatively thinking about agriculture and renewable energy production together.”

Go to Source
Author: