Why there is no speed limit in the superfluid universe

Physicists from Lancaster University have established why objects moving through superfluid helium-3 lack a speed limit in a continuation of earlier Lancaster research.

Helium-3 is a rare isotope of helium, in which one neutron is missing. It becomes superfluid at extremely low temperatures, enabling unusual properties such as a lack of friction for moving objects.

It was thought that the speed of objects moving through superfluid helium-3 was fundamentally limited to the critical Landau velocity, and that exceeding this speed limit would destroy the superfluid. Prior experiments in Lancaster have found that it is not a strict rule and objects can move at much greater speeds without destroying the fragile superfluid state.

Now scientists from Lancaster University have found the reason for the absence of the speed limit: exotic particles that stick to all surfaces in the superfluid.

The discovery may guide applications in quantum technology, even quantum computing, where multiple research groups already aim to make use of these unusual particles.

To shake the bound particles into sight, the researchers cooled superfluid helium-3 to within one ten thousandth of a degree from absolute zero (0.0001K or -273.15°C). They then moved a wire through the superfluid at a high speed, and measured how much force was needed to move the wire. Apart from an extremely small force related to moving the bound particles around when the wire starts to move, the measured force was zero.

Lead author Dr Samuli Autti said: “Superfluid helium-3 feels like vacuum to a rod moving through it, although it is a relatively dense liquid. There is no resistance, none at all. I find this very intriguing.”

PhD student Ash Jennings added: “By making the rod change its direction of motion we were able to conclude that the rod will be hidden from the superfluid by the bound particles covering it, even when its speed is very high.” “The bound particles initially need to move around to achieve this, and that exerts a tiny force on the rod, but once this is done, the force just completely disappears,” said Dr Dmitry Zmeev, who supervised the project.

The Lancaster researchers included Samuli Autti, Sean Ahlstrom, Richard Haley, Ash Jennings, George Pickett, Malcolm Poole, Roch Schanen, Viktor Tsepelin, Jakub Vonka, Tom Wilcox, Andrew Woods and Dmitry Zmeev. The results are published in Nature Communications.

Story Source:

Materials provided by Lancaster University. Note: Content may be edited for style and length.

Go to Source


A tiny instrument to measure the faintest magnetic fields

Physicists at the University of Basel have developed a minuscule instrument able to detect extremely faint magnetic fields. At the heart of the superconducting quantum interference device are two atomically thin layers of graphene, which the researchers combined with boron nitride. Instruments like this one have applications in areas such as medicine, besides being used to research new materials.

To measure very small magnetic fields, researchers often use superconducting quantum interference devices, or SQUIDs. In medicine, their uses include monitoring brain or heart activity, for example, while in the earth sciences researchers use SQUIDs to characterize the composition of rocks or detect groundwater flows. The devices also have a broad range of uses in other applied fields and basic research.

The team led by Professor Christian Schönenberger of the University of Basel’s Department of Physics and the Swiss Nanoscience Institute has now succeeded in creating one of the smallest SQUIDs ever built. The researchers described their achievement in the scientific journal Nano Letters.

A superconducting ring with weak links

A typical SQUID consists of a superconducting ring interrupted at two points by an extremely thin film with normal conducting or insulating properties. These points, known as weak links, must be so thin that the electron pairs responsible for superconductivity are able to tunnel through them. Researchers recently also began using nanomaterials such as nanotubes, nanowires or graphene to fashion the weak links connecting the two superconductors.

As a result of their configuration, SQUIDs have a critical current threshold above which the resistance-free superconductor becomes a conductor with ordinary resistance. This critical threshold is determined by the magnetic flux passing through the ring. By measuring this critical current precisely, the researchers can draw conclusions about the strength of the magnetic field.

SQUIDs with six layers

“Our novel SQUID consists of a complex, six-layer stack of individual two-dimensional materials,” explains lead author David Indolese. Inside it are two graphene monolayers separated by a very thin layer of insulating boron nitride. “If two superconducting contacts are connected to this sandwich, it behaves like a SQUID — meaning it can be used to detect extremely weak magnetic fields.”

In this setup, the graphene layers are the weak links, although in contrast to a regular SQUID they are not positioned next to each other, but one on top of the other, aligned horizontally. “As a result, our SQUID has a very small surface area, limited only by the constraints of nanofabrication technology,” explains Dr. Paritosh Karnatak from Schönenberger’s team.

The tiny device for measuring magnetic fields is only around 10 nanometers high — roughly a thousandth of the thickness of a human hair. The instrument can trigger supercurrents that flow in minuscule spaces. Moreover, its sensitivity can be adjusted by changing the distance between the graphene layers. With the help of electrical fields, the researchers are also able to increase the signal strength, further enhancing the measurement accuracy.

Analyzing topological insulators

The Basel research team’s primary goal in developing the novel SQUIDs was to analyze the edge currents of topological insulators. Topological insulators are currently a focus of countless research groups all over the world. On the inside, they behave like insulators, while on the outside — or along the edges — they conduct current almost losslessly, making them possible candidates for a broad range of applications in the field of electronics.

“With the new SQUID, we can determine whether these lossless supercurrents are due to a material’s topological properties, and thereby tell them apart from non-topological materials. This is very important for the study of topological insulators,” remarked Schönenberger of the project. In future, SQUIDs could also be used as low-noise amplifiers for high-frequency electrical signals, or for instance to detect local brainwaves (magnetoencephalography), as their compact design means a large number of the devices can be connected in series.

The paper is the outcome of close collaboration among groups at the University of Basel, the University of Budapest and the National Institute for Material Science in Tsukuba (Japan).

Go to Source


Quantum leap for speed limit bounds

Nature’s speed limits aren’t posted on road signs, but Rice University physicists have discovered a new way to deduce them that is better — infinitely better, in some cases — than previous methods.

“The big question is, ‘How fast can anything — information, mass, energy — move in nature?'” said Kaden Hazzard, a theoretical quantum physicist at Rice. “It turns out that if somebody hands you a material, it is incredibly difficult, in general, to answer the question.”

In a study published today in the American Physical Society journal PRX Quantum, Hazzard and Rice graduate student Zhiyuan Wang describe a new method for calculating the upper bound of speed limits in quantum matter.

“At a fundamental level, these bounds are much better than what was previously available,” said Hazzard, an assistant professor of physics and astronomy and member of the Rice Center for Quantum Materials. “This method frequently produces bounds that are 10 times more accurate, and it’s not unusual for them to be 100 times more accurate. In some cases, the improvement is so dramatic that we find finite speed limits where previous approaches predicted infinite ones.”

Nature’s ultimate speed limit is the speed of light, but in nearly all matter around us, the speed of energy and information is much slower. Frequently, it is impossible to describe this speed without accounting for the large role of quantum effects.

In the 1970s, physicists proved that information must move much slower than the speed of light in quantum materials, and though they could not compute an exact solution for the speeds, physicists Elliott Lieb and Derek Robinson pioneered mathematical methods for calculating the upper bounds of those speeds.

“The idea is that even if I can’t tell you the exact top speed, can I tell you that the top speed must be less than a particular value,” Hazzard said. “If I can give a 100% guarantee that the real value is less than that upper bound, that can be extremely useful.”

Hazzard said physicists have long known that some of the bounds produced by the Lieb-Robinson method are “ridiculously imprecise.”

“It might say that information must move less than 100 miles per hour in a material when the real speed was measured at 0.01 miles per hour,” he said. “It’s not wrong, but it’s not very helpful.”

The more accurate bounds described in the PRX Quantum paper were calculated by a method Wang created.

“We invented a new graphical tool that lets us account for the microscopic interactions in the material instead of relying only on cruder properties such as its lattice structure,” Wang said.

Hazzard said Wang, a third-year graduate student, has an incredible talent for synthesizing mathematical relationships and recasting them in new terms.

“When I check his calculations, I can go step by step, churn through the calculations and see that they’re valid,” Hazzard said. “But to actually figure out how to get from point A to point B, what set of steps to take when there’s an infinite variety of things you could try at each step, the creativity is just amazing to me.”

The Wang-Hazzard method can be applied to any material made of particles moving in a discrete lattice. That includes oft-studied quantum materials like high-temperature superconductors, topological materials, heavy fermions and others. In each of these, the behavior of the materials arises from interactions of billions upon billions of particles, whose complexity is beyond direct calculation.

Hazzard said he expects the new method to be used in several ways.

“Besides the fundamental nature of this, it could be useful for understanding the performance of quantum computers, in particular in understanding how long they take to solve important problems in materials and chemistry,” he said.

Hazzard said he is certain the method will also be used to develop numerical algorithms because Wang has shown it can put rigorous bounds on the errors produced by oft-used numerical techniques that approximate the behavior of large systems.

A popular technique physicists have used for more than 60 years is to approximate a large system by a small one that can be simulated by a computer.

“We draw a small box around a finite chunk, simulate that and hope that’s enough to approximate the gigantic system,” Hazzard said. “But there has not been a rigorous way of bounding the errors in these approximations.”

The Wang-Hazzard method of calculating bounds could lead to just that.

“There is an intrinsic relationship between the error of a numerical algorithm and the speed of information propagation,” Wang explained, using the sound of his voice and the walls in his room to illustrate the link.

“The finite chunk has edges, just as my room has walls. When I speak, the sound will get reflected by the wall and echo back to me. In an infinite system, there is no edge, so there is no echo.”

In numerical algorithms, errors are the mathematical equivalent of echoes. They reverberate from the edges of the finite box, and the reflection undermines the algorithms’ ability to simulate the infinite case. The faster information moves through the finite system, the shorter the time the algorithm faithfully represents the infinite. Hazzard said he, Wang and others in his research group are using their method to craft numerical algorithms with guaranteed error bars.

“We don’t even have to change the existing algorithms to put strict, guaranteed error bars on the calculations,” he said. “But you can also flip it around and use this to make better numerical algorithms. We’re exploring that, and other people are interested in using these as well.”

Go to Source


New method to track ultrafast change of magnetic state

An international team of physicists from Bielefeld University, Uppsala University, the University of Strasbourg, University of Shanghai for Science and Technology, Max Planck Institute for Polymer Research, ETH Zurich, and the Free University Berlin have developed a precise method to measure the ultrafast change of a magnetic state in materials. They do this by observing the emission of terahertz radiation that necessarily accompanies such a magnetization change.

Magnetic memories are not just acquiring higher and higher capacity by shrinking the size of magnetic bits, they are also getting faster. In principle, the magnetic bit can be ‘flipped’ — that is, it can change its state from ‘one’ to ‘zero’ or vice versa — on an extremely fast timescale of shorter than one picosecond. One picosecond is one millionth of one millionth of a second. This could allow the operation of magnetic memories at terahertz switching frequencies, corresponding to extremely high terabit per second (Tbit/s) data rates.

‘The actual challenge is to be able to detect such a magnetization change quickly and sensitively enough,’ explains Dr Dmitry Turchinovich, professor of physics at Bielefeld University and the leader of this study. ‘The existing methods of ultrafast magnetometry all suffer from certain significant drawbacks such as, for example, operation only under ultrahigh vacuum conditions, the inability to measure on encapsulated materials, and so on. Our idea was to use the basic principle of electrodynamics. This states that a change in the magnetization of a material must result in the emission of electromagnetic radiation containing the full information on this magnetization change. If the magnetization in a material changes on a picosecond timescale, then the emitted radiation will belong to the terahertz frequency range. The problem is, that this radiation, known as “magnetic dipole emission,” is very weak, and can be easily obscured by light emission of other origins.’

Wentao Zhang, a PhD student in the lab of Professor Dmitry Turchinovich, and the first author of the published paper says: ‘It took us time, but finally we succeeded in isolating precisely this magnetic dipole terahertz emission that allowed us to reliably reconstruct the ultrafast magnetization dynamics in our samples: encapsulated iron nanofilms.’

In their experiments, the researchers sent very short pulses of laser light onto the iron nanofilms, causing them to demagnetize very quickly. At the same time, they were collecting the terahertz light emitted during such a demagnetization process. The analysis of this terahertz emission yielded the precise temporal evolution of a magnetic state in the iron film.

‘Once our analysis was finished, we realized that we actually saw far more than what we had expected,’ continues Dmitry Turchinovich. ‘It has already been known for some time that iron can demagnetize very quickly when illuminated by laser light. But what we also saw was a reasonably small, but a very clear additional signal in magnetization dynamics. This got us all very excited. This signal came from the demagnetization in iron — actually driven by the propagation of a very fast pulse of sound through our sample. Where did this sound come from? Very easy: when the iron film absorbed the laser light, it not only demagnetized, it also became hot. As we know, most materials expand when they get hot — and this expansion of the iron nanofilm launched a pulse of terahertz ultrasound within our sample structure. This sound pulse was bouncing back and forth between the sample boundaries, internal and external, like the echo between the walls of a big hall. And each time this echo passed through the iron nanofilm, the pressure of sound moved the iron atoms a little bit, and this further weakened the magnetism in the material.’ This effect has never been observed before on such an ultrafast timescale.

‘We are very happy that we could see this acoustically-driven ultrafast magnetization signal so clearly, and that it was so relatively strong. It was amazing that detecting it with THz radiation, which has a sub-mm wavelength, worked so well, because the expansion in the iron film is only tens of femtometres which is ten orders of magnitude smaller,’ says Dr Peter M. Oppeneer, a professor of physics at Uppsala University, who led the theoretical part of this study.

Dr. Pablo Maldonado, a colleague of Peter M. Oppeneer who performed the numerical calculations that were crucial for explaining the observations in this work, adds: ‘What I find extremely exciting is an almost perfect match between the experimental data and our first-principles theoretical calculations. This confirms that our experimental method of ultrafast terahertz magnetometry is indeed very accurate and also sensitive enough, because we were able to distinguish clearly between the ultrafast magnetic signals of different origins: electronic and acoustic.’

The remaining co-authors of this publication have dedicated it to the memory of their colleague and a pioneer in the field of ultrafast magnetism, Dr. Eric Beaurepaire from the University of Strasbourg. He was one of the originators of this study, but passed away during its final stages.

Story Source:

Materials provided by Bielefeld University. Note: Content may be edited for style and length.

Go to Source


New insights into lung tissue in COVID-19 disease

Physicists at the University of Göttingen, together with pathologists and lung specialists at the Medical University of Hannover, have developed a three-dimensional imaging technique that enables high resolution and three-dimensional representation of damaged lung tissue following severe Covid-19. Using a special X-ray microscopy technique, they were able to image changes caused by the coronavirus in the structure of alveoli (the tiny air sacs in the lung) and the vasculature. The results of the study were published in the research journal eLife.

In severe Covid-19 disease, the researchers observed significant changes in the vasculature, inflammation, blood clots and “hyaline membranes,” which are composed of proteins and dead cells deposited on the alveolar walls, which make gas exchange difficult or impossible. With their new imaging approach, these changes can be visualized for the first time in larger tissue volumes, without cutting and staining or damaging the tissue as in conventional histology. It is particularly well suited for tracing small blood vessels and their branches in three dimensions, localizing cells of the immune systems which are recruited to the inflammation sites, and measuring the thickness of the alveolar walls. Due to the three-dimensional reconstruction, the data could also be used to simulate gas exchange.

“Using zoom tomography, large areas of lung tissue embedded in wax can be scanned enabling detailed examination to locate particularly interesting areas around inflammation, blood vessels or bronchial tubes,” says lead author Professor Tim Salditt from the Institute of X-ray Physics at the University of Göttingen. Since X-rays penetrate deep into tissue, this enables scientists to understand the relation between the microscopic tissue structure and the larger functional architecture of an organ. This is important, for example, to visualize the tree of blood vessels down to the smallest capillaries.

The authors foresee that this new X-ray technique will be an extension to traditional histology and histopathology, areas of study which go back to the 19th century when optical microscopes had just become available and pathologists could thereby unravel the microscopic origins of many diseases. Even today, pathologists still follow the same basic steps to prepare and investigate tissue: chemical fixation, slicing, staining and microscopy. This traditional approach, however, is not sufficient if three-dimensional images are required or if large volumes have to be screened, digitalized or analysed with computer programmes.

Three-dimensional imaging is well known from medical computerized tomography (CT). However, the resolution and contrast of this conventional technique are not sufficient to detect the tissue structure with cellular or sub-cellular resolution. Therefore, the authors used “phase contrast,” which exploits the different propagation velocities of X-rays in tissue to generate an intensity pattern on the detector. Salditt and his research group at the Institute for X-ray Physics developed special illumination optics and algorithms to reconstruct sharp images from these patterns, an approach which they have now adapted for the study of lung tissue affected by severe progression of Covid-19. The Göttingen team could record lung tissue at scalable size and resolution, yielding both larger overviews and close-up reconstructions. Depending on the setting, their method can even yield structural details below the resolution of conventional light microscopy. To achieve this, the researchers used highly powerful X-ray radiation generated at the PETRAIII storage ring of the German Electron Synchrotron (DESY) in Hamburg.

As was the case when the modern microscope was invented 150 years ago, significant progress has resulted from collaboration between physicists and medical researchers. The interdisciplinary research team hopes that the new method will support the development of treatment methods, medicines to prevent or alleviate severe lung damage in Covid-19, or to promote regeneration and recovery. “It is only when we can clearly see and understand what is really going on, that we can develop targeted interventions and drugs,” adds Danny Jonigk (Medical University Hannover), who led the medical part of the interdisciplinary study.

Story Source:

Materials provided by University of Göttingen. Note: Content may be edited for style and length.

Go to Source


A quantum thermometer to measure the coldest temperatures in the universe

Physicists from Trinity College Dublin have proposed a thermometer based on quantum entanglement that can accurately measure temperatures a billion times colder than those in outer space.

These ultra-cold temperatures arise in clouds of atoms, known as Fermi gases, which are created by scientists to study how matter behaves in extreme quantum states.

The work was led by the QuSys team at Trinity with postdoctoral fellows, Dr Mark Mitchison, Dr Giacomo Guarnieri and Professor John Goold, in collaboration with Professor Steve Campbell (UCD) and Dr Thomas Fogarty and Professor Thomas Busch working at OIST, Okinawa, Japan.

Discussing the proposal, Professor Goold, head of Trinity’s QuSys group, explains what an ultra-cold gas is. He said:

“The standard way in which a physicist thinks about a gas is to use a theory known as statistical mechanics. This theory was invented by giants of physics such as Maxwell and Boltzmann in the 19th century. These guys revived an old idea from the Greek philosophers that macroscopic phenomena, such as pressure and temperature, could be understood in terms of the microscopic motion of atoms. We need to remember that at the time, the idea that matter was made of atoms was revolutionary.”

“At the dawn of the 20th century, another theory came to fruition. This is quantum mechanics and it may be the most important and accurate theory we have in physics. A famous prediction of quantum mechanics is that single atoms acquire wave-like features, which means that below a critical temperature they can combine with other atoms into a single macroscopic wave with exotic properties. This prediction led to a century-long experimental quest to reach the critical temperature. Success was finally achieved in the 90s with the creation of the first ultra-cold gases, cooled with lasers (Nobel Prize 1997) and trapped with strong magnetic fields — a feat which won the Nobel Prize in 2001.”

“Ultra-cold gases like these are now routinely created in labs worldwide and they have many uses, ranging from testing fundamental physics theories to detecting gravitational waves. But their temperatures are mind-bogglingly low at nanokelvin and below! Just to give you an idea, one kelvin is -271.15 degrees Celsius. These gases are a billion times colder than that — the coldest places in the universe and they are created right here on Earth.”

So what exactly is a Fermi gas?

“All particles in the universe, including atoms, come in one of two types called ‘bosons’ and ‘fermions’. A Fermi gas comprises fermions, named after the physicist Enrico Fermi. At very low temperatures, bosons and fermions behave completely differently. While bosons like to clump together, fermions do the opposite. They are the ultimate social distancers! This property actually makes their temperature tricky to measure.”

Dr Mark Mitchison, the first author of the paper, explains:

“Traditionally, the temperature of an ultra-cold gas is inferred from its density: at lower temperatures the atoms do not have enough energy to spread far apart, making the gas denser. But fermions always keep far apart, even at ultra-low temperatures, so at some point the density of a Fermi gas tells you nothing about temperature.”

“Instead, we proposed using a different kind of atom as a probe. Let’s say that you have an ultra-cold gas made of lithium atoms. You now take a different atom, say potassium, and dunk it into the gas. Collisions with the surrounding atoms change the state of your potassium probe and this allows you to infer temperature. Technically speaking, our proposal involves creating a quantum superposition: a weird state where the probe atom simultaneously does and doesn’t interact with the gas. We showed that this superposition changes over time in a way that is very sensitive to temperature.”

Dr Giacomo Guarnieri gives the following analogy:

“A thermometer is just a system whose physical properties change with temperature in a predictable way. For example, you can take the temperature of your body by measuring the expansion of mercury in a glass tube. Our thermometer works in an analogous way, but instead of mercury we measure the state of single atoms that are entangled (or correlated) with a quantum gas.”

Professor Steve Campbell, UCD, remarks:

“This isn’t just a far-flung idea — what we are proposing here can actually be implemented using technology available in modern atomic physics labs. That such fundamental physics can be tested is really amazing. Among the various emerging quantum technologies, quantum sensors like our thermometer are likely to make the most immediate impact, so it is a timely work and it was highlighted by the editors of Physical Review Letters for that reason.”

Professor Goold adds:

“In fact one of the reasons that this paper was highlighted was precisely because we performed calculations and numerical simulations with a particular focus on an experiment that was performed in Austria and published a few years ago in Science. Here the Fermi gas is a dilute gas of trapped Lithium atoms which were in contact with Potassium impurities. The experimentalists are able to control the quantum state with radio frequency pulses and measure out information on the gas. These are operations that are routinely used in other quantum technologies.”

“The timescales that are accessible are simply amazing and would be unprecedented in traditional condensed matter physics experiments. We are excited that our idea to use these impurities as a quantum thermometer with exquisite precision could be implemented and tested with existing technology.”

Professor Goold and his QuSys research group is supported by Science Foundation Ireland. He is the recipient of a Royal Society University Research Fellowship and a European Research Council Starting Grant. He has recently been elected as Fellow of the Young Academy of Europe.

Go to Source


Physicists cast doubt on neutrino theory

University of Cincinnati physicists, as part of an international research team, are raising doubts about the existence of an exotic subatomic particle that failed to show up in twin experiments.

UC College of Arts and Sciences associate professor Alexandre Sousa and assistant professor Adam Aurisano took part in an experiment at the Fermi National Accelerator Laboratory in search of sterile neutrinos, a suspected fourth “flavor” of neutrino that would join the ranks of muon, tau and electron neutrinos as elementary particles that make up the known universe.

Finding a fourth type of neutrino would be huge, Sousa said. It would redefine our understanding of elementary particles and their interactions in what’s known as the Standard Model.

Researchers in two experiments called Daya Bay and MINOS+ collaborated on complementary projects in an intense effort to find sterile neutrinos using some of the world’s most advanced and precise tools.

“We apparently don’t see any evidence for them,” Aurisano said.

The study was published in the journal Physical Review Letters and was featured in Physics Magazine, published by the American Physical Society.

“It’s an important result for particle physics,” Sousa said. “It provides an almost definitive answer to a question that has been lingering for over 20 years.”

The research builds on previous studies that offered tantalizing possibilities for finding sterile neutrinos. But the new results suggest sterile neutrinos might not have been responsible for the anomalies researchers previously observed, Aurisano said.

“Our results are incompatible with the sterile neutrino interpretation of the anomalies,” he said. “So these experiments remove a possibility — the leading possibility — that oscillations into sterile neutrinos solely explain these anomalies.”

Neutrinos are tiny, so tiny they can’t be broken down into something smaller. They are so small that they pass through virtually everything — mountains, lead vaults, you — by the trillions every second at virtually the speed of light. They are generated by the nuclear fusion reactions powering the sun, radioactive decays in nuclear reactors or in the Earth’s crust, and in particle accelerator labs, among other sources.

And as they travel, they often transition from one type (tau, electron, muon) to another or back.

But theorists have suggested there might be a fourth neutrino that interacts only with gravity, making them far harder to detect than the other three that also interact with matter through the weak nuclear force.

The experiment Daya Bay is composed of eight detectors arrayed around six nuclear reactors outside Hong Kong. MINOS+ uses a particle accelerator in Illinois to shoot a beam of neutrinos 456 miles through the curvature of the Earth to detectors waiting in Minnesota.

“We would all have been absolutely thrilled to find evidence for sterile neutrinos, but the data we have collected so far do not support any kind of sterile neutrino oscillation,” said Pedro Ochoa-Ricoux, associate professor at the University of California, Irvine.

Researchers expected to see muon neutrinos seemingly vanish into thin air when they transitioned into sterile neutrinos. But that’s not what happened.

“We expected to see muon neutrinos oscillating to sterile neutrinos and disappear,” Aurisano said.

Despite the findings, Aurisano said he thinks sterile neutrinos do exist, at least in some form.

“I think sterile neutrinos are more likely than not to exist at high energies. At the very beginning of the universe, you’d expect there would be sterile neutrinos,” he said. “Without them, it’s hard to explain aspects of neutrino mass.”

But Aurisano is skeptical about finding light sterile neutrinos that many theorists expected them to find in the experiments.

“Our experiment disfavors light or lower-mass sterile neutrinos,” he said.

Sousa said some of his research was truncated somewhat by the global COVID-19 pandemic when Fermilab shut down accelerator operations months earlier than expected. But researchers continued to use massive supercomputers to examine data from the experiments, even while working from home during the quarantine.

“It’s one of the blessings of high energy physics,” Aurisano said. “Fermilab has all the data online and the computing infrastructure is spread out around the world. So as long as you have the internet you can access all the data and all the computational facilities to do the analyses.”

Still, Aurisano said it takes some adjusting to work from home.

“It was easier when I had dedicated hours at the office. It’s a challenge sometimes to work from home,” he said.

Go to Source


Physicists find misaligned carbon sheets yield unparalleled properties

A material composed of two one-atom-thick layers of carbon has grabbed the attention of physicists worldwide for its intriguing — and potentially exploitable — conductive properties.

Dr. Fan Zhang, assistant professor of physics in the School of Natural Sciences and Mathematics at The University of Texas at Dallas, and physics doctoral student Qiyue Wang published an article in June with Dr. Fengnian Xia’s group at Yale University in Nature Photonics that describes how the ability of twisted bilayer graphene to conduct electrical current changes in response to mid-infrared light.

From One to Two Layers

Graphene is a single layer of carbon atoms arranged in a flat honeycomb pattern, where each hexagon is formed by six carbon atoms at its vertices. Since graphene’s first isolation in 2004, its unique properties have been intensely studied by scientists for potential use in advanced computers, materials and devices.

If two sheets of graphene are stacked on top of one another, and one layer is rotated so that the layers are slightly out of alignment, the resulting physical configuration, called twisted bilayer graphene, yields electronic properties that differ significantly from those exhibited by a single layer alone or by two aligned layers.

“Graphene has been of interest for about 15 years,” Zhang said. “A single layer is interesting to study, but if we have two layers, their interaction should render much richer and more interesting physics. This is why we want to study bilayer graphene systems.”

A New Field Emerges

When the graphene layers are misaligned, a new periodic design in the mesh emerges, called a moiré pattern. The moiré pattern is also a hexagon, but it can be made up of more than 10,000 carbon atoms.

“The angle at which the two layers of graphene are misaligned — the twist angle — is critically important to the material’s electronic properties,” Wang said. “The smaller the twist angle, the larger the moiré periodicity.”

The unusual effects of specific twist angles on electron behavior were first proposed in a 2011 article by Dr. Allan MacDonald, professor of physics at UT Austin, and Dr. Rafi Bistritzer. Zhang witnessed the birth of this field as a doctoral student in MacDonald’s group.

“At that time, others really paid no attention to the theory, but now it has become arguably the hottest topic in physics,” Zhang said.

In that 2011 research MacDonald and Bistritzer predicted that electrons’ kinetic energy can vanish in a graphene bilayer misaligned by the so-called “magic angle” of 1.1 degrees. In 2018, researchers at the Massachusetts Institute of Technology proved this theory, finding that offsetting two graphene layers by 1.1 degrees produced a two-dimensional superconductor, a material that conducts electrical current with no resistance and no energy loss.

In a 2019 article in Science Advances, Zhang and Wang, together with Dr. Jeanie Lau’s group at The Ohio State University, showed that when offset by 0.93 degrees, twisted bilayer graphene exhibits both superconducting and insulating states, thereby widening the magic angle significantly.

“In our previous work, we saw superconductivity as well as insulation. That’s what’s making the study of twisted bilayer graphene such a hot field — superconductivity. The fact that you can manipulate pure carbon to superconduct is amazing and unprecedented,” Wang said.

New UT Dallas Findings

In his most recent research in Nature Photonics, Zhang and his collaborators at Yale investigated whether and how twisted bilayer graphene interacts with mid-infrared light, which humans can’t see but can detect as heat. “Interactions between light and matter are useful in many devices — for example, converting sunlight into electrical power,” Wang said. “Almost every object emits infrared light, including people, and this light can be detected with devices.”

Zhang is a theoretical physicist, so he and Wang set out to determine how mid-infrared light might affect the conductance of electrons in twisted bilayer graphene. Their work involved calculating the light absorption based on the moiré pattern’s band structure, a concept that determines how electrons move in a material quantum mechanically.

“There are standard ways to calculate the band structure and light absorption in a regular crystal, but this is an artificial crystal, so we had to come up with a new method,” Wang said. Using resources of the Texas Advanced Computing Center, a supercomputer facility on the UT Austin campus, Wang calculated the band structure and showed how the material absorbs light.

The Yale group fabricated devices and ran experiments showing that the mid-infrared photoresponse — the increase in conductance due to the light shining — was unusually strong and largest at the twist angle of 1.8 degrees. The strong photoresponse vanished for a twist angle less than 0.5 degrees.

“Our theoretical results not only matched well with the experimental findings, but also pointed to a mechanism that is fundamentally connected to the period of moiré pattern, which itself is connected to the twist angle between the two graphene layers,” Zhang said.

Next Step

“The twist angle is clearly very important in determining the properties of twisted bilayer graphene,” Zhang added. “The question arises: Can we apply this to tune other two-dimensional materials to get unprecedented features? Also, can we combine the photoresponse and the superconductivity in twisted bilayer graphene? For example, can shining a light induce or somehow modulate superconductivity? That will be very interesting to study.”

“This new breakthrough will potentially enable a new class of infrared detectors based on graphene with high sensitivity,” said Dr. Joe Qiu, program manager for solid-state electronics and electromagnetics at the U.S. Army Research Office (ARO), an element of the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory. “These new detectors will potentially impact applications such as night vision, which is of critical importance for the U.S. Army.”

In addition to the Yale researchers, other authors included scientists from the National Institute for Materials Science in Japan. The ARO, the National Science Foundation and the Office of Naval Research supported the study.

Go to Source


Spintronics: Faster data processing through ultrashort electric pulses

Physicists at Martin Luther University Halle-Wittenberg (MLU) and Lanzhou University in China developed a simple concept that could improve significantly magnetic-based data processing. Using ultrashort electric pulses in the terahertz range, data can be written, read and erased very quickly. This would make data processing faster, more compact and energy efficient. The researchers confirmed their theory by running complex simulations and the results were published in the journal NPG Asia Materials.

Magnetic data storage is indispensable for storing securely the huge amount of data generated every day, for instance through social networks. Once stored, the information can still be retrieved after many years. Charge-based data storage used for example in mobile phones is much more short-lived when there is no energy supply. Traditional magnetic hard drives and components have disadvantages of their own, due to the moving mechanical parts and the need for magnetic fields which makes them more power consuming and relatively slow when reading and writing data.

“We were after a fast and energy-efficient alternative,” explains Professor Jamal Berakdar from the Institute of Physics at MLU. He and his colleagues from Lanzhou University came up with a simple idea. By using ultrashort pulses in the terahertz range, information could be written in magnetic nano-vortices and retrieved within picoseconds. Theoretically, this renders possible billions of read and write operations per second without the need for magnetic fields. “With the appropriately shaped pulses the data can be processed very quickly at low energy cost,” says Berakdar. The new concept is based on existing terahertz and magnetism technologies. “It exploits advances in electric pulse generation and nanomagnetism.”

So far, the method has been tested in computer simulations. “In recent years there have been fantastic advances in generating and controlling electrical pulses,” says Berakdar. Therefore, it makes sense to explore new ways to apply these pulses to data storage. The concept presented by the researchers offers a simple tool for controlling magnetic nano-vortices and can therefore be directly utilised for new storage technologies.

Story Source:

Materials provided by Martin-Luther-Universität Halle-Wittenberg. Note: Content may be edited for style and length.

Go to Source


How material defects influence the melting process

In 1972, physicists J. Michael Kosterlitz and David Thouless published a groundbreaking theory of how phase changes could occur in two-dimensional materials. Experiments soon showed that the theory correctly captured the process of a helium film transitioning from a superfluid to a normal fluid, helping to usher in a new era of research on ultra-thin materials, not to mention earning Kosterlitz, a professor at Brown University, and Thouless shares of the 2016 Nobel Prize in Physics.

But the Kosterlitz-Thouless (K-T) theory aimed to explain more than the superfluid transition. The pair also hoped it might explain how a two-dimensional solid could melt into a liquid, but experiments so far have failed to clearly validate the theory in that case. Now, new research by another group of Brown physicists could help explain the mismatch between theory and experiment.

The research, published in Proceedings of the National Academy of Sciences, shows how impurities — “extra” atoms in the crystalline structure of a material — can disrupt the order of a system and cause melting to begin before the K-T theory predicts it should. The findings are a step toward a more complete physical theory of melting, the researchers say.

“The solid-liquid transition is something we’re all familiar with, yet it’s a profound failure of modern physics that we still don’t understand exactly how it happens,” said Xinsheng Ling, a professor of physics at Brown and senior author of the new paper. “What we showed is that impurities — which are not included in K-T theory but are always found in real materials — play a major role in the melting process.”

While the details remain a major mystery, scientists have a basic understanding of how solids melt. As temperature increases, atoms in the crystalline lattice of a solid start to jiggle around. If the jiggling becomes too violent for the lattice to hold together, the solid melts into a liquid. But how exactly the melting process starts and why it starts in certain places in a solid instead of others aren’t known.

For this new study, the researchers used tiny polystyrene particles suspended in highly deionized water. Electrical forces between the charged particles cause them to arrange themselves in a crystal-like lattice similar to the way atoms are arranged in a solid material. Using a laser beam to move individual particles, the researchers can see how lattice defects affect the order of the lattice.

Defects can come in two general forms — vacancies, where particles are missing, and interstitials, where there are more particles than there should be. This new study looked in particular at the effect of interstitials, which no previous studies had investigated.

The research found that while one interstitial in a given region made little difference in the behavior of the lattice, two interstitials made a big difference.

“What we found was that two interstitial defects break the symmetry of the structure in a way that single defects don’t,” Ling said. “That symmetry-breaking leads to local melting before K-T predicts.”

That’s because the K-T theory deals with defects that arise from thermal fluctuations, and not defects that may have already existed in the lattice.

“Real materials are messy,” Ling said. “There are always impurities. Put simply, the system cannot distinguish which are impurities and which are defects created by thermal agitation, which leads to melting before what would be predicted.”

The technique used for the study could be useful elsewhere, the researchers say. For example, it could be useful in studying the transition of hard glass to a viscous liquid, a phenomenon related to the solid-liquid transition that also lacks a complete explanation.

“We think we have accidentally discovered a new way to uncover symmetry-breaking mechanisms in materials physics,” Ling said. “The method itself may end up being the most significant thing about this paper in addition to the findings.”

Co-authors of the paper were former Brown Ph.D. students Sung-Cheol Kim, Lichao Yu and Alexandros Pertsinidis, who all completed their Ph.D. theses in the Ling Lab at Brown. The research was supported by the National Science Foundation (DMR- 1005705).

Story Source:

Materials provided by Brown University. Note: Content may be edited for style and length.

Go to Source