Path to quantum computing at room temperature

Army researchers predict quantum computer circuits that will no longer need extremely cold temperatures to function could become a reality after about a decade.

For years, solid-state quantum technology that operates at room temperature seemed remote. While the application of transparent crystals with optical nonlinearities had emerged as the most likely route to this milestone, the plausibility of such a system always remained in question.

Now, Army scientists have officially confirmed the validity of this approach. Dr. Kurt Jacobs, of the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory, working alongside Dr. Mikkel Heuck and Prof. Dirk Englund, of the Massachusetts Institute of Technology, became the first to demonstrate the feasibility of a quantum logic gate comprised of photonic circuits and optical crystals.

“If future devices that use quantum technologies will require cooling to very cold temperatures, then this will make them expensive, bulky, and power hungry,” Heuck said. “Our research is aimed at developing future photonic circuits that will be able to manipulate the entanglement required for quantum devices at room temperature.”

Quantum technology offers a range of future advances in computing, communications and remote sensing.

In order to accomplish any kind of task, traditional classical computers work with information that is fully determined. The information is stored in many bits, each of which can be on or off. A classical computer, when given an input specified by a number of bits, can process this input to produce an answer, which is also given as a number of bits. A classical computer processes one input at a time.

In contrast, quantum computers store information in qubits that can be in a strange state where they are both on and off at the same time. This allows a quantum computer to explore the answers to many inputs at the same time. While it cannot output all the answers at once, it can output relationships between these answers, which allows it to solve some problems much faster than a classical computer.

Unfortunately, one of the major drawbacks of quantum systems is the fragility of the strange states of the qubits. Most prospective hardware for quantum technology must be kept at extremely cold temperatures — close to zero kelvins — to prevent the special states being destroyed by interacting with the computer’s environment.

“Any interaction that a qubit has with anything else in its environment will start to distort its quantum state,” Jacobs said. “For example, if the environment is a gas of particles, then keeping it very cold keeps the gas molecules moving slowly, so they don’t crash into the quantum circuits as much.”

Researchers have directed various efforts to resolve this issue, but a definite solution is yet to be found. At the moment, photonic circuits that incorporate nonlinear optical crystals have presently emerged as the sole feasible route to quantum computing with solid-state systems at room temperatures.

“Photonic circuits are a bit like electrical circuits, except they manipulate light instead of electrical signals,” Englund said. “For example, we can make channels in a transparent material that photons will travel down, a bit like electrical signals traveling along wires.”

Unlike quantum systems that use ions or atoms to store information, quantum systems that use photons can bypass the cold temperature limitation. However, the photons must still interact with other photons to perform logic operations. This is where the nonlinear optical crystals come into play.

Researchers can engineer cavities in the crystals that temporarily trap photons inside. Through this method, the quantum system can establish two different possible states that a qubit can hold: a cavity with a photon (on) and a cavity without a photon (off). These qubits can then form quantum logic gates, which create the framework for the strange states.

In other words, researchers can use the indeterminate state of whether or not a photon is in a crystal cavity to represent a qubit. The logic gates act on two qubits together, and can create “quantum entanglement” between them. This entanglement is automatically generated in a quantum computer, and is required for quantum approaches to applications in sensing.

However, scientists based the idea to make quantum logic gates using nonlinear optical crystals entirely on speculation — up until this point. While it showed immense promise, doubts remained as to whether this method could even lead to practical logic gates.

The application of nonlinear optical crystals had remained in question until researchers at the Army’s lab and MIT presented a way to realize a quantum logic gate with this approach using established photonic circuit components.

“The problem was that if one has a photon travelling in a channel, the photon has a ‘wave-packet’ with a certain shape,” Jacobs said. “For a quantum gate, you need the photon wave-packets to remain the same after the operation of the gate. Since nonlinearities distort wave-packets, the question was whether you could load the wave-packet into cavities, have them interact via a nonlinearity, and then emit the photons again so that they have the same wave-packets as they started with.”

Once they designed the quantum logic gate, the researchers performed numerous computer simulations of the operation of the gate to demonstrate that it could, in theory, function appropriately. Actual construction of a quantum logic gate with this method will first require significant improvements in the quality of certain photonic components, researchers said.

“Based on the progress made over the last decade, we expect that it will take about ten years for the necessary improvements to be realized,” Heuck said. “However, the process of loading and emitting a wave-packet without distortion is something that we should able to realize with current experimental technology, and so that is an experiment that we will be working on next.”

Go to Source


Modelling wrinkling and buckling in materials that form the basis of flexible electronics

Flexible circuits have become a highly desirable commodity in modern technology, with applications in biotechnology, electronics, monitors and screens, being of particular importance. A new paper authored by John F. Niven, Department of Physics & Astronomy, McMaster University, Hamilton, Ontario, published in EPJ E, aims to understand how materials used in flexible electronics behave under stress and strain, particularly, how they wrinkle and buckle.

The design of flexible circuits generally involves a thin rigid capping layer —  a metallic or polymeric film —  placed upon a thick flexible substrate —  a soft and stretchable elastomer. Compressing this rigid capping layer can lead to local buckling with a sinusoidal wrinkling pattern that allows its excess surface area to be accommodated by the compressed substrate.

When designing biomedical devices and wearable electronics, mechanical-induced buckling is the most plausible mechanism. Thus, for such applications, it is vital to understand mechanical instabilities and how they depend on the geometry and material properties of the individual layers. The ultimate aim being avoiding a loss of binding between layers and the development of voids.

Niven and his colleagues conducted an experiment to determine the geometrical parameters that dictate how a free-standing bilayer of film transitions into global or local buckling. The experiment also measured the effect of varying characteristics of the capping film and substrate layers such as their relative thickness. Stress was placed on the material — Elastosil sheets —  biaxially by shifting the well-adhered layers in different directions, whilst leaving the perpendicular direction of the material fixed.

The result of the team’s experiments was a force balance model that allows researchers to better understand the behaviour of such systems as the thickness ratio between the film layer and the substrate is adjusted, and quantify the amount and nature of wrinkling and buckling in materials that could form the basis of the next generation of electronics.

make a difference: sponsored opportunity

Story Source:

Materials provided by Springer. Note: Content may be edited for style and length.

Journal Reference:

  1. John F. Niven, Gurkaran Chowdhry, James S. Sharp, Kari Dalnoki-Veress. The emergence of local wrinkling or global buckling in thin freestanding bilayer films. The European Physical Journal E, 2020; 43 (4) DOI: 10.1140/epje/i2020-11946-y

Cite This Page:

Springer. “Modelling wrinkling and buckling in materials that form the basis of flexible electronics.” ScienceDaily. ScienceDaily, 21 April 2020. <>.

Springer. (2020, April 21). Modelling wrinkling and buckling in materials that form the basis of flexible electronics. ScienceDaily. Retrieved April 21, 2020 from

Springer. “Modelling wrinkling and buckling in materials that form the basis of flexible electronics.” ScienceDaily. (accessed April 21, 2020).

Go to Source


New study allows brain and artificial neurons to link up over the web

Brain functions are made possible by circuits of spiking neurons, connected together by microscopic, but highly complex links called ‘synapses’. In this new study, published in the scientific journal Nature Scientific Reports, the scientists created a hybrid neural network where biological and artificial neurons in different parts of the world were able to communicate with each other over the internet through a hub of artificial synapses made using cutting-edge nanotechnology. This is the first time the three components have come together in a unified network.

During the study, researchers based at the University of Padova in Italy cultivated rat neurons in their laboratory, whilst partners from the University of Zurich and ETH Zurich created artificial neurons on Silicon microchips. The virtual laboratory was brought together via an elaborate setup controlling nanoelectronic synapses developed at the University of Southampton. These synaptic devices are known as memristors.

The Southampton based researchers captured spiking events being sent over the internet from the biological neurons in Italy and then distributed them to the memristive synapses. Responses were then sent onward to the artificial neurons in Zurich also in the form of spiking activity. The process simultaneously works in reverse too; from Zurich to Padova. Thus, artificial and biological neurons were able to communicate bidirectionally and in real time.

Themis Prodromakis, Professor of Nanotechnology and Director of the Centre for Electronics Frontiers at the University of Southampton said “One of the biggest challenges in conducting research of this kind and at this level has been integrating such distinct cutting edge technologies and specialist expertise that are not typically found under one roof. By creating a virtual lab we have been able to achieve this.”

The researchers now anticipate that their approach will ignite interest from a range of scientific disciplines and accelerate the pace of innovation and scientific advancement in the field of neural interfaces research. In particular, the ability to seamlessly connect disparate technologies across the globe is a step towards the democratisation of these technologies, removing a significant barrier to collaboration.

Professor Prodromakis added “We are very excited with this new development. On one side it sets the basis for a novel scenario that was never encountered during natural evolution, where biological and artificial neurons are linked together and communicate across global networks; laying the foundations for the Internet of Neuro-electronics. On the other hand, it brings new prospects to neuroprosthetic technologies, paving the way towards research into replacing dysfunctional parts of the brain with AI chips.”

The research was funded by the EU Future and Emerging Technologies programme as well as the Engineering and Physical Sciences Research Council in the UK. Professor Prodromakis also holds a Royal Academy of Engineering Chair in Emerging Technologies with a focus on developing energy-efficient AI Hardware solutions.

Story Source:

Materials provided by University of Southampton. Note: Content may be edited for style and length.

Go to Source

IEEE Spectrum

Ampacity Calculations of High-Voltage Cables Using Multiphysics Simulation

Get an introduction to modeling thermal performance of high-voltage cables in COMSOL Multiphysics® in this webinar. High-voltage power cables are major assets and are expected to work reliably for many years. One of the factors affecting reliability is thermal performance of the cables. This has to be determined at the early stage of the design and is achieved by calculations of the thermal rating (often called ampacity).

Ampacity calculations usually involve two different physics: heat transfer (in solids and gases/liquids) and electromagnetics (Joule losses in the conductor, dielectric losses, and electromagnetic induction). Some models of cable installations are relatively easy to solve, while others require in-depth knowledge of the physics and experience in building FEM models. For example, the heat transfer from directly buried cables involves only conduction, while for cables installed in air, radiation and convection must be taken into consideration.

This webinar will demonstrate how to build finite element models of underground cables using COMSOL Multiphysics®. The presentation concludes with a Q&A session.


Tiny devices made of DNA detect cancer with fewer false alarms

A new cancer-detecting tool uses tiny circuits made of DNA to identify cancer cells by the molecular signatures on their surface.

Duke University researchers fashioned the simple circuits from interacting strands of synthetic DNA that are tens of thousands of times finer than a human hair.

Unlike the circuits in a computer, these circuits work by attaching to the outside of a cell and analyzing it for proteins found in greater numbers on some cell types than others. If a circuit finds its targets, it labels the cell with a tiny light-up tag.

Because the devices distinguish cell types with higher specificity than previous methods, the researchers hope their work might improve diagnosis, and give cancer therapies better aim.

A team led by Duke computer scientist John Reif and his former Ph.D. student Tianqi Song described their approach in a recent issue of the Journal of the American Chemical Society.

Similar techniques have been used previously to detect cancer, but they’re more prone to false alarms — misidentifications that occur when mixtures of cells sport one or more of the proteins a DNA circuit is designed to screen for, but no single cell type has them all.

For every cancer cell that is correctly detected using current methods, some fraction of healthy cells also get mislabeled as possibly cancerous when they’re not.

Each type of cancer cell has a characteristic set of cell membrane proteins on its cell surface. To cut down on cases of mistaken identity, the Duke team designed a DNA circuit that must latch onto that specific combination of proteins on the same cell to work.

As a result they’re much less likely to flag the wrong cells, Reif said.

The technology could be used as a screening tool to help rule out cancer, which could mean fewer unnecessary follow-ups, or to develop more targeted cancer treatments with fewer side effects.

Each basic element of their DNA circuit consists of two DNA strands. The first DNA strand folds over and partially pairs up with itself to form a hairpin shape. One end of each hairpin is bound to a second strand of DNA that acts as a lock and tether, folding in such a way to fit a specific cell surface protein like a puzzle piece. Together these two strands act to verify that that particular protein is present on the cell surface.

To look for cancer, the circuit components are mixed with a person’s cells in the lab. If any cells are studded with the right combination of proteins, the complete circuit will attach. Adding a strand of “initiator” DNA then causes one of the hairpins to open, which in turn triggers another in a chain reaction until the last hairpin in the circuit is opened and the cell lights up.

Test runs of the device in test tubes in Reif’s lab showed it can be used to detect leukemia cells and to distinguish them from other types of cancer within a matter of hours, just by the strength of their glow.

The devices can be easily reconfigured to detect different cell surface proteins by replacing the tether strands, the researchers say. In the future, Reif plans to the DNA circuits to release a small molecule that alerts the body’s immune system to attack the cancer cell.

The technology isn’t ready for prime time yet. The researchers say their DNA circuits require testing in more realistic conditions to make sure they still flag the right cells.

But it’s a promising step toward ensuring that cancer screens and therapies zero in on the right culprits.

This research was supported by grants from the National Science Foundation (CCF-1320360, CCF-1217457, CCF-1617791, CCF-1813805).

Story Source:

Materials provided by Duke University. Original written by Robin Ann Smith. Note: Content may be edited for style and length.

Go to Source


3D-printed plastics with high performance electrical circuits

Rutgers engineers have embedded high performance electrical circuits inside 3D-printed plastics, which could lead to smaller and versatile drones and better-performing small satellites, biomedical implants and smart structures.

They used pulses of high-energy light to fuse tiny silver wires, resulting in circuits that conduct 10 times more electricity than the state of the art, according to a study in the journal Additive Manufacturing. By increasing conductivity 10-fold, the engineers can reduce energy use, extend the life of devices and increase their performance.

“Our innovation shows considerable promise for developing an integrated unit — using 3D printing and intense pulses of light to fuse silver nanoparticles — for electronics,” said senior author Rajiv Malhotra, an assistant professor in the Department of Mechanical and Aerospace Engineering in the School of Engineering at Rutgers University-New Brunswick.

Embedding electrical interconnections inside 3D-printed structures made of polymers, or plastics, can create new paradigms for devices that are smaller and more energy-efficient. Such devices could include CubeSats (small satellites), drones, transmitters, light and motion sensors and Global Positioning Systems. Such interconnections are also often used in antennas, pressure sensors, electrical coils and electrical grids for electromagnetic shielding.

The engineers used high-tech “intense pulsed light sintering” — featuring high-energy light from a xenon lamp — to fuse long thin rods of silver called nanowires. Nanomaterials are measured in nanometers (a nanometer is a millionth of a millimeter — about 100,000 times thinner than a human hair). Fused silver nanomaterials are already used to conduct electricity in devices such as solar cells, displays and radio-frequency identification (RFID) tags.

Next steps include making fully 3D internal circuits, enhancing their conductivity and creating flexible internal circuits inside flexible 3D structures, Malhotra said.

Story Source:

Materials provided by Rutgers University. Note: Content may be edited for style and length.

Go to Source


Computer model helps make sense of human memory

Brains are a mazy network of overlapping circuits — some pathways encourage activity while others suppress it. While earlier studies focused more on excitatory circuits, inhibitory circuits are now understood to play an equally important role in brain function. Researchers at the Okinawa Institute of Science and Technology Graduate University (OIST) and the RIKEN Center for Brain Science have created an artificial network to simulate the brain, demonstrating that tinkering with inhibitory circuits leads to extended memory.

Associative memory is the ability to connect unrelated items and store them in memory — to associate co-occurring items as a single episode. In this study, published in Physical Review Letters, the team used sequentially arranged patterns to simulate a memory, and found that a computer is able to remember patterns spanning a longer episode when the model takes inhibitory circuits into account. They go on to explain how this finding could be applied to explain our own brains.

“This simple model of processing shows us how the brain handles the pieces of information given in a serial order,” explains Professor Tomoki Fukai, head of OIST’s Neural Coding and Brain Computing Unit, who led the study with RIKEN collaborator Dr. Tatsuya Haga. “By modelling neurons using computers, we can begin to understand memory processing in our own minds.”

Lower Your Inhibitions

Thinking about the brain in terms of physical, non-biological phenomena is now a widely accepted approach in neuroscience — and many ideas lifted from physics have now been validated in animal studies. One such idea is understanding the brain’s memory system as an attractor network, a group of connected nodes that display patterns of activity and tend towards certain states. This idea of attractor networks formed the basis of this study.

A tenet of neurobiology is that “cells that fire together wire together” — neurons that are active at the same time become synchronized, which partly explains how our brains change over time. In their model, the team created excitatory circuits — patterns of neurons firing together — to replicate the brain. The model included many excitatory circuits spread across a network.

More importantly, the team inserted inhibitory circuits into the model. Different inhibitory circuits act locally on a particular circuit, or globally across the network. The circuits block unwanted signals from interfering with the excitatory circuits, which are then better able to fire and wire together. These inhibitory circuits allowed the excitatory circuits to remember a pattern representing a longer episode.

The finding matches what is currently known about the hippocampus, a brain region involved in associative memory. It is thought that a balance of excitatory and inhibitory activity is what allows new associations to form. Inhibitory activity could be regulated by a chemical called acetylcholine, which is known to play a role in memory within the hippocampus. This model is a digital representation of these processes.

A challenge to the approach, however, is the use of random sampling. The sheer number of possible outputs, or attractor states, in the network, overworks a computer’s memory capacity. The team instead had to rely on a selection of outputs, rather than a systematic review of every possible combination. This allowed them to overcome a technical difficulty without jeopardizing the model’s predictions.

Overall, the study allowed for overarching inferences — inhibitory neurons have an important role in associative memory, and this maps to what we might expect in our own brains. Fukai says that biological studies will need to be completed to determine the exact validity of this computational work. Then, it will be possible to map the components of the simulation to their biological counterparts, building a more complete picture of the hippocampus and associative memory.

The team will next move beyond a simple model toward one with additional parameters that better represents the hippocampus, and look at the relative importance of local and global inhibitory circuits. The current model comprises neurons that are either off or on — zeros and ones. A future model will include dendrites, the branches that connect neurons in a complicated mesh. This more realistic simulation will be even better placed to make conclusions about biological brains.

Go to Source


Light and sound in silicon chips: The slower the better

Integrated circuits in silicon enable our digital era. The capabilities of electronic circuits have been extended even further with the introduction of photonics: components for the generation, guiding and detection of light. Together, electronics and photonics support entire systems for data communication and processing, all on a chip. However, there are certain things that even electrical and optical signals can’t do simply because they move too fast.

Sometimes moving slowly is actually better, according to Prof. Avi Zadok of Bar-Ilan University’s Faculty of Engineering and Institute of Nanotechnology and Advanced Materials. “Important signal processing tasks, such as the precise selection of frequency channels, require that data is delayed over time scales of tens of nano-seconds. Given the fast speed of light, optical waves propagate over many meters within these timeframes. One cannot accommodate such path lengths in a silicon chip. It is unrealistic. In this race, fast doesn’t necessarily win.”

The problem, in fact, is a rather old one. Analog electronic circuits have been facing similar challenges in signal processing for sixty years. An excellent solution was found in the form of acoustics: A signal of interest is converted from the electrical domain to the form of an acoustic wave. The speed of sound, of course, is slower than that of light by a factor of 100,000. Acoustic waves acquire the necessary delays over tens of micro-meters instead of meters. Such path lengths are easily accommodated on-chip. Following propagation, the delayed signal can be converted back to electronics.

In a new work published today (September 16, 2019) in the journal Nature Communications, Zadok and co-workers carry over this principle to silicon-photonic circuits.

“There are several difficulties with introducing acoustic waves to silicon chips,” says doctoral student Dvir Munk, of Bar-Ilan University, who participated in the study. “The standard layer structure used for silicon photonics is called silicon on insulator. While this structure guides light very effectively, it cannot confine and guide sound waves. Instead, acoustic waves just leak away.” Due to this difficulty, previous works that combine light and sound waves in silicon do not involve the standard layer structure. Alternatively, hybrid integration of additional, nonstandard materials was necessary.

“That first challenge can be overcome by using acoustic waves that propagate at the upper surface of the silicon chip,” continues Munk. “These surface acoustic waves do not leak down as quickly. Here, however, there is another issue: Generation of acoustic waves usually relies on piezo-electric crystals. These crystals expand when a voltage is applied to them. Unfortunately, this physical effect does not exist in silicon, and we much prefer to avoid introducing additional materials to the device.”

As an alternative, students Munk, Moshe Katzman and coworkers relied on the illumination of metals. “Incoming light carries the signal of interest,” explains Katzman. “It irradiates a metal pattern on the chip. The metals expand and contract, and strain the silicon surface below. With proper design, that initial strain can drive surface acoustic waves. In turn, the acoustic waves pass across standard optical waveguides in the same chip. Light in those waveguides is affected by the surface waves. In this way, the signal of interest is converted from one optical wave to another via acoustics. In the meantime, significant delay is accumulated within very short reach.”

The concept combines light and sound in standard silicon with no suspension of membranes or use of piezo-electric crystals. Acoustic frequencies up to 8 GHz are reached, however the concept is scalable to 100 GHz. The working principle is applicable to any substrate, not only silicon. Applications are presented as well: the concept is used in narrowband filters of input radio-frequency signals. The highly selective filters make use of 40 nano-second long delays. “Rather than use five meters of waveguide, we achieve this delay within 150 microns,” says Munk.

Prof. Zadok summarizes: “Acoustics is a missing dimension in silicon chips because acoustics can complete specific tasks that are difficult to do with electronics and optics alone. For the first time we have added this dimension to the standard silicon photonics platform. The concept combines the communication and bandwidth offered by light with the selective processing of sound waves.”

One potential application of such devices is in future cellular networks, widely known as 5G. Digital electronics alone might not be enough to support the signal processing requirements in such networks. Light and sound devices might do the trick.

Story Source:

Materials provided by Bar-Ilan University. Note: Content may be edited for style and length.

Go to Source