Categories
ScienceDaily

Discovery will allow more sophisticated work at nanoscale

The movement of fluids through small capillaries and channels is crucial for processes ranging from blood flow through the brain to power generation and electronic cooling systems, but that movement often stops when the channel is smaller than 10 nanometers.

Researchers led by a University of Houston engineer have reported a new understanding of the process and why some fluids stagnate in these tiny channels, as well as a new way to stimulate the fluid flow by using a small increase in temperature or voltage to promote mass and ion transport.

The work, published in ACS Applied Nano Materials, explores the movement of fluids with lower surface tension, which allows the bonds between molecules to break apart when forced into narrow channels, stopping the process of fluid transport, known as capillary wicking. The research was also featured on the journal’s cover.

Hadi Ghasemi, Cullen Associate Professor of Mechanical Engineering at UH and corresponding author for the paper, said this capillary force drives liquid flow in small channels and is the critical mechanism for mass transport in nature and technology — that is, in situations ranging from blood flow in the human brain to the movement of water and nutrients from soil to plant roots and leaves, as well as in industrial processes.

But differences in the surface tension of some fluids causes the wicking process — and therefore, the movement of the fluid — to stop when those channels are smaller than 10 nanometers, he said. The researchers reported that it is possible to prompt continued flow by manipulating the surface tension through small stimuli, such as raising the temperature or using a small amount of voltage.

Ghasemi said raising the temperature even slightly can activate movement by changing surface tension, which they dubbed “nanogates.” Depending on the liquid, raising the temperature between 2 degrees Centigrade and 3 degrees C is enough to mobilize the fluid.

“The surface tension can be changed through different variables,” he said. “The simplest one is temperature. If you change temperature of the fluid, you can activate this fluid flow again.” The process can be fine-tuned to move the fluid, or just specific ions within it, offering promise for more sophisticated work at nanoscale.

“The surface tension nanogates promise platforms to govern nanoscale functionality of a wide spectrum of systems, and applications can be foreseen in drug delivery, energy conversion, power generation, seawater desalination, and ionic separation,” the researchers wrote.

In addition to Ghasemi and first author Masoumeh Nazari, researchers involved with the project include Sina Nazifi, Zixu Huang, Tian Tong and Jiming Bao, all with the University of Houston, and Kausik Das and Habilou Ouro-Koura, both with the University of Maryland Eastern Shore.

Funding for the project came from the Air Force Office of Scientific Research, the National Science Foundation and the U.S. Department of Education.

Story Source:

Materials provided by University of Houston. Original written by Jeannie Kever. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Nanomaterial gives robots chameleon skin

A new film made of gold nanoparticles changes color in response to any type of movement. Its unprecedented qualities could allow robots to mimic chameleons and octopi — among other futuristic applications.

Unlike other materials that try to emulate nature’s color changers, this one can respond to any type of movement, like bending or twisting. Robots coated in it could enter spaces that might be dangerous or impossible for humans, and offer information just based on the way they look.

For example, a camouflaged robot could enter tough-to-access underwater crevices. If the robot changes color, biologists could learn about the pressures facing animals that live in these environments.

Although some other color-changing materials can also respond to motion, this one can be printed and programmed to display different, complex patterns that are difficult to replicate. The UC Riverside scientists who created this nanomaterial documented their process in a Nature Communications paper published this past week.

Nanomaterials are simply materials that have been reduced to an extremely small scale — tens of nanometers in width and length, or, about the size of a virus. When materials like silver or gold become smaller, their colors will change depending on their size, shape, and the direction they face.

“In our case, we reduced gold to nano-sized rods. We knew that if we could make the rods point in a particular direction, we could control their color,” said chemistry professor Yadong Yin. “Facing one way, they might appear red. Move them 45 degrees, and they change to green.”

The problem facing the research team was how to take millions of gold nanorods floating in a liquid solution and get them all to point in the same direction to display a uniform color.

Their solution was to fuse smaller magnetic nanorods onto the larger gold ones. The two different-sized rods were encapsulated in a polymer shield, so that they would remain side by side. That way, the orientation of both rods could be controlled by magnets.

“Just like if you hold a magnet over a pile of needles, they all point in the same direction. That’s how we control the color,” Yin said.

Once the nanorods are dried into a thin film, their orientation is fixed in place and they no longer respond to magnets. “But, if the film is flexible, you can bend and rotate it, and will still see different colors as the orientation changes,” Yin said.

Other materials, like butterfly wings, are shiny and colorful at certain angles, and can also change color when viewed at other angles. However, those materials rely on precisely ordered microstructures, which are difficult and expensive to make for large areas. But this new film can be made to coat the surface of any sized object just as easily as applying spray paint on a house.

Though futuristic robots are an ultimate application of this film, it can be used in many other ways. UC Riverside chemist Zhiwei Li, the first author on this paper, explained that the film can be incorporated into checks or cash as an authentication feature. Under normal lighting, the film is gray, but when you put on sunglasses and look at it through polarized lenses, elaborate patterns can be seen. In addition, the color contrast of the film may change dramatically if you twist the film.

The applications, in fact, are only limited by the imagination. “Artists could use this technology to create fascinating paintings that are wildly different depending on the angle from which they are viewed,” Li said. “It would be wonderful to see how the science in our work could be combined with the beauty of art.”

Go to Source
Author:

Categories
ScienceDaily

Electrical fields can throw a curveball

MIT researchers have discovered a phenomenon that could be harnessed to control the movement of tiny particles floating in suspension. This approach, which requires simply applying an external electric field, may ultimately lead to new ways of performing certain industrial or medical processes that require separation of tiny suspended materials.

The findings are based on an electrokinetic version of the phenomenon that gives curveballs their curve, known as the Magnus effect. Zachary Sherman PhD ’19, who is now a postdoc at the University of Texas at Austin, and MIT professor of chemical engineering James Swan describe the new phenomenon in a paper published in the journal Physical Review Letters.

The Magnus effect causes a spinning object to be pulled in a direction perpendicular to its motion, as in the curveball; it is based on aerodynamic forces and operates at macroscopic scales — i.e. on easily visible objects — but not on smaller particles. The new phenomenon, induced by an electric field, can propel particles down to nanometer scales, moving them along in a controlled direction without any contact or moving parts.

The discovery came about as a surprise, as Sherman was testing some new simulation software for the interactions of tiny nanoscale particles that he was developing, within magnetic and electric fields. The test case he was studying involves placing charged particles in an electrolytic liquid, which are liquids with ions, or charged atoms or molecules, in them.

It was known, he says, that when charged particles just a few tens to hundreds of nanometers across are placed in such liquids they remain suspended within it rather than settling, forming a colloid. Ions then cluster around the particles. The new software successfully simulated this ion clustering. Next, he simulated an electric field across the material. This would be expected to induce a process called electrophoresis, which would propel the particles along in the direction of the applied field. Again, the software correctly simulated the process.

Then Sherman decided to push it further, and gradually increased the strength of the electric field. “But then we saw this funny thing,” he says. “If the field was strong enough, you would get normal electrophoresis for a tiny bit, but then the colloids would spontaneously start spinning.” And that’s where the Magnus effect comes in.

Not only were the particles spinning in the simulations as they moved along, but “those two motions coupled together, and the spinning particle would veer off of its path,” he says. “It’s kind of strange, because you apply a force in one direction, and then the thing moves in an orthogonal [right-angle] direction to what you’ve specified.” It’s directly analogous to what happens aerodynamically with spinning balls, he says. “If you throw a curveball in baseball, it goes in the direction you threw it, but then it also veers off. So this is a kind of a microscopic version of that well-known macroscopic Magnus effect.”

When the applied field was strong enough, the charged particles took on a strong motion in the direction perpendicular to the field. This could be useful, he says, because with electrophoresis “the particle moves toward one of the electrodes, and you run into this problem where the particle will move and then it will run into the electrode, and it’ll stop moving. So you can’t really generate a continuous motion with just electrophoresis.”

Instead, since this new effect goes at right angles to the applied field, it could be used for example to propel particles along a microchannel, simply by placing electrodes on the top and bottom. That way, he says, the particle will “just move along the channel, and it will never bump into the electrodes.” That makes it, he says, “actually a more efficient way to direct the motion of microscopic particles.”

There are two different kinds of examples of processes where this ability might come in handy, he says. One is to use the particle to deliver some sort of “cargo” to a specific location. For example, the particle could be attached to a therapeutic drug “and you’re trying to get it to a target site that needs that drug, but you can’t get the drug there directly,” he says. Or the particle might contain some sort of chemical reactant or catalyst that needs to be directed to a specific channel to carry out its desired reaction.

The other example is sort of the inverse of that process: picking up some kind of target material and bringing it back. For example, a chemical reaction to generate a product might also generate a lot of unwanted byproducts. “So you need a way to get a product out,” he says. These particles can be used to capture the product and then be extracted using the applied electric field. “In this way they kind of act as little vacuum cleaners,” he says. “They pick up the thing you want, and then you can move them somewhere else, and then release the product where it’s easier to collect.”

He says this effect should apply for a wide array of particle sizes and particle materials, and the team will continue to study how different material properties affect the rotation speed or the translation speed of this effect. The basic phenomenon should apply to virtually any combination of materials for the particles and the liquid they are suspended in, as long as the two differ from each other in terms of an electrical property called the dielectric constant.

The researchers looked at materials with a very high dielectric constant, such as metal particles, suspended in a much lower-conducting electrolyte, such as water or oils. “But you might also be able to see this with any two materials that have a contrast” in dielectric constant, Sherman says, for example with two oils that don’t mix and thus form suspended droplets.

Go to Source
Author:

Categories
ProgrammableWeb

AppBrilliance Launches MoneyMovement SDK

AppBrilliance has launched its Payments/Money Movement SDK for both iOS and Android. AppBrilliance differentiates its mobile payments platform by running on the edge of networks. In other words, the platform runs directly on a trusted app on an edge device. With payments possible directly within apps, consumers bypass the many middlemen that have historically taken a piece of every payment.

“With our Money Movement SDK, these companies can now offer direct push-payments and real-time transfers to their users, bypassing expensive debit and credit processing and moving funds instantly at a disruptively low cost,” Eric Smith, AppBrilliance Founder and CEO, commented in a press release. “Companies that rely on real-time payments can save 200-300% vs. processing the payments over debit or credit rails.”

AppBrilliance anticipates endless use case scenarios for its Money Movement SDK. From contactless retail payments and online shopping to reducing difficulty moving money throughout the banking systems and cryptocurrency exchanges, AppBrilliance is a platform that can be leveraged throughout the modern economy.

The SDK delivers full read/write functionality to any financial institution. No integration is required. No user credentials are every shared. Because the platform lives at the edge, it is infinitely scalable. The model is fully distributed across the mobile devices of users.

AppBrilliance recently showed off the SDK at the SoFin 2020 event. While in-person shows are being canceled due to COVID-19, virtual events continue, and virtual products continue to launch. Visit the AppBrilliance site to request a demo or learn more.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">ecarter</a>

Categories
ScienceDaily

The way you dance is unique, and computers can tell it’s you

Nearly everyone responds to music with movement, whether through subtle toe-tapping or an all-out boogie. A recent discovery shows that our dance style is almost always the same, regardless of the type of music, and a computer can identify the dancer with astounding accuracy.

Studying how people move to music is a powerful tool for researchers looking to understand how and why music affects us the way it does. Over the last few years, researchers at the Centre for Interdisciplinary Music Research at the University of Jyväskylä in Finland have used motion capture technology — the same kind used in Hollywood — to learn that your dance moves say a lot about you, such as how extroverted or neurotic you are, what mood you happen to be in, and even how much you empathize with other people.

Recently, however, they discovered something that surprised them. “We actually weren’t looking for this result, as we set out to study something completely different,” explains Dr. Emily Carlson, the first author of the study. “Our original idea was to see if we could use machine learning to identify which genre of music our participants were dancing to, based on their movements.”

The 73 participants in the study were motion captured dancing to eight different genres: Blues, Country, Dance/Electronica, Jazz, Metal, Pop, Reggae and Rap. The only instruction they received was to listen to the music and move any way that felt natural. “We think it’s important to study phenomena as they occur in the real world, which is why we employ a naturalistic research paradigm,” says Professor Petri Toiviainen, the senior author of the study.

The researchers analysed participants’ movements using machine learning, trying to distinguish between the musical genres. Unfortunately, their computer algorithm was able to identify the correct genre less that 30% of the time. They were shocked to discover, however, that the computer could correctly identify which of the 73 individuals was dancing 94% of the time. Left to chance (that is, if the computer had simply guessed without any information to go on), the expected accuracy would be less than 2%. “It seems as though a person’s dance movements are a kind of fingerprint,” says Dr. Pasi Saari, co-author of the study and data analyst. “Each person has a unique movement signature that stays the same no matter what kind of music is playing.”

Some genres, however, had more effect on individual dance movements than others. The computer was less accurate in identifying individuals when they were dancing to Metal music. “There is a strong cultural association between Metal and certain types of movement, like headbanging,” Emily Carlson says. “It’s probable that Metal caused more dancers to move in similar ways, making it harder to tell them apart.”

Does this mean that face-recognition software will soon be joined by dance-recognition software? “We’re less interested in applications like surveillance than in what these results tell us about human musicality,” Carlson explains. “We have a lot of new questions to ask, like whether our movement signatures stay the same across our lifespan, whether we can detect differences between cultures based on these movement signatures, and how well humans are able to recognize individuals from their dance movements compared to computers. Most research raises more questions than answers,” she concludes, “and this study is no exception.”

Go to Source
Author:

Categories
Hackster.io

Check Out These Mechanical Japanese Zen Garden Kinetic Art Pieces

Unlike paintings or sculptures, kinetic art relies on movement to capture the eye and provide meaning. How exactly that is implemented is just as subjective as any other kind of art and is up to the artist to determine. And, as with other art forms, kinetic art requires both technical skill and artistic vision. A painter needs to be capable of precise brush strokes, while a kinetic artist needs to be skilled with their fabrication tools of choice. These mechanical zen gardens, created by Jo Fairfax, are a fantastic example of what that kind of skill and vision can achieve.

These art pieces are, of course, inspired by traditional Japanese zen gardens. Those are intended to facilitate tranquility as the “gardener” carefully brushes the sand. Fairfax’s mechanical zen gardens do something similar, except that they do it all on their own. His reinterpretation of the zen garden consists of a large box with a clear cover. The box is filled with fine iron filings. As a person approaches a mechanical zen garden, it will spring to life and begin drawing patterns in the sand-like iron filings.

The mechanism used to draw the patterns is what makes this project particularly interesting to us. Inside of the box and underneath a barrier separating it from the iron filings, there is a motorized arm covered in an array of electromagnets. An Arduino Uno board controls both the movement of the arm and if each magnet is activated. By activating the magnets at specific points through the arm’s movement cycle, a variety of geometric patterns can be drawn. Fairfax has produced at least a couple of these mechanical zen gardens, though the only major difference between them appears to be their shape and the movement patterns of the motorized arms.

Go to Source
Author: Cameron Coward

Categories
IEEE Spectrum

Bionic ‘Feeling’ Leg Makes Walking Easier, Reduces Phantom Limb Pain

With sensory feedback, amputees could feel the knee’s movement and the sole of the foot on the ground
Categories
ScienceDaily

New artificial compound eye could improve 3D object tracking

If you’ve ever tried to swat a fly, you know that insects react to movement extremely quickly. A newly created biologically inspired compound eye is helping scientists understand how insects use their compound eyes to sense an object and its trajectory with such speed. The compound eye could also be used with a camera to create 3D location systems for robots, self-driving cars and unmanned aerial vehicles.

In The Optical Society (OSA) journal Optics Letters, researchers from Tianjin University in China report their new bio-inspired compound eye, which not only looks like that of an insect but also works like its natural counterpart. Compound eyes consist of hundreds to thousands of repeating units known as ommatidia that each act as a separate visual receptor.

“Imitating the vision system of insects has led us to believe that they might detect the trajectory of an object based on the light intensity coming from that object rather than using precise images like human vision,” said Le Song, a member of the research team. “This motion-detection method requires less information, allowing the insect to quickly react to a threat.”

Imitating an insect eye

The researchers used a method known as single point diamond turning to create 169 microlenses on the surface of the compound eye. Each microlens had a radius of about 1 mm, creating a component measuring about 20 mm that could detect objects from a 90-degree field of view. The fields of view of adjacent microlenses overlapped in the same way that ommatidia do for most insects.

One of the challenges in making an artificial compound eye is that image detectors are flat while the surface of the compound eye is curved. Placing a light guide between the curved lens and an image detector allowed the researchers to overcome this challenge while also enabling the component to receive light from different angles uniformly.

“This uniform light receiving ability of our bio-inspired compound eye is more similar to biological compound eyes and better imitates the biological mechanism than previous attempts at replicating a compound eye,” explained Song.

To use the artificial compound eye for measuring 3D trajectory, the researchers added grids to each eyelet that help pinpoint location. They then placed LED light sources at known distances and directions from the compound eye and used an algorithm to calculate the 3D location of the LEDs based on the location and intensity of the light.

The researchers found that the compound eye system was able to rapidly provide the 3D location of an object. However, the location accuracy was reduced when the light sources were farther away, which could explain why most insects are nearsighted.

How insects see the world

“This design allowed us to prove that the compound eye could identify an object’s location based on its brightness instead of a complex image process,” said Song. “This highly sensitive mechanism suits the brain processing ability of insects very well and helps them avoid predators.”

According to the researchers, the ability of the new bio-inspired compound eye to detect an object’s 3D location could be useful for small robots requiring fast detection from a very lightweight system. It also offers a new way for biologists to study the visual systems of insects.

The researchers are planning to imbed the localization algorithm into platforms such as integrated circuits to allow the system to be incorporated into other devices. They are also developing ways to mass produce the compound eye lenses to reduce the unit cost.

Story Source:

Materials provided by The Optical Society. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Machine learning tool improves tracking of tiny moving particles

Scientists have developed an automated tool for mapping the movement of particles inside cells that may accelerate research in many fields, a new study in eLife reports.

The movements of tiny molecules, proteins and cellular components throughout the body play an important role in health and disease. For example, they contribute to brain development and the progression of some diseases. The new tool, built with cutting-edge machine learning technology, will make tracking these movements faster, easier and less prone to bias.

Currently, scientists may use images called kymographs, which represent the movement of particles in time and space, for their analyses of particle movements. These kymographs are extracted from time-lapse videos of particle movements recorded using microscopes. The analysis needs to be done manually, which is both slow and vulnerable to unconscious biases of the researcher.

“We used the power of machine learning to solve this long-standing problem by automating the tracing of kymographs,” says lead author Maximilian Jakobs, a PhD student in the Department of Physiology, Development and Neuroscience at the University of Cambridge, UK.

The team developed the software, dubbed ‘KymoButler’, to automate the process. The software uses deep learning technology, which tries to mimic the networks in the brain to allow software to learn and become more proficient at a task over time and multiple attempts. They then tested KymoButler using both artificial and real data from scientists studying the movement of an array of different particles.

“We demonstrate that KymoButler performs as well as expert manual data analysis on kymographs with complex particle trajectories from a variety of biological systems,” Jakobs explains. The software could also complete analyses in under one minute that would take an expert 1.5 hours.

KymoButler is available for other researchers to download and use at kymobutler.deepmirror.ai. Senior author Kristian Franze, Reader in Neuronal Mechanics at the University of Cambridge, expects the software will continue to improve as it analyses more types of data. Researchers using the tool will be given the option of anonymously uploading their kymographs to help the team continue developing the software.

“We hope our tool will prove useful for others involved in analysing small particle movements, whichever field they may work in,” says Franze, whose lab is devoted to understanding how physical interactions between cells and their environment shape the development and regeneration of the brain.

Story Source:

Materials provided by eLife. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
Hackster.io

Biohacker Implants a Transponder in Her Arm to Start Her Tesla

The biohacking movement is all about meshing our imperfect, biological human bodies with technology. The goal is to use technological augmentations to improve our bodies — or at least to add some convenience to our lives. Today, the most common of those augmentations is, by far, an RFID implant. With an RFID chip similar to those used for pet identification, you can use your hand to unlock doors, trigger tasks on your smartphone, and much more. Amie DD has recently taken that idea to the next level by implanting a transponder into her arm to start her Tesla.

This hack does involve a surgical operation, so the squeamish among you might want to avoid watching the video below. If this is something you want to do yourself, make sure you’re going to a professional and take all of the proper safety precautions, like Amie did. In this case, that meant removing the NFC chip and antenna from a Tesla valet key, and then encasing it in a special biopolymer that is safe for implantation. That step, which was done by Amal at VivoKey, is extremely important for ensuring that the implant is safe for the body.

Amie DD already has a fairly standard RFID implant in her hand, but was unable to get it working with her Tesla Model 3. For obvious reasons, the Tesla has additional layers of security that prevent users from simply cloning the RFID chip. Fortunately, replacement valet key cards are inexpensive — Amie DD just had to get the chip out of the plastic card. To do that, she immersed the card in acetone, allowing the plastic to be dissolved away. The chip and antenna were then sent off to VivoKey to be encased in biopolymer. After having the new implant professional placed under the skin in her forearm, Amie DD can now start up her Tesla with her own body!

Go to Source
Author: Cameron Coward