Categories
ScienceDaily

Future autonomous machines may build trust through emotion

Army research has extended the state-of-the-art in autonomy by providing a more complete picture of how actions and nonverbal signals contribute to promoting cooperation. Researchers suggested guidelines for designing autonomous machines such as robots, self-driving cars, drones and personal assistants that will effectively collaborate with Soldiers.

Dr. Celso de Melo, computer scientist with the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory at CCDC ARL West in Playa Vista, California, in collaboration with Dr. Kazunori Teradafrom Gifu University in Japan, recently published a paper in Scientific Reports where they show that emotion expressions can shape cooperation.

Autonomous machines that act on people’s behalf are poised to become pervasive in society, de Melo said; however, for these machines to succeed and be adopted, it is essential that people are able to trust and cooperate with them.

“Human cooperation is paradoxical,” de Melo said. “An individual is better off being a free rider, while everyone else cooperates; however, if everyone thought like that, cooperation would never happen. Yet, humans often cooperate. This research aims to understand the mechanisms that promote cooperation with a particular focus on the influence of strategy and signaling.”

Strategy defines how individuals act in one-shot or repeated interaction. For instance, tit-for-tat is a simple strategy that specifies that the individual should act as his/her counterpart acted in the previous interaction.

Signaling refers to communication that may occur between individuals, which could be verbal (e.g., natural language conversation) and nonverbal (e.g., emotion expressions).

This research effort, which supports the Next Generation Combat Vehicle Army Modernization Priority and the Army Priority Research Area for Autonomy, aims to apply this insight in the development of intelligent autonomous systems that promote cooperation with Soldiers and successfully operate in hybrid teams to accomplish a mission.

“We show that emotion expressions can shape cooperation,” de Melo said. “For instance, smiling after mutual cooperation encourages more cooperation; however, smiling after exploiting others — which is the most profitable outcome for the self — hinders cooperation.”

The effect of emotion expressions is moderated by strategy, he said. People will only process and be influenced by emotion expressions if the counterpart’s actions are insufficient to reveal the counterpart’s intentions.

For example, when the counterpart acts very competitively, people simply ignore-and even mistrust-the counterpart’s emotion displays.

“Our research provides novel insight into the combined effects of strategy and emotion expressions on cooperation,” de Melo said. “It has important practical application for the design of autonomous systems, suggesting that a proper combination of action and emotion displays can maximize cooperation from Soldiers. Emotion expression in these systems could be implemented in a variety of ways, including via text, voice, and nonverbally through (virtual or robotic) bodies.”

According to de Melo, the team is very optimistic that future Soldiers will benefit from research such as this as it sheds light on the mechanisms of cooperation.

“This insight will be critical for the development of socially intelligent autonomous machines, capable of acting and communicating nonverbally with the Soldier,” he said. “As an Army researcher, I am excited to contribute to this research as I believe it has the potential to greatly enhance human-agent teaming in the Army of the future.”

The next steps for this research include pursuing further understanding of the role of nonverbal signaling and strategy in promoting cooperation and identifying creative ways to apply this insight on a variety of autonomous systems that have different affordances for acting and communicating with the Soldier.

Go to Source
Author:

Categories
IEEE Spectrum

5G Small-Cell Base Station Antenna Array Design

In this eSeminar we will explore state-of-the-art simulation approaches for antenna array design, with a particular focus on 5G small-cell base station antennas.

Realizing the 5G promise of reliably providing high data rate connections to many users simultaneously requires the use of new design approaches for base station antennas. In particular, antenna arrays will increasingly be used to enable agile beam forming and massive MIMO technology, both required to provide good service in dynamic, complex urban environments with a high number of users. The array design capabilities of SIMULIA CST Studio Suite have grown dramatically over the last years and are relied on by many companies around the world.

Join us to learn more about how simulation can help you with your array design, as we answer the following questions.

  • How can antenna elements be designed and evaluated in terms of their suitability as an array element?
  • How can full arrays with real radomes be simulated accurately much more quickly than before using the new simulation-by-zones approach?
  • How can interference between multiple co-located arrays be evaluated using advanced hybrid simulation techniques?
  • Finally, how can the coverage performance of base station arrays in complex urban or indoor environments be predicted?
Categories
ScienceDaily

New metallic material for flexible soft robots

‘Origami robots’ are state-of-the-art soft and flexible robots that are being tested for use in various applications including drug delivery in human bodies, search and rescue missions in disaster environments and humanoid robotic arms.

Because these robots need to be flexible, they are often made from soft materials such as paper, plastic and rubber. To be functional, sensors and electrical components are often added on top, but these add bulk to the devices.

Now, a team of NUS researchers has developed a novel method of creating a new metal-based material for use in these soft robots.

Combining metals such as platinum with burnt paper (ash), the new material has enhanced capabilities while maintaining the foldability and lightweight features of traditional paper and plastic. In fact, the new material is half as light as paper, which also makes it more power efficient.

These characteristics make this material a strong candidate for making flexible and light prosthetic limbs which can be as much as 60 per cent lighter than their conventional counterparts. Such prosthetics can provide real-time strain sensing to give feedback on how much they are flexing, giving users finer control and immediate information — all without the need for external sensors which would otherwise add unwanted weight to the prosthetic.

This light-weight metallic backbone is at least three times lighter than conventional materials used to fabricate origami robots. It is also more power-efficient, enabling origami robots to work faster using 30 per cent less energy. Furthermore, the novel material is fire-resistant, making it suitable for fabricating robots that work in harsh environments as it can withstand burning at about 800°C for up to 5 minutes.

As an added advantage, the novel conductive material has geothermal heating capabilities on-demand — sending a voltage through the material causes it to heat up, which helps to prevent icing damage when a robot works in a cold environment. These properties can be used in the creation of light, flexible search-and-rescue robots that can enter hazardous areas while providing real-time feedback and communication.

The metal-based material is produced through a new process developed by the team called ‘graphene oxide-enabled templating synthesis’. Cellulose paper is first soaked into a graphene oxide solution, before dipping it into a solution made of metallic ions such as platinum. The material is then burned in an inert gas, argon, at 800°C and then at 500°C in air.

The final product is a thin layer of metal — 90 micrometres (?m), or 0.09mm — made up of 70 per cent platinum and 30 per cent amorphous carbon (ash) that is flexible enough to bend, fold, and stretch. This significant research breakthrough was published in the scientific journal Science Robotics on 28 August 2019. Other metals such as gold and silver can also be used.

Team leader Assistant Professor Chen Po-Yen used a cellulose template cut out in the shape of a phoenix for his research. “We are inspired by the mythical creature. Just like the phoenix, it can be burnt to ash and reborn to become more powerful than before,” said Asst Prof Chen, from NUS Department of Chemical and Biomolecular Engineering.

Conductive backbone for smarter origami robots

The team’s material can function as mechanically stable, soft, and conductive backbones that equips robots with strain sensing and communication capabilities without the need for external electronics. Being conductive means the material acts as its own wireless antenna, allowing it to communicate with a remote operator or other robots without the need for external communication modules. This expands the scope of origami robots, such as working in high-risk environments (e.g. chemical spills and fire disaster) as remote-control untethered robots or functioning as artificial muscles or humanoid robotic arms.

“We experimented with different electrically conductive materials to finally derive a unique combination that achieves optimal strain sensing and wireless communication capabilities. Our invention therefore expands the library of unconventional materials for the fabrication of advanced robots,” said Mr Yang Haitao, doctoral student at the Department of Chemical and Biomolecular Engineering and the first author of the study.

In the next steps of their research, Asst Prof Chen and his team are looking at adding more functions to the metallic backbone. One promising direction is to incorporate electrochemically active materials to fabricate energy storage devices such that the material itself is its own battery, allowing for the creation of self-powered robots. The team is also experimenting with other metals such as copper, which will lower the cost of the material’s production.

Go to Source
Author:

Categories
ScienceDaily

Fingerprint test can distinguish between those who have taken or handled heroin

A state-of-the-art fingerprint detection technology can identify traces of heroin on human skin, even after someone has washed their hands — and it is also smart enough to tell whether an individual has used the drug or shaken hands with someone who has handled it.

In a paper published by The Journal of Analytical Toxicology, a team of experts from the University of Surrey detail how they have built on their world-leading fingerprint drug testing technology, based on high resolution mass spectrometry, which is now able to detect heroin, its metabolite, 6-monoacetylmorphine (6-AM) and other analytes associated with the class A drug.

The team took fingerprints from people seeking treatment at drug rehabilitation clinics who had testified to taking heroin or cocaine during the previous 24 hours. A fingerprint was collected from each finger of the right hand, and the participants were then asked to wash their hands thoroughly with soap and water and then wear nitrile gloves for a period of time before giving another set of fingerprints. This same process was used to collect samples from 50 drug non-users.

The researchers found that the technology was able to identify traces of heroin and 6-AM on drug non-users in every scenario the researchers devised — whether someone directly touched the drug, handled it and then thoroughly washed their hands, or had come into contact with heroin via shaking someone else’s hand.

Surrey’s system cross-referenced the information from the drug non-users with the volunteers who were being treated for drug dependency and found that compounds such as morphine, noscapine and acetylcodeine — alongside heroin and 6-AM — are essential to distinguishing those who have used the class A drug from those who have not. These analytes were only present in fingerprints from drug users.

Catia Costa from the University of Surrey said: “Our results have shown that this non-invasive and innovative technology is sensitive enough to identify class A drugs in several scenarios — even after people have washed their hands. Crucially, our study shows that the process of hand washing is important when trying to assess, from their fingerprint, whether someone has used a class A drug.”

Dr Melanie Bailey from the University of Surrey said: “Our team here at the University of Surrey believes that the technology we are developing will make our communities safer and shorten the route for those who need help to beat their addictions. We also believe the technology has scope in other areas, such as confirming whether a patient is taking their medication.”

Story Source:

Materials provided by University of Surrey. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
IEEE Spectrum

NASA Hiring Engineers to Develop “Next Generation Humanoid Robot”

It’s been nearly six years since NASA unveiled Valkyrie, a state-of-the-art full-size humanoid robot. After the DARPA Robotics Challenge, NASA has continued to work with Valkyrie at Johnson Space Center, and has also provided Valkyrie robots to several different universities. Although it’s not a new platform anymore (six years is a long time in robotics), Valkyrie is still very capable, with plenty of potential for robotics research. 

With that in mind, we were caught by surprise when over the last several months, Jacobs, a Dallas-based engineering company that appears to provide a wide variety of technical services to anyone who wants them, has posted several open jobs in need of roboticists in the Houston, Texas, area who are interested in working with NASA on “the next generation of humanoid robot.”