Categories
ScienceDaily

AI helps scientists understand brain activity behind thoughts

A team led by researchers at Baylor College of Medicine and Rice University has developed artificial intelligence (AI) models that help them better understand the brain computations that underlie thoughts. This is new, because until now there has been no method to measure thoughts. The researchers first developed a new model that can estimate thoughts by evaluating behavior, and then tested their model on a trained artificial brain where they found neural activity associated with those estimates of thoughts. The theoretical study appears in the Proceedings of the National Academy of Sciences.

“For centuries, neuroscientists have studied how the brain works by relating brain activity to inputs and outputs. For instance, when studying the neuroscience of movement, scientists measure muscle movements as well as neuronal activity, and then relate those two measurements,” said corresponding author Dr. Xaq Pitkow, assistant professor of neuroscience at Baylor and of electrical and computer engineering at Rice. “To study cognition in the brain, however, we don’t have anything to compare the measured neural activity to.”

To understand how the brain gives rise to thought, researchers first need to measure a thought. They developed a method called “Inverse Rational Control” that looks at a behavior and infers the beliefs or thoughts that best explain that behavior.

Traditionally, researchers in this field have worked with the idea that animals solve tasks optimally, behaving in a way that maximizes their net benefits. But when scientists study animal behavior, they find that this is not always the case.

“Sometimes animals have ‘wrong’ beliefs or assumptions about what’s going on in their environment, but still they try to find the best long-term outcomes for their task, given what they believe is going on around them. This could account for why animals seem to behave suboptimally,” said Pitkow, who also is a McNair Scholar at Baylor, co-director of Baylor’s Center for Neuroscience and Artificial Intelligence and member of the Rice Neuroengineering Initiative.

For example, consider an animal that is hunting and hears many noises it associates with prey. If one potential prey is making all the noises, the optimal behavior for the hunter is to consistently target its movements to a single noise. If the hunter mistakenly believes the noises are coming from many different animals, it may choose a suboptimal behavior, like constantly scanning its surroundings to try and pinpoint one of them. By acting according to its belief or assumption that there are many potential prey nearby, the hunter is behaving in a way that is simultaneously ‘rational’ and ‘suboptimal.’

In the second part of the work, Pitkow and his colleagues developed a model to relate the thoughts that were identified using the Inverse Rational Control method to brain activity.

“We can look at the dynamics of the modeled thoughts and at the dynamics of the brain’s representations of those thoughts. If those dynamics run parallel to each other, then we have confidence that we are capturing the aspects of the brain computations involved in those thoughts,” Pitkow said. “By providing methods to estimate thoughts and interpret neural activity associated with them, this study can help scientists understand how the brain produces complex behavior and provide new perspectives on neurological conditions.”

Other contributors to this work include Zhengwei Wu, Minhae Kwon, Saurabh Daptardar and Paul Schrater. The authors are affiliated with one or more of the following institutions: Baylor College of Medicine, Rice University, Soongsil University, Google Maps, and the University of Minnesota.

This work was supported in part by in part by BRAIN Initiative grant NIH 5U01NS094368, an award from the McNair Foundation, the Simons Collaboration on the Global Brain award 324143, the National Science Foundation award 1450923 BRAIN 43092 and NSF CAREER Award IOS-1552868.

Story Source:

Materials provided by Baylor College of Medicine. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

1 in 3 who are aware of deepfakes say they have inadvertently shared them on social media

A Nanyang Technological University, Singapore (NTU Singapore) study has found that some Singaporeans have reported that, despite being aware of the existence of ‘deepfakes’ in general, they believe they have circulated deepfake content on social media which they later found out was a hoax.

Deepfakes, a portmanteau of ‘deep learning’ and ‘fake’, are ultrarealistic fake videos made with artificial intelligence (AI) software to depict people doing things they have never done — not just slowing them down or changing the pitch of their voice, but also making them appear to say things that they have never said at all.

In a survey of 1,231 Singaporeans led by NTU Singapore’s Assistant Professor Saifuddin Ahmed, 54 per cent of the respondents said they were aware of deepfakes, of which one in three reported sharing content on social media that they subsequently learnt was a deepfake.

The study also found that more than one in five of those who are aware of deepfakes said that they regularly encounter deepfakes online.

The survey findings, reported in the journal Telematics and Informatics in October, come in the wake of rising numbers of deepfake videos identified online. Over the six months to June 2020, Sensity, a deepfake detection technology firm , estimates that identified deepfake videos online had doubled to 49,081.

Deepfakes that have gone viral include one with former President Barack Obama using an expletive to describe President Donald Trump in 2018, and another last year of Facebook founder Mark Zuckerberg claiming to control the future, thanks to stolen data.

Assistant Professor Saifuddin of NTU’s Wee Kim Wee School of Communication and Information said: “Fake news refers to false information published under the guise of being authentic news to mislead people, and deepfakes are a new, far more insidious form of fake news. In some countries, we are already witnessing how such deepfakes can be used to create non-consensual porn, incite fear and violence, and influence civic mistrust. As the AI technology behind the creation of deepfakes evolves, it will be even more challenging to discern fact from fiction.”

“While tech companies like Facebook, Twitter and Google have started to label what they have identified as manipulated online content like deepfakes, more efforts will be required to educate the citizenry in effectively negating such content.”

Americans more likely than Singaporeans to share deepfakes

The study benchmarked the findings on Singaporeans’ understanding of deepfakes against a similar demographic and number of respondents in the United States.

Respondents in the US were more aware of deepfakes (61% in US vs. 54% in SG). They said they were also more concerned by and frequently exposed to deepfakes. More people reported sharing content that they later learnt was a deepfake in the US than in Singapore (39% in US vs. 33% in SG).

Asst Prof Saifuddin said: “These differences are not surprising, given the more widespread relevance and public discussion surrounding deepfakes in the US. More recently, a rise in the number of deepfakes, including those of President Donald Trump, has raised anxieties regarding the destructive potential of this form of disinformation.

“On the other hand, Singapore has not witnessed direct impacts of deepfakes, and the government has introduced the Protection from Online Falsehoods and Manipulation Act (POFMA) to limit the threat posed by disinformation, including deepfakes.”

But legislation alone is not enough, he added, citing a 2018 survey by global independent market research agency Ipsos which found that while four in five Singaporeans say that they can confidently spot fake news, more than 90 per cent mistakenly identified at least one in five fake headlines as being real.

“The government’s legislation to inhibit the pervasive threat of disinformation has also been helpful, but we need to continue improving digital media literacy going forward, especially for those who are less capable of discerning facts from disinformation,” said Asst Prof Saifuddin, whose research interests include social media and public opinion.

The NTU study on deepfake awareness was funded by the University and Singapore’s Ministry of Education, and the findings are part of a longer-term study that examines citizens’ trust in AI technology.

Go to Source
Author:

Categories
IEEE Spectrum

Does AI in Healthcare Need More Emotion?







































Categories
ProgrammableWeb

Twilio Simplifies Access to Call Events Resources

Twilio has provided a new resource that will allow developers to programmatically access individual call events. This information, which was previously only available via the Twilio Console, can now be accessed by integrating with the Call Events resource.

The new Call Events resource is part of the Twilio Voice APITrack this API and will enable access to the event logs of a specific call, allowing developers to perform a detailed analysis of call data.

Developers that are interested in working with the new resource can check out the documentation provided by Twilio. 

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">KevinSundstrom</a>

Categories
ScienceDaily

Supersized wind turbines generate clean energy–and surprising physics

Twenty years ago, wind energy was mostly a niche industry that contributed less than 1% to the total electricity demand in the United States. Wind has since emerged as a serious contender in the race to develop clean, renewable energy sources that can sustain the grid and meet the ever-rising global energy demand. Last year, wind energy supplied 7% of domestic electricity demand, and across the country — both on and offshore — energy companies have been installing giant turbines that reach higher and wider than ever before.

“Wind energy is going to be a really important component of power production,” said engineer Jonathan Naughton at the University of Wyoming, in Laramie. He acknowledged that skeptics doubt the viability of renewable energy sources like wind and solar because they’re weather dependent and variable in nature, and therefore hard to control and predict. “That’s true,” he said, “but there are ways to overcome that.”

Naughton and Charles Meneveau at Johns Hopkins University in Baltimore, Maryland, organized a mini-symposium at the 73rd Annual Meeting of the American Physical Society’s Division of Fluid Dynamics, where researchers described the promise and fluid dynamics challenges of wind energy.

In order for wind energy to be useful — and accepted — researchers need to design systems that are both efficient and inexpensive, Naughton said. That means gaining a better understanding of the physical phenomena that govern wind turbines, at all scales. Three years ago, the U.S. Department of Energy’s National Renewable Energy Laboratory (NREL) brought together 70 experts from around the world to discuss the state of the science. In 2019, the group published grand scientific challenges that need to be addressed for wind energy to contribute up to half of the demand for power.

One of those challenges was to better understand the physics of the part of the atmosphere where the turbines operate. “Wind is really an atmospheric fluid mechanics problem,” said Naughton. “But how the wind behaves at the levels where the turbines operate is still an area where we need more information.”

Today’s turbines have blades that can stretch 50 to 70 meters, said Paul Veers, Chief Engineer at NREL’s National Wind Technology Center, who provided an overview of the challenges during the symposium. These towers tower 100 meters or more over their environs. “Offshore, they’re getting even bigger,” said Veers.

The advantage to building bigger turbines is that a wind power plant would need fewer machines to build and maintain and to access the powerful winds high above the ground. But giant power plants function at a scale that hasn’t been well-studied, said Veers.

“We have a really good ability to understand and work with the atmosphere at really large scales,” said Veers. “And scientists like Jonathan and Charles have done amazing jobs with fluid dynamics to understand small scales. But between these two, there’s an area that has not been studied all that much.”

Another challenge will be to study the structural and system dynamics of these giant rotating machines. The winds interact with the blades, which bend and twist. The spinning blades give rise to high Reynolds numbers, “and those are areas where we don’t have a lot of information,” said Naughton.

Powerful computational approaches can help reveal the physics, said Veers. “We’re really pushing the computational methods as far as possible,” he said. “It’s taking us to the fastest and biggest computers that exist right now.”

A third challenge, Naughton noted, is to study the behavior of groups of turbines. Every turbine produces a wake in the atmosphere, and as that wake propagates downstream it interacts with the wakes from other turbines. Wakes may combine; they may also interfere with other turbines. Or anything else in the area. “If there’s farmland downwind, we don’t know how the change in the atmospheric flow will affect it,” said Naughton.

He called wind energy the “ultimate scale problem.” Because it connects small-scale problems like the interactions of turbines with the air to giant-scale problems like atmospheric modeling, wind energy will require expertise and input from a variety of fields to address the challenges. “Wind is among the cheapest forms of energy,” said Naughton. “But as the technology matures, the questions get harder.”

Go to Source
Author:

Categories
ScienceDaily

Tracking and fighting fires on earth and beyond

Mechanical engineer Michael Gollner and his graduate student, Sriram Bharath Hariharan, from the University of California, Berkeley, recently traveled to NASA’s John H. Glenn Research Center in Cleveland, Ohio. There, they dropped burning objects in a deep shaft and study how fire whirls form in microgravity. The Glenn Center hosts a Zero Gravity Research Facility, which includes an experimental drop tower that simulates the experience of being in space.

“You get five seconds of microgravity,” said Gollner. The researchers lit a small paraffin wick to generate fire whirls and dropped it, studying the flame all the way down.

Experiments like this, presented at the 73rd Annual Meeting of the American Physical Society’s Division of Fluid Dynamics, can help fire scientists answer two kinds of questions. First, they illuminate ways that fire can burn in the absence of gravity — and may even inform protective measures for astronauts. “If something’s burning, it could be a very dangerous situation in space,” said Gollner. Second, it can help researchers better understand gravity’s role in the growth and spread of destructive fires.

The fire burned differently without gravity, said Gollner. The flame was shorter — and wider. “We saw a real slow down of combustion,” said Gollner. “We didn’t see the same dramatic whirls that we have with ordinary gravity.”

Other researchers, including a team from Los Alamos National Laboratory in New Mexico, introduced new developments to a computational fluid dynamics model that can incorporate fuels of varying moisture content. Many existing environmental models average the moisture of all the fuels in an area, but that approach fails to capture the variations found in nature, said chemical engineer Alexander Josephson, a postdoctoral researcher who studies wildfire prediction at Los Alamos. As a result, those models may yield inaccurate predictions in wildfire behavior, he said.

“If you’re walking through the forest, you see wood here and grass there, and there’s a lot of variation,” said Josephson. Dry grasses, wet mosses, and hanging limbs don’t have the same water content and burn in different ways. A fire may be evaporating moisture from wet moss, for example, at the same time it’s consuming drier limbs. “We wanted to explore how the interaction between those fuels occurs as the fire travels through.”

Los Alamos scientists worked to improve their model called FIRETEC (developed by Rod Linn), collaborating with researchers at the University of Alberta in Canada and the Canadian Forest service. Their new developments accommodate variations in moisture content and other characteristics of the simulated fuel types. Researcher Ginny Marshall from the Canadian Forest Service recently began comparing its simulations to real-world data from boreal forests in northern Canada.

During a session on reacting flows, Matthew Bonanni, a graduate student in the lab of engineer Matthias Ihme at Stanford University in California, described a new model for wildfire spread based on a machine learning platform. Predicting where and when fires will burn is a complex process, says Ihme, that’s driven by a complex mix of environmental influences.

The goal of Ihme’s group was to build a tool that was both accurate and fast, able to be used for risk assessment, early warning systems, and designing mitigation strategies. They built their model on a specialized computer platform called TensorFlow, designed by researchers at Google to run machine learning applications. As the model trains on more physical data, said Ihme, its simulations of heat accumulation and fire-spreading dynamics improve — and get faster.

Ihme said he’s excited to see what advanced computational tools bring to wildfire prediction. “It used to be a very empirical research area, based on physical observations, and our community works on more fundamental problems,” he said. But adding machine learning to the toolbox, he said, shows how algorithms can improve the fidelity of experiments. “This is a really exciting pathway,” he said.

Go to Source
Author:

Categories
ScienceDaily

The science of windy cities

Global population and urbanization have boomed over the last few decades. With them came scores of new tall buildings, drones, more energy-efficient ventilation systems, and planned air taxis by Uber and other companies. But these technological advancements must contend with a natural physical phenomenon: wind.

Scientists presented the latest findings on modeling and predicting urban airflow — in the hope of building better buildings, cities, and transportation — at the 73rd Annual Meeting of the American Physical Society’s Division of Fluid Dynamics.

The urban skies of the future could teem with autonomous aircraft: air taxis, drones, and other self-flying systems. A team from Oklahoma State University has developed techniques to model environmental hazards these vehicles might encounter so they can safely navigate cities.

“Urban environments present enormous challenges for drone and urban air mobility platforms,” said researcher Jamey Jacob, who led the team. “In addition to the challenges of traffic congestion and obstacles, critical technology gaps exist in modeling, detecting, and accommodating the dynamic urban local wind fields as well as in precision navigation through uncertain weather conditions.”

Researchers attached sensors to robotic aircraft to take more cohesive measurements of building wakes, or the disturbed airflow around buildings. They combined this data with numerical predictions to get a better picture of the complex wind patterns found in urban environments.

The work could help improve wind and weather forecasting, not only for unmanned aircraft but also for conventional airplanes.

“The potential of outfitting every drone and urban air taxi, as well as other aircraft, with sensors provides a game changing opportunity in our capability to monitor, predict, and report hazardous weather events,” said Jacob.

Another group, based at the University of Surrey also investigated building wakes. With an eye toward enhancing air quality in cities, they looked for wake differences between a single tall building and a cluster of tall buildings.

“Understanding how to model the wake of tall buildings is the first step to enable city planners to reduce the heat-island effect as well as improve urban air quality,” said Joshua Anthony Minien, a researcher in mechanical engineering.

The team carried out experiments in a wind tunnel, varying the grouping, aspect ratio, and spacing of tall buildings. They were encouraged to see that when measured far enough downstream, a cluster of buildings and an isolated building have similar wake characteristics. Changes to wind direction also seem to significantly affect the wakes of clusters of buildings.

All buildings, tall or not, must be ventilated.

“The ability to predict ventilation flow rates, purging times and flow patterns is important for human comfort and health, as highlighted by the need to prevent the airborne spread of coronavirus,” said University of Cambridge researcher Nicholas Wise.

With engineering professor Gary Hunt, Wise found a problem in current models of passive natural ventilation systems. These often use displacement flow — where cooler night air enters a building through one opening and warmer air accumulated during the day exits through another opening.

Their mathematical modeling revealed that displacement flow does not continue during the purge of warm air, as was believed. Instead, the room experiences an “unbalanced exchange flow” which can slow down the purging process.

“Every displacement flow transitions to unbalanced exchange flow,” said Wise.

The researchers were surprised at just how much adding a small low-level opening speeds up room cooling, compared to a room with only a high-level opening. Their model will be useful for designers of natural ventilation systems.

Go to Source
Author:

Categories
ScienceDaily

Galaxy encounter violently disturbed Milky Way

The spiral-shaped disc of stars and planets is being pulled, twisted and deformed with extreme violence by the gravitational force of a smaller galaxy — the Large Magellanic Cloud (LMC).

Scientists believe the LMC crossed the Milky Way’s boundary around 700 million years ago — recent by cosmological standards — and due to its large dark matter content it strongly upset our galaxy’s fabric and motion as it fell in.

The effects are still being witnessed today and should force a revision of how our galaxy evolved, astronomers say.

The LMC, now a satellite galaxy of the Milky Way, is visible as a faint cloud in the southern hemisphere’s night skies — as observed by its namesake, the 16th century Portuguese explorer Ferdinand Magellan.

Previous research has revealed that the LMC, like the Milky Way, is surrounded by a halo of dark matter — elusive particles which surround galaxies and do not absorb or emit light but have dramatic gravitational effects on the movement of stars and gas in the universe.

Using a sophisticated statistical model that calculated the speed of the Milky Way’s most distant stars, the University of Edinburgh team discovered how the LMC warped our galaxy’s motion. The study, published in Nature Astronomy, was funded by UK Science and Technology Facilities Council (STFC).

The researchers found that the enormous attraction of the LMC’s dark matter halo is pulling and twisting the Milky Way disc at 32 km/s or 115,200 kilometers per hour towards the constellation Pegasus.

To their surprise they also found that the Milky Way was not moving towards the LMC’s current location, as previously thought, but towards a point in its past trajectory.

They believe this is because the LMC, powered by its massive gravitational force, is moving away from the Milky Way at the even faster speed of 370 km/s, around 1.3 million kilometres per hour.

Astronomers say it is as if the Milky Way is trying hard to hit a fast moving target, but not aiming very well.

This discovery will help scientists develop new modelling techniques that capture the strong dynamic interplay between the two galaxies.

Astronomers now intend to find out the direction from which the LMC first fell in to the Milky Way and the exact time it happened. This will reveal the amount and distribution of dark matter in the Milky Way and the LMC with unprecedented detail.

Dr Michael Petersen, lead author and Postdoctoral Research Associate, School of Physics and Astronomy, said:

“Our findings beg for a new generation of Milky Way models, to describe the evolution of our galaxy.

“We were able to show that stars at incredibly large distances, up to 300,000 light-years away, retain a memory of the Milky Way structure before the LMC fell in, and form a backdrop against which we measured the stellar disc flying through space, pulled by the gravitational force of the LMC.”

Professor Jorge Peñarrubia, Personal Chair of Gravitational Dynamics, School of Physics and Astronomy, said:

“This discovery definitely breaks the spell that our galaxy is in some sort of equilibrium state. Actually, the recent infall of the LMC is causing violent perturbations onto the Milky Way.

“Understanding these may give us an unparalleled view on the distribution of dark matter in both galaxies.”

Go to Source
Author:

Categories
ScienceDaily

Flow physics could help forecasters predict extreme events

About 1,000 tornadoes strike the United States each year, causing billions of dollars in damage and killing about 60 people on average. Tracking data show that they’re becoming increasingly common in the southeast, and less frequent in “Tornado Alley,” which stretches across the Great Plains. Scientists lack a clear understanding of how tornadoes form, but a more urgent challenge is to develop more accurate prediction and warning systems. It requires a fine balance: Without warnings, people can’t shelter, but if they experience too many false alarms, they’ll become inured.

One way to improve tornado prediction tools might be to listen better, according to mechanical engineer Brian Elbing at Oklahoma State University in Stillwater, in the heart of Tornado Alley. He doesn’t mean any sounds audible to human ears, though. As long ago as the 1960s, researchers reported evidence that tornadoes emit signature sounds at frequencies that fall outside the range of human hearing. People can hear down to about 20 Hertz — which sounds like a low rumble — but a tornado’s song likely falls somewhere between 1 and 10 Hertz.

Brandon White, a graduate student in Elbing’s lab, discussed their recent analyses of the infrasound signature of tornadoes at the 73rd Annual Meeting of the American Physical Society’s Division of Fluid Dynamics.

Elbing said these infrasound signatures had seemed like a promising avenue of research, at least until radar emerged as a frontrunner technology for warning systems. Acoustic-based approaches took a back seat for decades. “Now we’ve made a lot of advances with radar systems and monitoring, but there are still limitations. Radar requires line of sight measurements.” But line of sight can be tricky in hilly places like the Southeast, where the majority of tornado deaths occur.

Maybe it’s time to revisit those acoustic approaches, said Elbing. In 2017, his research group recorded infrasound bursts from a supercell that produced a small tornado near Perkins, Oklahoma. When they analyzed the data, they found that the vibrations began before the tornado formed.

Researchers still know little about the fluid dynamics of tornadoes. “To date there have been eight trusted measurements of pressure inside a tornado, and no classical theory predicts them,” said Elbing. He doesn’t know how the sound is produced, either, but knowing the cause isn’t required for an alarm system. The idea of an acoustics-based system is straightforward.

“If I dropped a glass behind you and it shattered, you don’t need to turn around to know what happened,” said Elbing. “That sound gives you a good sense of your immediate environment.” Infrasound vibrations can travel over long distances quickly, and through different media. “We could detect tornadoes from 100 miles away.”

Members of Elbing’s research group also described a sensor array for detecting tornadoes via acoustics and presented findings from studies on how infrasound vibrations travel through the atmosphere. The work on infrasound tornado signatures was supported by a grant from NOAA.

Other sessions during the Division of Fluid Dynamics meeting similarly addressed ways to study and predict extreme events. During a session on nonlinear dynamics, MIT engineer Qiqi Wang revisited the butterfly effect, a well-known phenomena in fluid dynamics that asks whether a butterfly flapping its wings in Brazil could trigger a tornado in Texas.

What’s unclear is whether the butterfly wings can lead to changes in the longtime statistics of the climate. By investigating the question computationally in small chaotic systems, he found that small perturbations can, indeed, effect long-term changes, a finding that suggests even small efforts can lead to lasting changes in the climate of a system.

During the same session, mechanical engineer Antoine Blanchard, a postdoctoral researcher at MIT, introduced a smart sampling algorithm designed to help quantify and predict extreme events — like extreme storms or cyclones, for example. Extreme events occur with low probability, he said, and therefore require large amounts of data, which can be expensive to generate, computationally or experimentally. Blanchard, whose background is in fluid dynamics, wanted to find a way to identify outliers more economically. “We’re trying to identify those dangerous states using as few simulations as possible.”

The algorithm he designed is a kind of black box: Any dynamical state can be fed as an input, and the algorithm will return a measure of the dangerousness of that state.

“We’re trying to find the doors to danger. If you open that particular door, will the system remain quiescent, or will it go crazy?” asked Blanchard. “What are the states and conditions — like weather conditions, for example — that if you were to evolve them over time could cause a cyclone or storm?”

Blanchard said he’s still refining the algorithm but hopes to start applying it to real data and large-scale experiments soon. He also said it may have implications beyond the weather, in any system that produces extreme events. “It’s a very general algorithm.”

Go to Source
Author:

Categories
IEEE Spectrum

Boston Dynamics’ Spot Is Helping Chernobyl Move Towards Safe Decommissioning