Categories
ScienceDaily

If relaxed too soon, physical distancing measures might have been all for naught

If physical distancing measures in the United States are relaxed while there is still no COVID-19 vaccine or treatment and while personal protective equipment remains in short supply, the number of resulting infections could be about the same as if distancing had never been implemented to begin with, according to a UCLA-led team of mathematicians and scientists.

The researchers compared the results of three related mathematical models of disease transmission that they used to analyze data emerging from local and national governments, including one that measures the dynamic reproduction number — the average number of susceptible people infected by one previously infected person. The models all highlight the dangers of relaxing public health measures too soon.

“Distancing efforts that appear to have succeeded in the short term may have little impact on the total number of infections expected over the course of the pandemic,” said lead author Andrea Bertozzi, a distinguished professor of mathematics who holds UCLA’s Betsy Wood Knapp Chair for Innovation and Creativity. “Our mathematical models demonstrate that relaxing these measures in the absence of pharmaceutical interventions may allow the pandemic to reemerge. It’s about reducing contact with other people, and this can be done with PPE as well as distancing.”

The study is published in the journal Proceedings of the National Academy of Sciences and is applicable to both future spikes of COVID-19 and future pandemics, the researchers say.

If distancing and shelter-in-place measures had not been taken in March and April, it is very likely the number of people infected in California, New York and elsewhere would have been dramatically higher, posing a severe burden on hospitals, Bertozzi said. But the total number of infections predicted if these precautions end too soon is similar to the number that would be expected over the course of the pandemic without such measures, she said. In other words, short-term distancing can slow the spread of the disease but may not result in fewer people becoming infected.

Mathematically modeling and forecasting the spread of COVID-19 are critical for effective public health policy, but wide differences in precautionary approaches across the country have made it a challenge, said Bertozzi, who is also a distinguished professor of mechanical and aerospace engineering. Social distancing and wearing face masks reduce the spread of COVID-19, but people in many states are not following distancing guidelines and are not wearing masks — and the number of infections continues to rise.

What are the implications of these findings for policymakers who want to relax social distancing in an effort to revive their economies?

“Policymakers need to be careful,” Bertozzi said. “Our study predicts a surge in cases in California after distancing measures are relaxed. Alternative strategies exist that would allow the economy to ramp up without substantial new infections. Those strategies all involve significant use of PPE and increased testing.”

During the 1918 influenza pandemic, social distancing was first enforced and then relaxed in some areas. Bertozzi points to a study published in Proceedings of the National Academy of Sciences in 2007 that looked at several American cities during that pandemic where a second wave of infections occurred after public health measures were removed too early.

That study found that the timing of public health interventions had a profound influence on the pattern of the second wave of the 1918 pandemic in different cities. Cities that had introduced measures early in the pandemic achieved significant reductions in overall mortality. Larger reductions in peak mortality were achieved by those cities that extended the public health measures for longer. San Francisco, St. Louis, Milwaukee and Kansas City, for instance, had the most effective interventions, reducing transmission rates by 30% to 50%.

“Researchers Martin Bootsma and Neil Ferguson were able to analyze the effectiveness of distancing measures by comparing the data against an estimate for what might have happened had distancing measures not been introduced,” Bertozzi said of the 2007 study. “They considered data from the full pandemic, while we addressed the question of fitting models to early-time data for this pandemic. During the 1918 influenza pandemic, the early relaxation of social distancing measures led to a swift uptick in deaths in some U.S. cities. Our mathematical models help to explain why this effect might occur today.”

The COVID-19 data in the new study are from April 1, 2020, and are publicly available. The study is aimed at scientists who are not experts in epidemiology.

“Epidemiologists are in high demand during a pandemic, and public health officials from local jurisdictions may have a need for help interpreting data,” Bertozzi said. “Scientists with relevant background can be tapped to assist these people.”

Study co-authors are Elisa Franco, a UCLA associate professor of mechanical and aerospace engineering and bioengineering; George Mohler, an associate professor of computer and information science at Indiana University-Purdue University Indianapolis; Martin Short, an associate professor of mathematics at Georgia Tech; and Daniel Sledge, an associate professor of political science at the University of Texas at Arlington.

Go to Source
Author:

Categories
ProgrammableWeb

Why COVID-19 Makes App Security More Important than Ever

The United States is still hoping to fully reopen, but COVID-19 is more prevalent than ever, with the nation and many states reporting record daily infection rates. Even though employment has recovered somewhat, the country is still facing a more than 10% unemployment rate, and while many restaurants, stores and other physical businesses have reopened, people are not yet returning to them in the same numbers as before the pandemic.  

The pandemic will undoubtedly transform many aspects of our country, and some of these changes are already apparent. Organizations that had depended on a physical location to interact with and draw customers have had to change their business models to emphasize contactless or near contactless transactions, where goods are delivered or picked up curbside. Contactless transactions typically involve online communications.

According to the Pew Research Center, 74% of households own a computer and 84% have a smartphone. But when it comes to usage, mobile dominates. More than half of worldwide Internet traffic last year came from mobile devices, and U.S. consumers spent about 40% more time using their smartphones than they did their desktops and laptops. 

The long-term trend of growing mobile usage combined with the pressure for contactless transactions due to the pandemic has made creating and enhancing mobile apps not just a nice marketing tool for businesses, but a necessary task for survival. To compete with other businesses and draw formerly casual mobile users to their apps, development teams are under more pressure than ever to deliver new and updated apps even more quickly than before.

Features trump security … until they don’t

This does not bode well for mobile app security, especially since the situation was not very good prior to COVID-19. According to the Verizon Mobile Security Index 2020, 43% of app developers said they knew they were cutting corners on security to “get the job done,” and that survey was conducted well before the pandemic arrived.

Unless they are very technologically savvy, consumers have no real way to assess the security of the mobile apps they use, so they make decisions about which apps to deploy based on  features, functionality and ease of use. Naturally, that’s where developers focus their attention. What’s more, implementing security is expensive and time consuming, potentially breaking the development budget and delaying delivery schedules. Even if development teams are committed to implementing security, iOS and Android security specialists are hard to find and in high demand.

But while focusing on features at the expense of security may be a good strategy for short-term adoption, the potential long-term consequences can be devastating for consumers and developers alike. Cybercriminals are just as aware as developers are about the growing importance of mobile apps, and they are developing increasingly sophisticated attacks targeting them.

A good example is the EventBot malware that appeared in April. This Android-based trojan looks and feels like Adobe Flash or Microsoft Word, but its real purpose is to steal unprotected data in banking, bitcoin and other financial apps. The trojan is sophisticated enough to intercept two-factor authentication codes sent via SMS so it can use them to take over accounts. 

It’s a perfect example of the importance of good security. If app developers encrypt all data stored on the device, they won’t be in danger of theft from trojans like EventBot. Likewise, it illustrates why it’s critical to obfuscate and shield apps from reverse engineering. Not only can malicious actors create trojans from popular brands’ apps, they can also make buggy, badly performing fake apps that will give the genuine app a bad reputation.

Additionally, because the pandemic is causing such a large increase in app usage and adoption, security flaws that had previously gone unnoticed may start causing problems for users. 

Zoom, for example, saw millions of new users sign up essentially overnight after they were forced to work from home due to lockdown orders. This rush of new users exposed security flaws that hackers used to “zoom bomb” meetings. Zoom took quick action to resolve the issues, but it had to endure significant damage to its reputation.

Solutions to the security development challenge 

If your team plans to implement security into mobile apps on its own, first make sure you have the skills required to do so. Android and iOS differ significantly, and a security expert in one OS isn’t necessarily qualified to implement security for the other. 

Assuming you have developers qualified to implement security, the next step is to plan what, specifically, your team will focus on to harden your apps’ security. It’s not a simple question — after all, a hacker only has to find a single vulnerability to exploit, and there’s an enormous number of possible weaknesses. But a good place to start is to ensure each app is protected against the The Open Web Application Security Project (OWASP) Mobile Top Ten vulnerabilities, a list of the most common exploits cybercriminals use. 

Other development teams may decide to integrate security software development kits (SDKs) into their apps, which is a more efficient option than manual security implementation and can be done without having to hire security specialists. That said, it’s critical to thoroughly vet SDKs before integration. Not only are rogue SDKs a serious problem in the mobile app industry, but SDKs, themselves, may contain vulnerabilities.

Organizations can also leverage AI to automate security for mobile apps. It’s fast, can secure an app without any coding, and, compared to manual coding, is inexpensive as well. But, just as you must vet SDKs, conduct thorough due diligence to ensure that the AI platform provides comprehensive security and does not, itself, introduce vulnerabilities.

Mobile apps have never been more important to businesses, and cybercriminals are responding with more advanced, targeted attacks. Developers cannot afford to deliver full-featured apps that lack proper security — in the long run, the potential damage to customers and an enterprise itself is far too great a risk. So, as you race to provide an engaging, intuitive app for customers, pay as much attention to their safety as their experience. It’s no longer necessary to implement security manually, so there’s no excuse for putting customers at risk with a vulnerable app.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">tomtovar</a>

Categories
ProgrammableWeb

Should You Hire A Developer Or Use The API For Your Website’s CMS?

It doesn’t matter how powerful or well-rounded your chosen CMS happens to be: there can still come a point at which you decide that its natural state isn’t enough and something more is needed. It could be a new function, a fresh perspective, or improved performance, and you’re unwilling to settle for less. What should you do?

Your instinct might be to hire a web developer, ideally one with some expertise in that particular CMS, but is that the right way to go? Developers can be very costly, and whether you have some coding skill or you’re a total novice, you might be able to get somewhere without one — and the key could be using the API for your CMS.

In this post, we’re going to consider why you might want to hire a developer, why you should investigate APIs, and how you can choose between these options. Let’s get to it.

Why you should hire a developer

It’s fairly simple to make a case for hiring a web developer. For one thing, it’s easy. By sharing the load, you get to preserve your existing workload and collaborate with an expert, a second pair of eyes that can complete your vision and deftly deal with any issues that might arise. Additionally, it’s the best way to get quick results if you’re willing to allocate enough money to afford a top-notch developer and make your project a priority.

The ease of this option explains why it’s so popular. We so often outsource things that would be easy to do ourselves (getting store-bought sandwiches, using cleaning services, etc.) that outsourcing something as complex as a website development project seems like an obvious choice for anyone who isn’t themselves a programmer with plenty of free time.

And even if you are a programmer with enough free time to take on a personal project, you might not have the right skills for the job. Every system has its own nuances, whether it’s a powerful platform with proprietary parts (like Shopify) or an open-source foundation built around ease of use (like Ghost), so getting a CMS expert can make for a smoother experience.

Why you should use the API for your CMS

So, with such a good argument to be made for immediately consulting a developer, why should you take the time to get involved directly? Well, one of the core goals of an API — as you may well be aware — is to make system functions readily accessible to outside systems, and you can take advantage of that to extend your system through integrations.

Becoming familiar with the workings of an API doesn’t require you to have an exhaustive knowledge of the CMS itself. You need only understand the available fields and functions and how you can call them (and interact with them) from elsewhere. From there, it’s more about finding — or creating — the external systems that can give you the results you need.

The best developer portals will have detailed API references along with getting started guides, sample code, SDKs and everything else a developer needs to successfully consume the API. The providers  behind them want as many people as possible to gravitate towards their platforms, after all more compatible modules (along with services like Zapier) means a stronger ecosystem and more interest overall. This means that even people with relatively meager technical understanding can get somewhere.

Additionally, getting to know the API for your CMS will help you understand what the system can and can’t do natively. It’s possible that by consuming the API you will uncover existing functionality that you otherwise wouldn’t have noticed. Overall, then, taking this step first will help you understand your CMS and either source an existing integration or build a more economical outline of a project that you can then pass to a developer.

How you can choose the right approach

In talking about building a project outline, I hinted at the natural conclusion here, which is that these options aren’t mutually exclusive. Having studied the API for your website’s CMS, you can develop something else or bring in a suitable module, but you can also continue to work with an external developer. It doesn’t subtract from your options. For that reason, then, I strongly recommend working with the API first and seeing what you can glean from it. That will allow you to make the smartest decision about how to proceed.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">rodneylaws</a>

Categories
ScienceDaily

Black hole’s heart still beating

The first confirmed heartbeat of a supermassive black hole is still going strong more than ten years after first being observed.

X-ray satellite observations spotted the repeated beat after its signal had been blocked by our Sun for a number of years.

Astronomers say this is the most long lived heartbeat ever seen in a black hole and tells us more about the size and structure close to its event horizon — the space around a black hole from which nothing, including light, can escape.

The research, by the National Astronomical Observatories, Chinese Academy of Sciences, China, and Durham University, UK, appears in the journal Monthly Notices of the Royal Astronomical Society.

The black hole’s heartbeat was first detected in 2007 at the centre of a galaxy called RE J1034+396 which is approximately 600 million light years from Earth.

The signal from this galactic giant repeated every hour and this behaviour was seen in several snapshots taken before satellite observations were blocked by our Sun in 2011.

In 2018 the European Space Agency’s XMM-Newton X-ray satellite was able to finally re-observe the black hole and to scientists’ amazement the same repeated heartbeat could still be seen.

Matter falling on to a supermassive black hole as it feeds from the accretion disc of material surrounding it releases an enormous amount of power from a comparatively tiny region of space, but this is rarely seen as a specific repeatable pattern like a heartbeat.

The time between beats can tell us about the size and structure of the matter close to the black hole’s event horizon.

Professor Chris Done, in Durham University’s Centre for Extragalactic Astronomy collaborated on the findings with colleague Professor Martin Ward, Temple Chevallier Chair of Astronomy.

Professor Done said: “The main idea for how this heartbeat is formed is that the inner parts of the accretion disc are expanding and contracting.

“The only other system we know which seems to do the same thing is a 100,000 times smaller stellar-mass black hole in our Milky Way, fed by a binary companion star, with correspondingly smaller luminosities and timescales.

“This shows us that simple scalings with black hole mass work even for the rarest types of behaviour.”

Lead author Dr Chichuan Jin of the National Astronomical Observatories, Chinese Academy of Sciences, said: “This heartbeat is amazing!

“It proves that such signals arising from a supermassive black hole can be very strong and persistent. It also provides the best opportunity for scientists to further investigate the nature and origin of this heartbeat signal.”

The next step in the research is to perform a comprehensive analysis of this intriguing signal, and compare it with the behaviour of stellar-mass black holes in our Milky Way.

The research was funded by the National Natural Science Foundation of China, the Strategic Pioneer Program on Space Science, Chinese Academy of Sciences, and the Science and Technology Facilities Council, UK.

Story Source:

Materials provided by Durham University. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Microbial cyborgs: Bacteria supplying power

Electronic devices are still made of lifeless materials. One day, however, “microbial cyborgs” might be used in fuel cells, biosensors, or bioreactors. Scientists of Karlsruhe Institute of Technology (KIT) have created the necessary prerequisite by developing a programmable, biohybrid system consisting of a nanocomposite and the Shewanella oneidensis bacterium that produces electrons. The material serves as a scaffold for the bacteria and, at the same time, conducts the microbially produced current. The findings are reported in ACS Applied Materials & Interfaces.

The bacterium Shewanella oneidensis belongs to the so-called exoelectrogenic bacteria. These bacteria can produce electrons in the metabolic process and transport them to the cell’s exterior. However, use of this type of electricity has always been limited by the restricted interaction of organisms and electrode. Contrary to conventional batteries, the material of this “organic battery” does not only have to conduct electrons to an electrode, but also to optimally connect as many bacteria as possible to this electrode. So far, conductive materials in which bacteria can be embedded have been inefficient or it has been impossible to control the electric current.

The team of Professor Christof M. Niemeyer has now succeeded in developing a nanocomposite that supports the growth of exoelectrogenic bacteria and, at the same time, conducts current in a controlled way. “We produced a porous hydrogel that consists of carbon nanotubes and silica nanoparticles interwoven by DNA strands,” Niemeyer says. Then, the group added the bacterium Shewanella oneidensis and a liquid nutrient medium to the scaffold. And this combination of materials and microbes worked. “Cultivation of Shewanella oneidensis in conductive materials demonstrates that exoelectrogenic bacteria settle on the scaffold, while other bacteria, such as Escherichia coli, remain on the surface of the matrix,” microbiologist Professor Johannes Gescher explains. In addition, the team proved that electron flow increased with an increasing number of bacterial cells settling on the conductive, synthetic matrix. This biohybrid composite remained stable for several days and exhibited electrochemical activity, which confirms that the composite can efficiently conduct electrons produced by the bacteria to an electrode.

Such a system does not only have to be conductive, it also must be able to control the process. This was achieved in the experiment: To switch off the current, the researchers added an enzyme that cuts the DNA strands, as a result of which the composite is decomposed.

“As far as we know, such a complex, functional biohybrid material has now been described for the first time. Altogether, our results suggest that potential applications of such materials might even extend beyond microbial biosensors, bioreactors, and fuel cell systems,” Niemeyer emphasizes.

Story Source:

Materials provided by Karlsruher Institut für Technologie (KIT). Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

How silver ions kill bacteria

The antimicrobial properties of silver have been known for centuries. While it is still a mystery as to exactly how silver kills bacteria, University of Arkansas researchers have taken a step toward better understanding the process by looking at dynamics of proteins in live bacteria at the molecular level.

Traditionally, the antimicrobial effects of silver have been measured through bioassays, which compare the effect of a substance on a test organism against a standard, untreated preparation. While these methods are effective, they typically produce only snapshots in time, said Yong Wang, assistant professor of physics and an author of the study, published in the journal Applied and Environmental Microbiology.

Instead, Wang and his colleagues used an advanced imaging technique, called single-particle-tracking photoactivated localization microscopy, to watch and track a particular protein found in E. coli bacteria over time. Researchers were surprised to find that silver ions actually sped up the dynamics of the protein, opposite of what they thought would happen. “It is known that silver ions can suppress and kill bacteria; we thus expected that everything slowed down in the bacteria when treated with silver. But, surprisingly, we found that the dynamics of this protein became faster.”

The researchers observed that silver ions were causing paired strands of DNA in the bacteria to separate, and the binding between the protein and the DNA to weaken. “Then the faster dynamics of the proteins caused by silver can be understood,” said Wang. “When the protein is bound to the DNA, it moves slowly together with the DNA, which is a huge molecule in the bacteria. In contrast, when treated with silver, the proteins fall off from the DNA, moving by themselves and thus faster.”

The observation of DNA separation caused by silver ions came from earlier work that Wang and colleagues had done with bent DNA. Their approach, now patent pending, was to put strain on DNA strands by bending them, thus making them more susceptible to interactions with other chemicals, including silver ions.

The National Science Foundation-funded study validated the idea of investigating the dynamics of single proteins in live bacteria, said Wang, an approach that could help researchers understand the real-time responses of bacteria to silver nanoparticles, which have been proposed for fighting against so-called “superbugs” that are resistant to commonly prescribed antibiotics.

“What we want to do eventually is to use the new knowledge generated from this project to make better antibiotics based on silver nanoparticles,” said Wang.

Story Source:

Materials provided by University of Arkansas. Original written by Bob Whitby. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
3D Printing Industry

3D file marketplace MyMiniFactory releases ‘3DPrinted & Delivered’ service

London-based 3D file marketplace, MyMiniFactory, has released a new 3D printing service for tabletop hobbyists still hesitant about committing to a 3D printer. The 2013 start-up is doubling down on its services catering to designers, makers, and general enthusiasts by setting up a separate marketplace for physical miniatures alongside its digital STL file marketplace which […]

Go to Source
Author: Kubi Sertoglu

Categories
ScienceDaily

Americans perceive likelihood of nuclear weapons risk as 50/50 toss-up

It has been 30 years since the end of the Cold War, yet, on average, Americans still perceive that the odds of a nuclear weapon detonating on U.S. soil is as likely as a coin toss, according to new research from Stevens Institute of Technology.

“That’s exceptionally high,” said Kristyn Karl, a political scientist at Stevens who co-led the work with psychologist Ashley Lytle. “People don’t generally believe that highly rare events are slightly less likely than a 50/50 tossup.”

The finding, reported in the January 2020 issue of International Journal of Communication, represents the end of a decades long gap in the research literature on Americans’ perceptions on nuclear weapons threat. It also provides an initial look at how younger generations, namely Millennials and Gen Z (18-37 years old), think about the topic and what influences their behavior in an era of evolving nuclear threat.

Using their combined expertise in political science and psychology, Karl and Lytle fielded two nationally diverse online surveys totaling more than 3,500 Americans to measure individual characteristics and attitudes, such as perceptions of nuclear risk, apathy toward nuclear topics, media use, and interest in following current events.

They also analyzed how these characteristics and attitudes such as perceptions of nuclear risk influence behaviors, including the likelihood of seeking information and initiating conversations about nuclear topics, as well as preparing emergency kits in the event that the worst were to happen.

The ultimate goal of the work, which is part of the larger Reinventing Civil Defense project supported by the Carnegie Corporation of New York, is to learn more about how to best develop new communication tools to increase awareness among Americans about topics related to nuclear weapons, particularly what to do in the event of a nuclear detonation.

“The overarching narrative from the Reinventing Civil Defense project is that younger Americans just don’t hear anything about nuclear weapons risk,” said Karl. “Unlike older Americans, Millennials and Gen Z didn’t grow up during the Cold War, so what they know about nuclear risk is what’s in the media, and what’s in the media isn’t necessarily reflective of the true state of affairs.”

And media use matters.

Karl and Lytle find that consuming media has a striking effect on how younger and older adults think about topics related to nuclear weapons, especially as it relates to apathy. Specifically, as younger generations report using more media, they are increasingly likely to report being apathetic about nuclear topics.

But this pattern is different for older adults, as there is no association between their media use and their willingness to think about nuclear threats or how to survive them. In terms of behavior, apathy about nuclear topics is associated with a decrease in seeking information on the issue.

Interestingly, as Americans age, the lower they estimate the likelihood of a nuclear detonation in their lifetime. “Among lots of possibilities, they may be thinking if it didn’t happen during the Cold War, it won’t happen now; or perhaps I have fewer years to live, so it probably won’t happen in my lifetime,” said Lytle. However, older adults and those who tend to more closely follow the news tend to seek more information about nuclear topics.

Broadly, perceptions of nuclear weapons risk prove powerful as they lead Americans and take various actions to prepare in the event of a nuclear attack. On average, city dwellers estimate the risk as 5-7% higher than their rural or suburban peers whereas women estimate nuclear risk as 3-5% higher than men. Since men report significantly higher levels of media use and more closely following current events, this research presents several opportunities for targeting messages based on these varying perceptions.

One pattern is clear: as perception of nuclear weapons risk increases, so too does Americans’ intent to take action and that’s true across multiple measures, whether it putting forward effort to think and plan for it, seeking information about it, communicating with others on the topic, or taking steps to prepare for an attack.

Karl and Lytle explain that many people are fatalistic: if a nuclear weapon were to go off in New York City, then we would all be dead, ‘so why should I put any effort forward in thinking about it?’

Karl explains that the size of the weapon, the location, and even the weather, are important. In cities, for example, many nuclear weapons detonations would be funneled upward by tall buildings and modeling suggests that many people could survive. The most important thing people could do is get inside a building and stay there for three days.

“Our gut reaction is that everybody would die. But not everybody,” said Lytle. “We are trying to figure out how to educate people that this is not always true so that people feel like they have some sort of agency in a situation like this. Many people could survive the initial blast and then their subsequent behavior would determine what happens from there.”

While Lytle and Karl emphasize that they don’t wish to make claims about the actual degree of nuclear weapons risk, they maintain that perceptions of this risk are crucially important. Even if we assume the risk is low in the real world, it could be life-saving for Americans to know just a small amount about what you should do.

Go to Source
Author:

Categories
ScienceDaily

The mysterious movement of water molecules

Water is a mysterious substance. Understanding how it behaves at the atomic level is still a challenge for experimental physicists, as light hydrogen and oxygen atoms are difficult to observe using conventional experimental methods. This is especially true for any researcher looking to study the microscopic movements of individual water molecules that run off a surface in a matter of picoseconds. As they report in their paper, entitled ‘Nanoscopic diffusion of water on a topological insulator’, researchers from the Exotic Surfaces working group at TU Graz’s Institute of Experimental Physics joined forces with counterparts from the Cavendish Laboratory at the University of Cambridge , the University of Surrey and Aarhus University. Together, they made significant advances, performing research into the behaviour of water on a material that is currently attracting particular interest: a topological insulator called bismuth telluride. This compound could be used to build quantum computers. Water vapour would be one of the environmental factors to which applications based on bismuth telluride might be exposed during operation.

In the course of their research, the team used a combination of a new experimental method called helium spin-echo spectroscopy and theoretical calculations. Helium spin-echo spectroscopy uses very low-energy helium atoms that allow isolated water molecules to be observed without influencing their motion in the process. The researchers discovered that water molecules behave completely differently on bismuth telluride compared with those on conventional metals. On such metals, attractive interactions between water molecules can be observed, leading to accumulations in the form of films. But the opposite is the case with topological insulators: the water molecules repel one another and remain isolated on the surface.

Bismuth telluride appears to be impervious to water, which is an advantage for applications exposed to typical environmental conditions. Plans are in place for further experiments on similarly structured surfaces, which are intended to clarify whether the movement of water molecules is attributable to specific features of the surface in question.

make a difference: sponsored opportunity


Story Source:

Materials provided by Graz University of Technology. Original written by Birgit Baustädter. Note: Content may be edited for style and length.


Journal Reference:

  1. Anton Tamtögl, Marco Sacchi, Nadav Avidor, Irene Calvo-Almazán, Peter S. M. Townsend, Martin Bremholm, Philip Hofmann, John Ellis, William Allison. Nanoscopic diffusion of water on a topological insulator. Nature Communications, 2020; 11 (1) DOI: 10.1038/s41467-019-14064-7

Cite This Page:


Graz University of Technology. “The mysterious movement of water molecules.” ScienceDaily. ScienceDaily, 15 January 2020. <www.sciencedaily.com/releases/2020/01/200115120607.htm>.

Graz University of Technology. (2020, January 15). The mysterious movement of water molecules. ScienceDaily. Retrieved January 15, 2020 from www.sciencedaily.com/releases/2020/01/200115120607.htm

Graz University of Technology. “The mysterious movement of water molecules.” ScienceDaily. www.sciencedaily.com/releases/2020/01/200115120607.htm (accessed January 15, 2020).

Go to Source
Author:

Categories
ProgrammableWeb

ProgrammableWeb’s Most Clicked, Shared and Talked About APIs of 2019: Big Data and Analytics

If API-related news has any correlation to what is trendy, we can conclude that Big Data was still big in 2019. ProgrammableWeb‘s coverage included articles about a variety of data services. These include data retrieval, property data, contacts data, social media data, academic data, finance & trading, data automation, databases, data streaming, and data privacy.

Along with all that API news, we published several APIs in our Big Data and Data Analytics categories over the past year. This includes APIs for Data, Open Data, Data Visualizations, Database, Analytics, Data Mining, Extraction, ClassificationCharts, and others. Listed below are the Data APIs added to our directory during 2019 that might pique the interest of a data scientist or a developer.

Fauna is a cloud-first database provider for cloud and container environments. The FaunaDB GraphQL APITrack this API is a database service for defining schemas and executing queries and mutations of data within FaunaDB. The FaunaDB GraphQL API focuses on transactional consistency, user authorization, data access, quality of service (QoS), and temporal storage.

Temporal from RTrade Technologies is an enterprise data storage solution. The Temporal IPFS APITrack this API allows developers to manage functions within the storage solution with methods to create networks, add users, get information, upload files, and get object statistics.

MasterTables is a service for storing and maintaining business data and lists. Predefined choice lists or taxonomies such as countries, gender, and marital status are available in MasterTables. MasterTables APITrack this API offers data as JSON objects.

ScrapingBee (formerly Scraping Ninja) provides applications with web scraping tools. The ScrapingBee APITrack this API supports JavaScript rendering, Headless Chrome, captchas, and proxy rotation. The API returns HTML or JSON formatted responses.

ScrapingBee uses rotating proxies so users can bypass rate limiting. Screenshot: ScrapingBee

data.world is a modern data catalog service for analytics and teamwork. The data.world APITrack this API returns data discovery, comprehension, integration, and sharing in relation to the data catalog service. The API provides methods to manage files, users, data projects, data streams, insights and more.

Skim provides data extraction and AI consultancy services. The Skim Technologies Data APITrack this API can extract data about the content of a URI. Developers can retrieve data such as title, author, date, body, keywords, summaries, images, videos, reading time, and more.

Amazon‘s Get Metrics APITrack this API provides calculated metrics, insights, and advanced analytics reporting for Alexa Skills usage. The Metrics API is in beta and is listed in the Analytics category.

QuickChart APITrack this API generates chart and graph images to embed in an email, SMS, and reports. Developers can define charts by URL, JSON and JavaScript objects, and the Chart.js library. Line, bar, radar, donut, pie, polar areas, bubbles, and scatter chart types represent data visualizations. This service can be used as a replacement for the Google Image Charts API, which was deprecated in March 2019.

Create bar, line, donut, pie, bubble and other charts with this API

Create bar, line, donut, pie, bubble and other charts with this API. Screenshot: quickchart.io

Fivetran automates data pipelines between data repositories, cloud data warehouses, and analytics tools. The Fivetran APITrack this API provides a REST architecture to access business data connection features. With the API, developers can access a data pipeline, collect event logs, and replicate business data into a cloud warehouse.

Image-Charts APITrack this API returns a chart image in response to a URL GET or POST request. The API can generate many kinds of charts, including pie, line, bar charts, and radars. Image-Charts is a can act as a replacement for the deprecated Google Image Charts API

TEXT2DATA API provides detailed reports of unstructured data with Neuro-Linguistic Programming and Machine Learning. TEXT2DATA can be integrated as a text analytics platform, customer experience reporting tool, and social media monitoring platform.

Figure Eight platform aims to provide high-quality training data for machine learning models. The Figure Eight APITrack this API enables users to post and retrieve data from the data annotation platform. Developers can display text, image, video, and audio in annotated data forms.

Rubrik offers unified enterprise data services with backup, instant recovery, archival, search, analytics, and compliance features. The Rubrik REST APITrack this API provides support for configuring, querying, and controlling all of the allowed operations of the Rubrik cloud management platform.

GrayMeta is an AI-powered enterprise metadata platform. The GrayMeta Platform APITrack this API enables users to programmatically interact with the platform with methods to view and manage jobs, items, people, roles, activity, comments, favorites, and more. The GrayMeta Platform harvests content from enterprise storage and extracts metadata via this API, which is listed in the Big Data category.

Data Drum provides data automation and organization services at the intersection of data science, journalism, and finance. The Data Drum API Track this API enables clean, automated macroeconomic, and social data. The API supports live and historical figures with visualizations. Returned data can be JSON, CSV, XML, and XLSX formatted.

Get live and historical data in visualizations with Data Drum

Get live and historical data in visualizations with Data Drum. Screenshot: Data Drum

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">joyc</a>