Did Apple Leave Developers in the Lurch with iOS 14 Release?

Last week at its media event, Apple announced that the final public release of iOS 14 would ship the day after the event following a one day Golden Master (GM) period. On the surface this seems like a move aimed to benefit users who were able to get their hands on the new OS the day after it was announced. In doing so, however, it appears that Apple put their developer community in a tough spot forcing them to get their apps ready and submitted in a short amount of time.

To be fair, developers have had access to beta versions of iOS 14 since it was announced during WWDC in June. Any developer serious about supporting their app has had the ensuing time to make sure that it is compatible with the new software in addition to adding new features. The issue is that the GM release of iOS wasn’t made available to developers until a day before the proper release of the software. The GM release is usually the same build as the one that is released to the public meaning that developers can begin final testing and fixing of any last bugs before they have to submit their updated apps to the App Store. 

In past years Apple has given developers a week to work with GM. By only allowing 24 hours this time, Apple forced developers to scramble to get their apps submitted or face not being ready for launch day. 

In the end, the tangible fallout from this decision isn’t likely to be too high. But it does make you wonder what Apple was trying to gain by putting their developers in a pinch. In ProgrammableWeb’s article examining great developer portal practices, the importance of building developer engagement through proactive communication was discussed. Building strong developer relations isn’t limited to a portal though, it should be a key part of a provider’s overall API strategy. Apple bucked this best practice and while it may not hurt today, chipping away at the trust of its developers is never a good move.

Go to Source
Author: <a href="">wsantos</a>


Virtual reality trains public to reverse opioid overdoses

The United States has seen a 200% increase in the rate of deaths by opioid overdose in the last 20 years. But many of these deaths were preventable. Naloxone, also called Narcan, is a prescription drug that reverses opioid overdoses, and in more than 40 states — including Pennsylvania — there is a standing order policy, which makes it available to anyone, without an individual prescription from a healthcare provider.

Members of the public can carry naloxone in case they encounter a person experiencing an opioid overdose. But how do you know if someone needs naloxone and how do you administer it? Health care providers are often trained to respond in these types of situations, and prior to the onset of COVID-19, public health organizations were offering in-person trainings to the public.

But how do we get even more people trained and motivated to save lives from opioid overdoses, especially in our current socially distanced world?

A group of interdisciplinary researchers from the University of Pennsylvania and the Philadelphia Department of Public Heath developed a virtual reality immersive video training aimed at doing just that. Their new study — published recently in Drug and Alcohol Prevention — shows that the VR training is just as effective as an in-person training at giving the public both the knowledge and the confidence they need to administer naloxone and save lives.

“Overdoses aren’t happening in hospitals and doctor’s offices,” says Nicholas Giordano, former Lecturer at Penn’s School of Nursing. “They’re happening in our communities: in parks, libraries, and even in our own homes. It’s crucial that we get the ability to save lives into the hands of the people on the front lines in close proximity to individuals at risk of overdose.”

The researchers adapted a 60 minute in-person training, the educational standard for health care providers, into a 9-minute immersive virtual reality video. Then the interdisciplinary team tested the VR training on members of the public at free naloxone giveaways and training clinics hosted by the Philadelphia Department of Health at local libraries. (The clinics were held in 2019 and early 2020, before the coronavirus pandemic made such events unsafe.)

Roughly a third of the 94 participants received one-on-one in-person instruction on how to administer naloxone, while the others watched the experimental VR training. After the initial training, participants answered questions about the training to determine if they’d learned enough information to safely administer naloxone in the case of an opioid overdose.

Before leaving the library, all participants were given the opportunity to receive whichever training they didn’t receive initially. Since the VR training was still in testing mode, the researchers wanted to ensure that all participants had full access to what they came for: knowledge of how to save lives.

“We were really pleased to discover that our VR training works just as well as an in-person training,” says Natalie Herbert, a 2020 graduate of Penn’s Annenberg School for Communication. “We weren’t looking to replace the trainings public health organizations are already offering; rather, we were hoping to offer an alternative for folks who can’t get to an in-person training, but still want the knowledge. And we’re excited to be able to do that.”

In addition to continuing to test their VR training, the researchers plan to begin making it available to the general public through partnerships with libraries, public health organizations, and other local stakeholders. With grant support from the Independence Blue Cross Foundation, the team will be disseminating and promoting the VR training throughout the Greater Philadelphia Area. Now, more than ever, the portability and immersive aspects of this VR raining can be leveraged to expand access to overdose training. For more information on how to experience the VR training, which can be used at home through Google Cardboard or other VR viewers, visit their website:

Go to Source


Unraveling the secrets of Tennessee whiskey

More than a century has passed since the last scientific analyses of the famed “Lincoln County [Tennessee] process” was published, but the secrets of the famous Tennessee whiskey flavor are starting to unravel at the University of Tennessee Institute of Agriculture. The latest research promises advancements in the field of flavor science as well as marketing.

Conducted John P. Munafo, Jr., assistant professor of flavor science and natural products, and his graduate student, Trenton Kerley, the study “Changes in Tennessee Whiskey Odorants by the Lincoln County Process” was recently published in the Journal of Agricultural and Food Chemistry (JAFC).

The study incorporated a combination of advanced flavor chemistry techniques to probe the changes in flavor chemistry occurring during charcoal filtration. This type of filtration is a common step in the production of distilled beverages, including vodka and rum, but it’s a required step for a product to be labeled “Tennessee whiskey.” The step is called the Lincoln County Process (LCP), after the locale of the original Jack Daniel’s distillery. It is also referred to as “charcoal mellowing.”

The LCP step is performed by passing the fresh whiskey distillate through a bed of charcoal, usually derived from burnt sugar maple, prior to barrel-aging the product. Although no scientific studies have proved such a claim, it is believed that the LCP imparts a “smoother” flavor to Tennessee whiskey. In addition, by law for the distinction of having “Tennessee whiskey” on the label, the liquor must be produced in the state of Tennessee from at least 51% corn after having been aged in Tennessee for at least 2 years in unused charred oak barrels.

The actual LCP differs from distiller to distiller, and, as the details are generally held as a trade secret, the process has been historically shrouded in mystery. There are no regulations as to how the process is performed, only that the step is required. In other words, all a manufacturer needs to do is pass the distillate over charcoal (an undefined amount — possibly even just one piece). Thus, depending on how it’s conducted, the LCP step may not impact the whisky flavor at all. On the other hand, even small adjustments to the LCP can modify the flavor profile of the whiskey positively or negatively, potentially causing any number of surprises.

Munafo and Kerley describe how distillers adjust parameters empirically throughout the whiskey production process, then rely on professional tasters to sample products, blending subtly unique batches to achieve their target flavor. Munafo says, “By gaining a fundamental understanding of the changes in flavor chemistry occurring during whiskey production, our team could advise distillers about exactly what changes are needed to make their process produce their desired flavor goals. We want to give distillers levers to pull, so they are not randomly or blindly attempting to get the precise flavor they want.”

Samples used in the study were provided by the Sugarlands Distilling Company (SDC), in Gatlinburg, Tennessee, producers of the Roaming Man Whiskey. SDC invited the UTIA researchers to visit their distillery and collect in-process samples. Munafo says SDC prioritizes transparency around their craft and takes pride in sharing the research, discovery and distillation process of how their whiskey is made and what makes Tennessee whiskey unique.

Olfactory evaluations — the good ole smell test — revealed that the LCP treatment generally decreased malty, rancid, fatty and roasty aromas in the whiskey distillates. As for the odorants (i.e., molecules responsible for odor), 49 were identified in the distillate samples using an analytical technique called gas chromatography-olfactometry (GC-O). Nine of these odorants have never been reported in the scientific whiskey literature.

One of the newly found whiskey odorants, called DMPF, was originally discovered in cocoa. It is described as having a unique anise or citrus-like smell. Another of the newly discovered whiskey odorants (called MND) is described as having a pleasant dried hay-like aroma. Both odorants have remarkably low odor thresholds in the parts-per-trillion range, meaning that the smells can be detected at very low levels by people but are difficult to detect with scientific instrumentation.

The only previous investigation into how charcoal treatment affects whiskey was published in 1908 by William Dudley in the Journal of the American Chemical Society. The new study revealed fresh knowledge for optimizing Tennessee whiskey production. Thirty-one whiskey odorants were measured via a technique called stable isotope dilution assay (SIDA), all showing a decrease in concentration as a result of LCP treatment, albeit to different degrees. That is to say, while the LCP appears to be selective in removing certain odorants, the process didn’t increase or add any odorants to the distillate. This new knowledge can be used to optimize Tennessee whiskey production. For instance, the process can be optimized for the removal of undesirable aromas, while maintaining higher levels of desirable aromas, thus “tailoring” the flavor profile of the finished whiskey.

“We want to provide the analytical tools needed to help enable distillers to have more control of their processes and make more consistent and flavorful whiskey, says Dr. Munafo. “We want to help them to take out some of the guesswork involved in whiskey production.”

Additional studies are now underway at the UT Department of Food Science to characterize both the flavor chemistry of different types of whiskey and their production processes. The ultimate aim of the whiskey flavor chemistry program is to aid whiskey manufacturers in producing a consistent product with the exact flavor profile that they desire. Even with the aid of science Munafo says, “Whiskey making will ‘still’ remain an impressive art form.” Pun intended.

The researchers acknowledge support from the USDA National Institute of Food and Agriculture (NIFA) Hatch Project #1015002 and funding through the Food Science Department and start-up funding from the University of Tennessee Institute of Agriculture.

Go to Source


New model connects respiratory droplet physics with spread of Covid-19

Respiratory droplets from a cough or sneeze travel farther and last longer in humid, cold climates than in hot, dry ones, according to a study on droplet physics by an international team of engineers. The researchers incorporated this understanding of the impact of environmental factors on droplet spread into a new mathematical model that can be used to predict the early spread of respiratory viruses including COVID-19, and the role of respiratory droplets in that spread.

The team developed this new model to better understand the role that droplet clouds play in the spread of respiratory viruses. Their model is the first to be based on a fundamental approach taken to study chemical reactions called collision rate theory, which looks at the interaction and collision rates of a droplet cloud exhaled by an infected person with healthy people. Their work connects population-scale human interaction with their micro-scale droplet physics results on how far and fast droplets spread, and how long they last.

Their results were published June 30 in the journal Physics of Fluids.

“The basic fundamental form of a chemical reaction is two molecules are colliding. How frequently they’re colliding will give you how fast the reaction progresses,” said Abhishek Saha, a professor of mechanical engineering at the University of California San Diego, and one of the authors of the paper. “It’s exactly the same here; how frequently healthy people are coming in contact with an infected droplet cloud can be a measure of how fast the disease can spread.”

They found that, depending on weather conditions, some respiratory droplets travel between 8 feet and 13 feet away from their source before evaporating, without even accounting for wind. This means that without masks, six feet of social distance may not be enough to keep one person’s exhalated particles from reaching someone else.

“Droplet physics are significantly dependent on weather,” said Saha. “If you’re in a colder, humid climate, droplets from a sneeze or cough are going to last longer and spread farther than if you’re in a hot dry climate, where they’ll get evaporated faster. We incorporated these parameters into our model of infection spread; they aren’t included in existing models as far as we can tell.”

The researchers hope that their more detailed model for rate of infection spread and droplet spread will help inform public health policies at a more local level, and can be used in the future to better understand the role of environmental factors in virus spread.

They found that at 35C (95F) and 40 percent relative humidity, a droplet can travel about 8 feet. However, at 5C (41F) and 80 percent humidity, a droplet can travel up to 12 feet. The team also found that droplets in the range of 14-48 microns possess higher risk as they take longer to evaporate and travel greater distances. Smaller droplets, on the other hand, evaporate within a fraction of a second, while droplets larger than 100 microns quickly settle to the ground due to weight.

This is further evidence of the importance of wearing masks, which would trap particles in this critical range.

The team of engineers from the UC San Diego Jacobs School of Engineering, University of Toronto and Indian Institute of Science are all experts in the aerodynamics and physics of droplets for applications including propulsion systems, combustion or thermal sprays. They turned their attention and expertise to droplets released when people sneeze, cough or talk when it became clear that COVID-19 is spread through these respiratory droplets. They applied existing models for chemical reactions and physics principles to droplets of a salt water solution — saliva is high in sodium chloride — which they studied in an ultrasonic levitator to determine the size, spread, and lifespan of these particles in various environmental conditions.

Many current pandemic models use fitting parameters to be able to apply the data to an entire population. The new model aims to change that.

“Our model is completely based on “first principles” by connecting physical laws that are well understood, so there is next to no fitting involved,” said Swetaprovo Chaudhuri, professor at University of Toronto and a co-author. “Of course, we make idealized assumptions, and there are variabilities in some parameters, but as we improve each of the submodels with specific experiments and including the present best practices in epidemiology, maybe a first principles pandemic model with high predictive capability could be possible.”

There are limitations to this new model, but the team is already working to increase the model’s versatility.

“Our next step is to relax a few simplifications and to generalize the model by including different modes of transmission,” said Saptarshi Basu, professor at the Indian Institute of Science and a co-author. “A set of experiments are also underway to investigate the respiratory droplets that settle on commonly touched surfaces.”

Go to Source


Enjin Simplifies Blockchain Integration Via New Java SDK

Last week Enjin, an open-source gaming development platform, launched an all-new Java SDK. The organization is hoping to make it easier for Java developers to integrate their blockchain technology into applications.

Enjin views the addition of a Java SDK as significant based on the widespread use of the programming language and notes that it is “considered a stable, reliable way to build large systems.” The announcement of the SDK highlighted implementations that include Twitter, Netflix, and (a little more to the point) Minecraft. 

The new Java SDK provides developers with access to tools for authentication, user management, wallet linking, and create requests (used to initiate transactions of Enjin Coin). Enjin designed the SDK based on previously released SDKs for Unity and Godot, a fact that may simplify adoption for users that have worked with those tools.

Go to Source
Author: <a href="">KevinSundstrom</a>


Google Cardboard Gains Unity SDK to Support Open-Source Dev

Late last year Google discontinued active development of the company’s smartphone-based VR platform: Google Cardboard. At that time, the announcement was made that Google would open-source the project, an olive branch for developers hoping to continue improving the product. Google has now released the Cardboard Unity SDK to support these efforts.

The blog post announcing the SDK noted that developers will receive support that goes beyond the SDK:

“In addition to the Unity SDK, we are also providing a sample application for iOS/Android, which will be a great aid for developers trying to debug their own creations. This release not only fulfills a promise we made to our Cardboard community, but also shows our support, as we move away from smartphone VR and leave it in the more-than-capable hands of our development community.”

Developers interested in working with Google Cardboard can check out the developer center in addition to the associated GitHub repository

Go to Source
Author: <a href="">KevinSundstrom</a>


How Will Facebook’s Acquisition of Giphy Impact API Integrations?

Last week, Facebook announced its acquisition of Giphy, one of the most popular GIF sites across the internet. In its announcement, Facebook disclosed that the Facebook family of apps already makes for half of Giphy’s traffic, with Instagram making up half of the overall Facebook traffic. Accordingly, the Giphy team will become part of Instagram which has enjoyed API integration with Giphy for quite some time. The question becomes, how will app integrations outside of the Facebook family be impacted by the acquisition?

The Giphy API has long been used by third-party apps like Apple iMessage, TikTok, and Twitter to bring GIFs to integrated apps. With Facebook now fully in control of Giphy, will easy access to GIFs through this service continue? Even if Facebook doesn’t limit API access to Giphy, third party apps may not want to integrate with a Facebook-owned service. Whether it’s competition, privacy issues, or something else; Facebook ownership could dissuade continued use of Giphy for GIFs.

While the future of Giphy integration under Facebook ownership is yet to be seen, Facebook is nothing but positive. Facebook Vice President of Product, Vishal Shah, commented:

“We’ve used GIPHY’s API for years, not just in Instagram, but in the Facebook app, Messenger and WhatsApp. GIPHY will continue to operate its library (including its global content collection), and we’re looking forward to investing further in its technology and relationships with content and API partners. People will still be able to upload GIFs; developers and API partners will continue to have the same access to GIPHY’s APIs; and GIPHY’s creative community will still be able to create great content.”

Go to Source
Author: <a href="">ecarter</a>

3D Printing Industry

U.S. Air Force and GE collaborate to 3D print sump cover for F110 jet engine

Last year, GE Additive and GE Aviation proposed a collaborative metal additive manufacturing program with the U.S. Air Force to accelerate the adoption of AM for spare parts. That collaboration has just hit its first technology milestone with the 3D printing of a sump cover for the F110 jet engine which is used on both […]

Go to Source
Author: Kubi Sertoglu


Pofatu: A new database for geochemical ‘fingerprints’ of artefacts

Due to the improvement and increased use of geochemical fingerprinting techniques during the last 25 years, the archaeological compositional data of stone tools has grown exponentially. The Pofatu Database is a large-scale collaborative project that enables curation and data sharing. The database also provides instrumental details, analytical procedures and reference standards used for calibration purposes or quality control. Thus, Pofatu ensures reproducibility and comparability between provenance studies.

Provenance studies (documenting where artefacts are found relative to their sources or place of manufacture) help archaeologists understand the “life-histories” of artefacts, in this case, stone tools. They show where the raw material come from and how artefacts were manufactured and distributed between individuals and groups. Reliable data allows scientists to reconstruct technological, economic, and social behaviors of human societies over many thousands of years.

To facilitate access to this growing body of geochemical data, Aymeric Hermann and Robert Forkel of the Department for Linguistic and Cultural Evolution, Max Planck Institute for the Science of Human History, conceived and designed Pofatu, the first open-access database of geochemical compositions and contextual information for archaeological sources and artefacts in a form readily accessible to the scientific community.

Reconstructing ancient strategies of raw material and artefact procurement

Geochemical “fingerprinting” of artefacts is the most effective way to reconstruct how and where ancient peoples extracted, transformed, and exchanged stone materials and artefacts. These fingerprints also serve as clues to understand a number of phenomenon in past human societies, such as technical and economic behaviors, as well as sociopolitical organizations.

The Pofatu Database provides researchers with access to an ever-expanding dataset and facilitates comparability and reproducibility in provenance studies. Each sample is comprehensively documented for elemental and isotopic compositions, and includes detailed archaeological provenance, as well as supporting analytical metadata, such as sampling processes, analytical procedures, and quality control.

“By providing analytical data and comprehensive archaeological details in a form that can be readily accessed by the scientific community,” Hermann says, “the Pofatu Database will facilitate assigning unambiguous provenance to artefacts in future studies and will lead to more robust, large-scope modelling of long-distance voyaging and traditional exchange systems.”

Additionally, Marshall Weisler, a collaborator in the Pofatu project from the University of Queensland in Australia, stated that “By tracing the transport of artefacts carried across the wide expanse of the Pacific Ocean, we will be able to reconstruct the ancient journeys enabling the greatest maritime migration in human history.”

Pofatu — an operational framework for data sharing in archaeometry

Pofatu’s structure was designed by Forkel and Hermann. Hermann compiled and described the data with contributions and validations by colleagues and co-authors from universities and research institutions in New Zealand, Australia, and the USA. The database uses GitHub for open-source storage and version control and common non-proprietary file formats (CSV) to enable transparency and built-in reproducibility for future studies of prehistoric exchange. The database currently contains 7759 individual samples from archaeological sites and geological sources across the Pacific Islands, but Pofatu is made for even more, Hermann notes.

“With Pofatu we activated an operational framework for data sharing in archaeometry. The database is currently focused on sites and collections from the Pacific Islands, but we welcome all contributions of geochemical data on archaeological material, regardless of geographic or chrono-cultural boundaries. Our vision is an inclusive and collaborative data resource that will hopefully continue to develop with more datasets from the Pacific as well as from other regions. The ultimate goal is a more global project contemporary to other existing online repositories for geological materials.”

Although the Pofatu Database is meant to be used primarily by archaeologists, analyses of geological samples and raw material extracted from prehistoric quarries could also be used by geologists to gather essential information on the smaller or more remote Pacific islands, which are among the least studied places on the planet and sometimes lack geochemical documentation. In that sense, Pofatu is a tool that will facilitate interdisciplinary research.

Go to Source


How many jobs do robots really replace?

In many parts of the U.S., robots have been replacing workers over the last few decades. But to what extent, really? Some technologists have forecast that automation will lead to a future without work, while other observers have been more skeptical about such scenarios.

Now a study co-authored by an MIT professor puts firm numbers on the trend, finding a very real impact — although one that falls well short of a robot takeover. The study also finds that in the U.S., the impact of robots varies widely by industry and region, and may play a notable role in exacerbating income inequality.

“We find fairly major negative employment effects,” MIT economist Daron Acemoglu says, although he notes that the impact of the trend can be overstated.

From 1990 to 2007, the study shows, adding one additional robot per 1,000 workers reduced the national employment-to-population ratio by about 0.2 percent, with some areas of the U.S. affected far more than others.

This means each additional robot added in manufacturing replaced about 3.3 workers nationally, on average.

That increased use of robots in the workplace also lowered wages by roughly 0.4 percent during the same time period.

“We find negative wage effects, that workers are losing in terms of real wages in more affected areas, because robots are pretty good at competing against them,” Acemoglu says.

The paper, “Robots and Jobs: Evidence from U.S. Labor Markets,” appears in advance online form in the Journal of Political Economy. The authors are Acemoglu and Pascual Restrepo PhD ’16, an assistant professor of economics at Boston University.

Displaced in Detroit

To conduct the study, Acemoglu and Restrepo used data on 19 industries, compiled by the International Federation of Robotics (IFR), a Frankfurt-based industry group that keeps detailed statistics on robot deployments worldwide. The scholars combined that with U.S.-based data on population, employment, business, and wages, from the U.S. Census Bureau, the Bureau of Economic Analysis, and the Bureau of Labor Statistics, among other sources.

The researchers also compared robot deployment in the U.S. to that of other countries, finding it lags behind that of Europe. From 1993 to 2007, U.S. firms actually did introduce almost exactly one new robot per 1,000 workers; in Europe, firms introduced 1.6 new robots per 1,000 workers.

“Even though the U.S. is a technologically very advanced economy, in terms of industrial robots’ production and usage and innovation, it’s behind many other advanced economies,” Acemoglu says.

In the U.S., four manufacturing industries account for 70 percent of robots: automakers (38 percent of robots in use), electronics (15 percent), the plastics and chemical industry (10 percent), and metals manufacturers (7 percent).

Across the U.S., the study analyzed the impact of robots in 722 commuting zones in the continental U.S. — essentially metropolitan areas — and found considerable geographic variation in how intensively robots are utilized.

Given industry trends in robot deployment, the area of the country most affected is the seat of the automobile industry. Michigan has the highest concentration of robots in the workplace, with employment in Detroit, Lansing, and Saginaw affected more than anywhere else in the country.

“Different industries have different footprints in different places in the U.S.,” Acemoglu observes. “The place where the robot issue is most apparent is Detroit. Whatever happens to automobile manufacturing has a much greater impact on the Detroit area [than elsewhere].”

In commuting zones where robots were added to the workforce, each robot replaces about 6.6 jobs locally, the researchers found. However, in a subtle twist, adding robots in manufacturing benefits people in other industries and other areas of the country — by lowering the cost of goods, among other things. These national economic benefits are the reason the researchers calculated that adding one robot replaces 3.3 jobs for the country as a whole.

The inequality issue

In conducting the study, Acemoglu and Restrepo went to considerable lengths to see if the employment trends in robot-heavy areas might have been caused by other factors, such as trade policy, but they found no complicating empirical effects.

The study does suggest, however, that robots have a direct influence on income inequality. The manufacturing jobs they replace come from parts of the workforce without many other good employment options; as a result, there is a direct connection between automation in robot-using industries and sagging incomes among blue-collar workers.

“There are major distributional implications,” Acemoglu says. When robots are added to manufacturing plants, “The burden falls on the low-skill and especially middle-skill workers. That’s really an important part of our overall research [on robots], that automation actually is a much bigger part of the technological factors that have contributed to rising inequality over the last 30 years.”

So while claims about machines wiping out human work entirely may be overstated, the research by Acemoglu and Restrepo shows that the robot effect is a very real one in manufacturing, with significant social implications.

“It certainly won’t give any support to those who think robots are going to take all of our jobs,” Acemoglu says. “But it does imply that automation is a real force to be grappled with.”

Go to Source