Categories
ScienceDaily

Who’s Tweeting about scientific research? And why?

Although Twitter is best known for its role in political and cultural discourse, it has also become an increasingly vital tool for scientific communication. The record of social media engagement by laypeople is decoded by a new study publishing in the open access journal PLOS Biology, where researchers from the University of Washington School of Medicine, Seattle, show that Twitter users can be characterized in extremely fine detail by mining a relatively untapped source of information: how those users’ followers describe themselves. This study reveals some exciting — and, at times, disturbing — patterns of how research is received and disseminated through social media.

Scientists candidly tweet about their unpublished research not only to one another but also to a broader audience of engaged laypeople. When consumers of cutting-edge science tweet or retweet about studies they find interesting, they leave behind a real-time record of the impact that taxpayer-funded research is having within academia and beyond.

The lead author of the study, Jedidiah Carlson at the University of Washington, explains that each user in a social network will tend to connect with other users who share similar characteristics (such as occupation, age, race, hobbies, or geographic location), a sociological concept formally known as “network homophily.” By tapping into the information embedded in the broader networks of users who tweet about a paper, Carlson and his coauthor, Kelley Harris, are able to describe the total audience of each paper as a composite of multiple interest groups that might indicate the study’s potential to produce intellectual breakthroughs as well as social, cultural, economic, or environmental impacts.

Rather than categorizing people into coarse groups such as “scientists” and “non-scientists” that rely on Twitter users to accurately describe themselves in their platform biographies, Carlson was able to accurately segment “scientists” into their specific research disciplines (such as evolutionary biology or bioinformatics), regardless of whether they mentioned these sub-disciplines in their twitter bios.

The broader category of “non-scientists” can be automatically segmented into a multitude of groups, such as mental health advocates, dog lovers, video game developers, vegans, bitcoin investors, journalists, religious groups, and political constituencies. However, Carlson cautions that these indicators of diverse public engagement may not always be in line with scientists’ intended goals.

Hundreds of papers were found to have Twitter audiences that were dominated by conspiracy theorists, white nationalists, or science denialists. In extreme cases, these audience sectors comprised more than half of all tweets referencing a given study, starkly illustrating the adage that science does not exist in a cultural or political vacuum.

Particularly in light of the rampant misappropriation and politicization of scientific research throughout the COVID-19 pandemic, Carlson hopes that the results of his study might motivate scientists to keep a closer watch on the social media pulse surrounding their publications and intervene accordingly to guide their audiences towards productive and well-informed engagement.

Story Source:

Materials provided by PLOS. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

New design principles for spin-based quantum materials

As our lives become increasingly intertwined with technology — whether supporting communication while working remotely or streaming our favorite show — so too does our reliance on the data these devices create. Data centers supporting these technology ecosystems produce a significant carbon footprint — and consume 200 terawatt hours of energy each year, greater than the annual energy consumption of Iran. To balance ecological concerns yet meet growing demand, advances in microelectronic processors — the backbone of many Internet of Things (IoT) devices and data hubs — must be efficient and environmentally friendly.

Northwestern University materials scientists have developed new design principles that could help spur development of future quantum materials used to advance (IoT) devices and other resource-intensive technologies while limiting ecological damage.

“New path-breaking materials and computing paradigms are required to make data centers more energy-lean in the future,” said James Rondinelli, professor of materials science and engineering and the Morris E. Fine Professor in Materials and Manufacturing at the McCormick School of Engineering, who led the research.

The study marks an important step in Rondinelli’s efforts to create new materials that are non-volatile, energy efficient, and generate less heat — important aspects of future ultrafast, low-power electronics and quantum computers that can help meet the world’s growing demand for data.

Rather than certain classes of semiconductors using the electron’s charge in transistors to power computing, solid-state spin-based materials utilize the electron’s spin and have the potential to support low-energy memory devices. In particular, materials with a high-quality persistent spin texture (PST) can exhibit a long-lived persistent spin helix (PSH), which can be used to track or control the spin-based information in a transistor.

Although many spin-based materials already encode information using spins, that information can be corrupted as the spins propagate in the active portion of the transistor. The researchers’ novel PST protects that spin information in helix form, making it a potential platform where ultralow energy and ultrafast spin-based logic and memory devices operate.

The research team used quantum-mechanical models and computational methods to develop a framework to identify and assess the spin textures in a group of non-centrosymmetric crystalline materials. The ability to control and optimize the spin lifetimes and transport properties in these materials is vital to realizing the future of quantum microelectronic devices that operate with low energy consumption.

“The limiting characteristic of spin-based computing is the difficulty in attaining both long-lived and fully controllable spins from conventional semiconductor and magnetic materials,” Rondinelli said. “Our study will help future theoretical and experimental efforts aimed at controlling spins in otherwise non-magnetic materials to meet future scaling and economic demands.”

Rondinelli’s framework used microscopic effective models and group theory to identify three materials design criteria that would produce useful spin textures: carrier density, the number of electrons propagating through an effective magnetic field, Rashba anisotropy, the ratio between intrinsic spin-orbit coupling parameters of the materials, and momentum space occupation, the PST region active in the electronic band structure. These features were then assessed using quantum-mechanical simulations to discover high-performing PSHs in a range of oxide-based materials.

The researchers used these principles and numerical solutions to a series of differential spin-diffusion equations to assess the spin texture of each material and predict the spin lifetimes for the helix in the strong spin-orbit coupling limit. They also found they could adjust and improve the PST performance using atomic distortions at the picoscale. The group determined an optimal PST material, Sr3Hf2O7, which showed a substantially longer spin lifetime for the helix than in any previously reported material.

“Our approach provides a unique chemistry-agnostic strategy to discover, identify, and assess symmetry-protected persistent spin textures in quantum materials using intrinsic and extrinsic criteria,” Rondinelli said. “We proposed a way to expand the number of space groups hosting a PST, which may serve as a reservoir from which to design future PST materials, and found yet another use for ferroelectric oxides — compounds with a spontaneous electrical polarization. Our work also will help guide experimental efforts aimed at implementing the materials in real device structures.”

Story Source:

Materials provided by Northwestern University. Original written by Alex Gerage. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Physicists develop basic principles for mini-labs on chips

Colloidal particles have become increasingly important for research as vehicles of biochemical agents. In future, it will be possible to study their behaviour much more efficiently than before by placing them on a magnetised chip. A research team from the University of Bayreuth reports on these new findings in the journal Nature Communications. The scientists have discovered that colloidal rods can be moved on a chip quickly, precisely, and in different directions, almost like chess pieces. A pre-programmed magnetic field even enables these controlled movements to occur simultaneously.

For the recently published study, the research team, led by Prof. Dr. Thomas Fischer, Professor of Experimental Physics at the University of Bayreuth, worked closely with partners at the University of Poznán and the University of Kassel. To begin with, individual spherical colloidal particles constituted the building blocks for rods of different lengths. These particles were assembled in such a way as to allow the rods to move in different directions on a magnetised chip like upright chess figures — as if by magic, but in fact determined by the characteristics of the magnetic field.

In a further step, the scientists succeeded in eliciting individual movements in various directions simultaneously. The critical factor here was the “programming” of the magnetic field with the aid of a mathematical code, which in encrypted form, outlines all the movements to be performed by the figures. When these movements are carried out simultaneously, they take up to one tenth of the time needed if they are carried out one after the other like the moves on a chessboard.

“The simultaneity of differently directed movements makes research into colloidal particles and their dynamics much more efficient,” says Adrian Ernst, doctoral student in the Bayreuth research team and co-author of the publication. “Miniaturised laboratories on small chips measuring just a few centimetres in size are being used more and more in basic physics research to gain insights into the properties and dynamics of materials. Our new research results reinforce this trend. Because colloidal particles are in many cases very well suited as vehicles for active substances, our research results could be of particular benefit to biomedicine and biotechnology,” says Mahla Mirzaee-Kakhki, first author and Bayreuth doctoral student.

Story Source:

Materials provided by Universität Bayreuth. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

New machine learning-assisted method rapidly classifies quantum sources

For quantum optical technologies to become more practical, there is a need for large-scale integration of quantum photonic circuits on chips.

This integration calls for scaling up key building blocks of these circuits — sources of particles of light — produced by single quantum optical emitters.

Purdue University engineers created a new machine learning-assisted method that could make quantum photonic circuit development more efficient by rapidly preselecting these solid-state quantum emitters.

The work is published in the journal Advanced Quantum Technologies.

Researchers around the world have been exploring different ways to fabricate identical quantum sources by “transplanting” nanostructures containing single quantum optical emitters into conventional photonic chips.

“With the growing interest in scalable realization and rapid prototyping of quantum devices that utilize large emitter arrays, high-speed, robust preselection of suitable emitters becomes necessary,” said Alexandra Boltasseva, Purdue’s Ron and Dotty Garvin Tonjes Professor of Electrical and Computer Engineering.

Quantum emitters produce light with unique, non-classical properties that can be used in many quantum information protocols.

The challenge is that interfacing most solid-state quantum emitters with existing scalable photonic platforms requires complex integration techniques. Before integrating, engineers need to first identify bright emitters that produce single photons rapidly, on-demand and with a specific optical frequency.

Emitter preselection based on “single-photon purity” — which is the ability to produce only one photon at a time — typically takes several minutes for each emitter. Thousands of emitters may need to be analyzed before finding a high-quality candidate suitable for quantum chip integration.

To speed up screening based on single-photon purity, Purdue researchers trained a machine to recognize promising patterns in single-photon emission within a split second.

According to the researchers, rapidly finding the purest single-photon emitters within a set of thousands would be a key step toward practical and scalable assembly of large quantum photonic circuits.

“Given a photon purity standard that emitters must meet, we have taught a machine to classify single-photon emitters as sufficiently or insufficiently ‘pure’ with 95% accuracy, based on minimal data acquired within only one second,” said Zhaxylyk Kudyshev, a Purdue postdoctoral researcher.

The researchers found that the conventional photon purity measurement method used for the same task took 100 times longer to reach the same level of accuracy.

“The machine learning approach is such a versatile and efficient technique because it is capable of extracting the information from the dataset that the fitting procedure usually ignores,” Boltasseva said.

The researchers believe that their approach has the potential to dramatically advance most quantum optical measurements that can be formulated as binary or multiclass classification problems.

“Our technique could, for example, speed up super-resolution microscopy methods built on higher-order correlation measurements that are currently limited by long image acquisition times,” Kudyshev said.

Story Source:

Materials provided by Purdue University. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ProgrammableWeb

Four Methods You Can Use to Train Employees on API Basics

Investing in comprehensive digital training has become a practical necessity for businesses across countless industries. The spread of technology (particularly pertaining to automation) continues at a rapid clip, and the practice of having in-house tech support to assist workers lacking digital skills loses its efficacy when you pivot to the now-standard remote working model.

There are plenty of areas to be covered, of course, meaning you can choose your preferred route — though you’ll need to cater your approach to each topic. When taking your employees through API basics, the goal should be to help them understand the key role that APIs play in the modern world and the value they’ll see from API adoption. For organizations undergoing a digital transformation, educating employees about APIs can help increase buy-in as they begin to see how APIs can make their jobs easier. To help your employees best understand APIs, you’ll need to choose a suitable method.

In this post, we’re going to set out four methods that you can successfully implement in your effort to train your employees on API basics. You can choose just one of them, or attempt all four. It depends on your situation. Regardless, let’s begin.

Invest In Decent Online Courses

The most common option for training of any kind (particularly in a time of remote working) is to turn to online courses. They’re extremely accessible these days. There are myriad websites that offer huge ranges of courses on broad or specific topics: some costly, some cheap, some free (Lifehack has a solid list of these). In addition to that, there are plenty of YouTube videos (and Vimeo videos) that have a lot of valuable info — including videos from ProgrammableWeb:

The investment here isn’t necessarily monetary, though it may well be: it’s also about time and effort, because going through a free course that has little to offer will be a waste. The good thing about standardized online courses is that they’re extremely trackable and generally have native assessments, making it much easier to tell what progress has been made.

Run Guided Training Sessions

If you already have employees familiar with how APIs work and what they’re used for, you can take advantage of this by getting them involved in training sessions for their coworkers. Sharing knowledge internally is enormously useful for companies with multiple departments (or just diverse roles), and it allows a lot more creative freedom than using a set program from outside.

This also opens up the possibility of turning to a consultancy service: finding an API expert (or set of experts) who can come in to provide instruction on particular topics. Consulting a third party is very common in the IT industry due to its sheer breadth: for instance, it’s standard practice for a value-added software reseller to task a cloud solution distributor (usually a company like intY) with stepping in to provide software training and assist on complex customer queries.

Bring Them In On API Projects

Many people learn best by doing (this is why active learning is so prominent). If there are already people in your company working on API projects, a good way to pique the interest of someone struggling for training motivation is to give them the opportunity to get involved in one or more of those projects. They can follow along with what’s being done, ask questions at their leisure, and be given limited tasks they can pursue without needing to rush themselves.

This form of training will inevitably lead to some major mistakes being made, which is why you should choose low-priority projects — but those mistakes will likely prove beneficial because dedicated professionals always learn from their errors. In all likelihood, joining a meaningful project won’t just show them the value of APIs: it’ll also inspire them to do their own research.

Offer Incentives for Upskilling

Lastly, one of the best methods for training employees on API basics doesn’t actually involve doing any training. Instead, it involves offering them meaningful incentives for upskilling in pertinent areas and leaving them to decide how they’re going to manage it. One person might want to arrange some intensive training over the course of a week. Another might prefer to learn slowly but surely over the course of several months.

In the end, you want your employees to understand API basics because it will allow them to make better use of all the software tools available to them (and see much more clearly how interconnected today’s digital world is). How they acquire that understanding shouldn’t really matter, and simply providing a path for promotion and/or salary improvement might end up being the most powerful motivator (and cost no more than a high-end training course would).

There you have it: four viable methods for training your employees on API basics. What you should do depends on the preferences and abilities of your team. Ask them about how they work, how they learn, and what would motivate them to learn — then deliver.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">rodneylaws</a>

Categories
ScienceDaily

Effectiveness of cloth masks depends on type of covering

Months into the COVID-19 pandemic, wearing a mask while out in public has become the recommended practice. However, many still question the effectiveness of this.

To allay these doubts, Padmanabha Prasanna Simha, from the Indian Space Research Organisation, and Prasanna Simha Mohan Rao, from the Sri Jayadeva Institute of Cardiovascular Sciences and Research, experimentally visualized the flow fields of coughs under various common mouth covering scenarios. They present their findings in the journal Physics of Fluids, from AIP Publishing.

“If a person can reduce the extent of how much they contaminate the environment by mitigating the spread, it’s a far better situation for other healthy individuals who may enter places that have such contaminated areas,” Simha said.

Density and temperature are intricately related, and coughs tend to be warmer than their surrounding area. Tapping into this connection, Simha and Rao utilized a technique called schlieren imaging, which visualizes changes in density, to capture pictures of voluntary coughs from five test subjects. By tracking the motion of a cough over successive images, the team estimated velocity and spread of the expelled droplets.

Unsurprisingly, they found N95 masks to be the most effective at reducing the horizontal spread of a cough. The N95 masks reduced a cough’s initial velocity by up to a factor of 10 and limit its spread to between 0.1 and 0.25 meters.

An uncovered cough, in contrast, can travel up to 3 meters, but even a simple disposable mask can bring this all the way down to 0.5 meters.

“Even if a mask does not filter out all the particles, if we can prevent clouds of such particles from traveling very far, it’s better than not doing anything,” said Simha. “In situations where sophisticated masks are not available, any mask is better than no mask at all for the general public in slowing the spread of infection.”

Some of the other comparisons, however, were striking.

For example, using an elbow to cover up a cough is typically considered a good alternative in a pinch, which is contradictory to what the pair found. Unless covered by a sleeve, a bare arm cannot form the proper seal against the nose necessary to obstruct airflow. A cough is then able to leak through any openings and propagate in many directions.

Simha and Rao hope their findings will put to rest the argument that regular cloth masks are ineffective, but they emphasize that masks must continue to be used in conjunction with social distancing.

“Adequate distancing is something that must not be ignored, since masks are not foolproof,” Simha said.

Story Source:

Materials provided by American Institute of Physics. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ProgrammableWeb

10 Top Messaging APIs

In recent years, messaging has become a primary means of communication for much of the world. The asynchronous convenience of text messaging (SMS), web instant messaging, and in-app messaging, has driven this rise in popularity, along with a slough of enticing features within messaging applications to keep us hooked.

Engaging features in messaging applications include cross platform operation, artificial intelligence chatbots, anytime/anywhere usage thanks to WiFi or mobile network operators, file transfers, free international “calls”, business communications, audio messages, aggregated services, group chat, encryption, self-destructing messages, instant payments, and automatic alerts. Fun additions such as emoji, stickers, image & video support, avatars animations, “story” creation, games, cute bubbles and screen effects, contextual keyboards and even handwritten text lure customers to use messaging applications.

It’s not unusual to see applications with built-in custom messaging services, and developers who create applications have a vast amount of choices for delivering messaging technology. In order to integrate with these services, developers need APIs.

What is a Messaging API?

A Messaging API, or Application Programming Interface, is a means for developers to connect to specific messaging services programmatically.

The best place to discovery APIs for adding messaging capabilities to applications is in the ProgrammableWeb directory in the Messaging category. This article highlights the 10 most popular messaging APIs based on website traffic in the ProgrammableWeb directory.

1. Telegram

Telegram is a cloud-based mobile and desktop messaging app that focuses on speed and security. The Telegram API allows developers to build their own customized Telegram clients and applications. API methods are provided for dealing with spam and ToS violations, logging in via QR code, registration/authorization, working with GIFs, working with 2FA login, working with VoIP calls, working with deep links, working with files, and much more.

2. Bulk SMS Gateway API

The Bulk SMS Gateway APITrack this API allows developers to integrate bulk SMS services into their applications and portals. This API is suited for sending both promotional and transactional SMS to clients. API documentation is not publicly available. This service is provided by KAPSYSTEM, a company in India that provides bulk SMS and messaging solutions.

3. WhatsApp Business API

The WhatsApp Business APIs allow businesses to interact with and reach customers all over the world, connecting them using end-to-end encryption to ensure only the intended parties can read or listen to messages and calls. A REST APITrack this API and Streaming (Webhooks) APITrack this API are available.

4. Twilio SMS API

Twilio is a cloud communications platform that provides tools for adding messaging, voice, and video to web and mobile applications. The Twilio SMS APITrack this API allows developers to send and receive SMS messages, track sent messages, and retrieve and modify message history from their applications. This API uses a RESTful interface over HTTPS.

5. BDApps Pro SMS API

BDApps is an application development platform that provides Robi network tools for monetization and messaging. The BDApps Pro SMS APITrack this API allows developers to send and receive SMS using JSON objects over HTTP. This API can also be used to check the delivery status of sent SMS, receive SMS with a short code, and more. BDApps is based in Bangladesh.

6. Verizon ThingSpace SMS API

The Verizon ThingSpace SMS APITrack this API lets applications send time sensitive information to users phones about devices or sensor readings,such as temperature threshold warnings, gas leakage, smoke, fires, outages, and more. The ThingSpace SMS API allows users to check the delivery status of messages, and receive other notifications about messaging.

7. Telenor SMS API

Telenor is a mobile carrier based on Norway. The Telenor SMS APITrack this API provides access to the company’s text messaging service for business-to-business and business-to-consumer bulk messaging needs. The company provides short and whole numbers for sending and receiving text and MMS messages. There are various options for using the API, including SOAP and XMPP protocols.

8. waboxapp API

waboxapp is an API that allows users to integrate systems and Instant Messaging (IM) accounts. The waboxapp APITrack this API simplifies the integration of IM accounts such as WhatsApp in chat applications.

9. Twitter Direct Message API

The Twitter Direct Message APITrack this API allows developers to create engaging customer service and marketing experiences using Twitter Direct Messages (DM). Developers can send and receive direct messages, create welcome messages, attach media to messages, prompt users for structured replies, link to websites with buttons, manage conversations across multiple applications, display custom content, and prompt users for NPS and CSAT feedback with the API.

10. Mirrorfly API

Mirrorfly is a real time chat and messaging solution. The Mirrorfly APITrack this API allows developers to integrate chat, video, and voice functionality into their mobile and web applications. This service is customizable, comes with built-in WebRTC, and can be used for enterprise communication, in-app messaging, broadcasting, streaming, customer support, team chat, social chat, and personal chat. Both cloud-based and on-premises versions of Mirrorfly are available.

Build custom chat applications with MirrorFly API and SDK. Screenshot: MirrorFly

See the Messaging category for more than 1100 Messaging APIs, 1000 SDKs, and 1000 Source Code Samples, along with How-To and news articles and other developer resources..

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">joyc</a>

Categories
ScienceDaily

Researchers unlock secrets of the past with new international carbon dating standard

Radiocarbon dating is set to become more accurate than ever after an international team of scientists improved the technique for assessing the age of historical objects.

The team of researchers at the Universities of Sheffield, Belfast, Bristol, Glasgow, Oxford, St Andrews and Historic England, plus international colleagues, used measurements from almost 15,000 samples from objects dating back as far as 60,000 years ago, as part of a seven-year project.

They used the measurements to create new international radiocarbon calibration (IntCal) curves, which are fundamental across the scientific spectrum for accurately dating artefacts and making predictions about the future. Radiocarbon dating is vital to fields such as archaeology and geoscience to date everything from the oldest modern human bones to historic climate patterns.

Archaeologists can use that knowledge to restore historic monuments or study the demise of the Neanderthals, while geoscientists on the Intergovernmental Panel on Climate Change (IPCC), rely upon the curves to find out about what the climate was like in the past to better understand and prepare for future changes.

Professor Paula Reimer, from Queen’s University Belfast and head of the IntCal project, said: “Radiocarbon dating has revolutionised the field of archaeology and environmental science. As we improve the calibration curve, we learn more about our history. The IntCal calibration curves are key to helping answer big questions about the environment and our place within it.”

The team of researchers have developed three curves dependent upon where the object to be dated is found. The new curves, to be published in Radiocarbon, are IntCal20 for the Northern Hemisphere, SHCal20 for the Southern Hemisphere, and Marine20 for the world’s oceans.

Dr Tim Heaton, from the University of Sheffield and lead author on the Marine20 curve, said: “This is a very exciting time to be working in radiocarbon. Developments in the field have made it possible to truly advance our understanding. I look forward to seeing what new insights into our past these recalculated radiocarbon timescales provide.”

The previous radiocarbon calibration curves developed over the past 50 years, were heavily reliant upon measurements taken from chunks of wood covering 10 to 20 years big enough to be tested for radiocarbon.

Advances in radiocarbon testing mean the updated curves instead use tiny samples, such as tree-rings covering just single years, that provide previously impossible precision and detail in the new calibration curves. Additionally, improvements in understanding of the carbon cycle have meant the curves have now been extended all the way to the limit of the radiocarbon technique 55,000 years ago.

Radiocarbon dating is the most frequently used approach for dating the last 55,000 years and underpins archaeological and environmental science. It was first developed in 1949. It depends upon two isotopes of carbon called stable 12C and radioactive 14C.

While a plant or animal is alive it takes in new carbon, so has the same ratio of these isotopes as the atmosphere at the time. But once an organism dies it stops taking in new carbon, the stable 12C remains but the 14C decays at a known rate. By measuring the ratio of 14C to 12C left in an object the date of its death can be estimated.

If the level of atmospheric 14C were constant, this would be easy. However, it has fluctuated significantly throughout history. In order to date organisms precisely scientists need a reliable historical record of its variation to accurately transform 14C measurements into calendar ages. The new IntCal curves provide this link.

The curves are created based on collecting a huge number of archives which store past radiocarbon but can also be dated using another method. Such archives include tree-rings from up to 14,000 years ago, stalagmites found in caves, corals from the sea and cores drilled from lake and ocean sediments. In total, the new curves were based upon almost 15,000 measurements of radiocarbon taken from objects as old as 60,000 years.

Alex Bayliss, Head of Scientific Dating at Historic England, said: “Accurate and high-precision radiocarbon dating underpins the public’s enjoyment of the historic environment and enables better preservation and protection.

“The new curves have internationally important implications for archaeological methodology, and for practices in conservation and understanding of wooden built heritage.”

Darrell Kaufman of the IPCC said: “The IntCal series of curves are critical for providing a perspective on past climate which is essential for our understanding of the climate system, and a baseline for modelling future changes.”

Go to Source
Author:

Categories
ScienceDaily

Study predicts millions of unsellable homes could upend market

Millions of American homes could become unsellable — or could be sold at significant losses to their senior-citizen owners — between now and 2040, according to new research from the University of Arizona.

The study predicts that many baby boomers and members of Generation X will struggle to sell their homes as they become empty nesters and singles. The problem is that millions of millennials and members of Generation Z may not be able to afford those homes, or they may not want them, opting for smaller homes in walkable communities instead of distant suburbs.

Baby boomers are people born between 1946 and 1964, while Gen Xers were born between 1965 and 1980. Millennials were born between 1981 and 1997 and Gen Zers between 1998 and 2015.

The study predicts that the change in home-buying behaviors by younger generations may result in a glut of homes that could grow as high as 15 million by 2040, with homeowners selling for far below what they paid — if they can sell them at all. Most seniors will be able to sell their homes, the study says, but it may become especially difficult in smaller, distant and slow- or non-growing markets.

Arthur C. Nelson, a professor of urban planning and real estate development at the UArizona College of Architecture, Planning and Landscape Architecture, calls his prediction “The Great Senior Short Sale” in a paper published this week in the Journal of Comparative Urban Land and Policy.

An expert in urban studies, public policy and land development, Nelson has spent a large part of his career studying the changing demand for suburban homes, since long before the housing market crash of the Great Recession.

His newest prediction, if it plays out, would undermine one of the “big promises” of homeownership for millions of seniors, Nelson said: that a home, after it’s paid off, can be sold for a retirement nest egg.

“What if you pay off your mortgage over 30 years,” he added, “and nobody buys the home?”

The Mismatch in the Market

Nelson’s prediction comes from synthesizing data from sources such as the U.S. Census Bureau and the Harvard Joint Center for Housing Studies. The Harvard center is a leading source of data for those in academia, government and business to make sense of housing issues to inform policy decisions.

Nelson, using those data, mapped how the ages of homeowners would change between 2018 and 2038. Looking at three age groups — over 65, 35-64 and under 35 — he came to the projection at the center of the study: that there may be fewer homeowners under 65 in 2038 than there were in 2018, even though the vast majority of people over 65 in 2038 will own their homes.

“There’s the mismatch — if those over 65 unload their homes, and those under 65 aren’t buying them, what happens to those homes?” he asks.

Nelson is careful not to overstate his findings; millions of people will buy the homes that older generations are selling, he said.

“But the vast supply is so large and the demand for them is going to be so small, in comparison, that there’s going to be a real problem starting later this decade,” he said.

Nelson said he expects the phenomenon to reveal itself not all at once, but gradually over the next couple decades, at about 500,000 to 1 million homes every year. It’s not likely to have much impact in growing metropolitan areas such as Phoenix or Dallas where “growth will solve all kinds of problems,” he said, but it will matter in thousands of suburban and rural areas — including some parts of Arizona.

“The people who own homes now in thousands of declining communities may simply have to walk away from them,” he said.

Proposed Policy Solutions

Nelson’s study urges action from lawmakers, and he offers some ideas of his own.

Among those is a program in which the federal government would buy back homes that have or may become unsellable. The Federal Emergency Management Agency already does something similar with homes that or have been or are likely to be damaged by natural disasters.

By bearing the cost of buying those homes, Nelson said, the government could help seniors avoid turning to federal social support programs after losing their homes. Those programs are costly to taxpayers, and the cost is even greater when programs need to be administered in rural or suburban areas — where homes are predicted not to sell, Nelson said.

“If you have millions of seniors spread all across the landscape costing a fortune to serve, we might be better off finding ways to induce many to sell their homes,” he said. “And we could actually then save potentially billions in public money that would otherwise be used to serve people in very distant and remove locations.”

Nelson also proposes programs at the state level that would allow younger people to live with older empty nesters, single people and others who live in homes larger than they may need, but who do not want to move.

By sharing homes, Nelson said, older people would not have to sell them, and younger housemates could act as caregivers and property managers.

The idea is already being tested in cities such as Minneapolis and Seattle and across the state of Oregon, Nelson said. There, laws were passed last year that allowed owners of single-family homes to divide them into multiple units.

Nelson completed his study just before the coronavirus outbreak became widespread. But the pandemic, he said, doesn’t make the housing issue any less urgent.

“We’re going to wake up in 2025 — give or take a few years — to realize that millions of seniors can’t get out of their homes and that it’s going to get worse in to the 2030s,” he said. “We must start doing things now to reduce the coming shock of too many seniors trying to sell their homes to too few younger buyers.”

Go to Source
Author:

Categories
3D Printing Industry

INTAMSYS and Victrex partner to propel high-performance PAEK filament

FFF 3D printer manufacturer INTAMSYS has become the world’s first global reseller of polymer specialist Victrex’s new polyaryletherketone (PAEK) filament – VICTREX AM 200. The collaboration also marks INTAMSYS as the first company in Victrex’s proposed filament fusion network, which aims to facilitate and encourage the use of AM 200 and any future PAEK-based filaments […]

Go to Source
Author: Kubi Sertoglu