Modern theory from ancient impacts

Around 4 billion years ago, the solar system was far less hospitable than we find it now. Many of the large bodies we know and love were present, but probably looked considerably different, especially the Earth. We know from a range of sources, including ancient meteorites and planetary geology, that around this time there were vastly more collisions between, and impacts from, asteroids originating in the Mars-Jupiter asteroid belt.

Knowledge of these events is especially important to us, as the time period in question is not only when the surface of our planet was taking on a more recognizable form, but was also when life was just getting started. With more accurate details of Earth’s rocky history, it could help researchers answer some long-standing questions concerning the mechanisms responsible for life, as well as provide information for other areas of life science.

“Meteorites provide us with the earliest history of ourselves,” said Professor Yuji Sano from the Atmosphere and Ocean Research Institute at the University of Tokyo. “This is what fascinated me about them. By studying properties, such as radioactive decay products, of meteorites that fell to Earth, we can deduce when they came and where they came from. For this study we examined meteorites that came from Vesta, the second-largest asteroid after the dwarf planet Ceres.”

Sano and his team found evidence that Vesta was hit by multiple impacting bodies around 4.4 billion to 4.15 billion years ago. This is earlier than 3.9 billion years ago, which is when the late heavy bombardment (LHB) is thought to have occurred. Current evidence for the LHB comes from lunar rocks collected during the Apollo moon missions of the 1970s, as well as other sources. But these new studies are improving upon previous models and will pave the way for an up-to-date database of early solar impact records.

“That Vesta-origin meteorites clearly show us impacts earlier than the LHB raises the question, ‘Did the late heavy bombardment truly occur?'” said Sano. “It seems to us that early solar system impacts peaked sooner than the LHB and reduced smoothly with time. It may not have been the cataclysmic period of chaos that current models describe.”

Story Source:

Materials provided by University of Tokyo. Note: Content may be edited for style and length.

Go to Source


Going small for big solutions: Sub-nanoparticle catalysts made from coinage elements as effective catalysts

Due to their small size, nanoparticles find varied applications in fields ranging from medicine to electronics. Their small size allows them a high reactivity and semiconducting property not found in the bulk states. Sub-nanoparticles (SNPs) have an extremely small diameter of around 1 nm, making them even smaller than nanoparticles. Almost all atoms of SNPs are available and exposed for reactions, and therefore, SNPs are expected to have extraordinary functions beyond the properties of nanoparticles, particularly as catalysts for industrial reactions. However, preparation of SNPs requires fine control of the size and composition of each particle on a sub-nanometer scale, making the application of conventional production methods near impossible.

To overcome this, researchers at the Tokyo Institute of Technology led by Dr. Takamasa Tsukamoto and Prof. Kimihisa Yamamoto previously developed the atom hybridization method (AHM) which surpasses the previous trials of SNP synthesis. Using this technique, it is possible to precisely control and diversely design the size and composition of the SNPs using a “macromolecular template” called phenylazomethine dendrimer. This improves their catalytic activity than the NP catalysts.

Now, in their latest study published in Angewandte Chemie International Edition, the team has taken their research one step further and has investigated the chemical reactivity of alloy SNPs obtained through the AHM. “We created monometallic, bimetallic, and trimetallic SNPs (containing one, combination of two, and combination of three metals respectively), all composed of coinage metal elements (copper, silver, and gold), and tested each to see how good of a catalyst each of them is,” reports Dr Tsukamoto. 

Unlike corresponding nanoparticles, the SNPs created were found to be stable and more effective. Moreover, SNPs showed a high catalytic performance even under the milder conditions, in direct contrast to conventional catalysts. Monometallic, bimetallic, and trimetallic SNPs demonstrated the formation of different products, and this hybridization or combination of metals seemed to show a higher turnover frequency (TOF). The trimetallic combination “Au4Ag8Cu16” showed the highest TOF because each metal element plays a unique role, and these effects work in concert to contribute to high reaction activity.

Furthermore, SNP selectively created hydroperoxide, which is a high-energy compound that cannot be normally obtained due to instability. Mild reactions without high temperature and pressure realized in SNP catalysts resulted in the stable formation of hydroperoxide by suppressing its decomposition.

When asked about the relevance of these findings, Prof Yamamoto states: “We demonstrate for the first time ever, that olefin hydroperoxygenation can been catalyzed under extremely mild conditions using metal particles in the quantum size range. The reactivity was significantly improved in the alloyed systems especially for the trimetallic combinations, which has not been studied previously.”

The team emphasized that because of the extreme miniaturization of the structures and the hybridization of different elements, the coinage metals acquired a high enough reactivity to catalyze the oxidation even under the mild condition. These findings will prove to be a pioneering key in the discovery of innovative sub-nanomaterials from a wide variety of elements and can solve energy crises and environmental problems in the years to come.

Story Source:

Materials provided by Tokyo Institute of Technology. Note: Content may be edited for style and length.

Go to Source


Scientists propose plan to determine if Planet Nine is a primordial black hole

Scientists at Harvard University and the Black Hole Initiative (BHI) have developed a new method to find black holes in the outer solar system, and along with it, determine once-and-for-all the true nature of the hypothesized Planet Nine. The paper, accepted to The Astrophysical Journal Letters, highlights the ability of the future Legacy Survey of Space and Time (LSST) mission to observe accretion flares, the presence of which could prove or rule out Planet Nine as a black hole.

Dr. Avi Loeb, Frank B. Baird Jr. Professor of Science at Harvard, and Amir Siraj, a Harvard undergraduate student, have developed the new method to search for black holes in the outer solar system, based on flares that result from the disruption of intercepted comets. The study suggests that the LSST has the capability to find black holes by observing for accretion flares resulting from the impact of small Oort cloud objects.

“In the vicinity of a black hole, small bodies that approach it will melt as a result of heating from the background accretion of gas from the interstellar medium onto the black hole,” said Siraj. “Once they melt, the small bodies are subject to tidal disruption by the black hole, followed by accretion from the tidally disrupted body onto the black hole.” Loeb added, “Because black holes are intrinsically dark, the radiation that matter emits on its way to the mouth of the black hole is our only way to illuminate this dark environment.”

Future searches for primordial black holes could be informed by the new calculation. “This method can detect or rule out trapped planet-mass black holes out to the edge of the Oort cloud, or about a hundred thousand astronomical units,” said Siraj. “It could be capable of placing new limits on the fraction of dark matter contained in primordial black holes.”

The upcoming LSST is expected to have the sensitivity required to detect accretion flares, while current technology isn’t able to do so without guidance. “LSST has a wide field of view, covering the entire sky again and again, and searching for transient flares,” said Loeb. “Other telescopes are good at pointing at a known target but we do not know exactly where to look for Planet Nine. We only know the broad region in which it may reside.” Siraj added, “LSST’s ability to survey the sky twice per week is extremely valuable. In addition, its unprecedented depth will allow for the detection of flares resulting from relatively small impactors, which are more frequent than large ones.”

The new paper focuses on the famed Planet Nine as a prime first candidate for detection. The subject of much speculation, most theories suggest that Planet Nine is a previously undetected planet, but it may also flag the existence of a planet-mass black hole.

“Planet Nine is a compelling explanation for the observed clustering of some objects beyond the orbit of Neptune. If the existence of Planet Nine is confirmed through a direct electromagnetic search, it will be the first detection of a new planet in the solar system in two centuries, not counting Pluto, said Siraj, adding that a failure to detect light from Planet Nine — or other recent models, such as the suggestion to send probes to measure gravitational influence — would make the black hole model intriguing. “There has been a great deal of speculation concerning alternative explanations for the anomalous orbits observed in the outer solar system. One of the ideas put forth was the possibility that Planet Nine could be a grapefruit-sized black hole with a mass of five to ten times that of the Earth.”

The focus on Planet Nine is based both in the unprecedented scientific significance that a hypothetical discovery of a planet-mass black hole in the solar system would hold as well as the continued interest in understanding what’s out there. “The outskirts of the solar system is our backyard. Finding Planet Nine is like discovering a cousin living in the shed behind your home which you never knew about,” said Loeb. “It immediately raises questions: why is it there? How did it obtain its properties? Did it shape the solar system history? Are there more like it?”

The research was funded in part by a grant from the Breakthrough Prize Foundation, and by Harvard’s Black Hole Initiative (BHI), which is funded by grants from the John Templeton Foundation (JTF) and the Gordon and Betty Moore Foundation (GBMF).

Go to Source


How to Secure Your Rails API Without Being a Security Expert

Ruby on Rails is such a sweet tool to use. As a full-stack developer, I find it my framework of choice whenever I need to build a prototype/minimum-viable product. Its ease of use and quick setup allow me to move forward quickly into building the front-end of my projects.

This convenience can cause a developer to overlook the security aspects of their code. I know I’m guilty of it a lot of times. Even if one is accustomed to Test-Driven-Development, it doesn’t guarantee that the code they write is secure.

Personally, I never really paid much attention to security whenever I built apps. The reason is that I focus on building quick prototypes and minimally viable products (MVPs). But that’s not an excuse to write sloppy code that can easily open up users to attacks.

This is why, at the very least, as engineers, we have to be aware and be able to fix the most common vulnerabilities that we may accidentally put in our projects.

Open-source databases such as WhiteSource’s Vulnerability Lab provide you with updated information on most vulnerabilities.

You can learn about each vulnerability in greater detail including its severity and what fixes are available. Should you also discover new vulnerabilities, you can reach out to its maintainers for them to confirm and add it to their database.

In this article I’m going to run down the three most common vulnerabilities one may encounter in a Ruby on Rails API only app. However, some of the concepts presented apply regardless of which platform you’re using and you should investigate what similar tooling is available for your platform if you’re not on Rails.

I limit the topic to JSON-API projects since I personally would use Rails to only build APIs (and use React or any other front-end framework to build the client-side).

Session Hijacking

In a Rails API app, a common way to do authentication is by the generation of a JSON-Web Token (JWT). Every time a `POST` request is made to the authentication endpoint (usually `/sessions`), the Rails app can generate the JWT from the current user’s ID and an expiration timestamp (usually 24 hours).

An example of how to generate this JWT is using Ruby’s `jwt` gem`:

def self.encode(payload, exp = 24.hours.from_now)
   payload[:exp] = exp.to_i
   JWT.encode(payload, SECRET_KEY)

The JWT will be sent as part of the response to the front-end app.

For authenticated requests (such as getting a list of all the current users’ friends in a social networking application), the Rails app should check the request’s headers for the presence of this JWT.

If the request doesn’t contain any JWT or an expired JWT, then the API will simply respond with a 401 (Unauthorized) error.

During authentication:

If the request doesn't contain any JWT or an expired JWT, then the API will simply respond with a 401 (Unauthorized) error.

Subsequent requests that require a requester to be authenticated:

Subsequent requests that require a requester to be authenticated:

The issue here is that assuming the JWT hasn’t expired yet, a hacker may be able to steal the JWT and access the API as an authenticated user. API authentication tokens have been exploited like this in several real-world attacks.

In this case, the only measure against this is making the JWT expire sooner. So instead of 24 hours, the JWT can be made to expire in 1 hour.

The good news is that the only way for a hacker to steal this token is either by the user accidentally (or foolishly) sending this token through the web or if the hacker physically goes to the machine that this token is in (at the client-side most commonly an auth token is saved at the localstorage).

Still, we want to cover as much ground as possible in regards to the security of any application we build.

So, an alternative but more cumbersome way to store our auth token is for the Rails API to save each JWT generated into a database. Here’s how that logic goes:

  1. An additional database table (and model) called `tokens` will be generated
        $ rails g model token value:string
  2. Every time a successful login has been made, the Rails API will generate a new JWT and save it as a record in the `tokens` table
        def self.encode(payload, exp = 24.hours.from_now)
           payload[:exp] = exp.to_i
           token = JWT.encode(payload, SECRET_KEY)
           Token.create(value: token)
  3. This same JWT will be sent to the requester as part of the response.
  4. For requests that are only allowed for authenticated users, the Rails API will check for the existence of a JWT in the headers. Automatically the response will be a 401 (Unauthorized) if the headers don’t contain any.
  5. If the JWT is present in the headers, the first thing that the Rails API will do is check for its existence in the `tokens` database table. If it cannot be found then a 401 will be sent as response.
  6. If the JWT exists in the `tokens` table, then the Rails API will try to decode this (using a custom method). If it’s invalid (i.e, expired) then again a 401 response will be sent.
  7. If the JWT is valid, then a success response will be sent along with the data requested (or appropriate actions will be taken in the API).

We can write a method in the `application_controller.rb` file to check for the existence of the token in the headers, then in the database, and subsequently for its validity.

Using Ruby’s `pundit` gem`:

def pundit_user
   header = request.headers['Authorization']
   header = header.split(' ').last if header
    if Token.find_by(value: header)
             decoded = JsonWebToken.decode(header)
             return User.find(decoded['id'])
    decoded = JsonWebToken.decode(nil)
   rescue ActiveRecord::RecordNotFound
     render json: { message: 'You are not authorized' }, status: :unauthorized
   rescue JWT::DecodeError
     render json: { message: 'Unauthorized access' }, status: :unauthorized

Drawing this sequence in a flowchart:

Drawing Ruby's pundit gem sequence in a flowchart

The only drawback for this solution is that every single login of the user will add a new record into the `tokens` table. For the lifetime of each user, this may mean thousands to millions of records depending on the frequency of login.

A possible fix is to have an automated database cleanup. But this is an article about security vulnerabilities and fixes.

However, the benefit of this approach is that we can create a method that allows us to log out all devices by destroying all records in the `tokens` table:

def logout_all

As a final note, there is still of course a very real threat that hackers can get inside the database and steal the tokens. The best way to ensure that the JWTs are secured in a way that makes them useless to hackers is by using digital signatures.

SQL Injection

This particular security vulnerability doesn’t only affect Rails apps. Every web (or mobile) application that does SQL queries over the internet are susceptible to this.

In simple terms, an SQL injection attack happens whenever a malicious user manipulates request parameters in order to access database content.

For example, let’s say we have a database table called `users` (and a corresponding `User` model). Let’s also say that the way we coded a query to get a particular user’s data is as follows:

User.where("first_name = '%#{params[:first_name]}%'")

So the following query string in the URL will return a collection user with the name “Michael”:


However, a malicious user will simply “inject” any string value in this query string in order to extend the SQL query statement in the first line. Doing so could allow this malicious user to access all users and be able to do what they want with that data.

Due to its prevalence, one would think that the maintainers of Ruby on Rails would have an out-of-the-box safeguard for this. Unfortunately, it doesn’t.

What’s great though is that we can secure our Rails APIs against this vulnerability through simple tweaks in our ActiveRecord queries.

We can then revise our SQL query above to make it more secure:

User.where("first_name = ?", params[:first_name])

Aside from the obvious syntactical difference, how else is this newly formed SQL query any different from the aforementioned pure stringed one?

In the pure string query, which is unsafe, every single bit is being passed into the database as-is.

So a hacker might pass an extended SQL query string, as previously mentioned, to get a list of all user data so they could either sell or delete (whatever they fancy).

In the revised SQL query, the second argument is dynamic (since Rails will “escape” it) and won’t go directly into the database before being sanitized by Rails. This incapacitates any malicious attempts by a hacker to access the database through query strings.

Authorization and Access Vulnerabilities

Authenticating a user is simply checking whether the user that is making a request is logged in. Authorization is providing another layer of data access protection.

Authorizing a user means providing only a certain level of access to features depending on what category of user is making such requests.

For instance, message exchanges between two users should be made private between them. Therefore, an authorization system should be in place that will check whether an app user trying to access conversation data is a participant in that conversation.

I listed this vulnerability last because it can be caused more by architectural decisions than code. There are certain best practices that we can employ to ensure our API has a high-quality authorization system. But probably the simplest architectural measure is providing the least access by default.

For example, the only default access all users should have is their own data.

As the user is given additional privileges (i.e when they become administrators) that’s the time they are also provided access to additional information or features.

Using `pundit` gem

Fortunately, as Rails developers we can use `pundit`. Pundit allows us to easily create “policies” that restrict the kinds of requests users can make depending on certain model attributes.

For more advanced policies the developer can simply create new classes. The resulting policies are also very easily tested.


There are a thousand types of vulnerabilities any project can have. All of the three I listed here are easily detected. But there are hundreds more that are due to the types of dependencies we install into our projects.

Luckily for us if we’re pushing our code to Github we will get notices should any vulnerability be detected in any of our code’s dependencies.

e're pushing our code to Github we will get notices should any vulnerability be detected in any of our code's dependencies.

What’s even more fascinating is that Github already provides us with the solution:

What's even more fascinating is that Github already provides us with the followng solution:

The main message I wish to impart in this article is for us developers to start becoming more security-oriented, from staying on top of Github security updates to programming language vulnerabilities.

The field of security may be an entirely new discipline altogether for most of us but we don’t have to be experts to be able to build secure apps.

Go to Source
Author: <a href="">Anton-Lawrence</a>


Experimentally identifying effective theories in many-body systems

One goal of science is to find physical descriptions of nature by studying how basic system components interact with one another. For complex many-body systems, effective theories are frequently used to this end. They allow describing the interactions without having to observe a system on the smallest of scales. Physicists at Heidelberg University have now developed a new method that makes it possible to identify such theories experimentally with the aid of so-called quantum simulators. The results of the research effort, led by Prof. Dr Markus Oberthaler (experimental physics) and Prof. Dr Jürgen Berges (theoretical physics), were published in the journal Nature Physics.

Deriving predictions about physical phenomena at the level of individual particles from a microscopic description is practically impossible for large systems. This applies not only to quantum mechanical many-body systems, but also to classical physics, such as when heated water in a cooking pot needs to be described at the level of the individual water molecules. But if a system is observed on large scales, like water waves in a pot, new properties can become relevant under certain preconditions. To describe such physics efficiently, effective theories are used. “Our research aimed to identify these theories in experiments with the help of quantum simulators,” explains Torsten Zache, the primary author of the theoretical portion of the study. Quantum simulators are used to modify many-body systems more simply and to calculate their properties.

The Heidelberg physicists recently demonstrated their newly developed method in an experiment on ultracold rubidium atoms, which are captured in an optical trap and brought out of equilibrium. “In the scenario we prepared, the atoms behave like tiny magnets whose orientation we are able to precisely read out using new processes,” according to Maximilian Prüfer, the primary author on the experimental side of the study. To determine the effective interactions of these “magnets,” the experiment has to be repeated several thousand times, which requires extreme stability.

“The underlying theoretical concepts allow us to interpret the experimental results in a completely new way and thereby gain insights through experiments into areas that have thus far been inaccessible through theory,” points out Prof. Oberthaler. “In turn, this can tell us about new types of theoretical approaches to successfully describe the relevant physical laws in complex many-body systems,” states Prof. Berges. The approach used by the Heidelberg physicists is transferrable to a number of other systems, thus opening groundbreaking territory for quantum simulations. Jürgen Berges and Markus Oberthaler are confident that this new way of identifying effective theories will make it possible to answer fundamental questions in physics.

Story Source:

Materials provided by University of Heidelberg. Note: Content may be edited for style and length.

Go to Source

3D Printing Industry

How 3D printing is being used in the field of urology

Since its inception in the 80s, 3D printing has managed to find itself in more industries and fields than we can count. One such area is urology – the medical field concerned with the urinary-tract system. A recent literature review published in BJU International covers the latest developments and accomplishments of researchers employing 3D printing […]

Go to Source
Author: Kubi Sertoglu


Novel insight reveals topological tangle in unexpected corner of the universe

Scientists find a unique knotted structure — one that repeats itself throughout nature — in a ferroelectric nanoparticle, a material with promising applications in microelectronics and computing.

Just as a literature buff might explore a novel for recurring themes, physicists and mathematicians search for repeating structures present throughout nature.

For example, a certain geometrical structure of knots, which scientists call a Hopfion, manifests itself in unexpected corners of the universe, ranging from particle physics, to biology, to cosmology. Like the Fibonacci spiral and the golden ratio, the Hopfion pattern unites different scientific fields, and deeper understanding of its structure and influence will help scientists to develop transformative technologies.

In a recent theoretical study, scientists from the U.S. Department of Energy’s (DOE) Argonne National Laboratory, in collaboration with the University of Picardie in France and the Southern Federal University in Russia, discovered the presence of the Hopfion structure in nano-sized particles of ferroelectrics — materials with promising applications in microelectronics and computing.

The identification of the Hopfion structure in the nanoparticles contributes to a striking pattern in the architecture of nature across different scales, and the new insight could inform models of ferroelectric materials for technological development.

Ferroelectric materials have the unique ability to flip the direction of their internal electric polarization — the slight, relative shift of positive and negative charge in opposite directions — when influenced by electric fields. Ferroelectrics can even expand or contract in the presence of an electric field, making them useful for technologies where energy is converted between mechanical and electrical.

In this study, the scientists harnessed fundamental topological concepts with novel computer simulations to investigate the small-scale behavior of ferroelectric nanoparticles. They discovered that the polarization of the nanoparticles takes on the knotted Hopfion structure present in seemingly disparate realms of the universe.

“The polarization lines intertwining themselves into a Hopfion structure may give rise to the material’s useful electronic properties, opening new routes for the design of ferroelectric-based energy storage devices and information systems,” said Valerii Vinokur, senior scientist and Distinguished Fellow in Argonne’s Materials Science division. “The discovery also highlights a repeated tendency in many areas of science.”

What (and where) in the world are Hopfions?

Topology, a subfield of mathematics, is the study of geometric structures and their properties. A Hopfion topological structure, first proposed by Austrian mathematician Heinz Hopf in 1931, emerges in a wide range of physical constructs but is rarely explored in mainstream science. One of its defining characteristics is that any two lines within the Hopfion structure must be linked, constituting knots ranging in complexity from a few interconnected rings to a mathematical rat’s nest.

“The Hopfion is a very abstract mathematical concept,” said Vinokur, “but the structure shows up in hydrodynamics, electrodynamics and even in the packing of DNA and RNA molecules in biological systems and viruses.”

In hydrodynamics, the Hopfion appears in the trajectories of liquid particles flowing inside of a sphere. With friction neglected, the paths of the incompressible liquid particles are intertwined and connected. Cosmological theories also reflect Hopfion patterns. Some hypotheses suggest that the paths of every particle in the universe interweave themselves in the same Hopfion manner as the liquid particles in a sphere.

According to the current study, the polarization structure in a spherical ferroelectric nanoparticle takes on this same knotted swirl.

Simulating the swirl

The scientists created a computational approach that tamed polarization lines and enabled them to recognize the emerging Hopfion structures in a ferroelectric nanoparticle. The simulations, performed by researcher Yuri Tikhonov from the Southern Federal University and the University of Picardie, modeled the polarization within nanoparticles between 50 to 100 nanometers in diameter, a realistic size for ferroelectric nanoparticles in technological applications.

“When we visualized the polarization, we saw the Hopfion structure emerge,” said Igor Luk’yanchuck, a scientist from the University of Picardie. “We thought, wow, there is a whole world inside of these nanoparticles.”

Click here for related video, “Simulation of Hopfion structure in ferroelectric nanoparticle” by Yuri Tikhonov, University of Picardie and Russia’s Southern Federal University, and Anna Razumnaya, Southern Federal University, revealing the Hopfion structure of polarization lines within a ferroelectric nanoparticle

The polarization lines revealed by the simulation represent the directions of displacements between charges within atoms as they vary around the nanoparticle in a way that maximizes energy efficiency. Because the nanoparticle is confined to a sphere, the lines travel around it indefinitely, never terminating on — or escaping from — the surface. This behavior is parallel to the flow of an ideal fluid about a closed, spherical container.

The link between liquid flow and the electrodynamics displayed in these nanoparticles bolster a long- theorized parallelism. “When Maxwell developed his famous equations to describe the behavior of electromagnetic waves, he used the analogy between hydrodynamics and electrodynamics,” said Vinokur. “Scientists have since hinted at this relationship, but we demonstrated that there is a real, quantifiable connection between these concepts that is characterized by the Hopfion structure.”

The study’s findings establish the fundamental importance of Hopfions to the electromagnetic behavior of ferroelectric nanoparticles. The new insight could result in increased control of the advanced functionalities of these materials — such as their supercapacitance — for technological applications.

“Scientists often view properties of ferroelectrics as separate concepts that are highly dependent on chemical composition and treatment,” said Luk’yanchuck, “but this discovery may help describe many of these phenomena in a unifying, general way.”

Another possible technological advantage of these small-scale topological structures is in memory for advanced computing. Scientists are exploring the potential for ferroelectric materials for computational systems. Traditionally, the flip-able polarization of the materials could enable them to store information in two separate states, generally referred to as 0 and 1. However, microelectronics made of ferroelectric nanoparticles might be able to leverage their Hopfion-shaped polarization to store information in more complex ways.

“Within one nanoparticle, you may be able to write much more information because of these topological phenomena,” said Luk’yanchuck. “Our theoretical discovery could be a groundbreaking step in the development of future neuromorphic computers that store information more organically, like the synapses in our brains.”

Future plans

To perform deeper studies into the topological phenomena within ferroelectrics, the scientists plan to leverage Argonne’s supercomputing capabilities. The scientists also plan to test the theoretical presence of Hopfions in ferroelectric nanoparticles using Argonne’s Advanced Photon Source (APS), a DOE Office of Science User Facility.

“We view these results as a first step,” said Vinokur. “Our intention is to study the electromagnetic behavior of these particles while considering the existence of Hopfions, as well as to confirm and explore its implications. For such small particles, this work can only be performed using a synchrotron, so we are fortunate to be able to use Argonne’s APS.”

Go to Source


Investigating the dynamics of stability

The quest to find viable alternatives to fossil fuel in energy production has experienced a recent revolution as scientists search for materials that do not require precious metals to produce active and stable reactions.

Central to many of these reactions is the oxygen evolution reaction (OER), an important electrochemical part of water-splitting in electrolyzers to produce hydrogen that can power fuel cells.

Scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory used a combination of high-precision materials science and electrochemistry to provide important insight into the mechanisms that drive stability and activity of materials during the OER. This insight will guide the practical design of materials for electrochemical fuel production.

“Our explanation removes some of the fog surrounding the effects of impurities on stability of a material at both an atomic scale and a macro scale,” said Argonne Distinguished Fellow Nenad Markovic, a chemist in the lab’s Materials Science division.

The scientists studied an electrolyzer material, called a hydr(oxy)oxide, to discover that, although electrolyzerscan behave as if they are wholly stable, on an atomic scale the systems are extremely dynamic. Iron atoms present in the electrode repeatedly fall away and reattach to the interface, or the surface on which the important, oxygen-producing reactions take place. This careful balance between dissolution and redeposition allows for the overall stability of the material.

“Traditionally, scientists measure how long an electrolyzer can produce oxygen, and they use that to determine stability,” said Argonne postdoctoral scientist Dongyoung Jung, first author on the study. “We decoupled the overall stability of the material on a macro scale from the stability of the material on the atomic scale, which will help us to understand and develop new materials.”

The scientists developed ultrasensitive electrochemical measurement tools to monitor the iron activity in situduring the OER and to test the system with various levels of impurities to see what variables affect the overall stability of the material. The behavior of the iron at the interface is responsible for how well the material can produce oxygen in the OER process.

“By measuring the iron content in the electrode and the electrolyte with ultrahigh sensitivity, we found unexpected discrepancies that point to a dynamic stability of the iron in the system,” said Pietro Lopes, an Argonne assistant scientist on the study.

The dynamic stability in the material — characterized by stable behavior at the macroscopic level despite high activity at the atomic level — is not necessarily a bad thing for electrolyzers. The scientists hope to take advantage of their new understanding of this phenomenon to create materials with better performance.

“Once we identify the role of iron and how its movement affects the oxygen evolution process, we can modify materials to take advantage of dynamic stability, ensuring that iron is always present at the interface, boosting oxygen production,” said Lopes.

“We are addressing a major misconception in the field,” said Vojislav Stamenkovic, Energy Conversion and Storage group leader in Argonne’s Materials Science division. “The profound implications of the decoupling of virtual stability and true stability will extend the design rules for producing active and stable interfaces.”

This research was funded by the DOE’s Office of Basic Energy Sciences. In situ X-ray analysis for the study was conducted at Argonne’s Advanced Photon Source (APS), and density functional theory (DFT) calculations were performed using computational facilities at Argonne’s Center for Nanoscale Materials (CNM). Both APS and CNM are DOE Office of Science User Facilities.

Story Source:

Materials provided by DOE/Argonne National Laboratory. Original written by Savannah Mitchem. Note: Content may be edited for style and length.

Go to Source


Could dark matter be hiding in existing data?

Dark matter has so far defied every type of detector designed to find it. Because of its huge gravitational footprint in space, we know dark matter must make up about 85 percent of the total mass of the universe, but we don’t yet know what it’s made of.

Several large experiments that hunt for dark matter have searched for signs of dark matter particles knocking into atomic nuclei via a process known as scattering, which can produce tiny flashes of light and other signals in these interactions.

Now a new study, led by researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley, suggests new paths for catching the signals of dark matter particles that have their energy absorbed by these nuclei.

The absorption process could give an affected atom a kick that causes it to eject a lighter, energized particle such as an electron, and it might produce other types of signals, too, depending on the nature of the dark matter particle.

The study focuses mostly on those cases where an electron or neutrino is ejected as the dark matter particle strikes an atom’s nucleus.

Published May 4 in Physical Review Letters, the study proposes that some existing experiments, including ones that search for dark matter particles and processes related to neutrinos — ghostly, detectable particles that can pass through most matter and have the ability to change into different forms — can easily be broadened to also look for these absorption-related types of telltale dark matter signals.

Also, the researchers propose that new searches in previously collected particle detector data could possibly turn up these overlooked dark matter signals.

“In this field, we’ve had a certain idea in mind about well-motivated candidates for dark matter, such as the WIMP,” or weakly interacting massive particle, said Jeff Dror, the lead author of the study who is a postdoctoral researcher in Berkeley Lab’s Theory Group and UC Berkeley’s Berkeley Center for Theoretical Physics.

Dark matter pushes at the boundaries of the known fundamental laws of physics, encapsulated in the Standard Model of particle physics, and “The WIMP paradigm is very easy to build into the Standard Model, but we haven’t found it for a long time,” Dror noted.

So, physicists are now considering other places that dark matter particles may be hiding, and other particle possibilities such as theorized “sterile neutrinos” that could also be brought into the family of particles known as fermions — which includes electrons, protons, and neutrinos.

“It’s easy, with small modifications to the WIMP paradigm, to accommodate a whole different type of signal,” Dror said. “You can make a huge amount of progress with very little cost if you step back a little bit in the way we’ve been thinking about dark matter.”

Robert McGehee, a UC Berkeley graduate student, and Gilly Elor of the University of Washington were study co-authors.

The researchers note that the range of new signals they are focusing on opens up an “ocean” of dark matter particle possibilities: namely as-yet-undiscovered fermions with masses lighter than the typical range considered for WIMPs. They could be close cousins of sterile neutrinos, for example.

The study team considered absorption processes known as “neutral current,” in which nuclei in the detector material recoil, or get jolted by their collision with dark matter particles, producing distinct energy signatures that can be picked up by the detector; and also those known as “charged current,” which can produce multiple signals as a dark matter particle strikes a nucleus, causing a recoil and the ejection of an electron.

The charge current process can also involve nuclear decay, in which other particles are ejected from a nucleus as a sort of domino effect triggered by the dark matter absorption.

Looking for the study’s suggested signatures of both the neutral current and charge current processes could open up “orders of magnitude of unexplored parameter space,” the researchers note. They focus on energy signals in the MeV, which means millions of electron volts. An electron volt is a measure of energy that physicists use to describe the masses of particles. Meanwhile, typical WIMP searches are now sensitive to particle interactions with energies in the keV range, or thousands of electron volts.

For the various particle interactions the researchers explored in the study, “You can predict what is the energy spectrum of the particle coming out or the nucleon that’s getting the ‘kick,'” Dror said. Nucleon refers to the positively charged proton or uncharged neutron that resides in an atom’s nucleus and that could absorb energy when struck by a dark matter particle. These absorption signals could possibly be more common than the other types of signals that dark matter detectors are typically designed to find, he added — we just don’t know yet.

Experiments that have large volumes of detector material, with high sensitivity and very low background “noise,” or unwanted interference from other types of particle signals, are particularly suited for this expanded search for different types of dark matter signals, Dror said.

LUX-ZEPLIN (LZ), for example, an ultrasensitive Berkeley Lab-led dark matter search project under construction in a former South Dakota mine, is a possible candidate as it will use about 10 metric tons of liquid xenon as its detector medium and is designed to be heavily shielded from other types of particle noise.

Already, the team of researches participating in the study has worked with the team operating the Enriched Xenon Observatory (EXO), an underground experiment searching for a theorized process known as neutrino-less double beta decay using liquid xenon, to open up its search to these other types of dark matter signals.

And for similar types of experiments that are up and running, “The data is already basically sitting there. It’s just a matter of looking at it,” Dror said.

The researchers name a laundry list of candidate experiments around the world that could have relevant data and search capabilities that could be used to find their target signals, including: CUORE, LZ predecessor LUX, PandaX-II, XENON1T, KamLAND-Zen, SuperKamiokande, CDMS-II, DarkSide-50, and Borexino among them.

As a next step, the research team is hoping to work with experiment collaborations to analyze existing data, and to find out whether search parameters of active experiments can be adjusted to search for other signals.

“I think the community is starting to become fairly aware of this,” Dror said, adding, “One of the biggest questions in the field is the nature of dark matter. We don’t know what it is made out of, but answering these questions could be within our reach in the near future. For me, that’s a huge motivation to keep pushing — there is new physics out there.”

Go to Source


ProgrammableWeb Launches Covid-19/Coronavirus Developer Resource Center

ProgrammableWeb has launched a special resource center to help developers find the top COVID-19 related APIs and other appdev resources. The content in this resource center is curated by the ProgrammableWeb staff and is designed to provide developers with the most up to date information and tools to help them build solutions related to the coronavirus pandemic. These could be tracking solutions, reporting solutions or any type of innovation that these resources might inspire.

Since publishing an article in February about the top APIs to track the coronavirus, ProgrammableWeb has continued to cover the outbreak from a developer point of view. Now all of that information can be found in one place (pictured below).

Since that first article, the number of COVID related APIs, data visualizations and news stories has exploded. To help developers keep up to date on it, the landing page is broken into the following sections:

  • Coronavirus APIs – The landing page will list some of the best COVID-19 APIs and then link to our directory where developers can find the rest of the public APIs that have we know of. We welcome additions to our database. If you know of an API or SDK that belongs in our directory, you can add it yourself (just click the big blue “Add APIs & More” button near the top of any page on our site) or you can email the details to us at [email protected]
  • Dashboards/Datasources – Not all of the COVID-19 related data is available via API. Some of that data is simply available through static data sets. Developers have leveraged the vast number of these data sets from organizations worldwide to create data visualizations, pandemic dashboards, interactive maps, and analytic tools. This is where you can find the tools as well as the data sets that they are based on.
  • Coronavirus Hackathons and other events – Since the pandemic began to spread worldwide, a number of hackathons and other special events (ie: virtual conferences) have sprung up. They have been aimed at bringing together developers to build solutions that tackle some of the new challenges society faces as a result of COVID-19. Here developers can find a listing of past, present, and future events.
  • Open Source Projects – There are also many projects that have brought together citizens using a collaborative development model seen so often in software development projects. Those projects can be found in this section.
  • Coronavirus News – We have covered the releases of COVID-19 APIs and SDKs, put the spotlight on upcoming hackathons, and offered a developer-focused angle to the COVID-19 pandemic. 

We are not the only ones to offer such a resource to the developer community. Postman has opened its Postman COVID-19 API Resource Center, a site that contains API collections from many state governments and organizations. Similar resources can be found on GitHub as well. There is no shortage of options for developers wanting to take on the challenges and opportunities presented by this outbreak. Here at ProgrammableWeb, we will do our best to keep this information up to date so that developers can get to work making a difference.

Go to Source
Author: <a href="">wsantos</a>