Categories
ScienceDaily

Biologists create new genetic systems to neutralize gene drives

In the past decade, researchers have engineered an array of new tools that control the balance of genetic inheritance. Based on CRISPR technology, such gene drives are poised to move from the laboratory into the wild where they are being engineered to suppress devastating diseases such as mosquito-borne malaria, dengue, Zika, chikungunya, yellow fever and West Nile. Gene drives carry the power to immunize mosquitoes against malarial parasites, or act as genetic insecticides that reduce mosquito populations.

Although the newest gene drives have been proven to spread efficiently as designed in laboratory settings, concerns have been raised regarding the safety of releasing such systems into wild populations. Questions have emerged about the predictability and controllability of gene drives and whether, once let loose, they can be recalled in the field if they spread beyond their intended application region.

Now, scientists at the University of California San Diego and their colleagues have developed two new active genetic systems that address such risks by halting or eliminating gene drives in the wild. On Sept.18, 2020 in the journal Molecular Cell, research led by Xiang-Ru Xu, Emily Bulger and Valentino Gantz in the Division of Biological Sciences offers two new solutions based on elements developed in the common fruit fly.

“One way to mitigate the perceived risks of gene drives is to develop approaches to halt their spread or to delete them if necessary,” said Distinguished Professor Ethan Bier, the paper’s senior author and science director for the Tata Institute for Genetics and Society. “There’s been a lot of concern that there are so many unknowns associated with gene drives. Now we have saturated the possibilities, both at the genetic and molecular levels, and developed mitigating elements.”

The first neutralizing system, called e-CHACR (erasing Constructs Hitchhiking on the Autocatalytic Chain Reaction) is designed to halt the spread of a gene drive by “shooting it with its own gun.” e-CHACRs use the CRISPR enzyme Cas9 carried on a gene drive to copy itself, while simultaneously mutating and inactivating the Cas9 gene. Xu says an e-CHACR can be placed anywhere in the genome.

“Without a source of Cas9, it is inherited like any other normal gene,” said Xu. “However, once an e-CHACR confronts a gene drive, it inactivates the gene drive in its tracks and continues to spread across several generations ‘chasing down’ the drive element until its function is lost from the population.”

The second neutralizing system, called ERACR (Element Reversing the Autocatalytic Chain Reaction), is designed to eliminate the gene drive altogether. ERACRs are designed to be inserted at the site of the gene drive, where they use the Cas9 from the gene drive to attack either side of the Cas9, cutting it out. Once the gene drive is deleted, the ERACR copies itself and replaces the gene-drive.

“If the ERACR is also given an edge by carrying a functional copy of a gene that is disrupted by the gene drive, then it races across the finish line, completely eliminating the gene drive with unflinching resolve,” said Bier.

The researchers rigorously tested and analyzed e-CHACRs and ERACRs, as well as the resulting DNA sequences, in meticulous detail at the molecular level. Bier estimates that the research team, which includes mathematical modelers from UC Berkeley, spent an estimated combined 15 years of effort to comprehensively develop and analyze the new systems. Still, he cautions there are unforeseen scenarios that could emerge, and the neutralizing systems should not be used with a false sense of security for field-implemented gene drives.

“Such braking elements should just be developed and kept in reserve in case they are needed since it is not known whether some of the rare exceptional interactions between these elements and the gene drives they are designed to corral might have unintended activities,” he said.

According to Bulger, gene drives have enormous potential to alleviate suffering, but responsibly deploying them depends on having control mechanisms in place should unforeseen consequences arise. ERACRs and eCHACRs offer ways to stop the gene drive from spreading and, in the case of the ERACR, can potentially revert an engineered DNA sequence to a state much closer to the naturally-occurring sequence.

“Because ERACRs and e-CHACRs do not possess their own source of Cas9, they will only spread as far as the gene drive itself and will not edit the wild type population,” said Bulger. “These technologies are not perfect, but we now have a much more comprehensive understanding of why and how unintended outcomes influence their function and we believe they have the potential to be powerful gene drive control mechanisms should the need arise.”

Go to Source
Author:

Categories
ProgrammableWeb

Facebook Begins Rollout of Data Use Checkup to Facebook Platform Developers

In an effort to further protect user privacy, and given past failures in this area, Facebook has recently simplified the company’s platform terms and developer policies in hopes that this will improve adherence to guidelines. To support these goals Facebook has announced the rollout of Data Use Checkup, an annual process for developers that validates data usage.

This new process, which is supported by a self-service tool, was first announced in April of 2020 and will require developers to use check each application they manage for adherence to company standards. Developers will have 60 days to comply with this standard before losing access to APIs.

The rollout of this program will be gradual and developers will begin to be notified over the next several months. The announcement of the rollout notes that developers will be notified “via a developer alert, an email to the registered contact, and in your Task List within the App Dashboard.” To simplify the process for developers that manage multiple apps, Facebook is allowing batch processing via an interface that facilitates this action, although developers will still be required to check each apps permissions.

Developers can check the App Dashboard to verify if they are able to enroll in the program at this time. 

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">KevinSundstrom</a>

Categories
ScienceDaily

Deep learning will help future Mars rovers go farther, faster, and do more science

NASA’s Mars rovers have been one of the great scientific and space successes of the past two decades.

Four generations of rovers have traversed the red planet gathering scientific data, sending back evocative photographs, and surviving incredibly harsh conditions — all using on-board computers less powerful than an iPhone 1. The latest rover, Perseverance, was launched on July 30, 2020, and engineers are already dreaming of a future generation of rovers.

While a major achievement, these missions have only scratched the surface (literally and figuratively) of the planet and its geology, geography, and atmosphere.

“The surface area of Mars is approximately the same as the total area of the land on Earth,” said Masahiro (Hiro) Ono, group lead of the Robotic Surface Mobility Group at the NASA Jet Propulsion Laboratory (JPL) — which has led all the Mars rover missions — and one of the researchers who developed the software that allows the current rover to operate.

“Imagine, you’re an alien and you know almost nothing about Earth, and you land on seven or eight points on Earth and drive a few hundred kilometers. Does that alien species know enough about Earth?” Ono asked. “No. If we want to represent the huge diversity of Mars we’ll need more measurements on the ground, and the key is substantially extended distance, hopefully covering thousands of miles.”

Travelling across Mars’ diverse, treacherous terrain with limited computing power and a restricted energy diet — only as much sun as the rover can capture and convert to power in a single Martian day, or sol — is a huge challenge.

The first rover, Sojourner, covered 330 feet over 91 sols; the second, Spirit, travelled 4.8 miles in about five years; Opportunity, travelled 28 miles over 15 years; and Curiosity has travelled more than 12 miles since it landed in 2012.

“Our team is working on Mars robot autonomy to make future rovers more intelligent, to enhance safety, to improve productivity, and in particular to drive faster and farther,” Ono said.

NEW HARDWARE, NEW POSSIBILITIES

The Perseverance rover, which launched this summer, computes using RAD 750s — radiation-hardened single board computers manufactured by BAE Systems Electronics.

Future missions, however, would potentially use new high-performance, multi-core radiation hardened processors designed through the High Performance Spaceflight Computing (HPSC) project. (Qualcomm’s Snapdragon processor is also being tested for missions.) These chips will provide about one hundred times the computational capacity of current flight processors using the same amount of power.

“All of the autonomy that you see on our latest Mars rover is largely human-in-the-loop” — meaning it requires human interaction to operate, according to Chris Mattmann, the deputy chief technology and innovation officer at JPL. “Part of the reason for that is the limits of the processors that are running on them. One of the core missions for these new chips is to do deep learning and machine learning, like we do terrestrially, on board. What are the killer apps given that new computing environment?”

The Machine Learning-based Analytics for Autonomous Rover Systems (MAARS) program — which started three years ago and will conclude this year — encompasses a range of areas where artificial intelligence could be useful. The team presented results of the MAARS project at hIEEE Aerospace Conference in March 2020. The project was a finalist for the NASA Software Award.

“Terrestrial high performance computing has enabled incredible breakthroughs in autonomous vehicle navigation, machine learning, and data analysis for Earth-based applications,” the team wrote in their IEEE paper. “The main roadblock to a Mars exploration rollout of such advances is that the best computers are on Earth, while the most valuable data is located on Mars.”

Training machine learning models on the Maverick2 supercomputer at the Texas Advanced Computing Center (TACC), as well as on Amazon Web Services and JPL clusters, Ono, Mattmann and their team have been developing two novel capabilities for future Mars rovers, which they call Drive-By Science and Energy-Optimal Autonomous Navigation.

ENERGY-OPTIMAL AUTONOMOUS NAVIGATION

Ono was part of the team that wrote the on-board pathfinding software for Perseverance. Perseverance’s software includes some machine learning abilities, but the way it does pathfinding is still fairly naïve.

“We’d like future rovers to have a human-like ability to see and understand terrain,” Ono said. “For rovers, energy is very important. There’s no paved highway on Mars. The drivability varies substantially based on the terrain — for instance beach versus. bedrock. That is not currently considered. Coming up with a path with all of these constraints is complicated, but that’s the level of computation that we can handle with the HPSC or Snapdragon chips. But to do so we’re going to need to change the paradigm a little bit.”

Ono explains that new paradigm as commanding by policy, a middle ground between the human-dictated: “Go from A to B and do C,” and the purely autonomous: “Go do science.”

Commanding by policy involves pre-planning for a range of scenarios, and then allowing the rover to determine what conditions it is encountering and what it should do.

“We use a supercomputer on the ground, where we have infinite computational resources like those at TACC, to develop a plan where a policy is: if X, then do this; if y, then do that,” Ono explained. “We’ll basically make a huge to-do list and send gigabytes of data to the rover, compressing it in huge tables. Then we’ll use the increased power of the rover to de-compress the policy and execute it.”

The pre-planned list is generated using machine learning-derived optimizations. The on-board chip can then use those plans to perform inference: taking the inputs from its environment and plugging them into the pre-trained model. The inference tasks are computationally much easier and can be computed on a chip like those that may accompany future rovers to Mars.

“The rover has the flexibility of changing the plan on board instead of just sticking to a sequence of pre-planned options,” Ono said. “This is important in case something bad happens or it finds something interesting.”

DRIVE-BY SCIENCE

Current Mars missions typically use tens of images a Sol from the rover to decide what to do the next day, according to Mattmann. “But what if in the future we could use one million image captions instead? That’s the core tenet of Drive-By Science,” he said. “If the rover can return text labels and captions that were scientifically validated, our mission team would have a lot more to go on.”

Mattmann and the team adapted Google’s Show and Tell software — a neural image caption generator first launched in 2014 — for the rover missions, the first non-Google application of the technology.

The algorithm takes in images and spits out human-readable captions. These include basic, but critical information, like cardinality — how many rocks, how far away? — and properties like the vein structure in outcrops near bedrock. “The types of science knowledge that we currently use images for to decide what’s interesting,” Mattmann said.

Over the past few years, planetary geologists have labeled and curated Mars-specific image annotations to train the model.

“We use the one million captions to find 100 more important things,” Mattmann said. “Using search and information retrieval capabilities, we can prioritize targets. Humans are still in the loop, but they’re getting much more information and are able to search it a lot faster.”

Results of the team’s work appear in the September 2020 issue of Planetary and Space Science.

TACC’s supercomputers proved instrumental in helping the JPL team test the system. On Maverick 2, the team trained, validated, and improved their model using 6,700 labels created by experts.

The ability to travel much farther would be a necessity for future Mars rovers. An example is the Sample Fetch Rover, proposed to be developed by the European Space Association and launched in late 2020s, whose main task will be to pick up samples dug up by the Mars 2020 rover and collect them.

“Those rovers in a period of years would have to drive 10 times further than previous rovers to collect all the samples and to get them to a rendezvous site,” Mattmann said. “We’ll need to be smarter about the way we drive and use energy.”

Before the new models and algorithms are loaded onto a rover destined for space, they are tested on a dirt training ground next to JPL that serves as an Earth-based analogue for the surface of Mars.

The team developed a demonstration that shows an overhead map, streaming images collected by the rover, and the algorithms running live on the rover, and then exposes the rover doing terrain classification and captioning on board. They had hoped to finish testing the new system this spring, but COVID-19 shuttered the lab and delayed testing.

In the meantime, Ono and his team developed a citizen science app, AI4Mars, that allows the public to annotate more than 20,000 images taken by the Curiosity rover. These will be used to further train machine learning algorithms to identify and avoid hazardous terrains.

The public have generated 170,000 labels so far in less than three months. “People are excited. It’s an opportunity for people to help,” Ono said. “The labels that people create will help us make the rover safer.”

The efforts to develop a new AI-based paradigm for future autonomous missions can be applied not just to rovers but to any autonomous space mission, from orbiters to fly-bys to interstellar probes, Ono says.

“The combination of more powerful on-board computing power, pre-planned commands computed on high performance computers like those at TACC, and new algorithms has the potential to allow future rovers to travel much further and do more science.”

Go to Source
Author:

Categories
ProgrammableWeb

NSW Health Pathology Aided by APIs in Response to COVID-19

Over the past four years, NSW Health Pathology invested heavily in API-led connectivity. With coverage from Justin Hendry, writing for ITnews.com, we learn how this investment has paved the way for them to pivot handily to building public-facing services during the coronavirus pandemic. 

At this year’s MuleSoft CONNECT digital summit, enterprise architect Tim Eckersly spoke about the agency’s rapid response early on in the pandemic, crediting the agency’s “large library of healthcare microservices.” This library is one of the projects developed over the past four years, designed to allow “seamless integration between a very broad range of healthcare systems…each wave of delivery built up a groundswell of microservices…the reusable components gradually take a much more dominant posture and provide a really solid launching place to have this rapid response.”

Disclosure: MuleSoft is the parent company of ProgrammableWeb.

The NWS agency is the largest public provider of pathology in Australia. Eckersley leads the agency’s DevOps. He credits their architectural approach with allowing the agency to build out and launch a results-delivery bot in just two weeks. 

Eckersley explains, “In terms of what we’ve been able to achieve with MuleSoft, we’ve used it to integrate our four laboratory information systems, which are our core systems of record in the background, with the greater health system…So that’s the eMRs [electronic medical records] or the eHRs [electronic health records], depending on if you’re in Australia or the United States, as well as the outpatients administration systems.”

Eckersley’s team developed their automated, citizen-facing service in the first weeks of the pandemic, working in partnership with AWS, Deloitte, and Microsoft. This group approach shaved off a tremendous amount of work time: Eckersley credits the strategy with returning “5,000 days of effort back to clinical frontline staff.” The service will also work to “tie those [systems] together with our federal systems, so things like the My Health Record and the national cancer screening registry.”

The service is projected to return test results in as little as 24 hours or less – much more quickly than in other parts of the world. This service was initially piloted with a few regional clinics, before rolling out in the rest of the state. The simple, approachable service is easy for participants, Eckersley explains. “All [patients need to do when they go to get a nasal swab taken] is scan a QR code and it immediately pops open a text message of ‘what are my results?’ to our text bot service…and then that text bot requests that [the patient] put in identifying information, as well as the date their collection was taken, and it will instantly give them the results as soon as they become available.”

A key facet of the strategy is focused on collecting fringe cases, with a bot which integrates with a healthcare system (such as Cerna, Auslab, and a Jira service desk, which allows automated ticket creation). This collection enables the service to keep notifications within a three-day window of response time. The four-year process of building a library of microservices is the foundation enabling the agency to hit the ground running with a delivery window of two weeks. 

Eckersley breaks down the more technical elements of their process: “[By] taking an HL7 message, using the MuleSoft HL7 adapters and then connecting it up with cloud infrastructure like Azure service bus for messaging, we’ve been able to make a state-scaled solution really quickly which can pick up the millions of messages that we get running through the state in any given week and handle them in an API-led way…So we take that message in HL7, we convert it to XML, and then we push it through our process API layer…at that point, it is converted into a range of different FHIR [Fast Healthcare Interoperability Resources].”

Converting to FHIR empowers the agencies to use a NoSQL database like Cosmos at hyperscale: information can then be stored, and experience APIs can be presented to agency web and mobile apps (as well as those belonging to their partners). The agency is currently shifting all Mulesoft services piecemeal to Kubernetes, with the idea that a slow shift will reduce risk and allow detailed prioritization of which apps move when. 

Also presenting at the 2020 Mulesoft CONNECT digital event was NWS Health Pathology CIO James Patterson. Patterson praised the strategy of reusing as many components as possible, explaining that it reduced the creation of “technical debt.” Patterson explains: 

“Even where we’ve had things like a billing project that’s using MuleSoft integration to bring data from our legacy systems into our more modern systems, we’ve been able to pick up components of that previous project and reuse them to build these new services. Where we’ve had legacy, we’ve had to build things from scratch in our modern integration environment, and obviously that takes longer and takes more effort…we’re creating a situation where we’re removing technical debt as we go through the crisis, and I think that’s been really centered around our strategy with Mulesoft.

Patterson credits the upheaval of the pandemic with forcing adoption of agile practices, whereas pre-pandemic, agile practices made up just 10% of the work time at NSW Health Pathology. He praises the shift, musing that “I think the opportunity is now there to introduce that way of working into all of our work or most of our work, which will really enhance the experience of our customers internally.” 

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">Katherine-Harrison-Adcock</a>

Categories
ScienceDaily

Orb hidden in distant dust is ‘infant’ planet

Astronomers study stars and planets much younger than the Sun to learn about past events that shaped the Solar System and Earth. Most of these stars are far enough away to make observations challenging, even with the largest telescopes. But now this is changing.

University of Hawai’i at Manoa astronomers are part of an international team that recently discovered an infant planet around a nearby young star. The discovery was reported Wednesday in the international journal Nature.

The planet is about the size of Neptune, but, unlike Neptune, it is much closer to its star, taking only eight and a half days to complete one orbit. It is named “AU Mic b” after its host star, AU Microscopii, or “AU Mic” for short. The planet was discovered using the NASA TESS planet-finding satellite, as it periodically passed in front of AU Mic, blocking a small fraction of its light. The signal was confirmed by observations with another NASA satellite, the Spitzer Space Telescope, and with the NASA Infrared Telescope Facility (IRTF) on Maunakea. The observations on Hawai’i Island used a new instrument called iSHELL that can make very precise measurements of the motion of a star like AU Mic. These measurements revealed a slight wobble of the star, as it moves in response to the gravitational pull of the planet. It confirmed that AU Mic b was a planet and not a companion star, which would cause a much larger motion.

Discovery on Maunakea sets foundation

AU Mic and its planet are about 25 million years young, and in their infancy, astronomically speaking. AU Mic is also the second closest young star to Earth. It is so young that dust and debris left over from its formation still orbit around it. The debris collides and breaks into smaller dust particles, which orbit the star in a thin disk. This disk was detected in 2003 with the UH 88-inch telescope on Maunakea. The newly-discovered planet orbits within a cleared-out region inside the disk.

“This is an exciting discovery, especially as the planet is in one of the most well-known young star systems, and the second-closest to Earth. In addition to the debris disk, there is always the possibility of additional planets around this star. AU Mic could be the gift that keeps on giving,” said Michael Bottom, an Assistant Astronomer at the UH Institute for Astronomy.

“Planets, like people, change as they mature. For planets this means that their orbits can move and the compositions of their atmospheres can change. Some planets form hot and cool down, and unlike people, they would become smaller over time. But we need observations to test these ideas and planets like AU Mic b are an exceptional opportunity,” said Astronomer Eric Gaidos, a professor in the Department of Earth Sciences at UH M?noa.

Clues to the origin of Earth-like planets

AU Mic is not only much younger than the Sun, it is considerably smaller, dimmer and redder. It is a “red dwarf,” the most numerous type of star in the galaxy. The TESS satellite is also discovering Earth-sized and possibly habitable planets around older red dwarfs, and what astronomers learn from AU Mic and AU Mic b can be applied to understand the history of those planets.

“AU Mic b, and any kindred planets that are discovered in the future, will be intensely studied to understand how planets form and evolve. Fortuitously, this star and its planet are on our cosmic doorstep. We do not have to venture very far to see the show,” Gaidos explained. He is a co-author on another five forthcoming scientific publications that have used other telescopes, including several on Maunakea, to learn more about AU Mic and its planet.

AU Mic appears low in the summer skies of Hawai’i but you’ll need binoculars to see it. Despite its proximity, the fact that it is a dim red star means it is too faint to be seen with the unaided eye.

Story Source:

Materials provided by University of Hawaii at Manoa. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ProgrammableWeb

The 5 Best Healthcare APIs for Medical App Development

The mobile apps market has made significant progress over the past several years, becoming more business-oriented and commercialized. mHealth, also known as mobile health, is a completely new way of patient interaction with a doctor and patient care. It uses the support of mobile devices and wireless technologies to facilitate the process of obtaining medical care for patients and reduces the workload of routine operations for healthcare specialists, which allows more time to be spent on the diagnosis and treatment of various diseases. 

Why Use APIs for Medical App Development?

Healthcare mobile app development with the help of APIs has many advantages:

  1. Getting Rid of Unnecessary Work. APIs provide IT specialists with pre-designed tools and functions, making the process of developing a new app fast and easy. By using an available API to create a new product, mobile app developers do not need to reinvent the wheel each time and are able to minimize the budget and the time to market. 
  2. Stimulating Medicine Innovations. APIs facilitate international cooperation and allow immediate access to data, making the process of medical research fast and smooth. Without it, studies would be conducted independently of each other, and the results would appear much slower. Medical APIs allow scientists to quickly process the results of many studies, assist in discovering new therapies, and study the development of genetic diseases.
  3. Securely Manage all Medical Data. APIs provide fast data exchange with a single, simplified solution for managing secure medical information. They also allow users to instantly connect to existing information references, such as e-records and research databases. Thus, scientists can get valuable, practical information based on all of the available medical data.
  4. Provide Reliable Storage for Data. API security should confine data and provide multi-level comprehensive protection, as well as advanced threat prevention tools that comply with stringent standards. An efficient API medical app should have all necessary certificates, including ISO 27001, and should meet HIPAA regulatory requirements. As a result, APIs can enhance the security of the mHealth app. 
  5. Allows Better Treatment Results. Via API, it is possible to link data points from different systems to form more complete data sets. These data sets will help implement superior analytics outlines that help achieve better treatment results. 

Best APIs for Healthcare App Development

A good API eliminates the burden of creating the code from scratch. Using the ready-made code, developers will be able to create a successful mobile app much faster, saving time and resources. In addition, the API facilitates software support, debugging of code, and fixing errors. Below, we have listed some of the best APIs that can improve your medical app development.

EvitiTrack this API 

The Eviti Web API provides access to healthcare data for patients with cancer. It helps doctors choose the treatment that best suits the patient’s specific diagnosis and type of health insurance. For example, a separate code is used to determine the type of cancer. For instance, your code path can look as follows: 

path: 'https://connect.eviti.com/api/cancertypes'.

DrChronoTrack this API

DrChrono is a particularly interesting phenomenon in the field of mobile apps for healthcare. Its intention is to provide doctors with a universal platform that will cover all areas of medical practice, from the application of portable diagnostic tools to the processing of bills issued for treatment. Thanks to DrChrono-created Web apps, mobile versions for iOS and Android, as well as the API, developers can easily upgrade their apps.

AllscriptsTrack this API

For thirty years, Allscripts has been developing advanced information systems to automate health care and record treatment results. The company has certified nearly 200 various apps and devices and its latest development, the EHR patient care monitoring system, received the highest quality rating. Currently, its APIs have certificates of compliance with international standards and are ready for use for creating third-party apps. Allscripts APIs allow users to quickly realize novel ideas created by developers from all around the globe.

AdvancedMDTrack this API

AdvancedMD comprises a wide range of healthcare provision services. The system is suitable for individual doctors as well as for large groups of several providers. The AdvancedMD API includes over 40 templates for mental health, behavioral health, physiotherapy, and rehabilitation. Templates can be customized or created from scratch, and developers can also create their own third-party apps filling them with all the desired features. 

EligibleTrack this API

Eligible Health Insurance API is a service that provides complete information about more than a thousand insurance companies. It standardizes and simplifies requests and supports a wide range of operations, including processing demographic data, filing applications, and making payments.

Final Word

When choosing an API for developing a successful mobile app, remember that the task of mobile health care is to facilitate the communication process between doctors and patients. It should help specialists carry out remote monitoring and accustom users to take care of their health, receive timely consultations, and undergo preventive examinations. Thus, make sure that you pick the best API platform that comprises all the necessary features.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">Victoria-Melnychuk</a>

Categories
ScienceDaily

An advance in molecular moviemaking shows how molecules respond to two photons of light

Over the past few years, scientists have developed amazing tools — “cameras” that use X-rays or electrons instead of ordinary light ¬- to take rapid-fire snapshots of molecules in motion and string them into molecular movies.

Now scientists at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University have added another twist: By tuning their lasers to hit iodine molecules with two photons of light at once instead of the usual single photon, they triggered totally unexpected phenomena that were captured in slow-motion movies just trillionths of a second long.

The first movie they made with this approach, described March 17 in Physical Review X, shows how the two atoms in an iodine molecule jiggle back and forth, as if connected by a spring, and sometimes fly apart when hit by intense laser light. The action was captured by the lab’s Linac Coherent Light Source (LCLS) hard X-ray free-electron laser. Some of the molecules’ responses were surprising and others had been seen before with other techniques, the researchers said, but never in such detail or so directly, without relying on advance knowledge of what they should look like.

Preliminary looks at bigger molecules that contain a variety of atoms suggest they can also be filmed this way, the researchers added, yielding new insights into molecular behavior and filling a gap where previous methods fall short.

“The picture we got this way was very rich,” said Philip Bucksbaum, a professor at SLAC and Stanford and investigator with the Stanford PULSE Institute, who led the study with PULSE postdoctoral scientist Matthew Ware. “The molecules gave us enough information that you could actually see atoms move over distances of less than an angstrom — which is about the width of two hydrogen atoms — in less than a trillionth of a second. We need a very fast shutter speed and high resolution to see this level of detail, and right now those are only possible with a hard X-ray free-electron laser like LCLS.”

Double-barreled photons

Iodine molecules are a favorite subject for this kind of investigation because they’re simple — just two atoms connected by a springy chemical bond. Previous studies, for instance with SLAC’s “electron camera,” have probed their response to light. But until now those experiments have been set up to initiate motion in molecules using single photons, or particles of light.

In this study, researchers tuned the intensity and color of an ultrafast infrared laser so that about a tenth of the iodine molecules would interact with two photons of light — enough to set them vibrating, but not enough to strip off their electrons.

Each hit was immediately followed by an X-ray laser pulse from LCLS, which scattered off the iodine’s atomic nuclei and into a detector to record how the molecule reacted. By varying the timing between the light and X-ray pulses, scientists created a series of snapshots that were combined into a stop-action movie of the molecule’s response, with frames just 50 femtoseconds, or millionths of a billionth of a second, apart.

The researchers knew going in that hitting the iodine molecules with more than one photon at a time would provoke what’s known as a nonlinear response, which can veer off in surprising directions. “We wanted to look at something more challenging, stuff we could see that might not be what we planned,” as Bucksbaum put it. And that in fact is what they found.

Unexpected vibes

The results revealed that the light’s energy did set off vibrations, as expected, with the two iodine molecules rapidly approaching and pulling away from each other. “It’s a really big effect, and of course we saw it,” Bucksbaum said.

But another, much weaker type of vibration also showed up in the data, “a process that’s weak enough that we hadn’t expected to see it,” he said. “That confirms the discovery potential of this technique.”

They were also able to see how far apart the atoms were and which way they headed at the very start of each vibration — either compressing or extending the bond between them — as well as how long each type of vibration lasted.

In just a few percent of the molecules, the light pulses sent the iodine atoms flying apart rather than vibrating, shooting off in opposite directions at either fast or slow speeds. As with the vibrations, the fast flyoffs were expected, but the slow ones were not.

Bucksbaum said he expects that chemists and materials scientists will be able to make good use of these techniques. Meanwhile, his team and others at the lab will continue to focus on developing tools to see more and more things going on in molecules and understand how they move. “That’s the goal here,” he said. “We’re the cinematographers, not the writers, producers or actors. The value in what we do is to enable all those other things to happen, working in partnership with other scientists.”

Go to Source
Author:

Categories
ScienceDaily

Using deep learning to predict disease-associated mutations

During the past years, artificial intelligence (AI) — the capability of a machine to mimic human behavior — has become a key player in high-techs like drug development projects. AI tools help scientists to uncover the secret behind the big biological data using optimized computational algorithms. AI methods such as deep neural network improves decision making in biological and chemical applications i.e., prediction of disease-associated proteins, discovery of novel biomarkers and de novo design of small molecule drug leads. These state-of-the-art approaches help scientists to develop a potential drug more efficiently and economically.

A research team led by Professor Hongzhe Sun from the Department of Chemistry at the University of Hong Kong (HKU), in collaboration with Professor Junwen Wang from Mayo Clinic, Arizona in the United States (a former HKU colleague), implemented a robust deep learning approach to predict disease-associated mutations of the metal-binding sites in a protein. This is the first deep learning approach for the prediction of disease-associated metal-relevant site mutations in metalloproteins, providing a new platform to tackle human diseases. The research findings were recently published in a top scientific journal Nature Machine Intelligence.

Metal ions play pivotal roles either structurally or functionally in the (patho)physiology of human biological systems. Metals such as zinc, iron and copper are essential for all lives and their concentration in cells must be strictly regulated. A deficiency or an excess of these physiological metal ions can cause severe disease in humans. It was discovered that a mutation in human genome are strongly associated with different diseases. If these mutations happen in the coding region of DNA, it might disrupt metal-binding sites of the proteins and consequently initiate severe diseases in humans. Understanding of disease-associated mutations at the metal-binding sites of proteins will facilitate discovery of new drugs.

The team first integrated omics data from different databases to build a comprehensive training dataset. By looking at the statistics from the collected data, the team found that different metals have different disease associations. A mutation in zinc-binding sites has a major role in breast, liver, kidney, immune system and prostate diseases. By contrast, the mutations in calcium- and magnesium-binding sites are associated with muscular and immune system diseases, respectively. For iron-binding sites, mutations are more associated with metabolic diseases. Furthermore, mutations of manganese- and copper-binding sites are associated with cardiovascular diseases with the latter being associated with nervous system disease as well. They used a novel approach to extract spatial features from the metal binding sites using an energy-based affinity grid map. These spatial features have been merged with physicochemical sequential features to train the model. The final results show using the spatial features enhanced the performance of the prediction with an area under the curve (AUC) of 0.90 and an accuracy of 0.82. Given the limited advanced techniques and platforms in the field of metallomics and metalloproteins, the proposed deep learning approach offers a method to integrate the experimental data with bioinformatics analysis. The approach will help scientist to predict DNA mutations which are associated with disease like cancer, cardiovascular diseases and genetic disorders.

Professor Sun said: “Machine learning and AI play important roles in the current biological and chemical science. In my group we worked on metals in biology and medicine using integrative omics approach including metallomics and metalloproteomics, and we already produced a large amount of valuable data using in vivo/vitro experiments. We now develop an artificial intelligence approach based on deep learning to turn these raw data to valuable knowledge, leading to uncover secrets behind the diseases and to fight with them. I believe this novel deep learning approach can be used in other projects, which is undergoing in our laboratory.”

Story Source:

Materials provided by The University of Hong Kong. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Cycling is safer with more cyclists on the road, but injuries are on the rise

Head and facial injuries from cycling have remained steady over the past 10 years according to a Rutgers-led study.

The study, published in the Journal of Oral and Maxillofacial Surgery, found that despite an increase in bicyclists on the road during the study period from 2008 to 2017, the number of total facial and head injuries from bicycling did not increase.

“We believe this may be due to a safety-in-numbers phenomenon, whereby increased public safety campaigns, government and private-center funding for facilities and infrastructure, and overall awareness by cyclists and drivers appears to protect cyclists — which translates to further benefits for drivers and others,” said lead author Corina Din-Lovinescu of Rutgers New Jersey Medical School’s Department of Otolaryngology.

Still, the rise in popularity of bike riding has led to an increase in more serious injuries, particularly among cyclists aged 55 to 64. They were treated at hospital emergency departments nationwide for traumatic brain injuries and broken bones in the face more than 86,439 times from 2008 to 2017.

The incidence of these craniofacial injuries varied significantly among age groups. While patients aged 18 to 24 were injured more frequently, likely due to the popularity of bicycling in younger adults, patients aged 55 to 64 had the most significant increase in injuries, with a 54 percent growth over the ten-year study period.

Traumatic brain injury was the most commonly diagnosed injury, accounting for nearly 50 percent of emergency department visits. Those aged 45 to 54 were the most likely to be hospitalized with facial fractures, the most common to nasal bones, followed by jawbone fractures.

Researchers say older adults need to practice additional safety precautions when bicycling to help reduce injuries. Preventative behaviors such as avoiding alcohol before cycling, wearing brightly colored or reflective clothing, using lights or reflectors at night, and wearing helmets are simple maneuvers that can be taken to prevent hospitalizations and decrease cycling-related morbidity.

Boris Paskhover, a surgeon and professor at the New Jersey Medical School department of otolaryngology, co-authored the study.

Story Source:

Materials provided by Rutgers University. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
IEEE Spectrum

A Path Towards Reasonable Autonomous Weapons Regulation

Editor’s Note: The debate on autonomous weapons systems has been escalating over the past several years as the underlying technologies evolve to the point where their deployment in a military context seems inevitable. IEEE Spectrum has published a variety of perspectives on this issue. In summary, while there is a compelling argument to be made that autonomous weapons are inherently unethical and should be banned, there is also a compelling argument to be made that autonomous weapons could potentially make conflicts less harmful, especially to non-combatants. Despite an increasing amount of international attention (including from the United Nations), progress towards consensus, much less regulatory action, has been slow. The following workshop paper on autonomous weapons systems policy is remarkable because it was authored by a group of experts with very different (and in some cases divergent) views on the issue. Even so, they were able to reach consensus on a roadmap that all agreed was worth considering. It’s collaborations like this that could be the best way to establish a reasonable path forward on such a contentious issue, and with the permission of the authors, we’re excited to be able to share this paper (originally posted on Georgia Tech’s Mobile Robot Lab website) with you in its entirety.