Categories
ScienceDaily

After more than a decade, ChIP-seq may be quantitative after all

For more than a decade, scientists studying epigenetics have used a powerful method called ChIP-seq to map changes in proteins and other critical regulatory factors across the genome. While ChIP-seq provides invaluable insights into the underpinnings of health and disease, it also faces a frustrating challenge: its results are often viewed as qualitative rather than quantitative, making interpretation difficult.

But, it turns out, ChIP-seq may have been quantitative all along, according to a recent report selected as an Editors’ Pick by and featured on the cover of the Journal of Biological Chemistry.

“ChIP-seq is the backbone of epigenetics research. Our findings challenge the belief that additional steps are required to make it quantitative,” said Brad Dickson, Ph.D., a staff scientist at Van Andel Institute and the study’s corresponding author. “Our new approach provides a way to quantify results, thereby making ChIP-seq more precise, while leaving standard protocols untouched.”

Previous attempts to quantify ChIP-seq results have led to additional steps being added to the protocol, including the use of “spike-ins,” which are additives designed to normalize ChIP-seq results and reveal histone changes that otherwise may be obscured. These extra steps increase the complexity of experiments while also adding variables that could interfere with reproducibility. Importantly, the study also identifies a sensitivity issue in spike-in normalization that has not previously been discussed.

Using a predictive physical model, Dickson and his colleagues developed a novel approach called the sans-spike-in method for Quantitative ChIP-sequencing, or siQ-ChIP. It allows researchers to follow the standard ChIP-seq protocol, eliminating the need for spike-ins, and also outlines a set of common measurements that should be reported for all ChIP-seq experiments to ensure reproducibility as well as quantification.

By leveraging the binding reaction at the immunoprecipitation step, siQ-ChIP defines a physical scale for sequencing results that allows comparison between experiments. The quantitative scale is based on the binding isotherm of the immunoprecipitation products.

Story Source:

Materials provided by Van Andel Research Institute. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Researchers model source of eruption on Jupiter’s moon Europa

On Jupiter’s icy moon Europa, powerful eruptions may spew into space, raising questions among hopeful astrobiologists on Earth: What would blast out from miles-high plumes? Could they contain signs of extraterrestrial life? And where in Europa would they originate? A new explanation now points to a source closer to the frozen surface than might be expected.

Rather than originating from deep within Europa’s oceans, some eruptions may originate from water pockets embedded in the icy shell itself, according to new evidence from researchers at Stanford University, the University of Arizona, the University of Texas and NASA’s Jet Propulsion Laboratory.

Using images collected by the NASA spacecraft Galileo, the researchers developed a model to explain how a combination of freezing and pressurization could lead to a cryovolcanic eruption, or a burst of water. The results, published Nov. 10 in Geophysical Research Letters, have implications for the habitability of Europa’s underlying ocean — and may explain eruptions on other icy bodies in the solar system.

Harbingers of life?

Scientists have speculated that the vast ocean hidden beneath Europa’s icy crust could contain elements necessary to support life. But short of sending a submersible to the moon to explore, it’s difficult to know for sure. That’s one reason Europa’s plumes have garnered so much interest: If the eruptions are coming from the subsurface ocean, the elements could be more easily detected by a spacecraft like the one planned for NASA’s upcoming Europa Clipper mission.

But if the plumes originate in the moon’s icy shell, they may be less hospitable to life, because it is more difficult to sustain the chemical energy to power life there. In this case, the chances of detecting habitability from space are diminished.

“Understanding where these water plumes are coming from is very important for knowing whether future Europa explorers could have a chance to actually detect life from space without probing Europa’s ocean,” said lead author Gregor Steinbrügge, a postdoctoral researcher at Stanford’s School of Earth, Energy & Environmental Sciences (Stanford Earth).

The researchers focused their analyses on Manannán, an 18-mile-wide crater on Europa that was created by an impact with another celestial object some tens of millions of years ago. Reasoning that such a collision would have generated a tremendous amount of heat, they modeled how melting and subsequent freezing of a water pocket within the icy shell could have caused the water to erupt.

“The comet or asteroid hitting the ice shell was basically a big experiment which we’re using to construct hypotheses to test,” said co-author Don Blankenship, senior research scientist at the University of Texas Institute for Geophysics (UTIG) and principal investigator of the Radar for Europa Assessment and Sounding: Ocean to Near-surface (REASON) instrument that will fly on Europa Clipper. “The polar and planetary sciences team at UTIG are all currently dedicated to evaluating the ability of this instrument to test those hypotheses.”

The model indicates that as Europa’s water transformed into ice during the later stages of the impact, pockets of water with increased salinity could be created in the moon’s surface. Furthermore, these salty water pockets can migrate sideways through Europa’s ice shell by melting adjacent regions of less brackish ice, and consequently become even saltier in the process.

“We developed a way that a water pocket can move laterally — and that’s very important,” Steinbrügge said. “It can move along thermal gradients, from cold to warm, and not only in the down direction as pulled by gravity.”

A salty driver

The model predicts that when a migrating brine pocket reached the center of Manannán crater, it became stuck and began freezing, generating pressure that eventually resulted in a plume, estimated to have been over a mile high. The eruption of this plume left a distinguishing mark: a spider-shaped feature on Europa’s surface that was observed by Galileo imaging and incorporated in the researchers’ model.

“Even though plumes generated by brine pocket migration would not provide direct insight into Europa’s ocean, our findings suggest that Europa’s ice shell itself is very dynamic,” said co-lead author Joana Voigt, a graduate research assistant at the University of Arizona, Tucson.

The relatively small size of the plume that would form at Manannán indicates that impact craters probably can’t explain the source of other, larger plumes on Europa that have been hypothesized based on Hubble and Galileo data, the researchers say. But the process modeled for the Manannán eruption could happen on other icy bodies — even without an impact event.

“Brine pocket migration is not uniquely applicable to Europan craters,” Voigt said. “Instead the mechanism might provide explanations on other icy bodies where thermal gradients exist.”

The study also provides estimates of how salty Europa’s frozen surface and ocean may be, which in turn could affect the transparency of its ice shell to radar waves. The calculations, based on imaging from Galileo from 1995 to 1997, show Europa’s ocean may be about one-fifth as salty as Earth’s ocean — a factor that will improve the capacity for the Europa Clipper mission’s radar sounder to collect data from its interior.

The findings may be discouraging to astrobiologists hoping Europa’s erupting plumes might contain clues about the internal ocean’s capacity to support life, given the implication that plumes do not have to connect to Europa’s ocean. However, the new model offers insights toward untangling Europa’s complex surface features, which are subject to hydrological processes, the pull of Jupiter’s gravity and hidden tectonic forces within the icy moon.

“This makes the shallow subsurface — the ice shell itself — a much more exciting place to think about,” said co-author Dustin Schroeder, an assistant professor of geophysics at Stanford. “It opens up a whole new way of thinking about what’s happening with water near the surface.”

Go to Source
Author:

Categories
ScienceDaily

How to have a blast like a black hole

Laser Engineering at Osaka University have successfully used short, but extremely powerful laser blasts to generate magnetic field reconnection inside a plasma. This work may lead to a more complete theory of X-ray emission from astronomical objects like black holes.

In addition to being subjected to extreme gravitational forces, matter being devoured by a black hole can be also be pummeled by intense heat and magnetic fields. Plasmas, a fourth state of matter hotter than solids, liquids, or gasses, are made of electrically charged protons and electrons that have too much energy to form neutral atoms. Instead, they bounce frantically in response to magnetic fields. Within a plasma, magnetic reconnection is a process in which twisted magnetic field lines suddenly “snap” and cancel each other, resulting in the rapid conversion of magnetic energy into particle kinetic energy. In stars, including our sun, reconnection is responsible for much of the coronal activity, such as solar flares. Owing to the strong acceleration, the charged particles in the black hole’s accretion disk emit their own light, usually in the X-ray region of the spectrum.

To better understand the process that gives rise to the observed X-rays coming from black holes, scientists at Osaka University used intense laser pulses to create similarly extreme conditions on the lab. “We were able to study the high-energy acceleration of electrons and protons as the result of relativistic magnetic reconnection,” Senior author Shinsuke Fujioka says. “For example, the origin of emission from the famous black hole Cygnus X-1, can be better understood.”

This level of light intensity is not easily obtained, however. For a brief instant, the laser required two petawatts of power, equivalent to one thousand times the electric consumption of the entire globe. With the LFEX laser, the team was able to achieve peak magnetic fields with a mind-boggling 2,000 telsas. For comparison, the magnetic fields generated by an MRI machine to produce diagnostic images are typically around 3 teslas, and Earth’s magnetic field is a paltry 0.00005 teslas. The particles of the plasma become accelerated to such an extreme degree that relativistic effects needed to be considered.

“Previously, relativistic magnetic reconnection could only be studied via numerical simulation on a supercomputer. Now, it is an experimental reality in a laboratory with powerful lasers,” first author King Fai Farley Law says. The researchers believe that this project will help elucidate the astrophysical processes that can happen at places in the Universe that contain extreme magnetic fields.

Story Source:

Materials provided by Osaka University. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Safer CRISPR gene editing with fewer off-target hits

The CRISPR system is a powerful tool for the targeted editing of genomes, with significant therapeutic potential, but runs the risk of inappropriately editing “off-target” sites. However, a new study publishing July 9, 2020 in the open-access journal PLOS Biology by Feng Gu of Wenzhou Medical University, China, and colleagues, shows that mutating the enzyme at the heart of the CRISPR gene editing system can improve its fidelity. The results may provide a therapeutically safer strategy for gene editing than using the unmodified enzyme system.

The CRISPR system employs an enzyme called Cas9 to cleave DNA. Cas9 will cut almost any DNA sequence. Its specificity comes from its interaction with a “guide RNA” (gRNA) whose sequence allows it to bind with the target DNA through base-pair matching. Once it does, the enzyme is activated and the DNA is cut.

The CRISPR system is found in multiple bacterial species; among those commonly used in research, that from Staphylococcus aureus has the advantage of size — unlike some others, its gene is small enough to fit inside a versatile and harmless gene therapy vector called adeno-associated virus, making it attractive for therapeutic purposes.

A key limitation of any of the CRISPR systems, including that from S. aureus, is off-target cleavage of DNA. A guide RNA may bind weakly to a site whose sequence is a close but imperfect match; depending on how close the match is and how tightly the enzyme interacts with the paired gRNA-DNA complex, the enzyme may become activated and cut the DNA wrongly, with potentially harmful consequences.

To explore whether the S. aureus Cas9 could be modified to cleave with higher fidelity to the intended target, the authors generated a range of novel Cas9 mutants and tested their ability to discriminate against imperfect matches while retaining high activity at the intended site. They found one such mutant, which distinguished and rejected single base-pair mismatches between gRNA and DNA, regardless of the target, increasing the fidelity up to 93-fold over the original enzyme. They showed that the mutation affected part of the recognition domain, the region of the enzyme that coordinates contacts between the enzyme and the gRNA-DNA complex. The mutation had the likely effect of weakening those contacts, thus ensuring that only the strongest pairing — which would come from a perfect sequence match — would trigger enzyme activity.

“Avoidance of off-target cleavage is a crucial challenge for development of CRISPR for medical interventions, such as correcting genetic diseases or targeting cancer cells,” Gu said. “Our results point the way to developing potentially safer gene therapy strategies.”

Story Source:

Materials provided by PLOS. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ProgrammableWeb

Should You Hire A Developer Or Use The API For Your Website’s CMS?

It doesn’t matter how powerful or well-rounded your chosen CMS happens to be: there can still come a point at which you decide that its natural state isn’t enough and something more is needed. It could be a new function, a fresh perspective, or improved performance, and you’re unwilling to settle for less. What should you do?

Your instinct might be to hire a web developer, ideally one with some expertise in that particular CMS, but is that the right way to go? Developers can be very costly, and whether you have some coding skill or you’re a total novice, you might be able to get somewhere without one — and the key could be using the API for your CMS.

In this post, we’re going to consider why you might want to hire a developer, why you should investigate APIs, and how you can choose between these options. Let’s get to it.

Why you should hire a developer

It’s fairly simple to make a case for hiring a web developer. For one thing, it’s easy. By sharing the load, you get to preserve your existing workload and collaborate with an expert, a second pair of eyes that can complete your vision and deftly deal with any issues that might arise. Additionally, it’s the best way to get quick results if you’re willing to allocate enough money to afford a top-notch developer and make your project a priority.

The ease of this option explains why it’s so popular. We so often outsource things that would be easy to do ourselves (getting store-bought sandwiches, using cleaning services, etc.) that outsourcing something as complex as a website development project seems like an obvious choice for anyone who isn’t themselves a programmer with plenty of free time.

And even if you are a programmer with enough free time to take on a personal project, you might not have the right skills for the job. Every system has its own nuances, whether it’s a powerful platform with proprietary parts (like Shopify) or an open-source foundation built around ease of use (like Ghost), so getting a CMS expert can make for a smoother experience.

Why you should use the API for your CMS

So, with such a good argument to be made for immediately consulting a developer, why should you take the time to get involved directly? Well, one of the core goals of an API — as you may well be aware — is to make system functions readily accessible to outside systems, and you can take advantage of that to extend your system through integrations.

Becoming familiar with the workings of an API doesn’t require you to have an exhaustive knowledge of the CMS itself. You need only understand the available fields and functions and how you can call them (and interact with them) from elsewhere. From there, it’s more about finding — or creating — the external systems that can give you the results you need.

The best developer portals will have detailed API references along with getting started guides, sample code, SDKs and everything else a developer needs to successfully consume the API. The providers  behind them want as many people as possible to gravitate towards their platforms, after all more compatible modules (along with services like Zapier) means a stronger ecosystem and more interest overall. This means that even people with relatively meager technical understanding can get somewhere.

Additionally, getting to know the API for your CMS will help you understand what the system can and can’t do natively. It’s possible that by consuming the API you will uncover existing functionality that you otherwise wouldn’t have noticed. Overall, then, taking this step first will help you understand your CMS and either source an existing integration or build a more economical outline of a project that you can then pass to a developer.

How you can choose the right approach

In talking about building a project outline, I hinted at the natural conclusion here, which is that these options aren’t mutually exclusive. Having studied the API for your website’s CMS, you can develop something else or bring in a suitable module, but you can also continue to work with an external developer. It doesn’t subtract from your options. For that reason, then, I strongly recommend working with the API first and seeing what you can glean from it. That will allow you to make the smartest decision about how to proceed.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">rodneylaws</a>

Categories
ScienceDaily

Terahertz radiation: New material acts as an efficient frequency multiplier

Higher frequencies mean faster data transfer and more powerful processors — the formula that has been driving the IT industry for years. Technically, however, it is anything but easy to keep increasing clock rates and radio frequencies. New materials could solve the problem. Experiments at Helmholtz-Zentrum Dresden-Rossendorf (HZDR) have now produced a promising result: An international team of researchers was able to get a novel material to increase the frequency of a terahertz radiation flash by a factor of seven: a first step for potential IT applications, as the group reports in the journal Nature Communications.

When smartphones receive data and computer chips perform calculations, such processes always involve alternating electric fields that send electrons on clearly defined paths. Higher field frequencies mean that electrons can do their job faster, enabling higher data transfer rates and greater processor speeds. The current ceiling is the terahertz range, which is why researchers all over the world are keen to understand how terahertz fields interact with novel materials. “Our TELBE terahertz facility at HZDR is an outstanding source for studying these interactions in detail and identifying promising materials,” says Jan-Christoph Deinert from HZDR’s Institute of Radiation Physics. “A possible candidate is cadmium arsenide, for example.”

The physicist has studied this compound alongside researchers from Dresden, Cologne, and Shanghai. Cadmium arsenide (Cd3As2) belongs to the group of so-called three-dimensional Dirac materials, in which electrons can interact very quickly and efficiently, both with each other and with rapidly oscillating alternating electric fields. “We were particularly interested in whether the cadmium arsenide also emits terahertz radiation at new, higher frequencies,” explains TELBE beamline scientist Sergey Kovalev. “We have already observed this very successfully in graphene, a two-dimensional Dirac material.” The researchers suspected that cadmium arsenide’s three-dimensional electronic structure would help attain high efficiency in this conversion.

In order to test this, the experts used a special process to produce ultra-thin high-purity platelets from cadmium arsenide, which they then subjected to terahertz pulses from the TELBE facility. Detectors behind the back of the platelet recorded how the cadmium arsenide reacted to the radiation pulses. The result: “We were able to show that cadmium arsenide acts as a highly effective frequency multiplier and does not lose its efficiency, not even under the very strong terahertz pulses that can be generated at TELBE,” reports former HZDR researcher Zhe Wang, who now works at the University of Cologne. The experiment was the first ever to demonstrate the phenomenon of terahertz frequency multiplication up to the seventh harmonic in this still young class of materials.

Electrons dance to their own beat

In addition to the experimental evidence, the team together with researchers form the Max Planck Institute for the Physics of Complex Systems also provided a detailed theoretical description of what occurred: The terahertz pulses that hit the cadmium arsenide generate a strong electric field. “This field accelerates the free electrons in the material,” Deinert describes. “Imagine a huge number of tiny steel pellets rolling around on a plate that is being tipped from side to side very fast.”

The electrons in the cadmium arsenide respond to this acceleration by emitting electromagnetic radiation. The crucial thing is that they do not exactly follow the rhythm of the terahertz field, but oscillate on rather more complicated paths, which is a consequence of the material’s unusual electronic structure. As a result, the electrons emit new terahertz pulses at odd integer multiples of the original frequency — a non-linear effect similar to a piano: When you hit the A key on the keyboard, the instrument not only sounds the key you played, but also a rich spectrum of overtones, the harmonics.

For a post 5G-world

The phenomenon holds promise for numerous future applications, for example in wireless communication, which trends towards ever higher radio frequencies that can transmit far more data than today’s conventional channels. The industry is currently rolling out the 5G standard. Components made of Dirac materials could one day use even higher frequencies — and thus enable even greater bandwidth than 5G. The new class of materials also seems to be of interest for future computers as Dirac-based components could, in theory, facilitate higher clock rates than today’s silicon-based technologies.

But first, the basic science behind it requires further study. “Our research result was only the first step,” stresses Zhe Wang. “Before we can envision concrete applications, we need to increase the efficiency of the new materials.” To this end, the experts want to find out how well they can control frequency multiplication by applying an electric current. And they want to dope their samples, i.e. enrich them with foreign atoms, in the hope of optimizing nonlinear frequency conversion.

Go to Source
Author:

Categories
ScienceDaily

First tunable, chip-based ‘vortex microlaser’ and detector

As computers get more powerful and connected, the amount of data that we send and receive is in a constant race with the technologies that we use to transmit it. Electrons are now proving insufficiently fast and are being replaced by photons as the demand for fiber optic internet cabling and data centers grow.

Though light is much faster than electricity, in modern optical systems, more information is transmitted by layering data into multiple aspects of a light wave, such as its amplitude, wavelength and polarization. Increasingly sophisticated “multiplexing” techniques like these are the only way to stay ahead of the increasing demand for data, but those too are approaching a bottleneck. We are simply running out of room to store more data in the conventional properties of light.

To break through this barrier, engineers are exploring some of light’s harder-to-control properties. Now, two studies from the University of Pennsylvania’s School of Engineering and Applied Science have shown a system that can manipulate and detect one such property known as the orbital angular momentum, or OAM, of light. Critically, they are the first to do so on small semiconductor chips and with enough precision that it can be used as a medium for transmitting information.

The matched pair of studies, published in the journal Science, was done in collaboration with researchers at Duke University, Northeastern University, the Polytechnic University of Milan, Hunan University and the U.S. National Institute of Standards and Technology.

One study, led by Liang Feng, assistant professor in the departments of Materials Science and Engineering and Electrical and Systems Engineering, demonstrates a microlaser which can be dynamically tuned to multiple distinct OAM modes. The other, led by Ritesh Agarwal, professor in the Department of Materials Science and Engineering, shows how a laser’s OAM mode can be measured by a chip-based detector. Both studies involve collaborations between the Agarwal and Feng groups at Penn.

Such “vortex” lasers, named for the way their light spirals around their axis of travel, were first demonstrated by Feng with quantum symmetry-driven designs in 2016. However, Feng and other researchers in the field have thus far been limited to transmitting a single, pre-set OAM mode, making them impractical for encoding more information. On the receiving end, existing detectors have relied on complex filtering techniques using bulky components that have prevented them from being integrated directly onto a chip, and are thus incompatible with most practical optical communications approaches.

Together, this new tunable vortex micro-transceiver and receiver represents the two most critical components of a system that can enable a way of multiplying the information density of optical communication, potentially shattering that looming bandwidth bottleneck.

The ability to dynamically tune OAM values would also enable a photonic update to a classic encryption technique: frequency hopping. By rapidly switching between OAM modes in a pre-defined sequence known only to the sender and receiver, optical communications could be made impossible to intercept.

“Our findings mark a large step towards launching large-capacity optical communication networks and confronting the upcoming information crunch,” says Feng.

In the most basic form of optical communication, transmitting a binary message is as simple as representing 1s and 0s by whether the light is on or off. This is effectively a measure of the light’s amplitude — how high the peak of the wave is — which we experience as brightness. As lasers and detectors become more precise, they can consistently emit and distinguish between different levels of amplitude, allowing for more bits of information to be contained in the same signal.

Even more sophisticated lasers and detectors can alter other properties of light, such as its wavelength, which corresponds to color, and its polarization, which is the orientation of the wave’s oscillations relative to its direction of travel. Many of these properties can be set independently of each other, allowing for increasingly dense multiplexing.

Orbital angular momentum is yet another property of light, though it is considerably harder to manipulate, given the complexity of the nanoscale features necessary to generate it from computer-chip-sized lasers. Circularly polarized light carries an electric field that rotates around its axis of travel, meaning its photons have a quality known as spin angular momentum, or SAM. Under highly controlled spin-orbit interactions, SAM can be locked or converted into another property, orbital angular momentum, or OAM.

The research on a dynamically tunable OAM laser based on this concept was led by Feng and graduate student Zhifeng Zhang.

In this new study, Feng, Zhang and their colleagues began with a “microring” laser, which consists of a ring of semiconductor, only a few microns wide, through which light can circulate indefinitely as long as power is supplied. When additional light is “pumped” into the ring from control arms on either side of the ring, the delicately designed ring emits circularly polarized laser light. Critically, asymmetry between the two control arms allows for the SAM of the resulting laser to be coupled with OAM in a particular direction.

This means that rather than merely rotating around the axis of the beam, as circularly polarized light does, the wavefront of such a laser orbits that axis and thus travels in a helical pattern. A laser’s OAM “mode” corresponds to its chirality, the direction those helices twist, and how close together its twists are.

“We demonstrated a microring laser that is capable of emitting five distinct OAM modes,” Feng says. “That may increase the data channel of such lasers by up to five times.”

Being able to multiplex the OAM, SAM and wavelength of laser light is itself unprecedented, but not particularly useful without a detector that can differentiate between those states and read them out.

In concert with Feng’s work on the tunable vortex microlaser, the research on the OAM detector was led by Agarwal and Zhurun Ji, a graduate student in his lab.

“OAM modes are currently detected through bulk approaches such as mode sorters, or by filtering techniques such as modal decomposition,” Agarwal says, “but none of these methods are likely to work on a chip, or interface seamlessly with electronic signals.”

Agarwal and Ji built upon their previous work with Weyl semimetals, a class of quantum materials that have bulk quantum states whose electrical properties can be controlled using light. Their experiments showed that they could control the direction of electrons in those materials by shining light with different SAM onto it.

Along with their collaborators, Agarwal and Ji drew on this phenomenon by designing a photodetector that is similarly responsive to different OAM modes. In their new detector, the photocurrent generated by light with different OAM modes produced unique current patterns, which allowed the researchers determine the OAM of light impinging on their device.

“These results not only demonstrate a novel quantum phenomenon in the light-matter interaction,” Agarwal says, “but for the first time enable the direct read-out of the phase information of light using an on-chip photodetector. These studies hold great promise for designing highly compact systems for future optical communication systems.”

Next, Agarwal and Feng plan to collaborate on such systems. By combining their unique expertise to fabricate on-chip vortex microlasers and detectors that can uniquely detect light’s OAM, they will design integrated systems to demonstrate new concepts in optical communications with enhanced data transmission capabilities for classical light and upon increasing the sensitivity to single photons, for quantum applications. This demonstration of a new dimension for storing information based on OAM modes can help create richer superposition quantum states to increase information capacity by a few orders of magnitude.

These two strongly-tied studies were partially supported by the National Science Foundation, the U.S. Army Research Office and the Office of Naval Research. Research on the vortex microlaser was done in collaboration with Josep M. Jornet, associate professor at Northeastern University and Stefano Longhi, professor at the Polytechnic University of Milan in Italy and Natalia M. Litchinitser, professor at Duke University. Penn’s Xingdu Qiao, Bikashkali Midya, Kevin Liu, Tianwei Wu, Wenjing Liu and Duke’s Jingbo Sun also contributed to the work. Research on the photodetector was done in collaboration with Albert Davydov from the National Institute of Standards and Technology (NIST) and Anlian Pan from Hunan University. Penn’s Wenjing Liu, Xiaopeng Fan, Zhifeng Zhang and NIST’s Sergiy Krylyuk also contributed to the work.

Go to Source
Author:

Categories
ScienceDaily

Powerful new AI technique detects and classifies galaxies in astronomy image data

Researchers at UC Santa Cruz have developed a powerful new computer program called Morpheus that can analyze astronomical image data pixel by pixel to identify and classify all of the galaxies and stars in large data sets from astronomy surveys.

Morpheus is a deep-learning framework that incorporates a variety of artificial intelligence technologies developed for applications such as image and speech recognition. Brant Robertson, a professor of astronomy and astrophysics who leads the Computational Astrophysics Research Group at UC Santa Cruz, said the rapidly increasing size of astronomy data sets has made it essential to automate some of the tasks traditionally done by astronomers.

“There are some things we simply cannot do as humans, so we have to find ways to use computers to deal with the huge amount of data that will be coming in over the next few years from large astronomical survey projects,” he said.

Robertson worked with Ryan Hausen, a computer science graduate student in UCSC’s Baskin School of Engineering, who developed and tested Morpheus over the past two years. With the publication of their results May 12 in the Astrophysical Journal Supplement Series, Hausen and Robertson are also releasing the Morpheus code publicly and providing online demonstrations.

The morphologies of galaxies, from rotating disk galaxies like our own Milky Way to amorphous elliptical and spheroidal galaxies, can tell astronomers about how galaxies form and evolve over time. Large-scale surveys, such as the Legacy Survey of Space and Time (LSST) to be conducted at the Vera Rubin Observatory now under construction in Chile, will generate huge amounts of image data, and Robertson has been involved in planning how to use that data to understand the formation and evolution of galaxies. LSST will take more than 800 panoramic images each night with a 3.2-billion-pixel camera, recording the entire visible sky twice each week.

“Imagine if you went to astronomers and asked them to classify billions of objects — how could they possibly do that? Now we’ll be able to automatically classify those objects and use that information to learn about galaxy evolution,” Robertson said.

Other astronomers have used deep-learning technology to classify galaxies, but previous efforts have typically involved adapting existing image recognition algorithms, and researchers have fed the algorithms curated images of galaxies to be classified. Hausen built Morpheus from the ground up specifically for astronomical image data, and the model uses as input the original image data in the standard digital file format used by astronomers.

Pixel-level classification is another important advantage of Morpheus, Robertson said. “With other models, you have to know something is there and feed the model an image, and it classifies the entire galaxy at once,” he said. “Morpheus discovers the galaxies for you, and does it pixel by pixel, so it can handle very complicated images, where you might have a spheroidal right next to a disk. For a disk with a central bulge, it classifies the bulge separately. So it’s very powerful.”

To train the deep-learning algorithm, the researchers used information from a 2015 study in which dozens of astronomers classified about 10,000 galaxies in Hubble Space Telescope images from the CANDELS survey. They then applied Morpheus to image data from the Hubble Legacy Fields, which combines observations taken by several Hubble deep-field surveys.

When Morpheus processes an image of an area of the sky, it generates a new set of images of that part of the sky in which all objects are color-coded based on their morphology, separating astronomical objects from the background and identifying point sources (stars) and different types of galaxies. The output includes a confidence level for each classification. Running on UCSC’s lux supercomputer, the program rapidly generates a pixel-by-pixel analysis for the entire data set.

“Morpheus provides detection and morphological classification of astronomical objects at a level of granularity that doesn’t currently exist,” Hausen said.

An interactive visualization of the Morpheus model results for GOODS South, a deep-field survey that imaged millions of galaxies, has been publicly released. This work was supported by NASA and the National Science Foundation.

Story Source:

Materials provided by University of California – Santa Cruz. Original written by Tim Stephens. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ProgrammableWeb

Anvil Open-Sources its App Server to Accelerate Web App Development

Software startup Anvil today announced a major extension of its powerful web app development environment, which makes it simple for Python developers to quickly design, build and ship web apps in minutes.
 
By making its runtime engine open-source, any of the 8 million developers worldwide who know the Python language can now choose to deploy their apps on their own machines, or on specialized Internet of Things (IoT) devices, as well as within their employer’s or Anvil’s clouds.
 
Traditional ways of developing web apps require knowledge of multiple languages and frameworks, creating a complex ecosystem that shuts out many programmers and slows down development. Anvil removes these bottlenecks, enabling any developer who knows Python to create web apps using its integrated development environment.
 
“Anvil’s goal is to fix web development, by making it easier and faster for the world’s growing base of Python developers to create web apps,” said Meredydd Luff, Anvil’s CEO and Co-founder. “By extending our platform and embracing open source, we’re enabling developers to create their own apps in the Anvil Editor, export them and run them anywhere on their own hardware with our new App Server. This gives developers even more choice and control. It also enables apps to run without needing an internet connection, making it ideal for IoT applications, remote locations or offline enterprise deployments.”
 
Anvil’s speed and flexibility are on show as developers and organizations around the world use it to respond to the COVID-19 pandemic, including:

  • Baker Tilly, the 10th-largest accounting and consulting firm in the USA, which unlocked financial relief for its clients in record time by building a process for the just-signed CARES Act.
  • Australia’s MDU Public Health Laboratory, the first laboratory outside China to recreate and analyze the coronavirus, which uses Anvil to build and deploy essential tools for its COVID-19 response.
  • Broadcast engineer Ardian Lama, of Kosovar TV station Rrokum TV, who in two hours created and deployed a web app that enabled non-technical staff to collaborate as if they were in the same newsroom, and make changes – in real time and on-air – from the safety of their own homes.
  • Pablo Paniagua, a data scientist in Costa Rica, who quickly wrote an app to let people check when they could use their vehicle during lockdown. This went viral and was used by 2% of the country’s online population within 48 hours. 

“When the CARES Act was passed, we needed to help our clients access relief – fast. Using Anvil, we were able to deploy an all-new, secure web-based process for coordinating applicants and lenders incredibly quickly,” said Todd Bernhardt, Partner, Baker Tilly. “We opened it to clients on the 2nd of April: that’s less than six days after the law was signed, and just four business days after we started development. Deploying applications with Anvil is as fast as any alternative we have tried.”
 
Anvil’s development environment combines coding simplicity with powerful performance, allowing the creation of full stack web apps in minutes – bringing the power and speed of classic RAD tools like Visual Basic to the modern web. Previously, developers using Anvil could create and run their web apps hosted in Anvil’s cloud, or in their employer’s private cloud. The new open source App Server allows them to export an app and run it anywhere, without relying on the Anvil cloud or any internet connection.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">ProgrammableWeb PR</a>

Categories
ScienceDaily

Astronomers could spot life signs orbiting long-dead stars

The next generation of powerful Earth- and space-based telescopes will be able to hunt distant solar systems for evidence of life on Earth-like exoplanets — particularly those that chaperone burned-out stars known as white dwarfs.

The chemical properties of those far-off worlds could indicate that life exists there. To help future scientists make sense of what their telescopes are showing them, Cornell University astronomers have developed a spectral field guide for these rocky worlds.

“We show what the spectral fingerprints could be and what forthcoming space-based and large terrestrial telescopes can look out for,” said Thea Kozakis, doctoral candidate in astronomy, who conducts her research at Cornell’s Carl Sagan Institute. Kozakis is lead author of “High-resolution Spectra and Biosignatures of Earth-like Planets Transiting White Dwarfs,” published in Astrophysical Journal Letters.

In just a few years, astronomers — using tools such as the Extremely Large Telescope, currently under construction in northern Chile’s Atacama Desert, and the James Webb Space Telescope, scheduled to launch in 2021 — will be able to search for life on exoplanets.

“Rocky planets around white dwarfs are intriguing candidates to characterize because their hosts are not much bigger than Earth-size planets,” said Lisa Kaltenegger, associate professor of astronomy in the College of Arts and Sciences and director of the Carl Sagan Institute.

The trick is to catch an exoplanet’s quick crossing in front of a white dwarf, a small, dense star that has exhausted its energy.

“We are hoping for and looking for that kind of transit,” Kozakis said. “If we observe a transit of that kind of planet, scientists can find out what is in its atmosphere, refer back to this paper, match it to spectral fingerprints and look for signs of life. Publishing this kind of guide allows observers to know what to look for.”

Kozakis, Kaltenegger and Zifan Lin assembled the spectral models for different atmospheres at different temperatures to create a template for possible biosignatures.

Chasing down these planets in the habitable zone of white dwarf systems is challenging, the researchers said.

“We wanted to know if light from a white dwarf — a long-dead star — would allow us to spot life in a planet’s atmosphere if it were there,” Kaltenegger said.

This paper indicates that astronomers should be able to see spectral biosignatures — such as methane in combination with ozone or nitrous oxide — “if those signs of life are present,” said Kaltenegger, who said this research expands scientific databases for finding spectral signs of life on exoplanets to forgotten star systems.

“If we would find signs of life on planets orbiting under the light of long-dead stars,” she said, “the next intriguing question would be whether life survived the star’s death or started all over again — a second genesis, if you will.”

Story Source:

Materials provided by Cornell University. Original written by Blaine Friedlander. Note: Content may be edited for style and length.

Go to Source
Author: