Categories
IEEE Spectrum

Covariant Uses Simple Robot and Gigantic Neural Net to Automate Warehouse Picking

Two years ago, we wrote about an AI startup from UC Berkeley and OpenAI called Embodied Intelligence, founded by robot laundry-folding expert Pieter Abbeel. What exactly Embodied was going to do wasn’t entirely clear, and honestly, it seemed like Embodied itself didn’t really know—they talked about “building technology that enables existing robot hardware to handle a much wider range of tasks where existing solutions break down,” and gave some examples of how that might be applied (including in manufacturing and logistics), but nothing more concrete.

Since then, a few things have happened. Thing one is that Embodied is now Covariant.ai. Thing two is that Covariant.ai spent almost a year talking with literally hundreds of different companies about how smarter robots could potentially make a difference for them. These companies represent sectors that include electronics manufacturing, car manufacturing, textiles, bio labs, construction, farming, hotels, elder care—“pretty much anything you could think about where maybe a robot could be helpful,” Pieter Abbeel tells us. “Over time, it became clear to us that manufacturing and logistics are the two spaces where there’s most demand now, and logistics especially is just hurting really hard for more automation.” And the really hard part of logistics is what Covariant decided to tackle.

Categories
ScienceDaily

For cheaper solar cells, thinner really is better

Costs of solar panels have plummeted over the last several years, leading to rates of solar installations far greater than most analysts had expected. But with most of the potential areas for cost savings already pushed to the extreme, further cost reductions are becoming more challenging to find.

Now, researchers at MIT and at the National Renewable Energy Laboratory (NREL) have outlined a pathway to slashing costs further, this time by slimming down the silicon cells themselves.

Thinner silicon cells have been explored before, especially around a dozen years ago when the cost of silicon peaked because of supply shortages. But this approach suffered from some difficulties: The thin silicon wafers were too brittle and fragile, leading to unacceptable levels of losses during the manufacturing process, and they had lower efficiency. The researchers say there are now ways to begin addressing these challenges through the use of better handling equipment and some recent developments in solar cell architecture.

The new findings are detailed in a paper in the journal Energy and Environmental Science, co-authored by MIT postdoc Zhe Liu, professor of mechanical engineering Tonio Buonassisi, and five others at MIT and NREL.

The researchers describe their approach as “technoeconomic,” stressing that at this point economic considerations are as crucial as the technological ones in achieving further improvements in affordability of solar panels.

Currently, 90 percent of the world’s solar panels are made from crystalline silicon, and the industry continues to grow at a rate of about 30 percent per year, the researchers say. Today’s silicon photovoltaic cells, the heart of these solar panels, are made from wafers of silicon that are 160 micrometers thick, but with improved handling methods, the researchers propose this could be shaved down to 100 micrometers — and eventually as little as 40 micrometers or less, which would only require one-fourth as much silicon for a given size of panel.

That could not only reduce the cost of the individual panels, they say, but even more importantly it could allow for rapid expansion of solar panel manufacturing capacity. That’s because the expansion can be constrained by limits on how fast new plants can be built to produce the silicon crystal ingots that are then sliced like salami to make the wafers. These plants, which are generally separate from the solar cell manufacturing plants themselves, tend to be capital-intensive and time-consuming to build, which could lead to a bottleneck in the rate of expansion of solar panel production. Reducing wafer thickness could potentially alleviate that problem, the researchers say.

The study looked at the efficiency levels of four variations of solar cell architecture, including PERC (passivated emitter and rear contact) cells and other advanced high-efficiency technologies, comparing their outputs at different thickness levels. The team found there was in fact little decline in performance down to thicknesses as low as 40 micrometers, using today’s improved manufacturing processes.

“We see that there’s this area (of the graphs of efficiency versus thickness) where the efficiency is flat,” Liu says, “and so that’s the region where you could potentially save some money.” Because of these advances in cell architecture, he says, “we really started to see that it was time to revisit the cost benefits.”

Changing over the huge panel-manufacturing plants to adapt to the thinner wafers will be a time-consuming and expensive process, but the analysis shows the benefits can far outweigh the costs, Liu says. It will take time to develop the necessary equipment and procedures to allow for the thinner material, but with existing technology, he says, “it should be relatively simple to go down to 100 micrometers,” which would already provide some significant savings. Further improvements in technology such as better detection of microcracks before they grow could help reduce thicknesses further.

In the future, the thickness could potentially be reduced to as little as 15 micrometers, he says. New technologies that grow thin wafers of silicon crystal directly rather than slicing them from a larger cylinder could help enable such further thinning, he says.

Development of thin silicon has received little attention in recent years because the price of silicon has declined from its earlier peak. But, because of cost reductions that have already taken place in solar cell efficiency and other parts of the solar panel manufacturing process and supply chain, the cost of the silicon is once again a factor that can make a difference, he says.

“Efficiency can only go up by a few percent. So if you want to get further improvements, thickness is the way to go,” Buonassisi says. But the conversion will require large capital investments for full-scale deployment.

The purpose of this study, he says, is to provide a roadmap for those who may be planning expansion in solar manufacturing technologies. By making the path “concrete and tangible,” he says, it may help companies incorporate this in their planning. “There is a path,” he says. “It’s not easy, but there is a path. And for the first movers, the advantage is significant.”

What may be required, he says, is for the different key players in the industry to get together and lay out a specific set of steps forward and agreed-upon standards, as the integrated circuit industry did early on to enable the explosive growth of that industry. “That would be truly transformative,” he says.

Andre Augusto, an associate research scientist at Arizona State University who was not connected with this research, says “refining silicon and wafer manufacturing is the most capital-expense (capex) demanding part of the process of manufacturing solar panels. So in a scenario of fast expansion, the wafer supply can become an issue. Going thin solves this problem in part as you can manufacture more wafers per machine without increasing significantly the capex.” He adds that “thinner wafers may deliver performance advantages in certain climates,” performing better in warmer conditions.

Renewable energy analyst Gregory Wilson of Gregory Wilson Consulting, who was not associated with this work, says “The impact of reducing the amount of silicon used in mainstream cells would be very significant, as the paper points out. The most obvious gain is in the total amount of capital required to scale the PV industry to the multi-terawatt scale required by the climate change problem. Another benefit is in the amount of energy required to produce silicon PV panels. This is because the polysilicon production and ingot growth processes that are required for the production of high efficiency cells are very energy intensive.”

Wilson adds “Major PV cell and module manufacturers need to hear from credible groups like Prof. Buonassisi’s at MIT, since they will make this shift when they can clearly see the economic benefits.”

Go to Source
Author:

Categories
ScienceDaily

Electrochemical method for extracting uranium, and potentially other metal ions, from solution

Fifty years ago, scientists hit upon what they thought could be the next rocket fuel. Carboranes — molecules composed of boron, carbon and hydrogen atoms clustered together in three-dimensional shapes — were seen as the possible basis for next-generation propellants due to their ability to release massive amounts of energy when burned.

It was technology that at the time had the potential to augment or even surpass traditional hydrocarbon rocket fuel, and was the subject of heavy investment in the 1950s and 60s.

But things didn’t pan out as expected.

“It turns out that when you burn these things you actually form a lot of sediment,” said Gabriel Ménard, an assistant professor in UC Santa Barbara’s Department of Chemistry and Biochemistry. In addition to other problems found when burning this so-called “zip fuel,” its residue also gummed up the works in rocket engines, and so the project was scrapped.

“So they made these huge stockpiles of these compounds, but they actually never used them,” Ménard said.

Fast forward to today, and these compounds have come back into vogue with a wide range of applications, from medicine to nanoscale engineering. For Ménard and fellow UCSB chemistry professor Trevor Hayton, as well as Tel Aviv University chemistry professor Roman Dobrovetsky, carboranes could hold the key to more efficient uranium ion extraction. And that, in turn, could enable things like better nuclear waste reprocessing and uranium (and other metal) recovery from seawater.

Their research — the first example of applying electrochemical carborane processes to uranium extraction — is published in a paper (link) that appears in the journal Nature.

Key to this technology is the versatility of the cluster molecule. Depending on their compositions these structures can resemble closed cages, or more open nests, due to control of the compound’s redox activity — its readiness to donate or gain electrons. This allows for the controlled capture and release of metal ions, which in this study was applied to uranium ions.

“The big advancement here is this ‘catch and release’ strategy where you can switch between two states, where one state binds the metal and another state releases the metal,” Hayton said.

Conventional processes, such as the popular PUREX process that extracts plutonium and uranium, rely heavily on solvents, extractants and extensive processing.

“Basically, you could say it’s wasteful,” Ménard said. “In our case, we can do this electrochemically — we can capture and release the uranium with the flip of a switch.

“What actually happens,” added Ménard, “is that the cage opens up.” Specifically, the formerly closed ortho-carborane becomes an opened nido- (“nest”) carborane capable of capturing the positively-charged uranium ion.

Conventionally, the controlled release of extracted uranium ions, however, is not as straightforward and can be somewhat messy. According to the researchers, such methods are “less established and can be difficult, expensive and or destructive to the initial material.”

But here, the researchers have devised a way to reliably and efficiently flip back and forth between open and closed carboranes, using electricity. By applying an electrical potential using an electrode dipped in the organic portion of a biphasic system, the carboranes can receive and donate the electrons needed to open and close and capture and release uranium, respectively.

“Basically you can open it up, capture uranium, close it back up and then release uranium,” Ménard said. The molecules can be used multiple times, he added.

This technology could be used for several applications that require the extraction of uranium and by extension, other metal ions. One area is nuclear reprocessing, in which uranium and other radioactive “trans-uranium” elements are extracted from spent nuclear material for storage and reuse (the PUREX process).

“The problem is that these trans-uranium elements are very radioactive and we need to be able to store these for a very long time because they’re basically very dangerous,” Ménard said. This electrochemical method could allow for the separation of uranium from plutonium, similar to the PUREX process, he explained. The extracted uranium could then be enriched and put back into the reactor; the other high-level waste can be transmuted to reduce their radioactivity.

Additionally, the electrochemical process could also be applied to uranium extraction from seawater, which would ease pressure on the terrestrial mines where all uranium is currently sourced.

“There’s about a thousand times more dissolved uranium in the oceans than there are in all the land mines,” Ménard said. Similarly, lithium — another valuable metal that exists in large reserves in seawater — could be extracted this way, and the researchers plan to take this research direction in the near future.

“This gives us another tool in the toolbox for manipulating metal ions and processing nuclear waste or doing metal capture out of oceans,” Hayton said. “It’s a new strategy and new method to achieve these types of transformations.”

Research in this study was conducted also by Megan Keener (lead author), Camden Hunt and Timothy G. Carroll at UCSB; and by Vladimir Kampel at Tel Aviv University.

Go to Source
Author:

Categories
ScienceDaily

Americans perceive likelihood of nuclear weapons risk as 50/50 toss-up

It has been 30 years since the end of the Cold War, yet, on average, Americans still perceive that the odds of a nuclear weapon detonating on U.S. soil is as likely as a coin toss, according to new research from Stevens Institute of Technology.

“That’s exceptionally high,” said Kristyn Karl, a political scientist at Stevens who co-led the work with psychologist Ashley Lytle. “People don’t generally believe that highly rare events are slightly less likely than a 50/50 tossup.”

The finding, reported in the January 2020 issue of International Journal of Communication, represents the end of a decades long gap in the research literature on Americans’ perceptions on nuclear weapons threat. It also provides an initial look at how younger generations, namely Millennials and Gen Z (18-37 years old), think about the topic and what influences their behavior in an era of evolving nuclear threat.

Using their combined expertise in political science and psychology, Karl and Lytle fielded two nationally diverse online surveys totaling more than 3,500 Americans to measure individual characteristics and attitudes, such as perceptions of nuclear risk, apathy toward nuclear topics, media use, and interest in following current events.

They also analyzed how these characteristics and attitudes such as perceptions of nuclear risk influence behaviors, including the likelihood of seeking information and initiating conversations about nuclear topics, as well as preparing emergency kits in the event that the worst were to happen.

The ultimate goal of the work, which is part of the larger Reinventing Civil Defense project supported by the Carnegie Corporation of New York, is to learn more about how to best develop new communication tools to increase awareness among Americans about topics related to nuclear weapons, particularly what to do in the event of a nuclear detonation.

“The overarching narrative from the Reinventing Civil Defense project is that younger Americans just don’t hear anything about nuclear weapons risk,” said Karl. “Unlike older Americans, Millennials and Gen Z didn’t grow up during the Cold War, so what they know about nuclear risk is what’s in the media, and what’s in the media isn’t necessarily reflective of the true state of affairs.”

And media use matters.

Karl and Lytle find that consuming media has a striking effect on how younger and older adults think about topics related to nuclear weapons, especially as it relates to apathy. Specifically, as younger generations report using more media, they are increasingly likely to report being apathetic about nuclear topics.

But this pattern is different for older adults, as there is no association between their media use and their willingness to think about nuclear threats or how to survive them. In terms of behavior, apathy about nuclear topics is associated with a decrease in seeking information on the issue.

Interestingly, as Americans age, the lower they estimate the likelihood of a nuclear detonation in their lifetime. “Among lots of possibilities, they may be thinking if it didn’t happen during the Cold War, it won’t happen now; or perhaps I have fewer years to live, so it probably won’t happen in my lifetime,” said Lytle. However, older adults and those who tend to more closely follow the news tend to seek more information about nuclear topics.

Broadly, perceptions of nuclear weapons risk prove powerful as they lead Americans and take various actions to prepare in the event of a nuclear attack. On average, city dwellers estimate the risk as 5-7% higher than their rural or suburban peers whereas women estimate nuclear risk as 3-5% higher than men. Since men report significantly higher levels of media use and more closely following current events, this research presents several opportunities for targeting messages based on these varying perceptions.

One pattern is clear: as perception of nuclear weapons risk increases, so too does Americans’ intent to take action and that’s true across multiple measures, whether it putting forward effort to think and plan for it, seeking information about it, communicating with others on the topic, or taking steps to prepare for an attack.

Karl and Lytle explain that many people are fatalistic: if a nuclear weapon were to go off in New York City, then we would all be dead, ‘so why should I put any effort forward in thinking about it?’

Karl explains that the size of the weapon, the location, and even the weather, are important. In cities, for example, many nuclear weapons detonations would be funneled upward by tall buildings and modeling suggests that many people could survive. The most important thing people could do is get inside a building and stay there for three days.

“Our gut reaction is that everybody would die. But not everybody,” said Lytle. “We are trying to figure out how to educate people that this is not always true so that people feel like they have some sort of agency in a situation like this. Many people could survive the initial blast and then their subsequent behavior would determine what happens from there.”

While Lytle and Karl emphasize that they don’t wish to make claims about the actual degree of nuclear weapons risk, they maintain that perceptions of this risk are crucially important. Even if we assume the risk is low in the real world, it could be life-saving for Americans to know just a small amount about what you should do.

Go to Source
Author:

Categories
IEEE Spectrum

How Do Neural Implants Work?

It sounds like science fiction, but a neural implant could, many years from now, read and edit a person’s thoughts. Neural implants are already being used to treat disease, rehabilitate the body after injury, improve memory, communicate with prosthetic limbs, and more. 

The U.S. Department of Defense and the U.S. National Institutes of Health (NIH) have devoted hundreds of millions of dollars in funding toward this sector. Independent research papers on the topic appear in top journals almost weekly.

Here, we describe types of neural implants, explain how neural implants work, and provide examples demonstrating what these devices can do. 

Categories
ScienceDaily

When the milky way collided with dwarf galaxy gaia-enceladus

The dwarf galaxy Gaia-Enceladus collided with the Milky Way probably approximately 11.5 billion years ago. A team of researchers including scientists from the Max Planck Institute for Solar System Research in Germany for the first time used a single star affected by the collision as a clue for dating. Using observational data from ground-based observatories and space telescopes, the scientists led by the University of Birmingham were able to determine the age of the star and the role it played in the collision. The research group describes its results in today’s issue of Nature Astronomy.

On cosmic time scales, the colliding and merging of galaxies is not uncommon. Even if both galaxies involved are of very different sizes, such a collision leaves clear traces in the larger one. For example, the smaller galaxy introduces stars with a different chemical composition, the motion of many stars is altered, and myriads of new stars are formed.

The Milky Way has encountered several other galaxies in its 13.5 billion-year history. One of them is the dwarf galaxy Gaia-Enceladus. To understand how this event affected our galaxy and changed it permanently, it is important to reliably date the collision. To this end, the researchers led by Prof. Dr. Bill Chaplin of the University of Birmingham turned their attention to a single star: ? Indi is found in the constellation Indus; with an apparent brightness comparable to that of Uranus, it is visible even to the naked eye and can be easily studied in detail.

“The space telescope TESS collected data from ? Indi already in its first month of scientific operation,” says Dr. Saskia Hekker, head of the research group “Stellar Ages and Galactic Evolution (SAGE)” at MPS and co-author of the new study. The space telescope was launched in 2018 to perform a full-sky survey and characterize as many stars as possible. “The data from TESS allow us to determine the age of the star very accurately,” Hekker adds.

Moreover, ? Indi provided clues on the history of the collision with the dwarf galaxy Gaia-Enceladus. To reconstruct its role in the collision, the research group evaluated numerous data sets on ? Indi obtained with the help of the spectrographs HARPS (High Accuracy Radial velocity Planet Searcher) and FEROS (Fiber-fed Extended Range Optical Spectrograph) of the European Southern Observatory, the Galaxy Evolution Experiment of the Apache Point Observatory in New Mexico, and ESA’s Gaia Space Telescope. This allowed them to specify both the chemical composition of the star and its movement within the galaxy with great precision.

The cosmic detective work produced a clear picture: v Indi has been part of the halo, the outer region of the Milky Way, and the collision changed its trajectory. “Since the motion of v Indi was affected by the collision, it must have taken place when the star was already formed,” Chaplin explains the line of argument. The age of the star therefore puts a constraint on the time of the collision.

To determine the age of a star, researchers use its natural oscillations, which can be observed as brightness fluctuations. “Similar to the way seismic waves on Earth allow conclusions about the interior of our planet, stellar oscillations help us to reveal the internal structure and composition of the star and thus its age,” explains co-author Dr. Nathalie Themessl.

The calculations carried out by MPS researchers and other research groups showed that with a probability of 95 percent the galaxy merger must have occurred 13.2 billion years ago. With a probability of 68 percent, the collision took place approximately 11.5 billion years ago. “This chronological classification not only helps us to understand how the collision changed our galaxy,” says Hekker. “It also gives us a sense, of how collisions and mergers impacted other galaxies and influenced their evolution.”

Story Source:

Materials provided by Max-Planck-Gesellschaft. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Air pollution from oil and gas production sites visible from space

Oil and gas production has doubled in some parts of the United States in the last two years, and scientists can use satellites to see impacts of that trend: a significant increase in the release of the lung-irritating air pollutant nitrogen dioxide, for example, and a more-than-doubling of the amount of gas flared into the atmosphere.

“We see the industry’s growing impact from space,” said Barbara Dix, a scientist at the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder and lead author of the new assessment published in the AGU journal Geophysical Research Letters. “We really are at the point where we can use satellite data to give feedback to companies and regulators, and see if they are successful in regulating emissions.”

Dix and a team of U.S. and Dutch researchers set out to see if a suite of satellite-based instruments could help scientists understand more about nitrogen oxides pollution (including nitrogen dioxide) coming from engines in U.S. oil and gas fields. Combustion engines produce nitrogen oxides, which is a respiratory irritant and can lead to the formation of other types of harmful air pollutants, such as ground-level ozone.

On oil and gas drilling and production sites, there may be several small and large combustion engines, drilling, compressing gas, separating liquids and gases, and moving gas and oil through pipes and storage containers, said co-author Joost de Gouw, a CIRES Fellow and chemistry professor at CU Boulder. The emissions of those engines are not controlled. “Cars have catalytic converters, big industrial stacks may have emissions reduction equipment…” de Gouw said. “Not so with these engines.”

Conventional “inventories” meant to account for nitrogen oxides pollution from oil and gas sites are often very uncertain, underestimating or overestimating the pollutants, de Gouw said. And there are few sustained measurements of nitrogen oxides in many of the rural areas where oil and gas development often takes place, Dix said.

So she, de Gouw and their colleagues turned to nitrogen dioxide data from the Ozone Monitoring Instrument (OMI) on board a NASA satellite and the Tropospheric Monitoring Instrument (TropOMI) on a European Space Agency satellite. They also looked at gas flaring data from an instrument on the NOAA/NASA Suomi satellite system.

Between 2007 and 2019, across much of the United States, nitrogen dioxide pollution levels dropped because of cleaner cars and power plants, the team found, confirming findings reported previously. The clean air trend in satellite data was most obvious in urban areas of California, Washington and Oregon and in the eastern half of the continental United States. “We’ve cleaned up our act a lot,” Dix said.

However, several areas stuck out with increased emissions of nitrogen dioxide: The Permian, Bakken and Eagle Ford oil and gas basins, in Texas and New Mexico, North Dakota, and Texas, respectively.

In those areas, the scientists used a type of time-series analysis to figure out where the pollutant was coming from: Drilling of new wells vs. longer-term production. They could do this kind of analysis because drilling activity swings up and down quickly in response to market forces while production changes far more slowly (once a well is drilled, it may produce oil and natural gas for years or even decades).

Before a downturn in drilling in 2015, drilling generated about 80 percent of nitrogen dioxide from oil and gas sites, the team reported. After 2015, drilling and production produced roughly equal amounts of the pollutant. Flaring is estimated to contribute up to 10 percent in both time frames.

The researchers also developed a new oil and gas emissions inventory, using data on fuel use by the industry, the location of drilling rigs, and well-level production data. The inventory confirmed the satellite trends, said co-author Brian McDonald, a CIRES scientist working in NOAA’s Chemical Sciences Division, “It is a promising development that what we observe from space can be explained by expected trends in emissions from the oil and gas industry.”

“Scientifically, this is especially important: we can do source attribution by satellite,” de Gouw said. “We need to know the important sources to address these emissions in the most cost-efficient manner.”

Go to Source
Author:

Categories
ScienceDaily

Planet WASP-12b is on a death spiral, say scientists

Earth is doomed — but not for 5 billion years. Our planet will be roasted as our sun expands and becomes a red giant, but the exoplanet WASP-12b, located 600 light-years away in the constellation Auriga, has less than a thousandth of that time left: a comparatively paltry 3 million years.

A Princeton-led team of astrophysicists has shown that WASP-12b is spiraling in toward its host star, heading toward certain destruction. Their paper appears in the Dec. 27, 2019, issue of the Astrophysical Journal Letters.

WASP-12b is known as a “hot Jupiter,” a giant gaseous planet like our neighbor planet Jupiter, but which is very close to its own star, orbiting its sun in just 26 hours. (By contrast, we take 365 days to orbit, and even Mercury, the innermost planet of our solar system, takes 88 days.)

“Ever since the discovery of the first ‘hot Jupiter’ in 1995 — a discovery that was recognized with this year’s Nobel Prize in Physics — we have wondered how long such planets can survive,” said Joshua Winn, a professor of astrophysical sciences at Princeton and one of the authors of the paper. “We were pretty sure they could not last forever. The strong gravitational interactions between the planet and the star should cause the planet to spiral inward and be destroyed, but nobody could predict how long this takes. It might be millions of years, it might be billions or trillions. Now that we have measured the rate, for at least one system — it’s millions of years — we have a new clue about the behavior of stars as fluid bodies.”

The problem is that as WASP-12b orbits its star, the two bodies exert gravitational pulls on each other, raising “tides” like the ocean tides raised by the moon on Earth.

Within the star, these tidal waves cause the star to become slightly distorted and to oscillate. Because of friction, these waves crash and the oscillations die down, a process that gradually converts the planet’s orbital energy into heat within the star.

The friction associated with the tides also exerts a gravitational torque on the planet, causing the planet to spiral inward. Measuring how quickly the planet’s orbit is shrinking reveals how quickly the star is dissipating the orbital energy, which provides astrophysicists clues about the interior of stars.

“If we can find more planets like WASP-12b whose orbits are decaying, we’ll be able to learn about the evolution and eventual fate of exoplanetary systems,” said first author Samuel Yee, a graduate student in astrophysical sciences. “Although this phenomenon has been predicted for close-in giant planets like WASP-12b in the past, this is the first time we have caught this process in action.”

One of the first people to make that prediction that was Frederic Rasio, the Joseph Cummings Professor of Physics and Astronomy at Northwestern University, who was not involved in Yee and Winn’s work. “We’ve all been waiting nearly 25 years for this effect to be detected observationally,” Rasio said. “The implications of the short timescale measured for orbital decay are also very important. In particular it means that there must be many more hot Jupiters that have already gone all the way. When they get to the Roche limit — the tidal disruption limit for an object on a circular orbit — their envelopes might get stripped, revealing a rocky core that looks just like a super-Earth (or maybe a mini-Neptune if they can retain a bit of their envelope).”

Rasio also edits Astrophysical Journal Letters, the journal in which the new paper appears. The researchers had originally submitted their paper to a less-prestigious sister journal also published by the American Astronomical Society, but Rasio redirected it to ApJ Letters because of the “especially great significance” of the research. “Part of my job is to ensure that all major new discoveries presented in manuscripts submitted to the AAS Journals are considered for publication in ApJ Letters,” he said. “In this case it was a no-brainer.”

WASP-12b was discovered in 2008 through the transit method, in which astronomers observe a small dip in a star’s brightness as the planet passes in front of it, each time it completes an orbit. Since its discovery, the interval between successive dips has shortened by 29 milliseconds per year — an observation that was first noted in 2017 by co-author Kishore Patra, then an undergraduate at the Massachusetts Institute of Technology.

That slight shortening could suggest that the planet’s orbit is shrinking, but there are other possible explanations: If WASP-12b’s orbit is more oval-shaped than circular, for example, the apparent changes in the orbital period could be caused by the changing orientation of the orbit.

The way to be sure if the orbit is actually shortening is to watch the planet disappear behind its star, known as occultation. If the orbit is just changing its direction, the actual orbital period doesn’t change, so if transits occur more quickly than expected, occultations should occur more slowly. But if the orbit is truly decaying, the timing of both transits and occultations should shift in the same direction.

Over the last two years, the researchers have collected more data, including new occultation observations made with the Spitzer Space Telescope.

“These new data strongly support the orbital decay scenario, allowing us to firmly say that the planet is indeed spiraling toward its star,” said Yee. “This confirms the long-standing theoretical predictions and indirect data suggesting that hot Jupiters should eventually be destroyed through this process.”

This discovery will help theorists understand the internal workings of stars and interpret other data relating to tidal interactions, said Winn. “It also tells us about the lifetimes of hot Jupiters, a clue that might help shed light on the formation of these strange and unexpected planets.”

“The orbit of WASP-12b is decaying,” by Samuel W. Yee, Joshua N. Winn, Heather A. Knutson, Kishore C. Patra, Shreyas Vissapragada, Michael M. Zhang, Matthew J. Holman, Avi Shporer and Jason T. Wright, appears in the Dec. 27, 2019, issue of the Astrophysical Journal. The research was supported by Princeton University, the Heising-Simons Foundation, NASA Solar Systems grant NNX14AD22G, the Pennsylvania State University, the Eberly College of Science and the Pennsylvania Space Grant Consortium. The authors wish to recognize and acknowledge the very significant cultural role and reverence that the summit of Maunakea has always had within the indigenous Hawaiian community: “We are most fortunate to have the opportunity to conduct observations from this mountain.”

Go to Source
Author:

Categories
IEEE Spectrum

Aerospace Companies Compete to Build Lunar Landers for NASA’s Project Artemis

After 50 years of lamenting that America had abandoned the moon, astronauts are in a rush again, trying to go back within five—and NASA has asked aerospace companies to design the lunar landers that will get them there. The project is called Artemis, and the agency is now reviewing proposals to build what it calls the Human Landing System, or HLS. In January, it says, it will probably select finalists.

NASA had said a landing was possible by 2028. Then, the White House said to do it by 2024.

“Urgency must be our watchword,” said U.S. Vice President Mike Pence when he announced the new deadline in March 2019. “Now, let’s get to work.”

Categories
ProgrammableWeb

ProgrammableWeb’s Most Clicked, Shared and Talked About APIs of 2019: Entertainment

This segment of the year’s most clicked, shared and talked about APIs concerns the Entertainment segment. That would include TV, Movies, Music, Games, Gambling, Humor, Podcasts, Video, Animation, Books, Sports, and eSports.

API news from the video games industry during 2019 included some new Steam APIs, a Bitmoji SDK from Snap!, new tools for Unity developers, a new cross-platform game development SDK from Microsoft, a live streaming API spec from AWS, and fewer tools for some multiplayer services from Google.

Other entertainment API news this past year included enhanced vision recognition for Sports from Google, updated Google Play Store rules to protect children from gambling, violence, and other adult applications, developer tools for Apple TV, Siri for Spotify users, Google’s accessibility improvements, and controlling the TV via Alexa.

Below is a list of popular APIs that were added to ProgrammableWeb during 2019 from our Entertainment categories.

RAWG is a video game database and video game discovery service. The RAWG Video Games Database APITrack this API enables access to data about video games. The API allows users to search for 300,000 video games on 50 platforms, retrieve data about video games such as publishers, genres, descriptions, etc., get links to all online stores where users can buy a game, and find similar games.

A large number of Humor APIs were very popular this year, and evidently “Geek Jokes” reign, as a whopping number of ProgrammableWeb page visits to this API reveal. The Geek Jokes RESTful APITrack this API lets users fetch a random geeky/programming related joke for use in all sorts of applications. This API is provided by developer Sameer Kumar.

Joke APITrack this API returns jokes of many types in JSON, XML and YAML formats. The API allows filters for NSFW, political or religious jokes, and returns jokes from miscellaneous, dark and programming categories.

Bets Bet365 APITrack this API provides programmatic access to all markets and odds available on the Bet365 website. Developers can retrieve all markets and odds or just prematch markets and odds.

icanhazdadjoke.com offers the largest collection of dad jokes on the internet. The icanhazdadjoke APITrack this API allows developers to retrieve a random joke, a specific joke, or search for jokes programmatically. Results are returned in JSON, text, GraphQL, as an image, or as a Slack message.

Qloo AI APITrack this API detects trends in user preferences and personalizes related items within a database of culture and entertainment. The API generates matching recommendations for user-to-item or item-to-item searches on a variety of media and lifestyle categories, in addition to mapping related entertainment spots in particular geographic locations. Books, fashion, film, music, podcasts, and television are some of the API’s categories for media and lifestyle items.

Amazon Alexa Music Skill APITrack this API enables music service providers such as Spotify or Pandora to add music services to Alexa-powered devices. The Alexa Music Skill API also enables devices to understand a user’s voice commands to play music. For example, a user could say, “Alexa, play ‘I Can See For Miles’ by The Who” to hear that particular song.

Harry Potter APITrack this API can enable applications to return spell routes, character routes, house routes, and sorting hat routes from the popular children’s book series in JSON format. Parameters for characters include patronus, bloodStatus, school, wand, animagus, ministry of magic and others. The API is provided by developer Kristen Spencer.

Tronald Dump APITrack this API allows access to data that returns quotes for “the dumbest things Donald Trump has ever said.” It supports JSON formatted responses for several categories, and a slack integration for viewing available categories, searching categories, and personalizing search preferences.

Fortune< a href=”/api/fortune” pwinlinefollow=”/api/fortune”>API is a REST interface that returns random fortunes originally from the fortune-mod repository by Shlomi Fish. Developers can add new fortunes or specify a genre by adding the datafile name to the fortuneapi.heroku.com endpoint.

Open Pinball Database (OPDB) provides searchable data about pinball machines and an API to add pinball machine data into applications. All OPDB data results are JSON formatted and available via REST APITrack this API for querying data about specific machines, retrieving exports of the entire dataset, using specialized endpoints for typeahead searching and more.

UI Faces aggregates photos from several web sources to provide avatars with real looking photos. Developers who use the UI Faces APITrack this API can specify several parameters such as age, gender, and emotion, so that users may filter and sort the photos according to their needs.

TheRundown offers real-time odds and scores from major sports-books. NFL, NBA, WNBA, MLB, NHL, NCAA Football, and NCAA Basketball are supported. TheRundown APITrack this API enables specific information associated with sports by ID, affiliates, and lines.

p>

Broadage Sports is a sports data and technology company that offers API access to real-time data feeds for developers’ projects and applications. Developers will need to create an account to get started with the APIs. Individual APIs are provided for Global DataTrack this API, as well as SoccerTrack this API, Football, VolleyballTrack this API, BaseballTrack this API, Ice HockeyTrack this API, BasketballTrack this API and HandballTrack this API.

Star Wars Quotes API returns random quotes from characters in the Star Wars Universe. Developers can use the Star Wars Quotes APITrack this API to integrate The Force into applications. We have listed this API in the Movies category.

Goalserve delivers live sports data feeds in a wide range of competitive sports. We have added several of the company’s APIs including the Goalserve eSports APITrack this API and the Goalserve Sports Cricket API Track this APIfor retrieving data such as live scores, game results, pre-match updates, odds, historical data, team rosters, player information, injuries, schedules, racing entries, detailed tournament and match coverage, and much more. the

Fight Analytics is a data company and independent UFC statistics provider that specializes in MMA combat sports. The Fight Analytics APITrack this API provides data about combat fighting for betting or other uses. This API provides real-time data for live scores, live stats by fighter, text commentaries, historical data, and more, for UFC, One FC, World Series of Fighting, KSW, Bellator and others. It provides tools to bookmakers for managing pricing and mathematical models.

DonorDrive and the Extra Life program allow gamers to pledge and play video games to raise funds for the Children’s Miracle Network Hospitals. The DonorDrive APITrack this API enables programmatic access to publicly available data from the Extra Life program. The API offers access to donations, donors, events, achievement badges, participants, and teams.

EDM Train APITrack this API retrieves information on upcoming EDM (electronic dance music) events and event locations in the US and Canada. Events can be filtered by event name, artist, venue, start date, and more. Locations can be filtered by city and state.

Captain Coaster is a community website that allows users to review, rate, and rank roller coasters. The Captain Coaster APITrack this API allows developers to get roller coaster resources, image resources, park resources, and status resources from the Captain Coaster website. Captain Coaster is free to use and is not affiliated with any park, manufacturer, or amusement industry company.

Get ratings, height, speed, location, and other details about roller coasters with this API. Image: Captain Coaster

CrackWatch APITrack this API offers JSON formatted news associated with the crack status of PC games. This API features endpoints for the newest cracks with list sorting and NFO support.

Ovrstat provides statistics for Overwatch. Overwatch is a multiplayer first-person shooter video game. Developers can get PC stats or Console stats returned with the Ovrstat APITrack this API.

Trint is a transcription platform powered by artificial intelligence. The Trint APITrack this API allows developers to access transcript and export endpoints. JSON is the preferred response format and the following audio and video formats are supported: MP3, MP4, M4A, AAC, WMA, AVI, WAV, and MOV.

Hooktheory provides developers with support for discovering the theory behind songs using music theory books, songwriting software, and TheoryTabs (see below). The Hooktheory APITrack this API exposes music chord probability data used in TheoryTab Library searches, which enables users to find the most popular chord progressions in music.

TheoryTabs show the theory behind songs

TheoryTabs show the theory behind songs. Screenshot: Hooktheory

Quantone offers in-depth data about the music industry. The Quantone APITrack this API has a REST architecture and endpoints for data about artists, albums, recordings, and musical works. The API offers third-party and public domain identifiers including Spotify, Deezer, SoundCloud, Discogs, MusicBrainz, and industry-standard IDs like UPCs and ISRCs.

ShotTracker provides sensors and wearable technology that delivers automatic real-time basketball stats and analytics. The ShotTracker APITrack this API is a sports performance platform that provides basketball team statistics such as zone maps, shot charts, and box scores. This API is capable of displaying a leaderboard for the team or for a specific player, analyze stats, and show live player movement in applications.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">joyc</a>