Categories
ProgrammableWeb

Sensibill Launches Receipt Extraction API

Sensibill, a provider of SKU-level data and financial tools like digital receipt management that helps institutions better know and serve their customers, today announced the launch of its newest product: Receipt Extraction API. The machine learning-based solution automates and streamlines the transcription of receipts, allowing businesses to deepen customer engagement and loyalty at scale.

Sensibill’s Receipt Extraction API solution will benefit a wide range of businesses that need to quickly and accurately extract receipt data at scale. For example, enterprise accounting firms can use the service to reduce costs and maintain profitability, despite economic pressures. Financial services companies like accounting software and PFM providers can gain access to SKU-level data to drive personalization, using the technology to create an innovative edge and differentiate themselves from the competition. And, loyalty and reward companies that need near-perfect extraction capabilities can leverage Receipt Extraction API to help deliver rewards and value back to users more quickly, increasing efficiencies and improving product quality and accuracy.

“There is a new urgency around cost savings, efficiencies, digital engagement and innovation in otherwise mature markets,” explained Corey Gross, CEO of Sensibill. “Our Receipt Extraction API offering uses smart technology to extract receipts in bulk with speed and precision. At Sensibill, we are proven experts in SKU-level data; it’s what we’ve focused on for the past seven years and why leading institutions and digital banking and core providers across the globe have partnered with us. We are excited to help a broader range of organizations as they work to quickly and efficiently unlock the power of SKU-level data to drive deeper digital engagement and loyalty with their customers.”

Sensibill’s combination of deep SKU-level data expertise and leading AI and machine learning technology makes it uniquely positioned to deliver this solution to the market. Receipt Extraction API is powered by multi-brain processing, leveraging multiple OCR engines and machine learning models to maximize accuracy. And, the solution is intuitive and easily deployable, allowing business to quickly and nimbly test and implement. To best position businesses for success, Sensibill offers customers strategic account management support and white-glove service for extraction capabilities as needed.

Source: Finextra.com

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">ProgrammableWeb PR</a>

Categories
ProgrammableWeb

Xignite Announces Updates to its Bond Master API Increasing its Coverage

Xignite, a provider of market data cloud solutions for financial institutions and fintech companies, announced on Wednesday it recently enhanced its Bond Master API. Xignite offers several APIs that provide real-time, delayed, historical fixed income pricing and reference data for corporate and agency debt bonds.

Xignite reported the Bond Master API enhancement increases the coverage from the U.S. to more than 190 countries, adds additional bond types to support more than two million active bond issues, and increases the ease of use of the API with several new endpoints.

Additional detail on the enhanced Bond Master endpoints:

  • The List endpoint for bond type, issuer type, and domicile enables clients to slice and dice the bond universe differently based on use-case.
  • The ScreenBonds endpoint enables clients to dynamically and easily screen the bond universe by combining criteria based on the coupon rate, maturity date, callability, and issue convertibility.
  • The ListBondDataPoints and GetBondDataPoints endpoints enable clients to more easily pick and choose the reference data points they need to integrate into their systems.
  • The GetBondDataPoints endpoint enables access to additional reference data points without requiring changes to an existing implementation.

Vijay Choudhary, Vice President, Product Management, Market Data Solutions at Xignite, then added:

“These enhancements eliminate the need to maintain an on-site bond security master, which ultimately saves our clients time and eliminates significant unnecessary expenses.”

Founded in 2006, Xignite provides cloud-based financial market data APIs to help emerging companies and established enterprises deliver real-time and reference market data to their digital assets, such as websites and apps. The company reported that unlike legacy data providers, which require on-premise infrastructure to store and process market data, its data is accessed through the public cloud.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">ProgrammableWeb PR</a>

Categories
ScienceDaily

Comet discovered to have its own northern lights

Data from NASA instruments aboard the ESA (European Space Agency) Rosetta mission have helped reveal that comet 67P/Churyumov-Gerasimenko has its own far-ultraviolet aurora. It is the first time such electromagnetic emissions in the far-ultraviolet have been documented on a celestial object other than a planet or moon. A paper on the findings was released today in the journal Nature Astronomy.

On Earth, aurora (also known as the northern or southern lights) are generated when electrically charged particles speeding from the Sun hit the upper atmosphere to create colorful shimmers of green, white, and red. Elsewhere in the solar system, Jupiter and some of its moons — as well as Saturn, Uranus, Neptune, and even Mars — have all exhibited their own version of northern lights. But the phenomena had yet to be documented in comets.

Rosetta is space exploration’s most traveled and accomplished comet hunter. Launched in 2004, it orbited comet 67P/Churyumov-Gerasimenko (67P/C-G) from Aug. 2014 until its dramatic end-of-mission comet landing in Sept. 2016. The data for this most recent study is on what mission scientists initially interpreted as “dayglow,” a process caused by photons of light interacting with the envelope of gas — known as the coma — that radiates from, and surrounds, the comet’s nucleus. But new analysis of the data paints a very different picture.

“The glow surrounding 67P/C-G is one of a kind,” said Marina Galand of Imperial College London and lead author of the study. “By linking data from numerous Rosetta instruments, we were able to get a better picture of what was going on. This enabled us to unambiguously identify how 67P/C-G’s ultraviolet atomic emissions form.”

The data indicate 67P/C-G’s emissions are actually auroral in nature. Electrons streaming out in the solar wind — the stream of charged particles flowing out from the Sun — interact with the gas in the comet’s coma, breaking apart water and other molecules. The resulting atoms give off a distinctive far-ultraviolet light. Invisible to the naked eye, far-ultraviolet has the shortest wavelengths of radiation in the ultraviolet spectrum.

Exploring the emission of 67P/C-G will enable scientists to learn how the particles in the solar wind change over time, something that is crucial for understanding space weather throughout the solar system. By providing better information on how the Sun’s radiation affects the space environment they must travel through, such information could ultimately can help protect satellites and spacecraft, as well as astronauts traveling to the Moon and Mars.

“Rosetta is the gift that keeps on giving,” said Paul Feldman, an investigator on Alice at the Johns Hopkins University in Baltimore and a co-author of the paper. “The treasure trove of data it returned over its two-year visit to the comet have allowed us to rewrite the book on these most exotic inhabitants of our solar system — and by all accounts there is much more to come.”

NASA Instruments Aboard ESA’s Rosetta

NASA-supplied instruments contributed to this investigation. The Ion and Electron Sensor (IES) instrument detected the amount and energy of electrons near the spacecraft, the Alice instrument measured the ultraviolet light emitted by the aurora, and the Microwave Instrument for the Rosetta Orbiter (MIRO) measured the amount of water molecules around the comet (the MIRO instrument includes contributions from France, Germany, and Taiwan). Other instruments aboard the spacecraft used in the research were the Italian Space Agency’s Visible and InfraRed Thermal Imaging Spectrometer (VIRTIS), the Langmuir Probe (LAP) provided by Sweden, and the Rosetta Orbiter Spectrometer for Ion and Neutral Analysis (ROSINA) provided by Switzerland.

Rosetta was an ESA mission with contributions from its member states and NASA. Rosetta’s Philae lander, which successfully landed on the comet in November 2014, was provided by a consortium led by the German Aerospace Center in Cologne; Max Planck Institute for Solar System Research in Gottingen, Germany; the French National Space Agency in, Paris; and the Italian Space Agency in Rome. A division of Caltech, NASA’s Jet Propulsion Laboratory in Southern California managed the U.S. contribution of the Rosetta mission for NASA’s Science Mission Directorate in Washington. JPL also built the MIRO and hosts its principal investigator, Mark Hofstadter. The Southwest Research Institute (San Antonio and Boulder, Colorado), developed the Rosetta orbiter’s IES and Alice instruments and hosts their principal investigators, James Burch (IES) and Joel Parker (Alice).

For more information on the U.S. instruments aboard Rosetta, visit: http://rosetta.jpl.nasa.gov

More information about Rosetta is available at: http://www.esa.int/rosetta

Story Source:

Materials provided by NASA/Jet Propulsion Laboratory. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Researchers demonstrate record speed with advanced spectroscopy technique

Researchers have developed an advanced spectrometer that can acquire data with exceptionally high speed. The new spectrometer could be useful for a variety of applications including remote sensing, real-time biological imaging and machine vision.

Spectrometers measure the color of light absorbed or emitted from a substance. However, using such systems for complex and detailed measurement typically requires long data acquisition times.

“Our new system can measure a spectrum in mere microseconds,” said research team leader Scott B. Papp from the National Institute of Standards and Technology and the University of Colorado, Boulder. “This means it could be used for chemical studies in the dynamic environment of power plants or jet engines, for quality control of pharmaceuticals or semiconductors flying by on a production line, or for video imaging of biological samples.”

In The Optical Society (OSA) journal Optics Express, lead author David R. Carlson and colleagues Daniel D. Hickstein and Papp report the first dual-comb spectrometer with a pulse repetition rate of 10 gigahertz. They demonstrate it by carrying out spectroscopy experiments on pressurized gases and semiconductor wafers.

“Frequency combs are already known to be useful for spectroscopy,” said Carlson. “Our research is focused on building new, high-speed frequency combs that can make a spectrometer that operates hundreds of times faster than current technologies.”

Getting data faster

Dual-comb spectroscopy uses two optical sources, known as optical frequency combs that emit a spectrum of colors — or frequencies — perfectly spaced like the teeth on a comb. Frequency combs are useful for spectroscopy because they provide access to a wide range of colors that can be used to distinguish various substances.

To create a dual-comb spectroscopy system with extremely fast acquisition and a wide range of colors, the researchers brought together techniques from several different disciplines, including nanofabrication, microwave electronics, spectroscopy and microscopy.

The frequency combs in the new system use an optical modulator driven by an electronic signal to carve a continuous laser beam into a sequence of very short pulses. These pulses of light pass through nanophotonic nonlinear waveguides on a microchip, which generates many colors of light simultaneously. This multi-color output, known as a supercontinuum, can then be used to make precise spectroscopy measurements of solids, liquids and gases.

The chip-based nanophotonic nonlinear waveguides were a key component in this new system. These channels confine light within structures that are a centimeter long but only nanometers wide. Their small size and low light losses combined with the properties of the material they are made from allow them to convert light from one wavelength to another very efficiently to create the supercontinuum.

“The frequency comb source itself is also unique compared to most other dual-comb systems because it is generated by carving a continuous laser beam into pulses with an electro-optic modulator,” said Carlson. “This means the reliability and tunability of the laser can be exceptionally high across a wide range of operating conditions, an important feature when looking at future applications outside of a laboratory environment.”

Analyzing gases and solids

To demonstrate the versatility of the new dual-comb spectrometer, the researchers used it to perform linear absorption spectroscopy on gases of different pressure. They also operated it in a slightly different configuration to perform the advanced analytical technique known as nonlinear Raman spectroscopy on semiconductor materials. Nonlinear Raman spectroscopy, which uses pulses of light to characterize the vibrations of molecules in a sample, has not previously been performed using an electro-optic frequency comb.

The high data acquisition speeds that are possible with electro-optic combs operating at gigahertz pulse rates are ideal for making spectroscopy measurements of fast and non-repeatable events.

“It may be possible to analyze and capture the chemical signatures during an explosion or combustion event,” said Carlson. “Similarly, in biological imaging the ability to create images in real time of living tissues without requiring chemical labeling would be immensely valuable to biological researchers.”

The researchers are now working to improve the system’s performance to make it practical for applications like real-time biological imaging and to simplify and shrink the experimental setup so that it could be operated outside of the lab.

Story Source:

Materials provided by The Optical Society. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ProgrammableWeb

Amazon Introduces Data API for Redshift

Amazon has announced that Amazon Redshift (a managed cloud data warehouse) is now accessible from the built-in Redshift Data API. Such access makes it easier for developers to build web services applications that include integrations with services such as AWS Lambda, AWS AppSync, and AWS Cloud9. Further, there’s no more need to manage database connections and credentials for access.

The API is available in all AWS regions with an exception of AWS GovCloud and Asia Pacific. To invoke access through the API, execute SQL commands to the Amazon Redshift cluster through an HTTPS API endpoint. The endpoint is provided by the Data API.

Instead of managing credentials on your own, the API uses IAM user credentials or database credentials that are stored in AWS Secrets Manager. Credentials ARE NOT passed in API calls. Authentication is handled slightly differently depending on whether developers are working in AWS Lambda, AWS SDK, or other environments. For more information, check out the Amazon Redshift docs.

Ingest and egress is available for all languages supported by the AWS SDK. That includes Python, Go, Java, Node.js, PHP, Ruby, and C++. For more information on Redshift in general, visit the product page. Cost information is available at the pricing page.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">ecarter</a>

Categories
ProgrammableWeb

MX Announces New APIs and Developer Portal for its Financial Services

MX, a financial data platform that operates with the expressed aim of powering modern financial experiences, has announced a new platform designed to provide partners with APIs that can be used to develop personalized digital experiences. The all-new MX Open platform comprised of three core elements: MX Portal, MX Platform API, and MX Path.

The newly introduced MX Portal is a central hub that provides developers with documentation, Financial Data Exchange FDX guidance, and best practices to help improve onboarding. The announcement highlighted a focus on security and adherence to FDX 4.1 standards.

The MX Platform APITrack this API that supports this effort is designed to provide support for integration with “customer account information and verify and authenticate identity, assets, balances, and amounts.” The idea is to provide an API that enables accurate, reliable account information at scale.

Finally, MX Path is a new API that aims to help financial institutions and fintechs simplify integration with services, apps, and systems.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">KevinSundstrom</a>

Categories
ProgrammableWeb

Top 10 Countries APIs

Developers wishing to create applications supplied with data about individual countries and other international data may be interested in these APIs found in the Countries category of ProgrammableWeb.

What is a Countries API?

A Countries API, or Application Programming Interface, is an interface that connects developers to software featuring data concerning countries around the world.

APIs in the Countries category may provide data about demographics, geography, culture, flags, airport codes, postal codes, capital cities, universities, natural history, currency and financial markets, environment and any number of regional features.

This article focuses on the 10 favorite Countries APIs of ProgrammableWeb readers.

1. REST Countries API

REST Countries APITrack this API provides information about the world’s nations via REST calls. These calls allow users to retrieve all available countries or to retrieve a given country’s currency, capital city, calling code, region, sub-region, ISO 639-1 language, name, or country code.

2. Nutritics API

Nutritics APITrack this API supports the extraction and manipulation of nutrition and food-related data from the official national databases of countries around the world. It operates as a multilingual gateway portal to country-specific nutrition and dietetics data. The API is useful for developing analytical addons applications for recipes, diets, and meal planning, in addition to customizing access to academic research and collaboration resources.

Nutritics API provides nutrition of products from specific countries. Image: Nutritics

3. University Domains and Names Data List API

The University Domains and Names Data List APITrack this API from Hipo Labs retrieves JSON files with domains, names, and countries of national and international universities.

4. Open AQ API

OpenAQ uses a combination of open data and open source tools as well as a global, grassroots community to fight air inequality in different locations across the world. Use the Open AQ APITrack this API to build apps that power a variety of air quality measurement tools. The API convey responses in JSON format.

5. Numbeo Cost of Living API

The Numbeo Cost of Living APITrack this API integrates living conditions comparisons between two cities or countries. Methods include cities, price items, currency exchange, hotel prices, indices, crime, healthcare, pollution, traffic, and climate.

6. GeoDataSource Neighbouring Countries API

The GeoDataSource Neighbouring Countries Web ServiceTrack this API enables users to get the associated land border countries (the neighboring countries) based on the input of country code in ISO3166-2 format.

7. World Bank Country API

The World Bank Country APITrack this API returns country data including region, income level, ISO codes, lending type, capital city, longitude, and latitude. Data is provides in JSON or XML formats.

8. Graph Countries GraphQL API

The Graph Countries APITrack this API is a free GraphQL API to query country-related data like currencies, languages, flags, regions+subregions, bordering countries, and distance to other countries.

9. The Basetrip API

The Basetrip APITrack this API offers a variety of travel information by country including currency information, electricity (sockets & plugs), ATM locations, credit and debit cards information, driving data, dial codes, health related information, and emergency numbers. Additionally, Basetrip offers GeoJSON formatted data.

Add travel information about various countries to apps via this API

Add travel information about various countries to apps via this API. Image: The Basetrip

10. Tuxx EU Country API

The Tuxx EU Country APITrack this API checks whether a given country is a country in the European Union. A country in the European Union is a country that belongs to the economic and political union of 28 member states which are primarily located in Europe.

Head over to the Countries category for more than 40 APIs, SDKs, and Source Code Samples.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">joyc</a>

Categories
ProgrammableWeb

Automotive Titling Corporation Launches Granular Fees API

Automotive Titling Corporation (ATC), title and registration data solution provider, has launched its ATC Granular Fees (AGF) API. The API provides users the ability to calculate tax rates, taxable value calculations, local flat taxes, and registration fees at a granular level. By “granular”, ATC means the level of detail that mirrors DMV receipts.

“ATC leads the in-state and out-of-state titling and registration services industry,” Damon Bennett, VP, Business Development for ATC, commented in a press release. “And with the ATC Granular Fees API, we’re able to drill down to the most granular level of registration fees and taxes, so our customers can have the most accurate information possible.”

API users can dynamically choose level of fees for the API response. That includes over a hundred fees in various jurisdictions. The fees can be broken down per line item or aggregated into a single number. It includes supporting data (e.g. differences from location to location). The API also covers certain qualification data (e.g. requirements for military waivers).

The API is RESTful. ATC authenticates and verifies title and registration data on a daily basis. DMV requirements are continually incorporated into the API backend. In addition to fees and rates, the API provides necessary documentation (e.g. DMV applications and forms) specific to the relevant jurisdiction. To learn more, visit the API site.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">ecarter</a>

Categories
ScienceDaily

Small quake clusters can’t hide from AI

Researchers at Rice University’s Brown School of Engineering are using data gathered before a deadly 2017 landslide in Greenland to show how deep learning may someday help predict seismic events like earthquakes and volcanic eruptions.

Seismic data collected before the massive landslide at a Greenland fjord shows the subtle signals of the impending event were there, but no human analyst could possibly have put the clues together in time to make a prediction. The resulting tsunami that devastated the village of Nuugaatsiaq killed four people and injured nine and washed 11 buildings into the sea.

A study lead by former Rice visiting scholar Léonard Seydoux, now an assistant professor at the University of Grenoble-Alpes, employs techniques developed by Rice engineers and co-authors Maarten de Hoop and Richard Baraniuk. Their open-access report in Nature Communications shows how deep learning methods can process the overwhelming amount of data provided by seismic tools fast enough to predict events.

De Hoop, who specializes in mathematical analysis of inverse problems and deep learning in connection with Rice’s Department of Earth, Environmental and Planetary Sciences, said advances in artificial intelligence (AI) are well-suited to independently monitor large and growing amounts of seismic data. AI has the ability to identify clusters of events and detect background noise to make connections that human experts might not recognize due to biases in their models, not to mention sheer volume, he said.

Hours before the Nuugaatsiaq event, those small signals began to appear in data collected by a nearby seismic station. The researchers analyzed data from midnight on June 17, 2017, until one minute before the slide at 11:39 p.m. that released up to 51 million cubic meters of material.

The Rice algorithm revealed weak but repetitive rumblings — undetectable in raw seismic records — that began about nine hours before the event and accelerated over time, leading to the landslide.

“There was a precursor paper to this one by our co-author, Piero Poli at Grenoble, that studied the event without AI,” de Hoop said. “They discovered something in the data they thought we should look at, and because the area is isolated from a lot of other noise and tectonic activity, it was the purest data we could work with to try our ideas.”

De Hoop is continuing to test the algorithm to analyze volcanic activity in Costa Rica and is also involved with NASA’s InSight lander, which delivered a seismic detector to the surface of Mars nearly two years ago.

Constant monitoring that delivers such warnings in real time will save lives, de Hoop said.

“People ask me if this study is significant — and yes, it is a major step forward — and then if we can predict earthquakes. We’re not quite ready to do that, but this direction is, I think, one of the most promising at the moment.”

When de Hoop joined Rice five years ago, he brought expertise in solving inverse problems that involve working backwards from data to find a cause. Baraniuk is a leading expert in machine learning and compressive sensing, which help extract useful data from sparse samples. Together, they’re a formidable team.

“The most exciting thing about this work is not the current result, but the fact that the approach represents a new research direction for machine learning as applied to geophysics,” Baraniuk said.

“I come from the mathematics of deep learning and Rich comes from signal processing, which are at opposite ends of the discipline,” de Hoop said. “But here we meet in the middle. And now we have a tremendous opportunity for Rice to build upon its expertise as a hub for seismologists to gather and put these pieces together. There’s just so much data now that it’s becoming impossible to handle any other way.”

De Hoop is helping to grow Rice’s reputation for seismic expertise with the Simons Foundation Math+X Symposia, which have already featured events on space exploration and mitigating natural hazards like volcanoes and earthquakes. A third event, dates to be announced, will study deep learning applications for solar giants and exoplanets.

Story Source:

Materials provided by Rice University. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Warming Greenland ice sheet passes point of no return

Nearly 40 years of satellite data from Greenland shows that glaciers on the island have shrunk so much that even if global warming were to stop today, the ice sheet would continue shrinking.

The finding, published today, Aug. 13, in the journal Nature Communications Earth and Environment, means that Greenland’s glaciers have passed a tipping point of sorts, where the snowfall that replenishes the ice sheet each year cannot keep up with the ice that is flowing into the ocean from glaciers.

“We’ve been looking at these remote sensing observations to study how ice discharge and accumulation have varied,” said Michalea King, lead author of the study and a researcher at The Ohio State University’s Byrd Polar and Climate Research Center. “And what we’ve found is that the ice that’s discharging into the ocean is far surpassing the snow that’s accumulating on the surface of the ice sheet.”

King and other researchers analyzed monthly satellite data from more than 200 large glaciers draining into the ocean around Greenland. Their observations show how much ice breaks off into icebergs or melts from the glaciers into the ocean. They also show the amount of snowfall each year — the way these glaciers get replenished.

The researchers found that, throughout the 1980s and 90s, snow gained through accumulation and ice melted or calved from glaciers were mostly in balance, keeping the ice sheet intact. Through those decades, the researchers found, the ice sheets generally lost about 450 gigatons (about 450 billion tons) of ice each year from flowing outlet glaciers, which was replaced with snowfall.

“We are measuring the pulse of the ice sheet — how much ice glaciers drain at the edges of the ice sheet — which increases in the summer. And what we see is that it was relatively steady until a big increase in ice discharging to the ocean during a short five- to six-year period,” King said.

The researchers’ analysis found that the baseline of that pulse — the amount of ice being lost each year — started increasing steadily around 2000, so that the glaciers were losing about 500 gigatons each year. Snowfall did not increase at the same time, and over the last decade, the rate of ice loss from glaciers has stayed about the same — meaning the ice sheet has been losing ice more rapidly than it’s being replenished.

“Glaciers have been sensitive to seasonal melt for as long as we’ve been able to observe it, with spikes in ice discharge in the summer,” she said. “But starting in 2000, you start superimposing that seasonal melt on a higher baseline — so you’re going to get even more losses.”

Before 2000, the ice sheet would have about the same chance to gain or lose mass each year. In the current climate, the ice sheet will gain mass in only one out of every 100 years.

King said that large glaciers across Greenland have retreated about 3 kilometers on average since 1985 — “that’s a lot of distance,” she said. The glaciers have shrunk back enough that many of them are sitting in deeper water, meaning more ice is in contact with water. Warm ocean water melts glacier ice, and also makes it difficult for the glaciers to grow back to their previous positions.

That means that even if humans were somehow miraculously able to stop climate change in its tracks, ice lost from glaciers draining ice to the ocean would likely still exceed ice gained from snow accumulation, and the ice sheet would continue to shrink for some time.

“Glacier retreat has knocked the dynamics of the whole ice sheet into a constant state of loss,” said Ian Howat, a co-author on the paper, professor of earth sciences and distinguished university scholar at Ohio State. “Even if the climate were to stay the same or even get a little colder, the ice sheet would still be losing mass.”

Shrinking glaciers in Greenland are a problem for the entire planet. The ice that melts or breaks off from Greenland’s ice sheets ends up in the Atlantic Ocean — and, eventually, all of the world’s oceans. Ice from Greenland is a leading contributor to sea level rise — last year, enough ice melted or broke off from the Greenland ice sheet to cause the oceans to rise by 2.2 millimeters in just two months.

The new findings are bleak, but King said there are silver linings.

“It’s always a positive thing to learn more about glacier environments, because we can only improve our predictions for how rapidly things will change in the future,” she said. “And that can only help us with adaptation and mitigation strategies. The more we know, the better we can prepare.”

This work was supported by grants from NASA. Other Ohio State researchers who worked on this study are Salvatore Candela, Myoung Noh and Adelaide Negrete.

Go to Source
Author: