Categories
ProgrammableWeb

7 Top Fantasy Sports APIs

Fantasy Sport leagues are more popular than ever these days, with an estimated 60 million people participating in league play. Typically the participants create virtual teams based on real players of various sports, and utilize those real player statistics to compute finals scores and compete with other virtual teams. Nearly every real team sport imaginable has a fantasy component played by fans at the present, with many of the leagues requiring dues for players and payoffs for winners.

Several “official” fantasy sports leagues are commissioned by real leagues, including the NFL and Premier League soccer. Other leagues are created by individual organizations, or partner organizations, for a plethora of sports, including basketball, baseball, soccer, college sports, UFC, golf, tennis, auto racing and eSports. Developers looking to create applications to accompany this popular past time can start by finding the best APIs to suit their needs.

What is a Fantasy Sports API?

A Fantasy Sports API is an Application Programming Interface that enables developers to create applications that tap into Fantasy Sports data.

The best place to find these APIs is in the Fantasy Sports category in the ProgrammableWeb directory. In this article we highlight some favorites from our readers.

1. Sportradar Sports Data API

Sportradar provides real-time, accurate sports statistics and sports content. Sportradar’s data coverage includes all major U.S. sports, plus hundreds of leagues throughout the world. Data can be retrieved from Sportsradar via REST APITrack this API. This data includes schedules, standings, statistics, play by play, live images, and more.

2. Yahoo Fantasy Sports API

Yahoo Fantasy Sports allows users to compete against each other using statistics from real-world competitions. The Yahoo Fantasy Sports APITrack this API provides rich data on leagues, teams and player information. Use it to analyze draft results, review free agents, optimize current rosters, or create other applications. The Yahoo Fantasy Sports API utilizes the Yahoo Query Language (YQL) as a mechanism to access Yahoo Fantasy Sports data, returning data in XML and JSON formats.

3. Cric API

CricAPI provides data about the game of Cricket. Use the API to get live cricket match data, a list of matches, latest scores, player batting and bowling stats. The CricAPI Fantasy APITrack this API can be used before the match to help you with choosing players (batsmen / bowlers) for your fantasy game; once this is done you can hit the API at regular intervals and calculate the results of your Fantasy Cricket.

4. ProFootballAPI.com API

The ProFootballAPI NFL APITrack this API provides users with access to a database of current and past NFL football statistics and game information. The database is updated every minute, even while games are being played. Data is available going back to 2009. The NFL API can provide answers to simple queries or return large data sets for more in-depth use.

5. Goalserve MLB API

Goalserve provides live sports data feeds for multiple sports. The Goalserve Sports Data Feeds MLB APITrack this API delivers fixtures, live scores, results, in-game player statistics, profiles, injuries, odds, historical data since 2010, prematch and more.

6. GameScorekeeper API

GameScorekeeper provides feeds of data about eSports including League of Legends, Counter-Strike: Global Offensive, Heroes of the Storm, and DOTA 2. The GameScorekeeper REST APITrack this API provides JSON data related to eSports such as upcoming matches, competitions, teams, and results. The GameScorekeeper Live APITrack this API provides real-time data from eSports matches through websockets.

7. Sportmonks Soccer API<

SportMonks is a provider of data feeds for a variety of different professional sports. The SportMonks Soccer APITrack this API provides data feeds for live scores, full season fixtures, video highlights, and in-play odds among other features. Users can access historical data stretching way back to 2005.

Screenshot: SportMonks

Check out the Fantasy Sports category for more APIs, plus SDKs, Source Code Samples, and other resources.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">joyc</a>

Categories
ScienceDaily

Medical robotic hand? Rubbery semiconductor makes it possible

A medical robotic hand could allow doctors to more accurately diagnose and treat people from halfway around the world, but currently available technologies aren’t good enough to match the in-person experience.

Researchers report in Science Advances that they have designed and produced a smart electronic skin and a medical robotic hand capable of assessing vital diagnostic data by using a newly invented rubbery semiconductor with high carrier mobility.

Cunjiang Yu, Bill D. Cook Associate Professor of Mechanical Engineering at the University of Houston and corresponding author for the work, said the rubbery semiconductor material also can be easily scaled for manufacturing, based upon assembly at the interface of air and water.

That interfacial assembly and the rubbery electronic devices described in the paper suggest a pathway toward soft, stretchy rubbery electronics and integrated systems that mimic the mechanical softness of biological tissues, suitable for a variety of emerging applications, said Yu, who also is a principal investigator at the Texas Center for Superconductivity at UH.

The smart skin and medical robotic hand are just two potential applications, created by the researchers to illustrate the discovery’s utility.

In addition to Yu, authors on the paper include Ying-Shi Guan, Anish Thukral, Kyoseung Sim, Xu Wang, Yongcao Zhang, Faheem Ershad, Zhoulyu Rao, Fengjiao Pan and Peng Wang, all of whom are affiliated with UH. Co-authors Jianliang Xiao and Shun Zhang are affiliated with the University of Colorado.

Traditional semiconductors are brittle, and using them in otherwise stretchable electronics has required special mechanical accommodations. Previous stretchable semiconductors have had drawbacks of their own, including low carrier mobility — the speed at which charge carriers can move through a material — and complicated fabrication requirements.

Yu and collaborators last year reported that adding minute amounts of metallic carbon nanotubes to the rubbery semiconductor of P3HT — polydimethylsiloxane composite — improves carrier mobility, which governs the performances of semiconductor transistors.

Yu said the new scalable manufacturing method for these high performance stretchable semiconducting nanofilms and the development of fully rubbery transistors represent a significant step forward.

The production is simple, he said. A commercially available semiconductor material is dissolved in a solution and dropped on water, where it spreads; the chemical solvent evaporates from the solution, resulting in improved semiconductor properties.

It is a new way to create the high quality composite films, he said, allowing for consistent production of fully rubbery semiconductors.

Electrical performance is retained even when the semiconductor is stretched by 50%, the researchers reported. Yu said the ability to stretch the rubbery electronics by 50% without degrading the performance is a notable advance. Human skin, he said, can be stretched only about 30% without tearing.

Story Source:

Materials provided by University of Houston. Original written by Jeannie Kever. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Fast calculation dials in better batteries

A simpler and more efficient way to predict performance will lead to better batteries, according to Rice University engineers.

That their method is 100,000 times faster than current modeling techniques is a nice bonus.

The analytical model developed by materials scientist Ming Tang and graduate student Fan Wang of Rice University’s Brown School of Engineering doesn’t require complex numerical simulation to guide the selection and design of battery components and how they interact.

The simplified model developed at Rice — freely accessible online — does the heavy lifting with an accuracy within 10% of more computationally intensive algorithms. Tang said it will allow researchers to quickly evaluate the rate capability of batteries that power the planet.

The results appear in the open-access journal Cell Reports Physical Science.

There was a clear need for the updated model, Tang said.

“Almost everyone who designs and optimizes battery cells uses a well-established approach called P2D (for pseudo-two dimensional) simulations, which are expensive to run,” Tang said. “This especially becomes a problem if you want to optimize battery cells, because they have many variables and parameters that need to be carefully tuned to maximize the performance.

“What motivated this work is our realization that we need a faster, more transparent tool to accelerate the design process, and offer simple, clear insights that are not always easy to obtain from numerical simulations,” he said.

Battery optimization generally involves what the paper calls a “perpetual trade-off” between energy (the amount it can store) and power density (the rate of its release), all of which depends on the materials, their configurations and such internal structures as porosity.

“There are quite a few adjustable parameters associated with the structure that you need to optimize,” Tang said. “Typically, you need to make tens of thousands of calculations and sometimes more to search the parameter space and find the best combination. It’s not impossible, but it takes a really long time.”

He said the Rice model could be easily implemented in such common software as MATLAB and Excel, and even on calculators.

To test the model, the researchers let it search for the optimal porosity and thickness of an electrode in common full- and half-cell batteries. In the process, they discovered that electrodes with “uniform reaction” behavior such as nickel-manganese-cobalt and nickel-cobalt-aluminum oxide are best for applications that require thick electrodes to increase the energy density.

They also found that battery half-cells (with only one electrode) have inherently better rate capability, meaning their performance is not a reliable indicator of how electrodes will perform in the full cells used in commercial batteries.

The study is related to the Tang lab’s attempts at understanding and optimizing the relationship between microstructure and performance of battery electrodes, the topic of several recent papers that showed how defects in cathodes can speed lithium absorption and how lithium cells can be pushed too far in the quest for speed.

Story Source:

Materials provided by Rice University. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Future autonomous machines may build trust through emotion

Army research has extended the state-of-the-art in autonomy by providing a more complete picture of how actions and nonverbal signals contribute to promoting cooperation. Researchers suggested guidelines for designing autonomous machines such as robots, self-driving cars, drones and personal assistants that will effectively collaborate with Soldiers.

Dr. Celso de Melo, computer scientist with the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory at CCDC ARL West in Playa Vista, California, in collaboration with Dr. Kazunori Teradafrom Gifu University in Japan, recently published a paper in Scientific Reports where they show that emotion expressions can shape cooperation.

Autonomous machines that act on people’s behalf are poised to become pervasive in society, de Melo said; however, for these machines to succeed and be adopted, it is essential that people are able to trust and cooperate with them.

“Human cooperation is paradoxical,” de Melo said. “An individual is better off being a free rider, while everyone else cooperates; however, if everyone thought like that, cooperation would never happen. Yet, humans often cooperate. This research aims to understand the mechanisms that promote cooperation with a particular focus on the influence of strategy and signaling.”

Strategy defines how individuals act in one-shot or repeated interaction. For instance, tit-for-tat is a simple strategy that specifies that the individual should act as his/her counterpart acted in the previous interaction.

Signaling refers to communication that may occur between individuals, which could be verbal (e.g., natural language conversation) and nonverbal (e.g., emotion expressions).

This research effort, which supports the Next Generation Combat Vehicle Army Modernization Priority and the Army Priority Research Area for Autonomy, aims to apply this insight in the development of intelligent autonomous systems that promote cooperation with Soldiers and successfully operate in hybrid teams to accomplish a mission.

“We show that emotion expressions can shape cooperation,” de Melo said. “For instance, smiling after mutual cooperation encourages more cooperation; however, smiling after exploiting others — which is the most profitable outcome for the self — hinders cooperation.”

The effect of emotion expressions is moderated by strategy, he said. People will only process and be influenced by emotion expressions if the counterpart’s actions are insufficient to reveal the counterpart’s intentions.

For example, when the counterpart acts very competitively, people simply ignore-and even mistrust-the counterpart’s emotion displays.

“Our research provides novel insight into the combined effects of strategy and emotion expressions on cooperation,” de Melo said. “It has important practical application for the design of autonomous systems, suggesting that a proper combination of action and emotion displays can maximize cooperation from Soldiers. Emotion expression in these systems could be implemented in a variety of ways, including via text, voice, and nonverbally through (virtual or robotic) bodies.”

According to de Melo, the team is very optimistic that future Soldiers will benefit from research such as this as it sheds light on the mechanisms of cooperation.

“This insight will be critical for the development of socially intelligent autonomous machines, capable of acting and communicating nonverbally with the Soldier,” he said. “As an Army researcher, I am excited to contribute to this research as I believe it has the potential to greatly enhance human-agent teaming in the Army of the future.”

The next steps for this research include pursuing further understanding of the role of nonverbal signaling and strategy in promoting cooperation and identifying creative ways to apply this insight on a variety of autonomous systems that have different affordances for acting and communicating with the Soldier.

Go to Source
Author:

Categories
ScienceDaily

Study shows difficulty in finding evidence of life on Mars

In a little more than a decade, samples of rover-scooped Martian soil will rocket to Earth.

While scientists are eager to study the red planet’s soils for signs of life, researchers must ponder a considerable new challenge: Acidic fluids — which once flowed on the Martian surface — may have destroyed biological evidence hidden within Mars’ iron-rich clays, according to researchers at Cornell University and at Spain’s Centro de Astrobiología.

The researchers conducted simulations involving clay and amino acids to draw conclusions regarding the likely degradation of biological material on Mars. Their paper, “Constraining the Preservation of Organic Compounds in Mars Analog Nontronites After Exposure to Acid and Alkaline Fluids,” published Sept. 15 in Nature Scientific Reports.

Alberto G. Fairén, a visiting scientist in the Department of Astronomy in the College of Arts and Sciences at Cornell, is a corresponding author.

NASA’s Perseverance rover, launched July 30, will land at Mars’ Jezero Crater next February; the European Space Agency’s Rosalind Franklin rover will launch in late 2022. The Perseverance mission will collect Martian soil samples and send them to Earth by the 2030s. The Rosalind Franklin rover will drill into the Martian surface, collect soil samples and analyze them in situ.

In the search for life on Mars, the red planet’s clay surface soils are a preferred collection target since the clay protects the molecular organic material inside. However, the past presence of acid on the surface may have compromised the clay’s ability to protect evidence of previous life.

“We know that acidic fluids have flowed on the surface of Mars in the past, altering the clays and its capacity to protect organics,” Fairén said.

He said the internal structure of clay is organized into layers, where the evidence of biological life — such as lipids, nucleic acids, peptides and other biopolymers — can become trapped and well preserved.

In the laboratory, the researchers simulated Martian surface conditions by aiming to preserve an amino acid called glycine in clay, which had been previously exposed to acidic fluids. “We used glycine because it could rapidly degrade under the planet’s environmental conditions,” he said. “It’s perfect informer to tell us what was going on inside our experiments.”

After a long exposure to Mars-like ultraviolet radiation, the experiments showed photodegradation of the glycine molecules embedded in the clay. Exposure to acidic fluids erases the interlayer space, turning it into a gel-like silica.

“When clays are exposed to acidic fluids, the layers collapse and the organic matter can’t be preserved. They are destroyed,” Fairén said. “Our results in this paper explain why searching for organic compounds on Mars is so sorely difficult.”

The paper’s lead author was Carolina Gil?Lozano of Centro de Astrobiología, Madrid and the Universidad de Vigo, Spain. The European Research Council funded this research.

Story Source:

Materials provided by Cornell University. Original written by Blaine Friedlander. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

New machine learning-assisted method rapidly classifies quantum sources

For quantum optical technologies to become more practical, there is a need for large-scale integration of quantum photonic circuits on chips.

This integration calls for scaling up key building blocks of these circuits — sources of particles of light — produced by single quantum optical emitters.

Purdue University engineers created a new machine learning-assisted method that could make quantum photonic circuit development more efficient by rapidly preselecting these solid-state quantum emitters.

The work is published in the journal Advanced Quantum Technologies.

Researchers around the world have been exploring different ways to fabricate identical quantum sources by “transplanting” nanostructures containing single quantum optical emitters into conventional photonic chips.

“With the growing interest in scalable realization and rapid prototyping of quantum devices that utilize large emitter arrays, high-speed, robust preselection of suitable emitters becomes necessary,” said Alexandra Boltasseva, Purdue’s Ron and Dotty Garvin Tonjes Professor of Electrical and Computer Engineering.

Quantum emitters produce light with unique, non-classical properties that can be used in many quantum information protocols.

The challenge is that interfacing most solid-state quantum emitters with existing scalable photonic platforms requires complex integration techniques. Before integrating, engineers need to first identify bright emitters that produce single photons rapidly, on-demand and with a specific optical frequency.

Emitter preselection based on “single-photon purity” — which is the ability to produce only one photon at a time — typically takes several minutes for each emitter. Thousands of emitters may need to be analyzed before finding a high-quality candidate suitable for quantum chip integration.

To speed up screening based on single-photon purity, Purdue researchers trained a machine to recognize promising patterns in single-photon emission within a split second.

According to the researchers, rapidly finding the purest single-photon emitters within a set of thousands would be a key step toward practical and scalable assembly of large quantum photonic circuits.

“Given a photon purity standard that emitters must meet, we have taught a machine to classify single-photon emitters as sufficiently or insufficiently ‘pure’ with 95% accuracy, based on minimal data acquired within only one second,” said Zhaxylyk Kudyshev, a Purdue postdoctoral researcher.

The researchers found that the conventional photon purity measurement method used for the same task took 100 times longer to reach the same level of accuracy.

“The machine learning approach is such a versatile and efficient technique because it is capable of extracting the information from the dataset that the fitting procedure usually ignores,” Boltasseva said.

The researchers believe that their approach has the potential to dramatically advance most quantum optical measurements that can be formulated as binary or multiclass classification problems.

“Our technique could, for example, speed up super-resolution microscopy methods built on higher-order correlation measurements that are currently limited by long image acquisition times,” Kudyshev said.

Story Source:

Materials provided by Purdue University. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
Hackster.io

OpenVINO Webinar

Discover how the Intel® Distribution of OpenVINO™ toolkit enables you to deliver faster, more accurate real-world results from edge to cloud.

Learn to build high-performance, deep learning and computer vision applications that enable new and enhanced use cases in health and life sciences, retail, industrial, and more.

To join the contest visit https://www.hackster.io/contests/DLSuperheroes

Categories
ProgrammableWeb

5 Top APIs for Podcasts

Podcasts have never been more popular, and developers wanting take advantage of this hot content trend need to find suitable Application Programming Interfaces, or APIs, to create applications.

What is a Podcast API?

A Podcast API in an interface that developers can use to connect applications to various podcast services.

ProgrammableWeb‘s Podcasts category is the best place to discover these APIs. Here are 5 popular choices.

1. Listen Notes

ListenNotes is a podcast search engine. The ListenNotes APITrack this API allows applications to search the metadata of more than 1.5 million podcasts and 80 million episodes categorized by people, places, or topics. Search (almost) all podcasts & episodes that are found on the Internet via this API.

2. Audioboom API

AudioBoom provides audio content from major sports and media outlets, as well as smaller podcasters. Content hosted on audioBoom can be shared via the website, embeddable players, mobile applications, and social media websites. The audioBoom APITrack this API provides users with access to almost every function the site offers. Full documentation for the API is available on GitHub.

3. Audiogum API

Audiogum provides smart audio visual experiences for businesses and their customers. The platform offers a way to create personalized, voice controlled podcasts, streaming services, internet radio, audio books, and video content. The Audiogum APITrack this API enables audio content aggregation, intelligent personalization, analytics, and natural language understanding in business applications.

4. Voicepods API

Voicepod provides automated human-like text-to-speech (TTS) services. The Voicepods APITrack this API returns JSON responses with voice and text-to-speech narrations features. Projects and clips available as resources.

5. Rev API

Rev provides transcription, caption and translation services. Rev’s captioning work is done by humans for accuracy. The Rev Human Transcription APITrack this API enables developers to add human transcription services to applications.

Head over to the Podcasts category for more APIs, plus SDKs and Source Code samples.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">joyc</a>

Categories
ScienceDaily

Unraveling the secrets of Tennessee whiskey

More than a century has passed since the last scientific analyses of the famed “Lincoln County [Tennessee] process” was published, but the secrets of the famous Tennessee whiskey flavor are starting to unravel at the University of Tennessee Institute of Agriculture. The latest research promises advancements in the field of flavor science as well as marketing.

Conducted John P. Munafo, Jr., assistant professor of flavor science and natural products, and his graduate student, Trenton Kerley, the study “Changes in Tennessee Whiskey Odorants by the Lincoln County Process” was recently published in the Journal of Agricultural and Food Chemistry (JAFC).

The study incorporated a combination of advanced flavor chemistry techniques to probe the changes in flavor chemistry occurring during charcoal filtration. This type of filtration is a common step in the production of distilled beverages, including vodka and rum, but it’s a required step for a product to be labeled “Tennessee whiskey.” The step is called the Lincoln County Process (LCP), after the locale of the original Jack Daniel’s distillery. It is also referred to as “charcoal mellowing.”

The LCP step is performed by passing the fresh whiskey distillate through a bed of charcoal, usually derived from burnt sugar maple, prior to barrel-aging the product. Although no scientific studies have proved such a claim, it is believed that the LCP imparts a “smoother” flavor to Tennessee whiskey. In addition, by law for the distinction of having “Tennessee whiskey” on the label, the liquor must be produced in the state of Tennessee from at least 51% corn after having been aged in Tennessee for at least 2 years in unused charred oak barrels.

The actual LCP differs from distiller to distiller, and, as the details are generally held as a trade secret, the process has been historically shrouded in mystery. There are no regulations as to how the process is performed, only that the step is required. In other words, all a manufacturer needs to do is pass the distillate over charcoal (an undefined amount — possibly even just one piece). Thus, depending on how it’s conducted, the LCP step may not impact the whisky flavor at all. On the other hand, even small adjustments to the LCP can modify the flavor profile of the whiskey positively or negatively, potentially causing any number of surprises.

Munafo and Kerley describe how distillers adjust parameters empirically throughout the whiskey production process, then rely on professional tasters to sample products, blending subtly unique batches to achieve their target flavor. Munafo says, “By gaining a fundamental understanding of the changes in flavor chemistry occurring during whiskey production, our team could advise distillers about exactly what changes are needed to make their process produce their desired flavor goals. We want to give distillers levers to pull, so they are not randomly or blindly attempting to get the precise flavor they want.”

Samples used in the study were provided by the Sugarlands Distilling Company (SDC), in Gatlinburg, Tennessee, producers of the Roaming Man Whiskey. SDC invited the UTIA researchers to visit their distillery and collect in-process samples. Munafo says SDC prioritizes transparency around their craft and takes pride in sharing the research, discovery and distillation process of how their whiskey is made and what makes Tennessee whiskey unique.

Olfactory evaluations — the good ole smell test — revealed that the LCP treatment generally decreased malty, rancid, fatty and roasty aromas in the whiskey distillates. As for the odorants (i.e., molecules responsible for odor), 49 were identified in the distillate samples using an analytical technique called gas chromatography-olfactometry (GC-O). Nine of these odorants have never been reported in the scientific whiskey literature.

One of the newly found whiskey odorants, called DMPF, was originally discovered in cocoa. It is described as having a unique anise or citrus-like smell. Another of the newly discovered whiskey odorants (called MND) is described as having a pleasant dried hay-like aroma. Both odorants have remarkably low odor thresholds in the parts-per-trillion range, meaning that the smells can be detected at very low levels by people but are difficult to detect with scientific instrumentation.

The only previous investigation into how charcoal treatment affects whiskey was published in 1908 by William Dudley in the Journal of the American Chemical Society. The new study revealed fresh knowledge for optimizing Tennessee whiskey production. Thirty-one whiskey odorants were measured via a technique called stable isotope dilution assay (SIDA), all showing a decrease in concentration as a result of LCP treatment, albeit to different degrees. That is to say, while the LCP appears to be selective in removing certain odorants, the process didn’t increase or add any odorants to the distillate. This new knowledge can be used to optimize Tennessee whiskey production. For instance, the process can be optimized for the removal of undesirable aromas, while maintaining higher levels of desirable aromas, thus “tailoring” the flavor profile of the finished whiskey.

“We want to provide the analytical tools needed to help enable distillers to have more control of their processes and make more consistent and flavorful whiskey, says Dr. Munafo. “We want to help them to take out some of the guesswork involved in whiskey production.”

Additional studies are now underway at the UT Department of Food Science to characterize both the flavor chemistry of different types of whiskey and their production processes. The ultimate aim of the whiskey flavor chemistry program is to aid whiskey manufacturers in producing a consistent product with the exact flavor profile that they desire. Even with the aid of science Munafo says, “Whiskey making will ‘still’ remain an impressive art form.” Pun intended.

The researchers acknowledge support from the USDA National Institute of Food and Agriculture (NIFA) Hatch Project #1015002 and funding through the Food Science Department and start-up funding from the University of Tennessee Institute of Agriculture.

Go to Source
Author:

Categories
ScienceDaily

A multinational study overturns a 130-year old assumption about seawater chemistry

There’s more to seawater than salt. Ocean chemistry is a complex mixture of particles, ions and nutrients. And for over a century, scientists believed that certain ion ratios held relatively constant over space and time.

But now, following a decade of research, a multinational study has refuted this assumption. Debora Iglesias-Rodriguez, professor and vice chair of UC Santa Barbara’s Department of Ecology, Evolution, and Marine Biology, and her colleagues discovered that the seawater ratios of three key elements vary across the ocean, which means scientists will have to re-examine many of their hypotheses and models. The results appear in the Proceedings of the National Academy of Sciences.

Calcium, magnesium and strontium (Ca, Mg and Sr) are important elements in ocean chemistry, involved in a number of biologic and geologic processes. For instance, a host of different animals and microbes use calcium to build their skeletons and shells. These elements enter the ocean via rivers and tectonic features, such as hydrothermal vents. They’re taken up by organisms like coral and plankton, as well as by ocean sediment.

The first approximation of modern seawater composition took place over 130 years ago. The scientists who conducted the study concluded that, despite minor variations from place to place, the ratios between the major ions in the waters of the open ocean are nearly constant.

Researchers have generally accepted this idea from then on, and it made a lot of sense. Based on the slow turnover of these elements in the ocean — on the order of millions of years — scientists long thought the ratios of these ions would remain relatively stable over extended periods of time.

“The main message of this paper is that we have to revisit these ratios,” said Iglesias-Rodriguez. “We cannot just continue to make the assumptions we have made in the past essentially based on the residency time of these elements.”

Back in 2010, Iglesias-Rodriguez was participating in a research expedition over the Porcupine Abyssal Plain, a region of North Atlantic seafloor west of Europe. She had invited a former student of hers, this paper’s lead author Mario Lebrato, who was pursuing his doctorate at the time.

Their study analyzed the chemical composition of water at various depths. Lebrato found that the Ca, Mg and Sr ratios from their samples deviated significantly from what they had expected. The finding was intriguing, but the data was from only one location.

Over the next nine years, Lebrato put together a global survey of these element ratios. Scientists including Iglesias-Rodriguez collected over 1,100 water samples on 79 cruises ranging from the ocean’s surface to 6,000 meters down. The data came from 14 ecosystems across 10 countries. And to maintain consistency, all the samples were processed by a single person in one lab.

The project’s results overturned the field’s 130-year old assumption about seawater chemistry, revealing that the ratio of these ions varies considerably across the ocean.

Scientists have long used these ratios to reconstruct past ocean conditions, like temperature. “The main implication is that the paleo-reconstructions we have been conducting have to be revisited,” Iglesias-Rodriguez explained, “because environmental conditions have a substantial impact on these ratios, which have been overlooked.”

Oceanographers can no longer assume that data they have on past ocean chemistry represent the whole ocean. It has become clear they can extrapolate only regional conditions from this information.

This revelation also has implications for modern marine science. Seawater ratios of Mg to Ca affect the composition of animal shells. For example, a higher magnesium content tends to make shells more vulnerable to dissolution, which is an ongoing issue as increasing carbon dioxide levels gradually make the ocean more acidic. “Biologically speaking, it is important to figure out these ratios with some degree of certainty,” said Iglesias-Rodriguez.

Iglesias-Rodriguez’s latest project focuses on the application of rock dissolution as a method to fight ocean acidification. She’s looking at lowering the acidity of seawater using pulverized stones like olivine and carbonate rock. This intervention will likely change the balance of ions in the water, which is something worth considering. As climate change continues unabated, this intervention could help keep acidity in check in small areas, like coral reefs.

Go to Source
Author: