Categories
Hackster.io

Nordic Thingy:91 Getting Started // Cellular IoT

The Thingy:91 is a pocket-sized, cellular-enabled IoT sensor prototyping platform based on the nRF9160 SiP and nRF52840 SoC. Thanks to those modules, its environmental sensors can communicate over LTE-M, NB-IoT, Bluetooth 5, Thread, ZigBee, and more – plus, it has built-in GPS!

We’ve talked about this one before, and now we have a ton of helpful documentation on getting started with this remarkable device:
// https://www.hackster.io/glowascii/getting-started-with-the-nordic-thingy-91-mac-8d44e5
// https://www.youtube.com/watch?v=xQ7hDzRULJQ

Learn more about the Thingy:91 at https://www.nordicsemi.com/Software-and-tools/Prototyping-platforms/Nordic-Thingy-91/

Categories
ScienceDaily

Enormous planet quickly orbiting a tiny, dying star

Thanks to a bevy of telescopes in space and on Earth — and even a pair of amateur astronomers in Arizona — a University of Wisconsin-Madison astronomer and his colleagues have discovered a Jupiter-sized planet orbiting at breakneck speed around a distant white dwarf star. The system, about 80 light years away, violates all common conventions about stars and planets. The white dwarf is the remnant of a sun-like star, greatly shrunken down to roughly the size of Earth, yet it retains half the sun’s mass. The massive planet looms over its tiny star, which it circles every 34 hours thanks to an incredibly close orbit. In contrast, Mercury takes a comparatively lethargic 90 days to orbit the sun. While there have been hints of large planets orbiting close to white dwarfs in the past, the new findings are the clearest evidence yet that these bizarre pairings exist. That confirmation highlights the diverse ways stellar systems can evolve and may give a glimpse at our own solar system’s fate. Such a white dwarf system could even provide a rare habitable arrangement for life to arise in the light of a dying star.

“We’ve never seen evidence before of a planet coming in so close to a white dwarf and surviving. It’s a pleasant surprise,” says lead researcher Andrew Vanderburg, who recently joined the UW-Madison astronomy department as an assistant professor. Vanderburg completed the work while an independent NASA Sagan Fellow at the University of Texas at Austin.

The researchers published their findings Sept. 16 in the journal Nature. Vanderburg led a large, international collaboration of astronomers who analyzed the data. The contributing telescopes included NASA’s exoplanet-hunting telescope TESS and two large ground-based telescopes in the Canary Islands.

Vanderburg was originally drawn to studying white dwarfs — the remains of sun-sized stars after they exhaust their nuclear fuel — and their planets by accident. While in graduate school, he was reviewing data from TESS’s predecessor, the Kepler space telescope, and noticed a white dwarf with a cloud of debris around it.

“What we ended up finding was that this was a minor planet or asteroid that was being ripped apart as we watched, which was really cool,” says Vanderburg. The planet had been destroyed by the star’s gravity after its transition to a white dwarf caused the planet’s orbit to fall in toward the star.

Ever since, Vanderburg has wondered if planets, especially large ones, could survive the journey in toward an aging star.

By scanning data for thousands of white dwarf systems collected by TESS, the researchers spotted a star whose brightness dimmed by half about every one-and-a-half days, a sign that something big was passing in front of the star on a tight, lightning-fast orbit. But it was hard to interpret the data because the glare from a nearby star was interfering with TESS’s measurements. To overcome this obstacle, the astronomers supplemented the TESS data from higher-resolution ground-based telescopes, including three run by amateur astronomers.

“Once the glare was under control, in one night, they got much nicer and much cleaner data than we got with a month of observations from space,” says Vanderburg. Because white dwarfs are so much smaller than normal stars, large planets passing in front of them block a lot of the star’s light, making detection by ground-based telescopes much simpler.

The data revealed that a planet roughly the size of Jupiter, perhaps a little larger, was orbiting very close to its star. Vanderburg’s team believes the gas giant started off much farther from the star and moved into its current orbit after the star evolved into a white dwarf.

The question became: how did this planet avoid being torn apart during the upheaval? Previous models of white dwarf-planet interactions didn’t seem to line up for this particular star system.

The researchers ran new simulations that provided a potential answer to the mystery. When the star ran out of fuel, it expanded into a red giant, engulfing any nearby planets and destabilizing the Jupiter-sized planet that orbited farther away. That caused the planet to take on an exaggerated, oval orbit that passed very close to the now-shrunken white dwarf but also flung the planet very far away at the orbit’s apex.

Over eons, the gravitational interaction between the white dwarf and its planet slowly dispersed energy, ultimately guiding the planet into a tight, circular orbit that takes just one-and-a-half days to complete. That process takes time — billions of years. This particular white dwarf is one of the oldest observed by the TESS telescope at almost 6 billion years old, plenty of time to slow down its massive planet partner.

While white dwarfs no longer conduct nuclear fusion, they still release light and heat as they cool down. It’s possible that a planet close enough to such a dying star would find itself in the habitable zone, the region near a star where liquid water can exist, presumed to be required for life to arise and survive.

Now that research has confirmed these systems exist, they offer a tantalizing opportunity for searching for other forms of life. The unique structure of white dwarf-planet systems provides an ideal opportunity to study the chemical signatures of orbiting planets’ atmospheres, a potential way to search for signs of life from afar.

“I think the most exciting part of this work is what it means for both habitability in general — can there be hospitable regions in these dead solar systems — and also our ability to find evidence of that habitability,” says Vanderburg.

Go to Source
Author:

Categories
ScienceDaily

New method to determine the origin of stardust in meteorites

Scientists have made a key discovery thanks to stardust found in meteorites, shedding light on the origin of crucial chemical elements.

Meteorites are critical to understanding the beginning of our solar system and how it has evolved over time. However, some meteorites contain grains of stardust that predate the formation of our solar system and are now providing important information about how the elements in the universe formed.

In a study published by Physical Review Letters, researchers from the University of Surrey detail how they made a key discovery connected to the “pre-solar grains” found in primitive meteorites. This discovery has provided new insights into the nature of stellar explosions and the origin of the chemical elements. It has also provided a new method for astronomical research.

Dr Gavin Lotay, Nuclear Astrophysicist and Director of Learning and Teaching at the University of Surrey, said: “Tiny pre-solar grains, about one micron in size, are the residuals of stellar explosions that occurred in the distant past, long before our solar system existed. Stellar debris eventually became wedged into meteorites that, in turn, crashed into the Earth.”

One of the most frequent stellar explosions to occur in our galaxy is called a nova, which involves a binary star system consisting of a main sequence star orbiting a white dwarf star — an extremely dense star that can be the size of Earth but has the mass of our Sun. Matter from the main star is continually pulled away by the white dwarf because of its intense gravitational field. This deposited material initiates a thermonuclear explosion every 1,000 to 100,000 years and the white dwarf ejects the equivalent of the mass of more than thirty Earths into interstellar space. In contrast, a supernova involves a single collapsing star and, when it explodes, it ejects almost all of its mass.

As novae continually enrich our galaxy with chemical elements, they have been the subject of intense astronomical investigations for decades. Much has been learned from them about the origin of the heavier elements, for example. However, a number of key puzzles remain.

Dr Lotay continues: “A new way of studying these phenomena is by analysing the chemical and isotopic composition of the pre-solar grains in meteorites. Of particular importance to our research is a specific nuclear reaction that occurs in novae and supernovae — proton capture on an isotope of chlorine — which we can only indirectly study in the laboratory.”

In conducting their experiment, the team, led by Dr Lotay and Surrey PhD student Adam Kennington (also a former Surrey undergraduate), pioneered a new research approach. It involves the use of the Gamma-Ray Energy Tracking In-beam Array (GRETINA) coupled to the Fragment Mass Analyzer at the Argonne Tandem Linac Accelerator System (ATLAS), USA. GRETINA is a state-of-the-art detection system able to trace the path of gamma rays (g-ray) emitted from nuclear reactions. It is one of only two such systems in the world that utilise this novel technology.

Using GRETINA, the team completed the first detailed g-ray spectroscopy study of an astronomically important nucleus, argon-34, and were able to calculate the expected abundance of sulfur isotopes produced in nova explosions.

Adam Kennington said: “It’s extremely exciting to think that, by studying the microscopic nuclear properties of argon-34, it may now be possible to determine whether a particular grain of stardust comes from a nova or a supernova.”

Story Source:

Materials provided by University of Surrey. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
Hackster.io

Touch-responsive Robots

Alex upgraded her new companion AI, F3NR1R, with touch-responsive controls, thanks to Brown Dog Gadgets’ "Maker Tape". Learn more in today’s video!
Plus, don’t forget to take our quiz, for a chance to win one of 100 Intel Neural Compute Stick 2s! https://bit.ly/2RVVUuJ

// https://www.hackster.io/glowascii/f3nr1r-fennec-companion-ai-d71dca
// Maker Tape intro: https://www.youtube.com/watch?v=a8LSpMGoGew
// https://www.hackster.io/glowascii/paper-emotions-bot-da2b8c
// Kitty’s stream: https://twitter.com/BrownDogGadgets/status/1276247827325177856
// Nova: https://twitter.com/the_gella/status/1264307683751981056/video/1
// Dexter: https://www.hackster.io/Odd_Jayy/build-your-own-robot-monkey-companion-bot-dexter-8c0c9e
// Maker Faire panel on companion bots: https://www.youtube.com/watch?v=E38KNz2tBS8

Categories
ScienceDaily

157 day cycle in unusual cosmic radio bursts

An investigation into one of the current great mysteries of astronomy has come to the fore thanks to a four-year observing campaign conducted at the Jodrell Bank Observatory.

Using the long-term monitoring capabilities of the iconic Lovell Telescope, an international team led by Jodrell Bank astronomers has been studying an object known as a repeating Fast Radio Burst (FRB), which emits very short duration bright radio pulses.

Using the 32 bursts discovered during the campaign, in conjunction with data from previously published observations, the team has discovered that emission from the FRB known as 121102 follows a cyclic pattern, with radio bursts observed in a window lasting approximately 90 days followed by a silent period of 67 days. The same behaviour then repeats every 157 days.

This discovery provides an important clue to identifying the origin of these enigmatic fast radio bursts. The presence of a regular sequence in the burst activity could imply that the powerful bursts are linked to the orbital motion of a massive star, a neutron star or a black hole.

Dr Kaustubh Rajwade of The University of Manchester, who led the new research, said: “This is an exciting result as it is only the second system where we believe we see this modulation in burst activity. Detecting a periodicity provides an important constraint on the origin of the bursts and the activity cycles could argue against a precessing neutron star.”

Repeating FRBs could be explained by the precession, like a wobbling top, of the magnetic axis of a highly magnetized neutron star but with current data scientists believe it may be hard to explain a 157-day precession period given the large magnetic fields expected in these stars.

The existence of FRBs was only discovered as recently as 2007 and they were initially thought to be one-off events related to a cataclysmic event such as an exploding star. This picture partly changed once FRB 121102, originally discovered with the Arecibo radio telescope on November 2 2012, was seen to repeat in 2016. However, until now, no one recognised that these bursts were in fact organised in a regular pattern.

Professor Benjamin Stappers, who leads the MeerTRAP project to hunt for FRBs using the MeerKAT telescope in South Africa said: “This result relied on the regular monitoring possible with the Lovell Telescope, and non-detections were just as important as the detections.”

In a new paper published in Monthly Notices of the Royal Astronomical Society, the team confirm that FRB 121102 is only the second repeating source of FRBs to display such periodic activity. To their surprise, the timescale for this cycle is almost 10 times longer than the 16-day periodicity exhibited by the first repeating source, FRB 180916.J10158+56, which was recently discovered by the CHIME telescope in Canada.

“This exciting discovery highlights how little we know about the origin of FRBs,” says Duncan Lorimer who serves as Associate Dean for Research at West Virginia University and, along with PhD student Devansh Agarwal, helped develop the data analysis technique that led to the discovery. “Further observations of a larger number of FRBs will be needed in order to obtain a clearer picture about these periodic sources and elucidate their origin,” he added.

Story Source:

Materials provided by University of Manchester. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

3D-printed system speeds up solar cell testing from hours to minutes

Tests on new designs for next-gen solar cells can now be done in hours instead of days thanks to a new system built by scientists at Australia’s Monash University, incorporating 3D-printed key components.

The machine can analyse 16 sample perovskite-based solar cells simultaneously, in parallel, dramatically speeding up the process.

The invention means that the performance and commercial potential of new compounds can be very rapidly evaluated, significantly speeding up the development process.

“Third generation perovskite cells have boosted performance to above 25%, which is almost identical to the efficiency level for conventional silicon-based ones,” said project leader Mr Adam Surmiak from the ARC Centre of Excellence in Exciton Science (Exciton Science).

“But those results are from laboratory tests on millimetre-sized samples in indoor conditions — and therefore don’t take into account a whole range of real-world factors such as environmental conditions, the use to which the cells are put, the manufacturing process, and possible deterioration over time.

“To make proper decisions, we need to know how each different cell design will function at large scales in the real world — and to do that we need a proper data library so we can pick the best candidates to take to that next stage. This new system lets us build that very rapidly and speed up transition from laboratory to fabrication.”

Getting the recipe right for perovskite solar cells is regarded as critically important to the transition away from fossil fuels and towards renewable energy generation. They cost about 10 times less than silicon cells and are much cheaper to manufacture.

Rooftop solar panels made from perovskite will pay for themselves within months instead of years, which is the case with present models.

To achieve the high level of precision needed to build the system, PhD candidate Surmiak and his colleagues turned to Monash University’s Instrumentation Facility and the Melbourne Centre for Nanofabrication, part of the Australian National Fabrication Facility — highly specialised machining and equipment facilities. There, the researchers’ designs were produced using ultra-detailed milling and a 16-micrometre precision 3D printer.

Alongside the development and set-up of this new testing facility, Mr Surmiak was also able to significantly speed up the actual solar cell fabrication process.

The head of the Monash University lab in which Surmiak works, Professor Udo Bach, a chief investigator with Exciton Science, described the invention as world-leading.

“Experimental high-throughput concepts will become increasingly important for the discovery of the next generation of energy materials, fueling the transition to a carbon-neutral energy economy,” he said.

“Our new set-up has the capacity to test thousands of solar cells in one single day, putting us ahead of practically all other R&D labs worldwide.”

Story Source:

Materials provided by ARC Centre of Excellence in Exciton Science. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ProgrammableWeb

Google Adjusts Android 11 Timeline, Beta Now Arrives June 3

Google has made some changes to the timeframe surrounding Android 11’s arrival, thanks to the COVID-19 situation. With the Google I/O developer conference no longer taking placing May 12-14 as previously scheduled, Google has settled on June 3 to fully show off the latest version of Android.

The company will reveal the new operating system during a virtual event called #Android11: The Beta Launch Show. Google says the launch show will the developers’ opportunity to see what’s new in Android. It will be hosted by Dave Burke and will feature portions hosted by Google’s Stephanie Cuthbertson. When the presentation portion is complete, Burke and others will wrap-up with a post-show Q&A session held live. Questions can be posted via Twitter using the #AskAndroid hashtag. 

Worried about missing all those in-depth sessions normally held during I/O? Well, Google is meeting in the middle with a number of talks that it says will range from Jetpack Compose to Android Studio and Google Play. These were originally slated to take place during the developer conference, and are instead being repurposed to accommodate everyone’s stuck-home state. Interested? Developers can sign up here

Along with the rescheduled Android reveal, so too has the release of Android itself been postponed a few weeks. Google typically seeds its first full public Android beta build during Google I/O. Instead, Google this week released a fourth developer preview build. Google said the build focused on providing stability and performance improvements in order to make life easier for developers. The first public beta will now launch June 3 after the #Android11: The Beta Launch Show.

“To help us meet the needs of the ecosystem while being mindful of the impacts on our developers and partners, we’ve decided to add a bit of extra time in the Android 11 release schedule,” said Burke in a blog post. “We’re moving out Beta 1 and all subsequent milestones by about a month, which gives everyone a bit more room but keeps us on track for final release later in Q3.”

Here are the major milestones you need to know. The fourth developer preview is available now. Beta 1 release moves to June 3, along with the final SDK and NDK APIs. Moreover, Google Play publishing for apps targeting Android 11 will open June 3. Beta 2’s release will move to July, and Google believes it will reach platform stability by that point. Beta 3 moves to August and will include release candidate builds for testing as needed. The final public release for Android 11 is slated for Q3, which means before the end of September. 

Google believes this new schedule, which includes distributing the final APIs when they were originally expected, should ensure developers have all the tools and time they need to prepare.

Developer preview four is supported by the Pixel 2, Pixel 3, and Pixel 3 device families. 

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">EricZeman</a>

Categories
ScienceDaily

Spinal cord gives bio-bots walking rhythm

Miniature biological robots are making greater strides than ever, thanks to the spinal cord directing their steps.

University of Illinois at Urbana-Champaign researchers developed the tiny walking “spinobots,” powered by rat muscle and spinal cord tissue on a soft, 3D-printed hydrogel skeleton. While previous generations of biological robots, or bio-bots, could move forward by simple muscle contraction, the integration of the spinal cord gives them a more natural walking rhythm, said study leader Martha Gillette, a professor of cell and developmental biology.

“These are the beginnings of a direction toward interactive biological devices that could have applications for neurocomputing and for restorative medicine,” Gillette said.

The researchers published their findings in the journal APL Bioengineering.

To make the spinobots, the researchers first printed the tiny skeleton: two posts for legs and a flexible “backbone,” only a few millimeters across. Then, they seeded it with muscle cells, which grew into muscle tissue. Finally, they integrated a segment of lumbar spinal cord from a rat.

“We specifically selected the lumbar spinal cord because previous work has demonstrated that it houses the circuits that control left-right alternation for lower limbs during walking,” said graduate student Collin Kaufman, the first author of the paper. “From an engineering perspective, neurons are necessary to drive ever more complex, coordinated muscle movements. The most challenging obstacle for innervation was that nobody had ever cultured an intact rodent spinal cord before.”

The researchers had to devise a method not only to extract the intact spinal cord and then culture it, but also to integrate it onto the bio-bot and culture the muscle and nerve tissue together — and do it in a way that the neurons form junctions with the muscle.

The researchers saw spontaneous muscle contractions in the spinobots, signaling that the desired neuro-muscular junctions had formed and the two cell types were communicating. To verify that the spinal cord was functioning as it should to promote walking, the researchers added glutamate, a neurotransmitter that prompts nerves to signal muscle to contract.

The glutamate caused the muscle to contract and the legs to move in a natural walking rhythm. When the glutamate was rinsed away, the spinobots stopped walking.

Next, the researchers plan to further refine the spinobots’ movement, making their gaits more natural. The researchers hope this small-scale spinal cord integration is a first step toward creating in vitro models of the peripheral nervous system, which is difficult to study in live patients or animal models.

“The development of an in vitro peripheral nervous system — spinal cord, outgrowths and innervated muscle — could allow researchers to study neurodegenerative diseases such as ALS in real time with greater ease of access to all the impacted components,” Kaufman said. “There are also a variety of ways that this technology could be used as a surgical training tool, from acting as a practice dummy made of real biological tissue to actually helping perform the surgery itself. These applications are, for now, in the fairly distant future, but the inclusion of an intact spinal cord circuit is an important step forward.”

Story Source:

Materials provided by University of Illinois at Urbana-Champaign, News Bureau. Original written by Liz Ahlberg Touchstone. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ProgrammableWeb

Apifiny Launches GlobalX for Broad Access to Crypto Markets

Institutions will now be able to trade on every single crypto market simultaneously thanks to GlobalX, an API platform launched on March 31 by San Francisco-based firm Apifiny. Apifiny hired former executives of Google X, Kraken and AlphaPoint to promote the service.

Specifically, Josh Li will act as Apifiny’s chief business officer, having previous experience in Google and Google X, Alphabet’s innovation arm. Michael Fertman will lead the B2B marketing efforts as VP Marketing of Apifiny, coming from the security tokens startup AlphaPoint. Scott Eilbeck was brought on as VP of strategic partnerships and institutional sales. He recently served as head of Over the Counter (OTC) markets at Kraken, while counting JP Morgan and Bear Stearns under past experiences.

GlobalX works by integrating all of the world’s exchanges into one platform available to institutional traders. The firm opens business accounts with as many exchanges as possible across the entire world, while presenting a unified interface to its clients.By distributing the order across global markets, the traders tap into global crypto liquidity, instead of just one crypto exchange.

The core proposition of GlobalX is “increasing the bandwidth” available for institutional trading desks. As Fertman highlighted, trading on multiple exchanges is complex:”If you look at before, in order to execute these global strategies, an institutional investor would have to set up accounts on multiple exchanges, globally. In order to execute rapidly, they would need separate sets of APIs to different exchanges. […] On the front end, it is the equivalent of calling 17 different brokers to execute one trade, and not through one interface.”

GlobalX also provides a function that few institutional trading desks can have on their own — access to all local fiat-to-crypto markets.

GlobalX is driven by an institutional-grade REST/Websocket API, that will give traders, OTC desks, brokers, and market makers faster access to trade and easier reallocation of assets among a multitude of global exchanges. FIX API support for GlobalX is expected to be added soon.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">ProgrammableWeb PR</a>

Categories
ProgrammableWeb

10 Top APIs for the Internet of Things

Homes, cities, cars, businesses, and workplaces are getting smarter thanks to the Internet of Things (IoT). Developers wishing to create IoT applications and integrate with IoT-enabled devices can look to ProgrammableWeb to find hundreds of suitable Application Programming Interfaces, or APIs, to help them get the job done.

IoT APIs allow applications to read sensors and analyze smart city or smart campus data, automate home appliances, utilize voice commands, manage proximity beacons, automate smartcars, manage edge computing, manage manufacturing and industrial equipment, and so much more.

The Internet of Things category on ProgrammableWeb has over three hundred APIs. Here we highlight ten popular ones, based on website traffic.

1. Garmin Health API

Garmin Health APITrack this API enables developers to leverage health and activity data collected from Garmin wearables. There are methods available to collect data about steps, sleep, calories, heart rate, stress, intensity minutes, body composition and more. Thirty types of activity are monitored including running, cycling, paddle boarding, swimming and more.

2. Google Assistant API

Google Assistant can be embedded into devices to enable voice control, hotword detection, natural language understanding, and other intelligence services. The Google Assistant APITrack this API provides a way to manage and converse with devices. Google Assistant enables voice control over phone applications, speakers, smart displays, automobiles, watches, laptops, TV, and other Google Home devices (including Nest). Users can do Google searches about weather, sports, traffic, news, flights, add reminders, manage tasks, control smart home devices, and much more with this API and SDKs.

3. Withings API

Withings is a company focusing on the development of connected measuring devices, such as scales and blood pressure monitors, that can send health information directly to the internet. Withings Body metrics Services API (WBS API)Track this API is a set of webservices allowing developers and third parties limited access to users’ data about activity, heart ECG (or EKG). sleep cylces, and more.

4. Home Assistant API

Home Assistant is an open-source home automation platform that tracks and controls devices at home. The Home Assistant REST APITrack this API provides access to data methods for the Home Assistant control service. It allows you to return the current configuration, return basic information about the Home Assistant instance, return all data needed to bootstrap, return an array of event objects and more. The Home Assistant Server Events Streaming API allows users to consume server-sent events. And a Home Assistant WebSocket service is also available.

5. Unofficial Tesla Model S API

Tesla Model S JSON APITrack this API is not an official Tesla API, however it is based on the Tesla Model S and it provides documentation used by the iOS and Android apps. This API can help developers in the auto industry to go beyond controlling just one car since logged in users can add several vehicles at a time. Unofficial Tesla Model S API works like a remote control from a mobile phone, with vehicle controls to charge the car, flash the lights, honk the horn and get status reports about battery charge and open doors.

6. Ubidots API

Ubidots offers a platform for developers that enables them to easily capture sensor data and turn it into useful information. The Ubidots platform can send data to the cloud from any Internet-enabled device. Developers can then configure actions and alerts based on real-time data and visual tools. The Ubidots REST APITrack this API allows users to read and write data to the resources available, with methods for data sources, variables, statistics, events and insights.

7. Apple HomeKit

Apple’s HomeKit provides a platform for devices, apps, and services to communicate. Utilizing Siri, iPhone users can control supported devices in their home. Lights, thermostats, garage doors, etc. could all be controlled by voice. Apple HomeKit API is accessible via the Apple iOS8 SDK.

8. Caret API

Caret is a service that provides automatable status sharing triggered by a device’s sensors. The Caret APITrack this API lets users harness their smart device sensors and interconnect them with third party devices and applications to automate customizable status sharing services. For example, a status could automatically change when a user starts playing a game and contain a link, photo and more about that game.

Integrate with Caret API for automated triggered status sharing functionality. Image:Caret

9. Amazon Alexa Home Skills API

The Amazon Alexa Smart Home Skills APITrack this API allows developers to enable Alexa voice interaction and transmit messages to cloud-enabled devices. The API enables developers to enable skills for Alexa to control TVs, alarms, door locks, lights, and any number of other smart home devices.

10. Wink API

Wink is an application that syncs with home automation devices to adjust lighting, window shades, climate, key locks, and more. Wink sells a Wink HUB hardware component that accepts communications from devices in the following protocols: Bluetooth LE, Wi-Fi, ZigBee, Z-Wave, Lutron ClearConnect, and Kidde. The RESTful Wink APITrack this API is hosted through Apiary and allows Wink devices to communicate with users, other apps, and the web in general.

The above list barely scrapes the surface of IoT APIs. There are more than 390 APIs, 420 SDKs, and 370 Source Code Samples in ProgrammableWeb‘s Internet of Things category.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">joyc</a>