Categories
Hackster.io

Elephant Edge Webinar 1: The Software

The ElephantEdge challenge is calling on the community to build ML models using the Edge Impulse Studio and tracking dashboards using Avnet’s IoTConnect, which will be deployed onto 10 production-grade collars manufactured by our engineering partner, Institute IRNAS, and deployed by Smart Parks.

In this first of two ElephantEdge webinars you’ll learn about the problems park rangers are facing and how to get started with IoTConnect and Edge Impulse Studio.

Contest Link: https://www.hackster.io/contests/ElephantEdge

Categories
ProgrammableWeb

Microsoft Edge Receives Spellcheck Upgrade

Microsoft’s Chromium-based Edge browser will gain an all-new mechanism for spellcheck in version 83. The new feature, which is powered by Windows Spellcheck, will be available on all systems using Windows 8.1 and above.

The Windows spellcheck functionality is being ported over via the Spellcheck API that is integrated into the Windows operating system. Microsoft notes that the benefits of this adjustment include “support for additional languages and dialects, a shared custom dictionary, and better support for URLs, acronyms, and email addresses.”

The new spellcheck feature is available immediately. 

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">KevinSundstrom</a>

Categories
ProgrammableWeb

Fastly Launches Developer Hub for its Cloud Platform

Fastly, a provider of an edge cloud platform, today announced the launch of its new Developer Hub, a central place for developers to easily access all the tools they need to build fast, scalable, and secure modern applications on the Fastly edge cloud platform. Housed within the Developer Hub is a testing sandbox, ready-to-deploy code snippets, and a growing repository of structured tutorials, reference materials, and documentation. By giving developers at the world’s most innovative companies the tools they need to drive digital transformation, the promise of edge computing becomes more tangible and actionable.

“Fastly’s new Developer Hub is one of the best developer resources we have today,” says Ron Lipke, Platform Engineering Manager at Gannett. “The documentation, code snippets, and variety of use cases will save our edge team the cycles required to develop our own configuration examples. In addition, the attention given to the user experience is readily apparent. The improved navigation will allow our edge development team to discover and implement self-service solutions much faster. This is a welcome enhancement over Fastly’s already great developer library.”

As a company built by and for developers, Fastly understands how critical good documentation and resources are to the developer experience. Time spent searching for undiscoverable materials is time taken away from optimizing sites, streamlining processes, or executing on great ideas. With this in mind, the Developer Hub is designed with usability and innovation at the forefront. The developer-friendly search indexes content based on several different categorizations, such as command type or use case, allowing developers to discover additional related content.

Two key features of the Developer Hub are the highly-searchable documentation and sandboxed testing. Robust, easy-to-navigate changelogs and references will continuously educate developers on the extensive capabilities of Varnish Configuration Language and Fastly’s API. Fastly Fiddle – a powerful and flexible testing sandbox – allows developers to test configurations without putting their production environments at risk.

“We have built a highly customizable platform with a wide range of developer-focused capabilities,” said Adam Denenberg, SVP of Customer Solutions at Fastly. “Our Developer Hub puts the full power of Fastly in developers’ hands by making it simpler to find the tools they need and by helping them realize what our technology is capable of. We’re in the business of helping developers be successful by harnessing the power of edge computing. With Developer Hub, we’re excited to continue building on that goal, and opening the edge up to developers even further.”

Customers can utilize the Developer Hub to support their edge workflows via the following tools, with more resources planned for future releases:

  • Solution Library patterns and recipes: Ready-to-deploy code snippets and deployment instructions teach developers how to do basically anything on Fastly, with everything they need to implement in their own configurations
  • API and language references and changelogs: Robust reference documentation and release notes provide complete access to all of the features available through the Fastly web interface and VCL
  • Education for all levels: Getting started content and foundational education help developers learn about Fastly’s platform. Includes technical descriptions, and a growing collection of fine tuning instructions and observability tools
  • Technical blog posts: Education and thought leadership from Fastly developers and engineering experts about our edge cloud platform, learnings from our network data, and industry trends
  • Fastly Fiddle: A testing sandbox to experiment with Fastly configurations and debug custom code without impacting developers’ production services

The Developer Hub will soon include more tools and resources needed to learn about and build on [email protected], Fastly’s in-beta serverless compute environment. Developers interested in participating in the [email protected] beta are currently able to sign up through Developer Hub.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">ProgrammableWeb PR</a>

Categories
Hackster.io

Introducing Edge Impulse: Free for Developers

Create new intelligent devices with embedded machine learning! With TinyML, Edge Impulse makes it easy to collect and process data, then train, test, and deploy your own models, all from one simple interface. Plus, its "unit testing" features enable you to re-train models to add features without losing accuracy.

The system includes free access for developers, with a generous chunk of server time per month, as well as enterprise options for scalability.

// Read more: https://www.hackster.io/news/tinyml-for-all-developers-with-edge-impulse-2cfbbcc14b90
// Sign up now: http://edgeimpulse.com
// Order the ST Discovery Kit: https://www.st.com/en/evaluation-tools/b-l475e-iot01a.html

Categories
Hackster.io

AI on Adafruit’s Edge Badge!

Time to play with the Adafruit Edge Badge! Let’s take a look at the “Micro Speech” demo — using TensorFlow with Arduino — in honor of the Arm #AIoT conference this week.

// https://www.adafruit.com/product/4400
// https://armsummit.bemyapp.com/
// https://learn.adafruit.com/tensorflow-lite-for-edgebadge-kit-quickstart/overview
// https://www.tensorflow.org/lite/microcontrollers/build_convert

Categories
Hackster.io

DIY Intelligence with NVIDIA Jetson Nano

Learn the ins and outs of NVIDIA’s Jetson Nano Developer Kit and new Hackster competition, AI at the Edge Challenge. See how you can create your own AI-powered projects with deep learning and computer vision. You’ll discover new features and tips and tricks for using the devkit to help you get started with your project.

Categories
Hackster.io

Contest: AI at the Edge, with NVIDIA’s Jetson Nano

Join our new “AI at the Edge” challenge, in partnership with NVIDIA! Use the Jetson Nano Developer Kit (available from Seeed Studio), and take advantage of all the free educational materials to create the future you want to see.

Plus, we unbox the full Developer Kit!

// https://www.hackster.io/contests/NVIDIA
// https://www.seeedstudio.com/NVIDIA-Jetson-Nano-Development-Kit-p-2916.html
// https://courses.nvidia.com/courses/course-v1:DLI+C-RX-02+V1/about
// https://developer.nvidia.com/embedded/community/jetson-projects

Project links:
// https://medium.com/@ageitgey/build-a-hardware-based-face-recognition-system-for-150-with-the-nvidia-jetson-nano-and-python-a25cb8c891fd
// https://hackaday.com/2019/08/15/home-automation-at-a-glance-using-ai-glasses/
// https://vimeo.com/351143472
// https://www.youtube.com/watch?v=0T6u7S_gq-4

Categories
IEEE Spectrum

UAV-Based LiDAR Can Measure Shallow Water Depth

World’s first small-scale topographic and bathymetric scanning LiDAR

ASTRALiTe’s edge™ is the world’s first small-scale topographic and bathymetric scanning LiDAR that can detect small underwater objects, measure shallow water depth, and survey critical underwater infrastructure from a small UAV platform.

The edge™ can see beneath the water surface at depths from 0-5 meters and is completely self-contained with its own Inertial Navigation System with GNSS, battery, and onboard computer. It weighs about 5 kg and is designed for deployment on UAV systems for faster, safer, and more accurate bathymetric surveys. This patented 2-in-1 topographic and bathymetric LiDAR offers a centimeter-level depth resolution. There are numerous possible applications for this LiDAR, such as coastal mapping and surveying, infrastructure inspection, or even military logistics. 

Importance of geo-referencing and motion stabilization

“We needed a motion and navigation solution for our LiDAR. Our requirements included high accuracy along with low size, weight, and power” explains Andy Gisler, Director of Lidar Systems with ASTRALiTe. In addition, the system needed to be able to apply Post-Processing Kinematic (PPK) corrections to the LiDAR data to provide higher accuracy results to ASTRALiTe’s customers.

The LiDAR provides a comprehensive point cloud that needs to be motion-compensated and geo-referenced to be usable. Two methods can be used to reach the centimeter-level accuracy requested by surveyors. The first one is Real-Time Kinematic (RTK), which makes use of corrections obtained from a base station or a base station network in real-time thanks to a radio or a GSM link. The second one is used after the mission using a PPK software. This software will apply the same correction as RTK, but it will also re-compute all the inertial data and raw GNSS observables with a forward-backward-merge algorithm to correct all the trajectories, fill any loss of position, and greatly improve the overall accuracy.

ASTRALiTe chose SBG Systems’ dual antenna Ellipse2-D inertial navigation system which provides motion, RTK, and PPK. The weight of the INS/GNSS solution was especially important to ASTRALiTe as they were designing a system to be flown on most UAVs, where light payload capacities are required for UAV compatibility. The possibility to use two antennas was a key element to consider, as they required a robust heading even during slow-speed flights. In addition to this INS, they also use Qinertia, SBG Systems’ in-house post-processing software

This PPK software gives access to offline RTK corrections from more than 7,000 base stations located in 164 countries and is designed to help UAV integrators get the best of their GNSS or INS/GNSS solution.

About SBG Systems INS/GNSS

SBG Systems is an international company which develops Inertial Measurement Unit with embedded GNSS, from miniature to high accuracy ranges. Combined with cutting-edge calibration techniques and advanced embedded algorithms, SBG Systems manufactures inertial solutions for industrial & research projects such as unmanned vehicle control (land, marine, and aerial), antenna tracking, camera stabilization, and surveying applications.

Categories
ProgrammableWeb

Qeexo AutoML Brings Machine Learning to Edge Devices

Qeexo, a machine learning at the edge solution provider, recently introduced Qeexo AutoML. The solution is a one-click, automated platform that performs machine learning functions on edge devices. Devices like cameras, RFID readers, and other edge products take in sensor data which is then analyzed by the AutoML platform directly on the edge device.

“Thousands of companies are collecting vast amounts of data at the edge. These companies want to leverage machine learning but don’t have the necessary tools or the technical staff,” Sang Won Lee, Qeexo CEO, commented in a press release. “With Qeexo AutoML, companies can iterate through prototypes and projects to produce production-ready models with a fraction of the time and resources previously required.

The first hardware devices supported by AutoML are the Arm Cortex M0-M4 class of MCUs and STMicroelectronic’s SensorTile.box. Machine learning at the edge is difficult because of limited compute power, memory size, and battery life. However, Qeexo AutoML has been able to accomplish enterprise-grade machine learning in this form factor.

Qeexo based AutoML on the same machine learning it uses for its FingerSense, EarSense, and TouchTool products. AutoML is automated through after a one-click workflow is initiated. To learn more, visit the AutoML site.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">ecarter</a>

Categories
ProgrammableWeb

Microsoft Suggests Foldables Need their Own API

The Edge browser team at Microsoft says adjusting its app for devices that fold might be better handled with an API. The recommendation comes ahead of the market debut of high-visibility folding phones from Huawei and Samsung.

The Huawei Mate X features a screen that bends around to cover both sides of the phone, while the Samsung Galaxy Fold opens like a book to reveal a larger display within. Both phone companies have worked directly with developers to customize apps for this fresh form factor, though it’s unclear how successful those efforts have been to date. 

Microsoft’s team specifically announced the Windows Segments Enumeration API, which will target web browser functionality on bendy screens. The base-level goals are to “effectively layout the content in a window that spans multiple displays” and to “react when areas of that window are occluded by the OS, for example when soft keyboard pops up.”

As it stands now, there are a number of hurdles to jump when developing for folding displays. For example, differences in hardware mean some phones will have no seam between screen segments while others will have a hard edge in between segments. Then there’s the issue of occlusion, where parts of the phone user interface cover or otherwise interfere with the content underneath. Further, developers would like to have a way to futureproof against this form factor in the event it becomes popular and they need to adjust more apps in rapid fashion. 

Microsoft points out that there are several APIs already out there that could help, including the Presentation API, Screen Enumeration API, and Windows Placement API. The proposed Windows Segments Enumeration API appears to pick and choose portions of these APIs to form a separate, but complementary alternative. 

“We propose a new concept of Window Segments that represent the regions (and their dimensions) of the window that reside on separate (adjacent) displays,” said Microsoft. “Window Segment dimensions are expressed in CSS pixels and will be exposed via a JavaScript API that allows developers to enumerate segments, including about regions that are occluded.”

What’s not clear is how this will work with Google’s own support for folding screens. The company already has tools in Android 10 (API level 29) that address common issues with folding screens, such as handling alternative aspect ratios, window sizes, and app continuity.

Google has not responded publicly to Microsoft’s proposal.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">EricZeman</a>