Categories
ProgrammableWeb

Postman’s New Schema Validation Feature Helps Encourage API Spec Literacy

Postman, an API development platform provider, has announced that its API Builder is gaining the ability to validate API schemas in real-time via a new UI pane that is accessible in the tool’s define tab. The addition of this functionality helps to provide developers with real-time feedback and encourage API specification literacy.

At the time of the announcement Postman’s schema validation functionality is only supported for OpenAPI 3.0, although Kin Lane, Postman’s Chief Evangelist noted to ProgrammableWeb that the company intends to “support all of the leading API specifications equally when it comes to autocomplete, validation, and other design-time features.”

While editing OpenAPI definitions in Postman users will now notice a small banner across the bottom of the define panel that either states “Schema validated” or lists the number of errors that were found. This information updates in real-time and users can click on the banner to expand the UI and dive into the specifics of the errors that were found. The feature is speedy, usually updating to display errors within a few seconds and provides useful information for identifying the error made.

The most straightforward benefits of this new tool are obvious, identifying errors in real-time is certain to improve development speed and accuracy on the platform. When ProgrammableWeb asked Lane about other, less obvious benefits provided by this feature he noted that:

“OpenAPI literacy to help educate developers about the finer details of the specification, as well as helping speed up their design processes.” Lane continued by noting that there is additional value in, “Providing a feedback loop around not just the APIs, but how OpenAPI is being applied (or not), gathering data, and feeding back to the OAI to inform the road map for the specification.”

This new Schema Validation functionality is available now in Postman v7.29’s API Builder. 

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">KevinSundstrom</a>

Categories
ScienceDaily

Promising new research identifies novel approach for controlling defects in 3D printing

With its ability to yield parts with complex shapes and minimal waste, additive manufacturing has the potential to revolutionize the production of metallic components. That potential, however, is currently limited by one critical challenge: controlling defects in the process that can compromise the performance of 3D-printed materials.

A new paper in the journal Additive Manufacturing points to a possible breakthrough solution: Use temperature data at the time of production to predict the formation of subsurface defects so they can be addressed right then and there. A team of researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory, together with a colleague now at Texas A&M University, discovered the possibility.

“Ultimately you would be able to print something and collect temperature data at the source and you could see if there were some abnormalities, and then fix them or start over,” said Aaron Greco, group manager for Argonne’s Interfacial Mechanics & Materials group in the Applied Materials Division (AMD) and a study author. “That’s the big-picture goal.”

For their research, the scientists used the extremely bright, high-powered X-rays at beamline 32-ID-B at Argonne’s Advanced Photon Source (APS), a Department of Energy Office of Science User Facility. They designed an experimental rig that allowed them to capture temperature data from a standard infrared camera viewing the printing process from above while they simultaneously used an X-ray beam taking a side-view to identify if porosity was forming below the surface.

Porosity refers to tiny, often microscopic “voids” that can occur during the laser printing process and that make a component prone to cracking and other failures.

According to Noah Paulson, a computational materials scientist in the Applied Materials division and lead author on the paper, this work showed that there is in fact a correlation between surface temperature and porosity formation below.

“Having the top and side views at the same time is really powerful. With the side view, which is what is truly unique here with the APS setup, we could see that under certain processing conditions based on different time and temperature combinations porosity forms as the laser passes over,” Paulson said.

For example, the paper observed that thermal histories where the peak temperature is low and followed by a steady decline are likely to be correlated with low porosity. In contrast, thermal histories that start high, dip, and then later increase are more likely to indicate large porosity.

The scientists used machine learning algorithms to make sense out of the complex data and predict the formation of porosity from the thermal history. Paulson said that in comparison to the tools developed by tech giants that use millions of data points, this effort had to make do with a couple hundred. “This required that we develop a custom approach that made the best use of limited data,” he said.

While 3D printers typically come equipped with infrared cameras, the cost and complexity make it impossible to equip a commercial machine with the kind of X-ray technology that exists at the APS, which is one of the most powerful X-ray light sources in the world. But by designing a methodology to observe systems that already exist in 3D printers, that wouldn’t be necessary.

“By correlating the results from the APS with the less detailed results we can already get in actual printers using infrared technology, we can make claims about the quality of the printing without having to actually see below the surface,” explained co-author Ben Gould, a materials scientist in the AMD.

The ability to identify and correct defects at the time of printing would have important ramifications for the entire additive manufacturing industry because it would eliminate the need for costly and time-consuming inspections of each mass-produced component. In traditional manufacturing, the consistency of the process makes it unnecessary to scan every metallic component coming off of the production line.

“Right now, there’s a risk associated with 3D printing errors, so that means there’s a cost. That cost is inhibiting the widespread adoption of this technology,” Greco said. “To realize its full potential, we need to lower the risk to lower the cost.”

This effort is made all the more urgent in recognizing one of the key advantages that additive manufacturing has over traditional manufacturing. “We saw with the recent pandemic response how valuable it would be to be able to quickly adapt production to new designs and needs. 3D technology is very adaptable to those kinds of changes,” added Greco.

Looking ahead, Gould said the research team was hopeful that what he called a “very, very good first step” would allow it to keep improving and expanding the model. “For machine learning, to build accurate models you need thousands and thousands of data points. For this experiment, we had 200. As we put in more data, the model will get more and more exact. But what we did find is very promising.”

Go to Source
Author:

Categories
ScienceDaily

New molecular tool precisely edits mitochondrial DNA

The genome in mitochondria — the cell’s energy-producing organelles — is involved in disease and key biological functions, and the ability to precisely alter this DNA would allow scientists to learn more about the effects of these genes and mutations. But the precision editing technologies that have revolutionized DNA editing in the cell nucleus have been unable to reach the mitochondrial genome.

Now, a team at the Broad Institute of MIT and Harvard and the University of Washington School of Medicine has broken this barrier with a new type of molecular editor that can make precise C* G-to-T* A nucleotide changes in mitochondrial DNA. The editor, engineered from a bacterial toxin, enables modeling of disease-associated mitochondrial DNA mutations, opening the door to a better understanding of genetic changes associated with cancer, aging, and more.

The work is described in Nature, with co-first authors Beverly Mok, a graduate student from the Broad Institute and Harvard University, and Marcos de Moraes, a postdoctoral fellow at the University of Washington (UW).

The work was jointly supervised by Joseph Mougous, UW professor of microbiology and an investigator at the Howard Hughes Medical Institute (HHMI), and David Liu, the Richard Merkin Professor and director of the Merkin Institute of Transformative Technologies in Healthcare at the Broad Institute, professor of chemistry and chemical biology at Harvard University, and HHMI investigator.

“The team has developed a new way of manipulating DNA and used it to precisely edit the human mitochondrial genome for the first time, to our knowledge — providing a solution to a long-standing challenge in molecular biology,” said Liu. “The work is a testament to collaboration in basic and applied research, and may have further applications beyond mitochondrial biology.”

Agent of bacterial warfare

Most current approaches to studying specific variations in mitochondrial DNA involve using patient-derived cells, or a small number of animal models, in which mutations have occurred by chance. “But these methods pose major limitations, and creating new, defined models has been impossible,” said co-author Vamsi Mootha, institute member and co-director of the Metabolism Program at Broad. Mootha is also an HHMI investigator and professor of medicine at Massachusetts General Hospital.

While CRISPR-based technologies can rapidly and precisely edit DNA in the cell nucleus, greatly facilitating model creation for many diseases, these tools haven’t been able to edit mitochondrial DNA because they rely on a guide RNA to target a location in the genome. The mitochondrial membrane allows proteins to enter the organelle, but is not known to have accessible pathways for transporting RNA.

One piece of a potential solution arose when the Mougous lab identified a toxic protein made by the pathogen Burkholderia cenocepacia. This protein can kill other bacteria by directly changing cytosine (C) to uracil (U) in double-stranded DNA.

“What is special about this protein, and what suggested to us that it might have unique editing applications, is its ability to target double-stranded DNA. All previously described deaminases that target DNA work only on the single-stranded form, which limits how they can be used as genome editors,” said Mougous. His team determined the structure and biochemical characteristics of the toxin, called DddA.

“We realized that the properties of this ‘bacterial warfare agent’ could allow it to be paired with a non-CRISPR-based DNA-targeting system, raising the possibility of making base editors that do not rely on CRISPR or on guide RNAs,” explained Liu. “It could enable us to finally perform precision genome editing in one of the last corners of biology that has remained untouchable by such technology — mitochondrial DNA.”

“Taming the beast”

The team’s first major challenge was to eliminate the toxicity of the bacterial agent — what Liu described to Mougous as “taming the beast” — so that it could edit DNA without damaging the cell. The researchers divided the protein into two inactive halves that could edit DNA only when they combined.

The researchers tethered the two halves of the tamed bacterial toxin to TALE DNA-binding proteins, which can locate and bind a target DNA sequence in both the nucleus and mitochondria without the use of a guide RNA. When these pieces bind DNA next to each other, the complex reassembles into its active form, and converts C to U at that location — ultimately resulting in a C* G-to-T* A base edit. The researchers called their tool a DddA-derived cytosine base editor (DdCBE).

The team tested DdCBE on five genes in the mitochondrial genome in human cells and found that DdCBE installed precise base edits in up to 50 percent of the mitochondrial DNA. They focused on the gene ND4, which encodes a subunit of the mitochondrial enzyme complex I, for further characterization. Mootha’s lab analyzed the mitochondrial physiology and chemistry of the edited cells and showed that the changes affected mitochondria as intended.

“This is the first time in my career that we’ve been able to engineer a precise edit in mitochondrial DNA,” said Mootha. “It’s a quantum leap forward — if we can make targeted mutations, we can develop models to study disease-associated variants, determine what role they actually play in disease, and screen the effects of drugs on the pathways involved.”

Future developments

One goal for the field now will be to develop editors that can precisely make other types of genetic changes in mitochondrial DNA.

“A mitochondrial genome editor has the long-term potential to be developed into a therapeutic to treat mitochondrial-derived diseases, and it has more immediate value as a tool that scientists can use to better model mitochondrial diseases and explore fundamental questions pertaining to mitochondrial biology and genetics,” Mougous said.

The team added that some features of DdCBE, such as its lack of RNA, may also be attractive for other gene-editing applications beyond the mitochondria.

This work was supported in part by the Merkin Institute of Transformative Technologies in Healthcare, NIH (R01AI080609, U01AI142756, RM1HG009490, R35GM122455, R35GM118062, and P30DK089507), Defense Threat Reduction Agency (1-13-1-0014), and University of Washington Cystic Fibrosis Foundation

Go to Source
Author:

Categories
ScienceDaily

Untwisting plastics for charging internet-of-things devices

Untwisting chains of atoms within a plastic polymer improves its ability to conduct electricity, according to a report by researchers, led by Nagoya University applied physicist Hisaaki Tanaka, in the journal Science Advances. The insight could help accelerate the development of wearable power sources for a vast number of Internet-of-things devices.

The ‘smart’ societies of the future are expected to contain a large number of electronic devices that are interconnected through the Internet: the so-called Internet-of-things. Scientists have been looking for ways to use body heat to charge some types of micro-devices and sensors. But this requires lightweight, non-toxic, wearable, and flexible thermoelectric generators.

Plastics that can conduct electricity, called conducting polymers, could fill this bill, but their thermoelectric performance needs to be improved. Their thin films have highly disordered structures, formed of crystalline and non-crystalline parts, making it notoriously difficult to understand their properties and thus find ways to optimize their performance.

Tanaka worked with colleagues in Japan to understand the thermoelectric properties of a highly conductive thiophene-based polymer, called PBTTT. They added or ‘doped’ the polymer with a thin ion electrolyte gel, which is known to improve conductivity. The gel only infiltrates the polymer successfully when a specific electric voltage is applied.

They used a variety of measurement techniques to understand the polymer’s electronic and structural changes when doped. They found that, without the electrolyte gel, the PBTTT chain is highly twisted. Doping it with a critical amount of electrolyte untwists the chain and creates links between its crystalline parts, improving electron conductivity.

The scientists report that the formation of this interconnected conductive network is what determines the polymer’s maximum thermoelectric performance, which they were able to uniquely observe in this study.

They are now looking into ways to optimize the thermoelectric performance of thin film conducting polymers through material design and changing the fabrication conditions.

Story Source:

Materials provided by Nagoya University. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ProgrammableWeb

69 Percent of IT Leaders Keep the Lights On Instead of Innovating. Enter APIs.

Responding to a question regarding their ability to innovate on behalf of the organizations that they work for, 69 percent of IT leaders surveyed said they felt they were working just to keep the lights on versus innovating. This was the answer to one of many questions that MuleSoft posed to over 800 IT leaders, the complete results of which can be found in the 2020 edition of the company’s annual   Tune into the ProgrammableWeb Radio Podcast on Apple iTunes  Tune into the ProgrammableWeb Radio Podcast on SoundCloud


Full-text Transcript of Interview with Mulesoft Director of Solutions Engineering Ani Pandit

(00:29) David Berlind: Hi, I’m David Berlind, Editor in Chief of Programmable Web. Today is Tuesday, March 17th, 2020, and this is ProgrammableWeb’s Developers Rock podcast. With me today is Ani Pandit. He is the Director of solutions engineering at MuleSoft, and MuleSoft has just released their new connectivity benchmark report. We want to dig into that, but first Ani, thanks very much for being here on the show.

(00:55) Ani Pandit: Thanks David for having me over.

(00:57) David: It’s great to have you.

(00:58) Ani: Second year in a row now.

(00:59) David: Second year in a row. That’s right. We caught up on this connectivity benchmark report that you guys did last year. The first question I want to ask you, though, is for those people who don’t know what MuleSoft is, what is MuleSoft?

(01:12) Ani: All right, let’s start there. So MuleSoft provides a platform that necessarily helps companies achieve their digital transformation initiatives, and we do that by helping them easily create connected experiences for their customers, their employees, or partners. And we do that by helping them easily connect all their business systems, their data silos and devices, IoT devices, by means of APIs. And our platform helped these companies streamline API and integration life cycle and help them strategically stand up API programs that help them build these integrations and connected experiences faster.

(01:59) David: What’s an example of a connected experience?

(02:01) Ani: So I’ll use something that most people would relate to is when they go online and they search for a product or a service, right? For example, recently I was looking for a tennis ball machine online. I went to Google, said “Hey, which are the best tennis ball machines?”” And then it gave me 10 different products. And from there I went to Amazon, I looked at that product, looked at the reviews, went to Yelp, looked that one up. And then went to our local store that had the device there and talked to the people over there. And then I went back online to the company’s website, and through their eCommerce platform I actually bought it. So a couple of days later I realized that the device was not shipped to me. So what I did was I called in the call center to find out where my order was and they expedited that shipment.

(03:05) I got it, and I started using it. So when you think about this journey of a consumer, in this case myself, interacting with this brand across many, many different channels from the time where I’m researching the product to experiencing the product, to going and buying and paying for the product, and then even getting after-purchase support from the vendor. That’s the experience we’re talking about. And a lot of customers, when they say “connected experiences”, they want to deliver a very linear experience in the sense that the customer doesn’t feel that they are working in isolation, or working in silos with different parts of the business. Where they have a unique experience with the brand and a consistent experience with the brand as they move through that customer journey.

(04:04) David: So when you connect these siloed systems behind the scenes, it allows you to create something that’s much more seamless and frictionless from the beginning to the end of the journey. Is that what you’re saying?

(04:15) Ani: Exactly. Like typically you would imagine most organizations on average have close to 900 to 1,000 applications. And when you think about our typical customer experience, it spans around 35 to 37 different systems where information and data are being collected. So if you think about delivering a consistent linear and a unique experience to that particular customer, you’re thinking about integrating and connecting all of these 35, 37 different systems to deliver that consistent experience to the customer. So, you’re absolutely right David, it’s all about getting the connectivity of your platforms and your data to deliver that unique customer journey for your constituents.

(05:10) David: Well you just mentioned that there are like 900 applications at these organizations, and that is the data that I also saw when I previewed this connectivity benchmark report, that there’s somewhere between 900 and a thousand applications at these companies. But something else I saw in the report was that somewhere around only 28 percent of companies actually had these applications integrated with each other (Editor’s Note: the report indicated that, across those surveyed, only 28 percent of applications were integrated). And you’re talking about how important this is to create the sort of 360 degree view of the customer. Well if only 28 percent have them integrated, that doesn’t speak well of the other 72 percent. Is that something that’s changing at all from year to year? You did this report last year and I think the data was pretty much the same.

(05:52) Ani: Yeah, that’s a great observation. It’s remained pretty flat year over year. And my assessment of this is a few things. First of all, as I’ve been talking to a lot of leaders across the board in different industries, and a common thread of what they are seeing is they’ve been retiring a lot of applications — legacy ones, some that are not useful as their businesses have evolved. But on the other side they have accelerated bringing in newer business functionality that enables them to be more competitive in the business. So with that, what’s happened is there’s more and more demand in terms of building these connected experiences for their IT team. Even though there has been some progress in modernizing older systems, but the whole increase in number of applications that the business is adopting statistically doesn’t show a marked improvement in terms of integrated applications as such.

(07:06) David: Now you’re talking a little bit about making these changes and digitally transforming internally. Another data point that I saw had to do with the number of organizations that are looking at integration as a means of solving internal problems, but may not be necessarily looking outward. This idea of operating in two different modes, sometimes people call it two-speed IT. I don’t think the report refers to it that way, but what’s happening on that front? Are organizations seeing the connection between operating in those two different modes and something like improved efficiency overall better, better efficiency than their competitors, or better revenues than their competitors?

(07:47) Ani: This is a great question, because it talks to some business strategies, right? Certain organizations are primarily focused on delivering the right experience, but they are thinking about delivering that by driving internal efficiencies. And that helps them improve margins as part of their customer interaction. And there are certain set of customers who are thinking about, “Hey we need to grow our top line revenue and show growth in terms of introducing new products and services, and deliver those products and services to these new experiences.” So that those two strategies talk to where we are seeing the industry or most of our customers drive these strategies and on modernization.

(08:37) Ani: So when we think about driving internal efficiencies, we are looking at customers and companies that are taking APIs and taking these modern strategies to drive better automation of their business processes where they can integrate these systems to deliver these experiences. And this includes a legacy modernization where access to say a mainframe, or a legacy database, and custom applications on the back end are being driven through modern age APIs, and their core focus in that first phase of transformation is to drive accessibility to these legacy systems and be able to have the data that had been locked into these applications over many years in these organizations. So that’s one aspect, which is the majority of companies that were surveyed in this report, which they talked about.

(09:41) David: And of course you’re talking about, when you talk about taking a system like a mainframe and modernizing it, fronting it with APIs, we’re literally talking about that’s one of those 900 systems that needs to be integrated with something else, which may not be a legacy system. It may be a modern system, but at the end of the day, whether it’s new or old, these are those 900 systems that have to be pulled together to create those connected customer experiences you’re talking about.

(10:06) Ani: Exactly. I’ll give an example just to illustrate. Even though this is one of the systems, they become really core Q, some of the businesses, like for example, I was working with one of the largest banks in Europe, on one of their open banking initiatives in primarily their entire credit card services and core banking system is running on a mainframe. So pretty much 90 percent of what they do in that particular line of business is driven through mainframe. And as part of their business project they were trying to build a newer customer experience with an application that would help their credit card customers become much more better at spending money and driving points and loyalty with the bank. So they wanted to create a mobile app. Now think about the impedance mismatch between the experience you’re trying to drive in a modern mobile app, and the green screen data that you’re seeing in a mainframe, right? And that’s where APIs come in handy, where these types of companies are using APIs to modernize and bridge that impedance between a modern engagement layer and legacy data.

(11:30) David: Right. So getting that information that’s typically viewed on a green screen and moving it into a really slick mobile front-end that customers can use on their smartphone.

(11:39) Ani: That’s right. And that becomes a foundation for them to unlock this information from their mainframes into other applications as well, going forward. So setting that foundation has been a core focus for a lot of the companies that were surveyed in this particular benchmark report.

(11:58) David: And there, I think you’re talking about the reusability. Once you’ve got an API that’s open for access to, let’s say the new mobile app, you can reuse that API in some other innovative way.

(12:09) Ani: Yeah. So this is a good point you’re making. That is the hope, that when people are driving these strategies, one thing that came out of this benchmark report was 80 percent of IT leaders identified that APIs were super critical for them to drive this digital transformation. But also a lot of them also reported that they were not necessarily building APIs that were reusable. So, this is where I see an opportunity for a lot of organizations to start thinking more strategically in how they build building blocks, which are these APIs, that can become more reusable, that help them accelerate the speed of a newer initiatives and newer projects down the road. And this is one way where they can change the clock speed of their organization where rather than starting from scratch, by reusing some of these frameworks, they are basically accelerating their ability to deliver these new initiatives.

(13:19) David: Yeah. So I think what you’re getting at is that organizations have to include more stakeholders in the API strategy conversation. Like, “Hey, is this API going to be reusable and in multiple places? If not, what do we have to do to make it that way so that it serves the entire organization and not just one application?” Because if you end up with sort of a one-to-one relationship between the final API and some application and it doesn’t get reused, you’re losing out on a whole bunch of potential efficiency.

(13:51) Ani: Exactly. In my observation I’ve seen two or three different types of approaches just to add a little more color to what you just summarized there, which is a very important point. A lot of organizations last year did recognize that APIs were extremely critical for them to deliver these integrations and connected experiences, whether it was for an internal audience or an external audience, but how they went about doing it, the data in this report shows a very marked difference. There are certain organizations that took an API strategy much more isolated from the rest of the organization. It was very project focused, and not necessarily as a strategy that was holistic. And in some instances they were strategies that were only line of business focused, and they did see some efficiencies and benefits out of that.

(14:56) And there were very few organizations, less than 25 percent, that took this to the next level where they said, “this has to be a top down strategy where we can drive the most efficiency and effectiveness across the organization and be able to realize a consistency in doing that.” And that directly talks to exactly what you’re saying is, “How do you transform an organization?” By building these building blocks that are reusable and recomposable, so that you can drive better agility and effectiveness in delivering new products, services, or optimizations in business processes.

(15:39) David: Let’s jump on that top down approach, because I did read in the report that the organizations that were most effective were ones that had more of a top down approach here. And to me, I think, in my personal view that sounds problematic. I mean, we all know about the famous memo that Jeff Bezos sent out to everybody at Amazon that from now on we’re going to take more of this services led approach where every system has a service layer on it like an API. And if you don’t do it, you’re going to get fired. That was back in 2002. Very famous, and speaks to, it’s great when the leaders of the organization really get it, right? But in truth, all of those businesses out there all around the world don’t have leaders like Jeff Bezos who are thinking on these terms, and at some point, it seems to me like this has to come from within the organization.

(16:38) It has to be the people who are working facing customers, people who are working in the accounting department, who are the ones who, they are aware of the potential of integration getting these 900 systems connected to each other, just exactly how they can move the business forward. It’s got to be driven from within. If we just wait for all of this top down approach to happen, I think we’ll continue to be stuck. We won’t be seeing a lot of change in this data from one year to the next. I don’t know what your opinion of that is.

(17:07) Ani: I would concur with what you’re saying. There needs to be pragmatism. Let me chime in on the Jeff Bezos comment that you made. Let’s look at Amazon back in 2002. It was still an upcoming company completely built on modern technology. They did not have challenges like a 150 year old insurance company or a banking company, or a 70 year old retail company has, in terms of tech debt, or a mandate in terms of organizational culture where you have leaders who can actually drive such a change, right? So you cannot necessarily do that in certain organizations just because of the culture of it. So your observation is spot on that certain organizations are not built for a top down specific mandate. To be most effective, it has to be grassroots driven, but there are methods and methodologies that can be mandated as part of a organization for them to be able to deliver the right kind of bottoms-up strategy.

(18:21) One example you brought up was people who are closer to the customer. As long as you take your design cues of whatever you’re building, whatever experience you’re building, by taking feedback from your constituents or people who are going to consume your products or services or your application, and design your applications that way, it can basically drive a lot of behavior in teams and project teams and lines of businesses and how they build their experiences and their applications. What I’m seeing also in some of the organizations I’ve worked very closely with who are doing this legacy modernization as their foundational layer for that first phase of digital transformation, they are taking a design thinking approach to identify what are the core assets that they should build that are going to be reusable for the long term.

(19:21) So having setup some time to think through that and being able to build those assets early on gives you a lot more value down the road. And they’re seeing 70-80 percent productivity in their teams when they think about this bottom up strategy. So I would say in terms of my opinion, based on my observation, it depends on the organizational culture to have both a top down and a bottom up strategy. But it should always come from a place where you’re always thinking about the consumer of your application or your services or your product.

(20:00) David: It’s a huge cultural shift. And touching on that cultural shift, one other thing that I spotted in the report which really jumped out at me was, I’m going to read it here. 69 percent of the people surveyed felt like they were keeping the lights on versus innovating. So you’ve got IT directors here responding to the survey, the grand majority of them saying they’re just keeping the lights on. That just seems to me like… How is it possible that any of these organizations are going to get these cultural changes in place? And even if they do, if they’re spending so much time just keeping the lights on, they can’t innovate. What’s going to happen?

(20:45) Ani: Yeah. So this is exactly why we see not a marked change in from last year, is because of this delivery gap that we spoke about a few minutes earlier. What’s happening is the demand on IT, like IT budgets are not increasing that much. It’s less than 10 percent year over year on average. And the number of resources in those organizations aren’t growing markedly either. So, basically you have a certain finite set of resources, but the demand from business is significant, they are like, let’s bring in big data analytics, let’s do some marketing initiatives—

(21:34) David: Artificial intelligence.

(21:35) Ani: — and intelligence in AI and bot automation. And they’re bringing in all these concepts which are completely new to deliver these newer experiences. But what the IT teams and leaders are tasked with is not only bringing these newer and adopt these new technologies and deliver these new projects, but also endure the legacy aspect and support those things. So the pace of innovation hasn’t accelerated, even though IT leaders are having, doubling down on building the building blocks for modernization through a dev ops and cloud-first and API integrations, their strategy in terms of execution is where there is a lot of opportunity for us to focus on and be able to drive efficiency so that next year we can start seeing benefits out of that, where we start seeing more innovation coming out of these organizations.

(22:40) David: Yeah, 69-31. I mean, I have a really clear image of a guy standing by a light switch just praying that the mainframe stays on, and keeps powering all those applications. But what you’re saying, I think, is that in some ways an API led holistically thought out strategy is one that can help deal with that 69-31 split. You’re not getting more budget, but if you can get more efficient about how you’re delivering this end to end customer experience, these 360 degree views of the customer, the integration of these 900 different applications, you can essentially find a way to drive more innovation eventually.

(23:29) Ani: Exactly.

(23:29) David: You’re saying in a year. So, are you saying that one year from now when I come back and I interview you, we’re going on two years now, we’ll do it a third year in a row, when I come back there is going to be a big change here?

(23:41) Ani: Yeah, so there are two critical aspects. What I want to see is this. For any organization who’s already thinking and 80 percent of IT decision makers in this report suggested that integration and API strategies were very key to their transformation success. Well, there are two things that are very critical for this. One is building a culture of self-service, and building reuse into that strategy. As long as you do those two things, I would definitely see a lot more acceleration in terms of innovation. Because when I see just the organizations who are using these two tenets as key constituents of their API strategy, we are seeing 67 percent more productivity in these organizations. And that is just an anecdotal data coming out of this benchmark this year.

(24:53) David: So that’s where you can close that delivery gap.

(24:56) Ani: Exactly. This is where there are differences between these 26 percent organizations and the other 73 percent of organizations.

(25:05) David: Well, those 26 percent of those organizations are going to end up ruling the world if the other ones don’t get on board pretty quickly.

(25:13) Ani: I hope not. And I hope people see that this is quite essential to try and bridge the gap, this delivery gap.

(25:22) David: Ani, where can everybody go to find all the insights in this report?

(25:27) Ani: So we have published the benchmark report, the connectivity benchmark report on MuleSoft website on our resources section, and it’s free to download.

(25:42) David: Okay. Well we’re out of time. I want to thank you very much for joining us. We’ve been speaking with Ani Pandit, he’s the Director of solutions engineering at MuleSoft. Ani, thanks very much.

(25:52) Ani: Thanks, David. Thanks for having me over.

(25:54) David: We’ll see you next year. For ProgrammableWeb, I’m David Berlind, the Editor in Chief. If you want to see more videos like this one, you can just come to ProgrammableWeb.com, where we not only have the video, but we’ll have the full text transcript of this and all the other videos we’ve recorded. Also the audio only version if you want to consume it as a podcast on your smartphone, and if you want to watch the video on our YouTube channel, just go to www.youtube.com/programmableweb. All of our videos are up there. Until the next podcast. Thanks very much for joining us.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">david_berlind</a>

Categories
ScienceDaily

New method to grow human blood vessels

A team of researchers at the University of Minnesota Medical School recently proved the ability to grow human-derived blood vessels in a pig — a novel approach that has the potential for providing unlimited human vessels for transplant purposes. Because these vessels were made with patient-derived skin cells, they are less likely to be rejected by the recipient, helping patients potentially avoid the need for life-long, anti-rejection drugs.

Daniel Garry, MD, PhD, and Mary Garry, PhD, both professors in the Department of Medicine at the U of M Medical School, co-led the research team and published their findings in Nature Biotechnology last week.

“There’s so many chronic and terminal diseases, and many people are not able to participate in organ transplantation,” said Daniel, who is also a heart failure and transplant cardiologist. “About 98 percent of people are not going to be eligible for a heart transplant, so there’s been a huge effort in trying to come up with strategies to increase the donor pool. Our approach looked at a pig.”

Because of similarities between human and pig physiology, scientists have historically studied pigs to discover treatments for health issues, including diabetes. Before researchers engineered human insulin, doctors treated patients with pig insulin.

“Our discovery has made a platform for making human blood vessels in a pig,” said Daniel. “This could allow us to make organs with human blood vessels that would be less apt to be rejected and could be used in patients in need of a transplant. That’s what typically causes rejection — the lining of the blood vessels in the organs.”

The blood vessels created by the Garry duo will avoid rejection because of the method by which they are made. The team injects human-induced pluripotent stem cells — taken from mature cells scraped from a patient’s skin and reprogrammed to a stem cell state — into a pig embryo, which is then placed into a surrogate pig. In the future, viable piglets, with blood vessels that will be an exact match to the patient, will ensure a successful transplant and the ability to live without the need for immunosuppression, or anti-rejection, drugs.

“There’s hundreds of thousands of patients that have peripheral artery disease, either because of smoking or diabetes or any number of causes, and they have limb amputations,” Mary said. “These blood vessels would be engineered and could be utilized in these patients to prevent those kinds of life-long handicaps, if you will.”

The first phase of their study, approved by the U of M’s Stem Cell Research Oversight committee, brought the first embryo to a 27-day term. Because of the success of this phase, Daniel and Mary are currently seeking the committee’s approval to advance the research further into the later gestational period.

“We’re trying to take it in a phased approach,” Daniel said. “We want to be sure we address all of the possible issues — whether human cells go where we want them to go.”

“While it is a first phase, there’s pretty solid proof of concept,” Mary said. “We believe that we’ve proven that there’s no off-target effects of these cells, so we’re ready to move forward to later gestational stages.”

Go to Source
Author:

Categories
IEEE Spectrum

Damon’s Hypersport AI Boosts Motorcycle Safety

For all its pure-electric acceleration and range and its ability to shapeshift, the Hypersport motorcycle shown off last week at CES by Vancouver, Canada-based Damon Motorcycles matters for just one thing: It’s the first chopper swathed in active safety systems.

These systems don’t take control, not even in anticipation of a crash, as they do in many advanced driver assistance systems in cars. They leave a motorcyclist fully in command while offering the benefit of an extra pair of eyes.

Why drape high tech “rubber padding” over the motorcycle world? Because that’s where the danger is: Motorcyclists are 27 times more likely to die in a crash than are passengers in cars.

“It’s not a matter of if you’ll have an accident on a motorbike, but when,” says Damon chief executive Jay Giraud. “Nobody steps into motorbiking knowing that, but they learn.”

The Hypersport’s sensor suite includes cameras, radar, GPS, solid-state gyroscopes and accelerometers. It does not include lidar­–“it’s not there yet,” Giraud says–but it does open the door a crack to another way of seeing the world: wireless connectivity.  

The bike’s brains note everything that happens when danger looms, including warnings issued and evasive maneuvers taken, then shunts the data to the cloud via 4G wireless. For now that data is processed in batches, to help Damon refine its algorithms, a practice common among self-driving car researchers. Some day, it will share such data with other vehicles in real-time, a strategy known as vehicle-to-everything, or V2x.

But not today. “That whole world is 5-10 years away—at least,” Giraud grouses. “I’ve worked on this for over decade—we’re no closer today than we were in 2008.”

The bike has an onboard neural net whose settings are fixed at any given time. When the net up in the cloud comes up with improvements, these are sent as over-the-air updates to each motorcycle. The updates have to be approved by each owner before going live onboard. 

When the AI senses danger it gives warning. If the car up ahead suddenly brakes, the handlebars shake, warning of a frontal collision. If a vehicle coming from behind enters the biker’s blind spot, LEDs flash. That saves the rider the trouble of constantly having to look back to check the blind spot.

Above all, it gives the rider time. A 2018 report by the National Highway Traffic Safety Administration found that from 75 to 90 percent of riders in accidents had less than three seconds to notice a threat and try to avert it; 10 percent had less than one second. Just an extra second or two could save a lot of lives.

The patterns the bike’s AI tease out from the data are not always comparable to those a self-driving car would care about. A motorcycle shifts from one half of a lane to the other; it leans down, sometimes getting fearsomely close to the pavement; and it is often hard for drivers in other vehicles to see.

One motorbike-centric problem is the high risk a biker takes just by entering an intersection. Some three-quarters of motorcycle accidents happen there, and of that number about two-thirds are caused by a car’s colliding from behind or from the side. The side collision, called a T-bone, is particularly bad because there’s nothing at all to shield the rider.

Certain traffic patterns increase the risk of such collisions. “Patterns that repeat allow our system to predict risk,” Giraud says. “As the cloud sees the tagged information again and again, we can use it to make predictions.”

Damon is taking pre-orders, but it expects to start shipping in mid-2021. Like Tesla, it will deliver straight to the customer, with no dealers to get in the way.

Categories
ScienceDaily

Deadly ‘superbugs’ destroyed by molecular drills

Molecular drills have gained the ability to target and destroy deadly bacteria that have evolved resistance to nearly all antibiotics. In some cases, the drills make the antibiotics effective once again.

Researchers at Rice University, Texas A&M University, Biola University and Durham (U.K.) University showed that motorized molecules developed in the Rice lab of chemist James Tour are effective at killing antibiotic-resistant microbes within minutes.

“These superbugs could kill 10 million people a year by 2050, way overtaking cancer,” Tour said. “These are nightmare bacteria; they don’t respond to anything.”

The motors target the bacteria and, once activated with light, burrow through their exteriors.

While bacteria can evolve to resist antibiotics by locking the antibiotics out, the bacteria have no defense against molecular drills. Antibiotics able to get through openings made by the drills are once again lethal to the bacteria.

The researchers reported their results in the American Chemical Society journal ACS Nano.

Tour and Robert Pal, a Royal Society University Research Fellow at Durham and co-author of the new paper, introduced the molecular drills for boring through cells in 2017. The drills are paddlelike molecules that can be prompted to spin at 3 million rotations per second when activated with light.

Tests by the Texas A&M lab of lead scientist Jeffrey Cirillo and former Rice researcher Richard Gunasekera, now at at Biola, effectively killed Klebsiella pneumoniae within minutes. Microscopic images of targeted bacteria showed where motors had drilled through cell walls.

“Bacteria don’t just have a lipid bilayer,” Tour said. “They have two bilayers and proteins with sugars that interlink them, so things don’t normally get through these very robust cell walls. That’s why these bacteria are so hard to kill. But they have no way to defend against a machine like these molecular drills, since this is a mechanical action and not a chemical effect.”

The motors also increased the susceptibility of K. pneumonia to meropenem, an antibacterial drug to which the bacteria had developed resistance. “Sometimes, when the bacteria figures out a drug, it doesn’t let it in,” Tour said. “Other times, bacteria defeat the drug by letting it in and deactivating it.”

He said meropenem is an example of the former. “Now we can get it through the cell wall,” Tour said. “This can breathe new life into ineffective antibiotics by using them in combination with the molecular drills.”

Gunasekera said bacterial colonies targeted with a small concentration of nanomachines alone killed up to 17% of cells, but that increased to 65% with the addition of meropenem. After further balancing motors and the antibiotic, the researchers were able to kill 94% of the pneumonia-causing pathogen.

Tour said the nanomachines may see their most immediate impact in treating skin, wound, catheter or implant infections caused by bacteria — like staphylococcus aureus MRSA, klebsiella or pseudomonas — and intestinal infections. “On the skin, in the lungs or in the GI tract, wherever we can introduce a light source, we can attack these bacteria,” he said. “Or one could have the blood flow through a light-containing external box and then back into the body to kill blood-borne bacteria.”

“We are very much interested in treating wound and implant infections initially,” Cirillo said. “But we have ways to deliver these wavelengths of light to lung infections that cause numerous mortalities from pneumonia, cystic fibrosis and tuberculosis, so we will also be developing respiratory infection treatments.”

Gunasekera noted bladder-borne bacteria that cause urinary tract infections may also be targeted.

The paper is one of two published by the Tour lab this week that advance the ability of microscopic nanomachines to treat disease. In the other, which appears in ACS Applied Materials Interfaces, researchers at Rice and the University of Texas MD Anderson Cancer Center targeted and attacked lab samples of pancreatic cancer cells with machines that respond to visible rather than the previously used ultraviolet light. “This is another big advance, since visible light will not cause as much damage to the surrounding cells,” Tour said.

Story Source:

Materials provided by Rice University. Original written by Mike Williams. Note: Content may be edited for style and length.

Go to Source
Author:

Categories
ScienceDaily

Squid camouflage may lead to next gen of bio-inspired synthetic materials

Squids, octopuses and cuttlefish are undisputed masters of deception and camouflage. Their extraordinary ability to change color, texture and shape is unrivaled, even by modern technology.

Researchers in the lab of UC Santa Barbara professor Daniel Morse have long been interested in the optical properties of color-changing animals, and they are particularly intrigued by the opalescent inshore squid. Also known as the California market squid, these animals have evolved the ability to finely and continuously tune their color and sheen to a degree unrivaled in other creatures. This enables them to communicate, as well as hide in plain sight in the bright and often featureless upper ocean.

In previous work, the researchers uncovered that specialized proteins, called reflectins, control reflective pigment cells — iridocytes — which in turn contribute to changing the overall visibility and appearance of the creature. But still a mystery was how the reflectins actually worked.

“We wanted now to understand how this remarkable molecular machine works,” said Morse, a Distinguished Emeritus Professor in the Department of Molecular, Cellular and Developmental Biology, and principal author of a paper that appears in the Journal of Biological Chemistry. Understanding this mechanism, he said, would provide insight into the tunable control of emergent properties, which could open the door to the next generation of bio-inspired synthetic materials.

Light-reflecting skin

Like most cephalopods, opalescent inshore squid, practice their sorcery by way of what may be the most sophisticated skin found anywhere in nature. Tiny muscles manipulate the skin texture while pigments and iridescent cells affect its appearance. One group of cells controls their color by expanding and contracting cells in their skin that contain sacks of pigment.

Behind these pigment cells are a layer of iridescent cells — those iridocytes — that reflect light and contribute to the animals’ color across the entire visible spectrum. The squids also have leucophores, which control the reflectance of white light. Together, these layers of pigment-containing and light-reflecting cells give the squids the ability to control the brightness, color and hue of their skin over a remarkably broad palette.

Unlike the color from pigments, the highly dynamic hues of the opalescent inshore squid result from changing the iridocyte’s structure itself. Light bounces between nanometer-sized features about the same size as wavelengths in the visible part of the spectrum, producing colors. As these structures change their dimensions, the colors change. Reflectin proteins are behind these features’ ability to shapeshift, and the researchers’ task was to figure out how they do the job.

Thanks to a combination of genetic engineering and biophysical analyses, the scientists found the answer, and it turned out to be a mechanism far more elegant and powerful than previously imagined.

“The results were very surprising,” said first author Robert Levenson, a postdoctoral researcher in Morse’s lab. The group had expected to find one or two spots on the protein that controlled its activity, he said. “Instead, our evidence showed that the features of the reflectins that control its signal detection and the resulting assembly are spread across the entire protein chain.”

An Osmotic Motor

Reflectin, which is contained in closely packed layers of membrane in iridocytes, looks a bit like a series of beads on a string, the researchers found. Normally, the links between the beads are strongly positively charged, so they repel each other, straightening out the proteins like uncooked spaghetti.

Morse and his team discovered that nerve signals to the reflective cells trigger the addition of phosphate groups to the links. These negatively charged phosphate groups neutralize the links’ repulsion, allowing the proteins to fold up. The team was especially excited to discover that this folding exposed new, sticky surfaces on the bead-like portions of the reflectin, allowing them to clump together. Up to four phosphates can bind to each reflectin protein, providing the squid with a precisely tunable process: The more phosphates added, the more the proteins fold up, progressively exposing more of the emergent hydrophobic surfaces, and the larger the clumps grow.

As these clumps grow, the many, single, small proteins in solution become fewer, larger groups of multiple proteins. This changes the fluid pressure inside the membrane stacks, driving water out — a type of “osmotic motor” that responds to the slightest changes in charge generated by the neurons, to which patches of thousands of leucophores and iridocytes are connected. The resulting dehydration reduces the thickness and spacing of the membrane stacks, which shifts the wavelength of reflected light progressively from red to yellow, then to green and finally blue. The more concentrated solution also has a higher refractive index, which increases the cells’ brightness.

“We had no idea that the mechanism we would discover would turn out to be so remarkably complex yet contained and so elegantly integrated in one multifunctional molecule — the block-copolymeric reflectin — with opposing domains so delicately poised that they act like a metastable machine, continually sensing and responding to neuronal signaling by precisely adjusting the osmotic pressure of an intracellular nanostructure to precisely fine-tune the color and brightness of its reflected light,” Morse said.

What’s more, the researchers found, the whole process is reversible and cyclable, enabling the squid to continually fine-tune whatever optical properties its situation calls for.

New Design Principles

The researchers had successfully manipulated reflectin in previous experiments, but this study marks the first demonstration of the underlying mechanism. Now it could provide new ideas to scientists and engineers designing materials with tunable properties. “Our findings reveal a fundamental link between the properties of biomolecular materials produced in living systems and the highly engineered synthetic polymers that are now being developed at the frontiers of industry and technology,” Morse said.

“Because reflectin works to control osmotic pressure, I can envision applications for novel means of energy storage and conversion, pharmaceutical and industrial applications involving viscosity and other liquid properties, and medical applications,” he added.

Remarkably, some of the processes at work in these reflectin proteins are shared by the proteins that assemble pathologically in Alzheimer’s disease and other degenerative conditions, Morse observed. He plans to investigate why this mechanism is reversible, cyclable, harmless and useful in the case of reflectin, but irreversible and pathological for other proteins. Perhaps the fine-structured differences in their sequences can explain the disparity, and even point to new paths for disease prevention and treatment.

Go to Source
Author:

Categories
ScienceDaily

Could mathematics help to better treat cancer?

The development and survival of living beings are linked to the ability of their cells to perceive and respond correctly to their environment. To do this, cells communicate through chemical signal systems, called signalling pathways, which regulate and coordinate cellular activity. However, impaired information processing may prevent cells from perceiving their environment correctly; they then start acting in an uncontrolled way and this can lead to the development of cancer. To better understand how impaired information transmission influences the activity of diseased cells, researchers at the University of Geneva (UNIGE), Switzerland, are going beyond the field of biology. They propose to examine cellular communication in the light of information theory, a mathematical theory more commonly used in computer science. This work, to be discovered in the journal Trends in Cell Biology, offers a radically new approach to oncology.

“In a way, cancer can be viewed as an information disease,” says Karolina Zieli?ska, a researcher at the Translational Research in Onco-haematology (CRTOH) at UNIGE Faculty of Medicine and first author of this work. “But while the oncogenic power of over- or under-activated signalling pathways is becoming well known, the exact mechanisms remain quite mysterious.” How, indeed, do cells make their decisions based on the information they perceive — or no longer perceive? “Sometimes biology alone is not enough to decipher everything,” explains Vladimir Katanaev, a professor at CRTOH, who led the research.

Measuring uncertainty

In the late 1940s, American mathematician Claude Shannon developed a probabilistic theory to quantify the information transmitted in a set of messages over a noisy communication channel. This theory has enabled the development of modern communication systems and computers. It is also the basis for a multitude of applications such as data compression and transmission, cryptography and artificial intelligence. “But curiously, Shannon’s theory has not much been applied in the field of cell signalling,” says Karolina Zieli?ska, who is both a mathematician and a biologist. “Our idea is to use this powerful tool to examine the decisions made by diseased cells and compare them to those made by healthy cells.”

Information theory uses probability theory as a fundamental tool. Its main concept, called entropy, aims to measure the uncertainty of random variables. “If, for example, we flip a coin, the coin may randomly fall on heads or tails, so the result is uncertain. Now imagine a coin with two identical faces: the result is certain, and the entropy is nil. Entropy thus evaluates the degree of uncertainty of a random variable. When applied to communication, entropy indicates the amount of information necessary for the receiver to determine unambiguously what the source has transmitted.”

From mathematics to biology

Applied to cell signalling, information theory allows studying how cells process the information they receive from their environment. When a cell receives a stimulus — an information — from its environment, what concentrations of information can the cell process without error? Knowing a cell response, can we distinguish between different stimuli to evaluate which one triggered this particular response? These questions are essential in the field of cancer. Indeed, cancer cells may be unable to process information from the environment as well as healthy cells, and start to proliferate and divide when it is not necessary to do so.

Researchers will now test the validity of their approach by studying how breast and lung cancer cells process information from their environment. Indeed, current treatments generally aim to remove or completely extinguish certain signalling pathways, despite significant side effects. “The new approach we are proposing is not aimed at shutting down the signalling pathways, but rather at restoring their proper activity,” says Vladimir Katanaev. “By applying pure mathematical concepts to biology, we hope to identify information transmission failures that need to be fixed to correct them.”

Story Source:

Materials provided by Université de Genève. Note: Content may be edited for style and length.

Go to Source
Author: