Small sized amplifier but high in performance

Texas Instruments (TI) has introduced the industry’s smallest operational amplifier (op amp) and low-power comparators at 0.64 mm2. Being the first amplifiers in the compact X2SON package, the new TLV9061 op amp and TLV7011 family of comparators enable engineers to reduce their system size and cost.

The noticeable feature of the new amplifiers is that their size does not affect the performance. They offer high performance in a variety of Internet of Things (IoT), personal electronics and industrial applications, including mobile phones, wearables, optical modules, motor drives, smart grid and battery-powered systems.

As part of TI’s small-size amplifier portfolio these new devices enable engineers to design smaller systems, while maintaining high performance. They allow possibility of industry-leading package options and many of the world’s smallest op amps and comparators.

With a high gain bandwidth (GBW) of 10 MHz, fast slew rate at 6.5 V/µs and low-noise spectral density of 10 nV/√Hz, the TLV9061 op amp is designed for use in wide-bandwidth, high-performance systems. Additionally, both devices support rail-to-rail inputs with low-voltage operation down to 1.8 V, enabling ease-of-use in battery-powered applications.

In addition to its tiny size, the TLV9061 op amp also features integrated EMI filtering inputs. This helps provide resilient performance for systems prone to RF noise, while significantly reducing the need for external discrete circuitry.

Two times lower offset drift and typical input bias across a full temperature range, -40 to 125 degrees Celsius, creates a more precise signal chain solution compared to other small devices. With power as low as 335 nA and fast propagation delay down to 260 ns, the TLV7011 family of nano-power comparators enable low-power systems to monitor signals and respond quickly.

The post Small sized amplifier but high in performance appeared first on Internet Of Things | IoT India.

Internet Of Things | IoT India

Robots with lasers are tackling high salmon murder rates in Norway

Robots may work in teams, but their personalities can be like chalk and cheese, says Internet of Things (IoT) writer and observer, Nick Booth. Take the Scandinavian robotic cops that are cleaning up the killers on Norway’s fish farms.

With an output of 80 million tons of fish a year, aquaculture is the world’s fastest-growing food producing sector. It has tripled in 20 years and now provides 17% of the world’s animal protein.

But the waste is growing too. Fish in confinement are more prone to pestilence and disease. With 1,200 farms, Norway is the world’s second biggest fish farm producer – and it’s losing €310 million salmon killed every year.

The sea louse doesn’t look much but it will eat a salmon’s skin and blood and leave open wounds. Two robots can stop the killings but they have two distinct styles. One fights dirty, the other is clean.

Laser strike

Stingray takes the no-nonsense approach one would expect from a product of a capital city. The Oslo-bred robot shoots the killers with a laser gun. It’s an unconventional approach, but it gets results. And it has the full backing of its captain, Esben Beck, who learned his trade the hard way, in Norway’s oil industry.

As a form of pest control it sounds a bit laborious, not to mention dangerous. Shooting a fly-sized target off a moving fish sounds about as efficient as chasing a fox through fields with a pack of dogs.

But the technology is already there. In the 1990s Americans worked out how to laser a cruise missile from five miles, explains Beck. That’s a much higher precision than Stingray needs to shoot lice.

The laser is enclosed in a watertight ‘punch bag’, which contains a camera that spots the parasites on fish. Once a louse has been sighted an onboard computer calculates the laser’s trajectory.

Another camera provides a live-feed of the action to the Oslo control room. As fish swim by, lice are detected and within milliseconds a laser beam, comprising photons of energy, heads towards the lice and the energy coagulates them to death. Each louse-kill is documented in a photo, which is displayed as a composite in the control room, like a sort of virtual trophy wall.

No salmon are harmed by the laser as their shiny shells deflect the lethal lights. So far, the lasers have treated millions of fish, and fish farmers have reported no injuries.

Salmon farming in Norway uses an open net system, which means parasites can float freely between the wild environment and the cage that the farmed salmon is in.

Killed with a Halo

A more long-term solution is offered by the Halo Robot, which operates from the more rural surroundings of the Aqua Robotics HQ. This louse buster has a completely different MO (modus operandi) which is much more patient.

Lasers are tough on lice. The Halo tackles the causes of lice. Halo’s robots lay down the law by constantly scrubbing away at the infrastructure to remove any horrible stuff from lousing up the infrastructure. To paraphrase Travis Bickle, the vigilante anti-hero of cult movie Taxi Driver: “Grease, algae, mussels, hydroids, bacteria, Lice … some day […]

The post Robots with lasers are tackling high salmon murder rates in Norway appeared first on IoT Now – How to run an IoT enabled business.

Blogs – IoT Now – How to run an IoT enabled business

Enterprise AI needs high data quality to succeed

business ai needs high data quality

There’s no doubt that AI has usurped big data as the enterprise technology industry’s favorite new buzzword. After all, it’s on Gartner’s 2017 Hype Cycle for emerging technologies, for a reason.

While progress was slow during the first few decades, AI advancement has rapidly accelerated during the last decade. Some people say AI will augment humans and maybe even make us immortal; other pessimistic individuals say AI will lead to conflict and may even automate our society out of jobs. Despite the differences in opinion, the fact is, only a few people can identify what AI really is. Today, we are surrounded by minute forms of AI, like the voice assistants that we all hold in our smart phones, without us knowing or perceiving the efficiency of the service. From Siri to self-driving cars, a lot of promise has already been shown by AI and the benefits it can bring to our economy, personal lives and society at large.

The question now turns to how enterprises will benefit from AI. But, before companies or people can obtain the numerous improvements AI promises to deliver, they must first start with good quality, clean data. The success of AI relies on accurate, cleansed and verified data.

Data Quality and Intelligence Must Go Hand-in-Hand

Organizations currently use data to extract numerous informational assets that assist with strategic planning. The strategic plans dictate the future of the organization and how it fairs within the rising competition. Considering the importance of data, the potential impact caused by low quality information is indeed intimidating to think of. In fact, bad data costs the US about 3 trillion per year.

Recently, I had the opportunity to interview Nicholas Piette and Jean-Michel Francofrom Talend, which is one of the leading big data and cloud integration company. Nicholas Piette, who is the Chief Evangelist at Talend, has been working with integration companies for nine years now and has been part of Talend for over a year.

When asked about the link between both Data Quality and Artificial Intelligence, Nick Piette responded with authority that you cannot do one without the other. Both data quality and AI walk hand-in-hand, and it’s imperative for data quality to be present for AI to be not only accurate, but impactful.

The Five R’s

To better understand the concept of data quality and how impacts AI, Nick used the five R’s method. He mentioned he learned this method from David Shrier, his professor in MIT. The five R’s mentioned by Nicholas include:

  1. Relevancy
  2. Recency
  3. Range
  4. Robustness
  5. Reliability

If the data you are using to fuel your AI driven initiatives ticks off each one of these R’s, then you are off to the right start. All five of these hold a particular importance, but relevancy rises above the rest. Whatever data you have should be relevant to what you do, and should serve as a guide and not as a deterrent.

We might reach a point where the large influx of data we have at our fingertips is too overwhelming for us to realize what elements of it are really useful vs what is disposable. This is where the concept of data readiness enters the fold. Having mountains of historical data can be helpful for extracting patterns and forecasting cyclical behavior or re-engineering processes that lead to undesirable outcomes. However, as businesses continue to advance toward the increase use of real-time engines and applications, the importance of data readiness—or information that is the most readily or recently made available—takes on greater importance. The data that you apply should be recent and should have figures that replicate reality.

AI Use Cases: A look at Healthcare

When asked for the best examples of the use of AI at work today, Nick said he considered the use of AI in healthcare as a shining example of both what has be achieved using AI to-date and what more companies can do with this technology. More specifically, Nick said:

“Today, healthcare professionals are using AI technology to determine the chances of a heart attack in an individual, or predict cardiac diseases. AI is now ready to assist doctors and help them diagnose patients in ways they were unable to do before.”

Our understanding or interpretation of what the AI algorithms produce dictates the use of AI in healthcare. This is true regardless of its current accolades. Thus, if an AI system comes up with new insights that seem ‘foreign’ to our current understanding, it’s often difficult for the end-user to ‘trust’ that analysis. According to Nick, the only way society can truly trust and comprehend the results delivered by AI algorithms is if we know that at the very core of those analyses is quality data.

Quality-Driven Data

Nicholas Piette added that ensure data quality is an absolutely necessary prerequisite for all companies looking to implement AI. He said the following words in this regard:

“100% of AI projects are subject to fail if there are no solid efforts beforehand to improve the quality of the data being used to fuel the applications. Making no effort to ensure the data you are using, is absolutely accurate and trusted—in my opinion—is indicative of unclear objectives regarding what AI is expected to answer or do. I understand it can be difficult to acknowledge, but if data quality mandates aren’t addressed up front, by the time the mistake is realized, a lot of damage has already been done. So make sure it’s forefront.”

Nick also pointed out that hearing they have a data problem is not easy for organizations to digest. Adding a light touch of humor, he said “Telling a company it has a data problem is like telling someone they have an ugly child.” But the only way to solve a problem is to first realize you have one and be willing to put in the time needed to fix it.

First Step is Recognition

Referring to the inability of the companies to realize that they have a problem, Nicholas pointed out that more than half of the companies that he has worked with did not believe that they have a data problem until the problem was pointed out. Once pointed out, they had the AHA! Moment.

Nick Piette further voiced his opinion that it would be great if AI could, in the future, exactly tell how it reached an answer and the computations that went into reaching that conclusion. Until that happens, data quality and AI run parallel. Success in AI will only come from the accuracy of data inputted.

 “If you want to be successful, you have to spend more time working on the data and less time working on the AI.”

Nicholas Piette (Talend)

If you want to learn more about the concept of data quality you can click here.

About the Author

Ronald van Loon is an Advisory Board Member and Big Data & Analytics course advisor for Simplilearn. He contributes his expertise towards the rapid growth of Simplilearn’s popular Big Data & Analytics category.

If you would like to read more from Ronald van Loon on the possibilities of Big Data and the Internet of Things (IoT), please click “Follow” and connect on LinkedIn and Twitter.

The post Enterprise AI needs high data quality to succeed appeared first on ReadWrite.

ReadWrite

Arduino IDE 1.8.5: Hotfix for macOS High Sierra Users

In case you haven’t noticed, our team has just released Arduino IDE 1.8.5This time the changelog is fairly small, as it mainly solves a (rather important) problem being encountered by macOS users who just updated to High Sierra (10.13).

If you are not using English as system language, any version of Arduino you launch will lack the menu in the system bar. Every Java application is experiencing the same problem, so it will probably be solved by Apple in the near future.

In the meantime, IDE 1.8.5 recognizes when the menu bar is not being displayed and replaces it with a Windows-style one. It may not be the prettiest thing, but at least it works!

If you want to recover the old menu bar while keeping the whole system in your normal language, you can issue a single command on Terminal:

defaults write cc.arduino.Arduino AppleLanguages '(en)'

 

Thank @AdrianBuza for the workaround. Issuing this command will make Arduino IDE in English, however you can still change the language under “Preferences” without losing the macOS integration.

Arduino Blog

Electric Imp announces Secure, High Performance Support for AMQP

Data-Driven IoT Applications Need Enterprise-Grade Data Integration

The promise of the Internet of Things (IoT) is to deliver data-driven insights across a diverse range of use cases with millions of edge products. Capturing, delivering and integrating data at scale from the edge into the enterprise is a challenge that slows down many IoT projects. It requires not only robust security and reliable communication, but also protocol flexibility, scalability and processing capabilities to integrate efficiently into cloud architectures and business applications faster — and for less cost and effort.

The Electric Imp Platform already provides industrial-strength edge-to-cloud security, including UL 2900-2-2 Cybersecurity Certification, multi-channel managed communications, advanced device and cloud in-flight data processing processing and transformation capabilities, scalable device management, and a range of ready-to-use integrations with popular cloud service platforms such as AWS, IBM Watson IoT, Autodesk Fusion Connect, GE Predix, Salesforce and many others. We process more than half-a-billion device messages per day, enabling real-world, industrial-grade, high-volume transaction applications for our customers.

Now available from Electric Imp: High-Performance AMQP Support

Today, Electric Imp is proud to announce initial support for the Advanced Message Queuing Protocol (AMQP) 1.0. Electric Imp’s AMQP Integration delivers a proven protocol and highly-secure option for optimally-matched integrations to messaging-oriented cloud applications and enterprise infrastructure, such as Microsoft Azure IoT Hub.

Electric Imp’s AMQP support is designed for secure, enterprise-scale, low-latency, bi-directional communication, making it ideally suited for a wide range of high-performance and near-realtime IoT applications with millions of connected edge products, such as large-scale sensor deployments, remote monitoring of industrial equipment, and realtime tracking of commercial assets.

AMQP is offered as an API within the Electric Imp programmable cloud virtual machines (agents) that partner every imp-enabled device, enabling a direct and secure connection to an AMQP broker. For more details, please see our AMQP API documentation.

Electric Imp solutions designed for Microsoft Azure to enable demanding IoT Applications

In today’s IoT market, some vendors offer very basic Azure IoT Hub integrations with significant limitations, such as not being part of the Azure infrastructure, only supporting simple and one-way data flow, having a non-scalable approach to device onboarding, or lacking integrated in-flight data processing and transformation.

By comparison, the Electric Imp AMQP implementation provides tight integration between AMQP and Azure IoT Hub and includes ready-to-use advanced functionality required by demanding IoT deployments. The Electric Imp platform and integration will also soon be available to customers as an Azure Private Cloud offering, delivering an all-Microsoft solution.

Electric Imp’s support for Microsoft creates edge-to-enterprise device security and connectivity that help you create enterprise-grade data integrations and high-value IoT applications with confidence — and with a much faster time to market too.

Terrence Barr
Head of Solutions Engineering,
Electric Imp

Electric Imp Blog