How the IoT is making cities smarter and more efficient

Introduction

IoT and sensing technologies are improving our quality of life in many ways, from intelligent transportation, to smart street lighting, to smart medical services, to smart metering; making cities into smarter, more efficient cities that offer new convenience and innovative services.

Smart Metering Underway

Riding the digital tide, many public utilities have started to test smart metering of power, water, gas or other energies, using integrated sensors and communication devices that enable real time data acquisition, automatic billing, cloud-based management, and event-triggered notifications.

Advantech will be speaking and exhibiting at the IoT Tech Expo North America 2017 on November 29-30 at the Santa Clara Convention Center, CA.

Cloud-monitored Energy Consumption

With smart meters, the utilities can monitor energy consumption from a remote dashboard, visualizing data collected from households, companies, and other users, without on-site visits and manual recording. Bills can be generated automatically and efficiently while cumulative data can be analyzed and historical trends can be graphed to provide insights for decision-making on energy-saving plans in pursuing a smarter and greener city.

Value Added Services Can Be Built Up

Further innovative services can be developed by system developers or 3rd party application providers. For example, specific users may review their power/water/gas usage bills with a dashboard displayed on web browser-based devices such as computers, smartphones or iPads. Control algorithms can also be implemented at the device-end to enhance safety–for instance, a digital gas meter can trigger a supply shutdown if it detects gas leakage or wildly abnormal use.

Advantech-Arm Partnership Provides IoT Solutions

As a leading supplier in the embedded computing market, the Advantech Embedded-IoT Group not only delivers a wide range of embedded design-in services, but also develops integrated IoT solutions and services that assist customers in minimizing uncertainty and risk as they enter the IoT market. Advantech has initiated an IoT partnership program with ecosystem partners including Arm.

Advantech will be exhibiting at the IoT Tech Expo North America in Santa Clara, CA on November 29-30 – visit them at booth #430 to learn more about our IoT solutions. Gus Molina, IoT Solutions Architect at Advantech will also be joining the panel ‘Creating a seamless connected building ecosystem: in the home, hospitality and the workplace‘ on the IoT Innovations & Technologies track on November 29 at 1:30pm. This conference track is free to attend – simply register for a Free Expo Pass.

(c) istockphoto.com/ tawanlubfah 

The post How the IoT is making cities smarter and more efficient appeared first on IoT Tech Expo.

IoT Tech Expo

Algorithms aren’t the most important thing for building AI solutions – data is

We’re an AI company, so people always ask about our algorithms.    If we could get a dollar for every time we’re asked about which flavor of machine learning we use –convolutional neural nets, K-means, or whatever – we would never need another dollar of VC investment ever again.

But the truth is that algorithms are not the most important thing for building AI solutions — data is.  Algorithms aren’t even #2.   People in the trenches of machine learning know that once you have the data, It’s really all about “features.”

In machine learning parlance, features are the specific variables that are used as input to an algorithm.   Features can be selections of raw values from input data, or can be values derived from that data.  With the right features, almost any machine learning algorithm will find what you’re looking for.  Without good features, none will.  And that’s especially true for real world problems where data comes with lots of inherent noise and variation.

In machine learning parlance, features are the specific variables that are used as input to an algorithm. Features can be selections of raw values from input data, or can be values derived from that data. With the right features, almost any machine learning algorithm will find what you’re looking for. Without good features, none will. And that’s especially true for real world problems where data comes with lots of inherent noise and variation.
“That information’s going to save them billions of dollars because in the past they had to go through a pretty cumbersome process.”

My colleague Jeff (the other Reality AI co-founder) likes to use this example:  Suppose I’m trying to detect when my wife comes home.   I’ll take a sensor, point it at the doorway and collect data.  To use machine learning on that data, I’ll need to identify a set of features that help distinguish my wife from anything else that the sensor might see.  What would be the best feature to use?  One that indicates, “There she is!”    It would be perfect — one bit with complete predictive power.  The machine learning task would be rendered trivial.

If only we could figure out how to compute better features directly from the underlying data…  Deep Learning accomplishes this trick with layers of convolutional neural nets, but that carries a great deal of computational overhead.  There are other ways.

At Reality AI, where our tools create classifiers and detectors based on high sample rate signal inputs (accelerometry, vibration, sound, electrical signals, etc) that often have high levels of noise and natural variation, we focus on discovering features that deliver the greatest predictive power with the lowest computational overhead.   Our tools follow a mathematical process for discovering optimized features from the data before worrying about the particulars of algorithms that will make decisions with those features.  The closer our tools get to perfect features, the better end results become.  We need less data, use less training time, are more accurate, and require less processing power.  It’s a very powerful method.

For an example, let’s look at feature selection in high-sample rate (50Hz on up) IoT signal data, like vibration or sound. In the signal processing world, the engineer’s go-to for feature selection is usually frequency analysis.  The usual approach to machine learning on this kind of data would be to take a signal input, run a Fast Fourier Transform (FFT) on it, and consider the peaks in those frequency coefficients as inputs for a neural network or some other algorithm.

Why this approach?   Probably because it’s convenient, since all the tools these engineers use support it.   Probably because they understand it, since everyone learns the FFT in engineering school.   And probably because it’s easy to explain, since the results are easily relatable back to the underlying physics.   But the FFT rarely provides an optimal feature set, and it often blurs important time information that could be extremely useful for classification or detection in the underlying signals.

Take for example this early test comparing our optimized features to the FFT on a moderately complex, noisy group of signals.   In the first graph below we show a time-frequency plot of FFT results on this particular signal input (this type of plot is called a spectrogram).   The vertical axis is frequency, and the horizontal axis is time, over which the FFT is repeatedly computed for a specified window on the streaming signal.   The colors are a heat-map, with the warmer colors indicating more energy in that particular frequency range.

Compare that chart to one showing optimized features for this particular classification problem generated using our methods.  On this plot you can see what is happening with much greater resolution, and the facts become much easier to visualize.  Looking at this chart it’s crystal clear that the underlying signal consists of a multi-tone low background hum accompanied by a series of escalating chirps, with a couple of other transient things going on.   The information is de-blurred, noise is suppressed, and you don’t need to be a signal processing engineer to understand that the detection problem has just been made a whole lot easier.

There’s another key benefit to optimizing features from the get go – the resulting classifier will be significantly more computationally efficient.  Why is that important?  It may not be if you have unlimited, free computing power at your disposal.  But if you are looking to minimize processing charges, or are trying to embed your solution on the cheapest possible hardware target, it is critical.   For embedded solutions, memory and clock cycles are likely to be your most precious resources, and spending time to get the features right is your best way to conserve them.

Deep Learning and Feature Discovery

Reality AI, we have our own methods for discovering optimized features in signal data, but ours are not the only way.

As mentioned above, Deep Learning (DL) also discovers features, though they are rarely optimized.  Still, DL approaches have been very successful with certain kinds of problems using signal data, including object recognition in images and speech recognition in sound.  It can be highly effective approach for a wide range of problems, but DL requires a great deal of training data, is not very computationally efficient, and can be difficult for a non-expert to use.  There is often a sensitive dependence of classifier accuracy on a large number of configuration parameters, leading many of those who work with DL to focus heavily on tweaking previously used networks rather than focusing on finding the best features for each new problem.  Learning happens “automatically”, so why worry about it?

My co-founder Jeff (the mathematician) explains that DL is basically “a generalized non-linear function mapping – cool mathematics, but with a ridiculously slow convergence rate compared to almost any other method.”   Our approach, on the other hand, is tuned to signals but delivers much faster convergence with less data.  On applications for which Realty AI is a good fit, this kind of approach will be orders of magnitude more efficient than DL.

The very public successes of Deep Learning in products like Apple’s Siri, the Amazon Echo, and the image tagging features available on Google and Facebook have led the community to over-focus a little on the algorithm side of things.  There has been a tremendous amount of exciting innovation in ML algorithms in and around Deep Learning.    But let’s not forget the fundamentals. It’s really all about the features.

Reality AI are exhibiting at the IoT Tech Expo North America on November 29-30 – find them at booth 136.

Originally published on Reality AI.

(c) istockphoto.com/ Farakos | Antiv3D

The post Algorithms aren’t the most important thing for building AI solutions – data is appeared first on IoT Tech Expo.

IoT Tech Expo

IoT and Machine Learning: Why Collaboration is Key

“Internet of Things” and “Machine Learning” can seem like buzzwords. They are both in the “Peak of Inflated Expectations” stage on Gartner’s emerging technologies hype cycle, but together they will change the world in various capacities in the next couple of years.

Internet of Things (IoT) refers to the network of devices and data shared from them. You know that Fitbit that you wear? That’s an IoT device. Machine Learning (ML) finds patterns in data and does something based on those patterns without being explicitly programmed. The data collected from IoT devices overtime is enormous and would be difficult for one person or even a team to uncover all insights. That’s where machine learning comes in – it can scale and simplify IoT data analysis. Internet of Things and Machine Learning complement each other. Here are some use cases to illustrate how IoT and ML can work together..

IoT and ML use cases

Analyze traffic patterns for city planning

Use machine learning to forecast traffic and peak demand within smart cities to make recommendations on alternative transportation or travel times. You would need to collect hourly or daily traffic data so city officials can make predictions to identify bottlenecks and make city planning recommendations.

Use machine learning to forecast traffic and peak demand within smart cities to make recommendations on alternative transportation or travel times. You would need to collect hourly or daily traffic data so city officials can make predictions to identify bottlenecks and make city planning recommendations.

Predictive maintenance for wind turbines

Predict cooling system usage on wind turbines and schedule for preventative maintenance optimization. You would need to collect hourly or daily usage data. A manager can then dispatch a maintenance crew when the predicted aggregate usage exceeds known maintenance thresholds.

Optimize device / systems efficiency

By gathering device and systems data you can optimize efficiency by linking usage forecast to supply chain and device operations. Developers or analysts can use machine learning to send alerts when predicted usage has exceeded a known threshold.

Collaboration is key

The list goes on and on of possible use cases for IoT and ML to achieve greater insights together, but collaboration will yield the best results. By collaboration we mean minimizing siloes that separate departments, companies, or industry verticals. A good example of this is a grocery store chain’s data on customer food purchases by customer ID collaborating with a healthcare company who has data on health history to uncover relationships between food and health overtime. This type of partnership enables the dynamic duo of machine learning and internet of things to work together effectively.

We need to start thinking of internet of things and machine learning as a dynamic duo that together will create positive social impact, especially with collaboration.

Find Nexosis at the IoT Tech Expo North America on November 29-30 2017 with the free exhibition at booth 170.

Originally published on Nexosis.

(c) istockphoto.com/ erikona | chinaface

The post IoT and Machine Learning: Why Collaboration is Key appeared first on IoT Tech Expo.

IoT Tech Expo

How Oracle is using machine learning to help customers gain ROI from their IoT initiatives

The statistics around how the Internet of Things (IoT) will affect different industries are fascinating. According to MarketsandMarkets, the market for IoT in healthcare will hit $ 158 billion by 2022; intelligent transportation, $ 143.93bn by 2020; and manufacturing, $ 20.59bn by 2021.

The potential therefore is high in individual industries. But looking at the supply chain, it’s not impossible to connect the dots and increase the potential.

Lionel Chocron, Oracle VP industry and IoT cloud solutions, who will be speaking at the IoT Tech Expo North America in Santa Clara in November, puts it this way. Manufacturing companies and logistics companies started coming to conferences and asking questions several years ago; now they have gone far beyond initial implementations.

At the very end of August, software giant Oracle issued a major update to its IoT Cloud, featuring three primary trends; digital twin, applied machine learning and artificial intelligence (AI), and what Oracle calls ‘digital thread’.

At its heart, these help expedite the delivery process for organisations and can provide a one-stop shop with industry solutions built on Oracle’s IoT Cloud applications. Take fleet management as an example; by combining two products – in this case IoT Fleet Management Cloud and Oracle Logistics Cloud – companies can track shipments in real time, assess risk management, and synchronise logistics.

This is part of digital thread, as Atul Mahamuni, VP IoT at Oracle, explains. “If you look at multiple operations in the supply chain, from product design, to procurement, to manufacturing, to transportation, to logistics, to warehousing, to service, all of these today are siloed operations,” he says. “Through IoT we can interconnect all of these together.”

Lionel Chocron, Oracle VP industry and IoT cloud solutions, who will be speaking at the IoT Tech Expo North America in Santa Clara in November, puts it this way. Manufacturing companies and logistics companies started coming to conferences and asking questions several years ago; now they have gone far beyond initial implementations.

“People call it Industry 4.0, specific to manufacturing, but it’s the full supply chain digital thread topic,” he says. “From the point of designing your product, to basically getting it to the market…but more importantly, the management, the supply chain, the logistics…all that chain is absolutely critical. We see manufacturing companies, logistics companies, all embracing the topic.”

A little bit further back, but certainly on the radar for Chocron, is healthcare, which the Oracle VP claims is ‘picking up the ball big time’. “I believe healthcare is at that stage now, having tried and having dived into the security and privacy issues which are critical – because you have to be certified and conform to some specific regulation – but now they understand the value of it and how they can use it for many kinds of applications,” he says.

“The value they get out of it is astronomical, both in terms of operating the healthcare facilities and personnel and so forth, but from the creation of the drug, to the testing of the drug, to the follow-up on how the drug is being used,” adds Chocron. “Using IoT technology to bring data live from the people using that medication as an example is something they see as a major boost in information, which mean they get to go way faster in getting the drugs to market and getting them certified.

“That information’s going to save them billions of dollars because in the past they had to go through a pretty cumbersome process.”

Using IoT technology to bring data live from the people using that medication as an example is something they see as a major boost in information, which mean they get to go way faster in getting the drugs to market and getting them certified.
“That information’s going to save them billions of dollars because in the past they had to go through a pretty cumbersome process.”

Alongside this is Oracle’s belief that the line of business is the most influential when it comes to making these decisions. “The value proposition is a business value proposition when you talk about connecting a factory, when you talk about enabling predictive maintenance on equipment, and driving optimised field service management when you talk about connecting your workforce,” says Chocron. “At the end of the day, the metric is a business metric.”

“There’s another subtle aspect though,” adds Mahamuni. “Once you make the buying decision…your business wants to get some more data, or some more insight. A typical example we’ve seen is they want to create a new KPI. Typically that involves not just your IT department, but you have to bring in a system integrator, do changes to your implementation, and it is an expensive and time-consuming process.

“The kind of things that we have done here are, like the KPI, a business user, literally a plant manager or a fleet operator, can change and create new KPIs,” says Mahamuni. “So that’s a new paradigm where not only the decisions are moving to OT (operational technology) but even the changes. Because they know how to run their business better, they can do things and experiment, and see what they like right through their desktop or mobile application.”

Oracle is both speaking and exhibiting at the IoT Tech Expo in Santa Clara, where the message, alongside some visual cues in the form of live demos, will be around helping businesses make the right decisions wherever they are in the IIoT journey.

“Our role is to deliver a story that shows the business outcome to our customers,” says Chocron. “We don’t want to talk about a platform per se – we want to talk about the integration of our application with our IoT play to deliver a business outcome for different kinds of industries.”

(c) istockphoto.com/ sreenath_k | Antiv3D

The post How Oracle is using machine learning to help customers gain ROI from their IoT initiatives appeared first on IoT Tech Expo.

IoT Tech Expo

Another Brick in the Wall: Barriers to IoT Adoption

In my previous blog, I outlined the major components of the Internet of Things (IoT), giving the current state of IoT technology a grade of B-minus. Why the minus? Today, I’ll dive deeper into two major issues slowing IoT adoption: complexity and security.

Complexity Fragments Markets and Hampers Interoperability

There is no such thing as the “IoT market.” The typical vertical markets associated with industrial IoT applications range from manufacturing, transportation, oil and gas, and mining to agriculture, retail, insurance, healthcare, education, and smart cities. Each of these huge markets has many submarkets, and even within each submarket there are many overlapping, often long-standing ecosystems. Car manufacturers in Europe, for example, work within a completely different supply chain from those in the United States; each has its own vocabulary, technologies, and challenges. Adding to that complexity is the fact that, with few exceptions, IoT deployments are in “brownfield” environments, where innovations have to coexist with a plethora of incompatible legacy technologies.

“Car manufacturers in Europe, for example, work within a completely different supply chain from those in the United States; each has its own vocabulary, technologies, and challenges. Adding to that complexity is the fact that, with few exceptions, IoT deployments are in “brownfield” environments, where innovations have to coexist with a plethora of incompatible legacy technologies.”

Then factor in access technologies. The wide range of IoT use cases drives an equally wide range of technologies that vary according to bandwidth, reach, power, and cost. Connected vending machines may need to send a few packets whenever a brand of soda needs to be restocked. On the other extreme, the sensors deployed around an oil rig may generate terabytes of data each day. These sensors are connected within the rig using a combination of Ethernet and wireless technologies. In some cases, the data can be sent back to the central data repository using a fiber cable; but when this isn’t possible for remote sites, the data is processed locally in real-time, and just the exceptions or alerts are sent back via satellite. In other cases, you might piggyback on a municipal Wi-Fi system, or use Low Power Wide Area Network (LPWAN) technologies to connect battery-powered devices. Payment apps such as Apple Pay use near-field communications (NFC), which (thankfully) won’t work more than a few inches away. Indeed, these special needs demand specialized technology—but the result is a complex tangle of often incompatible and disparate access methods.

The IoT industry has tried to bring order to all of this with horizontal and vertical standards bodies and consortia—IEEE, IETF, ODVA, ISA, IIC, OCF, and OPC, to name a few (and to get lost in alphabet soup!). Ironically, there are so many industry organizations that it’s hard to bring them all together into a cohesive set of standards that ensure interoperability across an entire IoT deployment. The various sensors in a single production facility may run on different semi-proprietary standards that limit the free flow of information. Limited access to IoT data limits the value of your IoT deployment. For example, IoT applications such as preventive maintenance can work only if they can gather, process, and analyze all the data generated by heat, pressure, and vibration sensors on a piece of heavy equipment. Standardization and interoperability are the gateway to IoT value.

Companies considering IoT deployments also have to navigate rapidly changing organizational structures. For most of the 20th century, vertically integrated vendors strived to deliver end-to-end solutions. Today, markets move too fast for any one company to develop or deliver a single, complete solution on its own. The 21st century model is the emergence of symbiotic ecosystems of partners who complement each other in developing IoT solutions together. You might picture a big square dance, where partners come together for a time, then move off to dance with someone else. For many companies, this is unknown territory, but the sooner you embrace this model, the sooner you’ll be able to benefit from the IoT economy.

Security Concerns Can Kill an IoT Deployment

Worries about security may cause decision-makers to hesitate before investing in an IoT deployment—and last year’s IoT distributed denial of service (DDoS) attacks didn’t helped matters. IoT security is in many ways unique: It is more distributed, more heterogeneous, and more dynamic than traditional IT security environments. It also introduces new scenarios that require brand new approaches to security (think connected cars, sensor swarms and consumer-class devices in the workplace).

“Worries about security may cause decision-makers to hesitate before investing in an IoT deployment—and last year’s IoT distributed denial of service (DDoS) attacks didn’t helped matters. IoT security is in many ways unique: It is more distributed, more heterogeneous, and more dynamic than traditional IT security environments. It also introduces new scenarios that require brand new approaches to security (think connected cars, sensor swarms and consumer-class devices in the workplace).”

Back in the day when industrial enterprises ran self-contained, proprietary systems, “security by obscurity” was standard practice—if you’re not connected to anything, no one can break in. That approach no longer works in today’s connected IoT environment (if it ever did), so businesses must rely on a policy-based architectural approach that builds security into every aspect of a deployment—not just defending the perimeter.

After years of under-investment, the security industry is finally addressing the special requirements of IoT in a way that is reminiscent of how it responded to the challenges of Wi-Fi 15 years ago—accelerating work in standards, interoperability and certifications.

On the Other Hand, Adoption Accelerators Can Help Realize IoT Value

While complexity and security remain obstacles to widespread IoT implementation, here are two technology trends that promise to accelerate adoption and multiply the value of IoT solutions:

Analytics: When we put sensors on things and then connect them, we begin collecting vast amounts of data in motion about those things. Analytics sifts through that data real-time or near-real-time to find what is important and delivers insights and recommended actions for business impact. Two of the four fast paths to IoT payback I’ve identified—predictive analytics and preventive maintenance—depend on analytics to create IoT value.

Blockchain: I mentioned in my last blog that the ability to have a trusted means of transferring and tracking value online is enabling a whole new class of IoT capabilities, such as authenticating interactions among autonomous vehicles or managing and reporting mining site data. The “Internet of value” created by IoT plus blockchain will transform online processes. The industry is moving swiftly to capitalize on these capabilities starting with the formation of consortia to ensure interoperability.

So while obstacles remain, I am optimistic about the trajectory of IoT. An active community of IoT innovators is working tirelessly to reduce complexity and improve security. They know that IoT value depends on it.

Do you want to get involved?

Learn and contribute more by joining lively discussions from industry thought leaders in the new Building the Internet of Things community. More IoT insights can also be found on my web site.

(c) istockphoto.com/ bogdanhoda | tramino | hywards

The post Another Brick in the Wall: Barriers to IoT Adoption appeared first on IoT Tech Expo.

IoT Tech Expo