IoT second most important IT priority for 2018, says Wi-SUN Alliance

IoT is second most important IT priority for 2018, says Wi-SUN Alliance

Implementing IoT comes second only to tackling security  on the 2018 To Do list, according to a new survey from Wi-SUN Alliance.

Companies that have implemented IoT are overwhelmingly positive about the benefits they’ve achieved, and they’re keen to build on these foundations in 2018, according to a survey of 350 organizations in the UK, US, Denmark and Sweden, conducted on behalf of the Wi-SUN Alliance, a global industry association that focuses on connectivity.

No wonder, then, that enabling IoT comes high on the list of IT priorities for 2018; in second place, in fact, just after tackling security. When it comes to IoT benefits, over half of respondents (54 percent) say they have experienced improved business efficiency; 49 percent point to improved customer experience and 48 percent highlight better collaboration.

Read more: Vodafone IoT Barometer: Large global IoT projects doubled last year

IoT rises on priorities list

The Wi-SUN Alliance’s report, The Rise of the Internet of Things, finds that half of organisations investing in IoT initiatives already have a fully implemented strategy in place, while more than a third (36 percent) have a partially implemented strategy. Companies are most advanced in the oil & gas industry, with 75 percent having a fully implemented strategy, followed by technology (59 percent) and energy and utilities (57 percent).

The research identifies a number of key drivers for IoT implementation. Around half (47 percent) said it would improve “network intelligence and connectivity for citizen safety and quality of life”, while 42 percent said a key driver is “creating business efficiencies” and 41 percent cited “improving reliability of systems and services”.

Read more: IoT projects driving IT budget decisions, 451 Research finds

Not entirely rosy

Still, not everything in the IoT garden is rosy. While respondents report that enabling IoT is the second most important IT priority for the next 12 months, just behind improving security, almost all – a massive 90 percent – of those with an IoT plan at various stages of implementation have struggled to implement this, and over a third (36 percent) said they have found it “very or extremely difficult”.

Respondents highlight security as a barrier to IoT adoption. Fifty-nine percent of them cite security concerns, with the US (65 percent) and UK (64 percent) more concerned than those in Denmark or Sweden. Almost one in three (32 percent) see both funding and a lack of commitment from leadership, as barriers, while 30 percent view leadership’s lack of understanding of the benefits of IoT as a challenge.

The report also looks at the technical challenges in delivering IoT. Here, respondents identified a wide range of issues. Sixty-three percent mentioned security and safety; 46 percent data management; 41 percent network configuration, 39 percent recruiting IoT talent and 39 percent Wi-Fi connectivity.

Read more: Six out of ten IoT projects fail at trial stage, says survey

The post IoT second most important IT priority for 2018, says Wi-SUN Alliance appeared first on Internet of Business.

Internet of Business

New Research Shows Industrial Organizations Increasingly Focused on IoT Adoption, but Most Are Still in Early Stages

New Research Shows Industrial Organizations Increasingly Focused on IoT Adoption, but Most Are Still in Early Stages

New Research Shows Industrial Organizations Increasingly Focused on IoT Adoption, but Most Are Still in Early Stages

Bsquare’s 2017 Annual IIoT Maturity Study reveals 86 percent of industrial organizations have adopted IoT but fewer than half are using advanced analytics and only a quarter have taken steps to automate the application of insights.

Bsquare, a provider of Industrial Internet of Things (IIoT) solutions, today released the findings from its first annual IIoT Maturity Study, which explores the current IoT adoption progress of business buyers in Manufacturing, Transportation, and Oil and Gas (O&G).

According to the 2017 study, 86 percent of industrial organizations are currently adopting IoT solutions and 84 percent believe those solutions are very or extremely effective. In addition, 95 percent believe that IoT has a significant or tremendous impact on their industry. However, the study also shows that most IIoT investments are focused on connectivity (78 percent) and data visualization (83 percent). In addition, only 48 percent are doing advanced analytics on that data and only a small number (28 percent) are automating the application of insights derived from analytics.

Kevin Walsh, vice president of marketing at Bsquare, said:

“Our study shows that while industrial organizations have enthusiastically adopted IIoT, a majority have not yet moved to more advanced analytics-driven orchestration of data insights.”

“These later stages of IIoT maturity—analytics, orchestration and true edge computing—tend to be where most of the ROI is realized. This is especially important because, according to our study, the number one reason cited for IIoT adoption is cost reduction.”

Bsquare’s 2017 Annual IIoT Maturity Study was conducted in the United States in August 2017, and reached more than 300 respondents at companies with annual revenues in excess of $ 250 million. Participants were evenly divided among three industry groups (Manufacturing, Transportation, and O&G) and titles covered a wide spectrum of senior-level personnel with operational responsibilities, most of whom had spent an average of six years in their organizations.

Key highlights from the report include:

  • The vast majority (86 percent) of organizations are deploying IIoT solutions, led by Construction/Transportation (93 percent) and followed by O&G (89 percent) and Manufacturing (77 percent).
  • Nearly two-thirds (73 percent) of all businesses plan to increase their IoT investments over the next 12 months, despite almost every respondent acknowledging that IoT deployments are complex.
  • Nine out of 10 decision-makers feel it is very or somewhat important for their organization to adopt IoT solutions. And 95 percent perceive IoT as having either a significant or tremendous impact on their industry at a global level.
  • Industrial organizations are using IoT most frequently for device connectivity and data forwarding (78%), real-time monitoring (56 percent), and advanced data analytics (48%). More mature uses of IoT, such as automation and enhanced on-board intelligence, are also prevalent in industrial settings.
  • More than 90% of IIoT adopters cite device-health as the primary reason for IoT adoption followed by logistics (67%), reducing operating costs (24 percent) and increasing production volume (18 percent).
  • More than half of organizations are using annual subscription models for their IIoT solutions, and 77 percent use a cloud-based model. Amazon and Microsoft were tied (14 percent) for the preferred cloud service provider.

The IoT Maturity Index outlines the stages commonly associated with Industrial IoT technology adoption. Each phase typically builds on the previous one, allowing organizations to drive maximum value as they progress through the index. The stages include:
1. Device Connectivity : on-board logic to collect data and transmit to cloud databases,
2. Data Monitoring : dashboard and visualization tools to monitor real-time data,
3. Data Analytics : machine learning and complex analytics used to develop device models and insight,
4. Automation : development and execution of logic rules that automate business activities and device configuration,
5. Edge Computing : distribution of analytics and orchestration to the device level.

For more information on the 2017 Annual IIoT Maturity Study,
click on the illustration here-below to download the complete survey infographics:

BSquare IIoT maturity survey infographic

The post New Research Shows Industrial Organizations Increasingly Focused on IoT Adoption, but Most Are Still in Early Stages appeared first on IoT Business News.

IoT Business News

Algorithms aren’t the most important thing for building AI solutions – data is

We’re an AI company, so people always ask about our algorithms.    If we could get a dollar for every time we’re asked about which flavor of machine learning we use –convolutional neural nets, K-means, or whatever – we would never need another dollar of VC investment ever again.

But the truth is that algorithms are not the most important thing for building AI solutions — data is.  Algorithms aren’t even #2.   People in the trenches of machine learning know that once you have the data, It’s really all about “features.”

In machine learning parlance, features are the specific variables that are used as input to an algorithm.   Features can be selections of raw values from input data, or can be values derived from that data.  With the right features, almost any machine learning algorithm will find what you’re looking for.  Without good features, none will.  And that’s especially true for real world problems where data comes with lots of inherent noise and variation.

In machine learning parlance, features are the specific variables that are used as input to an algorithm. Features can be selections of raw values from input data, or can be values derived from that data. With the right features, almost any machine learning algorithm will find what you’re looking for. Without good features, none will. And that’s especially true for real world problems where data comes with lots of inherent noise and variation.
“That information’s going to save them billions of dollars because in the past they had to go through a pretty cumbersome process.”

My colleague Jeff (the other Reality AI co-founder) likes to use this example:  Suppose I’m trying to detect when my wife comes home.   I’ll take a sensor, point it at the doorway and collect data.  To use machine learning on that data, I’ll need to identify a set of features that help distinguish my wife from anything else that the sensor might see.  What would be the best feature to use?  One that indicates, “There she is!”    It would be perfect — one bit with complete predictive power.  The machine learning task would be rendered trivial.

If only we could figure out how to compute better features directly from the underlying data…  Deep Learning accomplishes this trick with layers of convolutional neural nets, but that carries a great deal of computational overhead.  There are other ways.

At Reality AI, where our tools create classifiers and detectors based on high sample rate signal inputs (accelerometry, vibration, sound, electrical signals, etc) that often have high levels of noise and natural variation, we focus on discovering features that deliver the greatest predictive power with the lowest computational overhead.   Our tools follow a mathematical process for discovering optimized features from the data before worrying about the particulars of algorithms that will make decisions with those features.  The closer our tools get to perfect features, the better end results become.  We need less data, use less training time, are more accurate, and require less processing power.  It’s a very powerful method.

For an example, let’s look at feature selection in high-sample rate (50Hz on up) IoT signal data, like vibration or sound. In the signal processing world, the engineer’s go-to for feature selection is usually frequency analysis.  The usual approach to machine learning on this kind of data would be to take a signal input, run a Fast Fourier Transform (FFT) on it, and consider the peaks in those frequency coefficients as inputs for a neural network or some other algorithm.

Why this approach?   Probably because it’s convenient, since all the tools these engineers use support it.   Probably because they understand it, since everyone learns the FFT in engineering school.   And probably because it’s easy to explain, since the results are easily relatable back to the underlying physics.   But the FFT rarely provides an optimal feature set, and it often blurs important time information that could be extremely useful for classification or detection in the underlying signals.

Take for example this early test comparing our optimized features to the FFT on a moderately complex, noisy group of signals.   In the first graph below we show a time-frequency plot of FFT results on this particular signal input (this type of plot is called a spectrogram).   The vertical axis is frequency, and the horizontal axis is time, over which the FFT is repeatedly computed for a specified window on the streaming signal.   The colors are a heat-map, with the warmer colors indicating more energy in that particular frequency range.

Compare that chart to one showing optimized features for this particular classification problem generated using our methods.  On this plot you can see what is happening with much greater resolution, and the facts become much easier to visualize.  Looking at this chart it’s crystal clear that the underlying signal consists of a multi-tone low background hum accompanied by a series of escalating chirps, with a couple of other transient things going on.   The information is de-blurred, noise is suppressed, and you don’t need to be a signal processing engineer to understand that the detection problem has just been made a whole lot easier.

There’s another key benefit to optimizing features from the get go – the resulting classifier will be significantly more computationally efficient.  Why is that important?  It may not be if you have unlimited, free computing power at your disposal.  But if you are looking to minimize processing charges, or are trying to embed your solution on the cheapest possible hardware target, it is critical.   For embedded solutions, memory and clock cycles are likely to be your most precious resources, and spending time to get the features right is your best way to conserve them.

Deep Learning and Feature Discovery

Reality AI, we have our own methods for discovering optimized features in signal data, but ours are not the only way.

As mentioned above, Deep Learning (DL) also discovers features, though they are rarely optimized.  Still, DL approaches have been very successful with certain kinds of problems using signal data, including object recognition in images and speech recognition in sound.  It can be highly effective approach for a wide range of problems, but DL requires a great deal of training data, is not very computationally efficient, and can be difficult for a non-expert to use.  There is often a sensitive dependence of classifier accuracy on a large number of configuration parameters, leading many of those who work with DL to focus heavily on tweaking previously used networks rather than focusing on finding the best features for each new problem.  Learning happens “automatically”, so why worry about it?

My co-founder Jeff (the mathematician) explains that DL is basically “a generalized non-linear function mapping – cool mathematics, but with a ridiculously slow convergence rate compared to almost any other method.”   Our approach, on the other hand, is tuned to signals but delivers much faster convergence with less data.  On applications for which Realty AI is a good fit, this kind of approach will be orders of magnitude more efficient than DL.

The very public successes of Deep Learning in products like Apple’s Siri, the Amazon Echo, and the image tagging features available on Google and Facebook have led the community to over-focus a little on the algorithm side of things.  There has been a tremendous amount of exciting innovation in ML algorithms in and around Deep Learning.    But let’s not forget the fundamentals. It’s really all about the features.

Reality AI are exhibiting at the IoT Tech Expo North America on November 29-30 – find them at booth 136.

Originally published on Reality AI.

(c) istockphoto.com/ Farakos | Antiv3D

The post Algorithms aren’t the most important thing for building AI solutions – data is appeared first on IoT Tech Expo.

IoT Tech Expo

The Most Important Participant in the Internet Ecosystem

The Internet is borderless, decentralised and indiscriminate, and it can empower people across class, colour and social status. But one question has always intrigued me: How can the universality of the Internet be ensured and sustained? I received the theoretical response to this question at the Pakistan School on Internet Governance in 2016 where I learned about the multistakeholder model and community-driven approaches to addressing the broad range of complex issues of the Internet ecosystem. Being part of a telecom regulator in South Asia that generally follows the chain of command, the idea of inclusive policies and programmes was truly a revelation. I decided to explore further and applied for a fellowship to the 2017 Asia-Pacific Regional Internet Governance Forum (APrIGF) and the Asia-Pacific School on Internet Governance (APSIG).

APSIG kicked off on 22 July, followed by APrIGF that ended on 29 July in the beautiful city of Bangkok, Thailand. APSIG had a fantastic line up of speakers that touched upon advanced topics like the Internet governance ecosystem, data governance, cybersecurity, Internet of Things governance, gender equality and the digital economy. The learnings I gained from APSIG laid an ideal foundation for me to contribute to APrIGF discussions. As a first timer to these regional events, I was in a dilemma when I had to choose from the rich and diverse line up of APrIGF workshops to participate in. Thankfully, the APrIGF fellowship programme offered help by assigning subthemes to the fellows, and I was rapporteur for the subtheme on “the digital economy and enabling innovation”.

I learned that the digital economy is dynamic and complex, and requires a whole-of-society approach in order to reap its true benefits. It involves addressing a set of complex challenges related to access, affordability, quality of service, cybersecurity, online rights, licensing conditions, taxation and enabling innovation and a competitive online business environment. Moreover, connecting and enhancing the digital literacy of unserved and underserved communities should be a priority so that they can participate in the digital economy. More importantly, I have understood that governments need to step forward and become a part of this multistakeholder process and pursue the opportunities that the digital economy and disruptive technologies such as blockchain and the Internet of Things offer.

In one week of knowledge, networking, participation and engagement, I can confidently say that the people I met and the friends I made are easily the best part of the whole experience. We had a blend of professionals, students, experts, journalists, civil society and marginalised communities from all across the Asia-Pacific region that were welcomed with enthusiasm and warmth by the Internet community. The Asia-Pacific region has more than half of the Internet users in the world, yet the region as a collective voice, requires its Internet users to realise that they are the most important stakeholder in the Internet ecosystem, free from any chain of command. We, the Internet users, play a key role in shaping the Internet of the future.

Join the Everyday Heroes

The post The Most Important Participant in the Internet Ecosystem appeared first on Internet Society.

Internet Society

Five Management Strategies for Getting the Most from AI

Fueled by the buzz around powerful applications of artificial intelligence (AI), many business leaders are contemplating whether to introduce AI into their organizations. While practitioners and academics have outlined some of the strategic challenges of implementing AI, many executives are still seeking good models for how to generate competitive advantage from its application.

To find out more about what contributes to successful AI adoption, we helped lead a survey by the McKinsey Global Institute of 3,000 C-level executives across 10 countries and 14 sectors. From that research, we identified five fundamental strategies for how to get the most out of AI’s potential.

1. Plan to Grow, Not Just Cut

Executives should approach AI as an instrument to expand their businesses — creating new products or services, increasing productivity, or winning more market share — as much as a tool to cut costs. Companies with less experience in AI tend to focus on its ability to help cut costs, but the more that companies use and become familiar with AI, the more potential for growth they see in it.

Retailing executives in our survey, for example, mentioned cost cutting as often as increasing market share or market growth as their main objectives for implementing AI. But the subset of retailers who have adopted AI at scale — meaning, they deploy AI across technology groups, use AI in the most core parts of their value chains, and have the full support of their executive leadership — cited AI’s potential for business growth twice as often as its potential for cutting costs.

This same subset of retailers, the early AI adopters, reported that insight-based selling — using AI to review shoppers’ habits and suggest personalized promotions and tailored displays — increased sales by 1% to 5% in traditional stores. And they reported that personalization and AI-enabled dynamic pricing lifted online sales as much as 30%.

2. Invest in Both Technical and Managerial Talent Capabilities

In our survey, executives gave several reasons for not adopting AI. The largest share (30%) said they were uncertain of its business case. Another 21% cited the scarcity of AI-related human capabilities — and these same executives were 50% more likely to also say that AI presented an uncertain business case, suggesting that human capabilities are critically important to capture the returns from AI in new organizations.

The talent question is challenging for many organizations on two grounds. First is the need for new talent: When debating how AI may affect labor markets by automating parts of old jobs, companies have paid less attention to how AI is likely to require new technical job categories such as “DevOps Engineers” and “Next-Gen Machine-Learning Engineers.” Second is the need for managerial attention: Good return on AI will be captured only when the technology is embedded in business and workflow processes — a job that typically is complex and requires management from the highest-level leaders.

Regarding technical jobs, AI promises to be a great source of employment — but also of headaches. Filling new technical positions is expensive and time-consuming because we have not been turning out enough skilled professionals to keep up with the demand. In the United States, for instance, there were approximately 150 million workers in 2016, but only 235,000 data scientists. To circumvent the issue, companies should be using multiple paths for talent acquisition. Organizations that have been best at adopting AI are better at anticipating needs, starting with a few hires during pilots and then scaling their recruitment process just before they move from piloting to full-scale development.

The management of AI technology also involves new leadership skills, including those required to implement modern processes embedded with AI. Companies that are successfully embracing AI are committed to transformation programs, with top management embracing the change and cross- functional management teams ready to redefine their processes and activities.

3. Be Open to Revising Your Strategic Goals

In the age of digital disruption, incumbent organizations often “play defense” and protect existing business lines by cutting costs, boosting automation, or improving customer service. Often, though, they would be better off playing offense by pioneering new products and business models. We saw this with the Schibsted Media Group of Oslo, Norway, which moved its entire newspaper classified business to a free online marketplace, opening up a new revenue stream that now generates more than 80% of the group’s earnings.

Similarly, companies committed to adopting AI need to make sure their strategies are transformational and should make AI central to revising their corporate strategies. There can be a clear strategic payoff in fully embracing the use of AI: In our data, we found that for 12 of the 15 sectors studied, companies that use AI at scale and go on the offensive report profit margins of 5 points higher than others — 18% versus 13%.

4. Rely on a Solid Digital Foundation

AI works best when it has real-time access to large amounts of high-quality data and is integrated into automated work processes. Thus, AI is not a shortcut to creating digital foundations, but a powerful extension of them instead.

Our analyses back this up. At the McKinsey Global Institute, we built a comprehensive measure of the status of digitization intensity in an enterprise. The measure looks at digital assets, including an organization’s computers, robots, digitally connected systems, and other information communication and technology (ICT) assets. It also looks at how digital assets are used, such as for digital payments, digital marketing, back-office operations, and customer relations, and at the human resources devoted to using digital.

We found that companies that are able to show a statistically significant impact from AI not only have a strong digital intensity but also have a strong AI intensity. Overall, AI intensity is relatively uncommon: Less than 5% of companies report that they are using AI as an enterprise-wide solution. Most of these are digital-native companies. But those that score high on both AI and digital dimensions report a much larger and statistically significant impact of AI on their profit development than companies with high scores in AI alone. Our conclusion: Leapfrogging digitization to adopt AI does not seem a good idea.

5. Help Nurture the Creation of AI Ecosystems

Leveraging network effects, which were so important to building global digital centers like Silicon Valley, appear to be just as important to budding AI hubs. A critical mass of researchers, developers, financiers, and customers can create a fertile, self-sustaining ecosystem in which innovation and entrepreneurialism can thrive.

Business leaders can nurture the development of AI ecosystems in their communities by encouraging supportive government policies. Thoughtful incentives to attract investment and talent are helpful — for example, tax breaks for AI entrepreneur immigrants and special tech visa quotas. Funding for leading-edge science programs is also important, including grants to universities, the creation of government laboratories, and joint research initiatives with the private sector. Our global review finds that AI investment is concentrated geographically: In 2016, the United States absorbed around 66% of external investment (defined as venture capital, private equity, and merger and acquisition activity). China was second, at 17%, but growing fast. In Europe, London was the leading city.

Governments have other important tools to foster AI ecosystems. They can act as lead customers, ensuring that regulations are AI-friendly. And they can make more data available, both by opening up their own data and by establishing standards that make data readily available, while still protecting individuals’ privacy.

These AI ecosystems not only create high-skill, high-paying jobs but also, critically, produce knowledge and innovation spillovers. Indeed, our survey suggests that leaders in AI innovation — the United States and China — also lead in AI adoption. Employees become entrepreneurs, AI-savvy workers move from company to company, and innovative products can be developed for and deployed in local markets.


MIT Sloan Management Review