How to avoid asset meltdown in nuclear power plants

We can all imagine the complexity surrounding the workings of a nuclear power plant. Given the level of safety, reliability and regulatory requirements they face, every operation and process needs to be designed and managed with utmost care.  The same is the case with asset management.

Asset management plays a critical role in the operations of a nuclear power plant for the following reasons:

1)    It establish processes to help improve reliability by ensuring the lowest downtime of assets possible

2)    It helps manage the complexity of assets to improve visibility and maintenance optimization. This ensures longer asset life and high returns on investment. This is especially critical in a nuclear power plant where equipment and infrastructure can come at a high cost.

3)    It helps protect the power supply, ensuring it is always available to meet demand

Data from the IoT enables predictive maintenance

 Modern nuclear power plant designs include more IoT sensors than their older counterparts. With these sensors, operators have access to  raw data , which they can feed into a predictive monitoring solution and view actionable insights in real time.

Data collected and analyzed can provide a precise picture of an asset’s state of health (Good, Fair, or Poor) – enabling the discovery of failures and potential failures that otherwise would have been impossible to spot. Predictive maintenance, based on this analysis, enables nuclear power plants to be more proactive and confident in their asset maintenance – potentially avoiding disasters or outages.

With this added insight, plant operators have an advantage in terms of the continuous operation of plant assets and better scheduling of maintenance tasks – which translates into reduced costs.

A specialized solution for the Energy & Utilities industry

Asset and operational management for nuclear power is unique and requires specialized enterprise asset management software. IBM IoT for Energy and Utilities is an open analytics solution that includes a wide range of capabilities to meet current and future needs of nuclear power providers. IBM IoT for Energy and Utilities is built on a foundation of data integration. It can capture and aggregate all relevant sources of information required to run the most advanced analytics across a wide variety of use cases.

Key capabilities offered in IBM IoT for Energy and Utilities are:

Out-of-the-box utility industry applications. They apply a wide range of analytical capabilities to assess asset health and risk—historically and in real-time. It can verify connectivity models in a cost effective way. It can also provide situational awareness—from the equipment level to the grid level – and employ predictive maintenance to proactively address impending asset degradation or failure.

A platform with a range of analytic tools. Combined with visualization, IoT data integration, and data lake capabilities,  the platform can effectively handle the big data needs of the industry and provide a comprehensive view of asset performance across the asset portfolio.

An open approach for extensibility, customizability, and integration of existing utility models. It can also leverage open source analytic tools to complement and extend a provider’s existing information sources and skills.

AREVA NP and IBM team up to aid nuclear power plants with asset management and maintenance

AREVA NP has joined forces with IBM’s Watson IoT advanced analytics platform.  This partnership help utilities implement big data solutions for the nuclear industry.Utilities can use this integrated data intelligence to predict the when, where and why of component operations and performance, as well as the consequences of component issues. This enables the most cost-effective and pre-emptive deployment of maintenance and repair resources.

For more information:

1)    To learn more about the solution or to talk to a sales rep, visit our Marketplace page.

2)    Find out more on the AREVA NP-IBM partnership here

The post How to avoid asset meltdown in nuclear power plants appeared first on Internet of Things blog.

Internet of Things blog

Five Robotic Process Automation Risks to Avoid

Software robots have emerged as a potential way for organizations to achieve clear cost savings. As an evolution of automated application testing and quality assurance (QA) platforms, software bots can simulate activities that humans perform via screens and applications. A software bot can be trained to perform QA tasks, such as tapping and swiping through an app like a human would, or execute repetitive service tasks such as generating bills. It does this either by recording what humans do or via scripts that are straightforward to construct. The result is the ability to automate rote tasks that do not require complex decision-making, without the need for underlying systems changes.

Deploying bots seems deceptively simple. Robotic process automation (RPA) software vendors are actively pitching their platforms, and professional services organizations are talking up the possibilities that bots offer for cost savings with minimal project spending and limited transformational pain, which is resulting in significant corporate interest. Some forward-looking organizations are using bots and other rapid-process and data-automation tool sets to free up budget and resources to kick off large-scale reengineering programs. Others are simply using the tools to give themselves a bit of breathing room to figure out where to go next with their core platforms.

Given the push toward digital agility and away from legacy systems, it’s not surprising that organizations are executing pilots with bots across their operations. But there are five major risks to consider when designing a bot strategy.

1. If bot deployment is not standardized, bots could become another legacy albatross. The way in which business organizations are adopting bots brings to mind another application adoption: measures to address the Y2K software bug at the end of the 20th century. To deal with the time-clock change at the turn of the century, many organizations circumvented legacy limitations. Business users embraced the increasing power in Microsoft Excel and Access to create complex, business-critical applications on their desktops. But as those custom-made computing tools proliferated, so did the problems due to the lack of a strong controls framework, QA, release-management processes, and other formalized IT processes. Companies then had to spend large sums of money tracking down all their wayward tools and slowly eliminating them from critical functions.

Today’s explosion of bots threatens to repeat this pattern. In many cases, the configurations of underlying applications, networks, or data services may need to be changed to allow the bots to work effectively with them. Often, the real power of bots can be realized only alongside other technology tools. For example, a bot might extract information from several hard-to-access systems and push information into a database for use by data-transformation tools, calculators, and models. These integrations require IT involvement to properly design and deploy. Without such expertise, a script designer might simply push the data into an Excel file as a proxy database, which creates another custom-tool remediation exercise — a large number of scripts, running on a larger number of bots, without the necessary standards and monitored source code that is critical in any modern enterprise technology platform. That remediation will take budget and management focus away from badly needed investments in application modernization.

The bottom line is that the scripts that program bots are software code and should be treated as such. They need to be designed using industry-standard methodologies that focus on reuse and abstraction, and they should be versioned and properly logged so that QA processes can be executed against them. It is critical that bot implementation be tightly coordinated between business users, technology teams, and, where appropriate, third-party companies hired to write the scripts. Bots should be put into production through the same tested processes that are used for all enterprise software applications.

2. Bots might make innovation more difficult — and slower. As bots are trained to interact with Windows and browser-based applications, they will become a dependency for any change to those underlying systems. If an IT team needs to roll out an upgrade, a critical patch, or any enhancement, it will need to consider how the system change will affect the bots that interact with it. Unless handled very carefully, this will potentially slow down the process of innovation.

Unlike humans, who adapt easily to small changes in the way a specific screen works or the data contained within a dropdown menu, bot scripts may not react positively to even minor changes to a user interface. When a bot “breaks,” it has the potential to cause substantial data corruption because it won’t realize that the work it is doing is wrong and won’t know that it should stop to ask questions, as a human would. Of course, some of this risk can be mitigated by good programming, but this assumes a formal software-development methodology has been used to develop the scripts — an approach that often is not taken. Even something as innocuous as changing the internal name of a screen object in application source code as part of a production release — a piece of information that is never seen by any user — can break a bot script that relies on it.

By introducing bots into their environments, companies have potentially created a set of dependencies that are poorly documented (or not documented at all), not designed to be adaptable to change, and most likely lack traceability. This creates further barriers to changing core systems, requiring more testing and verification to ensure that no bot scripts are broken. It also complicates QA environments, as they now need to encompass both the core application and the bots that run on it.

3. Broad deployment of bots, done too quickly, can jeopardize success. The risk of taking a broad approach of bot deployment from the start is that it can consume a significant amount of an organization’s budget to develop the overall governance framework — all before the organization has really determined how to make its bot investments effective. This will limit the ability of the organization to build momentum around its automation efforts, and potentially allow small and early failures to put the entire program in jeopardy.

A better strategy is to start small, demonstrate success, and then expand the overall automation program. While it is important to strategically approach bot systems, involving process users and IT, it’s also important to learn through the first few deployments how to best analyze and optimize bot platforms. This can be done via six- to eight-week deliverables. Then the organization can build on what it has learned and start to collect accurate measurements of efficiencies and cost savings.

4. Business-process owners have no incentive to automate themselves or their staffs out of jobs. It is unreasonable to assume that the people who own a process are the right people to automate it. A key premise underlying the process-automation programs that many organizations have underway is that bots will reduce the headcount required to execute core functions. Even if using bots will clearly improve the efficiency of the process and even if staff whose jobs are replaced by the use of bots get redeployed elsewhere in the company, it is a rare operations manager who will actively work to reduce the size of his or her group. Salaries and bonuses are often tied to the number of people who work for a specific manager, creating a disincentive to trade improved productivity for fewer workers.

On the other hand, process-owner expertise is necessary to understand the scope and behavior of the process so that it can be automated properly. A better solution might be to first do a scan of multiple processes to produce a heat map that prioritizes processes, then get the process owners to describe in detail how each of their processes works. Then bring in outsiders to automate the routine.

5. Bots don’t eliminate the need for rethinking core platforms. As organizations build bot strategies and tactical plans, they need to keep in mind the hammer-and-nail analogy: When you give someone a shiny new hammer, suddenly every problem starts looking like a nail.

It’s true that bot platforms can help automate manual processes and improve productivity. It’s also true that there are other tools that can achieve even higher levels of productivity and cost savings, often in conjunction with bots. These tools include end-to-end process digitization, rapid process reengineering, user self-service interfaces, custom-tool remediation, and machine learning. It’s important to fill out the toolbox, so to speak, with a range of efficiency solutions and not bring down the bot hammer to fix every problem.

The technology infrastructure in many companies suffers from consistent underinvestment. While bots can free up some resources, they don’t eliminate the need for organizations to take a hard look at their IT capabilities and think about how they need to be modernized. There is a risk that the success of small automation exercises results in management concluding that it can avoid the expense and risk of larger initiatives. That isn’t the case.


MIT Sloan Management Review

How to avoid losing in the competitive “future of work”

Business Team Discussion Meeting Corporate Concept

Did you get the memo? We’re living in the future. But despite being in the middle of the 4th Industrial Revolution (or Industry 4.0), we’re not seeing the increase in productivity we’re used to seeing with the previous revolutions.

Labor productivity growth is at a historical low, in defiance of the increased fetishization of productivity, countless technological innovations we have seen in this current business cycle (beginning Q4 2007), vacation days decreasing, and hours growth outpacing its long-term historical trend.

What’s happening, argues Jeff Schwartz, Deloitte’s HR thought leader, is that while technology is advancing at an unprecedented rate and individuals are usually quick to adapt, “…business productivity is not driven by Moore’s Law.”

So what can organizations do to move the needle and close the gap between business productivity and technological advancement? Adapt with your workforce.

Schwartz and his colleagues at Deloitte Human Capital, among many others, believe HR has a unique role to play in helping leaders and organizations adapt to this rapid shift in technology — and to the expectations of a rapidly growing Millennial and Gen-Z workforce. Recently, Deloitte’s Innovation Tech Terminal teamed up with ZipRecruiter and WeFind in Tel Aviv to host their 3rd Meetup for its HR Tech Community featuring experts ranging from HR Tech startups, investors, and corporations to discuss 1) the future of work, 2) the biggest challenges that organizations will face, and 3) what they can do to reinvent themselves for this digital age.

Catching Up

Be flexible

At the meetup, Vered Raviv-Schwarz, COO of Fiverr — the world’s largest freelance services marketplace — discussed the huge shift in the way the world views & treats work that has major implications for organizations looking to hire top talent. “Almost 40% of the US workforce is freelancing. For employers, you’re competing not just with other companies, but with a way of life as well.” The “Freelancing in America: 2016”  survey performed late last year seems to validate these comments, with 63% of freelancers doing it by choice, and with 50% saying that there is no amount of money that would get them to go back to a traditional job — and why would they when, according to the survey, a majority of freelancers that left a full-time job made more money within a year.

According to Vered, “You’re not just competing on compensation, but flexibility — you need to give [employees] the feeling that they can be intrapreneurs.” Following the leads of the  Amazons and  Googles of the world, organizations are embracing a risk-tolerant culture and encouraging experimentation in the workplace. Others, like WordPress (with 400 employees) and Buffer, are ditching the office completely, opting to go 100% remote with many finding that, in terms of employee retention, there’s no place like home.

Instead of trying to compete with the gig economy, companies can choose to embrace it. Working with freelancers would provide companies numerous benefits such as the flexibility of a fluid workforce that you can easily scale up or down for projects and a wider access to hyper-specialized talent. Routine process tasks and hierarchy are being phased out by automation as organizations are shifting towards being project-based networks of teams, so these off-balance sheet workers could eventually represent a sizeable chunk of the organization of the future.

Love thy employee

It’s going to take a lot more than flexible work policies to motivate this growing digital-first generation of workers. Companies will need to be more employee-centric and place as much emphasis on the employee experience as they do on the customer experience. This means, according to Schwartz, HR Tech will need to provide an end-to-end view of the entire employee experience, from recruitment to retirement, rather than just focusing on engagement and culture.

This includes reinventing how you measure and evaluate employee performance. Gone are the days of annual reviews and focusing on individual achievements. Enter continuous feedback loops and aligning rewards with an individual’s contribution to the team and the team’s contribution to overall business goals of the company.

Ronni Zehavi, CEO & Co-Founder of Hibob, an HR and employee benefits platform for SMBs that focuses on the end-to-end employee experience, argues that many HR platforms are B2B-oriented, with no focus on the “C” — “C” being the employee. With companies devoting up to 80% of their monthly expenses on people, “…they’re essentially your largest asset and resource for growth. To manage your #1 expense and resource, you need tech. It doesn’t make any sense that you pay for CRMs like Salesforce to manage your sales pipeline and marketing, but you neglect the people.”

Develop your talent

It’s no secret Millennials are notorious for job-hopping —  60% say they are open to new job opportunities with 21% having changed jobs in the past year (3x non-Millennials). With estimates having Millennials making up 75% of the total US workforce by 2025, organizations must move fast to retain them or risk being left behind.

Millennials and Gen-Z workers can expect to live for 100 years, meaning they’ll be working for 60-70 years. And, according to Schwartz, with the average worker staying at their job for ~4 years and the half-life of a learned skill being ~5 years, everyone will have one job — to constantly be learning. This represents an opportunity for organizations to increase retention and develop the younger, more agile and digital ready leaders of the future by providing training in the beginning of their employees’ careers and throughout.

PepsiCo is doing this by focusing on a framework of “Critical Experiences” — immersive experiences (like pioneering a new product) that force you to deal with ambiguity, take you out of your comfort zone, and help you develop new skills & knowledge. The idea is that you can draw from the insights and perspective these experiences give you no matter where you end up in your career.

Overwhelmed

Be attractive

Ranked as the 3rd most important challenge businesses face in Deloitte’s 2017 Global Human Capital Trends report, it’s apparent organizations are scrambling to catch up their Talent Acquisition efforts to the dramatic shift in jobs & skills needs that are brought on by rapid technological innovation. In the new age of talent acquisition, companies will have to move beyond traditional systems and adapt to emerging technologies, capabilities, and needs if they want to attract the best and brightest.

Building a digital employment brand is a necessity if you want to recruit top talent now. Companies like Industry, a professional network and hiring platform for the service and hospitality industry, are breathing digital life into an entire sector that otherwise uses rather archaic hiring methods like paper resumes and Craigslist ads — not ideal when the job turnover is 72%. For instance, Industry leverages video & photos for its users to build their brands, allowing restaurants to post a “Company Culture Video” on their hiring page and professionals to post short clips and pictures to highlight their creations and personalities.

Others are leveraging cognitive systems like IBM Watson by consolidating data from the employee lifecycle and social media to predict future performance of candidates, optimize recruitment marketing, and increase the speed of hiring.

Moving forward

Previously viewed as a compliance-only function, with 404 deals and $ 2.2 billion in funding in 2016 alone, according to CB Insights, and changes in human capital philosophy, HR has emerged as the strategic function in helping organizations navigate the 4th Industrial Revolution and the new expectations & increased competition that comes with it.

The post How to avoid losing in the competitive “future of work” appeared first on ReadWrite.

ReadWrite

Improve asset management and avoid equipment failure

The power of Maximo

Predict and prevent the failure of power plant assets and equipment, improve management, service levels and lower operating costs.

Use case: Increase operating efficiency and avoid equipment failure

Automated, company-wide asset management is helping this power generation company increase operating efficiency, which helps provide better, more reliable power-generation services to its customers.

For energy companies to provide efficient fuel sources for consumers and businesses, their operations must be efficient. If it takes too much labor or too much money to keep power plants running, output may suffer, and customers end up paying the price. Therefore, it is imperative that power companies run the most efficient plants possible, ensuring that all parts work together cost-effectively and provide high availability for energy output at the lowest price.

For power companies to provide efficient energy sources, their operations must be efficient. If it takes too much labor or too much money to keep power plants running, output may suffer, and customers end up paying the price. Therefore, it is imperative that power companies run the most efficient plants possible, ensuring that all parts work together cost-effectively and provide high availability for energy output at the lowest price.

Maximo infographic

An integrated solution with superior functionality

This power generation and distribution company aimed to improve asset management and reduce operating costs to respond efficiently to the company’s strategic plans, regulatory framework requirements and increased service levels required by customers.

The utility needed to replace an in-house solution for asset maintenance and management with one that would integrate with its current enterprise systems. The in-house system could not address preventive maintenance, which left the utility to react to incidents instead of being able to prevent potential issues from growing into problems. It wanted to update current processes with industry best practices applicable to utilities based on consulting services by adding management control.

The company sought a business and technology partner to help it improve its overall asset management and business processes, replacing its current maintenance application with an enterprise asset management (EAM) solution. Ideally, the solution would integrate all processes, monitor and trace all plant components, identify risks in advance and pool all the information for integrated and transparent asset management.

An automated asset management system using advanced analytics

Power plant equipment that breaks down causes energy waste, downtime and ultimately higher energy prices for customers. Power companies need to find and fix those potentially faulty assets even before they begin to cause trouble. This power company implemented an automated asset management system that uses advanced analytics to measure, monitor and maintain its power plant assets.

Through sensors located in the company’s power plants, engineers monitor a display that reveals the state of each asset in a single, comprehensive view, which helps them spot deviations from normal performance. By analyzing this data compared with historical data, the solution helps plant managers identify anomalous patterns and abnormal performance. This enables them to perform preventive maintenance and predict where assets may start to break down in the future, improving plant performance and avoiding costly outages.

To help better manage equipment and operations, the power company worked with IBM® Global Business Services® – Application Innovation Services to implement a solution based on IBM Maximo®, IBM Maximo for Utilities.

Improved data flow, preventive and reactive maintenance capabilities

The solution provides both preventive maintenance recommendations and reactive maintenance programs. The Maximo Asset Management and Maximo for Utilities software allows the power company to anticipate and prevent failures, and correct systems and processes before they are carried out. The solution also helps improve data flow between departments. Better asset management and smoother operating processes mean less downtime in power plants. The new solution provides plant safeguarding as well as planning transparency, proactive issue identification and plant asset management.

Real business results

  • Maintains 100 percent of the company’s distribution and generation assets, and 80 percent of its total assets, including preventive maintenance services
  • Reduces operating, maintenance, logistics and inventory costs
  • Improves safety, efficiency and product lifecycles with improved transparency and tracking abilities and more preventive maintenance

Learn more

IBM Maximo can help plant safeguarding, as well as planning transparency, proactive issue identification and plant asset management. Learn more about how IBM Maximo can be used not only for mainstream maintenance, purchasing, contracting and stores but also for routine preventative maintenance tasks. To understand the real risks for your asset-intensive business, visit our website.

The post Improve asset management and avoid equipment failure appeared first on Internet of Things blog.

Internet of Things blog

Four ways to avoid IoT data fatigue

IoT technologies are already well under way to revolutionize every industry ranging from transportation and buildings to manufacturing and environment. With the explosion of sensors, sensor data, innovative ideas of how to harness all the data and vendors who are implementing solutions, there is a real danger of causing data fatigue for the organizations using these solutions, not to mention the end-user or consumer.

The case of the missing elevator

The building in which my parents live in India has an old fashioned elevator with sliding grill elevator doors on each floor. The grill door acts as a safety device. If it is not slid shut manually – completing the electromagnetic circuit – the lift won’t move. However, if after exiting the elevator a passenger neglects to shut the outer grill door, the elevator will remain on the last floor serviced – creating frustration for the next passengers who inevitably send a neighbor’s son or daughter to climb the stairs in search of the lift stranded on an upper floor with its brass grill wide open.

A poor retrofit experience: the hissing elevator

After residents persisted with numerous complaints about unsatisfactory experiences, the building operations team installed sensors to detect when the grill door was open or closed. If open, the system triggers the voice recording of woman demanding, “Please shut the door” in three languages. The recording continues to cascade through speakers on every floor until at last someone steps into the hallway and slides the grill door shut, sending the errant lift to its destination.

Not surprisingly, after the novelty of the talking woman wore off, the residents found her announcement more irritating than useful – as the recorded voice seemed to bark commands at them the from the second they entered and exited the elevator door. The presence of the angry elevator voice now made the passenger experience even worse – resulting in them hurrying to exit the elevator, ending their less than optimal experience as quickly as possible.

I can attest to the experience personally. On my last visit, as I approached the elevator, I was accosted by the voice recording of an angry woman shouting incessantly through the intercom. In defiance, rather than comply with her commands, I noticed the people on the ground floor simply ignored the message – continuing on with their business, oblivious to her demands. With no lift in sight, I bounded up the stairs instead, a pattern which repeated itself throughout my visit.

The residents had grown accustomed to the harsh vocal warning. Instead of responding to her plea, and helping other residents, they ignore it. As a result, the occupants of the building are now left with an even worse user experience – long waits, no elevator door in sight, and the nagging voice of the recording.

There are two common causes that lead to data fatigue for users of IoT solutions:

1. Bad design for human consumption

Yes, technology can be awesome; but, without carefully considering the user experience when implementing a ‘solution,’ sometimes the impact of the solution leaves individuals in an even worse situation. Thinking through to the end-user’s problem first and considering the human reaction to the user experiences can help avoid similar catastrophes. How we as humans interact with technology influences the success or failure of any solution; the value we ascribe to the technology affects how readily we adopt new technology.

2. Stove pipe solutions

Another implementation trap which can be equally detrimental to successful adoption occurs when we try to solve too many IoT problems as individual point solutions. This can potentially lead to crossing the threshold of a user’s ability to process new information. Redundancies can also cause inefficiencies in the systems themselves. Did you know hospitals reported 80 deaths and 13 severe injuries attributed to alarm hazards from January 2009 to June 2012? When it comes to the field of medicine, alarm fatigue is dangerous and, indeed, is a hot topic in the medical community.

IoT for buildings

Managing a large commercial building requires answer to questions such as:

  • How bad is the plumbing leak in the basement?
  • How many pounds of salad should I make for lunch in the cafeteria today?
  • Which windows and doors are causing heat loss?
  • How much power is the 5th floor using compared to the average number of occupants on that floor this month?
  • Which parts of the elevator are close to the wearing threshold and need replacement right away?

Thousands of sensors feed data into the building’s “IoT brain” to answer these and other questions. There are hundreds of ways to design solutions to answer such valuable questions. This can easily lead to high likelihood of data fatigue for the users.

Avoiding data fatigue

To avoid IoT implementations that cause data fatigue – rendering potentially good solutions ineffective, solution developers should consider following these four constructs:

1. Put humans at the center

Ultimately it is indeed people who are the beneficiaries of data insights derived through IoT solutions. Regardless of the objective – whether it’s for comfort, safety or operational efficiency, using a human-centric design approach is one way to create better solutions.

First, understand the end-users’ persona, how they work in their environment, what they are trying to achieve; how users interact with data or insight; how much capacity they have for information; pay attention to human behavior to understand their responses when interacting with the IoT solution. Consider the user’s perspective – what frustrates them now, and what will make them happier? This recent article demonstrates that indeed “People are the point of IoT,”. In addition, explore IBM’s Design Thinking methodology – a methodology which advocates using a human-centric design point.

2. Keep the big picture in mind

Developing disconnected or individual point solutions for very targeted problems can have an inadvertent impact on something else. The same user may be responsible for watching and managing multiple systems. So, design for a comprehensive set of scenarios – as an example, ensure that the audio or visual notifications for a building security breach is clearly different than that for an electrical failure.

Also, avoid collecting and managing the same data more than once. Designing and maintaining a “Digital Twin” that is comprehensive across all systems and departments can help avoid unnecessary redundancies. Having a complete virtual representation of all systems and processes in a building allows an organization to leverage valuable data across systems to deliver higher value at a reduced cost.

3. Automate all low risk actions

Design software solutions that can make as many decisions and take as many actions as possible without human intervention. For example, in a solution that manages bathrooms in a commercial buildings, the “out of paper towels” alert triggered by a sensor should automatically place a service order for replacement, rather than simply notifying the manager who then has to place an order manually.

Automating low risk tasks will help reduce frustration for users and also result in greater cost-savings. It is important, however, to always ensure that the automated decision or action is indeed what a human would have chosen – i.e. remember to always put the human being at the center of software design!

4. Deliver higher-order information, not just data

There are many scenarios where the system cannot make a decision and take action on its own without human intervention. In this case, designing the system to provide the highest order information possible will allow humans to make better knowledge-driven decisions, in order to take subsequent actions quickly.

For example, if the system detects a plumbing leak in the basement and the automated action to shut off the appropriate valve fails, human intervention is immediately required. But rather than simply sending an alarm for the water leak and an alarm for the valve failure to a maintenance dashboard, one should design the system to provide higher order analytical information to the user.

In this scenario, rather than simply notifying the user of the failure alarms, inform the user about the extent of damage, a projected time indicating when the next major impact from the leak will occur, a headlight into any other high risk systems which could be impacted – is there electrical wiring nearby; the reason for valve shutoff failure and a list of the three next best action recommendations. Achieving this level of systems analysis requires sophisticated instrumentation and data analysis.

With sensors and a comprehensive Digital Twin of the building in place, develop Machine Learning algorithms to run prediction models based on historical data, events, and actions to make recommendations. The article entitled, “Watson IoT Platform Analytics – Covering all your IoT analytics needs,” explores the many capabilities available in the IBM Cloud to help build intelligent IoT systems.

How big can the data fatigue problem?

There is a lot at stake for human beings in this IoT revolution. The risk of data fatigue is real in every IoT implementation. The negative impact of ignoring the recording of the angry voice in a five-story residential building in India is a simple example of a pervasive issue. While it may seem insignificant to the buildings residents, perhaps even tolerable, imagine the scale of such a problem inside an office building with 75 floors.

The instrumentation throughout an urban sky scraper includes miles of plumbing, electrical and HVAC ducts, thousands of windows and doors, countless smoke detectors, multiple banks of elevators, cafeterias that serve thousands of people a day, gigantic parking garage underneath, 10s of thousands of computers/devices, network connections.

The sheer volume of data created and managed in the Digital Twin of such a building can easily cause data overload and badly designed IoT systems can quickly overwhelm and cripple building operations and safety, or frustrate the tenants.

Designing IoT solutions with humans at the center, keeping the big picture in mind before creating solutions, automating as much as possible, while delivering decision-making information, not just data are sensible constructs worthy of consideration.

The elevator saga continues

Continuing on with my encounter with the elevator in India, on one occasion, as the elevator descended with me as its single occupant, it suddenly halted in between the 2nd and the 3rd floor – where I was assaulted with the voice recording barking in my ear: “Please close the door…”over and over again. Someone had managed to open the grill door on some floor while the elevator was moving, thereby stopping the elevator immediately and the people nearby ignoring the cry wolf!

My cry for help was met by the building maintenance worker. His calm demeanor told me that this was not the first time someone had gotten stuck in between floors. His actions that followed left me amazed and amused. He took out his mobile phone and called, perhaps, his assistant and said in a monotone voice, “so… the elevator is stuck again. Bring me my Phillips-head screw driver.”

Learn more

IBM provides a complete workplace management solution that combines data from sensors and equipment with powerful analytics to optimize everything from core facilities maintenance to lease accounting, capital project management, space management, energy management and more.

IBM’s TRIRIGA integrated workplace management system (IWMS) delivers a single platform technology and core business applications to manage the life cycle of real estate and facilities assets. Download the solution brief to discover the power of true integration in workplace management solutions.

Want to know more about how the IoT can enable cognitive buildings? Read how the IBM Watson IoT for Buildings solutions can help you optimize your real estate space and facilities.

Find out how KONE uses Watson to keep 1 Billion people moving safely in their elevators.

The post Four ways to avoid IoT data fatigue appeared first on Internet of Things blog.

Internet of Things blog