The last two and a half centuries have brought about stunning changes in industrial productivity. With each passing generation, new and more advanced equipment and methodologies have arrived on scene to push production capabilities to new heights. And as each new methodology or technological advancement has matured, the quest for greater operational efficiency has honed and advanced their capabilities, setting the stage for the next leap.
While it would be easy to lump these achievements together over this period, the distinctions that set them apart are important in and of themselves. Because of this distinctiveness, we now identify that there have been four industrial revolutions, each bringing improvements to efficiency to a new level and again setting the stage for the next.
The first of these revolutions began in the 1760s and was marked by the introduction of steam power and mechanization. Moving from hand-produced goods made by skilled artisans to goods produced by unskilled workers in a factory setting disrupted economies and societies and imposed a new reality on each.
At the end of the 19th century, the second industrial revolution saw the advent of the assembly line and the electrification of factories. Equally disruptive, these advancements allowed for continuous operations round the clock and allowed for major improvements in machine design to make better machines that were faster and more controllable through the dependability of electricity.
By the 1960s, a new player had emerged to boost productivity and efficiency yet again. With the development and introduction of computers into mainstream manufacturing, industrial equipment could be automated, and data could be leveraged to enhance the equipment’s performance as well as to understand and manage broad trends to improve operations across the board.
And as the new century bloomed, yet another host of disruptive technologies appeared on the horizon. The use of software and advanced computer technology has led to the merging of machine and computer. Accompanied by the advent of AI, machine learning, and deep analytics, these disruptive technologies have set the stage for explosive growth in efficiency through the creation of the “smart” factory and the Industrial Internet of Things (IoT).
Challenging the Experts
Also known as Industry 4.0, the fourth industrial revolution has its own unique technologies. But it is also notable for how it is challenging industry professionals not only in resetting their expectations of operational efficiency, but perhaps by redefining operational efficiency altogether. For if the previous revolutions moved the needle of efficiency within specific areas of expertise, the arrival of the “smart” factory and the reality of cyber physical systems moves the needle for all areas of expertise at once and threatens to be the most disruptive of the four.
In the first industrial revolution, steam and mechanization challenged mechanical engineers. These challenges centered around harnessing steam to create machines that produced goods on scale. But regardless, the miracle of those advancements, the reality was that the machine defined what could be done.
In the second revolution, the arrival of electricity and the 24-hour factory challenged predominantly operational and electrical engineering. For operational leadership, that challenge was the creation of the modern factory culture that operated on shift work and required large organizational changes not previously possible when work was mostly “dawn to dusk”. Management systems advanced planning and scheduling, as well as other functions such as industrial engineering, to a science to create metrics that allow management of human assets within a new continuous factory operational setting. And as this culture was developing, electrical engineers developed the capability to bring more and more power into factories safely to take advantage of new equipment capabilities.
With the introduction of computers to assist machine capabilities and deliver more data, the third revolution began to change the direction of the operational efficiency from one where the machine defined what could be done, to one where computer defined what was possible. With less reliance on humans through automation, efficiency gains began to be more data driven. Trends and insights previously not available, coupled with better quality and precision through use automated manufacturing equipment , could be tapped to improve efficiency not just through better machines but through operational awareness across all levels of the factory. Decision making for planning, logistics and software became more sophisticated as new data became available.
With the Industrial Internet of Things in the current revolution, all areas of operations are challenged. By creating a factory of cyber physical systems, mechanical, electrical and operational engineers are forced to break out of linear expectations of improving efficiency only within their sphere. And with the arrival of the connected factory, data has exploded to a level that would have previously been unmanageable. The new reality is that technologies such as AI, deep analytics and machine learning require that operational efficiency be viewed as a single entity affecting the entire operation and where all functional areas contribute to improvements as part of an integrated and connected system that maximizes efficiency.
Operational Efficiency and the Limits of Lean and Six Sigma
The last few decades have also witnessed the rise of process and quality improvement methodologies aimed at increasing operational efficiency. In many ways, these methodologies were an attempt to bring together data and metrics available through the more sophisticated tools the third industrial revolution could deliver. Methodologies such as Lean and Six Sigma grew as companies sought ways to improve operational efficiency through process improvement. Lean grew from the Toyota Management System and was quickly adopted by thousands of companies throughout the world. Using a pull system, optimized layout, machine automation, fast setups and other controllable variables, the lean system was an attempt to take advantage of the fruits of the previous industrial revolutions by using process improvement to improve efficiency.
Six Sigma was originally pioneered by Motorola as a quality management system in the mid-1980s. It focused on the measurement of “defects per million” and developed methodology to achieve lower defect levels that could be used throughout the factory. The philosophy and logic of Six Sigma could also be viewed as a way of doing business by managing to the metrics and processes that resulted in the lowest possible defect rate.
Each of these systems and their methodologies was highly successful and allowed for tremendous gains for companies that adopted them. And their comprehensive organizational structure impacted entire organization. As such, they represent the first attempts to manage production and drive process improvement and efficiency holistically across a factory or an entire company. To be successful, the lean or Six Sigma program had to be adopted across multiple departments and functional areas and create a culture that adopted the methodologies completely.
But while these programs were successful (and will continue to be so within Industry 4.0) there are still limitations to how far they can go as a system. First, despite reliance upon metrics and the use of computers and advanced methods of analysis, they are still human-driven systems. Programs must be led, championed, honed and enforced indefinitely. And barriers such as culture, fatigue, burnout and interdepartmental competition reinforce the reality that there is only so much human intervention can do to influence the system manually.
Secondly, there is the problem of the data itself. Again, while advanced analytical methods are used in both systems, they are often still siloed and the decisions that come from them are still made by humans. There is a limit to human capacity in understanding the metrics and formulating actions based on what they show. As a result, despite a wealth of information, deeper data and trends are often undetected or undetectable. And while there are attempts to coordinate metrics and data between departments and functional areas, there is only so much that can be done before the data sets become too unwieldy to allow them to be used at a deeper, more global level.
These limitations do not suggest that the programs have not been valuable, only that there is a limit to their effectiveness as one can only eliminate so much waste or reduce so many defects before reaching a point of diminishing returns where further improvements would cost more than what is saved. At that point, the system may be able to be maintained, but further gains would be either impossible, impractical or not cost-effective.
Improving operational efficiency further will require the reliance on the Industrial Internet of Things (IIoT) within “smart” connected factories. Here, process improvement is driven not by human initiatives or linear surges within specific fields, but by the complete integration of computer and machine that encompasses the entire operation. This is possible through the introduction of AI, machine learning and deep analytics software that can help realize several benefits:
- These technologies can perform at a micro level that humans cannot.
- They can work faster than humans.
- They can allow for faster, objective and more accurate decision-making.
- Systems can process and analyze data to “see” patterns and trends not readily discernable by humans.
- They can decentralize decision-making by allowing many autonomous or semi-autonomous decisions from with the platforms themselves.
Barriers to Industrial IIoT Adoption
Before manufacturing companies can take full advantage of what Industrial IoT has to offer, there are a few notable barriers that must be overcome.
- Data Interoperability – One such barrier is that of data inoperability. In most traditional manufacturing environments data is siloed between departments or functional areas. Production, scheduling, quality, engineering and business monitoring software may not be linked, or, if they are linked, may still not be compatible for analysis.
One study estimates that up to 60% of IIoT potential value is inhibited by data inoperability. But as companies seek to transition to IIoT systems, older systems such as vertical closed applications that focus on single machines will have to be replaced and data will have to be standardized among all functional areas.
- IT Skillsets – Another barrier to leveraging IIoT potential for improved operational efficiency is the lack of skills and access to skills that will be needed for deployment. As a new and evolving technology, many companies have not begun to upgrade these skills and may not fully understand that they need to or what is required to do so. One survey revealed that fully a quarter of respondents cited lack of skilled professionals as a barrier to adoption of IIoT. Just as vertical closed applications inhibit data integration as a relic of the third industrial revolution, skillsets are predominantly geared to maintaining these relics and must be upgraded as well.
- Security – Perhaps one of the biggest barriers to faster adoption is security. As many devices are powered by open source software (OSS), new connections leave open the possibility for a breach. And in many cases the OSS software is not screened for vulnerabilities. Compounding the problem is the reality that there is no end to end security solution available, leaving manufacturers and service providers at odds over who should provide security and how that should be accomplished. The issue has become critical to the point that governments are now threatening to impose regulation for IIoT unless the industry self regulates to address security concerns.
The landscape of IIoT adoption, however, is not one of gloom and doom that these barriers may initially indicate. They must be taken in context as to where Industrial IoT is in terms of history. And as we are at the beginning of a new cycle of industrial revolution, these challenges and disruptions are no different than those faced at the beginning of previous revolutions. In each of those instances, business and technology came together to address initial barriers as the realization that extensive operational efficiency gains were possible if they did.
That is a path that IIoT adoption will follow as well because the potential gains are too great not to. It is estimated that the Industrial IoT market will reach $ 123.9 Billion by 2021 and that the impact to global GDP will be more than $ 14 Trillion by 2030. With estimates that high, barriers will fall as the impact on operational efficiency becomes quite apparent.
In next week’s post, we’ll continue the examination of the relationship between Operational Efficiency and Industrial IoT, diving into a deeper review of how each critical manufacturing area is impacted individually to help us understand how overall operational efficiency improves within a digitized system.
Originally this article was published here.
This article was written by Graham Immerman, Director of Marketing for MachineMetrics, a venture-backed manufacturing analytics platform. Graham has quickly become an authority on digital transformation and the application of IIoT technology for the manufacturing industry.
The post Improving Operational Efficiency with Industrial IoT: Part 1 appeared first on Create a culture of innovation with IIoT World!.