The Separate Benefits of Real-Time and Historical Data Visualization

customVis_fw

It’s safe to say that everyone has some sort of concept of data visualization. We live in a very visual world, replete with pie charts, trend graphs, heat maps, and infographics. The use of graphics to display data for easy consumption and analysis is very common. In fact, it has been happening since ancient times. What is less common is the graphic display of real-time data – data that is continuously updated as new data is generated by connected devices or people. Real-time data visualization – when done correctly – can transform decision-making and lead to a completely new understanding of the people, places, and things with which we interact.

While most of us have encountered real-time data visualization at one time or another (think of the digital signage at the airport keeping you apprised of flight schedules or the sign at your local bank displaying the current temperature), most haven’t considered how this concept differs from the standard type of data visualization we encounter in reports and presentations.

This distinction between real-time data visualization and historical data visualization is key, as they serve two very different purposes and should not be – as they often are – treated in the same way.

Real-time data visualization does not used past data to plan future activities (though historical data can and should be used to plan the creation of the real-time visualization). Real-time visualization is used to provide an instantaneous look at the current conditions of a person, place or thing.

Real-time data is not used to make plans; it is used to make decisions. This distinction requires that we approach real-time data visualization with a different philosophical and practical tact than that used to approach historical visualization.

Excerpted from the whitepaper “Real-Time Data Visualization Essentials”, downloaded at www.scada.com.

Advertisements

3 Ways to Use Information Modeling for Continual Improvement in Your Enterprise

improvement

The concept of continual improvement has been a regular feature of modern manufacturing enterprises. Its gaining favor now in different circles, and for good reason. Continually making small incremental improvements to business processes has proven time and again to have a positive impact on production quality, efficiency, and safety. What’s not as well known, however, is how much more effective continual improvement programs can be when used in conjunction with an information model.

An information model can be thought of essentially as a virtual representation of your enterprise, and it provides the organizational and relational structure of your enterprise’s data. Providing context and organization to the raw data is the first step in turning it into actionable information.

Data included in your information model can be drawn from nearly any source. Include data from databases, web services, sensors, PLCs , calculations, real-time user input, or data from other enterprise applications like ERP or MES systems – essentially anything of relevance that can add value and support decision-making.

By modeling your information in this way and providing context to your real-time data, you can visualize your asset management data alongside your process control data, or your maintenance data alongside procurement data. Any data relevant to your business processes is now accessible to your visualization system, and opportunities for optimization become much more apparent when data is presented in context.

For instance, you can visualize how a particular motor’s production throughput is affected by changes in the Overall Equipment Effectiveness (OEE). You can see how the OEE is affected by maintenance operations. The sort of real-time situational awareness enabled by information modeling reveals new opportunities to lower maintenance and operation costs by maximizing asset performance. By defining the relationships in your information model, the data that you visualize becomes much more understandable and actionable.

Another – and perhaps greater – benefit of information modeling is the ability to track the results of incremental changes in real-time across multiple channels.  This allows for faster analysis and greater collaboration. It also becomes much easier to establish new standards, as information entered into your information model is immediately accessible to all who use it. Also, additional media – like videos and manuals – can be included in your model to ensure that all personnel have immediate access to the latest standards and best practices.

There are many ways information modeling can help your continual improvement efforts. Here are three categories of benefits many business owners are already seeing.

1.      Analytics, Reporting, and Condition-Based Task Automation

If managed through the right software system, one of the great benefits of information modeling is that your data is normalized and available in a consistent format, regardless of where the raw data was generated. This presents tremendous opportunities for data analysis, reporting, and task automation. It allows machine-to-machine communication, business-to-machine communication, and business-to business communication. An event in one device or location can automatically trigger an action in another device or location. Automated reports can include data from multiple sources. This is the essence of the Internet of Things (IoT) – the interconnection of all of assets and their associated data.

This also presents opportunities for improvement maintenance operations, as machines can generate their own work requests or alert personnel of potential problems. The possibilities are truly endless when we free ourselves from the data silos many of us struggle to integrate daily.

2.      Data Mining and Activity-Based Intelligence

The US Department of Defense employs a process known as Activity-Based Intelligence (ABI) to find useful details in large sets of data. The process involves creating an automated mechanism to sift through large sets of data in search of anomalies.

Today’s industrial enterprises are finding ways to employ similar techniques. Huge amounts of data are being recorded and opportunities for improvement are known to exist, but how do we know what to look for and how do we find it? The same sort of ABI employed by the DoD is finding a place in the commercial world.

If we can review our historical process data to define the circumstances surrounding certain conditions (unplanned downtime, spikes in energy consumption, etc.), we may be able to recognize repeated patterns or anomalous activity related to these specific circumstances, thereby enabling us to act to correct the situation before it happens again. By finding the data that stands out from the rest, detailing the characteristics of that data, and looking for those characteristics elsewhere, we may be able to pinpoint causal relationships that were previously obscure or misleading.

On the flipside, the same techniques can be employed to define the circumstances surrounding periods of extended productivity or energy efficiency. The same techniques used to discern the cause of deficiencies can be used to optimize asset performance and improve the quality and efficiency of our processes.

By creating analytic mechanisms aligned with the principles of ABI, we are able to create a safer, more efficient, more productive work environment.

3.      Repeatable and Scalable

As you make the changes that will lead to a more efficient, more productive, and safer business, these changes become part of your information model.

Your information model not only helps you identify opportunities for improvement and publicize updated standards and procedures, but also gives you a means for endless repetition and growth. Your information model is progressive; it can always be modified or expanded. As you make successful optimizations, any changes made to your information model can be easily repeated for any other relevant processes. You are also able to expand your model by adding new locations, new assets, new process cells – whatever it is that you have optimized about your model can be repeated or extended indefinitely.

Excerpted from the whitepaper “Continual Improvement with Status”, downloaded at www.scada.com.

4 Ways Mobile Devices Have Transformed Remote Monitoring and Process Control

mobileHMI

Mobile devices have changed many things about the way we live and work today. They’ve changed the way we interact with each other, consume new media, purchase goods and services – they have become essential lifestyle accessories in a relatively short period of time. This is true not only for individuals, but entire industries have been impacted in a significant way.

With that in mind, here’s a look at 4 ways in which mobile devices are changing remote monitoring and process control.

Remote Device Monitoring

Mobile devices can be used as portable HMIs (Human Machine Interfaces) to monitor remote equipment in the same way that standard HMIs are used. Field operators can quickly and easily assess the current conditions of a process or piece of equipment without being tied to a workstation.

This can be particularly useful for checking the system-wide effects of repairs or configurations that are made to field equipment, rather than manually visiting each piece of equipment to take measurements or waiting until someone in the control room lets him/her know about any potential problems or abnormalities.

There may also be situations in which a problem can be diagnosed and corrected without even visiting the site. By giving field operators and technicians the ability to access real-time data from wherever they may be, it may possible to eliminate any travel time or expense, freeing the operator or technician to work on other tasks. This may also eliminate the need for the technician to call back to the control room for updated information. This means the control room operator now has more time as well.

 

Viewing Documents and Other Media

In addition to monitoring and controlling processes and equipment, mobile devices can also serve as a sort of repository for useful information, providing a handy reference for materials that would ordinarily fill several books and would be nearly impossible to carry around over the course of a work day.

New workers can reference training materials like manuals, pictures and videos. Use tablets and smartphones to access safety guidelines or troubleshooting procedures. View schematics and diagrams.  Review incident reports or outstanding work orders.

If you think of mobile devices as nothing more than a portable library of relevant media, this use alone is enough to justify the investment.

 

Filling out Forms or Checklists

Operators and technicians frequently have a need to add information to a database regarding certain tasks performed – or simply as part of their day-to-day responsibilities. Whether performing inspections, completing service orders, updating personnel files, or any number of other tasks, mobile devices can save employees a tremendous amount of time by allowing them to perform these tasks from anywhere at any time.

 

Field technicians can update the control system instantaneously from the field – without having to return to the control room to fill out a form or deliver the results to a control room operator over the phone.  It’s not hard to imagine a scenario where a technician in the field, several miles from any control room, can use a single device to read a procedural document, review a checklist, enter relevant information into a form, then check to confirm that the information was entered completely and accurately – without any unnecessary travel time or phone calls.

 

Collaborating

One of the most profound applications of mobile devices is as a tool for instant collaboration. By allowing continuous access to live process data, personnel from different departments can collaborate and make decisions with up-to-date and accurate information at their fingertips.

Mobile devices can be used to document best practices by uploading pictures or videos of particular procedures and allowing these items to be reviewed by workers at other locations in other facilities. Smartphones and tablets allow personnel to access rich media at any time as a means of conveying a certain set of information to relevant parties. Use displays of real time and historical data in meetings or presentations. Mobile devices allow off-site personnel to participate in real-time activities with on-site personnel. Many possibilities are introduced by mobile technology.

Excerpted from the whitepaper “The Benefits of Data Mobility”, downloaded at www.scada.com.

3 Keys to Effective Real-Time Data Visualization

There are several important factors to consider when creating your real-time data visualization, many of which will depend on your intended application. Today, we consider at a few of the general factors that will play a role in every visualization you create. These three factors are clarity, consistency, and feedback.

Clarity

Real-Time graphics should emphasize pertinent information and use design principles that promote ease-of-use and accessibility above aesthetics. Things like size, color and brightness can be used to distinguish primary details from secondary and tertiary details. Special graphics can be created to emphasize different information under different conditions (i.e. a special set of graphics to be used when a certain alarm is triggered).

clarityPic

When planning a real-time visualization scenario, it is very important to consider who will be using this visualization, and what is his/her purpose in viewing this data. This will obviously vary from one organization to the next, but when differentiating between primary, secondary, and tertiary information, it is important to not think in terms of what is important about the thing being monitored, but what is important to the person doing the monitoring.

 

Consistency

Consistent visualizations are standardized and consistently formatted. Interaction requires a minimum of keystrokes or pointer manipulations. In fact, whenever possible, all relevant information should be visible without the need to navigate to another screen. When navigation is necessary, be certain that elements of the user interface related to navigation are clearly distinguished from elements that relay pertinent information. Additionally, navigation and interaction of any type should be as easy and intuitive as possible.

chart

The ergonomic needs of the user are also extremely important. Poor data visibility has been cited as a primary cause of many industrial accidents where a process was being monitored or controlled through a real-time HMI (Human Machine Interface). In fact, poorly designed HMIs have been blamed for accidents that have led to millions of dollars in damaged equipment and some very unfortunate and unnecessary deaths.

 

A recent study by OSHA in Europe compiled statistics on HMI-related errors in the workplace. Interestingly, research shows that the majority of problems are caused by human error, but not entirely because of mental and physical fatigue. More often, errors are caused by poor decision-making related to the way that information is processed.

  

Feedback

An operator should be fully confident that the choices they make are having the desired effect. Screens should be designed in a way that provides information, putting relevant data in the proper context. Also, important actions that carry significant consequences should have confirmation mechanisms to ensure that they are not activated inadvertently.

 Controls will function consistently in all situations. If something is not working as it should, that fact should be immediately obvious and undeniable. Again, in a well-designed system, design principles are employed to promote clarity and simplicity, and to reduce user fatigue.

Keep it simple and straight-forward. Save the complex visual tools for historical data or real-time reporting. There is certainly a place for all of this, but that place is not where real-time data is being used to make real-time decisions.

Learn more in the free whitepaper “Real-Time Data Visualization Essentials”:

wpCover
http://scada.com/Content/Whitepapers/Real-Time%20Data%20Visualization%20Essentials.pdf

 

 

Is That SCADA or IoT?

Clearly, SCADA (Supervisory Control and Data Acquisition) and IoT (Internet of Things) are very different things, right? We typically don’t create new terms to describe things for which we already have terms, so yes. They are different, but maybe not as far removed from one another as we may think. As revolutionary as the end results may be, the truth is that the IoT is just a new name for a bunch of old ideas. In fact, in some ways the IoT is really just a natural extension and evolution of SCADA. It is SCADA that has burst free from its industrial trappings to embrace entire cities, reaching out over our existing internet infrastructure to spread like a skin over the surface of our planet, bringing people, objects, and systems into an intelligent network of real-time communication and control.

Not entirely unlike a SCADA system – which can include PLCs (Programmable Logic Controllers), HMI (Human Machine Interface) screens, database servers, large amounts of cables and wires, and some sort of software to bring all of these things together, an IoT system is also composed of several different technologies working together. That is to say you can’t just walk in to the electronics section of your local department store, locate the box labelled “IoT” and carry it up to the counter to check out.

It also means that your IoT solution may not resemble your neighbor’s IoT solution. It may be composed of different parts performing different tasks. There is no such a thing as a ‘one-size-fits-all’ IoT solution. There are, however, some common characteristics that IoT solutions will share:

  • Data Access
    It’s obvious, but there has to be a way to get to the data we want to work with (i.e. sensors).
  • Communication
    We have to get the data from where it is to where we are using it – preferably along with the data from our other ‘things’.
  • Data Manipulation
    We have to turn that raw data into useful information. Typically, this means it will have to be manipulated in some way. This can be as simple as placing it in the right context or as complex as running it through a sophisticated algorithm.
  • Visualization
    Once we have accessed, shared, and manipulated our data, we have to make it available to the people who will use it. Even if it’s just going from one machine to another (M2M) to update a status or trigger some activity, we still need some kind of window into the process in order to make corrections or to ensure proper operation.

There could be any number of other elements to your IoT system – alarm notifications, workflow, etc. – but these four components are essential and will be recognized from one IoT system to the next. Coincidentally (or not so coincidentally), these are technologies that all cut their teeth in the world of SCADA.

The IoT is the Next Generation of SCADA

Again, In many ways the IoT is a natural extension and evolution of SCADA. It is SCADA that has grown beyond industry and seeped into our daily lives. The IoT is essentially SCADA plus the new technology that has evolved since SCADA was first devised. Just like how in the late 18th Century, steam power put a hook in all other industrial technology and pulled it forward into a new era, electric power did the same thing a century later. Several decades later, with the advent of microchips and computer technology, once again industry was swept forward into a new era by the gravity of a single revolutionary technology. As we sit here today, well aware of the revolutionary power of what we call the ‘internet’, we are now feeling that gravity once again pulling us toward a new era.

OPC UA: The Communication Standard for the Internet of Things?

OPC UA

As we prepare ourselves for the expansion of the IoT (Internet of Things), many businesses today are looking ways to take advantage of the opportunities that are beginning to present themselves. Of course, as with anything new there are many questions and concerns.

Many organizations are struggling with interconnectivity. How do we get existing information systems to communicate with new information systems? If leveraging the IoT requires a wholly rebuilt information infrastructure and a complete reformatting of business processes – well, that’s just not going to work for most people.

There are also organizations who will have questions about how to make use of the unstructured data coming in real time from any number of different sources. How can they create the context to translate this endless stream of raw data into useful information?

And what about the scalability and flexibiilty needed to deal with growth and change. After all, if the changes implemented today need to be undone in order to keep up with the future needs of your organization, then is it really worth it?

Another common concern is that of security. Are we going to push sensitive information up to the cloud, where it may be exposed to any number of potential threats ranging from cyber-terrorism to corporate espionage? And even if our sensitive data is not being broadcast over the internet, how do we protect these interconnected systems from internal threats? How can we ensure that our employees and contractors have access to all of the information they need to do their jobs and nothing more?

These and many other questions are preventing some organizations from realizing the many benefits of the IoT. Some think it will be too difficult or expensive to implement; others may question the value of it. Fortunately for us all, these questions have been asked for several years, and there are answers.

The communication protocol often cited as the best fit for IoT applications has already been developed, tested and deployed in live environments around the world since it was fully released in 2009.

OPC Unified Architecture (UA) is platform-independent, service-oriented architecture developed and maintained by the OPC Foundation. As the interoperability standard for industrial automation, OPC has become an integral part of most SCADA (Supervisory Control and Data Acquisition) systems. As data systems expand beyond their traditional roles to include more sensor data and consolidate data from multiple systems, it makes sense that the OPC Foundation has remained at the forefront of the standardization process and and have developed a communication standard that has been embraced by proponents of Industry 4.0 and the Internet of Things – companies like Microsoft, Oracle, SAP, GE, and many others,

OPC UA is universally embraced because it directly addresses the obstacles faced by organizations involved in IoT implementation projects. The problem of interconnectivity, for example, is exactly the problem that the communication standard was developed to address. Today, OPC drivers exist for thousands of different devices, and many devices today are manufactured with embedded OPC servers to allow for exactly this type of interoperability with other devices and systems.

The concerns about the usefulness of multi-system data is addressed by information modeling. The OPC UA information modeling framework turns data into actionable information. With complete object-oriented capabilities, even the most complex multi-level structures can be modeled and extended. Information modeling also makes an OPC UA-based system significantly more customizable and extensible. As virtual representations of actual systems, information models can be modified or expanded to meet the changing needs of a modern company.

Of course, one of the most important considerations when choosing a communication technology is security, which is one of the great benefits of OPC UA. Security is provided in a number of ways, including: Session Encryption, Message Signing, Authentication, User Control, and Auditing of User Activity.

While it is difficult to say that there is anything “standard” about the Internet of Things, OPC UA is the closest thing we have to a communication standard, and every day it is becoming more widely accepted and adopted. To learn more about the synergy between OPC UA and Industrial IoT applications, read the following whitepaper: https://opcfoundation.org/wp-content/uploads/2015/04/OPC-UA-Interoperability-For-Industrie4-and-IoT-EN.pdf

** B-Scada’s IoT software is built on OPC UA and leverages the full power of these capabilities to provide fully customizable and extensible applications that consolidate and organize data from disparate sources for secure real-time visualization on any device. Learn more at http://scada.com

From BIM to Facility Management

BIM (Building Information Modeling) has become an essential tool in building architecture and construction. Creating a logical, structured model of all information related to a building project can help the project move seamlessly from one phase to the next.

BIM helps keep building projects on schedule and on budget. It helps ensure regulatory compliance. It helps facilitate the necessary collaboration that must occur between a project’s planning and eventual construction. A quality BIM also helps keep stakeholders involved in the process, adding a kind of transparency that inspires trust and confidence.

For most people, the notion of a Building Information Model implies a detailed 3-dimensional rendering of a building. With the 3D imaging and design software technology available today, it is true that designers and architects are enjoying powerful new tools to do their jobs, and these 3D models are in fact a big part of BIM. They are not, however, what BIM is all about.

A typical BIM will include not only detailed renderings of the planned building, but also specific information related to the engineering, construction, and operation of the building. This information can include designs, architectural specifications, site information, material sheets, budgets, schedules, personnel and more. BIM is not only useful in the design and construction of a building, but can also be very helpful in the management of the building once construction is complete.

COBie

In 2007, a pilot standard was developed by Bill East of the United States Army Corps of Engineers for the delivery of building information that is essential to the operations, maintenance, and asset management of a building once construction is complete. COBie (Construction Operations Building Information Exchange) was accepted by the National Institute of Building Sciences in December 2011 as part of its National Building Information Model (NBIMS-US) standard.

COBie is used to capture and record essential project data at the point of origin, including: product data sheets, spare parts lists, warranties, and preventive maintenance schedules. COBie’s popularity is increasing, and in September 2014 it was included in a code of practice issued as a British standard (BS 1192-4:2014 “Collaborative production of information Part 4: Fulfilling employer’s information exchange requirements using COBie – Code of practice”). This standard will require contractors involved in the construction of government buildings to comply with COBie when delivering facility information to the building owner after construction is completed.

While this expectation in Britain is controversial, and it has been characterized as “unrealistic”, it is becoming increasingly clear that the information involved in Building Information Models can, should, and will be used to aid in the maintenance and management of the building after its construction. This is where BIM becomes facility management, and this is where some enterprising software developers are creating a new market for themselves.

Some developers of BIM software have expanded their product portfolios by including Facility Management products that transfer the information from BIMs into a useful format for operating and maintaining the constructed building. This seems to be a natural extension of BIM, and these companies will benefit greatly by placing themselves ahead of their competition in what is nearly certain to become a large and lucrative market.


What does this have to do with SCADA?

In the space between BIM and Facility Management, there is often a need for greater automation. The exchange of building information today frequently requires a tremendous amount of labor – an amount of labor described in man-years.

Often, facility managers are provided several large boxes of paper documents, from which they must manually retrieve asset information and maintenance schedules to be entered into Computerized Maintenance Management Systems (CMMS). This process usually involves pallets of boxes full of paper of operations and maintenance manuals and drawings. Imagine the time required to create, review and transcribe hundreds of pages of documents, validate the transcriptions, and manually enter data, assuming a system like a CMMS is even used.

Even if a CMMS is used, maintenance technicians often still need to search for information in these paper boxes to complete many of their jobs. As time passes, documents can be moved or lost, increasing the cost of maintenance activities and potentially increasing downtime in mission-critical facilities. A study in 2011 suggested that 8% of annual maintenance budgets could be eliminated if open-standard electronic information were made available to technicians before starting complex work orders.

This is where some BIM software developers are finding a new market by providing the tools to painlessly transfer BIM information into a facility management system. This is also where there are still many who would benefit from an open software platform that allows users to consolidate and organize disparate information, making it available for real-time visualization on any device.

An open platform like B-Scada’s Status Enterprise can provide this type of value to a number of different stakeholders:

  • BIM software developers who would like a customized, branded software solution for facility management they can use to extend their own products or to add as another product in their portfolios.
  • Facility owners who have received a BIM related to their newly-built facilities and are looking for a way to remotely monitor and manage their new assets.
  • Facility managers charged with operating and maintaining multiple facilities, and who would benefit greatly from a remote monitoring solution that allows them to automate processes and monitor real-time activity from anywhere at any time on any device with a web browser.

To learn more about how Status Enterprise can help you reach your facility management goals, visit www.scada.com.
 

The Integrated Enterprise – Are We Ready?

13451387_m

There are many barriers to change in a commercial enterprise, and most of them start with a dollar sign. You are comfortable with what you’re doing. Your staff is comfortable. Sure, there may be some missed opportunities, but perfection is unrealistic. To implement enterprise-wide changes to something like your data management strategy would require cooperation across multiple departments, absorb numerous man-hours in implementation, and who can say how long it will take for all parties to get used to the new strategy and work with a level of comfort they already feel today? Is it worth it? How long will it take to recover the investment?

There are many legitimate questions to ask when considering whether or not to move toward an integrated data management strategy. How do we calculate the true cost of making such a change? A question that is very rarely asked is: What is the true cost of not making such a change?

First, let’s consider some of the reasons in favor of data integration.

Inconsistent data

One of the problems addressed by data integration is inconsistency between data on the plant floor and the business data further upstairs. Depending on the type of business, different departments typically have different goals and criteria for success. The plant floor supervisor wants to know where his products are; the executive upstairs wants to know how much his products are worth. Here is a case where we have different people querying for different bits of information about the same asset. Over time, the different goals and process definitions have led to departments using the same terms to describe different things, and different terms to describe the same things. This barrier to departmental collaboration in the manufacturing industry, for example, has led to the development of standards like ISA 95 to help facilitate the integration of manufacturing systems with business systems.

Redundant data

Another common condition is the tendency for different departments or divisions to have different ways of recording information about the same things. It is not at all unusual for large organizations to have multiple records of the same asset. For instance, if we imagine a particular production unit from the perspective of the plant floor operator, he will need to have information about where it is in the production process, its quality, the personnel involved in its production and testing, and when it will be shipping. At the same time, a manager will want to have information about how much it cost to produce this unit, how many units will be produced today, and how much we will get for it. We now have a situation where we are capturing and recording separate sets of data about the same thing.

Fewer Human Resources

This one seems obvious, but it a significant difference-maker when you analyze your bottom line. Making it easier to find needed data will allow personnel to spend more time focusing on other aspects of their jobs. It will allow for faster decisions and more immediate response to abnormal conditions. Your plant floor supervisor won’t have to make that call upstairs to find out why today’s production schedule has changed, or log in to a separate system to find out when a piece of equipment was last inspected. And the manager upstairs won’t have to call downstairs to find out why we are behind schedule today, or what happened to that shipment that was supposed to go out. Having the ability to quickly assess a situation leads to better-informed decisions made more quickly and with more immediate results.

Reduced Risk

While we are on the topic of making informed decisions more quickly, this is a good time to consider the way that decisions are currently made in many enterprises. When a decision needs to be made quickly, and the data that could support that decision is not available as quickly as the decision is needed, owners and executives are left to make decisions based on intuition. Studies have suggested that about 80% of decisions are made this way. It may work and it may not. Having the right information when and where it is needed can significantly reduce the risk involved in the decision-making process.

There are many additional benefits that can be attributed to data integration. New business opportunities can be revealed. New calculations can be used to improve efficiency and coordinate processes. Improve inventory management, energy consumption, supply chain scheduling, etc. Whether you choose to use a system of data virtualization to integrate key data from different divisions, a system of data federation to consolidate all enterprise data, or opt for a complete data integration solution that re-engineers your entire data system, the benefits are very real and yes, so is the cost. The cost, however, is a short-term loss for a long-term gain; a temporary pain for permanent growth.

So, to revisit the topic of this article: Are we ready for the integrated enterprise? The answer is irrelevant. Those who are ready will continue to prosper. Those who are not will lose the ability to compete, and will ultimately have to get ready or get out of the way.

For more information on how you can integrate and visualize your business’s data, visit: www.scada.com

Change Management Systems – Is There a Better Way?

There is no denying that plant floor automation can dramatically improve efficiency and increase productivity, but there is an unintended consequence of automation that can make it problematic. That consequence is the increased dependency on new technologies like PLCs, PC-based control systems, SCADA systems, and HMIs. As long as everything is working as it should, the automated workplace proceeds as a well-oiled machine, meeting every quota and price point. Of course, when something is not working as well as it should things can get complicated.

Imagine if a type of hardware used in your process has proven to be ineffective and you’ve decided to replace it with another model. Not only does the hardware change, but changes must be made to your overall control logic. This is likely to require changes to your PLCs, your SCADA system, and your HMIs. And what if the new equipment is even less efficient and you decide to roll back to the previous version? All of these control logic changes must be undone.

Change Management Systems

These concerns have become of such major importance that many companies are investing thousands of dollars and countless man-hours in software designed specifically to help manage plant-wide changes. These Change Management Systems are intended to reduce the overall cost of implementing plant-wide changes by automating as much of the process as possible. A good CMS will provide the following features:

– A backup/archive of prior revisions of programs
– Tools for documenting changes
– A historical record of what and when changes were made, and by whom
– User- or role-based permissions determining who is able to make changes
– Disaster recovery procedures to recover from hardware failures
– Notification of changes

These change management functions have been performed manually in most cases, requiring enormous investments of time. Furthermore, the updates made to PLCs and SCADA systems typically require taking the process down while changes are made. This inevitable downtime creates another enormous gap in profitability. Even when a sophisticated CMS is employed, there is no way to avoid the fact that traditional SCADA and HMI systems are inextricably linked to the hardware that they are monitoring. Any significant change will require taking the entire process down and starting it up again after the changes are fully implemented.

Is There an Alternative?

If it seems that change management is just a fancy new way for software developers to make more money on some unnecessary product designed to solve imaginary problems, just think about what would be involved in making plant-wide changes in your enterprise. Would you have to make changes to your SCADA system? How long would that take? Would you have to update your HMI screens? How many of them? And how long would you have to take the process down in order to make these changes? Consider the cost of the labor. Consider the lost production due to downtime. And imagine if the change you made does not produce the intended result, and you want to roll the process back to a previous state. How much more time and money would that cost?

The benefits of change management are various and undeniable, but is it possible to realize these benefits without introducing another management system – another system that will itself need to be managed? What if your HMI/SCADA system allowed you to manage plant-wide changes with ease, and without extravagent investments in labor or lost production? One way this is possible is through the concept of Data Modeling. By creating a logical model of your plant and your processes, your control logic is abstracted away from the actual hardware and becomes much more flexible and scalable. A change made to a piece of equipment in your data model will automatically be in effect for anyone who is using that model. Data modeling also allows you to create templates of your HMI screens that can be used for all assets of the same type, so instead of making changes to dozens of different screens a change can be made to the template and will be automatically applied to all instances of that template. And since graphics are bound to data in the model instead of actual hardware, changes can be made to your HMI screens without taking the process down. As today’s enterprises become more automated, and as more data points become measurable, a SCADA system that employs data modeling is becoming more and more of a necessity. The good news is that such a system will surely pay for itself in a short time as efficiency is increased and downtime is reduced, providing a significantly lower total cost of ownership.


Data Modeling is becoming more of a necessity in today’s data-driven enterprise

 

The need for a CMS can be eliminated in many cases by using an HMI/SCADA system that employs data modeling. And while data modeling alone will not replace the full range of features provided by a quality CMS, many of the benefits can be duplicated, and additional benefits can be derived from the ability to perform these change management tasks from inside of your SCADA system without having to deploy a separate system.

Consider the example of B-Scada’s Status Enterprise HMI/SCADA software, which takes full advantage of the data modeling concept. Status Enterprise allows you to deploy system-wide changes with ease, and allows these changes to be logged and accessed later for review. The packaged Database Utility allows you to create regular backups of your model and mimics so that they can be rolled back to an earlier version should the need arise. You can create user roles and workspaces to determine who has access to what information and what capabilities will be allowed. By combining the power and efficiency of high-quality SCADA software with the sophistication of data modeling, it is possible to incorporate capabilities that bridge the gaps between process control, maintenance management, change management, asset management and resource planning. With the dawning of the new interconnected industrial environment, industry 4.0 or the ‘Internet of Things’, there has never been a better time to change your expectations about SCADA software and what it can do to bring your enterprise into the 21st century.

 

 

U.S. Factories Manufacturing a Comeback?

Once a pillar of the American economy, the manufacturing industry suffered some major setbacks over the last couple of decades. Whether due to free trade agreements, international outsourcing, the general economic recession, or a combination of all three, fewer and fewer products are manufactured in the US. There are signs, however, that things may be changing for the better. 


Factories that have sat dormant for years are being purchased and reopened. Some are being opened by foreign companies, but those factories will be staffed by American workers. And the resurgence of U.S. manufacturing is not entirely dependant on foreign investment. Many American companies are also increasing their local manufacturing base – again, putting Americans to work. 

This resurgence is particularly important as manufacturing – even at its least productive – is responsible for a significant portion of the US economy. For example, in 2012 manufacturing contributed 12.5 percent of GDP. Right now, it is estimated that every $1.00 spent on manufacturing adds $1.32 to the economy, which is the highest multiplier effect of any economic sector. It is not an exaggeration to suggest that as manufacturing goes, so goes the U.S. economy. 


Manufacturing Production has increased during 9 of the last 12 months

 

The truth is that manufacturing output is increasing worldwide, due in large part to technological advances and the proliferation of factory automation. As SCADA technology advances, moving us ever closer to the next Industrial revolution – or Industry 4.0 – not only are manufacturing processes becoming more efficient, but manufacturing employees are becoming more skilled and higher paid than in previous generations, meaning there is more disposable income for purchasing manufactured goods. Greater demand encourages greater supply, and the cycle continues.

Automation is also likely to level the international playing field somewhat, as technology is eliminating the need for cheap, unskilled labor. Manufacturing employees in the new industrial environment will be technically savvy and skilled.  As we move toward a more automated future, there will be little advantage to an American company opening factories in other nations or outsourcing manufacturing projects. Americans are still among the world’s most veracious consumers, and the ability to eliminate the cost and complication of international shipping will provide incentive for local companies to keep their manufacturing base right here in the U.S.

U.S. factories continue to re-open, and new ones are being built. And the most important product issuing from these new assembly lines? Prosperity.