3 Keys to Effective Real-Time Data Visualization

There are several important factors to consider when creating your real-time data visualization, many of which will depend on your intended application. Today, we consider at a few of the general factors that will play a role in every visualization you create. These three factors are clarity, consistency, and feedback.

Clarity

Real-Time graphics should emphasize pertinent information and use design principles that promote ease-of-use and accessibility above aesthetics. Things like size, color and brightness can be used to distinguish primary details from secondary and tertiary details. Special graphics can be created to emphasize different information under different conditions (i.e. a special set of graphics to be used when a certain alarm is triggered).

clarityPic

When planning a real-time visualization scenario, it is very important to consider who will be using this visualization, and what is his/her purpose in viewing this data. This will obviously vary from one organization to the next, but when differentiating between primary, secondary, and tertiary information, it is important to not think in terms of what is important about the thing being monitored, but what is important to the person doing the monitoring.

 

Consistency

Consistent visualizations are standardized and consistently formatted. Interaction requires a minimum of keystrokes or pointer manipulations. In fact, whenever possible, all relevant information should be visible without the need to navigate to another screen. When navigation is necessary, be certain that elements of the user interface related to navigation are clearly distinguished from elements that relay pertinent information. Additionally, navigation and interaction of any type should be as easy and intuitive as possible.

chart

The ergonomic needs of the user are also extremely important. Poor data visibility has been cited as a primary cause of many industrial accidents where a process was being monitored or controlled through a real-time HMI (Human Machine Interface). In fact, poorly designed HMIs have been blamed for accidents that have led to millions of dollars in damaged equipment and some very unfortunate and unnecessary deaths.

 

A recent study by OSHA in Europe compiled statistics on HMI-related errors in the workplace. Interestingly, research shows that the majority of problems are caused by human error, but not entirely because of mental and physical fatigue. More often, errors are caused by poor decision-making related to the way that information is processed.

  

Feedback

An operator should be fully confident that the choices they make are having the desired effect. Screens should be designed in a way that provides information, putting relevant data in the proper context. Also, important actions that carry significant consequences should have confirmation mechanisms to ensure that they are not activated inadvertently.

 Controls will function consistently in all situations. If something is not working as it should, that fact should be immediately obvious and undeniable. Again, in a well-designed system, design principles are employed to promote clarity and simplicity, and to reduce user fatigue.

Keep it simple and straight-forward. Save the complex visual tools for historical data or real-time reporting. There is certainly a place for all of this, but that place is not where real-time data is being used to make real-time decisions.

Learn more in the free whitepaper “Real-Time Data Visualization Essentials”:

wpCover
http://scada.com/Content/Whitepapers/Real-Time%20Data%20Visualization%20Essentials.pdf

 

 

To Each His Own: Creating Custom Dashboards for Operators and Analysts

manyFaces

It’s always very annoying when I try to perform what seems like it would be fairly routine maintenance on a home appliance or worse – my car – only to find out that this seemingly simple thing I would like to do is actually quite difficult with the tools at my disposal. A little bit of research usually reveals that it actually is quite simple; I just have to buy this proprietary tool from the manufacturer for what seems like a ridiculous price, and then I can proceed.

Of course, it’s easy to understand why the manufacturer doesn’t want to make it easy for end users to service their product. They want you to buy a new one, or at the very least buy this overpriced tool from them so they can scrape every morsel of profit afforded by their built-in obsolescence.

It really makes me appreciate the simplicity and widespread application of some of our more traditional tools. Take a hammer, for instance. If you need to drive a nail into wood, it doesn’t matter if it’s a big nail, a little nail, a long nail, or a short nail. It doesn’t matter who manufactured it or when. All that matters is that it’s a nail. Just get a hammer; you’ll be fine.

This got me thinking. What if we had a hammer for every type of nail available? What if each hammer was perfectly sized, shaped, weighted and balanced for each particular nail? And what if that perfect hammer was always available to you every time you needed it. This isn’t realistic, obviously, but it reminds me of some of the things I hear from our customers.

One of the great benefits cited by our end users is the ability to create custom dashboards for the different work responsibilities in their organizations. The same system is used to create maintenance dashboards for technicians, control panels for operators, system overviews for managers, reports for analysts, and even special dashboards for contractors and vendors. By providing every member of the team with a real-time view of exactly the information they need to do their jobs and nothing more, each person is empowered to do their jobs with the utmost efficiency – improving the speed and accuracy of decision-making as well as increasing the capacity for planning.

In the past, so much of our data visualization was tied to the device from which the data was drawn. If you wanted to know something about a particular machine, you had to look at the same picture as everyone else, regardless of what you needed to see.

Some modern software platforms like B-Scada’s Status products eliminate this need to tie visualizations to the device from which the data is drawn. It is now possible to visualize data from multiple devices at multiple locations through the same interface. This allows for a new concept in user interface design: rather than displaying all available information about this particular thing, you can now display all information relevant to a particular task or set of tasks.

It’s not quite “a hammer for every nail”; it’s more like a complete tool set tailored to every job, containing exactly the tools you need and nothing more. It’s really been a transformative development for many organizations.

B-Scada recently released a case study detailing how one prominent North American electric utility used Status to create a system of customized views for their operators, managers, and analysts, providing specific insights into the real-time status of their generation resources:

Read It Now

 

The Four Biggest Challenges to Enterprise IoT Implementation

stopsign

After endless cycles of hype and hyperbole, it seems most business executives are still excited about the potential of the Internet of Things (IoT). In fact, a recent survey of 200 IT and business leaders conducted by TEKSystems ® and released in January 2016 (http://www.teksystems.com/resources/pressroom/2016/state-of-the-internet-of-things?&year=2016) determined that 22% of the organizations surveyed have already realized significant benefits from their early IoT initiatives. Additionally, a full 55% expect a high level of impact from IoT initiatives over the next 5 years. Conversely, only 2% predicted no impact at all.

Respondents also cited the key areas in which they expect to see some of the transformational benefits of their IoT efforts, including creating a better user and customer experience (64%), sparking innovation (56%), creating new and more efficient work practices and business processes, (52%) and creating revenue streams through new products and services (50%).

The IoT is Expected to Impact Organizations in Numerous WaysThe IoT is Expected to Impact Organizations in Numerous Ways

So, with the early returns indicating there are in fact real, measurable benefits to be won in the IoT, and the majority of executives expect these benefits to be substantial, why are some organizations still reluctant to move forward with their own IoT initiatives?

As could be expected, security is the biggest concern, cited by approximately half of respondents.

Increased exposure of data/information security – 50%

With the World Wide Web as an example, people today are well aware of the dangers inherent in transmitting data between nodes on a network. With many of these organizations working with key proprietary operational data that could prove advantageous to a competitor if exposed, the concern is very understandable.

 

ROI/making the business case – 43%

This is a classic example of not knowing what you don’t know. Without an established example of how similar initiatives have impacted your organization in the past – or even how similarly sized and structured organizations have been impacted – it can be very difficult to demonstrate in a tangible way exactly how these efforts will impact the bottom line. Without being able to make the business case, it will be difficult for executives to sign off any new initiatives. This is likely why larger organizations ($5+ billion in annual revenue) are much more likely to have already implemented IoT initiatives, while smaller organizations are still in the planning phase.

 

Interoperability with current infrastructure/systems – 37%

Nobody likes to start over, and many of the executives surveyed are dealing with organizations who have made enormous investments in the technology they are currently using. The notion of a “rip and replace” type of implementation is not very appealing. The cost is not only related to the downtime incurred in these cases, but the wasted cost associated with the expensive equipment and software systems that are being cast aside. In most cases, to gain any traction at all a proposed IoT initiative will have to work with the systems that are already in place – not replace them.

Finding the right staff/skill sets for IoT strategy and implementation – 33%

With the IoT still being a fairly young concept, many organizations are concerned that they lack the technical expertise needed to properly plan and implement an IoT initiative. There are many discussions taking place about how much can be handled by internal staff and how much may need to be out-sourced. Without confidence in their internal capabilities, it is also difficult to know whether or not they even have a valid strategy or understanding of the possibilities. Again, this is a case where larger organizations with larger pools of talent have an advantage.

The full results break down like this:

chart2.png
Many Organizations are Hesitant to Invest Much in IoT Initiatives at this Stage

 

There are some valid concerns, and not all of them lend themselves to simple solutions. In truth, many of the solutions will vary from one organization to the next. However, in many cases the solutions could be as simple as just choosing the right software platform. Finding a platform that eases your concerns about interoperability can also help ease your concerns about whether or not your staff can handle the change, as there will be no need to replace equipment. Likewise, a platform that can be integrated seamlessly into your current operations to help improve efficiency and implement optimization strategies will also make it much easier to demonstrate ROI.

B-Scada has released a new whitepaper on choosing the right IoT platform for your project. If you’re thinking about taking that leap into the IoT, it’s well worth the read.

Read It Now

3 Keys to Effective Real-Time Data Visualization

Everybody appreciates the value of a good picture. Each one says a thousand words, after all, or so the saying goes. If we really dig in to this metaphor, we’d probably admit that some pictures say substantially more than that – while others probably come in well under a dozen (just look at a random Facebook wall for some examples).

Ours has become a very visual culture, and one occupying a place in time defined by an overwhelming abundance of information all around us. Considering these two facts, it is not at all surprising that we see such an increased interest in data visualization – that is to say the process of placing a particular, specific set of data points in a highly visual context that allows it to be quickly consumed and analyzed.

It’s not a new concept; data has been visualized in pictures for centuries. A map is a type of data visualization, for instance, as are the many charts and graphs that have been used since the end of the 18th Century. What is new is the massive quantity of data available to nearly everyone, and the wide array of tools that can be used to create compelling visualizations. Think about the cool infographic you saw the other day. Was it created painstakingly over several days of carefully reviewing ethnographic data compiled by a dogged scientist over the course of his career? Maybe, but probably not. It was more likely created by some marketing department somewhere (not that there’s anything wrong with that) using somebody else’s data and somebody else’s visualization tools.

The purpose of this post, though, is not to discuss the merits of data visualization in general, but rather the specific subset of data visualization that deals with real-time data. This is a completely separate species of data visualization and should be treated as such.

Real-time data visualization refers to visualization of data that is continuously updated as new data is generated by connected devices or people. This is the type of data that is used to make real-time decisions and, when done correctly, can truly transform business processes.

There are a number of important factors to consider when attempting to visualize data in real time, but we will focus on three simple and obvious keys: clarity, consistency, and feedback.

 

Clarity

Real-Time graphics should emphasize pertinent information and use design principles that promote ease-of-use and accessibility above aesthetics. Things like size, color and brightness can be used to distinguish primary details from secondary and tertiary details. Special graphics can be created to emphasize different information under different conditions (i.e. a special set of graphics to be used when a certain alarm is triggered).

 

Hierarchical Data
Hierarchical Data Makes its Relevance Obvious

Clear visualizations provide actionable information at a glance, and clearly show the current process state and conditions. Alarms and indicators of abnormal conditions are prominent and impossible to ignore.

Clarity encompasses both content and context.

dataVis3.png
Contextual Controls Allow You to Assess Current Conditions at a Glance

 

Consistency

Consistent visualizations are standardized and consistently formatted. Interaction requires a minimum of keystrokes or pointer manipulations.

Shapes, colors, and layouts should be used consistently through all screens. If the color red is used in one place to designate an abnormally high value on one screen, that same color red should be used to indicate all abnormally high values of the same type on all screens. If navigation buttons are on the left side of one screen, they should be on the left side of all screens. A consistent visualization system is arranged in a logical, hierarchical manner, allowing operators to visualize both a general overview of the system as well as more detailed information on different components as needed. Navigation and interaction of any type should be as easy and intuitive as possible.

Consistency is closely related to clarity.

dataVis4.png
Color is a Great Way to Distinguish One Property from Another, As Long As it Is Consistently Applied.

 

Feedback

An operator should be fully confident that the choices they make are having the desired effect. Screens should be designed in a way that provides information, putting relevant data in the proper context. Also, important actions that carry significant consequences should have confirmation mechanisms to ensure that they are not activated inadvertently.

Controls will function consistently in all situations. If something is not working as it should, that fact should be immediately obvious and undeniable. In a well-designed system, design principles are employed to reduce user fatigue.

There are obviously many other important factors to consider when real developing a real-time visualization system. Anyone who wants to dig deeper is encouraged to read this free whitepaper on the subject:

Click here to read it

3 Reasons You Should Consider Giving Your Process Operators Mobile Devices

eim_image-fw

That’s right. It’s time to own up to the fact that the majority of us are using phones and tablets to do business everyday. We buy, sell, trade, learn, teach, and all manner of horrible and wonderful things that we have always done (no, not everyone does horrible things, but don’t act like the things you do are always so wonderful either) all with the aid of portable devices that allow us to move freely about our lives without being tethered to a desk chair.

Why, then, is it so difficult for some people to recognize that our industrial process operators and technicians – who are so often stuck behind a stationary HMI or calling from the field to speak with someone who is – would be far better equipped to do their jobs if only they were afforded the same conveniences they afford themselves in their lives outside of work.

I know there are concerns about security – about opening some digital wormhole through which all sorts of nefarious activity could be invited. There are concerns about ill-intentioned deviants having potential access to sensitive process data – which is not only proprietary, but often essential to our infrastrucure – as well there should be. But it’s not like these potential problems didn’t exist before mobile devices, and while some concerns are certainly valid, mobile devices provide a number of key benefits and opportunities that cannot be ignored:

 

  • For Remote Management of Disparate Assets
    This one seems pretty obvious, but imagine the amount of time that could be saved by not having to manually inspect field equipment or call back to the control station every time there is a simple question.
  • For Constant Access to a Portable Media Viewer
    How can you ensure that operators and techs always have access to the latest work masters, training videos, etc.? Upload or edit a document and make your changes instantly available to all relevant perties – regradless of where they are or what they’re doing.
  • For Instant access to Forms and Form Data
    Create Purchase Orders or close Work Requests from anywhere. Assign new owners or upload a picture you just snapped and attach it to a Job. The possibilities are nearly unlimited.

 

Sure, there are only three benfits listed here, but without much thought I’m sure you could think of a few more. Let me know in the comments below.

And for some additional food for for thought, check out this white paper on “The Benefits of Mobile HMIs” and tell me I’m not absolutely right about this:

Download White Paper

Object Virtualization: Digitizing the World

Object virtualization is the enabling technology behind the IoT (Internet of Things)

We are changing our world. With the advent of new sensing and communication technologies, we are finding ways of making everyday objects more intelligent and connected. As we connect more and more things to one another, however, we are finding a need to democratize the process. We have to make different things the same, or at least equal. We are still trying to answer the Mad Hatter’s famous riddle: How is a raven like a writing desk?

Though Alice’s time in Wonderland may have come and gone, ours is just beginning. While we may not be connecting ravens to writing desks (though nothing would surprise me at this point), we do have a need to connect seemingly unrelated objects in new ways.

One solution to this dilemma is the process of object virtualization. By creating virtual models, or representations, of the things you want to monitor and manage, you are putting ‘things’ on equal footing, creating new opportunities for analysis and task automation.

To understand object virtualization, consider the contact list in your phone. A contact can be thought of as a virtual model of an actual person. It is something like a digital identity. Imagine you have a contact named Mary Smith. Mary has a name, a phone number (or two), an email address, maybe a photo. Mary can have a Facebook profile, a Twitter alias – you can even assign Mary a special ringtone. All of these things combine to create a virtual model of Mary stored in your phone.

Now, to make your model of Mary a bit more intelligent and useful, you could add her date of birth, her hair color, her favorite book, her pet cat’s name, or any number of different properties of Mary. If we slapped a bunch of sensors on Mary, we may know things like her current location, current body temperature, her heart rate, her blood pressure. If this information is communicated to your model in real time, you have an active, living representation of Mary that tells you more about her than she may know herself.

Imagine applying this same process to your house, your car, your toaster, or your favorite pair of socks. Now, maybe you can’t think of a good reason for your socks to talk to your toaster, but they may have a thing or two to share with your washing machine. And maybe your house and your toaster can have a nice conversation about lowering your electric bill. Of course, your things aren’t just talking to your other things. They can talk to other things anywhere. Do you think it might be helpful for your air conditioning system to know something about today’s weather forecast? Or for your car to know about that new road construction on your way to work?

Your virtualized house doesn’t care that it’s a house. It may as well be an elephant or a water balloon. The same is true of your car, your refrigerator, or your lawn sprinklers. Virtual models can share information with other virtual models without regard for where the data is coming from or how it got there. Virtualization can make every “thing” accessible to every other “thing”, and ultimately to you.

**B-Scada’s VoT (Virtualization of Things) Platform allows you to create virtual models using data from multiple and disparate sources, providing a simple platform for creating powerful and intelligent IoT (Internet of Things) applications. Learn more at http://votplatform.com.

Information Modeling as a Tool for Collaboration

In the spirit of the upcoming holiday season, let’s take a moment to examine one of the greatest and most appreciable qualities of a healthy organization: collaboration. In a world so full of information, where we are all so busy and so pressed for time, it seems collaboration has become something done more out of necessity than out of a desire for quality and efficiency.

Some of this reality may be due to the fact that there simply are no good tools for collaboration in the modern workplace. Sure, we have email and teleconferencing, web meetings and text messages – but for all of our technology, our endless need to compartmentalize and segment our business processes has left us no closer to a model of organic collaboration than we were in the past.

With relevant information stored in separate silos, decision-makers are still forced to rely on reports and statistics compiled from historical data and interpreted to support a specific agenda. There has really been no truly organic means analyzing real-time data alongside the historical data. Likewise, the available tools for integrating data from separate systems are limited in terms of their ability to create a real-time context and to display the appropriate data to decision-makers at the speed with which decisions must often be made.

While these tools may be useful for looking back and analyzing what has happened, it is another matter altogether when trying to look forward to make plans or predict outcomes.

Information Modeling

One of the ways this challenge can be overcome is by using an information model to organize and structure your organization’s data in a way that provides context and clarity in real time. Information modeling allows assets to be associated with all relevant information – regardless of where that information may reside.

For instance, a motor on your plant floor can have live data related to its RPM, temperature, throughput, or other process data – as well as a commission date, a maintenance schedule, troubleshooting documents and training videos. Properties of this motor can also include OEE (Overall Equipment Efficiency), Net Asset Value, or other performance and resource planning metrics. Some of this data may be coming from PLCs, some from databases like SQL Server, some from user input, and other data is coming from programmed calculations. In this situation, it is not important how this data is generated or where it is stored. What is important is that this data can be visualized at any time in whatever way suits your collaborative needs.

There are a number of different tools that can be used to create an information model for your organization. A few things to consider when choosing an information modeling tool:

  • Does the modeling software take into account both real-time AND historical data?
  • Does the modeling software allow you to include ALL relevant information from every source?
  • Is your modeled data logged in a relational database like SQL Server so it can be queried if additional information is needed?
  • Does your modeling software provide the tools you need to visualize your data in a useful way that supports decision-making?
Before you jump into a new software product and a new data management system, do some homework. As with everything there are pros and cons to the different products available.

The Integrated Enterprise – Are We Ready?

13451387_m

There are many barriers to change in a commercial enterprise, and most of them start with a dollar sign. You are comfortable with what you’re doing. Your staff is comfortable. Sure, there may be some missed opportunities, but perfection is unrealistic. To implement enterprise-wide changes to something like your data management strategy would require cooperation across multiple departments, absorb numerous man-hours in implementation, and who can say how long it will take for all parties to get used to the new strategy and work with a level of comfort they already feel today? Is it worth it? How long will it take to recover the investment?

There are many legitimate questions to ask when considering whether or not to move toward an integrated data management strategy. How do we calculate the true cost of making such a change? A question that is very rarely asked is: What is the true cost of not making such a change?

First, let’s consider some of the reasons in favor of data integration.

Inconsistent data

One of the problems addressed by data integration is inconsistency between data on the plant floor and the business data further upstairs. Depending on the type of business, different departments typically have different goals and criteria for success. The plant floor supervisor wants to know where his products are; the executive upstairs wants to know how much his products are worth. Here is a case where we have different people querying for different bits of information about the same asset. Over time, the different goals and process definitions have led to departments using the same terms to describe different things, and different terms to describe the same things. This barrier to departmental collaboration in the manufacturing industry, for example, has led to the development of standards like ISA 95 to help facilitate the integration of manufacturing systems with business systems.

Redundant data

Another common condition is the tendency for different departments or divisions to have different ways of recording information about the same things. It is not at all unusual for large organizations to have multiple records of the same asset. For instance, if we imagine a particular production unit from the perspective of the plant floor operator, he will need to have information about where it is in the production process, its quality, the personnel involved in its production and testing, and when it will be shipping. At the same time, a manager will want to have information about how much it cost to produce this unit, how many units will be produced today, and how much we will get for it. We now have a situation where we are capturing and recording separate sets of data about the same thing.

Fewer Human Resources

This one seems obvious, but it a significant difference-maker when you analyze your bottom line. Making it easier to find needed data will allow personnel to spend more time focusing on other aspects of their jobs. It will allow for faster decisions and more immediate response to abnormal conditions. Your plant floor supervisor won’t have to make that call upstairs to find out why today’s production schedule has changed, or log in to a separate system to find out when a piece of equipment was last inspected. And the manager upstairs won’t have to call downstairs to find out why we are behind schedule today, or what happened to that shipment that was supposed to go out. Having the ability to quickly assess a situation leads to better-informed decisions made more quickly and with more immediate results.

Reduced Risk

While we are on the topic of making informed decisions more quickly, this is a good time to consider the way that decisions are currently made in many enterprises. When a decision needs to be made quickly, and the data that could support that decision is not available as quickly as the decision is needed, owners and executives are left to make decisions based on intuition. Studies have suggested that about 80% of decisions are made this way. It may work and it may not. Having the right information when and where it is needed can significantly reduce the risk involved in the decision-making process.

There are many additional benefits that can be attributed to data integration. New business opportunities can be revealed. New calculations can be used to improve efficiency and coordinate processes. Improve inventory management, energy consumption, supply chain scheduling, etc. Whether you choose to use a system of data virtualization to integrate key data from different divisions, a system of data federation to consolidate all enterprise data, or opt for a complete data integration solution that re-engineers your entire data system, the benefits are very real and yes, so is the cost. The cost, however, is a short-term loss for a long-term gain; a temporary pain for permanent growth.

So, to revisit the topic of this article: Are we ready for the integrated enterprise? The answer is irrelevant. Those who are ready will continue to prosper. Those who are not will lose the ability to compete, and will ultimately have to get ready or get out of the way.

For more information on how you can integrate and visualize your business’s data, visit: www.scada.com