3 Keys to Effective Real-Time Data Visualization

There are several important factors to consider when creating your real-time data visualization, many of which will depend on your intended application. Today, we consider at a few of the general factors that will play a role in every visualization you create. These three factors are clarity, consistency, and feedback.

Clarity

Real-Time graphics should emphasize pertinent information and use design principles that promote ease-of-use and accessibility above aesthetics. Things like size, color and brightness can be used to distinguish primary details from secondary and tertiary details. Special graphics can be created to emphasize different information under different conditions (i.e. a special set of graphics to be used when a certain alarm is triggered).

clarityPic

When planning a real-time visualization scenario, it is very important to consider who will be using this visualization, and what is his/her purpose in viewing this data. This will obviously vary from one organization to the next, but when differentiating between primary, secondary, and tertiary information, it is important to not think in terms of what is important about the thing being monitored, but what is important to the person doing the monitoring.

Consistency

Consistent visualizations are standardized and consistently formatted. Interaction requires a minimum of keystrokes or pointer manipulations. In fact, whenever possible, all relevant information should be visible without the need to navigate to another screen. When navigation is necessary, be certain that elements of the user interface related to navigation are clearly distinguished from elements that relay pertinent information. Additionally, navigation and interaction of any type should be as easy and intuitive as possible.

chart

The ergonomic needs of the user are also extremely important. Poor data visibility has been cited as a primary cause of many industrial accidents where a process was being monitored or controlled through a real-time HMI (Human Machine Interface). In fact, poorly designed HMIs have been blamed for accidents that have led to millions of dollars in damaged equipment and some very unfortunate and unnecessary deaths.

 

A recent study by OSHA in Europe compiled statistics on HMI-related errors in the workplace. Interestingly, research shows that the majority of problems are caused by human error, but not entirely because of mental and physical fatigue. More often, errors are caused by poor decision-making related to the way that information is processed.

  

Feedback

An operator should be fully confident that the choices they make are having the desired effect. Screens should be designed in a way that provides information, putting relevant data in the proper context. Also, important actions that carry significant consequences should have confirmation mechanisms to ensure that they are not activated inadvertently.

 Controls will function consistently in all situations. If something is not working as it should, that fact should be immediately obvious and undeniable. Again, in a well-designed system, design principles are employed to promote clarity and simplicity, and to reduce user fatigue.

Keep it simple and straight-forward. Save the complex visual tools for historical data or real-time reporting. There is certainly a place for all of this, but that place is not where real-time data is being used to make real-time decisions.

Learn more in the free whitepaper “Real-Time Data Visualization Essentials”:

wpCover
http://scada.com/Content/Whitepapers/Real-Time%20Data%20Visualization%20Essentials.pdf

 

 

The Four Biggest Challenges to Enterprise IoT Implementation

stopsign

After endless cycles of hype and hyperbole, it seems most business executives are still excited about the potential of the Internet of Things (IoT). In fact, a recent survey of 200 IT and business leaders conducted by TEKSystems ® and released in January 2016 (http://www.teksystems.com/resources/pressroom/2016/state-of-the-internet-of-things?&year=2016) determined that 22% of the organizations surveyed have already realized significant benefits from their early IoT initiatives. Additionally, a full 55% expect a high level of impact from IoT initiatives over the next 5 years. Conversely, only 2% predicted no impact at all.

Respondents also cited the key areas in which they expect to see some of the transformational benefits of their IoT efforts, including creating a better user and customer experience (64%), sparking innovation (56%), creating new and more efficient work practices and business processes, (52%) and creating revenue streams through new products and services (50%).

The IoT is Expected to Impact Organizations in Numerous WaysThe IoT is Expected to Impact Organizations in Numerous Ways

So, with the early returns indicating there are in fact real, measurable benefits to be won in the IoT, and the majority of executives expect these benefits to be substantial, why are some organizations still reluctant to move forward with their own IoT initiatives?

As could be expected, security is the biggest concern, cited by approximately half of respondents.

Increased exposure of data/information security – 50%

With the World Wide Web as an example, people today are well aware of the dangers inherent in transmitting data between nodes on a network. With many of these organizations working with key proprietary operational data that could prove advantageous to a competitor if exposed, the concern is very understandable.

 

ROI/making the business case – 43%

This is a classic example of not knowing what you don’t know. Without an established example of how similar initiatives have impacted your organization in the past – or even how similarly sized and structured organizations have been impacted – it can be very difficult to demonstrate in a tangible way exactly how these efforts will impact the bottom line. Without being able to make the business case, it will be difficult for executives to sign off any new initiatives. This is likely why larger organizations ($5+ billion in annual revenue) are much more likely to have already implemented IoT initiatives, while smaller organizations are still in the planning phase.

 

Interoperability with current infrastructure/systems – 37%

Nobody likes to start over, and many of the executives surveyed are dealing with organizations who have made enormous investments in the technology they are currently using. The notion of a “rip and replace” type of implementation is not very appealing. The cost is not only related to the downtime incurred in these cases, but the wasted cost associated with the expensive equipment and software systems that are being cast aside. In most cases, to gain any traction at all a proposed IoT initiative will have to work with the systems that are already in place – not replace them.

Finding the right staff/skill sets for IoT strategy and implementation – 33%

With the IoT still being a fairly young concept, many organizations are concerned that they lack the technical expertise needed to properly plan and implement an IoT initiative. There are many discussions taking place about how much can be handled by internal staff and how much may need to be out-sourced. Without confidence in their internal capabilities, it is also difficult to know whether or not they even have a valid strategy or understanding of the possibilities. Again, this is a case where larger organizations with larger pools of talent have an advantage.

The full results break down like this:

chart2.png
Many Organizations are Hesitant to Invest Much in IoT Initiatives at this Stage

There are some valid concerns, and not all of them lend themselves to simple solutions. In truth, many of the solutions will vary from one organization to the next. However, in many cases the solutions could be as simple as just choosing the right software platform. Finding a platform that eases your concerns about interoperability can also help ease your concerns about whether or not your staff can handle the change, as there will be no need to replace equipment. Likewise, a platform that can be integrated seamlessly into your current operations to help improve efficiency and implement optimization strategies will also make it much easier to demonstrate ROI.

B-Scada has released a new whitepaper on choosing the right IoT platform for your project. If you’re thinking about taking that leap into the IoT, it’s well worth the read.

Read It Now

Oh, The Possibilities … When the IoT Grows Up

5Ways.jpg


The Internet of Things is something like a gangly, acne-covered adolescent with knobby knees and a clumsy gait.
We can see the bright eyes, the long legs and strong hands, and we know it is chock full of “potential”, but it sure is awkward right now.

Notwithstanding all of this awkwardness, however, this clumsy youngster has already made a tremendous difference in the world. The very thought of its possibilities has sent a tremor to the core of our civilization, touching every aspect of our material and intellectual lives. Just consider the fact that the sentence you just read – as blustery and over-the-top as it may seem – is not even inaccurate. Sure, a person can still live a simple life without all of the trappings of modern technology or communication media (I assume?), living only from sustenance won by his or her bare hands directly from the natural world, never interacting with another living soul. I suppose this is possible, and maybe this person could make a strong case that his/her life remains untouched in any way by the Internet of Things. This person, however, will not be reading this and need not be a part of the conversation.

So, to reiterate: the Internet of Things – or at least the thought of it – is influencing every component of our world today. This is because it is not simply an evolution of technology; it the sort of technological/philosophical movement that transforms civilizations. On the order of agriculture, kingship, or industrialization.

Yes, it is that significant.

That is to say, the technological/ philosophical movement started by the Internet itself is that significant. After all, the words, images, videos, and applications that inhabit the regular old Internet are themselves ‘things’. The concept behind what we call the Internet of Things is simply the dawning of the realization that the Internet is not just about people communicating with people; it’s about everything communicating with everything.

Consider what we already see happening to:

 

Cities

In Oskarshamm, Sweden smart building technology has helped reduce the city’s power consumption by 350 MWh, reducing their carbon footprint by 80 tons of CO2. Houston, Texas has used new sensing technology to retrofit 40 municipal buildings for energy efficiency, delivering $3 million in yearly energy and water savings.

Entire cities are changing the way they govern their populations, the way they distribute resources, the way they police themselves. Cities are changing the way they transport goods and people, the way they measure and control their impact on their environments. Everything that defines what a city is and does is being transformed by not just new technologies, but the new ideas inspired the Internet of Things.

 

Agriculture

One of the foundational elements of civilization, a technological/philosophical movement that predates history itself, is being profoundly influenced by the Internet of Things. Farmers large and small are using networked data to maximize the already-known benefits of established practices (knowing what types of crops to plant when, knowing when and how much to water, etc.). Farmers have also had success safely and naturally controlling pests through the intelligent release of pheromones. Decreased resource consumption and increased yield are very tangible benefits that have the potential to solve some very serious problems related to food shortages and ever-increasing populations, while simultaneously reducing the environmental impact of farming and bringing the family-owned farm back into the global marketplace.

 

Industry

This is the realm of autonomous factories and self-healing machines. Through the convergent development of advanced computing power, sophisticated network technology, sensors, robotics, and analytic techniques, we are seeing the integration of industrial systems both vertically and horizontally. Machines to Machine communication, predictive maintenance, and continuous improvement programs are completely reinventing manufacturing.

Companies like Honda and ABB are using IoT technology to consolidate and organize their manufacturing and maintenance operations through systems of real-time communication and process automation. Companies are using advanced analytics to discover unknown opportunities for improved efficiency. Consider how Kennametal reduced their production cycle time by as much as %40 by simple modifications to their processes like changing the angle of a cut in a particular machining operation.

Real-time consumer data is helping companies be more responsive to the needs and expectations of their customers, and eliminating gaps between supply and demand. Predictive analysis is helping to reduce maintenance costs and incrementally improve production processes through systems of continual improvement. A unique quality of the impact the Internet of Things is having on Industry is its benefits extend beyond the marketplace. Whereas previously profit increases were sought by increasing the scale or speed of production, the new paradigm focuses on increasing efficiency, reducing resource consumption and eliminating waste. The new industrial landscape of smart, connected devices will incidentally lead to a cleaner, safer, more sustainable planet, which leads to the next item…

 

Environment

It is certainly possible to see new technologies as a double-edged sword in this arena. Historically, what humankind has deemed to be good for itself has quite often seemed to be detrimental to our environment. As the Internet of Things makes it easier for us feed and accommodate larger populations, and populations continue to grow, it is not difficult to see how this could negatively impact the environment. An interesting quality of the philosophical thrust behind most Internet of Things initiatives, though, is the tendency toward reduction and conservation.  Use fewer resources. Create less waste. Do as much as possible with what is available to us. In a way that may be unprecedented, this worldwide technological evolution may actually improve our relationship with the natural world.

 

Yes, the Internet of Things is a gangly, awkward, stumbling bunch of possibilities right now, but it is already changing our world. And while we may not have reached that tipping point yet – the point where what is possible becomes what is necessary, and a movement truly transforms our civilization – I think most of us can feel the axis tilting.

There will inevitably come a time when what is happening becomes what has happened, and we will only recognize the revolutionary quality of it when we look back at it in retrospect. In the case of the Internet of Things, I think we have reason to be optimistic.

(Originally published on the B-Scada, Inc. blog.)

3 Keys to Effective Real-Time Data Visualization

Everybody appreciates the value of a good picture. Each one says a thousand words, after all, or so the saying goes. If we really dig in to this metaphor, we’d probably admit that some pictures say substantially more than that – while others probably come in well under a dozen (just look at a random Facebook wall for some examples).

Ours has become a very visual culture, and one occupying a place in time defined by an overwhelming abundance of information all around us. Considering these two facts, it is not at all surprising that we see such an increased interest in data visualization – that is to say the process of placing a particular, specific set of data points in a highly visual context that allows it to be quickly consumed and analyzed.

It’s not a new concept; data has been visualized in pictures for centuries. A map is a type of data visualization, for instance, as are the many charts and graphs that have been used since the end of the 18th Century. What is new is the massive quantity of data available to nearly everyone, and the wide array of tools that can be used to create compelling visualizations. Think about the cool infographic you saw the other day. Was it created painstakingly over several days of carefully reviewing ethnographic data compiled by a dogged scientist over the course of his career? Maybe, but probably not. It was more likely created by some marketing department somewhere (not that there’s anything wrong with that) using somebody else’s data and somebody else’s visualization tools.

The purpose of this post, though, is not to discuss the merits of data visualization in general, but rather the specific subset of data visualization that deals with real-time data. This is a completely separate species of data visualization and should be treated as such.

Real-time data visualization refers to visualization of data that is continuously updated as new data is generated by connected devices or people. This is the type of data that is used to make real-time decisions and, when done correctly, can truly transform business processes.

There are a number of important factors to consider when attempting to visualize data in real time, but we will focus on three simple and obvious keys: clarity, consistency, and feedback.

 

Clarity

Real-Time graphics should emphasize pertinent information and use design principles that promote ease-of-use and accessibility above aesthetics. Things like size, color and brightness can be used to distinguish primary details from secondary and tertiary details. Special graphics can be created to emphasize different information under different conditions (i.e. a special set of graphics to be used when a certain alarm is triggered).

 

Hierarchical Data
Hierarchical Data Makes its Relevance Obvious

Clear visualizations provide actionable information at a glance, and clearly show the current process state and conditions. Alarms and indicators of abnormal conditions are prominent and impossible to ignore.

Clarity encompasses both content and context.

dataVis3.png
Contextual Controls Allow You to Assess Current Conditions at a Glance

 

Consistency

Consistent visualizations are standardized and consistently formatted. Interaction requires a minimum of keystrokes or pointer manipulations.

Shapes, colors, and layouts should be used consistently through all screens. If the color red is used in one place to designate an abnormally high value on one screen, that same color red should be used to indicate all abnormally high values of the same type on all screens. If navigation buttons are on the left side of one screen, they should be on the left side of all screens. A consistent visualization system is arranged in a logical, hierarchical manner, allowing operators to visualize both a general overview of the system as well as more detailed information on different components as needed. Navigation and interaction of any type should be as easy and intuitive as possible.

Consistency is closely related to clarity.

dataVis4.png
Color is a Great Way to Distinguish One Property from Another, As Long As it Is Consistently Applied.

 

Feedback

An operator should be fully confident that the choices they make are having the desired effect. Screens should be designed in a way that provides information, putting relevant data in the proper context. Also, important actions that carry significant consequences should have confirmation mechanisms to ensure that they are not activated inadvertently.

Controls will function consistently in all situations. If something is not working as it should, that fact should be immediately obvious and undeniable. In a well-designed system, design principles are employed to reduce user fatigue.

There are obviously many other important factors to consider when real developing a real-time visualization system. Anyone who wants to dig deeper is encouraged to read this free whitepaper on the subject:

Click here to read it

How To Improve Any Business Process

mobile_automation

If you are responsible for managing a business or organization of any type, you have undoubtedly sought out opportunities to make things run more smoothly and efficiently. It’s only natural. This means that responsible owners and managers are continually looking for opportunites to optimize their business processes.

How about some free advice?

First of all, let’s be clear about what it is we’re referring to when we use the term ‘business process’. In short, a business process is defined as a collection of linked tasks which can find their end in the delivery of a service or product to a client. It has also been defined as a set of activities and tasks that – once completed – will accomplish an organizational goal.

Any business (regardless of how poorly it may be run) employs some type of business process. Some are clearly better than others.

What we refer to as Business Process Management (BPM) can be defined as the set of techniques employed to map the flow of information and communication between various business assets and departments, identify opportunities for improvement, and establish and enforce rules to optimize the process moving forward. These techniques can (and should) be employed continually.

A BPM system can provide any company with several measurable benefits:

  • The ability to identify otherwise unknown inefficiencies
  • Reduced downtime and cost associated with wasted time and material
  • The ability to connect processes over multiple facilities and or operations
  • Automation of repeated and/or predictable tasks
  • Establishment of a program for continual improvement

These benefits are very attainable. Provided you use the right tools and follow a simple procedure, anyone can realize the improved efficiency and reduced waste that BPM systems provide. And what is the correct procedure? In very simple terms:

  1. Analyze Current Processes
    Create a business process map to paint a clear picture of the current flow of information between different business assets. Use this map to uncover inefficiencies and establish a preferred methodology.
  2. Establish and Enforce New Rules
    Define rules for how you would like information to flow, and create workflow tasks to automate tasks or send automatic notifications to people that need to be involved in enforcing the new rules.
  3. Implement, Train, Rinse and Repeat
    Once the new process is clearly defined and automated, ensure that all parties are fully trained and equipped to adhere to these new rules moving forward. You can create custom dashboards to track real-time data, create a centralized knowledge base that is shared and continually updated, and use automated real-time notifications to be sure that everyone is always aware of the current state of the process. Finally, ensure that your new process is fully repeatable and scalable to allow for continual evaluation and improvement.

Seems pretty simple, right? It can be when you combine your innate understanding of your business process with the right tools.

 

**Learn more about some of the data acquisition and visualization technology the empowers Business Process Management at http://scada.com

Object Virtualization: Digitizing the World

Object virtualization is the enabling technology behind the IoT (Internet of Things)

We are changing our world. With the advent of new sensing and communication technologies, we are finding ways of making everyday objects more intelligent and connected. As we connect more and more things to one another, however, we are finding a need to democratize the process. We have to make different things the same, or at least equal. We are still trying to answer the Mad Hatter’s famous riddle: How is a raven like a writing desk?

Though Alice’s time in Wonderland may have come and gone, ours is just beginning. While we may not be connecting ravens to writing desks (though nothing would surprise me at this point), we do have a need to connect seemingly unrelated objects in new ways.

One solution to this dilemma is the process of object virtualization. By creating virtual models, or representations, of the things you want to monitor and manage, you are putting ‘things’ on equal footing, creating new opportunities for analysis and task automation.

To understand object virtualization, consider the contact list in your phone. A contact can be thought of as a virtual model of an actual person. It is something like a digital identity. Imagine you have a contact named Mary Smith. Mary has a name, a phone number (or two), an email address, maybe a photo. Mary can have a Facebook profile, a Twitter alias – you can even assign Mary a special ringtone. All of these things combine to create a virtual model of Mary stored in your phone.

Now, to make your model of Mary a bit more intelligent and useful, you could add her date of birth, her hair color, her favorite book, her pet cat’s name, or any number of different properties of Mary. If we slapped a bunch of sensors on Mary, we may know things like her current location, current body temperature, her heart rate, her blood pressure. If this information is communicated to your model in real time, you have an active, living representation of Mary that tells you more about her than she may know herself.

Imagine applying this same process to your house, your car, your toaster, or your favorite pair of socks. Now, maybe you can’t think of a good reason for your socks to talk to your toaster, but they may have a thing or two to share with your washing machine. And maybe your house and your toaster can have a nice conversation about lowering your electric bill. Of course, your things aren’t just talking to your other things. They can talk to other things anywhere. Do you think it might be helpful for your air conditioning system to know something about today’s weather forecast? Or for your car to know about that new road construction on your way to work?

Your virtualized house doesn’t care that it’s a house. It may as well be an elephant or a water balloon. The same is true of your car, your refrigerator, or your lawn sprinklers. Virtual models can share information with other virtual models without regard for where the data is coming from or how it got there. Virtualization can make every “thing” accessible to every other “thing”, and ultimately to you.

**B-Scada’s VoT (Virtualization of Things) Platform allows you to create virtual models using data from multiple and disparate sources, providing a simple platform for creating powerful and intelligent IoT (Internet of Things) applications. Learn more at http://votplatform.com.

Information Modeling as a Tool for Collaboration

In the spirit of the upcoming holiday season, let’s take a moment to examine one of the greatest and most appreciable qualities of a healthy organization: collaboration. In a world so full of information, where we are all so busy and so pressed for time, it seems collaboration has become something done more out of necessity than out of a desire for quality and efficiency.

Some of this reality may be due to the fact that there simply are no good tools for collaboration in the modern workplace. Sure, we have email and teleconferencing, web meetings and text messages – but for all of our technology, our endless need to compartmentalize and segment our business processes has left us no closer to a model of organic collaboration than we were in the past.

With relevant information stored in separate silos, decision-makers are still forced to rely on reports and statistics compiled from historical data and interpreted to support a specific agenda. There has really been no truly organic means analyzing real-time data alongside the historical data. Likewise, the available tools for integrating data from separate systems are limited in terms of their ability to create a real-time context and to display the appropriate data to decision-makers at the speed with which decisions must often be made.

While these tools may be useful for looking back and analyzing what has happened, it is another matter altogether when trying to look forward to make plans or predict outcomes.

Information Modeling

One of the ways this challenge can be overcome is by using an information model to organize and structure your organization’s data in a way that provides context and clarity in real time. Information modeling allows assets to be associated with all relevant information – regardless of where that information may reside.

For instance, a motor on your plant floor can have live data related to its RPM, temperature, throughput, or other process data – as well as a commission date, a maintenance schedule, troubleshooting documents and training videos. Properties of this motor can also include OEE (Overall Equipment Efficiency), Net Asset Value, or other performance and resource planning metrics. Some of this data may be coming from PLCs, some from databases like SQL Server, some from user input, and other data is coming from programmed calculations. In this situation, it is not important how this data is generated or where it is stored. What is important is that this data can be visualized at any time in whatever way suits your collaborative needs.

There are a number of different tools that can be used to create an information model for your organization. A few things to consider when choosing an information modeling tool:

  • Does the modeling software take into account both real-time AND historical data?
  • Does the modeling software allow you to include ALL relevant information from every source?
  • Is your modeled data logged in a relational database like SQL Server so it can be queried if additional information is needed?
  • Does your modeling software provide the tools you need to visualize your data in a useful way that supports decision-making?
Before you jump into a new software product and a new data management system, do some homework. As with everything there are pros and cons to the different products available.