3 Keys to Effective Real-Time Data Visualization

There are several important factors to consider when creating your real-time data visualization, many of which will depend on your intended application. Today, we consider at a few of the general factors that will play a role in every visualization you create. These three factors are clarity, consistency, and feedback.

Clarity

Real-Time graphics should emphasize pertinent information and use design principles that promote ease-of-use and accessibility above aesthetics. Things like size, color and brightness can be used to distinguish primary details from secondary and tertiary details. Special graphics can be created to emphasize different information under different conditions (i.e. a special set of graphics to be used when a certain alarm is triggered).

clarityPic

When planning a real-time visualization scenario, it is very important to consider who will be using this visualization, and what is his/her purpose in viewing this data. This will obviously vary from one organization to the next, but when differentiating between primary, secondary, and tertiary information, it is important to not think in terms of what is important about the thing being monitored, but what is important to the person doing the monitoring.

Consistency

Consistent visualizations are standardized and consistently formatted. Interaction requires a minimum of keystrokes or pointer manipulations. In fact, whenever possible, all relevant information should be visible without the need to navigate to another screen. When navigation is necessary, be certain that elements of the user interface related to navigation are clearly distinguished from elements that relay pertinent information. Additionally, navigation and interaction of any type should be as easy and intuitive as possible.

chart

The ergonomic needs of the user are also extremely important. Poor data visibility has been cited as a primary cause of many industrial accidents where a process was being monitored or controlled through a real-time HMI (Human Machine Interface). In fact, poorly designed HMIs have been blamed for accidents that have led to millions of dollars in damaged equipment and some very unfortunate and unnecessary deaths.

 

A recent study by OSHA in Europe compiled statistics on HMI-related errors in the workplace. Interestingly, research shows that the majority of problems are caused by human error, but not entirely because of mental and physical fatigue. More often, errors are caused by poor decision-making related to the way that information is processed.

  

Feedback

An operator should be fully confident that the choices they make are having the desired effect. Screens should be designed in a way that provides information, putting relevant data in the proper context. Also, important actions that carry significant consequences should have confirmation mechanisms to ensure that they are not activated inadvertently.

 Controls will function consistently in all situations. If something is not working as it should, that fact should be immediately obvious and undeniable. Again, in a well-designed system, design principles are employed to promote clarity and simplicity, and to reduce user fatigue.

Keep it simple and straight-forward. Save the complex visual tools for historical data or real-time reporting. There is certainly a place for all of this, but that place is not where real-time data is being used to make real-time decisions.

Learn more in the free whitepaper “Real-Time Data Visualization Essentials”:

wpCover
http://scada.com/Content/Whitepapers/Real-Time%20Data%20Visualization%20Essentials.pdf

 

 

Is the Internet of Things Really Happening?

Over the last few years there has been much speculation about the inevitable growth of the Internet of Things (or Internet of Everything). Forecasts have suggested anywhere from 30 to 50 billion devices will be connected by 2020. Cisco has estimated that the global IoT ecosystem will have a value of $14.4 trillion by 2022, and IDC has projected yearly IoT market revenue to increase to $1.7 trillion by 2020.

Here we are now in 2016, a few years into the future they were talking about back then, and it may be a good time to take a look the current state of the IoT and see how it measures up to all of these lofty expectations. Are people really embracing IoT technology at this rate? Is this money really being invested?

connectedDevices.png

Connected Devices

First, let’s take a look at the number of connected devices. If we flash back to 2013, we find that Gartner released a report entitled “Forecast: The Internet of Things, Worldwide, 2013”. In this report, they predicted that the IoT will include 26 billion connected devices by 2020. Two years later, Gartner reported a total of 4.9 billion connected devices at the end of 2015, up from 3.8 billion in 2014. Gartner also revised their 2020 estimate, anticipating 20.7 billion connected devices by 2020, a decrease of 5.3 billion (20.4%) from their 2013 estimate. (It should be noted here that Cisco continues to anticipate as many as 50 billion by 2020).

So, according to Gartner, IoT adoption has not proceeded at the rate they had anticipated at the end of 2013.

One reason for the slower-than-expected growth is the difficulty faced when trying to implement IoT technology. In fact, Gartner anticipates that through 2018, 75% of IoT projects will take up to twice as long as planned.

Value of the IoT

Now, let’s consider the monetary value of the IoT and how that number has progressed. Cisco initially projected a value of $14.4 trillion by 2022. Within two years Cisco had increased this number to $19 trillion.

value.png

This highlights an interesting fact. Even though fewer connected devices are expected by this date, the total value of these devices and the underlying network is expected to be greater than it was when more devices were expected. Based on this, I think it’s safe to suggest that implementing IoT technology is turning out to be more expensive than originally thought.

This may be due in part to the fact that some enterprises are rushing headlong into IoT projects without the proper foresight and planning. Often it is a reaction to competitive pressure, based on a perception that a competitor is already moving forward with their IoT strategy, or simply in an effort to be the first and gain a competitive edge.


“I think it’s safe to suggest that implementing IoT technology is turning out to be more expensive than originally thought.”


Another answer may come from Gartner’s 2015 report: “Predicts 2015: The Internet of Things”, in which Gartner predicts that through 2018, there will be “no dominant IoT ecosystem platform”. They cite a lack of IoT standards and anticipate that IT leaders will be forced to compose solutions from multiple providers.

 

Read our White Paper on Choosing the Right IoT Software Platform

IoTPlatformWP_cover.png
Read It Now

 

 

Even when faced with these realities, however, enterprises are still moving forward with their IoT projects. The extra expense – though unanticipated – is not nearly enough to outweigh the potential benefits. The IoT is most certainly transforming the way businesses operate, and no one wants to be the last one to this dance.

IoT Investment

This is an important category as it will largely determine how quickly the industry moves to develop standards, and how motivated IoT solution providers will be to develop more powerful and more cost-effective solutions.
Recall IDC’s projection of annual market revenue reaching $1.7 trillion by 2020. It would stand to reason that if we are learning that IoT projects are coming in over budget and late, there is probably some distaste in the marketplace, and maybe IDC’s projection was a bit ambitious.

At the same time, though, if people are spending more on IoT initiatives than they had originally planned, perhaps IDC’s projection was a bit conservative. Let’s examine how things are taking shape.

In 2015, IDC reported that worldwide IoT spending reached $655.8 billion in 2014 and calculated a 16.9% CAGR (Compound Annual Growth Rate).

Well, 2015 is now in the books and we can see how IDC’s projections seem to be holding up. Their latest report indicates that spending in 2015 reached $698.6 billion, a CAGR over 2014 of only 6.53%. Had IDC’s anticipated CAGR proven accurate, 2015 revenue should have been closer to $766 billion.

Notwithstanding this fact, however, IDC continues to project a CAGR of 17% and an increase in spending to $1.3 billion by 2019, which would equal approximately $1.5 billion in 2020. It looks like IDC sees the IoT market cooling off a bit, though not much.

revenue.png
So, while the earlier projection has proven to be overly optimistic, it is clear that investments in IoT initiatives are continuing to increase with no end in sight.

If there is any kind of meaningful takeaway from all of this, I think it’s safe to surmise that IoT projects may be coming in late and over budget, but that doesn’t seem to have had much of an impact on continued investments. It is clear that business owners and executives see the value and have no interest in letting their competitor’s gain an edge.
So, was the IoT hyped a bit excessively over the last couple of years? Maybe a bit. But, it is also very real and happening right now.

3 Reasons Modern Farmers Are Adopting IoT Technology at an Astounding Rate

It seems like everything today is touched in some way by the Internet of Things. It is changing the way goods are produced, the way they are marketed, and the way they are consumed. A great deal of the IoT conversation has revolved around transformation in industries like manufacturing, petrochemical, and medicine, but one industry that has already seen widespread adoption of IoT technology is often overlooked: agriculture.

Of course, many of us are very familiar with some of the efforts that have been made to optimize food production. As populations continue to grow, there has been a serious and sustained drive to increase the crop yield from our available arable land. Some of these efforts have not been particularly popular with consumers (i.e. pesticides, GMOs).

With the advent of new technology and the Internet of Things, farmers are finding new ways to improve their yields. Fortunately for us, these new ways are decidedly less disturbing than toxic chemicals and genetic manipulation. Using sensors and networked communication, farmers are discovering ways to optimize already-known best practices to increase yield and reduce resource consumption.

If it’s surprising that the agricultural industry would be technological innovators, it’s worth considering how agriculture is in many ways an ideal testbed for new technology.

There are a few good reasons for this:

1. Ease of Deployment

Unlike in other industries, deploying sensors and other connected devices on a farm can be relatively easy and inexpensive. In a heavy industrial environment like a factory or refinery, new technology must replace old technology that is thoroughly embedded in the production infrastructure. There are concerns about downtime and lost revenue, as well as concerns about finding the right products or group of products to integrate into their existing technological ecosystem. On a typical farm, there is no need for downtime, and usually no concern for any existing technology that may be incompatible. Inexpensive sensors placed in various parts of a cultivated field can quickly yield very useful actionable data without disrupting a single process.

2. Instant Value

Another reason that agriculture has provided such a fertile testbed for IoT technology is the speed with value and ROI can be realized. Pre-existing metrics of precision agriculture can be applied more easily, maximizing the already-known benefits of established practices (knowing what types of crops to plant when, knowing when and how much to water, etc.). Farmers have also had success safely and naturally controlling pests through the intelligent release of pheremones. Of course, there is the obvious and very tangible benefit of decreased resource consumption and increased yield. A modest investment can yield measurable results within a single season.

3. Continual value

In agricultural IoT deployments, the same practices that provide instant value will continue to provide value for as long as they are employed. Conservation of water and waste reduction provide repeated value, as well as the increased yield brought on by precision farming. There are also opportunities to improve the equipment that farmers use every day. A connected combine or tractor can record useful information about its operation and maintenance. It can also allow for certain processes to be optimized and automated.

There are some real concerns about our ability to feed our ever-growing population in the future. While controversial technologies like genetically-modified-organisms have helped to increase food production, these techniques are not exactly popular with the general public, several of whom have voiced concerns about the long-term impact of a genetically-modified diet.

The good news is that similar increases in food production are possible without the need to modify the food; we simply have to modify the processes used to produce it. And it’s not just about food production. Plants are also used for biofuels and as raw materials in manufacturing. By increasing yield and reducing resource consumption, growers are also having a positive impact on numerous other industries.

For instance, a Colorado-based company called Algae Lab Systems is helping algae farmers improve their output by introducing sensors to measure environmental factors like temperature, pH, and dissolved oxygen in their photobioreactors and algae ponds. Algae growers are now able to continuously monitor their crops from any location, also allowing for larger and geographically dispersed operations.

A case study detailing Algae Lab Systems provides some insight into how they are transforming the algae farming industry, and aquaculture in general.

Read It Now

To Each His Own: Creating Custom Dashboards for Operators and Analysts

manyFaces

It’s always very annoying when I try to perform what seems like it would be fairly routine maintenance on a home appliance or worse – my car – only to find out that this seemingly simple thing I would like to do is actually quite difficult with the tools at my disposal. A little bit of research usually reveals that it actually is quite simple; I just have to buy this proprietary tool from the manufacturer for what seems like a ridiculous price, and then I can proceed.

Of course, it’s easy to understand why the manufacturer doesn’t want to make it easy for end users to service their product. They want you to buy a new one, or at the very least buy this overpriced tool from them so they can scrape every morsel of profit afforded by their built-in obsolescence.

It really makes me appreciate the simplicity and widespread application of some of our more traditional tools. Take a hammer, for instance. If you need to drive a nail into wood, it doesn’t matter if it’s a big nail, a little nail, a long nail, or a short nail. It doesn’t matter who manufactured it or when. All that matters is that it’s a nail. Just get a hammer; you’ll be fine.

This got me thinking. What if we had a hammer for every type of nail available? What if each hammer was perfectly sized, shaped, weighted and balanced for each particular nail? And what if that perfect hammer was always available to you every time you needed it. This isn’t realistic, obviously, but it reminds me of some of the things I hear from our customers.

One of the great benefits cited by our end users is the ability to create custom dashboards for the different work responsibilities in their organizations. The same system is used to create maintenance dashboards for technicians, control panels for operators, system overviews for managers, reports for analysts, and even special dashboards for contractors and vendors. By providing every member of the team with a real-time view of exactly the information they need to do their jobs and nothing more, each person is empowered to do their jobs with the utmost efficiency – improving the speed and accuracy of decision-making as well as increasing the capacity for planning.

In the past, so much of our data visualization was tied to the device from which the data was drawn. If you wanted to know something about a particular machine, you had to look at the same picture as everyone else, regardless of what you needed to see.

Some modern software platforms like B-Scada’s Status products eliminate this need to tie visualizations to the device from which the data is drawn. It is now possible to visualize data from multiple devices at multiple locations through the same interface. This allows for a new concept in user interface design: rather than displaying all available information about this particular thing, you can now display all information relevant to a particular task or set of tasks.

It’s not quite “a hammer for every nail”; it’s more like a complete tool set tailored to every job, containing exactly the tools you need and nothing more. It’s really been a transformative development for many organizations.

B-Scada recently released a case study detailing how one prominent North American electric utility used Status to create a system of customized views for their operators, managers, and analysts, providing specific insights into the real-time status of their generation resources:

Read It Now

 

3 Keys to Effective Real-Time Data Visualization

Everybody appreciates the value of a good picture. Each one says a thousand words, after all, or so the saying goes. If we really dig in to this metaphor, we’d probably admit that some pictures say substantially more than that – while others probably come in well under a dozen (just look at a random Facebook wall for some examples).

Ours has become a very visual culture, and one occupying a place in time defined by an overwhelming abundance of information all around us. Considering these two facts, it is not at all surprising that we see such an increased interest in data visualization – that is to say the process of placing a particular, specific set of data points in a highly visual context that allows it to be quickly consumed and analyzed.

It’s not a new concept; data has been visualized in pictures for centuries. A map is a type of data visualization, for instance, as are the many charts and graphs that have been used since the end of the 18th Century. What is new is the massive quantity of data available to nearly everyone, and the wide array of tools that can be used to create compelling visualizations. Think about the cool infographic you saw the other day. Was it created painstakingly over several days of carefully reviewing ethnographic data compiled by a dogged scientist over the course of his career? Maybe, but probably not. It was more likely created by some marketing department somewhere (not that there’s anything wrong with that) using somebody else’s data and somebody else’s visualization tools.

The purpose of this post, though, is not to discuss the merits of data visualization in general, but rather the specific subset of data visualization that deals with real-time data. This is a completely separate species of data visualization and should be treated as such.

Real-time data visualization refers to visualization of data that is continuously updated as new data is generated by connected devices or people. This is the type of data that is used to make real-time decisions and, when done correctly, can truly transform business processes.

There are a number of important factors to consider when attempting to visualize data in real time, but we will focus on three simple and obvious keys: clarity, consistency, and feedback.

 

Clarity

Real-Time graphics should emphasize pertinent information and use design principles that promote ease-of-use and accessibility above aesthetics. Things like size, color and brightness can be used to distinguish primary details from secondary and tertiary details. Special graphics can be created to emphasize different information under different conditions (i.e. a special set of graphics to be used when a certain alarm is triggered).

 

Hierarchical Data
Hierarchical Data Makes its Relevance Obvious

Clear visualizations provide actionable information at a glance, and clearly show the current process state and conditions. Alarms and indicators of abnormal conditions are prominent and impossible to ignore.

Clarity encompasses both content and context.

dataVis3.png
Contextual Controls Allow You to Assess Current Conditions at a Glance

 

Consistency

Consistent visualizations are standardized and consistently formatted. Interaction requires a minimum of keystrokes or pointer manipulations.

Shapes, colors, and layouts should be used consistently through all screens. If the color red is used in one place to designate an abnormally high value on one screen, that same color red should be used to indicate all abnormally high values of the same type on all screens. If navigation buttons are on the left side of one screen, they should be on the left side of all screens. A consistent visualization system is arranged in a logical, hierarchical manner, allowing operators to visualize both a general overview of the system as well as more detailed information on different components as needed. Navigation and interaction of any type should be as easy and intuitive as possible.

Consistency is closely related to clarity.

dataVis4.png
Color is a Great Way to Distinguish One Property from Another, As Long As it Is Consistently Applied.

 

Feedback

An operator should be fully confident that the choices they make are having the desired effect. Screens should be designed in a way that provides information, putting relevant data in the proper context. Also, important actions that carry significant consequences should have confirmation mechanisms to ensure that they are not activated inadvertently.

Controls will function consistently in all situations. If something is not working as it should, that fact should be immediately obvious and undeniable. In a well-designed system, design principles are employed to reduce user fatigue.

There are obviously many other important factors to consider when real developing a real-time visualization system. Anyone who wants to dig deeper is encouraged to read this free whitepaper on the subject:

Click here to read it

3 Reasons You Should Consider Giving Your Process Operators Mobile Devices

eim_image-fw

That’s right. It’s time to own up to the fact that the majority of us are using phones and tablets to do business everyday. We buy, sell, trade, learn, teach, and all manner of horrible and wonderful things that we have always done (no, not everyone does horrible things, but don’t act like the things you do are always so wonderful either) all with the aid of portable devices that allow us to move freely about our lives without being tethered to a desk chair.

Why, then, is it so difficult for some people to recognize that our industrial process operators and technicians – who are so often stuck behind a stationary HMI or calling from the field to speak with someone who is – would be far better equipped to do their jobs if only they were afforded the same conveniences they afford themselves in their lives outside of work.

I know there are concerns about security – about opening some digital wormhole through which all sorts of nefarious activity could be invited. There are concerns about ill-intentioned deviants having potential access to sensitive process data – which is not only proprietary, but often essential to our infrastrucure – as well there should be. But it’s not like these potential problems didn’t exist before mobile devices, and while some concerns are certainly valid, mobile devices provide a number of key benefits and opportunities that cannot be ignored:

 

  • For Remote Management of Disparate Assets
    This one seems pretty obvious, but imagine the amount of time that could be saved by not having to manually inspect field equipment or call back to the control station every time there is a simple question.
  • For Constant Access to a Portable Media Viewer
    How can you ensure that operators and techs always have access to the latest work masters, training videos, etc.? Upload or edit a document and make your changes instantly available to all relevant perties – regradless of where they are or what they’re doing.
  • For Instant access to Forms and Form Data
    Create Purchase Orders or close Work Requests from anywhere. Assign new owners or upload a picture you just snapped and attach it to a Job. The possibilities are nearly unlimited.

 

Sure, there are only three benfits listed here, but without much thought I’m sure you could think of a few more. Let me know in the comments below.

And for some additional food for for thought, check out this white paper on “The Benefits of Mobile HMIs” and tell me I’m not absolutely right about this:

Download White Paper


How To Improve Any Business Process

mobile_automation

If you are responsible for managing a business or organization of any type, you have undoubtedly sought out opportunities to make things run more smoothly and efficiently. It’s only natural. This means that responsible owners and managers are continually looking for opportunites to optimize their business processes.

How about some free advice?

First of all, let’s be clear about what it is we’re referring to when we use the term ‘business process’. In short, a business process is defined as a collection of linked tasks which can find their end in the delivery of a service or product to a client. It has also been defined as a set of activities and tasks that – once completed – will accomplish an organizational goal.

Any business (regardless of how poorly it may be run) employs some type of business process. Some are clearly better than others.

What we refer to as Business Process Management (BPM) can be defined as the set of techniques employed to map the flow of information and communication between various business assets and departments, identify opportunities for improvement, and establish and enforce rules to optimize the process moving forward. These techniques can (and should) be employed continually.

A BPM system can provide any company with several measurable benefits:

  • The ability to identify otherwise unknown inefficiencies
  • Reduced downtime and cost associated with wasted time and material
  • The ability to connect processes over multiple facilities and or operations
  • Automation of repeated and/or predictable tasks
  • Establishment of a program for continual improvement

These benefits are very attainable. Provided you use the right tools and follow a simple procedure, anyone can realize the improved efficiency and reduced waste that BPM systems provide. And what is the correct procedure? In very simple terms:

  1. Analyze Current Processes
    Create a business process map to paint a clear picture of the current flow of information between different business assets. Use this map to uncover inefficiencies and establish a preferred methodology.
  2. Establish and Enforce New Rules
    Define rules for how you would like information to flow, and create workflow tasks to automate tasks or send automatic notifications to people that need to be involved in enforcing the new rules.
  3. Implement, Train, Rinse and Repeat
    Once the new process is clearly defined and automated, ensure that all parties are fully trained and equipped to adhere to these new rules moving forward. You can create custom dashboards to track real-time data, create a centralized knowledge base that is shared and continually updated, and use automated real-time notifications to be sure that everyone is always aware of the current state of the process. Finally, ensure that your new process is fully repeatable and scalable to allow for continual evaluation and improvement.

Seems pretty simple, right? It can be when you combine your innate understanding of your business process with the right tools.

 

**Learn more about some of the data acquisition and visualization technology the empowers Business Process Management at http://scada.com