Last week was a busy week out at the CEATEC show in Makuhari Messe. This was our 7th year in a row exhibiting there, and it’s great to see that both cloud services and digital signage are becoming more widely known every year.

One person who stopped by the Vanten booth asked me about the Return On Investment (ROI) for Digital Signage. I was a bit surprised because I hadn’t heard that question in a long time, but I was reminded how ten years ago nearly everyone asked me that.

In 2006, very few companies were using digital signage yet, and for managers afraid to take a chance on new technology, that was an obvious question. If only we could prove with hard numbers that it would make them more money than it cost, they would be interested. It was very frustrating to be stuck in the typical chicken / egg dilemma of new technology: if nobody takes a chance, how will we ever get the data?

Now, more than 10 years later, we see companies eager to install displays without bothering to ask about ROI, probably because they see the increase in Digital Signage around them and figure if everyone else is doing it they should too.

Of course, I believe digital signage has an important role as a communication tool. I have been living and breathing it daily for almost 15 years now. But, I have to admit, I’m very disappointed that we don’t have better data by now to really show what works and doesn’t work for digital signage.

Here’s what I do know so far.

In 2010, we supplied the system for the first use of automated people counting on a digital signage media with paid advertising in Japan. We counted a lot of people looking at the signs – over one million per day.

We also ran the same technology on displays inside Yodobashi Camera for over three years, where we counted thousands of people daily in just one spot.

In both networks, we could see that the demographics of viewers closely matched the demographics we expected to see. We could also measure the average time for viewing, which was typically under 1 second.

This was surprising to advertisers, but matched what we had heard from other people using this technology around the world. Viewers tend to scan their environment and take quick glances at advertising, only occasionally stopping to read something carefully. This means it’s best to make content that can provide the most important information at a glance.

We were even able to match the viewing with specific pieces of content in the play loop, but it was very difficult to come to any meaningful conclusions from so much data. (customers weren’t so excited about big data 7 years ago)

In another case, we worked very closely with a customer conducting tests of different content and the effect on sales. What we found out is just how hard it is to focus only on specific factors in a retail environment and to measure them when the stores are running multiple promotions at the same time on different media. We would need to be using a trial of sufficient scale, a very consistent methodology, and a system that allowed for easy testing.

The bottom line is that we were able to show that many people looked, but it was very difficult to provide specifics beyond that without more scale, more automation, and more focus on what we are actually testing.

That doesn’t mean it can’t be done though. It just means that people aren’t putting sufficient effort into testing and measuring on Digital Signage yet.

Direct response marketers, on the other hand, have been carefully comparing the differences between changing a word here or there in their copy for well over 100 years, since the early days of mail order.

David Ogilvy, the advertising legend who spent his early years in advertising doing direct response knew so much about what worked and didn’t work in newspapers, magazines and television, but once said about posters:

“I had better tell you what little is known about designing them to maximum effect. There has been little or no research on the subject… Your poster should deliver your selling promise not only in words, but also pictorially. Use the largest possible type. Make your brand name visible at a long distance. Use strong, pure colors. Never use more than three elements in your design.
If you know more than that, please tell me.” (Ogilvy On Advertising)

Even though that was many years ago, I’ve yet to find any significant research on the topic. Posters are perhaps the closest analog equivalent of advertising on digital signage, and they have been around much longer. But, with Digital Sigage, you can also show video and do so much more. More possibilities though, means it’s even harder to focus and measure what works best.

In the world of online marketing, people are increasingly focusing on data driven results and using those same techniques from the early 20th century, the most important one being what we now call A/B testing. A/B testing is where you try two different pieces of content and see which one gets a better response. The winner becomes your new and base content and you can keep trying to test against it to find a better one.

With a system like Otegaru net that uses a rule engine to decide which content plays on which screen, you could easily set up a channel A and a channel B to do A/B testing. The key is automation. It’s the automation that let’s you do this quickly, easily and consistently.

We are ready and willing to assist you with your testing needs, but to get good test results there are several steps that need to happen first.

1. Define what success means for your media. Make sure that the goals match the scale of the media. For example, if the goal is to sell some advertising, it might be very difficult to do this if there are only a couple displays.

2. Create a complete content plan. Analyse the types of content you will have and who the are for. How are they going to get made? Who is going to put them on the signage. How often will each kind of content by changed or updated.

3. Start doing it! A lot of learning happens very quickly once you start running any kind of signage media. Until things settle down though, it’s too early to start any fancy measuring and testing because usually too many things are changing at once.

Without these preparation steps, you might end up with a case of GIGO.

“Garbage In. Garbage Out.”

This is a term from the early days of computers. It means when you are trying to make some kind of calculation, or run some kind of formula, if you put bad data in, you are guaranteed to get a meaningless result out.

But once you have made the big adjustments and your media is settling into a nice groove, then you are ready to get in there a test.

In part two, we will look at ROI from a completely different perspective.