Location-based advertising has grown significantly over the past two years, but the sheer increase in the volume of geo-based ads hasn’t led to a comparable improvement in the quality of the place-directed data powering those efforts.
In fact, according to Thinknear’s Q2 2016 Location Score Index (LSI), the rising level of programmatic ad volume appears to be a significant factor that’s holding back geo-data quality.
The Lowest Score
As a result, the LSI for the second quarter is 45 — the lowest its been since Telenav’s Thinknear initiated its first report on geo-data quality in May 2014. The Index is based on a weighted 100-point scale, with “100” being perfect geo-data accuracy. To put Q2’s LSI in context, the previous low was in Q2 2014, while the highest score was 55 in Q3 of that same year.
Seasonality could play a part in the low LSI, but for the most part, Thinknear argues that it’s mostly a matter of higher volume and the need for better filters to separate the higher quality from the lower quality geo-data.
Matter Of Trust
“There is a huge amount of high-quality data available to marketers, but they need a trusted partner who can help them find it,” said Thinknear President, Loren Hillberg. “Sorting through the data is the difficult part, but the good news is that high-quality data is available at scale, and marketers are using it in new ways to connect with their audiences.”
Recent evidence appears to validate that view — with a corollary. While 77 percent of 253 major brands surveyed in January 2016 by the LBMA said location data was vital, 65 percent said they didn’t trust the accuracy of the location information they’re getting.
This seeming paradox between proximity marketing’s perceived vitality and imprecision comes down to the fact that location data accuracy is indeed more prized than ever as consumers and businesses become more mobile and more connected. The silos between digital and physical are breaking down when it comes to the marketing of e-commerce and brick-and-mortar across retail, hospitality, food services, automotives, transportation, healthcare, and the variety of on-demand services that have sprung up across those categories.
Plus, as roughly 90 percent of purchases still occur in physical locations by smartphone-toting shoppers, the ability to reach those consumers whenever and wherever they’re about to make a buying decision is crucial to almost every business service from enterprise to SMB.
And that’s why scores like Thinknear’s are worthy of attention, as all industries struggle to understand the state of geo-data quality. (To get a sense of how concerns about location technology has leapt into the mainstream, check out this past week’s Last Week Tonight with John Oliver, which pointedly sought to answer why it appears easier to order a pizza via a Domino’s mobile app than it is for 911 dispatchers to find citizens in need of assistance. Location accuracy matters in ways both large and small.)
Q2 2016’s Scorecard
Thinknear’s breakdown of the volume of location data by accuracy level in Q2 versus Q1 2016 is fairly striking:
- Hyperlocal: 32 percent of location data was accurate within 100 meters (roughly the size of a football field) of the user’s “true location.” In Q1, that number was 37 percent.
- Local: 7 percent of location data was accurate between a range of 100 and 1,000 meters — a difference of 6 miles. During the previous quarter, it was 9 percent accurate.
- Regional: Findings in this section are pretty flat, with 27 percent geo-date accuracy rates between 1,000 and 10,000 meters of a person’s actual spot.
- Multi-Regional: 20 percent correct between 10,000 and 100,000 meters in Q2 versus 18 percent during the Q1.
- National: 14 percent for Q2 and 10 percent for Q1 2015 was the respective location accuracy tally for pinpointing distances within 60 miles from where a person with a smartphone was standing.
What’s Causing The Problems With Location Accuracy?
In terms of answering that question, it comes down to determining the source of the data.
Typically, a publisher generally has to take five steps to access a user’s location — and each one is less accurate than the next.
- The most reliable method of extracting precise location is with the A-GPS (assisted global positioning) chip in a connected device. This is one reason why navigation apps like Waze are so reliable.
- A wi-fi signal is second strongest — but since most people use networks at home or work, as opposed to walking or driving near a retailer, it’s not terribly effective.
- Cell towers are ubiquitous in most areas, but in places with many tall buildings like New York, interference with the location signal is common and therefore, the level of accuracy is dulled.
- While IP addresses are great for serving websites are often used as a signal for where someone on a PC appears to be situated, when it comes to pinpointing an individual mobile phone in a large area, they almost always miss the target.
- The award for the worst method of accessing location data, however, goes to apps that ask for user registration. In this case, a developer asks for a zip code when a person downloads their app. But that zip code tends to be centered around their home or work and therefore has no way of automatically finding that person as they go about their day unless they’ve turned their Location Services function to “always.”
Location Data: Glass Half-Full? Or Half-Empty?
Overall, what have been the biggest changes in location scoring this time out? Has anything gotten particularly better or worse?
It’s important to look to the data as a ‘two-year story,’” says Brett Kohn, Thinknear’s VP of Marketing and Product. “And the story is that the volume of location-based data in the ecosystem has grown exponentially — but that the overall quality of that data has not changed significantly.”
This has two implications for marketers who want to use geotargeting to drive visitation to specific stores.
On the plus side, marketers now have an enormous amount of accessible, high-quality data; the past issues of scale in location-based marketing are gone, Kohn says.
“On the minus side, to find the high-quality data, marketers must filter through all the dirty data to get to the good stuff,” he adds. “Most players in the space do a really poor job of this, which means brands and agencies end up paying for one thing, and getting something entirely different. Our case study in this LSI speaks to that point.”
The Biggest Improvements — And The Biggest Hurdles
Thinknear points to two key positive trends in the state of location data quality.
First, the industry trade groups are getting involved. Both the Interactive Advertising Bureau and the MMA have taken steps to investigate the issue and have formed very active committees to look at industry standards and education in the marketplace, Kohn notes.
“These efforts build awareness among buyers, who ultimately write the checks and influence the ad-tech community,” he says. “The second trend is that app developers are paying more and more attention to location. They are finally realizing that by providing higher-quality data, their users will have better in-app experiences and their monetization rates will improve.”
As far as the challenges that remain, the biggest hurdle of all is “the black box syndrome,” as Kohn puts it.
In other words, the amount of competitors claiming that their location data solutions are most accurate, while not able to say precisely why and how their claims should be given credence, have left brands and publishers feeling a more suspicious than trusting.
“Big agencies and brands are being pitched the same thing by a large number of competitors,” Kohn says.
“Vendors and suppliers all talk about the ability to cleanse data, deliver accuracy, and hit the right audience,” he adds. “Our agency partners openly acknowledge the difficulty in validating what’s real and what’s not, which causes uncertainty in the market and slows down overall growth. As more dollars flow into mobile, brands are going to demand accountability. Small players who can’t innovate and large players who can’t validate their technology promises will slowly lose ground to transparent solutions that focus on creating value for brands as opposed to just driving up clickthrough rates.”
Expectations And Prognosis For Geo-Data Quality Rates
When Thinknear started the LSI, executives expected the issue of data accuracy to be resolved rather quickly, Kohn says. But two years later, the company acknowledges a long-term problem that won’t be going away any time soon.
“Brands need tools to reach their audiences on mobile, and location is one of the best tools in their arsenal,” Kohn says. “But if the data isn’t accurate, it all falls apart. We expect that the most forward-thinking brands will continue to evolve the use of location data beyond the original geofence use case. Location can drive new forms of behavioral, demographic, and predictive targeting that haven’t been available on mobile. That’s the next wave of location.”