Gravy’s Location Data Forensics: Ensuring That You’ll Get High Quality Location Data Every Time
April 4, 2023
Do you know the difference between raw location data and processed location data? One key difference: data quality. Because it often includes irrelevant and non-useful information, raw data may not always be the most accurate, compared to data that has been properly processed and cleansed. Being the leading provider of enterprise location intelligence, we at Gravy Analytics take data processing and quality seriously. So what does our approach look like when processing raw location data?
AdmitOne™: Ensuring High Quality Foot Traffic Data
Gravy Analytics processes billions of pseudonymous mobile location signals every single day. AdmitOne™, our location data processing engine, cleanses and prepares high quality location data for our Visitations, providing verified visit data that can be used in foot traffic analysis. AdmitOne™ uses location data forensics to filter our aggregated location data signals to provide accurate, useful, and privacy-friendly data. It is a patented process that first filters out spoofed and fraudulent signals, then eliminates duplicates for a single source of truth. AdmitOne™ then matches validated signal data to points-of-interest—such as retail stores, restaurants, hotels, and events—and applies proprietary analytics to confirm that each visit occurred before creating a verified visit. AdmitOne™ helps us ensure that you’ll be working with high quality location data and highly transparent datasets for analysis.
When pieces of low quality data are flagged with our forensic flags, those parts of the data can then be removed for a more accurate dataset. The accuracy and quality of location data is determined by many things such as location accuracy, signal origin, and other factors, and raw location data can be polluted by various forms of fraud or problematic signals. Using our forensic flags, you can identify and eliminate any such data points, ensuring the highest quality of location data for your needs.
Types of Fraud
Fraudulent data is costly and can affect the way researchers glean insights. It’s crucial that fraudulent location data be identified and removed from datasets to save our customers time and money. So, how can location data be fraudulent?
Fraud is inevitable and flagging and removing fraudulent signals from our data is part of what makes our location intelligence solutions so effective. So, when it comes to fraud, it’s important to understand that just as location data is ever-evolving, forms of fraud are also evolving. When fraudsters mimic mobile location data signals that are inaccurate, we can begin to identify which signals are true and which signals are fraudulent, but it can happen in several ways. Our AdmitOne™ processing engine helps us spot when a signal is “bad” and when it can be flagged as a problematic, suspicious, and/or fraudulent signal. So what are some of the categories of fraudulent or inaccurate signals that we may observe? Let’s take a closer look:
- Driving: When a signal is identified as traveling at fast speeds, we can conclude that the device was likely driving, thus skewing a foot traffic dataset with signals that were only in the location of interest for short periods of time. This data is often excluded from a business’ foot traffic data.
- Cell Tower Derived Signals: If, for whatever reason, a mobile device cannot obtain a GPS signal, it may defer to cell tower triangulation. These signals are not accurate enough to include in foot traffic analysis.
- IP Address Signals: Similarly, these signals are not accurate enough for foot traffic analysis as they may be in the vicinity of a location of interest, rather than actually there.
- Spoofed Signals: If there are too many signals in the same location at the same time, it may indicate spoofed signals which are not real-life signals and should not be used.
- Replay and Device Duplicate Fraud: Most recently, we’ve been identifying this form of fraud more and more. This is when duplicate fraudulent signals are detected over time, with either the same locations, same device identifiers, and/or the same timestamp.
More on Location Data Fraud
Repeat and duplicative signals are some of the newest forms of fraud that our team of experts are able to detect on a broad scale. Identifying emerging forms of fraud like this takes thorough examination and constant observation of characteristics. For example, did you know that there may be patterns to replay device signals? In some cases, we’ve seen a replayed signal appear in three, four, or even seven day multiples, historically. As our data forensics process becomes more sophisticated, and our “lookback windows” for replay and device duplicate fraud grow, the quality of our location data only increases.
If you’re currently considering location data for your company, questions about data quality and fraud are a great place to start the conversation, along with other top questions to ask location data providers.
Empowering Your Organization to Define “High Quality”
At the end of the day, you decide what signals are valuable to your business and what signals are irrelevant. Not every inaccurate, irrelevant, or problematic signal is thrown out; there is a high level of value that can come from these flagged signals. Here’s an example. The University of Florida Transportation Institute (UFTI) utilized location data forensics to identify human movement patterns during a wildfire to better inform natural disaster response and management strategies. Using Gravy’s Observations, UFTI was able to study human mobility in evacuation zones for wildfires by capitalizing on the transparency and ease of use that our forensic flags offer.
When UFTI researchers analyzed location data from before, during, and after wildfires, our forensic flags offered an added layer of information. Although signals flagged as “driving” are often irrelevant for most foot traffic analyses, in this case, UFTI chose to include drivers in their analysis. This provided more information on how people travel during wildfires, and it allowed researchers the opportunity to compare the movement patterns of on-foot travelers with drivers in evacuation zones. Our forensic flags and the data transparency that they offer ultimately created a new level of value to the data that UFTI analyzed.
To learn more about how UFTI enhanced their study with location data forensics, read the full case study for a deep dive.
High Quality Location Data
Problematic, suspicious, and fraudulent signals are some of the most critical differences distinguishing low quality location data from high quality location data. Our location data forensics can flag this data and allow you to exclude the data that is not useful for an added level of customization and quality. Fraud will always be imminent as we collect raw location data from various suppliers, but it gets identified and filtered out, resulting in data transparency and insights that you can trust.
While our location data forensics are a large part of what makes our data high quality, there are plenty of other factors contributing to the high quality of our location data solutions. For example, we developed PrivacyCheck, our own privacy-enhancing technology that flags signals from sensitive locations for removal to ensure that data generated by consumer mobile devices while visiting sensitive locations is never used or shared. PrivacyCheck is available to any organization that works with location data, offering an effective and efficient way to implement privacy-enhancing technology in a dynamic regulatory environment.
Data quality is just one advantage of properly processed data, and Gravy’s location data forensics are paramount in helping your company optimize your strategies to accomplish your goals with location intelligence.
If you’re looking to learn more about the advantages of processed data, download our infographic. And, if you’d like more information about Gravy Analytics and what we do, learn more by subscribing to our email newsletter.