In 1998, NASA lost their $125 million Mars Climate Orbiter when the spacecraft burned up in the Martian atmosphere. While the engineering and software were meticulously built to NASA’s high standards and operated as intended, the data that put the spacecraft on the doomed trajectory was flawed.

The navigation team at the Jet Propulsion Laboratory used the metric system for its calculations, while Lockheed Martin Astronautics in Denver, who designed and built the spacecraft, provided crucial acceleration data in the English system of inches, feet and pounds. JPL’s engineers assumed the acceleration data (measured in English units of pound-seconds) was in metric measure of force and sent the spacecraft on a doomed and costly flight.

While, most data quality mistakes don’t end in the fiery destruction of multi-million dollar spacecraft, misunderstood data is costly for today’s businesses. Data is the lifeblood of every company, helping companies work better, work smarter, and reach their target audiences.

According to Gartner, modern business intelligence (BI) and analytics continues to expand more rapidly than the overall market, offsetting declines in traditional BI spending.

Data quality is still the biggest challenge.  While many companies are investing in BI visualization tools, they are not necessarily applying the same efforts to the data itself. Then companies face frustration and disappointment when the ‘right’ data is not processed.

Data Isn’t Treated as a Business Asset

Even though data is at the heart business decisions, companies don’t always handle their data as an enterprise asset. Data may be handled tactically, with databases and applications created as requested by a business unit. Enterprise-wide data dictionaries are rarely applied to enforce consistency on the meaning of fields, and departmental IT teams address issues in isolation from wider business goals. The overall approach is ad-hoc, leading to a fractured data system, leaving the business to question the reliability of their data.

Data Fuels Insights…Unless it’s Wrong

Companies are often more focused on simply collecting data, losing sight how to ensure the quality of data. Unreliable data undermines business’ ability to perform meaningful analytics that support smart decision-making and efficient workflows. Quality data is required across the organization; for management, operations, compliance, and interaction with external partners, vendors, and customers.

Maintaining Good Data Quality

What makes good quality data? Data quality is measured by many factors, including:

Even a dataset that seems accurate and consistent can lead to poor results when there is missing fields or outdated data.

Maintaining high quality data is a real business challenge. It is further complicated by the dynamic nature of different data generation resources and devices, and the enormous scale of data itself.

Companies need to confront their data quality challenges, before eroding trust in their data. When trust in data is lost, this is shared, leading to questions by all levels of the organization.

Data Quality Case Study: An E-Commerce Company Misses a Key Event

Here’s a recent example of how data issues led an e-commerce company to make some costly business decisions.

The e-commerce company collects event data through its mobile app, feeding data to a central data repository that drives their analytics and customer strategy. Every page and every click is collected for analysis, including tracking when products are added to or removed from a cart, how a user searches, and other user interactions on the site. There are potentially hundreds of events from each page.

When a new version of the app was deployed, it had a bug that failed to collect some event data for certain iOS versions. Because of the large volume of data that’s collected, the missing data problem wasn’t noticed and went unidentified for several weeks. As a result, the business perceived a drop in purchases (while in reality the opposite occurred) and in reaction they increased the marketing budget for a specific product.

Unfortunately,  in reality, there was no need for that increased marketing investment and would have been better spent elsewhere.

The Neverending Story: Ensuring Quality Data

Artificial Intelligence can be used to rapidly transform vast volumes of big data into trusted business information. You can immediately address problems, saving weeks of inaccurately reported data. Renewed trust in the quality of your data directly impacts business priorities.

Anodot’s AI-powered analytics solution automatically learns the normal behavior for each data stream, flagging any abnormal behavior. Using Anodot, changes that can impact data quality would be immediately alerted on, so that they can be addressed. preventing wasted time and energy and ensuring that decisions are made based on complete and accurate data.

Written by Anodot

Anodot leads in Autonomous Business Monitoring, offering real-time incident detection and innovative cloud cost management solutions with a primary focus on partnerships and MSP collaboration. Our machine learning platform not only identifies business incidents promptly but also optimizes cloud resources, reducing waste. By reducing alert noise by up to 95 percent and slashing time to detection by as much as 80 percent, Anodot has helped customers recover millions in time and revenue.

You'll believe it when you see it