Anodot Resources Page 44

FILTERS

Anodot Resources Page 44

Videos & Podcasts 26 min read

Anodot Enables Credit Karma to Manage Anomaly Detection at Scale

Credit Karma shares the benefits of using Anodot to easily track data, get real-time alerts and altogether promote a fair and healthy business monitoring.
Blog Post 5 min read

Travel Businesses Should Clean Up Their (Data) Act

Big data brings endless opportunities for the travel industry, but this ever-changing field also brings with it many challenges. For direct booking travel businesses and travel aggregators, microscopic pricing and performance advantages will determine each sale in today’s competitive market. The key is the data. Here are three tips to help online travel booking companies as well as travel aggregators steer clear of glitches and dirty data, and raise the volume of sales. There’s no need to explain the Online Travel Agency (OTA) business model, how it has revolutionized the travel industry, or how these companies (and later their offshoot travel metasearch engines) have utterly dominated online travel for nearly two decades. But recently, we’ve seen a tectonic shift in online travel. The travel industry like many others have amassed an inordinate amount of data on their consumers, flights, hotels, experiences, loyalty programs, complaints etc. With customers creating so much valuable data at every stage of their journey, how can travel companies do more to collect and connect these data points to improve the customer experience? Airlines, hotels, travel agencies, aggregators and others in the travel business need to do what they do best – provide outstanding value and respond to rapidly-changing markets. To do this, they need to ensure that the data they’re getting from all parties is clean, that each of the APIs that nourish their business are glitch-free, and that they can respond agilely yet accurately. The thing is, when you’re running a travel business that is processing nearly 50,000 events per second and drawing on data from thousands of parallel services – it’s tough to tell if something goes wrong! Dirty Data + Glitches = Lost Business In general, research has shown that 40% of anticipated business value is lost owing to poor data quality. Gartner estimates that dirty data results in losses approaching $10 million a year for the average company. Online travel businesses operate in a space where handling an extremely high volume of data from diverse and disparate sources at tremendous velocity is all in a day’s work. This sheer scale of data magnifies the impact of glitches and dirty data –  potentially cutting deep into the bottom line. It’s also what makes it nearly impossible for data analytics teams to identify and rectify revenue-siphoning problems in real time. For example, a sudden drop in sales of travel packages during the pre-holiday rush period could be the result of a simple pricing error, a negative trend on social media, a server glitch causing slow load time, a coding error that makes checkout impossible, a handshake problem with one of the hundreds of travel provider APIs, an error in the payment gateway, an error affecting just one type of browser over one operating system...the list goes on. The Road to Regaining Sales More than ever, travel businesses and aggregators need to overcome the challenge of spotting dirty data and glitches before they hurt the bottom line, and maximizing the value derived from big data as a whole. This is a great first step on the road to regaining sales volume lost to direct travel providers. To start down this road, make sure that you’re: Watching customers to identify problems faster – Sound intuitive? It should be. But not all online businesses can yet track in aggregate exactly what customers are doing on the site or in their apps. This is data that tells a story. It can show you pages, page elements, or products that show sudden drops or spikes in traffic. By way of example, a sudden runup in sales offset by a drop-off in revenues for a given vacation destination could indicate page-level mis-pricing. A drop-off in sales for customers using a particular Android version might mean a version-specific glitch in your app. Closely monitoring secondary data sources – For travel businesses, nearly all data is external data. Yet beyond this mission-critical core data from hotels, airlines, car rental agencies, and others - there’s a world of relevant secondary-source data to monitor. Need examples? How about competitor advertising bid data, weather data (which can cause revenue-impacting power outages or discourage travel to certain destinations), fraud detection and security data, and more. This is data that can quickly and dramatically affect travel plans, and you need be able to analyze trends in real time to understand how to best respond to findings from it. Listening to social media buzz – Social media monitoring is not a vanity metric. Used correctly, this data has real value to sales and marketing operations. Working in real time, correlating social media data with changes in demand for travel packages or destinations, can offer valuable insights. For example, if a celeb promotes a certain destination via social media, you need to quickly identify the actual business impact to your specific business. That way, you can more effectively leverage the momentum to tailor inventory to meet expected demand, tweak pricing to further drive demand, tactically bundle products to grow overall package profitability, and more. The Bottom Line Identifying dirty data, catching glitches, and leveraging data assets in the hyper-complex, multifaceted travel ecosystem is a task that stretches the limits of human capability. Travel is a fiercely competitive industry where speed is always of the essence, but minimal advantages can reap great rewards. The key to unlocking those rewards will be the use of reliable data, enhancing customer experiences. To meet the challenge, direct booking players and aggregators need to seek out and adopt technological solutions that have the speed, accuracy and agility that will make difference in their bottom line and impact future operations.
Videos & Podcasts 24 min read

When a Spike in Sales is Not a Good Sign

In this session at the Strata Data Conference, Anodot’s Inbal Tadeski presents the importance of identifying relationships between time metrics so that they can be used for predictions, root cause diagnosis, and more.
Blog Post 6 min read

The End to a Never-Ending Story? Improve Data Quality with AI Analytics

Data is a crucial asset for most businesses. But that data is worthless if it is unreliable. While data supports major business initiatives, the level of data accuracy for many organizations is still low. Saul Judah, research director at Gartner, explained, “Poor data quality contributes to a crisis in information trust and business value, such as financial and operational performance. Unless tangible and pragmatic steps are taken to understand, address and control data quality, the situation will worsen.” In our previous two posts in this series, we discussed the challenges and need for maintaining high levels of data quality followed by an explanation of the different types of costs that an organization can suffer due to poor data quality. Now we’ll explore how an AI analytics solution can help identify data quality issues before compromised data impacts critical business decisions. Quality of Data is Critical for Business The quality of the data that is used by business is critical for improving customer experience, enhancing business processes, finding new market opportunities, and enabling new initiatives. Organizations that put a premium on high data quality can leverage their data as a valuable competitive asset, increasing efficiency, enhancing customer service, and, ultimately, driving revenue. On the other hand, organizations with poor data quality, using outdated, inconsistent, and flawed data, end up wasting time working with contradictory reports and inaccurate business plans, and end up making misguided (and damaging) decisions. Poor data quality destroys business value, as reported by Gartner, “Recent Gartner research indicates that the average financial impact of poor data quality on organizations is $9.7 million per year. This is likely to worsen as information environments become increasingly complex.” (How to Create a Business Case for Data Quality Improvement - January 9, 2017) Make Sure Data is Reliable Quality data can speak volumes about your business. As noted, companies collect tons of data. In this process they usually need to translate and standardize that data in order to send and catalog it in data warehouses. With all the analytics and decisions based on this data, the accuracy of this data is very important. As our customer, Pedro Silva at Credit Karma, recently shared, “We ingest over 5 terabytes of data per day. The quality of data is critical to business decisions. So we need to make sure that the quality is always there.” Event data is collected from multiple sources. In some cases, even every single event that happens on mobile devices is collected and stored in data warehouses. With this volume of information being processed and at a demanding pace, there is room for errors to creep into the process. Such errors can be: Stuck Values: events that suddenly stopped being reported Inaccurate Values: events are still reported, but there is a problem in translation Wrong Data: when there is a peak in the number of nulls Missing Data: showing empty fields Implausible Values: interpolated data is “just too good to be true” This can result in decisions being taken based on wrong data. Such lapses may only be discovered weeks after decisions have already been implemented. This causes a lot of damage, by the time the data is located, tidied, sorted and re-applied, it may be virtually out of date and no longer relevant for decisions. Innovative Solutions for Data Quality Challenges Rather than stick with established and standard BI reporting tools, organizations should consider more innovative solutions to better address their data quality processes. Adds Forrester Analyst Michele Goetz, "The data quality solutions market is growing because more enterprise architecture professionals see data quality as a way to address their top challenges. In large part, this market growth is due to the fact that EA pros increasingly trust data quality solution providers to act as strategic partners that advise them on top data management and business decisions." (The Forrester Wave™: Data Quality Solutions, Q4 2015) A Smarter Way To Approach Data Quality Problems Since at the end of the day everything is data, a smarter way to approach data quality problems is through AI analytics, leveraging anomaly detection. Anomaly detection flags "bad" data, identifying suspicious anomalies, that can impact data quality. By tracking and evaluating data, anomaly detection gives critical insights into data quality as data is processed. Collecting Time Series Data Today’s fastest growing (and data-intensive) industries use time series data sets, tracking business metrics over time. For data quality problems, you want to know about business metric shifts - when error rates change, missing data, gaps in the event data /time series, value spikes, signal noise, data that wasn’t updated or inaccurate data formats. Time series data is the basic input of an automated anomaly detection system.  Anomaly detection can find actionable signals in the time series data, because those signals often take the form of anomalies (i.e. unexpected deviations in the data), highlighting data quality problems in real-time. Missing event data can go undiscovered for weeks. Using AI analytics to preserve data quality should be on everyone's agenda. AI Analytics Puts Trust Back in Your Data Large data sets have large data quality issues. When you are dealing with millions of data points, it is a challenge to know where things are changing when there are so many permutations, many business metrics and dimensions. This needs to be processed on an analytics platform that can efficiently run detection algorithms at multiple steps of the data pipeline, identifying data quality issues and business metric shifts. An AI analytics solution can address data integrity issues at the earliest point of data processing, rapidly transforming these vast volumes of data into trusted business information. Anodot’s real-time, large-scale AI analytics solution is fully automated at each step in the data collection process: detection, ranking, and grouping - issuing concise alerts about changes in key business metrics like missing data, unexpected data types, nulls where there shouldn't be, or malformed records.  Based on these notifications, if you have suspicions that things are not quite right with your data, then you can quickly focus directly on the specific issue and consider how to proceed. This level of granularity can help companies spot very specific anomalies in data quality, especially those which would get smoothed out or left unnoticed by wider metrics, like averages and company-wide totals.   Data credibility is one of the most valuable assets organizations have today, providing proactive alerts around the quality and integrity of that data, can save resources and offer many valuable opportunities.
Case Studies 2 min read

Case Study: Anodot Handles the Most Pressing Printing Problems

Read this case study to see how a leading digital printing company was able to use Anodot to minimize lost revenue from operational malfunctioning.
Videos & Podcasts 30 min read

The Value in Detecting Real-Time Business Incidents for Every Industry

Uri Maoz, Anodot VP U.S. Sales & Business Development, offers a variety of uses cases across industries for real-time anomaly detection in business data. Presented at the 2017 Innovation Enterprise Chief Data Officer Summit.
Blog Post 4 min read

Ensuring Data Quality is a Big Challenge for Business

In 1998, NASA lost their $125 million Mars Climate Orbiter when the spacecraft burned up in the Martian atmosphere. While the engineering and software were meticulously built to NASA’s high standards and operated as intended, the data that put the spacecraft on the doomed trajectory was flawed. The navigation team at the Jet Propulsion Laboratory used the metric system for its calculations, while Lockheed Martin Astronautics in Denver, who designed and built the spacecraft, provided crucial acceleration data in the English system of inches, feet and pounds. JPL’s engineers assumed the acceleration data (measured in English units of pound-seconds) was in metric measure of force and sent the spacecraft on a doomed and costly flight. While, most data quality mistakes don't end in the fiery destruction of multi-million dollar spacecraft, misunderstood data is costly for today’s businesses. Data is the lifeblood of every company, helping companies work better, work smarter, and reach their target audiences. According to Gartner, modern business intelligence (BI) and analytics continues to expand more rapidly than the overall market, offsetting declines in traditional BI spending. Data quality is still the biggest challenge.  While many companies are investing in BI visualization tools, they are not necessarily applying the same efforts to the data itself. Then companies face frustration and disappointment when the ‘right’ data is not processed. Data Isn't Treated as a Business Asset Even though data is at the heart business decisions, companies don't always handle their data as an enterprise asset. Data may be handled tactically, with databases and applications created as requested by a business unit. Enterprise-wide data dictionaries are rarely applied to enforce consistency on the meaning of fields, and departmental IT teams address issues in isolation from wider business goals. The overall approach is ad-hoc, leading to a fractured data system, leaving the business to question the reliability of their data. Data Fuels Insights...Unless it’s Wrong Companies are often more focused on simply collecting data, losing sight how to ensure the quality of data. Unreliable data undermines business’ ability to perform meaningful analytics that support smart decision-making and efficient workflows. Quality data is required across the organization; for management, operations, compliance, and interaction with external partners, vendors, and customers. Maintaining Good Data Quality What makes good quality data? Data quality is measured by many factors, including: Even a dataset that seems accurate and consistent can lead to poor results when there is missing fields or outdated data. Maintaining high quality data is a real business challenge. It is further complicated by the dynamic nature of different data generation resources and devices, and the enormous scale of data itself. Companies need to confront their data quality challenges, before eroding trust in their data. When trust in data is lost, this is shared, leading to questions by all levels of the organization. Data Quality Case Study: An E-Commerce Company Misses a Key Event Here's a recent example of how data issues led an e-commerce company to make some costly business decisions. The e-commerce company collects event data through its mobile app, feeding data to a central data repository that drives their analytics and customer strategy. Every page and every click is collected for analysis, including tracking when products are added to or removed from a cart, how a user searches, and other user interactions on the site. There are potentially hundreds of events from each page. When a new version of the app was deployed, it had a bug that failed to collect some event data for certain iOS versions. Because of the large volume of data that's collected, the missing data problem wasn't noticed and went unidentified for several weeks. As a result, the business perceived a drop in purchases (while in reality the opposite occurred) and in reaction they increased the marketing budget for a specific product. Unfortunately,  in reality, there was no need for that increased marketing investment and would have been better spent elsewhere. The Neverending Story: Ensuring Quality Data Artificial Intelligence can be used to rapidly transform vast volumes of big data into trusted business information. You can immediately address problems, saving weeks of inaccurately reported data. Renewed trust in the quality of your data directly impacts business priorities. Anodot's AI-powered analytics solution automatically learns the normal behavior for each data stream, flagging any abnormal behavior. Using Anodot, changes that can impact data quality would be immediately alerted on, so that they can be addressed. preventing wasted time and energy and ensuring that decisions are made based on complete and accurate data.
Documents 3 min read

Case Study: Anodot's Useful eCommerce Insights for Wix

Wix needed a real-time alert system that would indicate issues without manual threshold settings in the key metrics. Anodot proved to be the system required for providing the necessary insights to the company's analysts.
Videos & Podcasts 19 min read

Identify eCommerce Losses and Opportunities with Machine Learning

Anodot Sr. Director of Customer Success Nir Kalish presents how eCommerce organizations can leverage the powerful anomaly detection capabilities of Anodot’s AI analyitcs to proactively address business incidents in real time, protect revenue and keep customers happy.