Anodot Resources Page 42

FILTERS

Anodot Resources Page 42

Videos & Podcasts 24 min read

When a Spike in Sales is Not a Good Sign

In this session at the Strata Data Conference, Anodot’s Inbal Tadeski presents the importance of identifying relationships between time metrics so that they can be used for predictions, root cause diagnosis, and more.
Blog Post 6 min read

The End to a Never-Ending Story? Improve Data Quality with AI Analytics

Data is a crucial asset for most businesses. But that data is worthless if it is unreliable. While data supports major business initiatives, the level of data accuracy for many organizations is still low. Saul Judah, research director at Gartner, explained, “Poor data quality contributes to a crisis in information trust and business value, such as financial and operational performance. Unless tangible and pragmatic steps are taken to understand, address and control data quality, the situation will worsen.” In our previous two posts in this series, we discussed the challenges and need for maintaining high levels of data quality followed by an explanation of the different types of costs that an organization can suffer due to poor data quality. Now we’ll explore how an AI analytics solution can help identify data quality issues before compromised data impacts critical business decisions. Quality of Data is Critical for Business The quality of the data that is used by business is critical for improving customer experience, enhancing business processes, finding new market opportunities, and enabling new initiatives. Organizations that put a premium on high data quality can leverage their data as a valuable competitive asset, increasing efficiency, enhancing customer service, and, ultimately, driving revenue. On the other hand, organizations with poor data quality, using outdated, inconsistent, and flawed data, end up wasting time working with contradictory reports and inaccurate business plans, and end up making misguided (and damaging) decisions. Poor data quality destroys business value, as reported by Gartner, “Recent Gartner research indicates that the average financial impact of poor data quality on organizations is $9.7 million per year. This is likely to worsen as information environments become increasingly complex.” (How to Create a Business Case for Data Quality Improvement - January 9, 2017) Make Sure Data is Reliable Quality data can speak volumes about your business. As noted, companies collect tons of data. In this process they usually need to translate and standardize that data in order to send and catalog it in data warehouses. With all the analytics and decisions based on this data, the accuracy of this data is very important. As our customer, Pedro Silva at Credit Karma, recently shared, “We ingest over 5 terabytes of data per day. The quality of data is critical to business decisions. So we need to make sure that the quality is always there.” Event data is collected from multiple sources. In some cases, even every single event that happens on mobile devices is collected and stored in data warehouses. With this volume of information being processed and at a demanding pace, there is room for errors to creep into the process. Such errors can be: Stuck Values: events that suddenly stopped being reported Inaccurate Values: events are still reported, but there is a problem in translation Wrong Data: when there is a peak in the number of nulls Missing Data: showing empty fields Implausible Values: interpolated data is “just too good to be true” This can result in decisions being taken based on wrong data. Such lapses may only be discovered weeks after decisions have already been implemented. This causes a lot of damage, by the time the data is located, tidied, sorted and re-applied, it may be virtually out of date and no longer relevant for decisions. Innovative Solutions for Data Quality Challenges Rather than stick with established and standard BI reporting tools, organizations should consider more innovative solutions to better address their data quality processes. Adds Forrester Analyst Michele Goetz, "The data quality solutions market is growing because more enterprise architecture professionals see data quality as a way to address their top challenges. In large part, this market growth is due to the fact that EA pros increasingly trust data quality solution providers to act as strategic partners that advise them on top data management and business decisions." (The Forrester Wave™: Data Quality Solutions, Q4 2015) A Smarter Way To Approach Data Quality Problems Since at the end of the day everything is data, a smarter way to approach data quality problems is through AI analytics, leveraging anomaly detection. Anomaly detection flags "bad" data, identifying suspicious anomalies, that can impact data quality. By tracking and evaluating data, anomaly detection gives critical insights into data quality as data is processed. Collecting Time Series Data Today’s fastest growing (and data-intensive) industries use time series data sets, tracking business metrics over time. For data quality problems, you want to know about business metric shifts - when error rates change, missing data, gaps in the event data /time series, value spikes, signal noise, data that wasn’t updated or inaccurate data formats. Time series data is the basic input of an automated anomaly detection system.  Anomaly detection can find actionable signals in the time series data, because those signals often take the form of anomalies (i.e. unexpected deviations in the data), highlighting data quality problems in real-time. Missing event data can go undiscovered for weeks. Using AI analytics to preserve data quality should be on everyone's agenda. AI Analytics Puts Trust Back in Your Data Large data sets have large data quality issues. When you are dealing with millions of data points, it is a challenge to know where things are changing when there are so many permutations, many business metrics and dimensions. This needs to be processed on an analytics platform that can efficiently run detection algorithms at multiple steps of the data pipeline, identifying data quality issues and business metric shifts. An AI analytics solution can address data integrity issues at the earliest point of data processing, rapidly transforming these vast volumes of data into trusted business information. Anodot’s real-time, large-scale AI analytics solution is fully automated at each step in the data collection process: detection, ranking, and grouping - issuing concise alerts about changes in key business metrics like missing data, unexpected data types, nulls where there shouldn't be, or malformed records.  Based on these notifications, if you have suspicions that things are not quite right with your data, then you can quickly focus directly on the specific issue and consider how to proceed. This level of granularity can help companies spot very specific anomalies in data quality, especially those which would get smoothed out or left unnoticed by wider metrics, like averages and company-wide totals.   Data credibility is one of the most valuable assets organizations have today, providing proactive alerts around the quality and integrity of that data, can save resources and offer many valuable opportunities.
Case Studies 2 min read

Case Study: Anodot Handles the Most Pressing Printing Problems

Read this case study to see how a leading digital printing company was able to use Anodot to minimize lost revenue from operational malfunctioning.
Videos & Podcasts 30 min read

The Value in Detecting Real-Time Business Incidents for Every Industry

Uri Maoz, Anodot VP U.S. Sales & Business Development, offers a variety of uses cases across industries for real-time anomaly detection in business data. Presented at the 2017 Innovation Enterprise Chief Data Officer Summit.
Blog Post 4 min read

Ensuring Data Quality is a Big Challenge for Business

In 1998, NASA lost their $125 million Mars Climate Orbiter when the spacecraft burned up in the Martian atmosphere. While the engineering and software were meticulously built to NASA’s high standards and operated as intended, the data that put the spacecraft on the doomed trajectory was flawed. The navigation team at the Jet Propulsion Laboratory used the metric system for its calculations, while Lockheed Martin Astronautics in Denver, who designed and built the spacecraft, provided crucial acceleration data in the English system of inches, feet and pounds. JPL’s engineers assumed the acceleration data (measured in English units of pound-seconds) was in metric measure of force and sent the spacecraft on a doomed and costly flight. While, most data quality mistakes don't end in the fiery destruction of multi-million dollar spacecraft, misunderstood data is costly for today’s businesses. Data is the lifeblood of every company, helping companies work better, work smarter, and reach their target audiences. According to Gartner, modern business intelligence (BI) and analytics continues to expand more rapidly than the overall market, offsetting declines in traditional BI spending. Data quality is still the biggest challenge.  While many companies are investing in BI visualization tools, they are not necessarily applying the same efforts to the data itself. Then companies face frustration and disappointment when the ‘right’ data is not processed. Data Isn't Treated as a Business Asset Even though data is at the heart business decisions, companies don't always handle their data as an enterprise asset. Data may be handled tactically, with databases and applications created as requested by a business unit. Enterprise-wide data dictionaries are rarely applied to enforce consistency on the meaning of fields, and departmental IT teams address issues in isolation from wider business goals. The overall approach is ad-hoc, leading to a fractured data system, leaving the business to question the reliability of their data. Data Fuels Insights...Unless it’s Wrong Companies are often more focused on simply collecting data, losing sight how to ensure the quality of data. Unreliable data undermines business’ ability to perform meaningful analytics that support smart decision-making and efficient workflows. Quality data is required across the organization; for management, operations, compliance, and interaction with external partners, vendors, and customers. Maintaining Good Data Quality What makes good quality data? Data quality is measured by many factors, including: Even a dataset that seems accurate and consistent can lead to poor results when there is missing fields or outdated data. Maintaining high quality data is a real business challenge. It is further complicated by the dynamic nature of different data generation resources and devices, and the enormous scale of data itself. Companies need to confront their data quality challenges, before eroding trust in their data. When trust in data is lost, this is shared, leading to questions by all levels of the organization. Data Quality Case Study: An E-Commerce Company Misses a Key Event Here's a recent example of how data issues led an e-commerce company to make some costly business decisions. The e-commerce company collects event data through its mobile app, feeding data to a central data repository that drives their analytics and customer strategy. Every page and every click is collected for analysis, including tracking when products are added to or removed from a cart, how a user searches, and other user interactions on the site. There are potentially hundreds of events from each page. When a new version of the app was deployed, it had a bug that failed to collect some event data for certain iOS versions. Because of the large volume of data that's collected, the missing data problem wasn't noticed and went unidentified for several weeks. As a result, the business perceived a drop in purchases (while in reality the opposite occurred) and in reaction they increased the marketing budget for a specific product. Unfortunately,  in reality, there was no need for that increased marketing investment and would have been better spent elsewhere. The Neverending Story: Ensuring Quality Data Artificial Intelligence can be used to rapidly transform vast volumes of big data into trusted business information. You can immediately address problems, saving weeks of inaccurately reported data. Renewed trust in the quality of your data directly impacts business priorities. Anodot's AI-powered analytics solution automatically learns the normal behavior for each data stream, flagging any abnormal behavior. Using Anodot, changes that can impact data quality would be immediately alerted on, so that they can be addressed. preventing wasted time and energy and ensuring that decisions are made based on complete and accurate data.
Documents 3 min read

Case Study: Anodot's Useful eCommerce Insights for Wix

Wix needed a real-time alert system that would indicate issues without manual threshold settings in the key metrics. Anodot proved to be the system required for providing the necessary insights to the company's analysts.
Videos & Podcasts 19 min read

Identify eCommerce Losses and Opportunities with Machine Learning

Anodot Sr. Director of Customer Success Nir Kalish presents how eCommerce organizations can leverage the powerful anomaly detection capabilities of Anodot’s AI analyitcs to proactively address business incidents in real time, protect revenue and keep customers happy.
Blog Post 3 min read

AWS re:INVENT – Anodot Joins the AWS Machine Learning Competency Program

As one of the biggest names in technology right now, anything related to AWS is a big deal. We just got back from the sold-out 2017 AWS re:INVENT conference, held this year in Las Vegas, NV. With a record-breaking attendance of more than 40,000 people, it was filled with representatives from tech luminaries such as Accenture, Intel, Deloitte, Salesforce, VMware, and more. Re:INVENT has always been an event for learning. It provides the opportunity for attendees to familiarize themselves with flagship Amazon products such as EC2, S3, Redshift, and more. The hands-on training, networking opportunities, and breakout discussions have always been an invaluable opportunity for professional development – plus there’s always the excitement of being privy to new AWS announcements. As such, we were thrilled to be among the first to know about what, for us, is among the most exciting new developments with AWS – the AWS Machine Learning Competency. The AWS Machine Learning Competency: A Partner Ecosystem focused on AI and ML The market for artificial intelligence (AI) technologies is flourishing. While a broad set of important technologies is emerging, some of these technologies are still in their early stages. The AWS Machine Learning Competency showcases the industry-leading AWS Partners that provide proven technology for a variety of use cases that will help companies deliver ML at scale. Rigorous Testing and Tough Admissions Process AWS set a high bar for admission to the AWS Machine Learning Competency program. To join, enterprises undergo a strict validation of their capabilities, demonstrating technical proficiency and proven customer success. APN Partners must also complete a technical audit of their ML solution. As the AWS Machine Learning Competency datasheet notes, the machine learning competency is meant for partners whose bread and butter is providing machine learning services to their customers, rather than partners who simply use machine learning in the background of their solutions. Out of 150 companies invited to apply for the competency, only 17 were selected in three categories: Data Services, Platform Solutions, and SaaS/API Solutions. And yes, in case you’re wondering why we’ve spent so long talking about this, Anodot was one of just six companies selected under the SaaS/API Solutions category. “Not every machine learning problem requires starting from scratch and building a custom solution, and not all of our customers have access to a dedicated data science team with the time and expertise to build a production workflow for large scale predictions,” said Joseph Spisak, Global Lead for Artificial Intelligence and Machine Learning Partnerships, Amazon Web Services, Inc. “We are delighted to welcome Anodot to the Artificial Intelligence and Machine Learning Competency Program to provide off-the-shelf machine learning solutions that can help speed time to market and bring intelligence to any application.” Unforgettable Event of Amazing proportions We  got to hear VP of Amazon AI at Amazon Web Services, Swami Sivasubramanian, share his insights: "Our goal is to put machine learning capabilities in the hands of all developers and data scientists. The other thing we are excited about is the API services, where people who don't want to know anything about machine learning but want to benefit from the analytics capabilities. I am really excited about all the applications that we can use to improve our everyday life using machine learning." Finally we rounded out our visit with a round a virtual golf and hung out with this great guy, our main contact at AWS, who has a custom tailored Pac Man suit.
Documents 1 min read

Case Study: Lyft Optimizes its Business with Unique Anomaly Detection

Explore how this ride-share leader uses Anodot to identify business risks in real time.