We’re excited to announce that Fiddler has been named one of America’s most promising artificial intelligence companies on this year’s Forbes AI 50 list. Forbes partnered with venture firms Sequoia Capital and Meritech Capital to look at over 400 privately-held, U.S.-based companies that are applying AI in meaningful, business-oriented ways. Judges evaluated companies on their technology, business model, customers, and financials, and selected the 50 most compelling and innovative companies for the second-annual list. We are honored to be recognized among so many incredible peers, and congratulate each of the other companies who made this year’s list.
At Fiddler, we believe that to scale AI in an effective and responsible way, there is a critical need for a solution to oversee AI and machine learning deployment— not just to surface operational issues but more importantly, to help understand causes and resolve issues quickly. To address this, we designed an Explainable AI Platform that enables businesses to monitor, explain, and analyze their AI solutions. The Platform enables businesses to focus on removing blackbox AI and working to build greater transparency into every machine learning-generated decision.
Meeting the moment: Explainable AI in the time of COVID-19
It has become clear that the effects of COVID-19 will be felt by businesses for years to come. They have already adapted and progressed at an unprecedented rate. Firms have grown to meet the needs of a world under lockdown, shifting their business models, sales processes, and overarching strategies. In most cases, it has meant significantly advancing their digital capabilities and accelerating the adoption of AI projects. According to the Gartner 2020 CIO Agenda Survey, leading organizations expect to double the number of AI projects in place within the next year, and over 40% of them plan to deploy AI solutions by the end of 2020.
How does a new technology that infuses AI Explainability into the full AI lifecycle help your business today?
Addressing changing customer needs
As businesses (and customers) adapt to the ‘new normal’, one of the best things they can do is serve existing customers based on changing needs. But in times of uncertainty, customers don’t tend to behave in the same way they have historically. Understanding the ‘how’ and ‘why’ behind the decisions that ML models make is key to learning how customer behavior has changed. The ability to track anomalies that might soon become the new normal and retrain models using these new inputs will ensure models are quickly adapting to new customer needs. Powered by Explainable AI analytics, business leaders will be able to answer key customer-related questions such as:
- How has customer activity changed over the past few months?
- Are the behavioral patterns of customers in certain groups shifting? Why?
- Is customer loyalty in line with historical data, or is there increased churn? How can this churn be addressed? (With the ability in Fiddler to fiddle with inputs to see how that affects the output, teams can better understand how to reduce churn in customers.)
- Are there patterns to be uncovered that can give more insight into the ‘why’ and ‘how’ behind customers’ actions?
Getting ahead with explainable forecasting models
Experts predict that this economic downturn could exceed any the world has seen in the past century. To keep businesses running in the face of these challenging times, forecasting is key. AI models are effective forecasters, but business leaders are often unable to understand how models work and if they are trustworthy and reliable. An explainable approach to AI forecasting can empower teams to make informed decisions. They get insight into areas such as readiness gaps in the business structure and process, sales forecasting that identifies why predictions are changing, and deep-dive analysis on changing customer needs and how best to capitalize on them.
Minimizing risk in AI systems and ensuring responsible AI deployments
Not understanding why and how decisions are made can result in negative consequences. In the current climate this can be even more dangerous. Due to large-scale changes in human behavior, financial markets, and other factors that impact data and models, the focus should be on understanding data to minimize risk. Teams should focus on elements such as understanding transparency in outcomes, keeping humans in the loop to facilitate decision making outside of automation when necessary, and preparing for changes in regulations as the implications of this global pandemic develop.
With Explainable AI, business and analytics leaders can not only make accurate decisions, but also know the “why” behind these decisions and how the various factors influence model outputs. This ensures the overall decision-making process is better informed and results in more accurate outcomes.
If you’d like to learn more about what we do or how you can use our solution today, sign-up for a demo.