Trufa – AI for Enterprise Performance
An Analytics Application for the Digital Enterprise
Trufa is an analytics application that provides insights about business optimization in the enterprise in high speed and precision. In the past, Analytics meant to aggregate numbers into metrics, turn them into pies and bars, and perhaps even calculate trend lines. The complex, but value-adding work – namely to derive insights and actions from the information – has to be done by humans.
What is special about Trufa?
Trufa is made for the business user with business know-how. Data scientist or data engineering knowledge may be helpful but is not required.
In contrast to traditional BI offerings, which only offer visualization of facts, Trufa delivers analysis of the content of data, it delivers the hypothesis as well as reasons. Trufa hat built-in knowledge of the semantic of SAP ERP including its customizing.
Based on the thorough understanding of business processes, Trufa works as an electronic advisor and highlights areas with potential improvement either in economic outcome or in improvements and simplification of processes.
Trufa - Functional Overview
Trufa helps you get full transparency about your enterprise performance and automatically identify optimization potentials based on cash and profitability.
Prescriptive Analytics: Opportunity Identification
Automatic discovery of process optimization potentials
- Let AI and advanced statistics discover concrete process optimization opportunities including respective scope, drivers and economic potentials.
- Search optimization opportunities as easy as Google.
Predictive Analytics: Impact Simulation
Simulate the impact of process changes to your cash and profit
- Identify what will happen to your cash and profit situation if process drivers are improved.
- Set realistic targets for your process performance.
- Get process optimization recommendations based on the prior process performance in your organization
Diagnostic Analytics: Root-cause Analysis
Identify relevant patterns in your processes
- Analyze data patterns to uncover causes for process variants – in seconds instead of combining 500 pivot tables.
- Identify automatically data clusters to discover causes for processes to be more efficient.
Descriptive Analytics: Data Visualization
Create transparency about what happened in your processes
- Analyze processes ad hoc without any individual modeling of data preparation.
- Slice and dice in any dimension with one click.
- Visualization always highlights the economically important drivers.
- Perform deep-dive analysis down to the individual document level.
Business Alerts: The Future of Reporting
Forget about eyeball scanning of hundreds of reports per month. Just Automate.
- Monitor-based alerts for performance indicators.
- Report-based alerts to get information about defined business occurrences, e.g., if process compliance is impacted.
- TPI-based alerts to monitor changes in opportunities.
- Receive e-mails at the frequency of your choice and share with others as necessary.
- APIs to connect to other apps.
Trufa Use Case Examples
Working Capital & Cash Management
Identification of opportunities to reduce inventory, optimization of payment terms or accounts receivable & payable
Revenue & Margin Growth
Optimization of profits by analyzing customer satisfaction, product profitability, or pricing
Turnaround & Restructuring Strategy/Drivers
Identification of low performing products, plants, countries, or complexity reduction potentials
Supply Chain Optimization & Strategic Sourcing
Optimization of supplier network, inventory levels and purchase prices, analysis of customer payment history or plant profitability
Target Setting & Planning Support
Identification of realistic targets, of profit drivers and simulation of impacts of measures
S/4HANA Implementation Preparation
Identification of current ERP usage patterns, configuration complexity and their business benefits
The Science Inside - Whitepapers
About the author: Prof. Dr. Andreas Mielke is Managing Director of Deloitte Digital. He joined Deloitte in 2018 co-heading the Trufa team. Along with his work at Deloitte Digital, Andreas Mielke is apl. Professor at the Institute for Theoretical Physics at the University of Heidelberg, Germany. He studied in Frankfurt and Heidelberg, where he received the habilitation in 1994. He worked for extended periods of research in Lausanne, Cambridge, Paris, Prague, Budapest, and Vienna.
Whitepaper No 1: Robust Statistics
Statistics is known to everybody, simply because in our current live we often see statistical analyses. But what is robust statistics, why should statistics be robust? And why should we use robust statistics instead of well known usual statistics, when we analyze business data? The present paper answers this questions and explains why only robust statistics guaranties reliable results when we analyze business data. The aim is to provide some very basic understanding of the why and how of robust statistics. For details of the mathematical foundation or the implementation of robust procedures we refer to the literature mentioned at the end.
Whitepaper No 2: Maximum Entropy
Entropy is a quantity which appears in statistical physics. How is it connected to business data and why should it be maximized? The present paper tries to answer that question. We show that the maximum entropy principle ensures that the outcome of a statistical model is unbiased, robust, and actionable. The maximum entropy method is a general-purpose technique in machine learning. It has a simple and precise mathematical foundation. A number of aspects make it well suited for the modelling of distributions of business data.
Whitepaper No 3: Time Series Analysis, Modelling, and Forecast:
The paper aims at explaining how time series analysis can be used for the prediction of business data. We present which methods are available, which methods are suitable for business data, and how prediction of business data can be implemented. As an example, we will use liquidity prediction.
Whitepaper No 4: Regression Outliers
Standard regression often yields bad results if outliers are present. There are two ways out of this dilemma. The first is to remove the outliers. The second is to use robust methods for which the result does not depend on outliers. If the data set is large, the second possibility is often the only possible one. In this paper, we point out that robust methods can and should be used for outlier detection. The reason is that outliers often contain additional information and are thus important. They may for instance show that an additional factor is relevant to understand the behavior of the full data set. A precise outlier detection is therefore mandatory.
Whitepaper No 5: Neural Networks
Neural networks are a useful tool for simulations, classifications, predictions, and forecasts in various areas. In Trufa we use neural networks to determine the functional relationship between all kinds of quantities. The aim of this paper is to explain some of the main aspects of neural networks to the non-expert and to make clear where and how they are used in Trufa.