My background is in economics and I spent the early part of my career building economic models to carry out tasks like exploring the impact of changes to tax rates on revenue and income inequality. These early experiences gave me a passion for getting data to be as accurate as possible as they are so critical for good models. Later on I worked at the UK Data Archive collecting and distributing vital data collections for secondary use. After building some data discovery and exploration tools I started to work on a variety of major analytical projects at KPMG and later Tribal Consulting before setting up Musgrave Analytics a few years ago.
Through these experiences I found that most analytical projects could be considered to have three stages, namely data collection, analysis and presentation. Pressure of time often resulted in project time being divided between these stages in the ratio 80:15:5. As a consequence much of the value invested in the project was lost in poor communication.
I realised that if we were to make better use of our time consuming data and analysis work, then more effort was needed on the presentation, whether via good reports and scorecards or interactive dashboards and scenario modelling.
We now work extensively within the housing sector, but we know that good ideas in one sector are very useful in others, as we have found many times having also worked extensively in the automotive and health sectors amongst others.
A performance framework provides a summary of all the core measures used to determine how well an organisation is functioning in terms of its stated goals and objectives. It is a crucial component in the management of all types of organisations and is particularly effective in data rich ones.
A schematic of a performance framework is shown below and each section is summarised in the rest of the blog. Future blogs will open out these sections, in particular our specialisms, sections 3 and 4.
1. Define metrics
A performance framework should be built around clearly defined and measurable objectives. The metrics that are defined should be ones that link most closely to the core objectives. They can then be monitored on a regular basis and used to inform key decision makers such the board and executive and operational leadership. In addition, they can be used for many other functions such as supporting audit reviews and stakeholder reporting (e.g. banks and partners) as well as providing accessible information to other stakeholders such as wider staff and customers.
The course of defining metrics may involve detailed process mapping to understand key points in the business processes which can be measured and monitored. On the other hand, they can also focus on significant outputs. It is important that they are readily understood and measurable. SMART is sometimes used as an acronym meaning - Specific, Measurable, Achievable, Realistic, Time-bound - and helps to ensure that any metrics are effective.
2. Collate data
Once the metrics have been defined, then the evidence to construct them can be collated. This may involve pulling together data from multiple systems including operational, financial and HR as well as customer feedback surveys or external benchmarks. It normally makes sense to bring these disparate sources into a well-structured data warehouse.
Pulling the data together is no easy task. There are many database systems available to support this, including heavyweight tools such as SAS, Oracle, Microsoft SQL server or SAP or more specialist tools such as Tableau and R. At Musgrave Analytics we are software neutral, but find that we make extensive use of Antivia’s DecisionPointTM and Microsoft SQLserverTM to manage the data. This whole area of supporting IT is generally referred to as business intelligence.
3. Analyse and understand
Once data has been collected, in depth exploration and analysis of data must not be overlooked. IT solutions often oversell how easy this is, whereas in fact, it takes careful thought and extensive discussion to understand how metrics should be analysed and presented. A thorough engagement with the business is essential. Good exploratory data analysis is needed to help us to answer questions such as:
What elements of the current performance are important for our audience?
How do we know if our performance is good, average or poor?
How do we define good performance?
Even if it is good, is it as good as it could be?
How do we define and identify outlier performance?
Are performance variations important and can we identify their cause?
Can we forecast future levels of metrics?
How do two metrics relate to each other?
Are there any significant changes in activity or performance?
What are the consequences of changes in performance?
The analysis should be used to build the core outputs used in the presentation. Analysis is both routine (e.g. to serve monthly reporting) or in-depth in achieving an initial understanding of a metric performance. It might also spin off into detailed analysis of a particular issue. We typically use Microsoft Excel and R (an environment for statistical computing and graphics) to support this stage of the framework.
The graph below shows the operational performance of different housing areas. Staff know that an estate with a small number of properties makes their score very unreliable. The use of a funnel plot enables the analyst to immediately see which housing estates have scores that are significantly better or worse (statistically speaking) than the average or target score. So the analyst can quickly identify the areas that are outside the lower dotted line (in other words have significantly poor scores) and then consider how best to present these results to decisions makers. This graph (helpful for analysts) would normally be considered too complex for executives or other non-specialists.
4. Present Outputs
Presenting the outputs is not a simple task and requires good design skills including an understanding of good graphics. A keen appreciation of what the audience needs to see is also important and will be informed by the analysis as the analysis allows the author to really ‘know the data’. So the main elements of effective data presentation are:
· Understand the audience. It is usually necessary to present output to different audiences at different level of detail and granularity. For example, for a board report, an overall aggregate is often adequate whereas operational managers need to be able to drill down to individual staff performance.
· Reduce the output to the core, so that there are not too many distracting and confusing messages. Design the display to communicate simply, clearly, and accurately. Include nothing that isn’t data unless it’s needed to support the data.
· Some messages are communicated most effectively with words, others with tables or graphs, and some with a combination. Choose graphs carefully, using the simplest possible.
There are many specific components that might be included, such as performance scores where metrics are normalised (to be on comparable scales) weighted and aggregated to provide an overall score at various levels of an organisation. These could even be linked to performance incentives. We use Antivia’s DecisionPointTM as a primary tool for building dashboards, but also use Microsoft Excel, R shiny and SAP Dashboard Design.
Some frameworks might include scenario modelling to support the taking of strategic decisions, like major investments or disposals. This supports strategic ‘what-if’ discussions such as the example below. This example allows an property developer to instantly assess project and programme profitability and funding implications under a large number of shifting assumptions relating to timing, values, costs and funding.
5. Prioritise action
The presentation of performance metrics should be designed to support the user in the fifth stage of the performance framework, namely identifying action. Outputs should be focused on the different target audiences and provide the information, including contextual information, to inform the prioritisation of action.
As an example, a performance metric for response times might be displayed as a red traffic light, but this should be seen in terms of trend (e.g. is it going up or down or just behaving erratically?), context (e.g. are other parts of the business, or similar external organisations, behaving in the same way?).
The context will help determine the required action and so the analysis and presentation stages should have provided the necessary input to inform this decision.
Once actions have been initiated, it may be necessary to modify the choice of key metrics, which takes us back round to the start of the framework.
- Dec 14, 2018 8 insights from the SDR 2017-18 Dashboard Dec 14, 2018
- Nov 23, 2018 What is a Dashboard? Nov 23, 2018
- Aug 31, 2018 Plotly in R: How to make ggplot2 charts interactive with ggplotly Aug 31, 2018
- Aug 16, 2018 Making the most of box plots Aug 16, 2018
- Jul 24, 2018 Plotly in R: How to order a Plotly bar chart Jul 24, 2018
- Apr 11, 2018 Machine learning in the housing sector Apr 11, 2018
- Mar 5, 2018 How Useful Are Traffic Light Scorecards for Performance Management? Mar 5, 2018
- Feb 16, 2018 How to merge multiple data frames using base R Feb 16, 2018
- Feb 8, 2018 The beginner's guide to time series forecasting Feb 8, 2018
- Jan 24, 2018 R Shiny vs. Power BI Jan 24, 2018
- Aug 15, 2016 Fundamentals of a good performance framework Aug 15, 2016