Preparing the VFM statement

Using PredictIve Analysis

Context

Perhaps the biggest challenge faced when completing a VFM submission is understanding your relative performance, typically tackled by comparing key metrics against similar organisations and over time. Much analysis focuses on comparator groups to provide a reference point. However, the Regulator of Social Housing (RSH) generated some analysis to calculate the score they expected to see for each metric for each registered provider. They explained it in their Technical Regression Report which they published a few years ago. Based on this methodology, we have created a unique predicted social cost of housing per unit (SHCU) value for all the large registered providers, based on the 2023 data.

Predicted Scores

Since they published their analysis, the RSH has focused on identifying the main differential costs drivers. These costs drivers include the percentages of supported housing and housing for older people, major works expenditure and the regional wage index.

Using these major cost drivers, we can compute a unique predicted value for each registered provider and compare this with their actual reported value in the most recently published accounts, as shown in this graphic.

It is very important to note that this model generates a predicted value based on the known factors. It explains about 65% of the variation between providers and the other 35% is made up of a mixture of factors that may be historic, external or the result of different staff and management performance. The model will tell us the expected level of social housing costs based on the major explanatory factors, all other factors being equal. The error bar provides an indication of how much variation in this prediction we might expect to see. The closer your actual value is to these low and high values, the more extreme is your result and the more likely to come under extra RSH scrutiny.

Let’s explore the practical implications of this. We know that the percentage of supported housing will have a significant impact on unit costs. Some providers might be tempted to reduce their supported housing in order to lower their unit cost. It would normally have that effect, but it would also lower the predicted cost and so any gap between the predicted and actual score might be unaffected. Instead, the focus should be on those factors not in the predicted model that can account for why a registered provider is above or below their predicted cost.

Interestingly, other potential differential cost drivers such as the total stock size or the percentage of rural stock did not have turn out to have any significant impact on unit costs.

Comparative Analysis

Once you know whether your unit costs are higher or lower than might be expected, then it is possible to try and understand the source of that divergence. As the model adjusts for the percentage of supported housing and housing for older people, as well as major repairs costs per unit and the regional wage index (a very good proxy for the impact of region) your analysis has to focus elsewhere.

At this stage, it may be helpful to explore underlying costs in more detail and specify a comparison group to identify cost and expenditure categories that are unusual. Re-investment is a major focus of expenditure but did not register as significant in our model (in other words there was no clear pattern), but is one of a number of useful areas of analysis in the case of much higher or lower SHCU than predicted.

Power BI Report

We decided to build an accessible Power BI report that combines all this analysis in one easy to use environment. The report provides the opportunity to dig into the predicted score and comparative analysis a little more. The report has five pages:

  1. Overview of the unique predicted score for each large registered provider (assuming that they had reported a full set of data in 2023).

  2. An opportunity to create a (or use a predefined) comparison group to explore the variance in the key cost drivers used in the predictive model.

  3. A view of all the cost categories, again comparing across the comparator groups.

  4. The opportunity to view all the VFM metrics and see the average values  - all the pages (2 – 4) contain simple bar charts but moving the mouse over a bar will show the values for the last five years.

  5. Scenario modelling – the opportunity to adjust the values that you are now seeing (or wish to see) in order to get a good sense of what your unit costs are likely to be under different scenarios (such as more housing for older people, or reduced repair costs).

Many providers are investing in Power BI and we have used that environment as it makes it easy to integrate with your wider reporting, but also to bookmark your own important thoughts and filters as you explore the data.

Conclusions

We have developed this Power BI tool based on our years of experience in data analysis and benchmarking services. We would be very happy to hear your thoughts and debate this approach. We would also be happy demonstrate the Power BI tools that we have developed to present this focused and time saving analysis which, we believe, is consistent with the RSH approach.

More details