From education to employment

When should you question your data?

Governor dashboards always tell a story. Here, FEA’s MIS specialist Dr Ian Hadfield tells you how to separate the story from the storytelling.

For FE Governing bodies it is the start of a new year. Accounts, achievement rates and indicators for last year are being finalised. Has there been a financial surplus? Has there been an improvement in quality?

For some boards, despite monthly dashboards showing on or above-target performance, the improvements will not be delivered – leading to the inevitable question: “Why wasn’t this picked up earlier?”

The reason often lies in the processes behind the dashboard and the natural desire to ensure the RAG rating is Green or at worst Amber. The aggregation of optimism can sometimes cloud reality.

Be the KPI KGB

In recent years there has been a drive to produce dashboard data showing key performance indicators (KPIs) tracked against college improvement targets.

KPIs should be well-defined and risk assessed to ensure that targets will deliver the required improvement. Key to this is deciding the measurement of the KPI – and the rules it plays by.

But spotting where human interpretation has played a role requires insight.

As a Governor or Senior Leader, how do you know when to question your data?

1. Spotting Spin

Most Governors’ dashboards contain a range of KPIs from a variety of sources. The data is usually copied from one data source to another. Sometimes human intervention plays a part.

Understanding how the dashboard is produced and where the data comes from is vital. If the dashboard is live data, drill down and check a few things out with link managers.

If the dashboard is not live then seek assurances on how it is collated and is this reflected in the commentary.

2. Income against allocation

Failing to hit the allocation can lead to financial difficulty in the next academic year.

  • Have enrolments hit target? If not, how will the income hit the allocation?
  • Has potential future income been included from learners that haven’t yet enrolled?
  • Compare the subcontractor declaration forms. Have the values been increased since the start of term to account for a shortfall in enrolments?

3. Withdrawals

Ask to see data on learners who have not been seen for more than 1, 2, 3 and 4 weeks. If managers are sitting on withdrawal forms these numbers will be high.

Compare attendance data for the previous month with the withdrawal rate. A low attendance and low withdrawal rate is not consistent.

Other stats can measure reliability: days between last date of attendance and withdrawal being processed, or numbers of unmarked registers to name two.

4. Attendance

The measurement of attendance is one of the most stable KPIs. After the first two or three months the volume of data means the average attendance changes very little going forwards.

Check the KPI definition of how authorised absences are dealt with. Keep it simple: number present, divided by number on the registers.  

Attendance commonly declines gradually during the year. Ask to see attendance plotted monthly. If the average is just on target then for the last few months it has been below target. Any intervention to improve attendance should show as a month-on-month improvement.

5 Comparisons with previous years

Be aware of the complexity of the question: “Where were we this time last year?”

Do you mean where you actually were or where you thought you were?

The first is easy. Data gurus can work out the exact position based on the same date last year because everything is known. Current performance is always better than this.

The second can only really be achieved by a manual comparison with the same KPI as reported the previous year. But KPIs change and this becomes very unreliable.

Presenting the data as a monthly trend where the whole line is recalculated each month gives the best picture. Data from two months ago is firmed up and it’s easier to see the projection forwards.

Never forget

Almost all data is derived from learners’ experiences.

Asking probing questions on what interventions have taken place, and what impact they have had, will lead to greater insight – far more than can be represented by numbers alone.

The result will be a more open discussion of performance and the challenge of improvement.

ENDS.

*FEA’s Dr Ian Hadfield has 25 years’ FE experience in quality assurance and improvement, MIS and planning.


Related Articles

Responses