The Auditor-General’s Report: Beyond the Headlines

The tabling in Parliament on 7 May of the Victorian Auditor-General’s report into the functioning of Victoria’s Planning Land Use and Development Framework on was accompanied by a splash of media headlines that picked up on the most alarming of its findings. Subsequent events have only increased the timeliness of the report, since its negative findings about the performance of local government have no doubt helped to support – in popular perception, even if not by design – the government’s recent decision to strip some planning control over activity centres from councils. The popular media reports on the audit strongly picked up on the theme of serous dysfunction at local government level presented in the report’s findings. This was an important aspect of the Auditor-General’s conclusions, but the audit also turned up some other interesting nuggets.

The report can be found on the Auditor-General’s webpage (www.audit.vic.gov.au or direct link here; interestingly, as I write there is no link from the DPCD Planning webpage). Most media reports regarding the audit were based largely on the headline findings in the two page “In Brief” summary accompanying the audit’s release. These can be summarised as follows:

  • The architecture of the system is sound but “some elements have become overly complex, are unclear and are not adequately achieving their original intent.”
  • Existing measuring arrangements “do not allow for comprehensive monitoring and reporting of the system.”
  • DPCD codes of practice are of high quality but “under-promoted and underutilised.”
  • Statutory processes are generally appropriately followed in preparation of amendments, but timeframes were “excessive in some cases,” averaging 22 months.1
  • Processing of planning applications was an area of particular concern. “In 78 per cent of cases examined, officer reports did not give adequate consideration to matters specified in the Act, planning scheme, or both.”
  • Senior council staff need to improve the quality assurance processes for the accuracy and processing of permit applications.

Unfortunately, however, most did not go past the above points and so did not give a sense of the deeper findings of the audit that will be of interest to planning professionals. The following discussion attempts to draw out some particular points of interest beyond the broad brush of the “In Brief” findings.

With regards to the first finding above, regarding the health of the system, it is clear from the main report that the audit is here essentially reciting the conclusions of the Making Local Policy Stronger ministerial working group. Brief mention is made of the 2006 Cutting Red Tape in Planning review and “discussions with relevant peak-bodies and other planning professionals,” but the primary discussion of the broad structure of the VPP planning system (outlined on pages 28-34) is explicitly reliant on the Making Local Policy Stronger report. Indeed, the key conclusions about the functioning of the VPPs, stated at page 34 of the audit report, are close to a direct quote from the first page of Making Local Policy Stronger. This is interesting given the somewhat different brief of the Making Local Policy Stronger working group, and the widespread reporting of this conclusion as a fresh finding by the office of the Auditor-General.

The second headline point, regarding monitoring of the system, was probably noted without surprise by many planners who have noted the day-to-day logistical difficulties of reliably collating the data underpinning the Planning Permit Activity Reports (PPAR); the audit report notes planned upgrades to the PPAR should improve the quality of data (p. 38). Less obvious from the “In Brief” report is that the section of the audit report relating to reporting goes beyond the PPAR to include findings about the DPCD’s monitoring of the performance of the performance of the VPPs. This includes the conclusion that there is a need for more pro-active stakeholder engagement and evidence-based review of the VPPs (p. 41).

Given the headline finding about the extensive timeframes exhibited by councils in processing amendments, it is also interesting to note that the section on monitoring of the system includes a discussion of the DPCD’s contribution to amendment timeframes. It notes (at pp. 42-43) that the establishment of target timeframes for requests for authorisation and assessment of amendments has seen the department “consistently meeting or exceeding these targets.” However, it then notes:

“…the audit disclosed that the calculation of actual performance against the above performance targets only measures the time taken by staff to make a recommendation on an authorisation/approval request in most cases, not the elapsed time to make a decision. In several cases where the department met its 15 and 30 day targets, audit observed that the elapsed time taken to make a decision was substantially greater.” (p.43).

Oddly, given the nature of an audit report, this discrepancy is not further scrutinised, and quantified only indirectly, later in the report. Given the 15 and 30 day targets for authorisation and assessment (outlined in the February 2007 advisory note Reducing Amendment Timeframes) are timeframes for a decision, the internal recommendation timeframes would appear to be irrelevant, except in terms of establishing whether the delay occurred with DPCD officers or the minister. If it is the intention of either the Auditor-General’s office or the DPCD / Minister to claim that these internal figures mean the target is being met, then this seems a contrivance, to put it generously.

Given the phrasing of the headline findings, which very much place the responsibility for the 22 month timeframe at council level, the unexplained slippage of DPCD / Ministerial timeframes raises the question as to how the responsibility for delays in the amendment process is actually shared between state and local government level. The answer is provided in Section 5.2.2 of the report, which includes a fascinating diagram breaking down the average timeframes at various steps throughout the amendment processes (p. 57). This diagram shows extensive delays at virtually every step of the amendment process. It is interesting to note that the authorisation process, introduced as a red-tape cutting measure, consumes an average of 2.2 months at DPCD alone. (One would hope that the 4 month period with Council before authorisation is requested, and the 2.1 month period between authorisation and exhibition, would compress to a combined total of less than 6.1 months if the intermediate authorisation process was eliminated; but this is impossible to divine on these figures).

Council’s average timeframes on this data run to a total of 15.4 months (excluding the average 1.4 months of exhibition). The panel process, from appointment of a panel to receipt of report by Council, averaged 4.3 months. The state government’s “share” of the consideration time consumed just over 7 months,2 principally comprising 2.2 months at the authorisation stage and the 4.7 months at the assessment stage. The contrast between these figures and the 15 day and 30 day target timeframes (which the report earlier noted were usually “met or exceeded” if only the department’s consideration was counted) is striking. The report notes – in strident bold text – that these figures may not reflect “recent improvements in the time taken by the department to process authorisation and amendment requests prior to forwarding to the minister for decision.” However, that improvement remains unmeasured; there is no suggestion of improvement in the ministerial component of the timeframe; and again, the extent of the discrepancy between departmental and actual timeframes is not quantified o further explained.

Of all the audit’s findings, perhaps the most dramatic and widely reported finding was that in 78% of applications, officer reports did not adequately cover the matters outlined in the Act and Schemes. Anecdotally, it seems some planners reading media commentary wondered whether this amounted to nitpicking; had planners been marked down for, say, not explicitly mentioning the objectives of planning from Section 4 of the Act?

The report’s detailed discussion makes it clear that this is not the case. A number of serious statutory failings were found in a high proportion of audited files. Lowlights included:

  • Neighbourhood and site descriptions not being provided where required (20% of audited files) or not being certified (55%).
  • In 14% of cases where further information was requested, the information was not supplied in time; in all these cases the council nevertheless processed the lapsed application despite not having any power to do so.
  • In 22% of application amendments were made to the application, but in only a single case was the amendment made in accordance with the Act (i.e., Sections 50, 50A and 57A).
  • The decisions about giving notice under S.52 were often poorly made and / or documented.
  • Notice was given for applications in 34% of applications that were subject to exemptions from notice.
  • In 8% of cases no reports were prepared to justify the issue of a permit.
  • At three of the audited councils, permit triggers were not always adequately identified and assessed.
  • In one council, officers made decisions about notice and signed off delegate reports despite not having delegation to do so.

Interestingly, the audit notes the contrast between the generally procedurally sound processing of amendments, and the decidedly shaky processing of permits. The audit linked this result to the increased external scrutiny of the amendment process, arguing that for amendments:

“significant external oversight is provided by DPCD and panels at key stages in the process. On the other hand, the planning permit application process has no such external oversight mechanisms, except for the Victorian Civil and Administrative Tribunal (VCAT) appeals process. We also noted that of the councils examined in this audit, the one with the highest rate of appeals to VCAT was also the best performing council in terms of demonstrating compliance with the Act and its planning scheme. This strongly suggests that external scrutiny and reporting is an important contributing factor in lifting performance.”

While few would argue with the fundamental proposition that scrutiny elevates performance, one wonders if this can explaining the whole of the difference between the two processes. It is interesting that the report does not make a similar observation with regards to the correlation between greater statutory compliance and the more leisurely timeframes associated with the amendment process. This is despite the fact that, very broadly speaking, the audit found good levels of statutory compliance but exorbitant timeframes in the amendment process; and acceptable timeframes but woeful statutory compliance in permit assessments. In the context of the finding carried forward from Making Local Policy Stronger regarding the complexity of the system, it is difficult to escape the conclusion that council statutory planners are struggling to master the intricacies of the assessment process in the limited time available to them. Reading the report one gets a palpable sense of a local government system in crisis, and lacking the time, resources and experienced staff needed to get the permits out the door in a statutorily sound manner.

Originally published in Planning News 34, no. 5 (June 2008): 10-11.

1. The audited municipalities were Maribyrnong City Council, City of Boroondara, City of Casey, City of Greater Shepparton, Bass Coast Shire Council and Pyrenees Shire Council.

2. It is not immediately clear why these figures combine to a total of significantly more than the 22 month figure for the entirety of the amendment process that is cited elsewhere in the report.