I had ceded out of respect for Senator Cameron before, but I somewhat regret that now, given the heap of abuse he has just hurled at a Senate colleague. I also encourage him, before he leaves the chamber, to go and have a look at my speech of 3 November which directly addresses the issue of science in the climate change debate. He may, if he is prepared to read it with an open mind, change some of his views.
I wish to talk to the 2011-12 major projects report by the Australian National Audit Office of the Defence Materiel Organisation. This report is a very thorough and good document that is of great utility to the parliament, representing the interests of the Australian people. The Joint Committee on Public Accounts and Audit worked closely with the ANAO and Defence to bring the report into being back in 2007-08. The ANAO have developed an expertise in auditing Defence and Defence procurement that is a significant asset to the Australian taxpayer through the transparency that it provides in public reporting and to this parliament. Their expertise was particularly useful during the recent Senate inquiry into Defence procurement. It does beg the question, though, of whether, if this report by the ANAO is actually more thorough and more detailed, particularly when you look at the project data summary sheets—and they give us a very good insight into the progress in terms of cost, schedule and capability, looking at a number of parameters for each of these projects—we can actually start to track the progress of these projects against the business case that was presented to government at first and second pass leading up to contract signature. It does raise the question of whether, if this is such a useful document and if it provides the public, the parliament and particularly the Joint Committee on Public Accounts and Audit with a balanced insight, there is a role for this expertise, during the life cycle of a project, to feed into Defence’s and the executive’s and the parliament’s ongoing oversight of Defence procurement.
Rather than having an audit that is looking back at events, have the same kind of rigour and expertise involved through the life and at decision points of projects, in terms of whether to proceed, how to proceed, whether or not to make it a project of concern, and how any rectification and recovery of the project may occur. Not that I think our system ever should be like the American system, but I note that the GAO in America fulfils a similar function, in terms of reviewing the progress of projects and informing the congressional oversight committees who do have that role in the American system.
The other part I wish to raise about this report is that, as well as giving us that tracking of cost-scheduling capability and where the project is up to, it makes a pretty fair effort of analysing and then listing lessons that should be learnt from each of these projects. That is where one of my significant concerns lie. During the Senate inquiry into Defence procurement we raised, on a number of occasions, the level of confidence within Defence to make adequate assessments of risk before we even brought the project to first and second pass and certainly before contract signature. Defence’s position was that the technical risk assessment was going to be conducted by DSTO, and the point was discussed on a number of occasions with both the Capability Development Group and the DMO. But most of the risk that translated into project delays and cost overruns was not so much with the underlying technology but with issues dealing with integration, certification and acceptance into service. The DSTO and, in fact, most of the people working within Capability Development Group and even many within the DMO were not competent, by strict definition of their qualification and experience, to identify, analyse and make an accurate assessment on the implications of that technical risk.
It is concerning, having highlighted that during the Senate inquiry, to go back and look at the major projects reports from 2007-08 onwards and to see some of the lessons learnt, when it was highlighted by the Audit Office that one of the things that should be learnt out of these projects was that a more robust and informed assessment of risk was needed before the Commonwealth accepted that a capability truly was non-developmental or off-the-shelf, whether that be commercial or military.
It also is a good opportunity to review and to get an independent assessment on how Defence views that issue of off-the-shelf acquisition. What we are consistently told—both in the Defence Senate inquiry into Defence procurement and in estimates—is that Defence’s interpretation of the Mortimer and Kinnaird reviews is that an off-the-shelf option has to be provided to government as a benchmark for them to make a decision about value for money. What is written down and what occurs in practice is an issue of culture. I find it interesting that in paragraph 48 of this year’s report what the ANAO finds, having audited the DMO again, is that following the Kinnaird and Mortimer reviews, government has increasingly been requiring Defence to pursue MOTS or COTS capability solutions, where they exist, that can deliver the required capability. The intention of this policy is to reduce the risk associated with the acquisition of new capability by limiting the Defence organisation’s exposure to the additional risk associated with developmental projects. There is a subtle difference there. It is no longer just a benchmark; it is a preferred direction.
In the additional comments to the Senate report, I talk at some length about the medium- and longer-term implications of continually shifting our design engineering and our design acceptance as well as our test, evaluation and certification capabilities offshore, because it reduces our ability to even conduct rigorous ongoing maintenance of our capabilities. The Rizzo report into the Navy’s amphibious capabilities is a classic example that shows that, once those capabilities have atrophied, the unknown unknowns increase. People do not know what they do not know, and we start seeing a decline in the technical battle worthiness, airworthiness and seaworthiness of defence capabilities when that expertise decreases.
It is not good enough to just send people on courses. Qualification alone does not make people competent. People’s competence for technical engineering roles around design assurance and ongoing battle worthiness, airworthiness or seaworthiness comes from a combination of both qualification and experience. You only get experience if people have the opportunities to work in those roles. So the long-term consequence—by long-term I mean within the decade—of continuing to send those functions offshore is significant. Take electronic warfare, for example. It has been recognised as a priority capability for industry, yet, if we look across a number of, or almost all of, our fixed-wing platforms—perhaps with the exception of the classic Hornet—we are seeing less and less involvement from the Australian Defence Force and its people within the Defence Science and Technology Organisation or JEWOSU, the Joint Electronic Warfare Operational Support Unit, and from Australian industry, which have the opportunity to put into practice the experience and the qualifications they have in these areas. Through our acquisition decisions, we are undermining the ability of the Defence organisation and industry to maintain that priority capability into the future.
I commend the ANAO for this report, the latest in the series. It sets a really good benchmark for what the Defence department should be providing to this parliament. There are opportunities there to look at how we can use those skills as part of the ongoing process of Defence acquisition and review. But we also need to make sure that the lessons learned are used by Defence and change behaviours as opposed to just sitting in a book that may well gather dust on a shelf.