Monthly Archives: March 2011

On the Value of What We Do… and How to Measure It

Posted on by .

Ortiz_ James226x170This past week we conducted a workshop with IPCE and our stakeholders to explore approaches and proposed quantitative indicators (metrics) to measure the value of the agency’s independent review process. The proposition is simple, that is, measure the impact independent reviews are having on the success of the agency’s Programs and projects. The implementation on the other hand is anything but straight forward. After much discussion it became clear that that one of the factors that complicates direct measurement is the fact independent review is not a separate process but is embedded and synergistic with the overall programmatic decision and approval approach implemented by the agency. A way to illustrate this point is by examining the decision process of a project going to a Key Decision Point (KDP). The project’s initial integrated baseline undergoes many adjustments on the way to the KDP as a better understanding develops of the project’s technical, programmatic, and overall risk posture during the approval cycle. The approval cycle incorporates inputs and analyses from many sources including the project, the program, the host center, the mission directorates, and the agency of which the Standing Review Board (SRB) is one part. These activities are iterative and involve several in-process decisions to make changes and adjustments affecting the project technical and programmatic posture leading to the approval at the KDP. So it is difficult to ascertain whether the resulting decision and subsequent results in terms of the project’s subsequent performance are singularly due to the SRB. Thus, the consensus at the workshop was that the emphasis needs to be placed on supporting good approval decisions at the governing boards recognizing that the independent review process is an integral and essential part of informing these decisions.

So as we go forward we will be generating metrics to assess our contribution to the agency’s technical and programmatic decision and approval process. Many factors influence our contributions (and thus our value) such as the precision of our analyses, the thoroughness of our risk assessments, the timeliness in providing our results, our effectiveness in communicating our assessments, and our due diligence ensuring that complete and accurate information is heard at the management forums (which is also a measure of our independence).

You will be participating in this key effort as it unfolds through the year. I want to encourage you to embrace it and to contribute to its development and implementation. This is another step in our quest for excellence.

As always, I welcome your comments.
James N. Ortiz, PhD
IPAO Director