Home
About Us
Site Map
Products
Services
PMP® Certification
PM Tips & Tricks
PM Tools & Techniques
Links
Contact Us
BLOG

 

 

Weekly Status Reports

The need to inform the project team and stakeholders on the progress of the project to date is usually fulfilled, in part, by some form of status report which is usually circulated weekly. The weekly status report forms the public portion of your project communication. This should be supplemented by reports, presentations, and status review meetings which are tailored for the audience. The weekly status report should include information on the following key areas of project progress:

  • Overall indicator (e.g. green, yellow, red)
  • Things that went well and Things that went wrong
  • Project phase
  • Performance to schedule (e.g. SPI index)
  • Performance to budget
  • Project risk
  • Project issues
  • Project changes
  • Quality performance
  • Vendor performance

This list is not meant to be a comprehensive list of every aspect of project performance you could be asked to report on. It is meant to be a list of the project areas that are the most likely to attract interest from your audience. Use this list as a starting point and add or remove from it as you feel appropriate, then use it as an order list so that your team and stakeholders can select the items they are interested in.

The combination of several of these items is often referred to as a "dashboard” or "scorecard”. Organizations which have Project Management Offices (PMOs), or Project Management Centers (PMCs) may have a standardized scorecard or dashboard in which case this article won’t be of any help. If not, this article offers some suggestions on how to gather and report information for each of the menu items.

Overall Indicator

Busy senior executives who may have dozens of projects within their span of control will appreciate a visual reference which tells them at a glance whether this particular project is healthy, ailing, or dying. Use a color scheme to represent the status of the project with green indicating a healthy project, yellow indicating a project which is ailing, and red indicating a project that is failing. These colors can show as a stand-alone indication or you can accompany them with a one sentence explanation of the status. These indicators are usually appropriate for the overall status and for each of: performance to schedule, performance to budget, and quality performance.

You must be clear and consistent with your color codes, especially if they aren’t accompanied by a textual explanation. Here are some standards you may find useful. Green indicates that your project is on, or ahead of, schedule and on, or under, budget. You may want to allow the project some leeway as to what "on schedule” or "on budget” means. For example, if you are using a Schedule Performance Index (SPI) as a schedule indicator, you may decide that anything greater than .98 constitutes "green”. Use a similar approach for budget.

Yellow indicates that the project is behind schedule, over budget, or under the quality goals set for the project. The difference between yellow and red is that yellow indicates problems that are being addressed by corrective actions within your control. When reporting that your project is yellow status, you should also indicate what issues put the project in that state, and the corrective action(s) that have been implemented, or will be implemented, to change the status to green.

Red is your cry for help. Red means that your project is so far behind schedule, over budget, or the deliverables are of such poor quality that the corrective actions implemented cannot recover the project. Do not put a project directly from green status to red status. Do not report a project in red status without first implementing the corrective actions that you define to recover the project. Use a red status to elicit a response from your project’s executive sponsor, customer, or client. What you are telling them is that the project is seriously behind schedule, over budget, or has serious quality problems that the team cannot correct on their own. Don’t use this status indicator unless you are absolutely certain you and the team has done everything possible to recover the project.

Things that went well, Things that went wrong

This is a textual description of the project’s high lights for the reporting period. The things that went well will include major milestones passed, awards to project team members, activities or deliverables that are ahead of schedule, etc. Things that went wrong will include missed deadlines, unanticipated risk events, and deliverables that are behind schedule. Use a point form for this section of the report and separate the two categories. You may want to deliver the bad news first and place the "Things that went well” at the end of this section.

Project Phase

Project phase is a simple description of the phase your project is currently in. This will be the one you communicated at your last gate, phase exit review, or business decision point meeting. For software development projects the phases will typically be one of: planning, development/build/implementation, QA test, or UAT. For projects using an iterative methodology, state the iteration being worked on.

Performance to Schedule

Performance to schedule is typically measured using a standard such as the Schedule Performance Index from the Earned Value Management (EVM) method. The Schedule Performance Index (SPI) simply measures the amount of work performed against the amount of work that was planned. BCWP/BCWS (budgeted cost of work performed, budgeted cost of work scheduled). Cost may be expressed in monetary terms or other terms such as hours of effort, days of effort, etc.

The data you will base this measurement on should come from the tool you use to control the execution of the work. For most project managers this will be MS Project. You can stick with the canned reports that most versions of MS Project offer, or export the data to a spreadsheet and work from it. You should be aware that any summing you do in a spreadsheet exported from your MS Project file will add the total of a parent task to the individual costs of its child tasks, in effect double counting for the task. You can avoid this by eliminating all the parent tasks.

There are several other decisions to be made regarding this metric. Should you report the cumulative SPI, or the SPI for the reporting period? Should you count partially completed tasks as BCWP? How should you show task completion? Be very careful with your answers to these questions and educate your audience in how this information is collected, extracted, and reported. An SPI of .75 for one week may not be cause for alarm in a project with a life cycle of 12 months, but a .75 SPI is likely to scare the pants off an executive sponsor expecting to see a cumulative SPI for the project!

Use a chart or graph to depict the SPI. There are two advantages to choosing this method rather than a simple number: first a visual display is always more appealing to your audience and second this method allows you to show several weeks of past performance. Using a line chart will allow you to overlay a line at the 1.0 SPI level which you can compare your project SPI. If you use a bar graph you can overlay both a trendline and a line for an SPI of 1.0. Use a rolling view of the project, adding the newest week and dropping the oldest. Get creative with your colors so that the chart or graph is visually appealing. Give your audience something to brighten the picture, especially if you have bad news to report!

Performance to Budget

The cost of any software, hardware, rented facilities, etc. that the project incurs, in addition to the labor is all part of the project budget and must be included in the calculation of the performance indicator. You don’t have to use the EVM indicator, Cost Performance Indicator (CPI) calculated as BCWP/AC (Budgeted Cost of Work Performed, Actual Cost), you may decide to use a simpler method such as the budget set aside for the project to this date/money spent on the project to date.

There are a number of questions to be answered for this metric. Should you report on this metric cumulatively or for the period reported? Should you report on a cost when the cost is incurred? When the invoice is paid? Should you report the cost of hardware when it is ordered? When it is received? When it is implemented? When it the invoice is paid? Make certain that your audience, particularly your executive sponsor is educated in the methods you use to collect, extract, and report this information.

Use the same bar or graph approach you used for reporting on your SPI, if you are reporting the project Cost Performance Index (CPI). You may want to choose a different approach than that used for the SPI. Try using a line graph if you used a bar graph for your SPI, or vice versa. Use a rolling window approach to your graph or chart, using the same window as you used for your SPI.

Project Risk

There is really no way to convert your project’s risks into a graph or chart so you’re stuck with a text representation of the risks. Don’t simply dump the project risk list into a page, or pages, and include these with your report; summarize the risks for your audience. Eliminate risks which are obsolete from your list, no-one will be interested in a risk event which has a zero percent probability of happening. You may also want to eliminate risks which are 1 or more phases away, depending on the length of the phase. Try to limit your list to those risks which current deliverables or deadlines.

One way to limit the number of risks you report is to use the "top x risk list” approach – it works for Dave Letterman. Use a formula to determine which risks to include in the list. For example, use your PI score (probability, impact) combined with a proximity score. Say you use a numeric scoring method, numbers from 1 to 10 for probability, impact, and proximity. Report the top 10 PIP (probability, impact, proximity) scores. If there are more than 10 risks which exceed the PIP threshold either report all the risks that exceed it or reduce the list based on some other criteria. Always report the criteria with the risks.

Keep the status, or perceived effectiveness, of the mitigation strategy separate from the score that you give the risk. For example, if you have assessed a risk event with a large PIP score but you believe you have implemented an effective mitigation strategy, don’t lower the PIP score, leave the PIP score high. Indicate the effectiveness of the mitigation strategy with its own score such as high, medium, or low.

Don’t just report the risk; also report the mitigation strategy chosen to deal with the risk. You may also want to report on the status of the strategy: has it been implemented? Is it still in the planning phase? If the strategy is a contingency, what is the trigger to implement the plan? Edit the information from your risk register so that each risk event and mitigation strategy is represented by a few words or a short sentence at most. Use letters to represent things like status. Try to keep the information here as brief as possible. You may also want to use a color code to indicate the status of the risk: green for

You may also want to report on the balance in your contingency reserve for the project, or project phase. Report the total reserve amount, the amount used to date, and the amount left.

Project Issues

Project issues are neither planned tasks nor risks. They are problems or roadblocks which should be addressed and removed so that the team may meet its deadlines. Each issue should have an action assigned to a team member with a due date. Use a "top x list” approach for project issues. Eliminate issues which have been closed – issues whose attendant actions have been completed. You may also want to eliminate issues with forecast completion dates several weeks in the future. Keep your list reasonably short; choose issues which are significant because they impact on an important deliverable or deadline. Edit the issue and action descriptions so that they are expressed in a few words, or a short sentence.

Project Changes

There are several metrics that can be used to compile a report on project change: total number of change requests submitted, total number of requests approved, total number of requests rejected, total effort of change requests approved, total cost of change requests approved. You can also report metrics for each these totals, for the reporting period.

Metrics on change requests lend themselves to charts or graphs very well. You can show the metrics for the reporting period, using a rolling window approach similar to your SPI chart or graph. You may also want to include a second report showing the cumulative metrics for a rolling window. Drop the previous reporting period and add the new one to keep the window a consistent size.

Quality Performance

Quality performance here refers to the quality of the products or deliverables under test, not the performance of the Quality Assurance team. Some of the data that can be gathered for a quality performance report include: total number of test cases, total number of test cases per application, number of test cases executed this reporting period, total number of test cases executed, number of test cases which passed this period, total number of test cases passed, number of test cases failed this period, total number of failed test cases, number of test cases closed this period, total number of test cases closed, and number of opened and closed test cases by severity.

Most trouble tracking tools will have a reporting engine to help you create your report. The volume of data available for reporting will make charts and graphs effective for quality performance; it’s just a matter of selecting the right data to include without overwhelming your audience with the details. Just remember that a large number of tickets opened is not necessarily a bad thing, although it does indicate a potentially large volume of work. Quality Assurance is supposed to find all the bugs before your software system is given over to User Acceptance Testing. Also remember that a large number of severity one tickets has a different significance than a large number of severity 4 or 5 tickets. You may want to show different severities using different bars on your graph, or different colored lines.

You can expect to see the number of open trouble tickets peak when the QA team gets to the meat of the tests, and then gradually diminish as the development team works on closing them. This is a general rule, but if the open tickets on your project are accumulating faster than the development team could possibly close them, you are being alerted to the possibility of slipping your schedule. You may want to include some form of threshold in your report on open tickets so that when the cumulative total of open tickets surpasses the threshold, corrective action is taken.

Be inventive with your graphs and charts. Use colors as well as numbers to convey your message. For example, you can use red for the severity one tickets; you can use red for open tickets and green for closed.

Vendor Performance

Your vendor performance report should reflect the size of the project or work outsourced to the vendor. A vendor supplying one server to be used in your project may not warrant a report at all, whereas a vendor performing a large sub-project for you may warrant a report containing all of the above reports. Make sure that any reports which you expect from your vendor are described in the SOW.

There are several pieces of information that you may wish to report on in addition to the information described above. You may wish to include the amount of re-work required (if this does not form a part of your Quality performance reporting). You could also include information on the total budget for the sub-project, the amount of invoices received, amount paid, and amount held back.

Conclusions

Remember that your weekly status report won’t remain the same over the course of the project. You won’t be reporting on quality performance during the planning phase of the project and you won’t be reporting many project risks during the QA phase of your project. Prepare your audience for the change in report content prior to communicating the first new report.

Constantly poll your team, starting with your executive sponsor for their satisfaction with the report. Don’t be afraid to make changes to the report based on your audience’s shifting requirements. Add new information, drop information of less interest, and change the "look and fee” of charts and graphs. If your executive sponsor’s requirements are significantly different from the stakeholder and project team requirements, you may need to create two separate reports each week.

You’ll know that you are meeting your executive sponsor’s requirements for information when you start seeing your charts, graphs, and reports appearing in their presentations to other senior executives.
 

The tips and tricks described in this article are intended to help the project manager using the best practices promoted by the PMI. Project managers who are certified have already implemented those best practices. If you haven't been certified as a PMP® (Project Management Professional) by the PMI and would like to learn more about certification, visit the three O Project Solutions website at: http://threeo.ca/pmpcertifications29.php. three O Project Solutions also offers a downloadable software based training tool that has prepared project managers around the world to pass their certification exams. For more information about this product, AceIt, visit the three O website at: http://threeo.ca/aceit-features-c1288.php.