DoView External Program Monitoring or Review

One page PDF summary

There is a faster way of working out: what an agency is trying to do; if its work is focused on its priorities; what it is currently measuring and tracking; what it's proving about its impact; and how to best discuss the program/agency's work with stakeholders in the course of the review.

In many situations, an external agency, or a review team, has to review or monitor another program or agency.

Reviewing and monitoring is often inefficient - wading through long documents to work out what is going on.

The DoView External Program Monitoring or Review Process is based on using a visual outcomes model (an 'outcomes DoView') of the program/agency's outcomes and the steps it is taking to reach them. This model is used as the basis for all review/monitoring work.

The DoView approach is used to answer the key set of review/monitoring questions. These are set out below. This work can be done by a review/monitoring team. However, in an ideal world the program/agency would have done much of this work itself and it would just be critiqued by the review/monitoring team.

1. What is the program/agency being reviewed trying to do?

The Solution

Drawing an 'outcomes DoView' for the program/agency based on what can be pulled out of its documentation and from information obtained from discussions with the program/agency's staff.

More on how to draw a program/agency DoView

Calculating results of an RCT

The Problem

The program/agency being reviewed will have a range of documentation, this may, or may not, clearly document its outcomes. In some cases, the program/agency may even be confused about its outcomes. Assessing clarity about outcomes is a key review/monitoring task.

The external review/monitoring team needs a methodology to be able to work out whether the program/agency being reviewed has a clear concept of its outcomes and priorities.

The Benefits

An outcomes DoView provides a formal visual model of what a program/agency is seeking to achieve in terms of high-level outcomes and the steps that lead to them. Because this logic is set out visually in the DoView, it can easily be critiqued by the review/monitoring team itself.

In some cases, where the logic is specified at a sufficiently detailed level, it can be validated against what the research literature says about what works in terms of similar interventions. It can also be critiqued by the review/monitoring team.

In addition, it can be sent for review by other peer reviewers. It can also be used in discussions with stakeholders to see if they agree with the agency's priorities.

2. What are the agency's priorities and is its activity focused on them?

The Solution

DoView Line-of-sight analysis is a specific methodology that has been designed to visualize the extent to which a program is focused on its priorities. When using it, the outcomes DoView (developed in the first step) is marked up with the priorities the agency is currently focusing on.

Secondly, the actual pieces of work the program/agency is doing are identified (at the outputs-type level) and these are visually mapped onto the boxes in the outcomes DoView.

Line-of-sight aligment can be seen by looking at the number of pieces of work focused on each box within the DoView. Obviously, more work needs to be focused on higher priorities and less work on lower priorities. DoView Line-of-sight analysis allows a clear view of whether the program/agency has achieved strategic alignment.

See here for how DoView Line-of-Sight alignment works

It should be noted that the second part of this process - the DoView Line-of-sight analysis is usually best done by an agency itself and given to a review/monitoring team to critique rather than a review/monitoring team constructing the Line-of-sight analysis itself.

The Problem

Even when you are managing a program/agency it is often hard to be sure that the agency is focused on its priorities. It is much harder for an external monitoring/review team to be able to work out whether a program/agency is actually focusing its work on its priorities.

Just relying on a program/agency's documentation and discussions with agency management and staff can leave a monitoring/review team with a lack of clarity as to whether the program/agency has a tight strategic focus.

Calculating results of an RCT

The Benefits

The DoView approach cuts through the problem of monitoring/review teams not being able to quickly 'look into' an organization to work out whether its work is focused on its priorities.

The first stage of just identifying priorities onto the visual DoView is a very powerful technique in itself. It means that the program/agency has to clearly specify what it is doing at the moment, and equally importantly, what it regards as a lower priority.

The DoView which is marked up with priorities can be used as a basis for discussion with stakeholders to see if their view of what the program/agency's priorities should be matches up with what the agency thinks its current priorities should be.

The second stage of mapping work onto boxes in DoView visual Line-of-sight analysis is the most powerful and transparent way of quickly analyzing whether a program/agency's work is focused on its priorities.

Calculating results of an RCT

3. What is the program/agency currently measuring and how is it tracking?

The Solution

Map the program/agency's indicators onto the visual outcomes DoView and assess them for measurability, whether targets have been set and whether they are measuring important outcomes or not.

It should be noted that not all boxes within the program/agency DoView are likely to have an indicator next to them.

The Problem

The documentation produced by the program/agency being reviewed is likely to include information on indicators that are being monitored.

However it is often hard to work out from such documentation: 1) whether the indicators have been set out in measurable terms; 2) whether targets have been set (where appropriate); and, 3) whether the indicators are measuring the 'important' rather than just the 'convenient to measure'.

The Benefits

The visual approach of putting indicators next to the boxes they are measuring makes it easier to identify what is, and is not, being measured by the program/agency. The traditional approach is to attempt to do this by looking at indicators within table formats of one type of another - this is much less efficient than the DoView visual approach.

Putting indicators onto an outcomes model

4. What is the program/agency doing to prove its impact?

The Solution

The evaluation questions which have already been identified and answered by the agency/program should be put onto the DoView.

In addition, further impact evaluation questions that the review/monitoring team is planning to address can also be added to the DoView.

In either case, the questions are put next to the boxes in the program/agency DoView that they relate to.

More on putting evaluation questions onto a DoView

More on Duignan's Impact Evaluation Feasibility Check

The Problem

Programs/agencies differ in regard to how easy it is for them to prove that they are having an impact. It is often difficult for a review/monitoring team to quickly get to grips with what it is that the program/agency has actually established in terms of how much impact it is having on high-level outcomes.

Impact evaluation is a complex technical area which is often hard for a review/monitoring team to rapidly assess.

The Benefits

The visual approach of putting evaluation questions next to the relevant boxes in the DoView makes it much easier to work out whether an impact evaluation is looking at high, or lower-level, evaluation questions.

In addition, it avoids the confusion often caused in impact evaluation discussions where the same evaluation questions are being discussed but they have different wordings.

Calculating results of an RCT

5. How should the work of the program/agency be conceptualized in discussions with stakeholders and when reporting on strengths and weaknesses of the program/agency in the review/monitoring report?

The Solution

A program/agency DoView can be used as an efficient and transparent way to communicate with stakeholders about a program/agency when discussing the review of monitoring work with them.

If the program/ageny's priorities have been marked up on the DoView, stakeholders can be asked whether they agree with the program/agency's priorities.

In addition, a review/monitoring team can 'traffic-light' a DoView of the program/agency. This shows where there are problem areas and where things are going well. This marked-up DoView can then be used by the review/monitoring team in their report on the review or monitoring work.

A related technique - a Performance Improvement DoView is used to show both areas of strength and weakness and how these are being addressed by a program going forward.

More on how to use a Performance Improvement DoView

The Problem

What programs/agencies do can be complex. Review/monitoring teams need an accessible way of framing discussions with stakeholders about a program/agency's work and priorities.

In addition, in a busy world review/monitoring teams need a fast way of communicating the results of their review/monitoring work.

The Benefits

The visual DoView approach provides a fast way of communicating what a program/agency is doing and how well it is doing it within an review/monitoring process.

Calculating results of an RCT

Additional practical ways of using DoView in monitoring and review processes

Identifying review interview questions

The DoView can be used to identify questions to ask interviewees. There may be different groups of interviewees with different types of knowledge about the program. Go through the DoView and mark which boxes will be the focus of questions for which groups. See information on how to use a DoView to work out a set of questions. Use this approach to compile a separate interview schedule for each group (if there is more than one group of interviewees). 

Alternatively, for increased efficiency, simply provide the relevant DoView pages (having marked the boxes you want the interviewee to focus on) to interviewees either before, or during, the interview. Work through each of the boxes asking them what points they want to make about that box (e.g. whether it is happening or not, how it can be improved). At the end of the interview, ask them if there are any boxes they want changed or added. If you wish, as you go through each box, you can ask interviewees to tell you how they would traffic light each box on the basis of whether it has been achieved so far or not. Green = 5, green/yellow = 4, yellow = 3, red/yellow = 2, red = 1. Afterwards, count the total for each box, get the average and use it to assign summary traffic lights of how interviewees viewed the box. 

Reporting back

Show a summary 'traffic-lighted' copy of the DoView. Use the titles of the DoView boxes to structure your report. Amend the DoView to show what you now think needs to happen to achieve the program's outcomes. Additionally, if you wish, you can use DoView Line-of-Sight (mapping projects onto boxes - see here) to show what interventions should be made to improve the program being reviewed.

Taking notes within the DoView file

Within your DoView file, copy the DoView page(s) you are going to use in any particular interview and name them after the interviewee. Put a new box on the page with the interviewee's name in it. Put any general notes about the interviewee (e.g. title) into the working notes section of the details table at the bottom of the window for that box.

As you do the interview, enter your notes about what the interviewee is saying about each box into the working notes section for that box. Include the name of the interviewee at the top of the working notes entry for each box. 

When you have finished the interviews go to File > Print as PDF. Select only the pages for the interviewees you want to review. Select Include details (in separate file) and save the PDF. Open the PDF which has the details in it and you will see all of the responses from each interviewee listed sequentially under the name of each box.


 Life is short - go visual, go faster™