Debate

How should the future aid evaluation be designed?

Although results are increasingly emphasized, the evaluation capacity in development assistance has decreased due to restructuring and the shortcoming of the evaluation authority SADEV. The expected closure of SADEV raises important questions about the future evaluation of Swedish development assistance, write Anders Olofsgård and Jesper Roine.

In May this year, the State Treasury published the report "Evaluation of Swedish development assistance - a review of the evaluation activities”. The report highlighted many shortcomings in the evaluation work, and emphasized that the independent evaluation authority SADEV had not lived up to expectations and that a "restart" would be desirable. Against this background, a working group within the Ministry of Foreign Affairs recommended in June that the government decide that SADEV be wound up on 31 December 2012. The working group also proposes that a comprehensive analysis be prepared with proposals for how the evaluation function within development assistance should be organized in the future. Anyone who has followed the development aid debate in Sweden in recent years is probably not particularly surprised. There is a lot to be said about what has happened, but the most important thing in this situation is to carefully think through how the development assistance evaluation as a whole should be organized. The question concerns not only SADEV's future but also the role of the Ministry of Foreign Affairs and SIDA in the chain that constitutes the evaluation process.

Paradoxically, the capacity to carry out evaluations in Swedish development assistance in parallel with this has deteriorated due to restructuring at Sida and the Ministry of Foreign Affairs and SADEV's shortcomings.

To begin with, one can of course ask oneself the question of whether special resources are needed for the evaluation of development assistance. Here the answer is a clear "yes". The importance of evaluating the results of public activities is increasingly emphasized, while research on development assistance evaluation has been one of the most developed fields in economics. Paradoxically, the capacity to carry out evaluations in Swedish development assistance in parallel with this has deteriorated due to restructuring at Sida and the Ministry of Foreign Affairs and SADEV's shortcomings. A strengthening of the evaluation capacity is thus, if possible, more important today than when SADEV was started.

Independent authority?
The next question concerns whether an independent authority is necessary. The answer here is more of a "yes". An independent authority with permanent employees who are responsible for the evaluation work sounds like a guarantor of independence vis-à-vis audited bodies and the political client. However, experience from SADEV indicates that this is not necessarily the case. There are also examples of other ways to guarantee independence (eg IEG within the World Bank), so a more lean organization that mainly procures external expertise within e.g. Sida or the Ministry of Foreign Affairs should not be excluded. In the end, the opportunity for actual independence is usually about competence and relevance. A related question concerns the extent to which evaluations must be self-initiated. The existence of this possibility is a must for credibility in examining the Government's work, but the State Office's report points out that the Foreign Ministry's staff often perceived SADEV's self-initiated evaluations as irrelevant. This has been partly due to lack of quality, but also due to wrong focus, or wrong timing in the decision-making process. The advantage of commissioned reports is that they are more likely to have an actual impact on decision-making. There is thus a balancing act here that must be respected, without, for that reason, tarnishing the credibility of the evaluation function. The important thing is that independence is not equated with disconnected and irrelevant; meaningful evaluation requires communication. In the end, a clear mandate and competence are more important than the organizational form, which leads us to the following recommendations.

One of the most striking points in the State Treasury's report is that SADEV has not yet made a single evaluation of the effects of development assistance, ie. the consequences for those who are the actual recipients of the aid.

Closer connection to development research
One of the most striking points in the State Treasury's report is that SADEV has not yet made a single evaluation of the effects of development assistance, ie. the consequences for those who are the actual recipients of the aid. The focus has instead been on processes and internal efficiency. This "process focus" seems to be partly justified by the fact that the evaluation of the effects of development assistance falls within the area of ​​"research", which must be distinguished from "evaluation". This is a very strange and sad attitude in our eyes, not least in light of the fact that evaluation methodology is currently receiving enormous attention in development economics. How can one justify not taking advantage of researchers' progress when they are applied to the very areas an authority is set to monitor? The new methods that have emerged are not based on advanced theoretical models or wild assumptions, but are very concrete and data-driven. However, they require closer cooperation between implementing and evaluating units, as they are based on a comparison before and after an intervention both in areas covered by the intervention and similar areas that have not been affected. This method is not without criticism, and above all it is not suitable for all types of projects, but many efforts are very well suited to be evaluated in this way. Ensuring competence in modern evaluation methodology appears to be one of the most important changes that must take place, and here the connection to relevant research environments is the key.

Participation and transparency
Good evaluations require participation and transparency throughout the process. As mentioned above, good evaluation generally requires an opportunity to compare the situation before and after an action. This in turn requires good coordination and communication between those who implement the effort and those who will evaluate it, ie. the possibility of evaluation must be part of the actual design of the project. A key to better evaluations is thus that the evaluation function is involved early in the process.

In what city?
The two points above have a direct link to where the evaluation function is located. Just as pointed out in the study that Per Molander wrote when SADEV was to be formed (“An independent evaluation body for development cooperation”, 2004), proximity to a city with expertise in development research is of great importance. In the first place, Stockholm and Gothenburg were singled out as suitable cities, where Stockholm also had the advantage of being close to the government and Sida. As is well known, this did not happen, SADEV was established in Karlstad when the Armed Forces were reduced and government jobs were relocated. The arguments for locating the evaluation function to Stockholm or Gothenburg today appear to be at least as strong precisely because of the two previous points.

This is a debate article. The author is responsible for analysis and opinions in the text.

Do you also want to write a debate article for Uttvecklingsmagasinet? Contact us at opinion@fuf.se

Share this: