We now need to start a constructive discussion on how a truly systemic Monitoring and Results Measurement (MRM) framework could look like (as Evaluation does not play a big role in the current discussions, I am adopting the expression MRM and avoid the M&E). In this post, I will take up the discussions on MRM and the DCED Standard for Results Measurement from the two guest posts by Aly Miehlbradt and Daniel Ticehurst and will add from a discussion that runs in parallel on the very active forum of the Market Facilitation Initiative (MaFI). I will also add my own perspective suggesting that we need to find a new conceptual model to define causality in complex market system. Based on that, in my next post, I will try to outline a possible new conceptual model for MRM. Continue reading
After submitting a long comment as a reply to Aly Miehlbradt’s post, I could win Daniel Ticehurst to instead write another guest post. Daniel’s perspective on the DCED Standard nicely contrasts with the one put forward by Aly and I invite you all to contribute with your own experiences to the discussion. This was not originally planned as a debate with multiple guest posts, but we all adapt to changing circumstances, right?
Dear Marcus and Aly, many thanks for the interesting blog posts on monitoring and results measurement, the DCED standard and what it says relating to the recent Synthesis on Monitoring and Measuring changes in market systems.
This is a guest post by Aly Miehlbradt. Aly is sharing her thoughts and experiences on monitoring and results measurement in market systems development projects. She highlights the Donor Committee for Enterprise Development (DCED) Standard for Results Measurement and its inception as a bottom-up process and draws parallels between the Standard, her own experiences, and the recently published Synthesis Paper of the Systemic M&E Initiative.
In one of Marcus’s recent blog posts, he cites the SEEP Value Initiative paper, “Monitoring and Results Measurement in Value Chain Development: 10 Lessons from Experience” (download the paper here), as a good example of a bottom-up perspective that focuses on making results measurement more meaningful for programme managers and staff. Indeed the SEEP Value Initiative was a great learning experience, and is just one example of significant and on-going work among practitioners and donors aimed at improving monitoring and results measurement (MRM) to make it more useful and meaningful. The DCED Results Measurement Standard draws on and embodies much of this work and, also, promotes it. In fact, the lessons in MRM that emerged from the SEEP Value initiative came from applying the principles in the DCED Results Measurement Standard.