It is widely acknowledged that research evidence is an important ingredient of decision-making in development efforts, be they policy, programme or practice decisions. It is also widely acknowledged that research evidence is not the sole ingredient in these decision-making efforts. The context of decision-making shapes if and how research evidence influences the decisions made. Parliament is arguably a context where research evidence has to compete with many factors to inform decisions made by Members of Parliament (MPs), top of these being politics and individual interests.
A recent workshop on evidence-informed decision-making in parliament for staff of the Parliamentary Research Service (PRS) in Kenya conducted on May 9, 2016 in Nairobi presented an opportunity for us to reflect on the struggles of evidence use in the parliament context and we share some of these in this blog.
The workshop was facilitated by two PRS staff who had recently benefited from a one-month internship at the UK Parliamentary Office of Science and Technology (POST), under the SECURE Health programme implemented by AFIDEP in collaboration with other organisations. The programme seeks to optimise individual and institutional capacity in accessing and utilising health research evidence in decision-making in Kenya and Malawi.
The purpose of the workshop was for the two staff to share lessons from the internship and for the participants to identify ways in which they can adopt some of these lessons to improve the quality of the evidence they provide to MPs in Kenya to support decision-making.
From the discussions at this workshop, it was evident that efforts to promote and facilitate the use of research evidence within parliament bring up many unique challenges.
Is research evidence relevant in parliament debates?
As the workshop started, one issue that arose was the definition of evidence required for decision-making within parliament. Some of the staff challenged the focus of the workshop on research evidence, arguing that research evidence is often not relevant to the decisions made in parliament. Indeed, many noted that often the evidence they use in their day-to-day work to prepare summaries to parliamentary committees is rarely research evidence, but rather, performance and statistical reports from government ministries and agencies, newspaper and other media reports, reports of previous parliamentary committees, past Hansard issues, statutes, expert opinions and past budgets, among others.
This discussion brought in the issue of the quality of evidence that staff provide to committees, especially in view of the fact that scientific evidence from research is seen as the most credible and yet it is not the commonly used evidence by these staff and by legislators. It was argued that often existing research reports are not relevant to the debates in parliament. The staff were reminded that the ideas, opinions and editorials encased in media reports, for instance, are rarely put through the same rigor that research is, and would be considered a weaker type of evidence to inform legislation. Indeed, the group agreed that as policy analysts within parliament, they had the responsibility of appraising the quality of such non-scientific evidence and advising committees on their credibility.
So then one staff member asked, “How do you assess the quality of such non-scientific evidence that we commonly use?” A number of factors were highlighted by the group as what to look out for when reviewing non-scientific evidence including the sources and authors of the report and the date of the reports, among others. Given the lacking criteria to assess non-scientific evidence, the group agreed to develop an in-house guide on how to go about appraising non-scientific reports, so as to improve the quality of evidence summaries they provide to committees.
This discussion agrees with Phil Davies'(2007) observation who, as a former Deputy Chief Social Researcher in the UK government, pointed out that a policymaker’s hierarchy of evidence tends to place research evidence at the bottom of the hierarchy; coming behind media reports, lay evidence, urban myths and conventional wisdom.
Should you present to parliament committees research evidence that they do not agree with?
Often, MPs are mainly concerned about how a decision will result in political mileage as opposed to ensuring that the best evidence informs the decision made. Also, the relationship between MPs and the parliament staff is defined by power, with MPs assuming the power and the staff being subordinates. Within this context, juggling between presenting to MPs objective research evidence on an issue and political correctness so as not to be seen as antagonising MPs becomes a struggle for staff.
At the workshop, one staff asked, “Should we present objective evidence to committees even when we know that it antagonises the views of committee members?” In response, another staff wondered, “What is the point of presenting evidence that you know the committee does not agree with and will therefore be thrown out?” To which a third member of staff reminded his colleagues that, “Although we work in a very political and interest-driven setting, we cannot “bend’ evidence so that it speaks to the interests of legislators, we are impartial and should provide objective evidence regardless of the views and opinions of committee members.”
This discussion points to the real struggles that these staff have when preparing evidence summaries on issues that legislators are debating. It also raises the question of the appreciation of evidence-informed decision-making (EIDM) among technical staff within parliament. While the SECURE Health programme has increased the appreciation of EIDM among the 11 research staff trained in the Kenyan parliament, this discussion shows that more staff who have not taken part in this training could benefit from future investments in capacity building programmes of this nature.
Sometimes legislators mistrust research evidence, or they just don’t understand it
Staff argued that sometimes legislators even mistrust them, fearing that they might be proxies for civil society, NGOs and donors who have interests in using research to push their own interests. As such, legislators sometimes challenge research findings presented to them and dismissively term them “compromised’ or “manipulated’. At other times, staff argued that legislators consider research findings as complex and difficult to understand, and is therefore not their preferred source of evidence.
Some important issues for the evidence use movement
The discussions at this workshop highlight a number of issues for the evidence use or the EIPM/EIDM movement in Kenya and Africa at large. One is the question of how to make research evidence more relevant and appealing to parliament. Two is the need to provide clear guidance on how to appraise the quality of the non-scientific evidences that parliaments use. Three is the need for sustained capacity strengthening and sensitisations efforts on EIPM/EIDM for both staff and legislators in African parliaments. Such efforts could address some of the struggles that the Kenya parliament staff charged with providing evidence to legislators are grappling with.
The Kenya parliament staff ended the workshop with an agreed way forward of setting up a roadmap on how to tackle the challenges they face in their efforts to provide evidence to parliamentary committees. Some of the action points included: adopting information technologies to improve the way they store and share evidence; defining better ways of disseminating PRS evidence products among legislators; and identifying ways of measuring the impact of the PRS evidence products in parliament. This is in addition to the action on defining criteria for assessing non-scientific evidence noted earlier.
To contact the authors: Rose.firstname.lastname@example.org; Solomon.email@example.com; and Evans.firstname.lastname@example.org