Damn Statistics


Advertisement
Uganda's flag
Africa » Uganda » Eastern Region » Jinja
February 25th 2012
Published: January 30th 2013
Edit Blog Post

As a former social researcher and current monitoring and evaluation person I should be the person that defends research and statistics. Sometimes, however, I find it difficult. It is undoubtedly the trump card in a wide range of arguments. “How can you say Arsenal are weak in the air when only 12% of the goals they have conceded have been headers?” or “you say Africa is catching up with the rest of the world: more people live on under $1 a day than in the rest of the world put together.”

For charities desperate to raise funds and for Government agencies desperate to show success, statistics have become the holy grail. Now I don’t think I am boastin when I claim that the organisation I work for has one of the best monitoring and evaluation systems around: my boss is often asked to visit other organisations, including much larger ones, to offer training; recent business consultant visitors commented that it would put many businesses to shame. Knowing the detail of how we collect all our information and the lengths we go to to ensure that it is correct makes me very proud… but also aware how unreliable it is. As I say, I think we are one of the best.

Having researched and evaluated in the UK for a while, I know that much political and social research can be picked apart fairly easily. It maybe a culturally skewed view but, I don’t think it is anything compared to Uganda. I would say this is more due to a lack of transport rather than expertise: it is much harder to reach different populations and the differences between isolated populations are often greater, making it harder to generalise.

A good example of problematic data collection is HIV statistics. Our strategists felt that reducing HIV rates in areas that we work would be a good way of demonstrating the impact of our work. However, trying to collect the data proved near-impossible. It should be easy I was told, as the district health authority has to submit figures to central government on an annual basis. The difficulty is, however, nothing happens if they don’t. Even if they attempt, then they will submit data from health centres who only test pregnant women. Some districts with populations of 100,000 would return figures with the results of 100 people tested.

The sketchiness of government statistics mean that development agencies such as USAID will collect their own figure. They may collect from larger samples but will only get data from organisations that they fund, not a geographical or social representation of the country. What I was left with was no statistics for some districts and others where estimates ranged from 12% from one source to 7% from another. This is what the Uganda ‘success story’ of the 90s and recent criticism of Musevini’s ‘changed approach’ is based upon.

Even the most simple things to collect can become difficult. We are working with young people (up to 24 or 30 depending on funding) but poor rural people rarely know their age. Someone will own acres of land, produce enough food for the whole community and have enough livestock to make them the wealthiest person for miles around but end up as a statistic in an Oxfam video as they earn less than $1 per day.

It is with this in mind that some doubts are placed in my mind when I hear that a study has found that 48 women are raped every hour in the Congo. That is a shocking statistic and the fact that many women are raped in Congo is obviously the main issue rather than me being a pendant about the number. It is just the certainty with which these figures are reported that worries me.

One issue is translating any question, and with this I don’t just mean language but also culturally. We have just conducted our annual survey, young people that could speak the local language went out to the communities where we work and asked the questions. We spend a day training, largely translating the survey to the local language. It was not the technicalities of the language that were argued over amongst the team but also how you approached questions, especially over sensitive subjects like sex. I can’t imagine how hard it would be to ask someone if they had been raped.

We worked on a project with the UN Food Programme before my time with the organisation. It was based in one of displaced people camps from people fleeing violence in the North. A baseline survey was conducted with families sheltering in the camp. One of the questions was about what they had lost in the war. A rumour went around the camp that the UN were to compensate people for what they had lost and it turned out that people in the camp owned more land and more cows than there are in Uganda!

The other issue I have with ‘figures’ are that they are sometimes utterly meaningless. Take one I heard in a discussion the other day to illustrate a point about how aid was akin to a new colonialism, ’there are more white people in Africa than in colonial times’. There are also more Africans, and Indians, and Chinese people. ‘There are more Africans in America than during slavery’. It doesn’t mean anything! A statistic is not an argument or a basis for funding or programming on its own.

I guess that as long as the general picture is accurate the detail doesn’t matter too much (apart from to people like me!) What worries me is that people stop at a number or a piece of data, especially if it serves their purpose, and don’t look beyond it to try to understand where it comes from or what it means.

Advertisement



Tot: 0.073s; Tpl: 0.009s; cc: 9; qc: 48; dbt: 0.0459s; 1; m:domysql w:travelblog (10.17.0.13); sld: 1; ; mem: 1.1mb