Thursday, April 25th | 17 Nisan 5784

Subscribe
August 17, 2016 6:27 am
4

BBC’s Failure in Use of Statistics Also Applies to Reporting on Palestinians

× [contact-form-7 404 "Not Found"]

avatar by Hadar Sela

A BBC News headline that mislead readers about a fatal stabbing attack in Jerusalem. Photo: Screenshot from BBC News via Honestreporting.com

A BBC News headline that mislead readers about a fatal stabbing attack in Jerusalem. Photo: Screenshot from BBC News via Honestreporting.com.

On August 10, the BBC Trust published the findings of a review on the impartiality of the BBC’s reporting of statistics in its news and current-affairs output; the review was commissioned in 2015. The report, together with accompanying documents, is accessible here.

Titled “Making Sense of Statistics,” the report makes for interesting reading, although it has a somewhat domestic focus. While it does not address the issue of the BBC’s presentation of casualty figures during the 2014 conflict between Israel and Hamas, some of its observations, conclusions, and recommendations are pertinent to the corporation’s portrayal of that topic both at the time and since.

On page 17, the report addresses the topic of audience expectations.

Audiences expect that numbers are accurate, factual and verified, that claims that are wrong or misleading are challenged, and that a range of evidence is used to put statistical claims into context. In other words, the BBC has to ensure that the public is informed accurately and impartially on the important issues of the day by helping audiences navigate through the statistical evidence and make sense of the numbers.

Regarding accuracy, there is a presumption of veracity – if a story contains a number, it must be true. Certainly, the audience research found that “adding statistics does increase the impression of accuracy”:

There is an assumption by the audience that figures quoted by the BBC will be accurate, factual and well verified and that the BBC sets out to be impartial in its use of statistics. 

Audience research report, Oxygen Brand Consulting

As regular readers know, the BBC did not independently verify the casualty figures and civilian/combatant casualty ratios that it presented to its audiences during the 2014 conflict. Although there is no publicly available evidence of its having carried out any such verification since the conflict ended, it continues to quote and promote unverified data sourced from interested parties and has even defended its own use of statistics provided by a terrorist organisation.

Ironically, on page 30 of the report, readers find the following:

We heard examples of the BBC choosing not to cover particular statistics which have either been sent to them in press releases or featured in other media coverage, due to concerns with the methodology behind them or the interpretations placed on them.

Page 68 of the report states:

In order to evaluate the importance or validity of a statistic, audiences often need to some extent to understand where it came from, who is using it and how it has been generated – in other words, the provenance of a statistic needs to be transparent. Good practice suggests that in order to achieve this, those descriptors should be routinely presented, although not necessarily as a full suite on every occasion. In some cases of course – such as a fleeting reference in an interview – it is not possible to give all this information. But where the story is the statistic, then transparency is vital for the audience as attribution can sometimes greatly affect the weight audiences give to particular figures. And yet, there appear to be occasions where statistics in BBC content are not clearly attributed, or where a link to the direct source (if in an online article) is not provided.

Appendix 2 at the end of the report presents a hand-out provided at the end of BBC training sessions. The “10 Golden Rules” include:

Taking a theory and trying to find statistics that fit it is a recipe for disaster, and one of the biggest causes of inaccuracy and misrepresentation. Make sure that whoever has provided the figures hasn’t fallen into that trap.

Check your source. Is it likely to be someone with a vested interest in interpreting findings in a particular way?

Clearly those “golden rules” were not followed when the BBC unquestioningly promoted data provided, via a third-party, by political NGOs engaged in lawfare against Israel.

On the one occasion that the BBC did provide its audiences with some good statistical analysis of the topic of casualty figures — in August 2014; that article was subsequently altered and reframed to the point of being rendered meaningless.

On page 49, the report states:

Providing context aids interpretation. But it is not always enough. It is important that, as well as communicating the statistics, journalists are also able to provide interpretations around the sometimes-complex subjects in which they appear […] in order to help audiences to understand the relevance of the figures. And these interpretations need to be based on a balanced assessment of the evidence in order to provide audiences with an impartial reading.

Readers may recall that shortly after the 2014 conflict came to a close, the BBC News website published an article titled “Gaza crisis: Toll of operations in Gaza” about which we remarked:

But by far the most egregious aspect of this BBC feature is the fact that it makes no attempt whatsoever to provide BBC audiences with the crucial context of casualty ratios in the Gaza Strip as compared to those in other conflicts.

Let us assume for a moment that the UN figures quoted and promoted by the BBC are correct and that 495 children were killed during Operation Protective Edge and that none of those under 18s (as UNICEF defines child casualties) were in fact operatives for terrorist organisations. Even if we take those figures at face value, the percentage of children killed in the Gaza Strip during the summer of 2014 is, as Ben Dror Yemini has pointed out, considerably lower than the percentage of children killed by coalition forces (including British troops) in Iraq and by NATO forces (also including British troops) in Kosovo.

And even if we take the BBC’s claim that 1,462 (69%) of a total of 2,104 casualties in the Gaza Strip were civilians as being accurate (despite the fact that – as noted above – ongoing analysis suggests that the ratio of civilians to combatants may actually be lower), that would still mean that – as Col. Richard Kemp has pointed out on numerous occasions – there is nothing exceptional about that ratio.

On page 71 of the report an issue that will be familiar to many readers is discussed:

And yet, we received evidence that there remains concern in some quarters over the speed in which the BBC issues corrections when it gets the numbers wrong and the transparency with which they inform audiences that changes have been made (for example to online articles).

One can only hope that this review will prompt the BBC to take the subject of verification of data originating from political NGOs and terrorist groups much more seriously than it has done in the past, and that the focus will from now on be placed on meeting audience expectations of the provision of accurate, verified and impartial data rather than the promotion of deliberately politicized statistics.

Share this Story: Share On Facebook Share On Twitter

Let your voice be heard!

Join the Algemeiner

Algemeiner.com

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.