Skip to content

Why you can trust (some of) the research stories in the media

May 31, 2011

We’ve all seen articles like these in the media:

Eight out of 10 owners feed their pets anything from spaghetti hoops and ice cream to lobster thermidor, duck breast, veal scallopini and after-dinner mints, according to a Petplan study.  (Daily Express 11.09.10)

Popular social networking site Facebook is cited in one in five divorces in the United States, a survey by the American Academy of Matrimonial Lawyers suggests. (Daily Mail 2.12.10)

Drunk car passengers can be as dangerous as drivers who drink, according to a new survey.  As many as 76% of car owners have driven drunken passengers home from parties, the poll by insurance company swiftcover.com found.  (Daily Mirror 20.12.10)

On the face of it, it appears that research of this type is biased and invalid because it has been commissioned by a particular organisation with the sole intention of soliciting press attention.  Many people are cynical about market research as a result of seeing too many stories like this.

So just how cynical should we be about market research? 

I don’t think it is market research that we should be cynical about, I think it is the media coverage of market research that should be more of a concern.

The market research that we are talking about here has two distinct phases.

  • Phase 1: Doing the research (driven by the researcher / brand) – Planning the methodology, writing the questionnaire, undertaking the fieldwork, data processing and analysis, reporting to the client.  Organisations commission surveys all the time, to help them to plan what they should do with their products or services. 
  • Phase 2: Media coverage about the research (driven by the journalist / brand) – What you read about the research in the media. 

It is understandable that if a brand gets ‘good news’ from a survey they will want to issue a press release and get some positive publicity for their brand. 

But doesn’t that mean it is biased?

Not necessarily.  If the research was conducted for a big brand, they probably commissioned a market research agency to do the research so they could be sure that the results were fit to base their business decisions on.  In the UK, credible market research agencies abide by the Market Research Society (MRS) Code of Conduct (or equivalent).  This means:

  • The survey must be impartial:  MRS members are required to write balanced and unbiased questionnaires and reports, meaning that the findings from the survey will be impartial and independent.   
  • The reporting must not be misleading:  Members of the MRS are required to ensure that the results of their research are not misrepresented to the public.  The MRS member will have checked the press release to ensure that the information given in the article is not misleading and is a true representation of the findings.

So how can I tell if research I read about is credible?

Firstly, look for details of an independent research agency in the article (for example IpsosMORI, TNS-BMRB, Millward Brown, GFK NOP, YouGov – further list here)   These organisations are required to produce impartial research.  If they are named in an article, you should be able to trust the source of the research and conclude that the research was impartial and the findings have some validity.

Secondly, look out for contextual information about the research which should be provided in the article.  Does it say..?

  • How many respondents completed the survey (the more the better, look for hundreds +)
  • Who they were (current users of a product or ‘convenience samples’ such as people registered on the company website would be more likely to give a positive response)
  • What the sampling method was (look for the word ‘representative’)

This allows you to judge whether the research had methodological limitations, and consider any conclusions in context. 

So all ‘credible’ research articles in the media can be trusted?

Yes and no. 

Credible research may be reported in a misleading way by journalists, or context or supporting data may be missing from a published article.  Unfortunately journalists do not always publish stories as specified in press releases, and sometimes they read publically available reports and draw their own conclusions or ‘selectively report’ on the bits that they think will be interesting to their readers.

According to a MORI poll, the fictional wizard is officially the most famous face in Britain, recognised by 96 per cent of the nation.  Second in the poll was Sean Connery as suave spy James Bond. 

The most famous face in Britain?  Really?  More famous than the Queen?

If I read this in the newspaper I would trust the source (which was IpsosMORI) but I would think there must be more to the survey than meets the eye.  This is a typical example of selective reporting, and if you compare the newspaper article to the press release you will see just how much information was missed out.  Looking at the original press release gives me confidence that the methodology was extremely robust, and puts me in a position to judge the conclusions against the clearly explained limited number of ‘famous faces’ tested in the questionnaire. 

Once something gets out it is difficult to take back.  Even if a retraction or clarification is published the next day it is really too late because the context is lost.

However, don’t let this devalue the fact that any research that enters the public domain becomes subject to public scrutiny.  You can choose to find out more about a survey if you want to assess its validity.  With a bit of Google searching you can usually find the original press release (from the research agency or client) which will give you much more information and a more balanced view to judge the research against.

But why are there so many ‘Good news stories’?

If you want to put the range of ‘good news stories’ in context, another issue that you should consider is the omission of the huge range of studies that are not reported in the media. 

Think about it – thousands of companies are undertaking research on any given day asking millions of questions, but you don’t hear about it.

  • Many companies do not publish research as it gives their competitors an advantage.
  • Some research projects are quite boring and would not be of general interest.
  • Every survey asks tens of questions, but not all of the results make an interesting news story.
  • If the research is inconclusive and the client doesn’t get any juicy headline statistics from survey, they would not be able to ‘use’ the research in this way.
  • If a company finds out their product is failing, they won’t want to tell the world about it.

If you want to be cynical about ‘lies, damn lies and statistics’ you are probably better thinking about the stories and brands that you don’t hear about (and wondering why they are being so ‘quiet’), rather than the ones that have chosen to make themselves subject to public scrutiny through publishing their findings.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: