Like many researchers around the world, at the beginning of lockdown we were faced with a dilemma: how to collect survey data without actually visiting respondents? We recognised that collecting data on important outcomes like employment and hunger was critical, the question was simply how to do it.
Eventually the National Income Dynamic Study: Coronavirus Rapid Mobile Survey (Nids-Cram) team settled on conducting a telephone survey, something that has quickly become the norm around the world.
One of the challenges with conducting any survey, including telephone surveys, is identifying a representative sampling frame and the issue of nonresponse. The people who decide to pick up the phone and participate are usually different to the ones who don’t, and in nontrivial ways. To get around this we decided to use an existing dataset that had a rich set of background information on all participants — Nids.
Nids is a panel study that began in 2008 as a nationally representative survey with the same individuals and their descendants being re-interviewed every two to three years between 2008 and 2017.
Because we have prior data on all individuals, we know who does and doesn’t respond, allowing us to correct for nonresponse in a way that’s usually not possible in telephone surveys.
This is a key advantage of the Nids-Cram survey. A second advantage is that because Nids respondents have been surveyed before, there is an existing relationship. Where telephone surveys often obtain response rates of 5%-20%, Nids-Cram had a response rate of 40%.
To reach as many of the individuals selected for the sample as possible, interviewers called the phone numbers obtained for the selected respondents in Nids 2017, as well as those of their family members and acquaintances.
In total 17,568 randomly selected individuals from Nids were called and, of these, 7,073 were interviewed.
The existing information collected on each individual (from Nids 2017) was then used to correct for potential biases due to nonresponse in Nids-Cram.
This correction means that the Nids-Cram weighted sample resembles the Nids 2017 weighted sample quite closely. Because some of the individuals from the first wave of Nids refused to participate (or could not be found) in the subsequent waves, over time Nids became less representative of the SA population.
The Nids statisticians could adjust for this to some extent, with the result that both Nids 2017 and Nids-Cram resemble other household surveys quite closely.
For example, one can compare the weighted Nids-Cram data to Stats SA’s 2018 General Household Survey (GHS). The weighted Nids-Cram respondents are 78% black African (79% in GHS), 53% female (52% in GHS), 49% have matric (52% in GHS), 22% have medical aid (16% in GHS) and 64% are in a household that receives at least one government grant (66% in GHS).
This is encouraging, as it shows the Nids-Cram data resembles the SA population in important ways.
One area where the original sample of Nids 2017 was not truly representative of the population was on the topic of employment. The respondents in Nids 2017 had a higher rate of employment than in surveys such as the Quarterly Labour Force Survey (54% in Nids 2017 against 47% in QLFS 2017 — note this is for those aged 18 to 64).
Because the Nids data and all the procedures used to calculate weights are in the public domain, this issue is well documented and well understood. There are more than 100 peer-reviewed academic articles using Nids data, for example.
But it does mean that the average person in Nids-Cram is slightly more likely to be employed than the average person in SA.
While one should not brush over this fact — and all the researchers have been careful to put this caveat upfront in their research — it is also true that the trends seen in Nids-Cram are indicative of the trends in the population at large.
For example, the overwhelming finding from all the employment papers in Nids-Cram is that job losses were heavily concentrated among those already disadvantaged in the labour market (the poor, women, black Africans, and so on). Furthermore, while job losses might be 2.5-million or even 2-million (rather than the 3-million estimated from the Nids-Cram survey), this does not change the overall finding that there have been unprecedented job losses in this period.
All of the above is to say that there has been a tremendous amount of work put into the Nids-Cram data collection exercise. As researchers with extensive experience in sampling, survey design and dealing with nonresponse, we believe researchers and policymakers can feel confident that this survey is speaking to the real underlying labour market and welfare dynamics in SA at this time.
*Andrew Kerr (associate professor at the University of Cape Town), Rulof Burger (associate professor at Stellenbosch University), Cally Ardington (professor at UCT) and Nic Spaull (senior researcher at Stellenbosch University). All authors are in the economics departments at their respective universities. For more information on the Nids-Cram survey, visit http://www.cramsurvey.org
Would you like to comment on this article or view other readers' comments?
Register (it’s quick and free) or sign in now.
Please read our Comment Policy before commenting.