logo IPPC
      FAQ            Вход

Blog: General surveys: Rethinking the design for better results


By Brent Larson

Analyzing surveys can be an insightful undertaking where a myriad of learning can be had. In 2012 and 2016, we conducted general surveys to determine how national plant protection organizations (NPPOs) assessed themselves against the implementation of the IPPC and the International Standards for Phytosanitary Measures (ISPMs). We also asked the reasons behind high and low implementation. It was interesting to see some parallels in the results of the two surveys, but we thought we would take a deeper dive and compare the results between the two surveys to try to find further insights.

In 2019, we analyzed the survey design and the corresponding responses. Both the first and second surveys showed interesting results, showing similar patterns of implementation wherein the most highly implemented NPPO responsibilities were in the following areas: export certification, inspection measures and phytosanitary treatments. On the other hand, NPPOs reported the least implementation of their responsibilities in the areas of general reporting, information provision and interaction with other contracting parties.

We saw that the primary factor influencing the level of ISPM implementation was the availability or lack of qualified personnel. Financial and physical resources, as well as stakeholder cooperation and policy support were other influencing factors.

Survey shortcomings

Upon closer look, we realized that some results were incomparable between both surveys because of how the questionnaires had been designed. For instance, questions on factors influencing level of implementation were modified in the second survey, but this led to multiple possible interpretations of answer options and variation of response compared to the first survey. In addition, the answer options (rating scale) for the implementation level was changed in the second survey. Such variations generated results that would not allow us to determine changes in implementation or to show trends in implementation.

The surveys intended to cover all the Convention’s articles and NPPO responsibilities, but this comprehensive reach of the surveys sometimes led to unnecessary duplication in the same subject area.

While there were no large sample imbalances with respect to region and NPPOs’ income levels, there was some evidence that NPPOs participating in the surveys generally implemented the IPPC to a higher degree than non-participating NPPOs. IPPC implementation might therefore be slightly lower in reality than what these results offered.

With 71 respondents in 2012 and 93 in the 2016 survey (out of 182 contracting parties at the time), disaggregation of regional differences proved to be a challenge. Moreover, only 45 NPPOs participated in both surveys, and not all of these answered all questions. This constrained our ability to detect changes since the differences between 2012 and 2016 could in part be due to the different composition of both samples. Restricting the analysis to the 45 NPPOs that participated twice, only allows detection of very large changes which were not found, while small differences remained hidden.

Improving the third general survey

Our key learning points boil down to developing a more purposeful questionnaire design, having a strategic approach to inquiry that reflects day-to-day NPPO practice as well as streamlining the questions to avoid overlap and increase ease of answering.

At the onset, data needs and desired objectives should be more clearly established. Collect only data that is necessary, useful and meets the objectives. Furthermore, avoid asking similar questions that causes duplication and phrase questions such that they are interpreted uniformly, allowing comparability between the results of different surveys. We have seen that even small changes in wording of the questions and related answer scales can affect the comparability of the data. Practical changes to the questionnaire, such as allowing respondents to skip irrelevant questions will also make a difference. In addition, efforts to pre-test the questionnaire should further help ensure it meets our objectives.

While the surveys have a general scope, the questionnaires do not necessarily have to cover every NPPO responsibility stated in the IPPC. Also if data collected will not be informative, is already available from other sources or will not be used in the final analysis, then no questions for this data should be included. Both surveys covered the full breadth of the IPPC responsibilities, but they were very structured and ‘legalistic’ in approach, closely following articles laid out in the IPPC and individual ISPMs. Approaching the inquiry through the lens of an NPPO could allow for more precise questions which would generate more useful responses.

It was also noted that more effort should be made to synthesize the key results and communicate them to NPPOs in a way that would help them adjust their national systems to improve them. This would also help NPPOs understand the value of participating in future surveys. As we gear up for the third general survey in 2022, we will take these lessons learned to help ensure that questions asked will yield more meaningful and useful data to present a more accurate picture of how NPPOs are implementing the IPPC, ISPMs and CPM Recommendations.

Brent Larson is the lead of the IPPC Observatory (formerly the Implementation Review and Support System funded by the European Commission).


Doc # Agenda # Заголовок Файлы Publications date
IPPC General Survey 2013: Findings of the general survey of the IPPC and its Standards En
IPPC General Survey 2016: A Report of Findings of Contracting Party Implementation En
Report: A critical assessment and analysis of the 2012 and 2016 IPPC general surveys En

↓↓↓ Download multiple files