NatureScot Research Report 1361 - SPANS Scotland’s People and Nature Survey 2023/24 - technical report
Published: 2024
Authors: Duncan Stewart and Jim Eccleston (56 Degree Insight)
Cite as: Stewart, D. and Eccleston, J. 2024. SPANS Scotland’s People and Nature Survey 2023/24 - technical report. NatureScot Research Report 1361.
What is SPANS?
Scotland’s People and Nature Survey (SPANS) is a large-scale population survey that provides detailed data on how adults in Scotland use, value, and enjoy the outdoors and connect with nature. SPANS data allow NatureScot to monitor key trends over the long-term and produce statistically robust insights.
The core research objectives for SPANS are to deliver strong quantitative evidence in relation to the following key areas:
- Visits to the outdoors for leisure and recreation;
- Recreational use of and attitudes towards urban greenspace;
- Connection to nature;
- Benefits of engagement with the natural environment;
- Environmental attitudes and behaviours.
Technical report
This report provides details on the methods used to deliver SPANS 2023-24. It is intended to provide users with sufficient technical information to support their use of the results, interpret the findings, and understand the design of the survey. A suite of other reports containing the survey findings, with a focus on specific areas of interest have also been produced using the data from SPANS 2023-24:
- Headline report
- Outdoor recreation
- Health and wellbeing
- Connection to nature
- Equality and diversity
- Technical report
Data collection method
Online approach
SPANS 2023-24 used a sample drawn from the Prodege online consumer panel to provide 12 monthly waves of data representative of the Scottish adult population aged 16 and over.
The adoption of an online approach for SPANS marks a change from previous waves of the survey which used an in-home face-to-face interviewing method. With interviewer-administered face-to-face surveying approaches suffering from decreasing response rates and increasing costs, switching to an online platform provided NatureScot with a more cost effective and future-proofed methodology for SPANS. More details on the impact of this change of approach on the comparability of results are described later in this report.
Sampling
During each survey wave at least 1,000 online interviews were completed, following quota sampling approaches described below.
A total of 12,053 online interviews were completed over the full 12-month period, as illustrated in Table 1.
Table 1. Monthly fieldwork periods and sample sizes achieved
Fieldwork dates | Reference period asked about in questions regarding visits to outdoors | Sample size |
---|---|---|
3rd – 11th April | March 2023 | 1,011 |
1st – 10th May | April 2023 | 1,005 |
1st – 8th June | May 2023 | 1,008 |
3rd – 10th July | June 2023 | 1,008 |
1st – 7th August | July 2023 | 1,003 |
1st – 11th September | August 2023 | 1,002 |
1st – 10th October | September 2023 | 1,002 |
1st – 10th November | October 2023 | 1,009 |
1st – 11th December | November 2023 | 1,002 |
2nd – 10th January | December 2023 | 1,000 |
1st – 9th February | January 2024 | 1,003 |
1st – 10th March | February 2024 | 1,000 |
Total | - | 12,053 |
In each monthly wave fieldwork took place over 7 to 8 days. Interviewing commenced on the first working day of each month. Questions involving recall of recent visits to the outdoors asked respondents to think about visits taken during the preceding month (so, for instance, participants in May would be asked about visits taken in April).
Quota controls
Each month’s fieldwork used sampling quotas to ensure a representative distribution across gender, age, socio-economic group and place of residence. The tables below illustrate these monthly sample targets and the final distribution of the sample achieved over the overall 12-month period.
Table 2a. Monthly fieldwork periods and sample sizes achieved – by Gender
Gender | Scottish population | Monthly sample target | Final 12 month sample achieved | % distribution of sample |
---|---|---|---|---|
Male | 48% | 480 | 5,784 | 48% |
Female | 52% | 520 | 6,227 | 52% |
Table 2b. Monthly fieldwork periods and sample sizes achieved – by Age
Age | Scottish population | Monthly sample target | Final 12 month sample achieved | % distribution of sample |
---|---|---|---|---|
16-34 | 29% | 290 | 3,361 | 28% |
35-54 | 32% | 320 | 4,097 | 34% |
55+ | 39% | 390 | 4,595 | 38% |
Table 2c. Monthly fieldwork periods and sample sizes achieved – by Socio-economic group
Socio-economic group | Scottish population | Monthly sample target | Final 12 month sample achieved | % distribution of sample |
---|---|---|---|---|
ABC1 | 52% | 520 | 6,267 | 52% |
C2DE | 48% | 480 | 5,786 | 48% |
Table 2d – Monthly fieldwork periods and sample sizes achieved – by Region
Region | Scottish population | Monthly sample target | Final 12 month sample achieved | % distribution of sample |
---|---|---|---|---|
West (City of Glasgow, North Lanarkshire, South Lanarkshire, Renfrewshire, North Ayrshire, East Ayrshire, South Ayrshire, East Dunbartonshire, East Renfrewshire, West Dunbartonshire, Inverclyde) | 45% | 450 | 5,327 | 44% |
South (Scottish Borders, Dumfries and Galloway) | 5% | 50 | 544 | 5% |
East (Perth & Kinross, Angus, Stirling, Fife, Falkirk, Dundee, East Lothian, Mid Lothian, West Lothian, Clackmannanshire, City of Edinburgh) | 35% | 350 | 4,273 | 35% |
North (Highland, Argyll and Bute, Moray, Aberdeenshire, Aberdeen City, Orkney, Shetland, Western Isles) | 15% | 150 | 1,909 | 16% |
While these quotas ensured that the sample reflected the population for these demographics, by setting these controls other important demographics were also broadly reflective of the population (e.g. working status, ethnicity, and levels of deprivation).
Data quality
SPANS 2023-24 was undertaken by 56 Degree Insight using the Prodege online consumer panel. Panel members go through a robust quality control process before they can join the panel and are then invited to take part in surveys in return for reward points. In the design of the approach several steps were taken to ensure data quality, as described below.
Ensuring representativeness
Using an online panel approach brings many benefits including data collection speed and coverage across the required volumes of respondents in the target demographic and geographic areas. However, there are some potential limitations around the representativeness of the data collected through this method, including the possible exclusion of the offline population and the risk of those people who have joined the Prodege panel are not representative of the general population.
In terms of offline adults, internet penetration amongst the adult Scottish population is now over 90% (Source: Scottish Household Survey, 2022). However, levels of coverage still vary by demographic group, particularly by age, ranging from 76% among people aged 60+ to 99% among people aged 16-34. Internet access also varies, to a lesser extent, by income and housing tenure. The quota controls based on age and social grade described above ensured that those groups where internet usage is lower were represented in the SPANS data.
To minimise the risk of bias around the representativeness of panel members, Prodege use a wide variety of methods to recruit and retain their panellists. Numerous recruitment channels are used to ensure a diverse mix of types of people become and remain active panel members. These methods include promotion via social media, online and offline advertising, member referrals, and long-standing partnerships with firms providing recruitment incentives.
It is important to note that in-person survey methods require similar considerations to ensure representativeness. For instance, effort must be taken to ensure that in-person surveys avoid over sampling those demographic groups most likely to be at home and willing to take part in an interview, including older age groups, people who are less busy or not working, and people otherwise less likely to be away from home (including regular outdoor recreation participants).
Sample exclusions to prevent conditioning effects
To produce the large required total sample for SPANS 2023-24, and owing to the finite nature of online panels, it was necessary that some panellists be recontacted and invited to participate in the survey on more than one occasion over the fieldwork period.
Given the subjects covered in SPANS, allowing some participants to complete the survey more than once was considered unlikely to have any considerable impact on data quality (such as by creating a conditioning effect whereby the act of taking part in the survey impacted upon the behaviours or attitudes recorded).
Nevertheless, despite the low risk of conditioning or other negative effects on data quality, the chosen sampling strategy guaranteed that participants were excluded from the subsequent two survey waves. For example, someone who completed the survey in wave 1 could not be invited to take part in the survey again until wave 4.
With the use of these controls, by the end of the 12-month fieldwork period just 8% of the 12,053 responses obtained were provided by respondents who had taken part in the survey in an earlier wave (915 in total of whom 506 took part twice overall, 234 took part on three occasions and 175 took part on the maximum allowed four occasions).
Questionnaire
Modular approach
A full copy of the questionnaire is available to download in Word format here.
Given the wide range of topics addressed in SPANS 2023-24, and the need to keep the average questionnaire completion time manageable, the questionnaire was divided into sections, or question sets. Some question sets were included in every month of surveying and others were included on a rotational basis, for example on a quarterly, biannual or other periodic basis, as shown in Table 3 below. Question sets could have the same frequency across the survey period (i.e. quarterly) but be included at different times (i.e. beginning in April or May), allowing the total number of questions asked each month to be managed.
Table 3. Illustrative modular questionnaire approach – question frequency and timing
Frequency | April | May | June | July | August | September | October | November | December | January | February | March |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Every month | x | x | x | x | x | x | x | x | x | x | x | x |
Every other month | - | x | - | x | - | x | - | x | - | x | - | x |
Quarterly | x | - | - | x | - | - | x | - | - | x | - | - |
Twice over twelve months | - | - | x | - | - | - | - | - | x | - | - | - |
Once over twelve months | - | - | - | - | x | - | - | - | - | - | - | - |
When and how often a question was asked is determined by several factors, including the desired final sample size for each question, the need to cross tabulate the results of different questions, and the likelihood of seasonal variation being a factor in participant responses. Frequency and timing for each of the questions included in SPANS are listed in Table 4.
Table 4. SPANS questions – frequency, timing and weights applied in analysis
Question number | Number of waves included | Months included | Weight used |
---|---|---|---|
Classif1 | 12 | All | Demographic |
Classif2 | 12 | All | Demographic |
Classif3 | 12 | All | Demographic |
Classif4 | 12 | All | Demographic |
REC19 | 6 | Apr, Jun, Aug, Oct, Dec, Feb | Demographic |
REC1 | 12 | All | Demographic |
REC2 | 2 | Jul, Jan | Demographic |
REC3 | 2 | Jul, Jan | Demographic |
REC4 | 12 | All | Demographic |
QUALITY CHECK (VISIT TAKERS) | 10 | Apr, May, Jun, Aug, Sep, Oct, Nov, Dec, Feb, Mar | Demographic |
REC5 | 12 | All | Visit |
REC5A | 12 | All | Visit |
REC5B | 12 | All | Visit |
REC5C | 12 | All | Visit |
REC5D | 12 | All | Visit |
REC5E | 12 | All | Visit |
REC5F | 12 | All | Visit |
REC6 | 6 | May, Jul, Sep, Nov, Jan, Mar | Visit |
REC7 | 6 | May, Jul, Sep, Nov, Jan, Mar | Visit |
REC8 | 6 | Apr, Jun, Aug, Oct, Dec, Feb | Visit |
REC9 | 6 | Apr, Jun, Aug, Oct, Dec, Feb | Visit |
REC9A | 6 | Apr, Jun, Aug, Oct, Dec, Feb | Visit |
NP1 | 12 | All | Visit |
REC10 | 4 | May, Aug, Nov, Feb | Visit |
T4 | 4 | May, Aug, Nov, Feb | Visit |
REC11 | 4 | Jun, Sep, Dec, Mar | Visit |
REC12 | 4 | Jun, Sep, Dec, Mar | Visit |
REC13 | 4 | Jun, Sep, Dec, Mar | Visit |
REC14 | 4 | Jun, Sep, Dec, Mar | Visit |
QUALITY CHECK (VISIT TAKERS) | 12 | All | Visit |
BEN1 | 2 | May, Nov | Visit |
BEN2 | 2 | May, Nov | Visit |
BEN3 | 2 | May, Nov | Demographic |
GREEN1 | 4 | May, Aug, Nov, Feb | Demographic |
GREEN2 | 2 | May, Nov | Demographic |
GREEN3 | 2 | May, Nov | Demographic |
GREEN4 | 2 | May, Nov | Demographic |
GREEN5 | 2 | May, Nov | Demographic |
GREEN6 | 2 | May, Nov | Demographic |
GREEN7 | 2 | May, Nov | Demographic |
T3 | 2 | May, Nov | Demographic |
ENVIR1 | 1 | Jul | Demographic |
ENVIR2 | 1 | Jul | Demographic |
Classif10 | 12 | All | Demographic |
ENVIR3 | 1 | Jul | Demographic |
ENVIR4 | 1 | Jul | Demographic |
FOR1 | 6 | Apr, May, Jun, Oct, Nov, Dec | Demographic |
FOR2 | 6 | Apr, May, Jun, Oct, Nov, Dec | Demographic |
FOR3 | 6 | Apr, May, Jun, Oct, Nov, Dec | Demographic |
FOR4 | 6 | Apr, May, Jun, Oct, Nov, Dec | Demographic |
FOR5 | 6 | Apr, May, Jun, Oct, Nov, Dec | Demographic |
T1 | 4 | Jun, Sep, Dec, Mar | Demographic |
T2 | 4 | Jun, Sep, Dec, Mar | Demographic |
HWB1 | 12 | All | Demographic |
HWB2 | 6 | May, Jul, Sep, Nov, Jan, Mar | Demographic |
HWB3 | 6 | May, Jul, Sep, Nov, Jan, Mar | Demographic |
HWB4 | 6 | May, Jul, Sep, Nov, Jan, Mar | Demographic |
NCI | 12 | All | Demographic |
Classif5 | 12 | All | Demographic |
Classif6 | 12 | All | Demographic |
Classif7 | 12 | All | Demographic |
Classif8 | 12 | All | Demographic |
Classif9 | 6 | May, Jul, Sep, Nov, Jan, Mar | Demographic |
Classif11 | 12 | All | Demographic |
Classif12 | 12 | All | Demographic |
The sample size produced for each question was determined by the frequency over the 12-month period, ranging from around 1,000 for those questions included in a single survey wave to around 12,000 for those included in all twelve waves. Given this variation, and the potential need to understand seasonal effects, the sample size for each question and months that questions were asked in are included alongside tables and figures published in the SPANS reports.
Ensuring data quality
Measures were taken to ensure that the data provided by respondents was as accurate as possible.
To minimise the potential for respondent fatigue, the modular questionnaire structure allowed each monthly set of questions to be kept as short as possible. Survey length ranged from around 10 to 15 minutes. Table 5 illustrates the average length during each month and overall annual average.
Table 5. Average survey completion time
Month | Average survey (minutes) |
---|---|
April 2023 | 11 |
May 2023 | 15 |
June 2023 | 13 |
July 2023 | 13 |
August 2023 | 13 |
September 2023 | 12 |
October 2023 | 11 |
November 2023 | 15 |
December 2023 | 12 |
January 2024 | 11 |
February 2024 | 13 |
March 2024 | 11 |
Total | 13 |
Electronic scripting of the survey was designed so that the interface was as engaging and easy to use on a mobile device as on a computer screen. In total, 65% of completions were made on a mobile phone, 32% on a computer, and 4% on a tablet.
The survey questionnaire was programmed to encourage the highest quality of responses, such as by using automatic routing so that only relevant questions were shown to each participant, as well as using automated messages to alert respondents if numeric responses were outside of an expected range (e.g. the number of visits taken in preceding month).
Data cleaning
Prior to analysis, data cleaning was undertaken to identify and remove any respondents whose responses did not meet certain criteria, suggesting that they were not sufficiently engaged with the survey questions or were not answering accurately for other reasons. The following measures were used:
- Open-ended qualitative questions – specific questions were included that asked respondents to type in their reasons for visiting or not visiting the outdoors (depending on their answer to a previous visit frequency question). The responses to these questions were then manually reviewed to identify any problematic responses. Any respondents identified as problematic were then removed and replaced.
- Addition of opposing statement – at the question relating to nature connection (NCI), an additional statement was added to the standard set as a final rating scale. The wording of this statement was the oppositive of one of the existing statements and any respondents who provided contradictory answers across the two opposing statements were subsequently flagged and quality checked. If their responses to other questions were also confirmed to be suspicious, the respondent was removed and replaced. Results from the opposing statement question were not included in the calculation of the NCI itself.
- Identifying survey ‘speedsters’ –the time taken by respondents to complete the questionnaire was recorded. If participants took less time than expected the respondent was automatically flagged for manual checking and, if necessary, removed from the sample.
These data checks were undertaken while fieldwork was live to allow for any problematic respondents to be removed and replaced within the fieldwork period.
Analysis
Weighting
Demographic weight
Online interviews for SPANS 2023-24 were undertaken using a quota sampling approach. This approach ensured that each monthly sample of respondents broadly reflected the Scottish adult population in terms of gender, age, socio-economic group, and region. At the analysis stage, demographic weighting was applied to correct for any variations which existed between the sample and the Scottish adult population.
A Demographic Weight was created using weighting targets based on the latest Scottish population data (see tables 6a to 6d). The final weighted outputs from the survey can therefore be considered representative of the Scottish adult population and the results of each wave are comparable.
Table 6a. Weighted and unweighted sample profile – by Gender
Gender | Unweighted profile | Weighted profile |
---|---|---|
Male | 48% | 48% |
Female | 52% | 52% |
Table 6b. Weighted and unweighted sample profile – by Age
Age | Unweighted profile | Weighted profile |
---|---|---|
16-34 | 28% | 29% |
35-54 | 34% | 32% |
55+ | 38% | 39% |
Table 6c. Weighted and unweighted sample profile – by Socio-economic group
Socio-economic group | Unweighted profile | Weighted profile |
---|---|---|
ABC1 | 52% | 52% |
C2DE | 48% | 48% |
Table 6d. Weighted and unweighted sample profile – by Region
Region | Unweighted profile | Weighted profile |
---|---|---|
West | 44% | 45% |
South | 5% | 5% |
East | 35% | 35% |
North | 16% | 15% |
A rim-weighting (Random Iterated Method) approach was used in this process, involving the application of weights to each of the respondents included in the annual analysis to bring the sample distribution of each of these demographic variables into line with the population distribution. Rim-weighting uses a mathematical algorithm to provide an even distribution of results across the entire dataset whilst balancing demographic variables to pre-determined totals. It weights the specified characteristics simultaneously and disturbs each variable as little as possible.
Visit weight
In addition to demographic weighting, visit weighting was applied to those questions regarding visits taken to the outdoors during the preceding month.
All respondents who had taken at least one outdoor recreation visit in the previous month were asked to provide details of the visit they had taken most recently. This approach was used in preference to collecting details of all of the visits taken by respondents during the four week recall period since providing details of all visits taken would have been time consuming and burdensome for respondents (particularly for frequent outdoor visitors) and would likely have affected the quality of the data recorded.
Using details only from the most recent visit meant that visits taken by frequent participants are under-represented within the sample while those taken by infrequent participants are over-represented. For example, someone who took a visit every day of March provided details of one of their 31 visits while someone who took a visit once a week provided details of one of their 4 visits.
The purpose of the visit-weighting approach is to make the results of these questions more representative of all visits taken during the survey period. Representativeness was achieved by upweighting the data by a factor equal to the number of visits taken by the respondent in the month prior to interview (provided in response to Question REC4 “How many visits to the outdoors for leisure and recreation in Scotland have you did you make during [MONTH]?”). For example, the responses provided by a respondent who had taken 4 visits during the month were upweighted by a factor of 4, while the responses of a respondent who had taken 31 visits were upweighted by a factor of 31.
At an individual respondent level, this approach to weighting visit data means that the characteristics of the most recently taken visit are applied to all of the visits taken by that respondent during the previous month. This approach is considered valid as the survey results are presented at an aggregated level only, rather than for individual respondents. This approach is also consistent with previous years of SPANS, along with other comparable studies in the sector such as the People and Nature Survey undertaken in England.
The type of weighting used in the analysis of each of the questions included in SPANS 2023-24 is shown in Table 4, above.
Estimating the volume of visits taken over 12 months
In every wave of the survey, all respondents who stated that they had taken any outdoor recreation visits during the previous 12 months were asked how many visits they had taken during the month prior to the interview (Question REC1). To calculate an estimate of the total volume of visits taken during each month, the average result given at this question (across the whole weighted sample) was multiplied by the latest estimate of the number of adults aged 16 or over living in Scotland (4.56 million based on mid-year estimates for 2022).
By following this approach an estimate of the number of visits taken by the total population was obtained for each month from March 2023 to February 2024. Combined, these monthly figures amount to the total estimate for the full 12-month period.
This approach to estimating the number of visits taken is identical to that used in previous years of SPANS.
Data profiling
All respondents were asked to provide their full home postcode for the purposes of data analysis. In line with the Market Research Society Code of Conduct, respondents were given the option not to provide this information.
Over the 12-month period, a total of 9,600 valid postcodes were collected, 80% of the total sample. These postcodes have been used to profile these respondents with respect to the following two spatial classifications:
- Scottish Index of Multiple Deprivation (SIMD) – SIMD is a tool used by the Scottish Government to identify areas of high deprivation. It assesses deprivation across multiple dimensions, including income, employment, health, education, access to services, housing, and crime. The SIMD ranks geographical areas, known as data zones, from most deprived to least deprived. In reporting SPANS respondents have been classified to identify those in the 10% most deprived and 10% least deprived areas.
- Urban Rural classification - the Scottish Government's Urban Rural Classification categorises areas across Scotland based on their population size and accessibility to major cities. This classification divides areas into several categories ranging from large urban areas, which are cities with substantial populations, to remote rural areas, which are characterised by their smaller populations and greater distances from large cities. In reporting SPANS respondents have been classified using the 6-fold classification ranging from Large Urban Areas to Remote Rural.
Respondents were also profiled using the Nature Connectedness Index (NCI), a metric recording people’s connection with nature (see Richardson et al. 2019). The NCI uses a series of attitude statements, to which participants rated their agreement or disagreement. Responses to each statement are combined to produce a score of 0-100 indicating how closely individuals feel to nature.
The NCI statements were included in every wave of the survey so that all respondents could be profiled based on their NCI score. The results have been presented in the SPANS reports as an average result, across all respondents and amongst sub-sets such as particular age groups, and on the basis of population quartiles.
Accuracy estimates
In any population survey using an online panel approach respondents ‘opt-in’ to the survey rather than being selected at random. Quotas are set to ensure the sample is representative of the population. Therefore, the approaches followed to calculate confidence intervals in a survey which uses random probability sampling approach cannot provide a complete measure of the accuracy of the survey findings and only indicative estimates of the accuracy of data can be provided. This was also the case in the previous SPANS surveys when a non-probability, quota sampling approach was used in the face-to-face fieldwork.
The estimated level of accuracy of results is primarily dependent on the size of the sample. Whilst the total analytical sample in SPANS 2023/24 was over 12,000, the rotation of questions between survey waves meant that some questions were asked of a lower number of respondents than others. Furthermore, the routing instructions used in some sections of the questionnaire reduced the number of potential respondents for some questions. For example, questions about the characteristics of recent outdoor visits were only asked if the respondent has visited the outdoors during the previous month.
The effects of weighting also need to be accounted for when estimating the accuracy of the results in any survey of this nature. SPANS 2023/24 involved 12,053 interviews in total, but the weighting solutions employed in the data analysis are estimated to have an efficiency of around 70% when the demographic weights are applied and 40% for the visit weights, thereby reducing the effective sample size.
Given the above design effects, confidence intervals (i.e. the percentage range within which an estimate is likely to fall) for those questions weighted using the Demographic Weighting have been estimated as being 1.5 times higher than those which would occur with an equivalent sized simple random sample. Due to the additional design effect caused by the application of the Visit Weighting, confidence intervals for these questions are estimated as being 2.5 times higher than those which would occur with an equivalent sized simple random sample. These design effects are the same as those estimated for previous SPANS waves which employed a face to face, quota sampling approach.
Table 6 provides the margins of error associated with an individual result given a range of different sample sizes. For example, where the sample size is in excess of 10,000 respondents, the data are accurate to around +/-2% at the 95% confidence interval. In other words, a hypothetical result of 50% would have a range from 48% to 52%.
Table 7. Estimated margins of error associated with an individual result
Sample size | Demographic weighting only | Demographic and visit weighting |
---|---|---|
10,000 or more | +/-1% | +/-2% |
6,000 | +/-2% | +/-3% |
3,000 | +/-3% | +/-4% |
2,000 | +/-3% | +/-5% |
1,000 | +/-5% | +/-8% |
500 | +/-7% | +/11% |
Table 7 provides an indication of when the differences between two results may be considered statistically significant. For example, when comparing two percentages where both sample sizes are around 6,000, a difference of +/-2% or more can be considered statistically significant when results have been weighted using demographic weighting only.
In the reporting of findings only statistically significant differences in results (e.g. between two demographic groups) have been highlighted in the commentary.
Table 8. Margins of error when comparing two percentages
Sample size | Demographic weighting only | Demographic and visit weighting |
---|---|---|
10,000 or more | +/-2% | +/-4% |
6,000 | +/-2% | +/-4% |
3,000 | +/-4% | +/-6% |
2,000 | +/-5% | +/-8% |
1,000 | +/-7% | +/-11% |
500 | +/-9% | +/-16% |
It should be noted that these margins of error are intended to be indicative only.
The margins of error shown are all for a hypothetical result of 50% at the 95% confidence levels (e.g. a margin of error of +/-4% represents a range from 46% to 54%). For results of below or above 50% the margin of error is smaller in terms of percentage points.
SPANS data continuity
With the change in survey approach from in-person at-home interviews to online surveying any comparisons between the 2023-24 SPANS and previous SPANS surveys must be made with caution.
Wherever possible the design of SPANS 2023-24 has been consistent with the previous survey, for example in terms of the questionnaire wording, quotas used in sampling, and the analysis approaches. Nevertheless, changes in data collection method are likely to lead to some survey mode effects which result in a loss in comparability.
Survey mode effects arise from differences in how surveys are administered, impacting responses due to the method. For example, transitioning SPANS from a face-to-face to an online format may result in the following mode effects:
- Reduction of Social Desirability Bias - Online surveys typically see reduced social desirability bias as respondents feel more anonymous compared to face-to-face settings, where the presence of an interviewer might influence them to answer in socially acceptable ways. For example, overstating their frequency of participation in an activity or their levels of concern for the environment.
- Interaction changes - Without interviewer interaction in online surveys, respondents might misunderstand questions. The absence of an interviewer to explain complex questions can lead to more varied interpretations and responses in an online setting. For example, although question wording clearly asks respondents to focus only on a single outdoor recreation visit in a certain time period, some people may misunderstand this requirement and instead report more generally on the visits they have taken over a longer period of time.
- Sample composition changes – participant recruitment methods can affect the demographic composition of the sample. Online consumer panels have been shown to skew slightly towards younger individuals who are online. Conversely, face-to-face approach can skews towards those people who are more likely to be at home, such as older age groups and people who are not working. These types of variation were identified in analysis undertaken in the development of the People and Nature Survey in England (see section 2.1.1 here).
While it has not been possible to quantify the presence or impact of these mode effects for SPANS 2023/24, results from the online approach for key measures such as participation in outdoor visits are broadly consistent with those obtained in the face-to-face SPANS surveys undertaken pre-pandemic. This similarity suggests that mode effects are not significantly impacting upon the comparability of these headline measures.
SPANS 2023-24 – Questionnaire
The SPANS 2023-24 - Questionnaire is available to download next.
Please note that this word document is not fully accessible. If you require an accessible format please use the feedback form on our website.