Context
The Ontario Nonprofit Network (ONN) has conducted a State of the Sector survey for 4 consecutive years. ONN describes its data as “high quality” and provides analysis of year-over-year changes.
“High quality” data presumably means accurate data. The ONN data are not accurate. And there is no indication that any standard methods for improving the accuracy of survey data were considered, let alone implemented.
There are multiple substantive issues with the ONN State of the Sector survey that make its survey data uninterpretable and unusable.
Survey Issues
Target Population
ONN defines its survey target population as all nonprofit organisations based in Ontario, including
- nonprofits
- charities
- grassroots groups with a mission to serve a public benefit
- volunteer-run organizations
- nonprofit social enterprises
- nonprofit cooperatives
Like most organizations, ONN wishes to extrapolate from the findings of its survey sample to its target population as a whole.
This is impossible.
It is impossible to obtain a representative sample from the target population because it is impossible to identify all volunteer-run organizations and all grassroots groups with a mission to serve a public benefit. They cannot even be approximated. At a bare minimum, we would need to know the approximate proportion of each type of organization in the ONN definition as a proportion of all non-profits.
In social science research, we do not expect perfect accuracy, but survey data do need to be sufficiently accurate to inform decision-making.
The survey sample includes respondents who are not in the target population
Oddly, ONN is aware that its sample data includes organizations that are not part of its target population as some respondents wrote in that they are in government, a government agency, or for-profit business.
It is not unusual to collect data from non-qualifying respondents, but it is unusual not to remove non-qualifying respondents during the data cleaning phase of work. ‘Non-qualifying’ respondents are those who are not in the Target Population.
Generic survey link creates challenges
It appears that a generic survey link was sent out and made available for anyone to take the survey. By using a generic link, there is no way to know how many people from the same non-profit organization participated in the survey. Are 300 of the 3300 respondents from the same organization? We don’t know.
The unit of analysis is in the survey report is not the same as in the data
Individuals from nonprofit organizations responded to the survey, but the data analysis treats each respondent as if they were a unique nonprofit organization. We don’t know how many individual nonprofit organizations are represented in the survey and how many nonprofits are represented many times by multiple respondents.
The confusion in the unit of analysis is evident throughout the ONN analysis:
“While more nonprofits have a reserve fund, more are also accessing it, compared to last year. 28% of respondents reported accessing their reserve funds, compared to 24% last year…” [emphasis added, p.8]
While there is a 4 percentage point increase in the proportion of respondents whose nonprofit accessed reserve funds in the past year (28% vs. 24%), there is no way to know how many nonprofit organizations accessed their reserve fund in the past year. Because the ONN sample likely includes duplicate organizations, it is possible that more respondents reported that their organisation accessed reserve funds in the past year, but fewer nonprofit organizations actually did.
Changes in survey respondents prevents year on year comparability
In 2021, 20% of ONN survey respondents were from Toronto, while about one-third of respondents were from Toronto in 2023. This one change alone in the sample composition could account for substantial year on year changes in the survey findings. There are other examples of notable changes in the respondent composition by year.
Such fundamental changes in the composition of the samples cast doubt on any purported year-on-year changes.
How to improve
ONN states that it “provides stakeholders access to high quality sector-wide data.” We can assume that “quality data” means accurate data.
To produce high quality (accurate) data going forward, ONN needs to ensure that it can reasonably extrapolate from its survey sample findings to its target population.
- Redefine the target population to include organizations for which there is population information (e.g., registered nonprofits, registered charities), and then exclude other organizations or use information from other organizations as add-on qualitative information that is not merged with the core sample.
- Invite only one person from each organization in the target population to participate, using a unique survey link for each invitee.
- Weight the sample to known features of the target populations (e.g., region, number of employees, year established, organisation size, revenue, etc.)
Lessons
Be wary of any survey using a generic link. It is only rarely a good option.
Be wary of any survey that does not weight the sample data to known target population parameters. (Weighting is a standard practice by legitimate professional survey research companies.)
Be aware of duplicate survey respondents, especially when an organization or business is your unit of analysis.