Skip to content

Your Genetics Incline You to Respond to Surveys

2010 June 24

ResearchBlogging.orgA very interesting article from the Journal of Organizational Behavior[1] suggests that genetics play a role in predicting if you’ll respond to surveys. Over 1000 twin pairs were contacted through the Minnesota Twin Registry, and it was found that 45% of the variance in survey response behavior could be explained by genetic differences.

For those unfamiliar with the Minnesota Twin Family Study, here’s the brief version: Tom Bouchard in 1979 began studying a pair of twins that had been separated at birth and reunited at age 39.  He believed that by examining twins (similar genetic profiles but different experiences), we could better address the psychological problem of nature vs. nurture: do genetics or environment play a stronger role in actual behavior.  This eventually evolved into several projects, including the Minnesota Study of Identical Twins Reared Apart, which examined twin pairs with identical genomes but often quite different experiences growing up.  The Minnesota Twin Registry in particular is an extension of this project, and contains many thousand twin pairs of native Minnesotans born 1936-1955 and 1961-1964.  Some are identical; some, fraternal.

Thompson, Zhang, and Arvey (2010) used this registry to examine the genetic determinants of non-response bias.  They essentially asked the question, “When we send out a survey, who responds?” and if patterns of non-response are random or instead might bias results systematically.  In survey research, we must explicitly consider the reason that data is missing.  There are several possibilities:

  • MCAR (Missing Completely At Random): Ahh, the safety of MCAR.  When missingness is MCAR, you have absolutely nothing to worry about.  Missingness is uncorrelated with anything else, so missing data will not bias your results.
  • MAR (Missing At Random): When data is MAR, you only have a little to worry about.  Data is not MCAR, but it is not missing in a way that will bias your results.  For example, let’s assume that you send your survey invitations via e-mail to a random sample of your population of interest, with a deadline of 1 week for participation.  Perhaps the construct Comfort with Computers predicts how often people check their e-mail – some of your sample simply won’t check their e-mail within a week and as a result, their data will be missing.  But as long as Comfort with Computers (or a construct correlated with Comfort with Computers) isn’t what you’re interested in measuring, your results will be safe.
  • NMAR (Not Missing At Random): When data is NMAR, you have a problem.  Missing data will bias your results systematically due to range restriction.  Consider the example above, but this time, the target of your survey is measurement of Comfort with Computers.  If people with low Comfort with Computers don’t respond, then missingness will result in range restriction, which will in turn artificially decrease the magnitude of any correlations you might compute on it.

The problem with understanding missingness, of course, is that there is no way to actually track it.  MCAR/MAR/NMAR is a theoretical concept, and one that must be considered during the survey development process.  But it cannot be measured in most surveys; if people didn’t respond, there’s no easy way to discover their reason.

Thompson, Zhang and Arvey were able to investigate this because of a unique advantage; they already had access to genetic information before sending the survey out.  By comparing people with predictable genetic differences (or even identical genes) raised in different households, they were able to extract the proportion of variability in survey responses predictable from differences in genetics.  The value?  45% (a correlation of .67).  Likelihood of survey response (at least to a leadership survey) seems to be substantially heritable.

The implication of this?  There is at least some biological basis for non-response to surveys.  This means that the MCAR/MAR/NMAR distinction must be made even more carefully, as any characteristic with a genetic basis (and there are many) might be correlated with non-response, biasing the results from any survey on those characteristics.

  1. Thompson, L., Zhang, Z., & Arvey, R. (2010). Genetic underpinnings of survey response. Journal of Organizational Behavior, DOI: 10.1002/job.692 []
Previous Post:
Next Post:
No comments yet

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS

x has changed servers! The new site should be faster and more responsive than ever before. But there may be bugs! If you run into any weird site behavior, please email me at rnlanders at odu dot edu