Skip to content

Gamifying Surveys to Increase Completion Rate and Data Quality

2014 October 2
by Richard N. Landers

ResearchBlogging.orgOne of the biggest challenges for research involving surveys is maintaining a high rate of completion and compliance with survey requirements. First, we want a reasonably representative sample of whomever we send the survey to. Second, we want those that do complete the survey to do so honestly and thoughtfully. One approach that researchers have taken to improve these outcomes is to gamify their surveys. But does gamification actually improve data quality?

In an empirical study on this topic, Mavletova[1] examined the impact of gamification on survey completion rate and data quality among 1050 Russian children and adolescents aged 7 to 15. In her study, Mavletova compared a traditional text-only survey with a visually interesting survey which incorporated graphics, background colors, interactive slider bars, and Adobe Flash-based responses, and a gamified survey.

To gamify the survey, Mavletova first developed four guidelines for effective gamified assessment: 1) narrative, 2) rules and goals, 3) challenges, and 4) rewards. To realize this vision, students in the gamified condition set their name, specified an avatar, and followed a story about the respondent traveling in the Antarctic among friendly penguins. In the story, respondents were asked to tell some characters in the story about themselves to the penguins in order to travel home. They also played mini-games between sections of the survey. Regular feedback was also provided indicating progress through the survey.

So did all this extra pay off? Here’s what happened:

  • Total respondents (N = 1050) to the text, visual, and gamified surveys on desktops and laptops did not differ in the total number of drop-outs.
  • On mobile devices (N = 136), fewer respondents to the gamified surveys dropped out than those completing the visual surveys, who dropped out less than those completing the text-based surveys.
  • Participants took the least amount of time on the text version (13.9 min), more on the visual (15.2 min), and the most on the gamified version (19.4 min). However, the gamified version was also substantially longer, due to the extra content.
  • When asked about the amount of time they spent, participants in all three conditions reported approximately the same subjective experience of time.
  • Respondents found the gamified survey easier to complete than either the visual or text surveys (however, this was analyzed by comparing response rates to “Strongly Agree” on the scale across conditions, which is a strange way to do it).
  • More respondents were interested in further surveys when competing either the visual or gamified surveys in comparison to the text-based surveys (although this was determined with the strange analytic approach described above).
  • The highest non-response rate was found in the gamified survey, less in the visual survey, and the least in the text-based survey.
  • When omitting Flash-based questions (which required additional technology to view), there was no difference in non-response rate between conditions.
  • There were no differences in socially desirable responding by condition.
  • Straight-line responses (answering all “c” or all “b”, for example) were more common in the text survey (11.4%) than in both the visual (2.8%) and gamified (3.2%) survey.
  • Extreme responses (answering “a” or “e” on a five point scale) did not differ by condition.
  • Middle responses (answering “c”s) were more common in the text survey than in either of the other two conditions.
  • There were no differences on open-ended questions in regards to length of responses, number of examples provided, or the distribution of that number.

So is the effort to create a gamified survey with a narrative worthwhile? Gamified surveys had better drop-off numbers for some respondents, respondents found the survey a little easier, and some types of lazy responding decreased. However, they also had a higher non-response rate. Most benefits were small and also realized by the visually interesting survey.

Overall, this is not very good news for gamified surveys. It appears that the (rather extreme) time and cost investment to develop gamified surveys did not really help much. However, because this study design did not isolate particular gamification elements, it’s difficult to say what did or did not lead to this result. Were the gains seen because of the mini-games? Perhaps the gains seen simply because the respondents were able to take breaks between sections of the survey? Did the narrative help? We don’t really know.

In fact, the cleanest comparison available here is between the text-based survey and the visual survey, since the visual survey was essentially the text-based survey plus graphics and interactive item responses.  This is a relatively small investment (relative to gamification) and thus I can recommend it more easily.

This doesn’t mean that gamifying surveys is a bad idea – it just means that gamification designed like this – with mini-games and Flash-based response scales – is unlikely to do much.  This type of gamification may simply be too simple; fancy graphics may not convince anyone, even kids, that your survey is more interesting than it really is.  More transformative gamification, such as those approaches integrating the story into the questions (which wasn’t done here) or taking more innovative approaches to data collection (rather than dressing up normal Likert-type scales), is an area of much greater promise.

Footnotes:
  1. Mavletova, A. (2014). Web Surveys Among Children and Adolescents: Is There a Gamification Effect? Social Science Computer Review DOI: 10.1177/0894439314545316 []
Share

Grad School: Online I/O Psychology Master’s and PhD Program List

2014 September 25
tags: ,
by Richard N. Landers

Grad School Series: Applying to Graduate School in Industrial/Organizational Psychology
Starting Sophomore Year: Should I get a Ph.D. or Master’s? | How to Get Research Experience
Starting Junior Year: Preparing for the GRE | Getting Recommendations
Starting Senior Year: Where to Apply | Traditional vs. Online Degrees | Personal Statements
Interviews/Visits: Preparing for Interviews | Going to Interviews
In Graduate School: What to Expect First Year
Rankings/Listings: PhD Program Rankings | Online Programs Listing

A common question I get is, “What online I/O programs are worthwhile for graduate study?” As I’ve discussed elsewhere, you are generally best served these days by a brick-and-mortar program. Employability, salary of first job, and a host of other outcomes are better. If you decide that you just can’t manage brick-and-mortar though, the available choices are not equal. Some programs are better than others. When making such comparisons, most people are ultimately concerned about employability, and the best way to answer that question is to contact some current and former students in that program, asking if they are currently employed in an I/O-related field.

But before you get to that point, you might want to just get a broad overview of which online I/O programs are available, and what they offer. I found this a surprisingly difficult task when I tried to figure it out myself, so I decide to compile what information I could find into one list.  Please note that I am unaffiliated with any of these programs, so these figures are “unofficial.”  They are based entirely upon what I could find on university websites and via Google.

How to navigate this list? Here’s what I’d look at, in order:

  1. Check if the program has a long-standing in-person PhD program.  You’ll have the best job prospects in an online I/O program associated with a long-standing brick-and-mortar I/O program, because the reputation of the in-person program will generally carry over to the online program.
  2. Check if the program is not-for-profit (public or private in the list below).  Even without a long-standing brick-and-mortar program, not-for-profit programs have no motivation to be a diploma mill, so you have a better chance at a quality education.  That’s not always true – some private not-for-profits may still be in it for the money, and some for-profits may legitimately care about their students – but this is a good general rule.
  3. Check how stringent the admission requirements are.  Generally, more-difficult-to-get-into programs are going to have better training – they only accept qualified students because they challenge those students – and overcoming educational challenges is what gives you the skills to get a job.

Note that you can click on the column headings to sort!

UniversityDegreeDegree AreaTypeCost/Credit-hourRanked PhD Program?Requirements
Colorado State UniversityMasters of Applied I/O (MAIOP)Applied I/O PsychologyPublic$665Yes3.0 GPA, GRE, B or higher in I/O course, B or higher in stats course
Kansas State UniversityMaster of Science (MS)I/O PsychologyPublic$304Yes3.0 GPA or GRE, 2 years managerial experience, coursework in Psych, HR, Management, and/or Statistics
Austin Peay State UniversityMaster of Arts (MA)I/O PsychologyPublic$462No2.5 GPA (above 3.0 recommended), GRE
Birkbeck University of LondonMaster of Science (MSc)Org PsychologyPublic£12,570 / programNoBachelor’s or work experience
University of LeicesterMaster of Science (MSc)Occupational PsychologyPublic£9,220 / programNo2.2 UK degree or international equivalent
Adler School of Professional PsychologyMaster of Arts (MA)I/O PsychologyPrivate$1040No3.0 GPA, C or higher average in Psychology, Org experience
Baker CollegeMaster of Science (MS)I/O PsychologyPrivateUnlistedNoUnlisted
Carlos Albizu UniversityMaster of Science (MS)I/O PsychologyPrivate$505NoUnlisted
Chicago School of Professional PsychologyMaster of Arts (MA)I/O PsychologyPrivateUnlistedNoC or higher in Intro Psych, C or better in stats course, C or better in methods course
Grand Canyon UniversityDoctor of Philosophy (PhD)I/O PsychologyPrivate$495NoUnlisted
Southern New Hampshire UniversityMaster of Science (MS)I/O PsychologyPrivate$627NoSuccessful completion of a stats course and a methods course
Touro UniversityDoctor of Psychology (PsyD)Human and Org PsychologyPrivate$700NoMaster’s degree, 3.4 GPA
University of Southern CaliforniaMaster of Science (MS)Applied Psych, I/O conc.PrivateUnlistedNoGRE and transcript (specific grades not specified)
Argosy UniversityMaster of Arts (MA)I/O PsychologyFor-profitUnlistedNo2.7 GPA or 3.0 GPA on last 60 hours
California Southern UniversityMaster of Science (MS)I/O PsychologyFor-profitUnlistedNoUnlisted
Capella UniversityMaster of Science (MS)I/O PsychologyFor-profit$458No2.3 GPA
Capella UniversityDoctor of Philosophy (PhD)I/O PsychologyFor-profit$510No3.0 GPA
Northcentral UniversityMaster of Science (MS)I/O PsychologyFor-profit$752NoUnlisted
Northcentral UniversityDoctor of Philosophy (PhD)I/O PsychologyFor-profit$930NoUnlisted
University of PhoenixMaster of Science (MS)I/O PsychologyFor-profit$740No2.5 GPA
University of PhoenixDoctor of Philosophy (PhD)I/O PsychologyFor-profit$740NoMaster’s degree, 3.0 GPA, C or better in stats or methods course, work experience, access to own research library
University of the RockiesMaster of Arts (MA)Business PsychologyFor-profit$824NoUnlisted
Walden UniversityDoctor of Philosophy (PhD)I/O PsychologyFor-profit$555NoUnlisted
Share

Grad School: Sortable I/O Psychology Ph.D. Program Rankings

2014 September 17
tags: , ,
by Richard N. Landers

Grad School Series: Applying to Graduate School in Industrial/Organizational Psychology
Starting Sophomore Year: Should I get a Ph.D. or Master’s? | How to Get Research Experience
Starting Junior Year: Preparing for the GRE | Getting Recommendations
Starting Senior Year: Where to Apply | Traditional vs. Online Degrees | Personal Statements
Interviews/Visits: Preparing for Interviews | Going to Interviews
In Graduate School: What to Expect First Year
Rankings/Listings: PhD Program Rankings | Online Programs Listing

Having written my grad school series, one of the most common questions I get is, “Which graduate programs should I apply to?” As I’ve explained on this blog, that’s a complicated question. You should evaluate which schools offer what you want as a student.

Unfortunately, SIOP does not make it easy to directly compare such information across programs. That’s understandable to a degree – much of this information, like research interests, changes frequently. However, every few years, a new set of rankings appears in SIOP’s newsletter, TIP, for some reason still chained to a text-based format, and sometimes to PDF. Why not something a little more modern?

So to fix that, I’ve combined the most recent of several rankings currently available into a searchable, sortable format: US News and World Report’s ranking of I/O psychology programs (woefully incomplete), the most recent evaluations of I/O faculty research productivity as reported by Beiler, Zimmerman, Doerr and Clark (2014), the number of I/O faculty in each program from that same source, and the most recent student satisfaction ratings of I/O PhD programs as reported by Kraiger and Abalos (2004). Those student satisfaction ratings are a bit old (collected in 2002), but they’re the most recent currently available.

These rankings shouldn’t be the only thing you look at when considering a graduate program, but it is something worth paying attention to.

Table column meanings are as follow (1 = highest rank for all columns except Num Fac, NR = not ranked)):

  1. Num Fac = The number of I/O faculty at the program.
  2. US News = The US News and World Report ranking.
  3. Pubs = The number of publications by I/O faculty in any peer-reviewed outlet between 2003 and 2012.
  4. IO Pubs = The number of publications by I/O faculty in the “top 10 I/O journals” between 2003 and 2012.
  5. SIOP = The number of SIOP presentations by I/O faculty between 2003 and 2012.
  6. Prod = An overall productivity index of I/O faculty between 2003 and 2012.
  7. Per Cap = The overall productivity index per capita (i.e., split per I/O faculty) between 2003 and 2012.
  8. Students = An overall weighted index of student satisfaction across 20 dimensions, from a study conducted in 2002.

Note that you can click on the headings to re-sort the table at will.

ProgramNum FacUS NewsPubsIO PubsSIOPProdPer CapStudents
Michigan State University8113119NR
University of Minnesota5124433NR
University of South Florida84315216NR
University of Central Florida6NR42921625NR
Griffith University7NR532402135NR
Rice University7NR692372811
George Mason University7NR78352110
University of Georgia8NR82742920
Teacher’s College, Columbia University9NR933388387
University of Akron8NR10712632NR
University of North Carolina – Charlotte6NR1110291827NR
University of Calgary4NR123437317NR
Portland State University5NR1319131914NR
Bowling Green State University5314111110817
University of Maryland3NR151391229
University of Waterloo5NR1626363023NR
Old Dominion University5NR1738282720NR
Purdue University5NR185201713NR
The Pennsylvania State University6NR1918101424NR
Georgia Institute of Technology5NR20142713105
Texas A&M University6NR2166918NR
University of Illinois at Urbana – Champaign4NR221214116NR
Central Michigan University5NR2322182015NR
Florida Institute of Technology6NR24361935343
Wright State University5NR2520162619NR
Baruch College, CUNY7NR261624223613
North Carolina State University7NR2717212437NR
University of Western Ontario5NR2821262317NR
University of Missouri – St. Louis6NR2928342931NR
Colorado State University4NR30372236114
Florida International University5NR3127252822NR
University of Houston5NR32158151212
Clemson University6NR3339173333NR
Wayne State University8NR3423152539NR
De Paul University5NR3525313426NR
University of Albany, SUNY3NR363030374NR
University of Guelph7NR37353939402
Auburn University3NR384035385NR
Ohio University2NR392433321NR
Illinois Institute of Technology5NR40313240306
Share