The first 23 responses to a user survey for The Coaster Crew

Lessons Learned: Coaster Crew User Surveys

I landed a new UX redesign project in January for Coaster Crew and their network of fansites.  So far, it is going great and is ahead of schedule.  Currently, I am almost done with user surveys and about to make the personas.

The user surveys are based at AYTM.  The initial respondents were recruited through Facebook and Twitter.  To get more respondents, I created new rounds of similar surveys at Mechanical Turk, ultimately targeting users in the USA and Canada.

Here are some of my lessons learned:

1.  To take a survey for free, people must already be passionately interested in your subject.  In other words, people in audiences that your organization is trying to expand into will most likely not take your surveys for free.

When the first surveys were launched, I expected a relatively even split between coaster enthusiasts and general public.  We had announced the surveys throughout Coaster Crew’s social media channels.  Instead, the first round gave us less than 30 responses.  When I noticed that most of the respondents had ridden hundreds of different coasters, I realized that this only catered to coaster enthusiasts.  So I had to split the survey in two to get both coaster enthusiasts and general public (GP) to respond.

2.  Therefore, if you want a big sample size, expect to pay up.

Workers want a fair rate.  And they deserve it.  Turker Nation, a large community of Mechanical Turk workers, accused me of promoting slave labor on Twitter for the rates that I was paying.

I have a few rationales for my low rate:

a) I’m not getting paid for this project. In their tweet, Turker Nation said, “Would YOU work for that?” (shouting theirs).  I am working on this project for free – because I love theme parks, because I love being part of Coaster Crew, because I see it as a way to give back to Coaster Crew for years of letting me go to their events, and because I need work for my UX portfolio (particularly in amusement – my main target industry).  Therefore, when I was writing the proposal for this project, I was assuming I wouldn’t have to pay my respondents – or, for that matter, my usability testers.

b) Our budget is low.  To my knowledge, Coaster Crew has no full-time employees.  All of us have outside jobs, have our own companies, or are students.  The organization is on a lean budget and interested in branching out to have events at more parks and fansites for more parks.  I want this project to further these initiatives and not cause them to be disrupted in the short run.  Within the past year, they have launched several new fansites.  I would rather concentrate money on making sure the end result is as professional as possible.  So if I were to only have a couple hundred dollars to work with, I’d invest in high-quality, responsive WordPress themes as the highest priority.  For the record, I do not send my Adobe receipts to my clients and tell them to pay me for the tools I use to do the job.

c) Surveys need a representative board of respondents.  For that, you need a lot of respondents.  I was trying to get the most respondents within my budget as I could.

I would use MTurk for low- or medium-budget projects such as these.  If money were no object, I would just use AYTM’s panel.  Originally, I was dreaming big – 3000 respondents would answer my survey for free.  Doing this on AYTM’s panel would cost about $90,000, and double that if there is even one prequalification question on the survey.  However, for a smaller-scale, still high-budget survey, AYTM’s panel would produce amazing results.  I can target it not only by country but by states and counties, so it would be very easy (if expensive) for me to hire a similar set of respondents as the parks would want in their own marketing surveys.  AYTM’s tools for doing this are truly powerful.  Their interface is a lot easier to figure out than MTurk, and (even in their free surveys) you can slice and dice presentation data almost any way you like to get great data to back up your personas.

3.  Target your surveys by location.

Fortunately, the free first round of our survey gave us great results for the coaster enthusiasts – because the results we paid for in the coaster enthusiast survey did not.  Coaster Crew presently serves the fans of parks in the US and Canada.  Only 3 of our 69 respondents to the “enthusiasts” survey were based in either of those two countries.  Most of the others were based in India.

The problems with India on a survey for our coaster club are these:

a) Coaster Crew doesn’t target the parks in India, so these are the wrong users to research even if they have the necessary coaster count to qualify for taking this survey.

b) Per RCDB, there aren’t even 100 currently operating roller coasters in India.

c) Most of the respondents indicated they would not drive more than 6 hours to visit an amusement park.  So most likely, they have not ridden any coasters outside India.

So, combining b) and c), most of these respondents did not really qualify to take this survey.  But I couldn’t reject them (although I rejected some for failing to follow the instructions), because the acceptance criteria didn’t spell this out.  I paid the price for this.  Fortunately, the free survey in the first round gave me at least some data on the enthusiasts, and I’ve been part of that community for years as well.

Interestingly, the GP survey got no respondents at its original reward price.  I canceled that batch and launched two new GP batches for the US and Canada.  With the tricks below, I got much better data.

4.  Clarify your qualification questions.

We had a “coaster enthusiasts’ survey” that was intended for those who had ridden at least 100 different roller coasters.  By comparison, I’ve ridden about 80 so far.  So I am close to the cutoff for “enthusiasts” even though I plan whole vacations around roller coasters, have been to several ERT sessions for members of coaster clubs, know what ERT is (exclusive ride time), know coasters by their make and model more than their names, and so on.

“Coaster count” is a widely understood term in the coaster enthusiasts’ community.  However, the general public seems to not think in those terms.  They’re not going around the world to ride kiddie coasters at special club events just so they can have 700.  Add to that the cross-cultural challenges I had in the first round of the survey (as I described earlier), and people took this qualification question to mean, “Have you ridden a roller coaster at least 100 times?”  I think that is what tripped up the respondents in the first enthusiasts’ survey on MTurk.

5.  Have written questions that require people to post relevant answers – and reject anyone who gives a nonsensical response.

In that survey, I got some truly absurd answers.  Survey respondents were asked a standard question: “When you use the website of any coaster club, do you find anything frustrating that you wish was easier? If so, what?”  One person responded, “last month”.  I couldn’t reject their task because I didn’t ask for a Worker ID.  So that was one thing I added in the next round of surveys.

Most importantly, I added two standard questions: 1) What is your favorite ride at any amusement park? and 2) What is your favorite amusement park?.  These are great ways to understand both enthusiasts and the GP, as well as guards against nonsensical data showing up in the surveys.  In AYTM, I can easily reject or filter out answers that don’t make sense when I publish my survey results.

6.  Ask MTurk’s support team for help whenever needed.

MTurk’s interface for new requesters is daunting, to put it nicely.  Commands like “Cancel this batch” leave me with more questions than answers: if a batch is canceled, do I get the money back?  Are workers that completed the survey still paid? … and so on.

I am also not sure what republishing a rejected assignment for other workers to complete means.  I rejected 18 of 69 respondents in the enthusiasts’ survey based on my acceptance criteria.  Several emailed me asking why, and in only one or two cases (with all of my surveys) have they actually come back and submitted the survey properly, allowing me to approve their previously rejected assignment.  The interface and UX for this are backwards, though the task itself does not take very long after you figure out how.  Their support team was responsive in telling me how to do this.  Between their UX and their lack of clear documentation, I had to call MTurk’s support team about 3 times in the first week.

So those are some lessons I learned in using MTurk to do user surveys.  Next, after all of the surveys have concluded, I will share some of my findings from the surveys themselves – to help you all understand a bit about how roller coaster enthusiasts and general public amusement park visitors use the web and coaster club sites.