Qualtrics & Survey Design

Introduction

What is Qualtrics?

Qualtrics is predominantly a survey tool that allows you to build questionnaires for others to answer. In the People Analytics field, the most popular surveys companies build follow the employee lifecycle journey and include some of the following:

  • Onboarding surveys
  • Engagement surveys
  • Exit/Attrition surveys

Using Qualtrics

If you are a student at University, there is a strong likelihood that you already have access. Otherwise, to access this web application, navigate to this website.

Survey Design

Another term for survey design is psychometrics, which is the theory and technique of measuring knowledge/skills/abilities, cognition, personality, and so much more. Commonly used methodologies include: surveys, questionnaires, assessments, and tests. Survey design is important to know in the People Analytics space because you may be asked to design or analyze a survey on employee engagement, an exit interview, or some other pulse on the employee population.

Surveys are a good tool to learn about your workforce. As a People Analytics professional, you may or may not work directly with the team that owns the employment surveys, but it is necessary to understand the science behind:

  • Writing good survey items
  • Survey fatigue and how to fix it
  • Qualitative vs. quantitative data from a survey
  • How to analyze results from the survey in a way that makes sense

The following sections will dive into each of these themes so that you begin to feel more comfortable creating a survey in the future.

Writing Good Survey Items

Have you ever taken a survey about your experience with a product or within an organization asking about your experience with the company? These are just two examples of many options that circulate business needs.

If you have taken one before and have scratched your head, slightly confused as to what the survey item was asking about, you are not alone! So, what makes a survey item bad? Take a look at the three items below and see if you can identify why?

1. Please rate the speed and accuracy of the help you received. 
2. My team is able to make quick and efficient decisions to get work done timely.
3. Thinking about your entire experience within the company, how would you rate your satisfaction?

If you thought these were ambiguous or worded in a confusing way, then you are correct! These would be considered bad survey items that we would not want to include in any customer or employee facing survey.

Here’s a list of guiding principals to keep in mind when creating survey items:

Avoid the use of:Why?
Yes/No questionsAnalyzing this down the line will provide less meaningful insights.
Using jargonIf your survey taker cannot understand slang/jargon, they will not be able to answer the question to the best of their abilities.
Being inconsistent with scalesCould confuse the survey taker if they thought “agree” was always on the right and taking the survey quickly.
Asking sensitive questionsThis starts to dissuade the survey taker for fear of being identified from results.
Putting demographics at the front/topThis often biases the survey taker to be less honest compared to putting them at the end.
AmbiguityLeads to ambiguous data analysis and less reliability in results.
Double barreled questionsSurvey taker can’t respond to both items separately and reduces validity in the item.
Too many questionsIf the survey has too many questions, most often than not the employee will stop participating midway.

Survey Fatigue

In most companies, an engagement survey may only happen once or twice a year as to not overwhelm the employees. The Covid-19 pandemic disrupted this as the world moved into a remote-first working model. Many companies started sending out monthly, if not weekly, surveys to their employees to make sure they could keep a good pulse on what they were feeling and experiencing. So, while there will always be exceptions to the rule, in general it is not advised to send out employee engagement surveys too frequently as the response rates will slowly decline over time with employees feeling bored or tired of answering.

If you think about the entire employee journey within a company however, there are numerous touch points that typically correspond to a survey item. For example, in onboarding survey items, you may be asked, “Did you speak with your manager on the first day?”. Or perhaps after a large distribution of teams management may ask about your morale or something like, “I feel recognized for the work I do”. Finally at the end of one’s employment, an exit survey could ask, “What could the company do better?”.

Therefore, it is not as simple as stating that survey fatigue happens if the questions are asked too often. In fact, it is the lack of actionability that often decreases the desire to fill out a survey. Unless it can be made clear as to what will or can be actioned on from the survey, it will be challenging to get the employee to be convinced to take the survey in the first place. Before creating survey items, think about the goals of the survey, who the key stakeholders are, and whether or not you can already answer some items based on other data you may have access to.

Quantitative vs. Qualitative Data

Quantitative survey data consists of the actual numbers from the survey. For example, if you asked people to rate something “on a scale from 1 to 5”, then in your results, you’ll have a variety of responses from 1 to 5. These will be the data points that you then analyze and can connect to other data points to tell a compelling story with the data.

Qualitative survey data most often is from the free/open text responses from the survey. For example, if you provided a question at the end of your survey like, “Provide any additional details here:” with no answer choices, then you’ll have survey takers add in any response they want. This makes analyzing the data output a little less structured than quantitative data. However, with the advancement of LLMs (large language models), getting themes from this type of data has become much easier.

Analyzing Results from Survey

Once the survey goes out and you get the results back, it’s time to analyze the data! As a beginner to People Analytics, you most likely do not need (or want) to know all of the complexitites that make up survey design/psychometrics, as the teams that work on surveys full-time often have PhDs in Industrial and Organizational Psychology. So, being able to analyze the output from the surveys is a great starting place for you to experiment with data, understand organizational nuances, and learn the fundamentals.

With so much data, where do you even start? If you have never looked at survey results before, here are some tips on what to look for and why:

TipWhy?
Know what your survey items areIt is all too common for an analyst to jump into analyzing before they even know what the survey items are. Know each one and how it relates to your research questions.
Know what your research questions areIf you start looking for trends without a hypothesis to anchor on, it’s easy to get lost in the data. Take a moment to understand what are the big problems you are trying to solve.
Investigate your resultsNow that you know what you’re analyzing, do a variety of exploratory analyses! Pivot tables are great ways to review descriptive data like means, counts, and percentages.
Explore advanced analytics after basic data questions are answeredOnce you can answer the basics like participation rates, mean scores, or distributions by key cohorts, then and only then should you move onto the more advanced analyses.

Tutorial

To illustrate how to analyze survey data, it is best to work with a real data set. Below we have a file that has been exported directly from Qualtrics, our survey system. As you progress with your skills (either as an individual or as a People Analytics team), you’ll notice that moving away from static data files is essential. For now, we will start analyzing data in a tool that should be familiar to you - Excel.

Tutorial coming soon!

Previous
Next