Interviews

Interviews#

Resources and guides for conducting research interviews in usable security and human-centered security.

  • IF it is your first time conducting interviews, consider reading through the whole page.
  • IF you are short on time or revisiting interviews, check out the Cheat Sheet section for a high-level overview.
  • IF you are looking for a specific aspect, check out the Table of Contents on the right side.
  • ELSE check out the Examples section for inspiration from real-world interview papers.

Examples#

See also the group drive for past IRB example applications and consent forms: https://drive.google.com/drive/folders/11rJkniObDSS1tXbCuHUC3euDCRDBaJPn?usp=drive_link

  • Context Matters: Qualitative Insights into Developers’ Approaches and Challenges with Software Composition Analysis, USENIX 2025, PDF, Artifacts
    • PopulationIndustry Devs AreaTooling
  • It’s like flossing your teeth: On the Importance and Challenges of Reproducible Builds for Software Supply Chain Security, IEEE S&P 2023, PDF, Artifacts
    • PopulationReproducible Builds Professionals AreaBuilds
  • Committed to Trust: A Qualitative Study on Security & Trust in Open Source Software Projects, IEEE S&P 2023
    • PopulationIndustry Devs AreaDependencies
  • Committed to Trust: A Qualitative Study on Security & Trust in Open Source Software Projects, IEEE S&P 2022, PDF
    • PopulationOpen Source Devs AreaOS Security

Approach#

The general approach for scientific interviews differs somewhat by type (see below), but generally, the following steps are taken (correspond to sections on this page):

  1. Sync on Data Protection plan
  2. Create Interview Guide Draft
  3. Set up recruitment pipeline
  4. Important: Get IRB approval (generally requires the artifacts from the steps above)
  5. Do Pilot testing
  6. Somewhat overlapping:
  7. Report Results

Data Protection: Data protection is especially relevant for interviews, as you can’t control participant answers and most collected data (including simple voice recordings) are considered personal information.

Give the data protection section a close consideration.

Cheat Sheet#

High-level overview if short on time or revisiting interviews, see sections below for more in-depth insights.

Interview Cheat Sheet

Structure:#

  1. Opener & Introduction
  2. Explain purpose and research context of interview
  3. Encourage natural interview flow, let participant speak, guide if necessary
  4. Ask probing questions to gain deeper insights
  5. Debrief & end the interview

Interview Types:#

  • Unstructured / Open / Exploratory:
    • Only initial question(s) might be planned
    • Interviewer develops interview based on answers
  • Structured:
    • Rigid script, branches only on pre-planed splits
    • Similar to a (guided) survey
  • Semi-Structured Interviews:
    • Guiding questions with optional follow-ups

Usable security / HCS mainly conducts semi-structured interviews.

  • Allows for some quasi-quant data (unlike unstructured)
  • Good coverage of everything we want to know (unlike unstructured)
  • Can follow-up or skip (unlike fully structured)

Good Interview Questions#

  • Build your interview guide around research questions, alignment matrix to see that they match
  • Go from broad to specific (Intro, general questions, specific questions, outro)
  • Allow participants to mention concepts themselves before you ask about them. (e.g. allow them to bring up security concerns)
  • Mostly open questions (instead of questions that can be answered with yes/no)
  • Clear questions
  • Applicable (ask for thoughts / experiences that they can answer)
  • Unbiased questions
  • Pilot them multiple times so you know your guide more or less by heart & are prepared for responses

Interview Goals#

  • The goal of the interviews is to collect data about people’s perceptions / challenges / ideas / … in their own words.

Priming and Leading#

  • It is important that the interviewer does not convey anything to the participant about what they know or believe about how the topic works/should work.
  • This means that the interviewer must pay careful attention to the language each participant uses during the interview, and refer to the same concepts the participant talks about using the same kinds of words as the participant, avoiding to indicate vocabulary of their own.
  • The more we guide their responses, the more we will be collecting data about something they’re only thinking about because we asked them to think about it. We want to know what they think about this, not what they think about what WE think about this.

Focus Points#

  • Unprompted mentions of privacy, security, ethics, fairness considerations.
  • Company pressures of requirements, deadlines, things that get cut due to them.
  • Challenges they face.
  • How other departments that may be involved in security or oversight interact with them.
  • Reasons why defenses they know are not used.
  • Feelings of confusion, defeat, carelessness, etc.

Misc#

  • Video interviews are great for non-verbal communication (nodding at the participant etc.). Also keeps the transcript cleaner.
  • Co-interviewers are nice for backup but also because it may make it easier to last through the interviews! Or interview experience & subject matter expertise can be matched up. Clarify how co-interviewer can ask questions (raise hands, end of section, backchannel via zoom text messages…); clarify who asks the first question if you equally co-interview; usually the person who asks a question follows up; speak up in cases of follow-ups. You can debrief together!
  • Debriefing: Brain-dump on advisor, or leaving many comments on transcript. Talk about or write down what the most insightful or interesting parts of the interview were. Summary may also be helpful later in the project for onboarding other people or choosing transcripts to read together or in detail, or for “which are so different that they may be good for codebooks”.
  • What to do if interviewee is dishonest: Whatever is least painful. Usually end interview in a coordinated way, but if you interviewed them for a bit, pay them too.

Setup#

  • Be there early, have waiting room
  • Limit amount of appointments in calendar
  • Make sure tech setup work (zoom, mic, camera, obs, etc)
  • Make sure you have recently taken a bio break
  • Bring water

Interview Types#

Most of our usable security research involves semi-structured interviews:

Semi-Structured Interview#

See also: Our Interview Guide section for examples of semi-structured questionnaires.
Often structured as a few sections with very general main question, each followed by more specific sub-questions to include not covered aspects. E.g.,

  • Main question: “Can you tell us a bit about your project”
  • If not covered by participant as follow-ups: “How many people are involved”, “When was the project created”, etc.

Challenges for semi-structured interviews include:

  • Higher mental load for the interviewer: you need to keep track (and understand) participants’ answers to decided which follow-ups to ask, often even across multiple interview sections.
  • During the interview design phase, you need to weight interview flow vs. what you want to report in the paper (to many follow-ups and it turns into a super long structured interview, which leaves less room for quotes and tires the participant).

Structured Interview#

Typically based on the same research logic as questionnaires: Standardized ways of asking questions are thought to lead to answers that can be compared across participants and possibly quantified.

  • Interviewers are supposed to “read questions exactly as worded to every respondent and are trained never to provide information beyond what is scripted in the questionnaire.”
  • Commonly used in CATIs (where call center agents do the interviews based on a survey template).

Unstructured Interview#

At the other end of the continuum lie interviews that have little preset structure. Can only start with one initial question and then continue by discretion of the interviewer.

  • Not that common in usable security research.

Other Variations#

  • Group Interviews: (Smallish) group of participants discusses high-level questions
    • Setup usually semi-structured (fixed high-level questions / discussion starters)
    • Unique group dynamics
    • Potential biases through peer interactions
  • CATI Interviews: (Computer Assisted Telephone Interviews)
    • Interviewer supported by computer system (i.e., they click through a survey script and fill in participants’ answers)
    • Less flexible than interviews, but easier to scale
    • Usually done through service providers (call centers) that work with provided contact lists + scripts
    • Example paper: https://dwermke.com/publications/2021-conf-usenix-huaman/
  • Walkthrough Interviews: Interviewer accompanies participant through task or process
    • Asking questions about actions, thoughts, and decisions
    • Common in usability testing
    • Can also be post-task (retrospective think-aloud) by going through a video or similar
  • Self-Interviewing: Respondent conducts interview themselves with a recording device or written responses.
    • Useful for:
      • Remote locations without internet access
      • Longitudinal studies (e.g., one self-interview every day for a month)
      • Anonymity or self-reflection is important
  • Dyadic Interviews: Similar to group interviews but focused on (relationship between) two participants
    • Involves two participants interacting with each other while being interviewed by a researcher.
    • Used to explore relationships, collaborations, or conflicts between participants.
    • Example: Developers’ perceptions and challenges after using a pair programming approach for some time

Data Protection#

Data protection and privacy is very important in interview studies. Stick to the following points when doing interview studies to ensure that you will not violate data protection and privacy.

Everyone in the project (all student assistants, all PhDs, all collaborators) should be introduced to data protection and privacy in the project and all its processes during onboarding — before actually collecting the sensitive data.

Data Storage

See also article on VeraCrypt for creating encrypted folders.

  • Store contact data and similar data (e.g., data needed for payment of participants) separate from the actual interview data (recordings, transcripts, etc.).
  • Assign a unique ID on a per interview basis, e.g., use simple numbers (00, 01, 02, …) or longer identifiers (GER01, GER02, …, US01, US02, …) and maintain a mapping in a secure place with corresponding contact data.
  • Store all the data in safe encrypted places (BOTH encryption at rest and in transit). Never store the data publicly for any uninvolved people or in unencrypted form. Know where the data is stored.
  • Do not send any data in unencrypted form (e.g., clear text emails).

GDPR requirements:

  • Do not use cloud storage by third parties (e.g., Google Drive or Dropbox).
  • Do not use any online office suite (e.g., Google Docs or Office 365).

Transcription

  • Remove all personally identifiable information (PII) of the participants as well as information about companies or any other sensitive information/unique identifiers. Replace it by some non sensitive information/description while keeping the important contextual information (e.g., "Robert" → [participant], "Sascha" → [Co-Author], "my colleague Jason" → my [coworker], "Facebook" → [large internet company], product names "WhatsApp" → [smartphone messenger app], etc.).
  • If some transcription service is used, ensure the transfer of interview recordings and transcripts in encrypted form.
  • After transcription: Authors read the whole transcript to ensure that all data is adequately anonymized/pseudonymized. If a position has been forgotten, this will be corrected.
  • When the transcript is finalized, the recordings are no longer needed and therefore should be deleted immediately.

Publication

  • Never publish the interview recordings or transcripts, as the content will probably have enough context information to unblind and deanonymize the involved participants, companies, etc. - despite anonymization/pseudonymization. (Treat transcripts the same way as recordings.)
  • It is ok to publish very small parts of the transcribed interviews, e.g., quotes, as long as they contain no PII and will not allow deanonymization/unblinding. Double check this when you are going to use a quote from the already anonymized transcripts.
  • If you want to publish information about the interviews (e.g., for replication of your study) you could publish information about recruiting (texts, emails, social media posts), consent forms, code books, example codes, example quotes.
  • Be precise when reporting how PII was handled since anonymization is not deidentification(!)
    • Anonymization is defined as:

    The act of permanently and completely removing personal identifiers from data, such as converting personally identifiable information into aggregated data. (Source)

    • We are usually only able to de-identify, especially in the case of interviews, so make sure to report it as such!

Interview Guide#

If you are conducting an interview, using an interview guide is important for structuring your questions, planning out how you’ll pose your questions to the interviewee, and keeping your questions consistent throughout multiple interviews.

Examples:

Format#

Your can write your interview guide in any text document and even print it out for interviews.

For semi-structured interviews, a good structural approach is to write down the general question, check the provided follow-ups during the answer, and then specifically ask for the missed follow-ups.

Some features that help with this approach are:

  • Checkboxes for sections, questions, and follow-up questions so that you or a shadow interviewer can keep track during interviews.
  • Question IDs for easier referencing during calls and later publication.
  • Color coded headlines for easier visibility during interviews.
  • Summary word in front of follow ups for faster search when scanning the document during the interview.
  • Page Breaks for different sections to avoid having to turn pages during a question.
  • Highlight Branching questions with keywords and visuals, e.g., IF and IF NOT.

A guide with all of those suggestions included could look somewhat like this:

Example: Interview Guide

[Check project metadata beforehand]

  • S1Q1 Project: Can you tell us about [project]?

Follow-Ups:

  • S1Q1.1 About: What is the project about? What is the project’s purpose?
  • S1Q1.2 Age: When did the project start?
  • S1Q1.3 Contributors: How many regular contributors does the project have?
  • S1Q1.4 Connection: How do contributors know each other? (Virtually, Personally)
  • S1Q1.5 Distribution: How are contributors distributed geographically?

[Check if project has guidances & update Q accordingly]

  • Quick intro guidances
  • S2Q1 Guidance
    • [IF Guidance] Are there guides/best practices/hints available for developers/operators, etc.?
    • [IF NOT Guidance] What are your thoughts about including guides/best practices/hints available for developers/operators, etc.?

Follow-Ups:

  • S2Q1.1 Infrastructure: Does your project have security guidelines for configuring/running infrastructure, e.g. cloud, vcs, etc.?
  • S2Q1.2 Languages: Is your project using language security guidelines for all languages in the project? Yes: Can you elaborate on them?
  • S2Q1.3 Crypto: If you’re using crypto in your code: Do you have a guide on how to use crypto?

Structure#

As for general structure, including introduction and outro in the interview guide is a good practice to keep interviews more consistent between sessions.

  1. Write down the larger research questions of the study. Outline the broad areas of knowledge that are relevant to answering these questions.
  2. Develop questions within each of these major areas, shaping them to fit particular kinds of respondents. The goal here is to tap into their experiences and expertise.
  3. Ask “how” questions rather than “why” questions to get stories of process rather than acceptable “accounts” of behavior.
  4. Develop probes that will elicit more detailed and elaborate responses to key questions. The more detail, the better!
  5. Think about the logical flow of the interview. What topics should come first? What follows more or less “naturally”? This may take some adjustment after several interviews.

A guide structure for semi-structured interviews could look like follows:

  1. Preamble
    • Greeting
    • Not Judging
    • Consent (only if not gathered before)
  2. Main Part
    • Ice Breaker, Building Rapport, Encouraging Questions
    • Demographics (project etc.)
    • Main Questions (usually more general -> more specific)
    • Thoughts & Opinions
  3. Outro
    • Debrief
    • Payment

1. Preamble#

  • Greeting. Give the participant a short intro about who we are and what the interview is about.
    • Introduce yourself and any other attendees of the call
    • Provide overview and context for the conducted research
    • Ask if they have any questions
    • Ask if they are okay with being recorded
    • Only then start recording and interview
  • Not Judging: Towards start of interview:
    • “Before we begin, I want to emphasize that this interview is not about judging your answers or performance. We’re just interested in learning about your experiences and perspectives, so feel free to share openly. There are no right or wrong responses in this interview.”
  • Consent. Allow the participant to consent to this study: provide background on how their data will be handled and answer any possible questions. (Sometimes moved to a pre-survey)
    • If you plan on recording the interview, get clear agreement, only then start the recording, and requery if they are okay with being recorded to also have it in the recording.

2. Main Part#

  • Ice Breaker. Begin the interview with a “warm-up” question — something that the respondent can answer easily and at some length (though not too long).
    • It does not have to pertain directly to what you are trying to find out (although it might), but this initial rapport-building will put you more at ease with one another and thus will make the rest of the interview flow more smoothly.
    • General demographics (role, position, experience, …) are a good candidate for easy rapport-building questions.
    • Example: “Can you tell us about yourself / your project”
  • Difficult questions should be asked toward the end of the interview, when rapport has been established.
  • The Thoughts & Opinions should provide some closure for the interview, and leave the respondent feeling empowered, listened to, or otherwise glad that they talked to you.
    • Good candidates are outlook (“Where do you see X in 5 years?”) or improvement (“What would you personally change?”) questions.
    • Example: “If you could make one change to improve the security of X, what would that be?”

3. Outro#

  • Debrief Provide the participant with some closure. It can be a good idea to include a feedback section:
    • Anything the participant wants to mention but wasn’t asked yet?
    • Any feedback they have about the interview? (can be on the recording, else take notes).
    • Clearly state when you are turning of the recording.
    • (Off recording) Any company / project / person they think would be a good fit to interview (especially when snowball sampling)
  • Payment
    • If you plan on paying the participant, ask for the corresponding contact info, clearly state if the payment might take some processing time on our end.

Writing Questions#

Part of the challenge of conducting an effective interview is writing the right interview questions. Effective interview questions will have the following traits.

Interview questions should be:

Simple Questions#

Keep the questions simple, both in length and structure. Longer questions will be forgotten in a call, complex structure leads to confused participants.

Good What challenges do you face when integrating security measures into your code?
Bad Do you find the current security protocols difficult and time-consuming to implement?

Open-ended Questions#

Give the participant to express their thoughts. Generally avoid yes/no questions unless intentionally required (follow-up, demographic, interview split).

Good Outline how you use two-factor authentication in your development process?
Bad Do you use two-factor authentication in your development process?

Clear#

Craft each question with simple, clear prose. Avoid confusion about how to understand terms the question itself.

  • Prepare definitions (as text in guide appendix) for terms you can’t avoid.
Bad How do you feel about the current ACL implementations with regards to RBAC and MAC?

Unbiased#

Avoid making any judgmental assumptions about the subject of research or of the respondent.

  • Don’t assume certain answers to be right
  • Don’t judge the respondent (bias potential)
Good What do you think about current security tools for developers?
Bad Don’t you think the current security tools are too restrictive for developers?

References for Writing Questions#

Piloting#

Before starting the actual interviews, it is a good idea to pilot your interview guide and process.

There are two main types of piloting:

  1. Internal Piloting: Conduct interviews with colleagues or friends that are somewhat familiar with the topic.
    • Can be done without IRB approval (data not used for publication)
    • Helps to get a feeling for the interview guide and process, rough time estimates
    • Allows to identify unclear questions or other issues
    • Also use these for training interview skills and testing technical setup (recording, video, transcribing etc.)
    • Generally framed as a normal interview, but you can also choose to stop and discuss things during the interview (especially for early pilots)
  2. External Piloting: Conduct interviews with actual participants from the target population.
    • Requires IRB approval (and probably payment etc.)
    • Basically the same as actual interviews, but intended to test the interview guide and process
    • Can be included in the actual study (and counted as main participants) if no major changes are required to the guide or process

Recruiting Particpants#

Recruiting participants for interviews can be challenging, especially if you are looking for a specific population.

Interviews are generally not representative, so you can combine multiple channels to reach a more diverse set (in terms of the population) of participants. Mixed approaches are common, e.g., starting with direct contact and snowball sampling, and then using social media posts to reach a broader audience.

Channels#

There are multiple channels that can be used to recruit participants for interviews, depending on the target population:

  • Direct Contact: If you have access to a list of potential participants (e.g., through previous studies, conferences, or professional networks), you can reach out to them directly via email or phone.
  • Snowball Sampling: Ask connections (LinkedIn, audiences etc.) and current participants to refer other potential participants who might be interested in the study.
  • Networks: Use platforms like Twitter, LinkedIn, Reddit, or specialized forums to post about your study and invite participants.
    • Developer Waterholes: If you are looking for developers, consider reaching out to communities on GitHub, Stack Overflow, or specific technology forums.
    • Professional Groups: Contact professional organizations or groups related to your target population.
    • Conferences and Meetups: Attend relevant conferences, meetups, or webinars to network and recruit participants.
  • Freelance Platforms: Use platforms like Upwork or Fiverr to find participants willing to take part in interviews for compensation.

Invites#

2024 Crypto Forum Post:

Subject: Interview Study regarding Cryptographic Standards and their Implementation

Dear All,

We are researchers from the CISPA Helmholtz Center for Information Security, the Max Planck Institutefor Security and Privacy, and the University of Paderborn from Germany.
In coordination with the PQCTeam at NIST, we would like to pitch our current research project.
We are conducting an interview studyto investigate developers' experiences of implementing cryptographic standards. You can find moreinformation about this study at https://research.teamusec.de/2023-secure-crypto-standards/.

If you have any level of experience implementing cryptography, or interesting opinions regarding thecryptographic standardization process, we would greatly appreciate being able to interview you foraround an hour.

To select an interview slot, you can respond via mail to hua...@sec.uni-hannover.de or book a slotdirectly at https://calendly.com/2023-crypto-standard-interviews/schedule.

Best Regards

2023 LinkedIn:

Hi xyz,

Sorry for cold-messaging you about this. I work together with a team of researchers from George Washington University on studying the ethical impacts of privacy and security software.
We would like to interview you about your take on ethics, and consideration of ethics during the development of security and privacy software. You can choose to obtain a $80 gift card as a thank you from us for your participation in this study.

You can find more details on our landing page for the survey: https://gwusec.seas.gwu.edu/ethicalimpactproject/ 

We are thankful that you took the time to read through this message and sorry again for bothering you this way. 

Many thanks, 
Dominik

2023 Reproducible Builds Cold Call:

Title: "Interview: Reproducible Builds Project Approaches and Security Impacts"
Dear [FULL NAME],

We are a group of researchers and are contacting you due to your involvement in [PROJECT]. We are interested in this software projects' involvement and experiences with reproducible builds and would love to interview you about it.

We are not trying to sell anything, you & the project would be treated completely anonymously. We hope your answers and our findings will help with further improving the reproducible builds effort (which might benefit your projects).

If an interview sounds interesting to you, feel free to check our landing page for this research with more information: https://research.teamusec.de/2022-interviews-reproducible/

We are very sorry to take up your valuable time with this email. Regrettably, cold emailing is an approach to reach a more diverse set of projects. If you do not want to participate, please accept our deepest apologies and simply ignore this email.

Best regards,
[AUTHOR]

[SIGNATURE]

2023 Reproducible Builds Established:

Hi [NAME],

We are contacting you due to a recommendation from [RECOMMENDER]. We are a group of researchers interested in reproducible open source software. We are interested in this software projects' involvement and experiences with reproducible builds and would love to interview you about it.

If an interview sounds interesting to you, feel free to check our landing page for this research with more information: https://research.teamusec.de/2022-interviews-reproducible/
If you do not want to participate, please accept our deepest apologies for wasting your valuable time and simply ignore this email.

Best regards,
[AUTHOR]

[SIGNATURE]

2022 Open Source Email:

Dear [name],

We are a group of researchers and contacting you due to your involvement
in the [project name] project on GitHub. 
We are interested in how popular and active open source project 
communities tackle trust & security and would love to interview 
you about it.

We are not trying to sell anything, you & the project would be 
treated completely anonymously. 
We hope your answers and our findings will help with further 
improving security & trust procedures in the open-source community.

If an interview sounds interesting to you, feel free to check out 
our landing page for this research with more information: 
[landing page URL]

Simply respond to this email with your preferred time slots and 
we will work something out.
Lastly, we are very sorry to take up your valuable time with this email. 
Regrettably, cold emailing is an approach to reach a more diverse set 
of projects. If you do not want to participate, please accept our 
deepest apologies and simply ignore this email, 
we will not contact you again.

Best regards,
[researcher]

2022 Open Source channels:

Hi [project name] community,

We are a group of researchers from [affiliations...]  and 
we are interested in how open source projects tackle 
trust & security and would love to interview someone 
from the community about it.

Our Landing page: [landing page URL]

You can message me directly here on [platform] if you 
have any questions or want to schedule an interview!

Thanks for your time,
[researcher]

Landing Page#

A landing web page with some further information and sign up link for interviews can be a good idea to (a) keep invite emails short and to the point and (b) allow invitees to pass on a link in case they know better suited particpants (other maintainer, internal developer groups etc).

Examples:

A consent form is a legal document that ensures an ongoing communication process between you and your study participants. The primary purpose of the informed consent process is to protect both the participant and you.

Before beginning data collection, participants need to give their consent. Consent forms are generally required by IRBs for all types of data collection that include humans, e.g., interview studies, online and lab studies, and surveys.

Every user study needs a consent form! Make sure to provide a consent form for every user study. Only begin data collection after participants gave their consent!

Common collection methods:

  • As quick async survey before the interview (prefered by US IRBs, provides consent list)
  • At beginning of interview (before recording); outline consent, answer questions, get consent, get consent “Yes” again on recording.
Note: Being able to collect consent right in the interview call is more of a German thing. A US IRB likely prefers collecting the consent before the interview. Best approach is usually sending the participant an email with the link to a short form (Google Forms, Qualtrics, …), which shows the consent form and collects email + signature/submit of the participant.

Conducting Interviews#

Being a fairly interactive research approach, probably no other results’ quality is as dependent on researcher behaviour as interviews.

Best Practices#

Below you can find some hints that might help you conducting an effective interview. They are partially inspired / taken from the listed references.

  • No Priming: It is important that the interviewer does not convey anything to the interviewee about what they know or believe about the topic. This means that the interviewer must pay careful attention to the language each interviewee uses during the interview, and refer to the same concepts the interviewee talks about using the same kinds of words as the interviewee, avoiding to indicate vocabulary of their own. The more we guide interviewees’ responses, the more we will be collecting data about something they’re only thinking about because we asked them to think about it. We want to know what they think about this, not what they think about what WE think about this.
  • Wait for Answers: It is OK to wait for people to answer when you’ve asked them a question. People may need to think for a minute before answering some of these questions, and if there is a pause in conversation while they do that it may feel a bit awkward. This is OK. The best way to give a participant space to answer is to remind yourself to PAUSE and let them think, even if the silence makes you uncomfortable. If you move on, and ask another question, they won’t answer the first question! Count to 10 in your head if you have to.
  • Never interrupt the person you are interviewing! This may mean that it feels like the person may be rambling into something that is off topic for the interview protocol. But, that doesn’t mean the data won’t be useful, and if you cut them off you will never know what they were going to say. People think out loud sometimes, and the process of talking about something is important for the process of thinking about it. Also, interrupting someone conveys to them that you weren’t actually that interested in what they were saying, and which is absolutely the LAST thing we want interview participants to feel. We are VERY interested in what they have to say!
  • Focus: You should be trying to pay attention to everything the person is saying and thinking about how to follow up. This takes a lot of energy and focus. You shouldn’t be thinking about other stuff going on in your life during the interview – focus all of your attention on the participant, and ask good follow-up questions.
  • Get them to think! Sometimes people’s first response might be “I don’t know” or “I have no idea” or “I’ve never thought about that before”. This is because we’re showing them information they really may not have thought about much before! It is important to follow up when they say that, don’t just let it go! Some ways to follow up and get them talking about what they’re thinking are: “Tell me more about that.” or “Why do you think you haven’t thought about it?” or “What is it that makes it hard to answer this question?” or “We’re really interested in anything you can tell us about your thoughts about this.”

Interview Statements#

Some statements that can be used during interviews.

Acknowledge Meaningful Answers: Good ways to acknowledge that they have said something meaningful:

  • “That was very insightful”
  • “That is interesting to hear”
  • “thank you for sharing, that was very interesting” (etc)
  • “Thank you, this was very helpful”
  • “I learned something new, thanks” ← not appropriate everywhere!

Probe/Follow-Up:

  • “Tell me more about [X]”
  • “What do you mean by [X]”
  • “How do you think [X] happens”
  • “Where do you think [X] came from”
  • “Can you give me a specific example of [X]”
  • “What would [X] look like”
  • “Why do you think that is..”
  • “Can you explain this in more detail?”
  • “Can you elaborate on…”

Just a note that sometimes it can be helpful to move onto a different topic and then maybe come back, if they are reluctant to talk about the first topic.

Back to the Topic:

  • “Thank you, that answered my question. Could we move on to..”
  • “I would like to come back to [X]”
  • ”Focussing on [X]…”

Use brief pause or inhalation to ask a follow-up or on-topic question.

Lead New Section:

  • “Moving on to …”
  • “Next, we want to learn about …”
  • “About half way through, next …”
  • “As last section, …”

Protocol#

Below is an example protocol for an generic semi-structured (online video) interview.

1. Before#

Prepare Surroundings

  • Is the current noise level okay? Do you need to move to another office?
  • Is your headset charged? Do you have a wired alternative?
  • Dominik: Consider using a webcam, it is way easier to give positive reinforcement during an answer (by vigorously nodding etc.), instead of having to steer the interview only by voice.
    • Check your webcam, is your background neutral and not distracting?
    • Check your clothing, could it introduce bias (no hacker shirts, unless you interview hackers)?

Prepare Call

  • Give your shadow interviewers co-host privileges.
  • If not already done: do a quick mic / camera / recording check (don’t forget to check the actual recording for sound etc.)

2. Greeting#

When attendee enters:

Introduction. Greeting, introduce yourself and the group, thank them for coming e.g.: Good afternoon! Thank you for coming and participating in our interview study. I am _______ and I’m joined by my research colleagues _____________

Overview. Describe study, procedure, e.g.:

We’re going to be asking you some questions today about your experiences working in data science projects

Reassure them to give honest answers, we aren’t judging or testing:

We’re trying to gather your honest thoughts and views on this subject so we’d like to assure you that we aren’t trying to test your knowledge or judge your opinion. We may ask you questions on subjects you aren’t knowledgeable on, it’s perfectly alright to say you are unsure or you don’t know.

Consent

  • Reference the consent form, remind them of confidentiality and opt-out e.g.:

When taking our survey you signed a consent form, I just wanted to emphasize that what you say to us during the interview will be kept confidential, and you can stop participating at any time, just let us know. If you feel uncomfortable answering any question, we can skip them with no penalty.

  • Ask if they have any more questions or doubts before starting.

Recording

  • State that with their consent, you will now start recording.
  • State that you are now recording.

3. During#

Communicate to your interviewee that you are listening to their responses

  • maintain an appropriate level of eye contact (somewhat difficult to see on online call)
  • periodically nodding
  • making affirmative (but not disruptive) sounds
  • comments like “yes”, “okay” or “I see”

Good ways to probe/follow up for more information:

  • “Tell me more about […]”
  • “What do you mean by […]”
  • “How do you think […] happens”
  • “Where do you think […] came from”
  • “Can you give me a specific example of […]”
  • “What would […] look like”
  • Dominik: I prepare a set of increasingly longer / encouraging probes (“Okay”,“I see, thanks”,“Thank you so much for this exhaustive answer”), and “reward” exhaustive (fitting) answers with the longer probes, to encourage more detailed responses by the participant.

4. Ending#

  • Turn off video recording, state that you did so.
  • Ask the participant if they have any more remarks or questions.
  • Thank them for their time and participation and wish them a nice day.

5. After#

  • Leave comments on transcript.
  • Note any changes that need to be made
  • Talk about or write down what the most insightful or interesting parts of the interview were.
  • Summarize the key points
  • Key points helpful later for onboarding other people or choosing transcripts for more in-depth looks
  • Pay (and review) the participant according to recruitment.

Data Collection#

  • Make sure your setup is working, i.e. test your microphone and camera.
  • Make sure the recording works, either store recordings locally or in the cloud.
  • Set up a backup recording strategy, e.g. use the Open Broadcaster Software.
  • In Zoom: set up a waiting room for the interviewee. A waiting rooms gives you control over when the interviewee enters the interview, e.g. you should avoid that an interviewee barges in while you test your setup or brief your shadow interviewer.

Tools For Recording#

  • Zoom recording
  • FFmpeg (Recommended for audio only)
  • OBS-Studio (Audio + Video)

Tools For Transcription#

Older Content: The transcription content below is older. A self-hosted whisper instance is probably the best / cheapest / privacy-respecting way for transcription.

The group has trialed Amberscript (transcription service) and deemed it mostly appropriate for our needs. First, ask someone about the credentials for the shared account. Then, upload your file that was output by any of the methods above. Choose the spoken language (most likely English, all accents) and the number of speakers (two if there’s one interviewer and one interviewee). Select Transcription and manual, and lastly hit Order on the right. Check neither Verbatim nor Rush order.

The transcript should be completed within a few days. The transcription may not be perfect at times. Here, it might help to ALT-click text passages that were misunderstood, listen to the audio, and correct them manually. Inaudible passages are often marked with [inaudible], crosstalk with [crosstalk]. CTRL-F can help here.

For further use in, for example, MaxQDA, exporting the transcript as a TXT file with timestamps is a good option.

Data Analysis#

Without Coding: How to analyse qualitative interviews (without coding):

  1. Read transcripts
  2. Annotate transcripts
  3. Conceptualize data
  4. Segment the data
  5. Analyze segments
  6. Write results

With Coding: Creating a codebook is often an essential step in coding qualitative data, see the Codebook page on how to create and code with a codebook.

Reporting#

Reporting interviews in method sections.

Example Structure#

Example structure for the method section of an interview publication. You can copy and paste this structure to your publications as comments, and fill-in your own approach below each line.

Copy-Paste Version (for your LaTeX methodology file)
%% Purpose and Approach
% - Between June and August 2021, we conducted a semi-structured interview study with XX developers of background Y.
% - We invited XX participants.
% - See also Table~\ref{tab:interview} for a summary of our interview participants.
%% Participant Recruitment
% - Total number of interviewees
% - Eligibility criteria
% - How did we recruit on GitHub (?), screening questions on GitHub (if any?)
% - How did we select & why is this a good idea for our target population
% - Describe who we kicked out and why
% - Mitigations for biases if any
%% Demographics
% - Data dump of age, gender, experience, race, location, education etc. refer to appendix for detailed information, recruitment from where?
% - Experience and type of experience of participants, industries people worked in
% - See also Table~\ref{tab:demo} for participants' demographics.
%% Interview Procedure
% - Screening responses, survey for demographics
% - Semi-structured interviews
% - Explain why interviews
% - Explain how we developed interview guide and cheat sheet + source, adapted
% - Explain that research questions from prior work, addressing the problems described in previous sections, crafted questions to answer them
% - Developed and refined interview script (pilot, reiteration after first 3, minor changes -> included), explain piloting of interview guide and that we included pilots
% - Include interview-guide-flow-figure
% - Big picture content of interview guide, summary of interview script (sections etc.); walk through the figure
% - Describe what we defined in the interview and what we didn't define and why and at which point
% - Describe if/that we initially interviewed one type of person then included other people people and why
% - Explain that we stopped when we reached saturation
%% Interview Data Analysis
% - Describe coding process at a high level
% - Mention volume of audio
% - Describe codebook development
% - Link to codebook in appendix
% - Coding process including conflict resolution in entail
%% Ethical Considerations and Data Protection
% - Approved by IRB
% - Consent form
% - Sensitive questions
% - Data protection stuff
% - Describe how participants were informed about the study/data collection/data usage
% - Consent form, including right to withdraw at any time, skip any questions they wished, answered all participants' data storage questions to their satisfaction
% - GDPR
% - Secure cloud collaboration
% - De-identified transcripts
% - Transcription provider was GDPR compliant
% - Fair payment
%% Limitations
% - Our work includes a number of limitations typical for this type of interview study and should be interpreted in context.
% - In general, self-report studies may suffer from several biases, including over- and under-reporting, sample bias, and social-desirability bias.
% - Convenience sample
% - Interview study
% - Sampling, upwork, skew young and male (which is common among these sites \cite{})
% - Mitigation for location bias
% - Maybe suggest future work that expands the scope of this study

Purpose and Approach#

  • Between June and August 2021, we conducted a semi-structured interview study with XX developers of background Y.
  • We invited XX participants.
  • See also Table~\ref{tab:interview} for a summary of our interview participants.

Participant Recruitment#

  • Total number of interviewees
  • Eligibility criteria
  • Where did we recruit
  • How did we recruit (channel)
  • Screening questions (if any?)
  • How did we select & why is this a good idea for our target population
  • Describe who we kicked out and why
  • Mitigations for biases if any

Demographics#

  • Data dump of age, gender, experience, race, location, education etc. refer to appendix for detailed information, recruitment from where?
  • Experience and type of experience of participants, industries people worked in
  • See also Table~\ref{tab:demo} for participants’ demographics.

Interview Procedure#

  • Screening responses, survey for demographics
  • Semi-structured interviews
  • Explain why interviews
  • Explain how we developed interview guide and cheat sheet + source, adapted
  • Explain that research questions from prior work, addressing the problems described in previous sections, crafted questions to answer them
  • Developed and refined interview script (pilot, reiteration after first 3, minor changes -> included), explain piloting of interview guide and that we included pilots
  • Include interview-guide-flow-figure
  • Big picture content of interview guide, summary of interview script (sections etc.); walk through the figure
  • Describe what we defined in the interview and what we didn’t define and why and at which point
  • Describe if/that we initially interviewed one type of person then included other people people and why
  • Explain that we stopped when we reached saturation

Interview Data Analysis#

  • Describe coding process at a high level
  • Mention volume of audio
  • Describe codebook development
  • Link to codebook in appendix
  • Coding process including conflict resolution in entail

Ethical Considerations and Data Protection#

  • Approved by IRB / ethics board
  • Consent form
  • Sensitive questions
  • Data protection stuff
  • Describe how participants were informed about the study/data collection/data usage
  • Consent form, including right to withdraw at any time, skip any questions they wished, answered all participants’ data storage questions to their satisfaction
  • GDPR
  • Secure cloud collaboration
  • De-identified transcripts
  • Transcription provider was GDPR compliant
  • Fair payment

Limitations#

  • Our work includes a number of limitations typical for this type of interview study and should be interpreted in context.
  • In general, self-report studies may suffer from several biases, including over- and under-reporting, sample bias, and social-desirability bias.
  • Convenience sample
  • Interview study
  • Sampling, upwork, skew young and male (which is common among these sites \cite{})
  • Mitigation for location bias
  • Maybe suggest future work that expands the scope of this study

Reporting Numbers#

There has been a lot of discussion on whether or not you should report counts or percentages in a qualitative interview study - after all, you don’t want to suggest any quantitative or generalized results, as you “only” interviewed a small subset of a group on their subjective experiences.

The current group workflow in this case is to provide qualifiers, i.e., you select certain percentages ranges, assign them a qualifier term, and only use these to describe your results and the portions of participants who stated certain things. For example, if you interviewed 20 people, instead of saying “10 stated xy” or “50% said xy”, you only write “About half”, leaving open if that is 45 or 55% of your interviewees. As a bonus benefit, you achieve some additional anonymity by using percentage ranges.

If you chose this, you should also include some kind of image / table depicting your ranges and qualifiers in the paper. For an interview with 20 people, you could use the following distribution:

Percent Count Qualifier
0% 0 None
>0-15% 1-3 A few
>15-30% 4-6 Some
>30-45% 7-8 Many
>45-55% 9-11 About Half
>55-70% 12-13 Majority
>70-85% 14-16 Most
>85-99% 17-19 Almost All
100% 20 All

Examples: A few publications using this approach:

References#

References and helpful resources:

  • Interviewing As Qualitative Research: A Guide for Researchers in Education And the Social Sciences by Irving Seidman, Teachers College Press (2006).
  • Qualitative Interviewing: Understanding Qualitative Research by Svend Brinkmann, Oxford University Press (2013).