How to Survey#
One key technique for understanding and improving how human decision making affects information security and privacy is the gathering of self-reported data from users. This data is typically gathered via survey and interview studies, and serves to inform the broader security and privacy community about user needs, behaviors, and beliefs. The quality of this data, and the validity of subsequent research results, depends on the choices researchers make when designing their experiments.
In the following, we provide an overview of these best practices in the context of studying usability for security and privacy. They can be largely grouped into questionnaire writing, pre-testing and sampling. After data collection survey questionnaire results need to be analyzed using qualtitative and quantitative methods.
Questionnaire Writing#
A survey consists of different questions about participant beliefs, behaviors or opinions. Questions can take different forms, the following being the most common:
-
Closed Answer: Participants can choose one or more items from a precompiled list of answer options. The results are better structured and these can be answered faster, but the list of answers might not be exhaustive. Can introduce ordering bias, so answer order should be randomized.
-
Open-Ended: Participants are not offered any answer options but can freely describe their answer in a text field. Participants have better control over their answers, but researchers need to go through all provided answers, usually using coding techniques. This takes more time and effort for both participants as well as researchers.
-
Likert Scales: Participants are given a set of points that usually describe ranges such as “Very secure - somewhat secure - neither secure nor insecure - somewhat insecure - very insecure”. A Likert scale should consist of 5-10 points: With less points, the answers might not provide a clear picture as there are not enough nuances, with more points, differences begin to vanish. Likert scales should be balanced, i.e. have the same number of points on both sides of a neutral middle ground.
- Odd number of points: Includes a true neutral option, which is desired by participants with e.g. no opionion on the topic.
- Even number of points: Has no true neutral position, generates stronger opinions as participants are forced to choose a side.
A survey should be as short as possible and communicate a precise estimate of its length before participants start. Researchers should aim for duration of at most 15-20 minutes. A longer survey will tire participants, and if a shorter duration was communicated, this can be perceived as deceit. Both might lead to increased drop-out rates.
Quality Assurance#
Participants might be careless in their responding, inconsistent, or otherwise distracted. To ensure high quality survey data, this should be considered during the survey design. There are several ways to prevent or detect such behavior. For example, it is possible to include attention check questions in a survey, and discard responses with failed attention checks.
As this is a whole subtopic for itself, the following literature might give some hints:
Bias#
Due to the nature of surveys, researchers need to be very careful when designing surveys. There are a multitude of biases that can be introduced with the presentation of a survey, the question phrasing or even the order of questions and answers. A few examples for this are:
- Ordering bias: The first or last answer options are more likely to be chosen, which is why their order should be randomized.
- Social desirability: Participants tend to answer in a way they deem socially acceptable or “the correct answer”. This is especially true for sensitive topics. Careful phrasing can help mitigate this bias.
- Framing bias: Answers can be influenced by the way they (or the survey) are presented, e.g. if they are positively or negatively connotated. Using neutral wordings can help against this bias.
- Sampling bias: Not all individuals from the target population had an equal chance of being selected, which often leads to wrong assumptions about the population. Can be reduced by adjusting the sampling strategy.
When designing a survey, it is important to keep possible sources for bias in mind and to weigh out options - sometimes, introducing a bias can be justified (e.g. if the answers are not as important as the answer behavior, if there were countermeasures against bias, if there is previous research supporting the experiment design choices). In these cases, researchers should always be acknowledging, report possible limitations and be able to defend their research decisions.
Survey Context#
Aspects that influence a survey outside of its questions; e.g. the question order, sponsors, introductory texts and more can strongly influence how participants respond. For example:
- Asking questions about privacy before a privacy coding study can prime participants and cause them to behave more privacy-aware than usual
- Asking personal questions before a privacy-related question block can lead to more privacy-concerned responses
- Telling participants that their privacy behavior might lead to them reporting more desireable behavior
- Asking questions on different topics can help diffuse this effect.
- Being sponsored by an university might seem more trust-worthy and lead to more sincere answers
- It can be beneficial to be vague or even lie about e.g. the true purpose behind a study. Be sure to debrief participants and not to violate ethical standards.
Choice of Words#
“[…] for example, the word ‘usually’ was interpreted in 24 different ways by one study’s participants.” - [1]
As there is often no direct contact to participants, both text sections within the survey and the questions need to be phrased carefully to avoid biases and misunderstandings. There are two common main problems:
- Different participants have different understandings of terms, questions or answer choices, which heavily influences the validity of the survey: If we don’t know what participants had in mind while answering, we cannot compare their answers! Besides technical terms participants might be unfamiliar with, this can also affect common words such as “usually” if there is no precise definition for them.
- Pre-testing can cover up unclear phrasing and other error sources.
- Researchers using technical or abstract terms due to being more familiar with the topic at hand. This can be harder to avoid as there are often no common alternatives and participants often do not read provided explanations.
- Try to use more familiar terms where possible
- Use examples to clarify what you mean
- Tools such as QUAID can draw attention to common pitfalls within the survey.
Avoid double-barred questions: Questions can include more than one question, e.g. “Do you believe that your employer should require you to update your computer and change your password every six months?” asks for both updates and passwords. For participants, it might be hard to answer both at the same time (which makes it hard to evaluate!), so examples like this should be split in two different questions.
Sensitive Topics#
Surveys can include topics that are regarded as sensitive - e.g. because participants are supposed to keep answers secret, because the topic is very personal and/or disclosing information can lead to serious consequences, or because there is social stigma against certain answers. This can include topics such as sexuality, religion or political views, but also security-related behavior such as password choices. As all data from surveys is self-reported, participants chose to not answere sensitive questions truthfully, due to e.g. lack of trust or because they want to answer “correctly” with a socially acceptable answer. There are some ways we can try to mitigate this:
- Phrase questions indirectly and not about the participants themselves (“Do your friends behave securely?” vs “Do you behave securely”).
- Use forgiving phrasing that signals that every answer is valid (“People have various reasons to choose not to update their phone. Which of these best describes your behavior?”)
- Balance questions by including both a positive and negative answer option (“If there is a serious fuel shortage this winter, do you think there should or should not be a law requiring people to lower the heat in their homes?”)
- It’s considered unethical to not include “a way out” for participants who don’t want to answer a question. This can be achieved by e.g. adding a “I don’t know” or “Prefer not to disclose” option.
A subset of this are demographic questions. As they are not only sensitive but can also highly influence the following questions, they should be put towards the end of the survey. Since demographical questions are usually easy to answer, they also fit very well at the end when participants are possibly already tired from previous questions.
Established Questions: It is advisable to use already established questions, especially for topics such as demographics - many questions have already been phrased and tested. A pre-compiled list of questions can be found.
Further Reading#
- Self-Reports: How the Questions Shape the Answers
- Rating Scales: Numeric Values May Change the Meaning of Scale Labels
Questionnaire Pre-Testing#
In an ideal survey, each respondent interprets the questions in the way we intended. But in reality, survey questions are misunderstood. Participants may find the answers hard to recall, difficult to estimate, and struggle to map their answer to the choices we provide. They frequently interpret words in different ways, and may hesitate to report answers due to social-desirability bias. Additionally, while writing questionnaires we may inadvertently miss key answer choices or accidentally include technical words that our respondents do not understand. Pre-testing surveys and interview protocols can help prevent these and other measurement errors and ensure that self-report survey and interview measurements are as accurate as possible. The main goals of pre-testing are to make sure that respondents correctly understand and interpret questions.
Cognitive Interviews#
Based on: Link, Link2, Link3, Link4, Link5 Link6
Cognitive interviewing is meant to identify and analyze sources of response error in survey questionnaires by focusing on the cognitive processes respondents use to answer questions on a survey or questionnaire.
- Conventionally, cognitive interviewing involves conducting face-to-face (f2f) interviews with small sample sizes of five to 30 respondents.
- Semi-structured, in-depth interviews are conducted on the basis of an interview protocol which contains the questions to be tested in the cognitive interview and the techniques to be adopted, in particular think-aloud and follow-up questions (probing).
- Probing is used to elicit information about how respondents interpret questions or define specific terms and how respondents arrive at their answers.
- Probing questions are administered either immediately after the subject has answered the survey question (concurrent) or at the end of the cognitive interview (retrospective)
Think Aloud#
Based on:
By using thinking aloud, you ask survey respondents to answer questions while continuously thinking out loud — that is, simply verbalizing their thoughts as they move through the survey questoinnaire.
- “Simply” ought to be in quotes, because it’s not that simple for most people to keep up a running monologue. You typically have to prompt respondents to keep them talking.
Probing Questions#
Based on:
Probes can be grouped into comprehension, elaborative and specific probes. Picking the right questions is important. Examples:
- Comprehension Probes
- What does the term X mean to you?
- What, to you is, X?
- Elaborative Probes
- Could you please explain your answer a little further?
- Why did you answer X?
- Specific Probes
- What kinds of implementation challenges did you think of when answering this question?
- Why do you say that you think it is very important that developers write secure code?
- Paraphrasing
- Can you repeat the question I just asked in your own words?
- Confidence Judgement
- How sure are you that you have 19 years of experience in writing Python code?
- Recall
- How do you remember that you have 19 years of experience in writing Python code?
- How did you come up with your answer?
- General
- How did you arrive at the answer?
- Was that easy or hard to answer?
- I noticed that you hesitated. Tell me what you are thinking.
- Tell me more about that.
Web Probing#
Based on: Link5
An alternative to conducting f2f cognitive interviews in the lab is to transfer the probing procedure into an online questionnaire, a method called online or web probing. Here, for the questions to be tested, open and closed probing questions are developed and then implemented into an online questionnaire. In the concurrent probing format:
- respondents first answer a survey question
- and after clicking on the next button receive one or more probes on the next survey page.
Advantages re Cognitive Interviews
- No cognitive interviewer required
- Recruiting respondents in a quicker and more cost-effective
- Realization of larger sample sizes
- Allows researchers to quantify their pretest findings (Ref)
- Enhances the radius of the regional accessibility
- Rules out any interviewer effects; increases the reliability and comparability of the results (Ref)
Disadvantages re Cognitive Interviews
- Due to the absence of the interviewer, no one can probe for more information, follow up on incomplete answers or provide clarification of the tasks
- Probing is restricted to the scripted questions previously programmed and implemented into the Web survey.
- No one can motivate the respondents during completion of the Web survey to answer the (open) probing questions thoughtfully and elaborately
Sampling#
Survey results depend a lot on who responds to them - and therefore, they way participants are chosen or invited is very important to the whole process. It is advisable to first specify the target population (i.e. who we want to know more about), then think about how these persons can be reached best: If the target are computer science students, a convenience sample at the local university is usually sufficient. If we want to draw conclusions on the general public, sampling methods such as Crowdsourcing are a better fit. We describe a few of the most common sampling methods below - given that none of these are perfect, researchers must make tradeoffs to balance data quality with available resources (e.g. money to pay participants).
Sampling Methods#
- Convenience Sampling: Try to acquire participants from easy sources such as university students (if you’re researching at a university) or personal contacts. This is usually easy and cheap, but results in non-representative participants from similar backgrounds.
- Snowball Sampling: Similar to convenience sampling, but includes participants recruiting their friends who recruit their friends and so on. Can yield a slightly more representative sample and a larger participant pool, but will usually still result in participants that resemble each other.
- Crowdsourcing: Use crowdsourcing platforms such as Amazon MTurk or Prolific. These cover up a more representative portion of the general public that can also be recruited for behavioral experiments, but the costs are higher. Statistics on platform users or previous research should be consulted to get insights in available users as there might be still some bias (e.g. MTurk workers tend to be younger, better educated and more tech-savvy than the general public).
- Social Media: Use social media platforms such as Twitter to reach participants. This yields higher representativeness than Snowball Sampling, and depending on the topic, data can already be accessible without directly contacting users (as profiles are often public and can already include e.g. demographical information). Possible bias include that only a small portion of the general public is actively using e.g. Twitter, and that information on these profiles is self-reported and can be wrong.
- Online Census Platforms: Online Platforms that are specialized in putting together representative samples (or samples based on demographic criteria of the researchers choosing). This should yield much better representativeness than other methods and researchers can define their sample very well. However, this is more expensive than Crowdsourcing, and as over 90% of invitees do not respond to surveys, there is still a response bias present.
- Probabilistic Samples: Sampling method where every person in the population has a non-zero chance of being drawn at random. This is the sampling method that leads to the best representativeness, but is also very hard to achieve. Tools such as Google Consumer Surveys (GCS) or KnowledgePanel can be used, but they are even more expensive than Online Census Platforms. They are also often very limited, as for example GCS only allows up to 10 question items per survey, including demographics.
Consent Forms#
The primary purpose of the informed consent process is to protect both the participant and you. A consent form is a legal document that ensures an ongoing communication process between you and your study participants. Before beginning data collection, participants need to give their consent. We require consent forms for all types of data collection that include humans, e.g., interview studies, online and lab studies, and surveys. Below you find a consent form template and examples from previous studies.
Every user study needs a consent form! Make sure to provide a consent form for every user study. Only begin data collection after participants gave their consent!
For Interviews: For (oral) interviews, the requirements are somewhat less strict, as you probably already got the participants consent. Take a look at the consent form examples below as a general structure for your interview intro.
Some ideas that should appear in the intro:
- “I am going to interview you about X, this will take about Y minutes”
- " This is completly anonymous, and your answers will only be used for X"
- “Are you okay with me recording this interview? This recording will only be accessed by X and deleted after Y.”
- “Are you ready to start the interview?”
Consent Form Protocol#
Please follow the protocol below whenever you need to create a consent form.
| Project Title | Project Title goes here |
|---|---|
| Principle Investigator | PI goes here, usually Sascha |
| Student Researchers | List all involved students, PhDs, PostDocs. |
| Project Description | Add a brief project description. |
| Eligibility | Tell the participants what criteria they have to fullfil in order to participate. |
| Procedure | Tell the participants (roughly) the study’s procedure. |
| Risks & Benefits | List potential risks and benefits for participants. |
| Duration | Piloting should have told how long the study is going to take. Include this info here. |
| Compensation | Do we pay participants? Tell them here. |
| Confidentiality | Tell the participants how you store and process their data and care for their privacy. |
| Subjects’ Rights | Tell the participants that their participation is voluntary and that they can stop participating at any time. |
| Future use of research data | Tell the participants how their data is stored and might be used in the future. |
| Contact | Provide contact information for the PI |
TODO: NC State form
Special Blocks#
-
Disclaimer if we use IP filtering:
Warning! This survey uses a protocol to check that you are responding from inside the U.S. and not using a proxy or Virtual Private Network (VPN) to hide your country. In order to take this survey, please turn off your proxy/VPN if you are using one. Failure to do this might prevent you from completing the HIT.
-
If we want to use the data for teaching purposes:
We would like to use your responses (in an anonymized version) for teaching purposes. This option is voluntary.
- I agree that my anonymized responses may be used for teaching purposes.
Consent Form Examples#
Socioeconemic Faculty (UoC)
The impact of socioeconomic heterogeneity on knowledge#
- Title of research study: The impact of socioeconomic heterogeneity on science and innovation
- IRB Protocol Number: 24-0430
- Investigator: Aaron Clauset
- National Science Foundation, (NSF) award #2420950.
Key Information#
Our study aims to understand how the educational, socioeconomic, and geographic backgrounds of faculty have shaped their scholarly interests and academic career trajectories. For instance, how does being a first-generation college graduate influence what type of scholarship they pursue as faculty? We aim to analyze how the composition of the academic workforce shapes the pace and direction of knowledge.
As comprehensive social and demographic data on faculty backgrounds are not available and it is critical to our research, we need your help to construct it via this quick and one-time online survey. If any, contributing involves minimal risk. Thank you for your participation!
To respect your time, we have made this survey brief (4 to 6 minutes), and to respect your privacy, participation in each question is voluntary.
Purpose of the Study#
- This research project will develop a new understanding of the factors that shape the scholarly interests of scientists, and the pace of academic scholarship.
- Past work has primarily focused on a few easily measurable characteristics of the scientific workforce, or on specific fields or institutions, which has limited our ability to develop theories and robustly inform policy.
- This brief online survey includes all tenured or tenure-track faculty in every field at every PhD-granting institution in the U.S. (about 300,000 faculty) and will provide unprecedented insights into across- and within-disciplinary trends and patterns.
Explanation of Procedures#
The survey has 2 overall sections:
- Parental information (approximately, 2-3 minutes)
- Sociodemographic information (2-3 minutes).
All questions are optional, you may select ‘Prefer not to answer’ or skip it.
If you answer our survey, will use your name and institutional affiliation only to algorithmically link you with public bibliographic information such as academic publications. This linkage is required to answer our research question because we seek to measure the statistical relationship between academic scholarship and socioeconomic backgrounds across academic disciplines.
Voluntary Participation and Withdrawal#
Your participation is voluntary, and you may withdraw from the survey at any time. If you choose to stop at any point, you may request deletion of the data you have entered so far.
Confidentiality#
Information obtained about you for this study will be kept confidential to the extent allowed by law. Only the research team will have access to the survey records. For managing the data:
- We assign each participant a unique and anonymous identifier and securely store your name and institutional affiliation separately from all your survey responses.
- After the study is completed, we will deidentify the data by removing the identifiers that link it to you.
- Any data from this study made available to other researchers, e.g., as part of a published paper, will either be only statistical aggregates like regression coefficients or will be de-identified strictly following the NIST IR 8053 standard for data anonymization.
We take very seriously the matter of protecting your individual responses from re-identification so that you can feel comfortable participating fully in our study. We are happy to answer any questions you may have on the anonymization process, as well as the handling or removal of your data. By participating, you agree to allow us to analyze this data for this and future related studies.
Payment for Participation#
As a token of our appreciation, respondents who complete at least half of the required section of survey will be entered into a drawing for one of ten $100 honorariums.
Questions#
If you have questions, comments, concerns, or complaints, please contact the research team at facultystudy@colorado.edu or our personal email addresses. This research has been reviewed and approved by an IRB. You may contact them at (303) 735-3702 or irbadmin@colorado.edu if:
- Your questions, concerns, or complaints are not being answered by the research team.
- You cannot reach the research team.
- You want to talk to someone besides the research team.
- You have questions about your rights as a research subject.
- You want to get information or provide input about this research.
Signatures#
By clicking and proceeding into the survey, you acknowledge that you have read and understood the consent form above, and that give your consent to participate in this research study.
- I have read and understood the consent form above and give my permission to participate in this study.
- Next Page >
- This survey includes questions about your identity and background.
- All questions are critical to our research objectives, but due to their personal nature, all questions are optional. Please respond only to those you feel comfortable answering.
- To invite you to the survey, we pre-collected some information from public sources and created a unique ID for you. Is the below information correct? [snip]
- Yes, that is me
- Yes, but I’d like to correct this information
- No
- Prefer not to proceed to the survey
USEC Web Auth Advice (CISPA)
By signing this consent form, I am affirming that…
- I am age 18 or older.
- I am comfortable using the English language to participate in this study.
- I have read and understand the above information. All of the questions that I had about this research have been answered.
- I have chosen to participate in and continue this study with the understanding that I may stop participating at any time without penalty or loss of benefits to which I am otherwise entitled.
- I am aware that I may revoke my consent at any time.
[Submit]
Cloud Privacy (LUH)
| Project Title | Exploring Users’ Perceptions of Cloud Office Applications |
|---|---|
| Principle Investigator | Sascha Fahl |
| Student Researchers | Nicolas Huaman, Christian Stransky, Dominik Wermke |
| Description | We are researchers at Leibniz University Hannover in Germany doing a research study about cloud office applications such as Google Docs and Office 365. You will be asked to answer a series of questions about your interaction with cloud office documents and your understanding of how these applications work. Mechanical Turk workers who are age 18+, live in the United States, have interacted with cloud office software before, and have completed 1000+ HITs with a 95%+ approval rating are eligible. |
| Risks & Benefits | The risks to your participation in this online study are those associated with basic computer tasks, including boredom, fatigue, mild stress, or breach of confidentiality. The only benefit to you is the learning experience from participating in a research study. The benefit to society is the contribution to scientific knowledge. |
| Duration | Participation should take about 10 minutes. |
| Compensation | All participants who complete all tasks will be compensated $1.70 through Mechanical Turk. Participants who do not complete all tasks will not be paid. |
| Confidentiality | Your Mechanical Turk Worker ID will be used to distribute payment to you but will not be stored with the research data we collect from you. Please be aware that your MTurk Worker ID can potentially be linked to information about you on your Amazon public profile page, depending on the settings you have for your Amazon profile. We will not be accessing any personally identifying information about you that you may have put on your Amazon public profile page. Any reports and presentations about the findings from this study will not include your name or any other information that could identify you. We may share the data we collect in this study with other researchers doing future studies – if we share your data, we will not include information that could identify you. |
| Subjects’ Rights | Your participation is voluntary. You may stop participating at any time by closing the browser window or the program to withdraw from the study. Partial data will not be analyzed. |
| Contact | For additional questions about this research, you may contact: Sascha Fahl, Institute for Applied, Informatics, Leibniz University Hannover, E-Mail: fahl@sec.uni-hannover.de |
- I am age 18 or older.
- I am comfortable using the English language to participate in this study.
- I have read this consent form or had it read to me.
- I agree to participate in this research and I want to continue with the study.
[Submit]
GitHub Convenience
| Project Title | Python study |
|---|---|
| Principle Investigators | Michelle Mazurekt and Sascha Fahl |
| Student Researchers | Yasemin Acar, Christian Stransky |
| Description | This research is being conducted by Michelle L. Mazurek (University of Maryland, College Park) in cooperation with Sascha Fahl at Saarland University (Germany). The purpose of this project is to better understand how Python programmers use Python. You will be asked to complete several short programming tasks. Immediately after finishing the short programming tasks, you will be given an exit survey. |
| Risks & Benefits | There is only minimal risk to participants. All collected data including Python source code and exit survey answers will be stored in a secure cloud-storage system that only investigators listed in the IRB can access. There are no direct benefits for participants in this study. However, we believe that this study will improve understanding of how Python programmers use Python. |
| Duration | Participation should take about 60 minutes. |
| Compensation | You are not compensated for the study. |
| Confidentiality | No personally identifiable information will be collected. However, we may use cookies and IP addresses to prevent duplicate participation. If we contacted you through github, we may link your participation with pseudonymized data from your github profile, not including your email address or github ID. All exit survey answers, along with Python source code, will be collected and analyzed pseudonymously. Moreover, the data will be stored in a secure cloud-storage system accessible only to investigators listed in the IRB application. After this study is completed, the data used in this study will be archived for three years and then destroyed. |
| Subjects’ Rights | You may choose not to take part at all. If you decide to participate in this research, you may stop participating at any time. If you decide not to participate in this study or if you stop participating at any time, you will not be penalized or lose any benefits to which you otherwise qualify. If you are a student of UMD or Saarland University, neither your grades, standing nor your employability will be positively or negatively affected by deciding to participate in this study, deciding to not participate in this study, or starting the study and withdrawing from it. |
| Contact | For additional questions about this research, you may contact: Michelle Mazurek, University of Maryland College Park, E-Mail: mmazurek@umd.edu |
- I am age 18 or older.
- I am comfortable using the English language to participate in this study.
- I have read this consent form or had it read to me.
- I agree to participate in this research and I want to continue with the study.
[Submit]
E2E (HTML Source, LUH)
(For qualtrics, you can just copy & modify the HTML source of the table below):
Please indicate, in the box below, that you are at least 18 years old, have read and understood this consent form, and you agree to participate in this online research study.
- I am age 18 or older.
- I have read this consent form or had it read to me.
- I am comfortable using the English language to participate in this study.
- I agree to participate in this research and I want to continue with the study.
[Submit]
Question Blocks#
Example question blocks from established scales and our previous surveys.
Demographics#
List of potential demographics questions (you probably don’t want all at the same time):
Age: What is your age? _______ [Integer; Dropdown works, too] Gender: What is your gender? [checkbox; alternatively only free text] Place of residence: In which city / state / country do you live in? ______ [Free text; Dropdown works, too] Education: What is the highest level of school you have completed or the highest degree you have received? [Radio box] Note: The question is a little weird and we got questions in reviews, why we include currently enrolled in college or graduate school here. This seems to be highly confusing, as the question was about highest obtained degree. Be careful using it that way. Employment: What is your current employment status? [Check box] Company: What is the approximate number of employees your company has? [Radio box] Income: Information about income is important to understand. Would you give your best guess? Please indicate the answer that includes your entire household income in [year] before taxes. [Radio box]Demographics Long
The above gender question is taken from How to do better with gender on surveys: a guide for HCI researchers by Spiel et al. and could be referenced in our papers.
Highly sensitive questions, talk to somebody who knows what they are doing beforehand:
Demographics Sensitive
-
Trans: Are you transgender? [Radio box]
- Yes
- No
- Prefer not to disclose
-
Sexual orientation: What is your sexual orientation? [Check box]
- Straight
- Gay
- Lesbian
- Bisexual
- Pansexual
- Queer
- Asexual
- Prefer not to disclose
- Prefer to self-describe ______ [Free text]
-
Ethnicity: Choose one or more races/ethnicities that you consider yourself to be: [Check box]
- American Indian or Alaska Native
- Asian
- Black or African American
- Hispanic or Latino
- Middle Eastern
- Native Hawaiin or Pacific Islander
- White
- Prefer not to disclose
- Prefer to self-describe ______ [Free text]
-
Political views: In general, how would you describe your political views? [Radio box]
- Very conservative
- Conservative
- Moderate
- Liberal
- Very liberal
- Prefer not to disclose
Experts#
Coding
-
Coding [specific language]: How long have you been coding in [programming language]? [Radio box]
- Less than 1 year
- 1-2 years
- 2-5 years
- more than five years
-
Coding [general]: How long have you been coding in general? [Radio box]
- Less than 1 year
- 1-2 years
- 2-5 years
- more than five years
-
Coding learning: How did you learn to code? [Check box]
- self-taught
- online class
- college
- on-the-job training
- coding camp
- other: ______ [Free text]
-
Coding primary job [general]: Is coding your primary job? [Radio box]
- Yes
- No
- Prefer not to disclose
-
Coding primary job [specific language]: Is writing [programming language] code (part of) your primary job? [Radio box]
- Yes
- No
- Prefer not to disclose
-
IDE/editor: Which IDE/editor do you use to write [programming language] code? [Check box]
- [provide list of potentially relevant IDEs/editors]
Tasks#
Task-specific questions, ask for each task:
Task-specific
Usability#
System Usability Scale (SUS)#
Taken from: usability.gov
The System Usability Scale (SUS) provides a reliable tool for measuring the usability. It consists of a 10 item questionnaire with five response options for respondents; from Strongly agree to Strongly disagree. It allows you to evaluate a wide variety of products and services, including hardware, software, mobile devices, websites and applications.
SUS Benefits#
- has become an industry standard
- very easy scale to administer to participants
- can be used on small sample sizes with reliable results
- is valid: it can effectively differentiate between usable and unusable systems
Considerations when using SUS#
- the scoring system is somewhat complex
- there is a temptation, when you look at the scores, since they are on a scale of 0-100, to interpret them as percentages, they are not
- the best way to interpret your results involves “normalizing” the scores to produce a percentile ranking
- SUS is not diagnostic - its use is in classifying the ease of use of the site, application or environment being tested
The Scale#
SUS Scale
Participants are asked to score the following 10 items with one of five responses that range from Strongly Agree to Strongly disagree:
- Statement: Please rate your agreement to the following statements: [Likert items: Strongly agree; agree; neutral; disagree; strongly disagree.]
- I think that I would like to use this system frequently.
- I found the system unnecessarily complex.
- I thought the system was easy to use.
- I think that I would need the support of a technical person to be able to use this system.
- I found the various functions in this system were well integrated.
- I thought there was too much inconsistency in this system.
- I would imagine that most people would learn to use this system very quickly.
- I found the system very cumbersome to use.
- I felt very confident using the system.
- I needed to learn a lot of things before I could get going with this system.
^system can be replaced with the actual artifact you evaluate, e.g. mobile app, web browser extension or API
Interpreting Scores#
Interpreting scoring can be complex. The participant’s scores for each question are converted to a new number, added together and then multiplied by 2.5 to convert the original scores of 0-40 to 0-100. Though the scores are 0-100, these are not percentages and should be considered only in terms of their percentile ranking.
Based on research, a SUS score above a 68 would be considered above average and anything below 68 is below average, however the best way to interpret your results involves “normalizing” the scores to produce a percentile ranking.
UMUX#
TODO
Cognitive Dimensions (API Usability)#
TODO
Security#
SeBis#
Based on:
- Scaling the Security Wall - Developing a Security Behavior Intentions Scale (SeBIS) (original paper)
- Behavior Ever Follows Intention? A Validation of the Security Behavior Intentions Scale (SeBIS) (validation study)
Secure Software-Development Self-Efficacy SSD-SES#
Based on:
The Questions#
SeBis
Participants are asked to score the following 15 items with one of five responses that range from I am not confident at all (1), I am slightly confident (2), I am somewhat confident (3), I am moderately confident (4), and I am absolutely confident (5):
- Statement: Please rate your agreement to the following statements: [Likert items: I am not confident at all; I am slightly confident; I am somewhat confident; I am moderately confident; I am absolutely confident.]
- I can perform a threat risk analysis (e.g., likelihood of vulnerability, impact of exploitation, etc.)
- I can identify potential security threats to the system
- I can identify the common attack techniques used by attackers
- I can identify potential attack vectors in the environment the system interacts with (e.g., hardware, libraries, etc.)
- I can identify common vulnerabilities of a programming language
- I can design software to quarantine an attacker if a vulnerability is exploited
- I can mimic potential threats to the system
- I can evaluate security controls on the system’s interfaces/interactions with other software systems
- I can evaluate security controls on the system’s interfaces/interactions with hardware systems
- I can communicate security assumptions and requirements to other developers on the team to ensure vulnerabilities are not introduced due to misunderstandings
- I can communicate system details with other developers to ensure a thorough security review of the code
- I can discuss lessons learned from internal and external security incidents to ensure all development team members are aware of potential threats
- I can effectively communicate to company leadership identified security issues and the cost/risk trade-off associated with deciding whether or not to fix the problem
- I can communicate functionality needs to security experts to get recommendations for secure solutions (e.g., secure libraries, languages, design patterns, and platforms)
- I know the appropriate point of contact/response team in my organization to contact if a vulnerability in production code is identified
Privacy#
IUIPC#
Based on:
The Questions#
IUIPC
Please rate your agreement or disagreement with the following statements. Options: {Strongly agree, Agree, Somewhat agree, Neutral, Somewhat disagree, Disagree, Strongly disagree}
- Control
- Consumer online privacy is really a matter of consumers’ right to exercise control and autonomy over decisions about how their information is collected, used, and shared.
- Consumer control of personal information lies at the heart of consumer privacy.
- I believe that online privacy is invaded when control is lost or unwillingly reduced as a result of a marketing transaction.
- Awareness
- Companies seeking information online should disclose the way the data are collected, processed, and used.
- A good consumer online privacy policy should have a clear and conspicuous disclosure.
- It is very important to me that I am aware and knowledgeable about how my personal information will be used.
- Collection
- It usually bothers me when online companies ask me for personal information.
- When online companies ask me for personal information, I sometimes think twice before providing it.
- It bothers me to give personal information to so many online companies.
- I’m concerned that online companies are collecting too much personal information about me.
Interpretation#
TODO
Westin#
Based on:
- Privacy Indexes: A Survey of Westin’s Studies
- Would a privacy fundamentalist sell their DNA for $1000…if nothing bad happened as a result? The Westin categories, behavioral intentions, and consequences
The Questions#
Westin
Participants are asked to score the following 3 items with one of five responses that range from Strongly Agree to Strongly disagree:
- Statement: Please rate your agreement to the following statements: [Likert items: Strongly agree; agree; neutral; disagree; strongly disagree.]
- Consumers have lost all control over how personal information is collected and used by companies.
- Most businesses handle the personal information they collect about consumers in a proper and confidential way.
- Existing laws and organizational practices provide a reasonable level of protection for consumer privacy today.
Interpretation#
TODO
- Fundamentalist:
Fundamentalists are generally distrustful of organizations that ask for their personal information, worried about the accuracy of computerized information and additional uses made of it, and are in favor of new laws and regulatory actions to spell out privacy rights and provide enforceable remedies.
They generally choose privacy controls over consumer-service benefits when these compete with each other.
- Agree (strongly or somewhat) with “Consumers…”; AND
- Disagree (strongly or somewhat) with “Most businesses…”; AND
- Disagree (strongly or somewhat) with “Existing laws…”.
- Pragmatist:
They weigh the benefits to them of various consumer opportunities and services, protections of public safety or enforcement of personal morality against the degree of intrusiveness of personal information sought and the increase in government power involved. They look to see what practical procedures for accuracy, challenge and correction of errors the business organization or government agency follows when consumer or citizen evaluations are involved. They believe that business organizations or government should “earn” the public’s trust rather than assume automatically that they have it. And, where consumer matters are involved, they want the opportunity to decide whether to opt out of even non-evaluative uses of their personal information as in compilations of mailing lists.
- Disagree (strongly or somewhat) with “Consumers…”; AND
- Agree (strongly or somewhat) with “Most businesses…”; AND
- Agree (strongly or somewhat) with “Existing laws…”.
- Unconcerned:
The Unconcerned are generally trustful of organizations collecting their personal information, comfortable with existing organizational procedures and uses are ready to forego privacy claims to secure consumer-service benefits or public-order values and not in favor of the enactment of new privacy laws or regulations.
- Anyone who is not a fundamentalist or unconcerned
References#
- Based on: Link