7

Although not often used for evaluation, at times, in-depth qualitative interviews are conducted as part of the evaluation process. Therefore, I wanted to make sure you had some exposure to qualitative evaluation techniques.

Qualitative interviews are done in person and are not the same as having people write down their answers to an open-ended question on a survey. There are many different types of qualitative research traditions, but this chapter focuses on the mechanics of gathering qualitative data across those traditions (Creswell, 1998).

This chapter is written as if an entire practice evaluation is designed to be qualitative, but the same basic rules for qualitative interviewing that I will present here would apply to the use of a few questions in a “mixed method” approach to evaluation.

While quantitative evaluations tend to focus on process and outcome measures, qualitative evaluations have a broader focus that does not center on measurement. Rather, the focus of a qualitative evaluation would likely be on the experiences clients had during the intervention being studied.

For example, let’s go back to our macro social work example about community organizing efforts. Given that a community organizing project usually involves a range of efforts towards a goal, the targets of those efforts, community members, would be able to reflect on their experiences during the project.

The goal of the project is to improve community pride. The social worker leading the community organizing facilitated improvements to the local park and tracked what would help people want to stay in the community over the course of a year. Community members experienced the social worker’s efforts over time and would have an opinion about their experience of the community organizing intervention. Those opinions would be recorded through qualitative interviewing.

Let’s move to a discussion of the more technical aspects of qualitative interviewing. People conducting qualitative interviews need to understand that research or evaluation interviewing may seem similar to a clinical interview, but are actually different (Padgett, 1998).

One clinician-evaluator describes the difference this way “Research interviews require a fact-based, neutral inquiry style that contrasts markedly from the empathic style of clinical interviews in psychiatric practice. In fact, the research interview generally seeks to gather information and specifically avoid any therapeutic benefit” (Targum, 2011, 40).

Making this shift can be challenging for new evaluators. However, it is often not advisable for a social worker to do qualitative interviews for a practice evaluation focused on their own work. Ideally, a neutral party would conduct those interviews.

As you begin the process of developing interview questions for your clients, also known as an “interview guide” or “interview protocol,” you want to eliminate most of the quantitative survey design tips from your head. Interview guides usually have between 7-10 open-ended questions, often with the accompanying “prompts” for follow-ups to each one.

You may be saying, “Gee, 7-10 questions do not sound like very much.” In fact, this is just right. You should be expecting to interview your clients (or have them interviewed by someone else) for between 45 minutes to an hour.

With only 7-10 questions, this means that they are doing most of the talking. Therefore, your questions need to be juicy ones, that will elicit stories from clients about the phenomena you are evaluating. This means that your questions will be open-ended as opposed to yes and no questions. As with quantitative survey design, you should avoid double-barreled questions and it is best to ask about the present before the past or the future.

In qualitative evaluation, designing the right set of questions is of paramount importance. Let’s spend some time thinking about the following interview questions that could be used for a range of different practice evaluations. Do you feel these questions follow the basic rules of qualitative interview question design?

What are your feelings about social workers and nurses at this agency?

What do you think about your social worker’s job performance on your case?

After receiving services at this agency, do you engage in any criminal activities?

When was the last time you had serious mental health trouble?

Let’s take each question one by one and think through them.

What are your feelings about social workers and nurses at this agency?

In this question, evaluation participants wouldn’t know which group they were responding about, as this is a double-barreled question. Additionally, the wording is very vague, what does “feelings” really mean? Perhaps the evaluators wanted to know about clients’ confidence in their social workers, for example. If that was the case, would this question only lead to someone saying they were or were not confident? That is akin to a yes or no question, which is not what we are aiming for in qualitative evaluation.

What do you think about your social worker’s job performance on your case?

Here, we can imagine that the client being interviewed might feel a lot of pressure to say something positive, depending on who was doing the interviewing. Also, the term “job performance” might encompass many items, and could, perhaps, be broken down into more specific items such as “provision of therapy,” “explanation of services,” and the like.

After receiving services at this agency, do you still engage in any criminal activities?

Asking people about what are often taboo behaviors is always a challenge when done in-person. A question like this, if really necessary, might be better asked through the use of a paper or computer survey.

When was the last time you had serious mental health trouble?

This question leaves a lot to the imagination. What does ‘mental health’ mean to a client? What does trouble mean – and could it mean various things across a spectrum? But beyond the general terms used in this question, the answer would be very short. A client might answer ‘last week’ or ‘last year.’ A qualitative interview is best used to elicit broader answers. This question might be revised to ask, “Describe the last time you faced challenges as a result of your personal mental health problems,” for example.

Once you have designed and tested your qualitative interview questions, you will begin the data collection process. Interviews should be conducted in a quiet place that is comfortable for your client. The interview should be audio or videotaped, depending on what best fits your client population’s comfort. The interviewer will manage both the recording and may also wish to take a few notes by hand on the interview guide.

After collecting data, you will need to analyze your results. There are many writings on to how to approach qualitative data analysis. However, some of the simplest points of guidance can be summed up as follows.

First, get familiar with your data. Listen to the tapes. Read the transcripts several times before starting the formal data analysis process. Second, whereas quantitative data analysis is about counting, qualitative data analysis is not. Rather, it is about identifying common themes within interviews and across interviews.

Third, as you start the formal analysis process, create initial ‘codes’ about what you see in each part of the video or transcript. Later, you can review those codes and revise them into larger themes. You can identify codes and themes for individual questions or across the interview data. Your process must be replicable in order to be of high quality.

In order to do a good job on your practice evaluation’s qualitative data analysis, you will need your results to be of as high a quality as possible. Social workers Cynthia Lietz and Luis Zayas discuss the four different aspects of quality that you will need to think about as you analyze and report upon your qualitative practice evaluation data (Lietz & Zayas, 2010; Lincoln & Guba, 1985). These include credibility, transferability, auditability and confirmability.

First, credibility refers to how much “a study’s findings represent the meanings of the research participants” during the interview process (Lietz & Zayas, 2010, 191). Key in this endeavor is the need for the evaluator to eliminate their personal reactivity or bias in analyzing the data. Evaluators may wish to see a certain result but need to be true to what the data present. A question one might ask about an evaluation’s credibility might be “How was research reactivity and bias managed in the study? Or What strategies were used to establish credibility?” (Lietz & Zayas, 2010, 199).

Second, transferability is about whether “the findings are applicable or useful to theory, practice or future research” (Lietz & Zayas, 2010, 195. At first glance, this may seem akin to the concept of generalizability that we reviewed in previous chapters. However, qualitative evaluations are not designed to have external validity. Rather, particular aspects of findings may be applicable to other similar settings. A question one might ask about an evaluation’s transferability could be Are “the findings applicable or useful for your population, setting or area of practice?” (Lietz & Zayas, 2010, 199).

Third, auditability refers to “the degree to which research procedures are documented allowing someone outside the project to follow and critique the research process” (Lietz & Zayas, 2010, 195). A question one might ask about an evaluation’s auditability could be “Was there evidence of an audit trail and/or peer consultation on the project?” (Lietz & Zayas, 2010, 199).

Fourth, confirmability addresses the “ability of others to confirm or corroborate the findings” (Lietz & Zayas, 2010, 197). To demonstrate confirmability, a study should show that the findings and the data are clearly connected. A question one might ask about an evaluation’s confirmability would be “How did the researchers corroborate their conclusions? (Lietz & Zayas, 2010).

Taken together, these items lead us to thinking about the ‘trustworthiness’ of an evaluation study that uses qualitative interviewing and data analysis techniques. Lietz and Zayas (2010) present specific strategies for increasing trustworthiness. Including these items in your evaluation’s methods would be what you would want to see if as an outsider, you asked the question “To what degree do you find the research procedures increased the trustworthiness of the findings?” (Lietz & Zayas, 2010, 199). See Table 7.1.

Table 1. Strategies for increasing trustworthiness

Reflexivity

Thoughtful consideration of how an evaluator’s standpoint can influence the evaluation’s data collection and analysis

Observer triangulation

Using more than one analyst to analyze data

Data triangulation

Collecting data from multiple sources (e.g. interviews, focus groups)

Prolonged engagement

Spending extended time with participants to get an exhaustive look at their experience

Member checking

Including participants in analysis or returning to a small group of them to corroborate findings

Thick description

Thorough representation of the phenomenon and its context as perceived by participants

Audit trail

Maintaining a detailed written account of all research procedures utilized

Peer debriefing

Meeting with mentors or colleagues to discuss analytical research decisions

Negative case analysis

Seeking contrasting evidence through sampling and analysis

Sources: Lietz & Zayas, (2010); Lincoln & Guba (1985); Padgett (2008); Shenton (2004)

Thinking about the quality of your evaluation links back to the fact that social workers want to engage in practice that is informed by the best available evidence. As a practice evaluator, you want to create the best available evidence about your intervention!

Now that you have been introduced to best practices in qualitative interviewing, let’s consider how a social work practitioner would prepare for and conduct a practice evaluation project that used this technique.

  • Always explain to your client/s that you will be evaluating your practice during or after the time you will be working together (i.e. the intervention period and maybe after) in order to determine whether the intervention is effective.
  • Explicate that practice evaluation data from qualitative interviews will be kept in the lead evaluator’s filing system and will be kept confidential and not reported outside of the agency but may be used in supervision.
  • Commit to sharing your practice evaluation results with your client/s as part of the member checking process and for accountability.
  • Share your final report with your supervisor and your client in order to consider the intervention process more deeply as a reflexive and reflective practitioner.

In summary, occasionally, practice evaluation involves the use of qualitative interviewing. This approach moves beyond the laser-like focus of process and outcome measurement towards understanding the lived experience of the intervention by the clients receiving it. Key to the success of qualitative evaluations is the careful crafting of interview questions, for which there are a set of guidelines. In order to have quality evaluation results, social workers must focus on building the trustworthiness of their work through fostering credibility, transferability, auditability and confirmability.

Discussion questions for chapter 7

  • What types of questions would be most appropriate for a qualitative practice evaluation in your current work or field placement setting?
  • When would you want to choose qualitative methods for your practice evaluation over quantitative methods?
  • In what situation would a mixed-methods practice evaluation make sense in your current setting?
  • Describe the ways in which your qualitative data analysis can be of the highest quality.

References for chapter 7

Creswell, J. (1998). Qualitative inquiry and research design Choosing among five traditions. Thousand Oaks, CA: Sage Publications.

Lietz, C. and Zayas, L. (2010). Evaluating qualitative research for social work practitioners. Advances in Social Work. 11(2), 189-202.

Lincoln, Y. & Guba, E. (1985). Naturalistic inquiry. Beverly Hills, CA. Sage Publications.

Padgett, D. (2008). Qualitative methods in social work research. Los Angeles: Sage Publications.

Padgett, D. (1998). Does the glove really fit? Qualitative research and social work practice. Social Work. 43(4), 373-381.

Shenton, A. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information. 22, 63-75.

Targum, S. (2011). The distinction between clinical and research interviews in psychiatry. Innovations in Clinical Neuroscience. 8(3), 40-44.