• Keine Ergebnisse gefunden

5.2 C ASE S TUDY D ESIGN

5.2.6 Interviews

To analyze each project`s solution design and to derive the lessons learned, interviews were chosen as the main research instrument. They are a well-established tool to collect qualitative data and are frequently used in case study research. See Yin (2003), Merriam (1988), and Eisenhardt (1989), for example.

Interviews as a research tool are not new. As early as 1957, Kahn and Cannell (1957) described “The dynamics of interviewing; theory, technique, and cases.” The described techniques have, of course, evolved over time. Comprehensive guidelines are provided, for example, by US General Accounting Office (2006), Dick (2002), and McNamara (2009). Based on these guidelines, the following section looks at interview types, interview design, interviewee selection, and interview conduction.

5.2.6.1 Interview Types

McNamara (2009) describes four main interview types:

 Informal, conversational interview: Open interview without prepared questions.

Allows the interviewer to adapt to the interviewee’s nature and priorities.

 General interview guide approach: This approach ensures that the same general areas are covered in each interview. It is a more focused approach but still with a relatively high degree of flexibility.

 Standardized, open-ended interview: A set of open-ended questions is prepared and used in all the interviews. Open-ended means that there is no set of predefined answers or options to choose from. According to McNamara (2009), “this approach facilitates faster interviews that can be more easily analyzed and compared.”

 Closed, fixed-response interview: All interviewees are presented with the same questions and presented with a set of answers to choose from.

For this case study, the standardized, open-ended interview style was chosen because it seems best suited for capturing feedback on the research subject—the design of IgniteWorx—without limiting the interviewees too much in sharing their personal experiences.

5.2.6.2 Interview Design

US General Accounting Office (2006) describes a process for the identification, development, and selection of interview questions, as well as a standard procedure for composing and formatting them. The report differentiates among descriptive, normative, and cause-and-effect questions. After the development of the broad overall questions, US General Accounting Office (2006) describes how they are translated into measurable elements in the form of hypotheses or questions. The composition of the appropriate

questions must consider the relevance, the selection of the respondents, and the ease of response. Again, different types of questions can be used, including open and closed.

Finally, the organization of the questions is another main task in the interview design:

“Present the questions in a logical manner, keeping the flow of questions in chronological or reverse order, as appropriate. Avoid haphazardly jumping from one topic to another.”

Finally, the right layout of the questions must be considered as well.

Another important part of interview design related to avoiding potential problems, including selection of the right language (appropriateness, level, use of qualifying language, clarity, avoidance of double negatives, etc.) and avoiding bias within questions;

see (US General Accounting Office, 2006) and (McNamara, 2009).

For this case study, the interview guide was designed to match the main artifacts of IgniteWorx, namely, the Ignite project assessment dimensions, the IgniteWorx result sets, and the matching rules. Figure 81 shows the three main areas of the interview guide.

Figure 81: Structure of interview guide

These three main interview areas are presented sequentially. The order was defined as follows:

1. Ignite Dimensions: This is the first part of the interview guide since this is also the starting point from an end-user’s point of view (he or she starts with the project assessment).

2. IgniteWorx Result Sets: This is the second part of the interview guide since this is the result that the end-user sees after doing the assessment.

3. Matching Rules: This is the third and last part because it requires some explanation of how the rules help in mapping the dimensions to the result sets.

That would be difficult to explain without having introduced the result sets first.

Each interview area has the following parts:

 Feedback on the respective elements, e.g., concrete dimensions or result sets

 Key aspects from the perspective of the concrete project in the scope of the interview

 General observations

This is repeated for all elements of each area. At the end of each section, the interview guide asks if there are elements missing (e.g., missing dimensions, missing results sets).

For Ignite dimensions, the five main areas are included. For result sets, the ten are included. For matching rules, two concrete examples are provided.

Each area is mapped onto a matrix, with elements on one axis and project-specific/general observations on the other. Each cell in each of these matrixes is an interview question.

The number of questions in the interview guide can be derived as follows:

Area Number of Questions Total

Ignite Dimensions 5x2+2 12

IgniteWorx Result Sets 10x2+2 22

IgniteWorx Rules 2x2 4

38

Table 27: Number of questions in the interview guide

Thirty-eight is a relatively high number of questions, given that the planned interview durations are between 90 and 120 minutes. However, since the second column of each matrix (the general observations) is optional, this means that only 19 questions are mandatory (meaning that they should be answered, if possible at all, which also might not always be the case).

5.2.6.3 Interviewee Selection

Interviewee selection is another key part of the interview research design, especially for interviews with a broad reach. However, US General Accounting Office (2006) also describes, “For some structured interviews, because there is only one person who fits the category of interviewee […], no selection process is needed.” In the case of this project, the most likely candidates for the interviews are the project managers or product managers since they typically have the most significant insights and a broad overview over the entire project. In some cases (like in project B here), it can make sense to interview experts from the IT and the OT sides.

Project Interviewee Reason for Selection

A: Automated Optical Inspection

Project Manager Has deep insights into both product design as well as team structure and performance B: Tightening

Quality Assurance Systems

Interviewee B1: Product Manager Backend Solution

Represents the backend part of the solution with many years of experience

Interviewee B2: Product Manager Tightening Tools

Represents the asset perspective in this project (in this case, asset means tightening tool)

C: Remote

Maintenance

Product Manager Has the best overview of all project aspects

D: OTA in

Automotive

Solution Architect Is involved in solution design and implementation aspects

Table 28: Interviewee selection for integrated case study

5.2.6.4 Conducting the Interview

McNamara (2009) provides the following advice for conducting the interview:

 Occasionally verify that the tape recorder (if used) is working.

 Ask one question at a time.

 Attempt to remain as neutral as possible.

 Encourage responses with occasional nods of the head, “uh-huh,” etc.

 Be careful about the appearance when note taking.

 Provide transitions between major topics.

 Do not lose control of the interview.

Actually, most interview guides recommend recording and then transcribing the interview. In the experience of the author, this approach is very difficult to implement with the interviewees targeted here; project managers and product managers of large industry firms are dealing with many different stakeholders and tasks and are constantly under time pressure. Getting them to do an interview and then also approve a lengthy transcript is extremely difficult. Finally, they might not openly address more difficult topics if the interview is recorded.

To address these issues, for the interviews here, a different approach was selected. During the interview, the interview guide is shared between the interviewer and the interviewee on a laptop or a projector. The answers to the questions are transcribed in real time so that approval can be given directly at the end of the interview. This should ideally also increase the validity of the interviews since the approval is given instantly and there will

be no discussions about details and semantics in the aftermath. Of course, there are also potential limitations, especially the fact that important details might not be recorded or that context information like intonations are lost. However, for the purpose of this thesis, it is assumed that the advantages of this approach outweigh the disadvantages.

5.2.6.5 Interview Evaluation

US General Accounting Office (2006) states that “the answers to open-ended questions are often left unanalyzed. The evaluator or auditor in reporting may quote from one or a few selected responses, but open-ended interviews generally do not produce uniform data that can be compared, summed, or further analyzed to answer the evaluation or audit questions.”

Consequently, the interview evaluation in this case actually aims to get as much circumstantial evidence for the validation of the different elements of IgniteWorx with all the limitations of this approach—as described, for example, in “The Nature of

‘Evidence’ in Qualitative Research Methods” (Miller and Fredericks, 2003)—e.g., with respect to the application of the “hypothetico-deductive model” to qualitative research cases.

The interview evaluation is actually done on two levels in this case study:

 Individual project analysis: For each individual project analysis, the interview input is used for the general “lessons learned” part, as well as the “findings for IgniteWorx” part.

 Cross-Study Findings: Across all projects, the findings for IgniteWorx are synthetized and summarized.