Qualitative data analysis: A sourcebook of new methods PDF

Title Qualitative data analysis: A sourcebook of new methods
Author A M
Pages
File Size 536.1 KB
File Type PDF
Total Downloads 72
Total Views 211

Summary

184 BOOK REVIEWS methodological, theoretical, and ethical issues that ac- The collection in hand, however, presents the art company this innovation in evaluation. In essence, the and science of what ethnographers actually do in prac- collection addresses the most salient factors in this tice. A thor...


Description

184

BOOK REVIEWS

methodological, theoretical, and ethical issues that accompany this innovation in evaluation. In essence, the collection addresses the most salient factors in this adaptation process. The text demands not “immersion” (to use Bank’s terminology), but careful study and attention to the issues and experiences discussed by these authors. Bank attempts a “personal reorga~zation” of the text and proceeds to evaluate the book within the confines of this new, but undefined, format. Bank asks and answers her own questions. These questions spring from “long conversations” with “a bona fide and well trained ethnographer,” as well as from Bank’s own personal search for “tips and techniques.” I was delighted to note that despite her distortion of the text’s structure, Bank found much material to answer her questions. She offers a “distilled” list of “insights, suggestions, examples, and aphorisms . . . for the novice in fieldwork,” at the end of her discussion. I am indebted to her for culling these “how-to-do-it” tips from the text. Her efforts, however, will not speak to those scholars, students, and policymakers who have more than a novice’s interest or experience in fieldwork and evaluation. Bank would “like to see a more systematic comparison between the various perspectives in evaluation and ethnography”, an interesting topic, but not the focus of this book. I might recommend a new collection, which may be closer to the model she envisions: Educational Evaluation: Ethnography in Theory, Practice, and Politics, Fetterman, D. M. and M. A. Pitman (Beverly Hills, CA: Sage 1986). This second collection presents the next stage in the evolution of this disciplin~y endeavor and may anticipate her interests.

The collection in hand, however, presents the art and science of what ethnographers actually do in practice. A thorough review would have addressed the significance of using a Geertzian approach to discuss ethnography’s contribution to evaluation and noted the value of using a methods chapter as a lens through which to view the chapters. Moreover, the multilevel and multidimension~ model used in one of the collection’s more significant national ethnographic evaluations, as well as the subject of programmatic and policy recommendations, would have been discussed. A critical reviewer would ask about the validity and reliability of the techniques used, the utility of the concept of culture on the program and evaluation project levels, the potentially contradictory nature of the term ethnographic evaluation, the conceptual crossroad of methodology and ethics-including “guilty knowledge and dirty hands”, the reflective nature of the endeaver for fieldworkers (across disciplines), and the issue of whether formalization negates ethnographicness. A technically informed reviewer would penetrate to more complex questions such as the role of cognitive theory bias in ethnographic evaluation today or the relative merits of continuous versus non-continuous fieldwork in one’s own culture. Bank’s discussion leaves one with the sense that the reviewer is not in touch with either the issues or the people involved in this enterprise. She left unasked both surface and substantial questions. The review provided insufficient insight into context or sensitivity to central issues. An informed book review, however, is itself a scholarly contribution that should be evaluated as thoroughly and responsibly as any other analytical endeavor.

Qualitative Data Analysis: A Sourcebook of New Methods, by Mathew B. Miles and A. Michael Huberman. Beverly Hills, CA: Sage, 1984. 263 pages. Reviewer: Thomas A. Schwandt “You know what you display” (p. 22) is the theme adopted by Miles and Huberman in their educating and sometimes irritating discussion of analyzing qualitative data. “Beware of Greek’s bearing gifts” is an adage I think applies equally well to this book, but more on that later. The five chapters which constitute the meat of the book are bounded by an interesting introduction in which the authors explain their methodological point of view and the nature of the book and by a very brief (2 pages) set of concluding remarks where they share some general pointers on data analysis. Within these boundaries Miles and Huberman discuss the results of their efforts of “casting about” for “manageable and straightforward methods for analyzing qualitative data. They present 49 specific methods in a standard-

ized format: analysis problem (the problem for which the method is a proposed solution), description, illustration (including how to build the display and enter and analyze the data), variations on the method, advice on use, and time required. One hundred charts and figures illustrate the methods. A field study of school improvement supplies the qualitative data used in the illustrations and discussions. Chapter 2 begins by explaining the need for pr~tructured research design before starting a field study. The authors briefly discuss the merits of “inductive,” “loosely designed, ” “emergent” approaches versus “tight, ” “prestructured” designs. They state that their “stance lies off center, toward the structured end” of the continuum of designs, and they build a case accordingly. Their method of focusing and bounding the

BOOK REVIEWS

collection of qualitative data is a four-stage process which includes building a conceptual framework, formulating research questions, sampling, and developing instrumentation. The conceptual framework is a graphic illustration of the “key factors or variables” to be studied. In this way the researcher labels probable “bins” of discrete events and behaviors and achieves “some clarity of thought about their interrelationships” (p. 28). Research questions arise from the relationships stipulated in the conceptual framework. Through an iterative process the researcher generates general research questions as a first step toward “operationalizing” the conceptual framework. These first two stages aid in focusing data collection. The third stage, sampling, actually sets boundaries for data gathering. Here, the researcher decides which settings, actors, events, and processes to sample in order to answer the research questions. Explicit sampling decisions are necessary to avoid the pitfalls of “indiscriminate, vacuum-cleanerlike collection of every datum; accumulation of far more information than there will be time to analyze; and detours into. . . blind alleys” (p. 37.). Given a sampling frame, decisions must be made about instrumentation. Miles and Huberman briefly review arguments for and against the development of well-structured instruments prior to entering the field. They adopt an “it depends” stance based on choices the researcher makes between the following sets of parameters: exploratory versus confirmatory studies, singlesite versus multiple-site studies; and, site-specific versus cross-site studies. They maintain that in each pair of choices listed above, the former option typically calls for less front-end preparation of instrumentation than the latter. An excerpt from an interview guide is provided as an illustration of prior instrumentation. Throughout this discussion of the their four-stage research planning process, the authors emphasize the need for revision and iteration. Chapter 3 presents a set of 12 methods useful for analyzing data during the process of collection. The authors assume that the data to be analyzed (e.g., field notes) have already been converted to “write-ups” that “are intelligible to anyone, not just the fieldworker” (p. 50). The methods are presented in order of early to late in data collection and from simple to complex. They range from forms for recording a field contact and summarizing a document to procedures for preparing codes and doing coding. The authors discuss descriptive codes for summarizing data and “pattern codes” for identifying patterns, themes, or overarching constructs. Creating and revising codes and doublecoding to achieve reliability are explained. This chapter also discusses the use of “reflexive” and “marginal” remarks in writing up field notes and conceptual memos for theorizing about variables and their relationships.

185

Finally forms for a site-analysis meeting (a meeting of field workers to summarize the current status of events at the site) and an interim site summary (a synthesis of what the researcher knows about the site) are illustrated and explained. The latter is described as “the first attempt to derive a coherent account of the site” (p. 75). Chapters 4 and 5 (about half of the book) discuss methods for analysing data from a single site and from multiple sites, respectively. All of the methods involve various strategies for displaying data in charts, tables, checklists, matrics, or figures. The authors define “display” as a “spatial format that presents information systematically to the user.” Data entered into displays may be “short blocks of texts, quotes, phrases, ratings, abbreviations, symbolic figures” derived from the coded field note writeups. The authors claim that displays enhance the chances of drawing and verifying valid conclusions from qualitative data. They argue that displays are superior to narrative text as a tool of analysis. In their view, narrative text, “is an extremely weak and cumbersome form of display,” and “it is hard on analysts” because it is “dispersed, ” “sequential rather than simultaneous,” “usually only vaguely ordered, and it can get monotonous and overloading” (p. 79). According to the authors, narrative text is not only hard on analysts, but also cumbersome for readers of case studies. Taking their cue from data displays produced by statistical packages such as SPSS, the authors generate various set-ups for presenting qualitative data. In Chapter 4, 19 methods are presented in order of simple to complex and from descriptive to explanatory. A context chart (somewhat akin to an organization chart) and a checklist matrix are simple devices used for description. Ordering or arranging data in some systematic way is made possible by time-ordered, roleordered (roles of key players at the field site), and conceptually-clustered matrices. To facilitate explanation the analyst might prepare an effects or sitedynamics matrix, an event listing, or, ultimately, a causal network. Causal networks are defined as “a visual rendering of the most important independent and dependent variables in a field study and of the relationships between them” (p. 132). Chapter 5 builds upon the previous chapter to show how displays can be constructed using data from multiple field sites. Eighteen methods for conducting cross-site synthesis are illustrated. Me&-matrices (called “monster dogs” by the authors) and various types of site-ordered, variable ordered, and effect-ordered matrices are examined. The chapter concludes with a relatively lengthy (20 pages) discussion and illustration of causal modeling and cross-site causal networking of critical variables. Chapter 6, only four pages long, summarizes key points for building displays, entering data into display cells, and analyzing that data. Chapter 7 discusses

186

BOOK REVIEWS

specific tactics for drawing and verifying conclusions from qualitative data displays. The authors first discuss 12 tactics for generating meaning including, among others, counting, noting patterns and themes, making metaphors, factoring, finding intervening variables, and building a logical chain of evidence. These are followed by 12 tactics for confirming conclusions including, checking for representativeness and researcher effects, ruling out rival hypotheses and spurious relations, triangulating, and getting feedback from informants. The chapter concludes with a brief discussion of documenting and auditing the procedures of a field study. This book is likely to be well-received by quantitative researchers who find themselves having to cope with qualitative data. The discussion of the four-stage research process will feel as comfortable as an old shoe, and the attempt to find analogies to computergenerated data displays that will work with qualitative data will be gratifying. Putting words and coded symbols rather than numbers into display cells won’t appear to be such an odd idea. Likewise researchers and evaluators schooled in the positivist/experimentalist tradition will find the idiom of the book consoling: The language of their paradigm (“independent and dependent variables, ” “intervening variables,” “replicat“outliers,” sampling ing,” “rival explanations,” frame,” “ representativeness,” etc.) pervades the book. The authors admit that “we think of ourselves as logical positivists who recognize and try to atone for the limitations of that approach” (p. 19). Clearly this is a book written about qualitative analysis from the perspective of the quantitative tradition. It should also be noted that this book promotes the pragmatic “primacy of method” solution to the debate between quantitative and qualitative research traditions. It reflects a viewpoint which the authors previously elaborated (Miles & Huberman, 1984), namely, that the debate between different epistemological positions is best left to philosophers of science and to those methodologists with a philosophical bent. In this sourcebook the authors foster this point of view by arguing that “any method that works is grist for our mill regardless of its antecedents” (p. 17). The language and approach that pleases proponents of one tradition will no doubt peeve supporters and believers of another. The authors admit as much, although I believe they think of themselves as being a bit more ecumenical than they actually are. For example, one of the cardinal principles distinguishing the two traditions is the primacy of subject matter versus the primacy of method (Diesing, 1971). In the qualitative/ ethnographic/naturalistic tradition, researchers are admonished to first be acted upon by the subject matter. Further, methods are to be chosen in view of their compatibility with the subject matter under investiga-

tion and not in view of their capacity to meet requirements of scientific investigation. Qualitative Data Analysis reveals the primacy of method approach on virtually every page. Despite these and similar errors which caution the reader to beware of Greeks bearing gifts, I believe that the sourcebook itself provides much grist for the qualitative researcher’s methodological mill. If quaIitative researchers looking for better ways of displaying information can hold their displeasure with the book’s language and philosophy in abeyance long enough to inspect the suggestions that Miles and Huberman offer, I beIieve that they also will find much of value in this sourcebook. I found myself in the latter category, so I know whereof I speak. These caveats not withstanding, the book is not without its irritants, and I will devote the remainder of this review to discussing the major and minor annoyances. Under major irritants I would classify the following observations. First, Miles and Huberman frequently adopt a very patronizing tone toward practitioners of qualitative/naturalistic approaches and their methodologies. They refer to themselves as “savvy practitioners” (p. 48) and leave the reader with the implication that many who practice qualitative research are not. Consider the following two passages: So, unlike some schools within social phenomenology, we consider it important to evolve a set of valid and verifiable methods for capturing these social relationships with their causes (p. 20) Our stance involves orderliness. There are many researchers who prefer intuitive, relaxed, nonobsessive, voyages through their data, and we wish them well. But for 11.x.. . thoroughness and explicitness are quite paramount. (p. 20) In the passages above, I put the emphases where I felt it when reading. I found Miles and Huberman’s argument that we need a clearer understanding of the process of qualitative data analysis appealing. But I think that the case for clarity is taken aback by this sort of us versus them tactic. Second, the authors tend to bear false witness against their colleagues, Consider the following passage:

Methodologi~ly, our beef is with the somewhat magical approach to the analysis of qualitative data advocated on the grounds that such an approach is idiosyncratic, incommunicable and artistic and that only those who have been fully socialized and apprenticed in its practice can claim to comment upon it. (p. 20) “Wizards and magicians we ain’t!,” is the response this passage invoked from me. I resent being cast as an alchemist, and I know of no self-respecting qualitative researcher who would make a claim such as is presented here. A third major irritant of this sourcebook was the labeling scheme employed to identify illustrations.

BOOK

Often I was bewildered by the scheme, searching for references in the text to Chart 13b or Box 1II.A.a. Minor irritations included a significant number of proofreading errors, poorly drawn charts, charts which weren’t typeset, and charts with print so small that it discouraged perusal. Often, I thought that if the chart in question was being proposed as a more palatable alternative to narrative displays, then the authors must be pulling my leg. Since displays were the theme of the book, one would expect that more attention

REVIEWS

I87

would be paid to their display! Lastly, there were annoying errors in the reference list. For example, a reference to one of the authors’ previous publications was nowhere to be found in the bibliography. Finally, I would recommend this sourcebook as an addition to educational research courses, with the caveat that it be an advanced course. Prior knowledge of epistemological traditions and research approaches is necessary to fully appreciate its contribution to the methodological literature.

REFERENCES DIESING, P. (1971). Parterns of discxwery in the sociai sciences. Chicago: Aldine.

MILES, M. B. & HUBERMAN, A. M. (1984). Drawing valid meaning from qualitative data: Toward a shared craft. Educational Researcher, 13, 20-30....


Similar Free PDFs