Send the link below via email or IMCopy
Present to your audienceStart remote presentation
- Invited audience members will follow you as you navigate and present
- People invited to a presentation do not need a Prezi account
- This link expires 10 minutes after you close the presentation
- A maximum of 30 users can follow your presentation
- Learn more about this feature in our knowledge base article
Qualitative research methodology
Transcript of Qualitative research methodology
using qualitative methods... To describe Make "analytical" generalizations and not "statistical"... and be honest about that To evaluate 1. Specification (what is the purpose)
2. Variables of interest
3. Case selection and unit of analysis
4. Formulation of questions
5. Criteria for linking observations to propositions "Construction and demarcation"
- to reach the goal and secure validity! Sampling
- sample frame?
- sample unit?
- purposive sampling?
- reviewing documents
- participation - Comprehension
- Re-contextualization Always be "rich" in case descriptions
so those wanting to transfer results
know the "frame". Answer Question Design Implementation Analysis To test theory To probe
(areas of future study) - Establishes "construction validity"
Simplified, it is the question if you will in fact study what you intend to study? And how you do it... Doing Analyzing Specify from the start!
- Pattern matching
- Explanation building
- Time series analysis
Etc... Already the Implementation determines our "External validity"
- I.e. is our study set up and performed to answer what we intend to? Cleaning and organizing data
The theoretical methods for analysis can be operatinslised by:
- relating observations to theoretical propositions
- asking "how and why" questions to the data
- Just making case descriptions... "Internal validity" is how we make the link from the observations - to the predictions we make. There are many threats
to external validity, for instance:
- We are not asking the right questions.
- We are not talking to the right persons.
- Respondents might try to give us
what we want (or what they think we want...
- They might try to be nice
- They might try to act strategic (to gain funding, prestige, power,... We can improve the analysis by for instance
- letting respondents validate our findings,
- letting experts validate findings
- triangulation (using more methods) Threats to validity
- Have we collected enough information to really say something about X?
- Did we talk to the right people?
- Did we do our homework so we asked the right questions to our "data"? "Conclusion validity" is all about reliability Conclusions One of the major problems with construction validity is usually mono-operation and mono-method bias.
- that we make to few observations and use to few "tools/ways" to collect observations. If we can we should push further and try: *Creswell (2003) "Research design: qualitative, quantitative and mixed method approaches."
*Guba and Lincoln (1989) "Fourth generation evaluation"
*Morse (1994) "Emergig from the data: the cognitive process of analysis in qualitative inquiry)
*Yin (many editions) "Case study research: design and methods"
*Trochim (online) "socialresearchmehods.net"