Full Transcript

Research Designs & Methods Prof. dr. Bart Cambré Antwerp Management School Leg 3 Operationalization Also check the short video Making concepts measurable Antwerp Management School LEG1: research strategy  RESEARCH To what extent does the level of organizational QUESTION dec...

Research Designs & Methods Prof. dr. Bart Cambré Antwerp Management School Leg 3 Operationalization Also check the short video Making concepts measurable Antwerp Management School LEG1: research strategy  RESEARCH To what extent does the level of organizational QUESTION decentralization impact job satisfaction of employees, moderated by company size? Conceptualization  Giving concepts theoretical meaning  Based on theory and literature LEG1: research strategy  RESEARCH To what extent does the level of organizational QUESTION decentralization impact job satisfaction of employees, moderated by company size? LEG2: conceptual model  KEY Level of organizational decentralization = definition CONCEPTS Job satisfaction of employees = definition Company size = definition 3 LEG3: operationalization  OPERATIONALIZATION CONCEPTS DIMENSIONS INDICATORS MEASURES Homogeneous or Making concepts Coding, calculation of heterogeneous? measurable. Single or scores multiple? Decentralization Job satisfaction Size Concepts from ‘key concepts’ 4 LEG3: operationalization ABSTRACT LEVEL / EMPIRICAL LEVEL  OPERATIONALIZATION CONCEPTS DIMENSIONS INDICATORS MEASURES Homogeneous or Making concepts Coding, calculation heterogeneous? measurable. Single of scores or multiple? Decentralization 1. Vertical 2. Horizontal Job satisfaction 1. With content 2. With conditions 3. With circumstances 4. With support Size (size) 5 LEG3: operationalization ABSTRACT LEVEL / EMPIRICAL LEVEL  OPERATIONALIZATION CONCEPTS DIMENSIONS INDICATORS MEASURES Homogeneous or Making concepts Coding, calculation heterogeneous? measurable. Single of scores or multiple? Decentralization 1. Vertical 1. Number of 2. Horizontal hierarchical levels 2.1. Number of units 2.2. Parallel or sequential structure Job satisfaction 1. With content Likert-scale (15 2. With conditions items) 3. With (reference to literature or circumstances location of the 4. With support scale) Size Size Number of employees 6 LEG3: operationalization ABSTRACT LEVEL / EMPIRICAL LEVEL  OPERATIONALIZATION CONCEPTS DIMENSIONS INDICATORS MEASURES Homogeneous or Making concepts Coding, calculation of heterogeneous? measurable. Single scores or multiple? Decentralization 1. Vertical 1. Number of 1. Number of 2. Horizontal levels, hierarchical including top levels and floor 2.1. Number of 2.1. Number of units units outside 2.2. Parallel or production sequential 2.2. True/False structure statements Job satisfaction 1. With content Likert-scale (15 5 point scale: 2. With items) conditions (reference to Completely 3. With literature or agree- location of the completely circumstances scale) disagree 4. With support Size (size) Number of In FTE employees 7 Measurement error Measurement invalidity Measure does not capture concept Systematic error/bias Measurement unreliability: Inconsistency under repeated uses Random measurement error 8 NOIR scale For more qualitative research: LEG3: ABSTRACT LEVEL / EMPIRICAL operationalization LEVEL  OPERATIONALIZ CONCEPTS DIMENSIONS INDICATORS MEASURES ATION Homogeneous Making concepts Coding, or measurable. calculation of heterogeneous? Single or scores multiple? ‘open’: new concepts topics to use in can be left possible during research e.g. in-depth out interviews 10 Process of operationalization difficult, but extremely important! 11 Leg 4 Research Design and Methods Antwerp Management School Research Designs and Methods A Research Design provides a framework for the collection and analysis of data. Choice of research design reflects decisions about priorities given to the dimensions of the research process. A Research Method is simply a procedure for collecting data. Choice of research method reflects decisions about the type of instruments or techniques to be used. Research Quality Bamberger (2017, p. 237), “after all, no matter how interesting a Bagozzi et al. (1991, p. 421) phenomenon may be, until it can remind us: “To bring rigor in be accurately and reliably research, it is therefore, essential measured, our ability as scholars for the researcher to first establish to understand such phenomena, an evidence of construct validity explain their origins and before testing the theory.” demonstrate their implications for management is extremely limited.” Quality Indicators in Quantitative Business Research Reliability – are measures consistent? Replication/ replicability – is study repeatable? Measurement (or construct) validity – do measures reflect concepts? Validity – are conclusions well- Internal validity – are causal relations between variables real? founded? External validity – can results be generalized beyond the research setting? Ecological validity – are findings applicable Quality Indicators in Qualitative Business Research Credibility (~internal validity) – how believable are the findings? Transferability (~external validity) – do the findings apply to other contexts? Dependability (~reliability) – are the findings likely to apply at other times? Confirmability (~objectivity) – has the investigator allowed his or her values to intrude to a high degree? Ecological validity – relates to the naturalness of the research approach And above all: Relevance Research Designs Experiment Six Cross-Sectional Design Major Longitudinal Design Case Design Designs Comparative Design Design Science Research question Choice Practical decisions: finance, law, ethics T0 T1 Experimental Design Measurement of Control group – no Measurement of dependent variable treatment dependent variable Field experiments laboratory experiment Measurement of Experimental Measurement of Necessary to manipulate independent variable dependent variable group - treatment dependent variable Effect + time= causality Not all designs are useful for causal relationships Quasi-experiment: control group compared to treatment group, but no random assignment Quantitative Qualitative Typical form: No typical form: Most researchers using an The Hawthorne experiments experimental design employ provide an example of experimental quantitative comparisons between research design that gradually experimental and control groups moved away from the ‘test room with regard to the dependent method’ towards the use of more variable. qualitative methods. Elaboration: Hawthorne Effect The effect of the experimenter, or the fact of being studied Research at Hawthorne works of the Western Electric Company (USA, 1920s-1930s) (Roethlisberger and Dickinson, 1939) Aim of the study was to discover how changes in the number and duration of rest pauses, in length of working day, in heating and lighting, affected productivity Result: productivity increased, irrespective of changes that were being introduced Conclusion: ‘human-relations’, psycho-social support in the workplace Experimenter effect: Researcher’s activities will have an influence on the research setting Cross-sectional design Data-collection › More then one unit (much more) › At a single point in time › To collect quantitative data T1 › On 2 (or more) variables › To check for patterns Obs1 Survey-design, structured observation, Obs2 content analysis, official statistics, diary research,...... Obsn Longitudinal design › Time consuming › High cost T1 Tn › Causality! › Panel research: (random) sampling Obs1 Obs1 · Cohort research: share certain characteristics (eg. year of birth) Obs2 Obs2...... Obsn Obsn Case study design Detailed and in-depth analysis of 1 case = to highlight complexity and specifity What can be a case? › Organization › A single location (production site, office building, …) › A person (life history/biography) › A single event (a merger, a disaster, …) Validity: problem of generalization (external validity) › Case is not a sample of 1 › Therefore no aim at generalizability, though theoretical › Aim is to generate concrete, context-dependent knowledge Methods: often mixed-method Comparative Design Two (or more) cases with (more or less) identical methods Understand social phenomena through comparison Cross-cultural, cross-national (e.g. Hofstede (1984) on organizational culture) Comparability  theoretical reflection Hybrid: › Quantitative: e.g. extension of cross-sectional design › Qualitative: e.g. extension of case study design Design science › Learning by doing … and check (Lewin, 1946) › Study the problem systematically and ensure the intervention is informed by theoretical considerations › Special characteristics · Involve problem stakeholders · Real-world situation to solve real problem · No attempt to remain objective › Cycle: diagnose, plan, act, observe, reflect › Holistic approach: document analysis, observation, interview, survey, case, search conference › Ethical issues: Involve all? Open? Collective? Equal? Zelf toegevoegd! Examples Design Science Management Information Artefacts Systems (Hevner et al., 2004; Peffers et al., 2007) Requirements of an information system Process charts, flow charts, data models ‘To be’ should be better than ‘as is’ Organizational structure Organizational change, workplace innovation Poll What major design do you prefer for your research? (first idea) 1. Experiment 2. Cross-sectional 3. Longitudinal 4. Case 5. Comparative 6. Design science Unit of Analysis Unit of Observation Unit of Analysis: what is the primary unit of analysis Individuals Groups (e.g. team) Organizations Societies  or multiple levels. Make explicit! Unit of Observation: what is the primary unit of measurement Can be the same as the unit of analysis Unit of Analysis: network of networks Can be different Unit of Observation: network coordinators, meeting minute Make it explicit!  Potential bias created Data & Sampling › Data › What data? Why these data? Access? Primary or secondary data? Database issues? Missing data? Cleaning data? › Small versus large amount of data › Boundaries and frame › Sampling › Random versus non-random (purposive)? Boundaries and frame? Size? Period? Interval? › Methods › What data collection methods? In what sequence (if applicable)? Availability of software? How will you do the analysis? Do you have the appropriate knowledge (e.g. statistical skills)? Linking qualitative and quantitative research › Most research questions can be answered by multiple designs, both quantitative and qualitative designs › Sequential and Parallel › Triangulation › Strategy › Design › Method (mixed methods research)  Becoming more important Poll Do you consider triangulation for your research? 1. No 2. Yes, in strategy 3. Yes, in design 4. Yes, in methods Turtle Tail - Ethical issues Gaining access to the field › Gate keepers › Personal position (contacts) › Win-win? Use of appropriate language Open versus closed research Privacy and anonymity Possibility not to participate Effects of research for participants and researcher Behavior and objectivity of researcher INFO AUTHOR(S) Name(s) WORKING TITLE Title FIELD OF RESEARCH Define the subfield Template HEAD: research orientation 1. Ontology: objectivism, constructionism? 2. Epistemology: positivism, post-positivism, interpretivism? RESEARCH STRATEGY 3. More Deductive or Inductive research 4. Qualitative and/or Quantitative research? PROBLEM STATEMENT & What is the problem? What is the overall research question? Are there (related) sub- RESEARCH QUESTION questions? BODY OF KNOWLEDGE KEY PAPERS What are top articles (gurus; high impact journal articles; high citations)? LEG1: 5C framework COMMON GROUND What is already known? What is the ‘common ground’? COMPLICATION What is the managerial ‘puzzle’? What is currently missing in this common ground? CONCERN Why should we be concerned about this puzzle? Why is it relevant? What is your main action? Will it be theory testing or developing? Will it be an application, a COURSE OF ACTION replication, an evaluation, a test, a combination, etc.? What is the contribution to theory and conceptual understanding of the phenomenon? Is there CONTRIBUTION a methodological novelty (in data, tools, analysis)? What is the contribution to managerial practice? Is there societal impact? LEG2: conceptual model KEY CONCEPTS Is there enough focus? What are definitions of key concepts? How many concepts are there? RELATIONS & (if applicable) What are the dependent and independent variables? What are other variables CAUSALITY and relations? Is the causal direction clear? Include a visual ‘conceptual model’. HYPOTHESES What are the central hypotheses (for deductive research)? LEG3: operationalization Template OPERATIONALIZATION CONCEPTS DIMENSIONS INDICATORS MEASURES Homogeneous or Making concepts measurable. Coding, calculation heterogeneous? Single or multiple? of scores LEG4: design & methods DEFINING THE CASE What is a case? Where do you conduct your research? UNIT OF ANALYSIS & What is the unit of analysis? What is the unit where you collect data? UNIT OF OBSERVATION Major design? Combination of designs? Why is this the best possible design to answer your DESIGN RQ? What data? Why these data? Access? Primary or secondary data? Database issues? DATA Missing data? SAMPLING Random or purposive? Boundaries and frame? Size? Period? Interval? Representativeness? What data collection methods? In what sequence (if applicable)? Availability of software? METHODS & TOOLS How will you do the analysis? Do you have the appropriate knowledge (e.g. statistical skills)? VALIDITY & RELIABILITY Meaningful data? Quality? What will you do to increase validity and reliability? TAIL: reflective stuff ETHICS Are there ethical considerations in research design, access to data, analysis, reporting data? Collaboration needed? Risks? Time frame? Personal advantages? What will make it CHALLENGES happen? Thank you! Bart Cambré [email protected] Antwerp Management School Opening minds to impact the world

Use Quizgecko on...
Browser
Browser