1ZV60 MfIER '24-'25 PDF Lecture Notes
Document Details
Uploaded by TopnotchObsidian8057
2024
Tags
Related
- Qualitative Research Methods PDF
- Kothari Research Methodology - Methods and Techniques 2004 PDF
- PR 2 - GROUP 3 Research Methodology PDF
- Research Methodology Methods and Techniques PDF 2020 by CR Kothari & Gaurav Garg
- Sociology TYBA Paper 9 Research Methodology (Marathi) PDF
- Descriptive Statistics - Open Science Tools - 2024-2023 PDF
Summary
These lecture notes cover the methodology of IE research, focusing on topics such as the role of theory, philosophical foundations, qualitative and quantitative research methodologies, data analysis, research designs, and research proposals. The document includes sections for each week of the course.
Full Transcript
1ZV60 - Methodology of IE research Lecture notes Week 1 (1.1) 3 Role of theory in research 3 Philosophical foundations of research methods 3 objectivism 3 constructivi...
1ZV60 - Methodology of IE research Lecture notes Week 1 (1.1) 3 Role of theory in research 3 Philosophical foundations of research methods 3 objectivism 3 constructivism 3 positivism 4 interpretivism 4 The empirical cycle and the problem-solving cycle 4 Week 1 (1.4) 5 Developing good research questions 5 What? 5 Why? 5 How-conceptually? 5 How-practically? 5 Reviewing literature 5 Systematic review 6 Narrative/integrative review 6 Writing up research 6 Week 2 (2.1) 7 When to use qualitative research 7 Qualitative study designs & sampling cases 7 Data collection 8 Interviewing 9 Week 3 (3.1) 10 Coding 10 Building a model 11 Saturation 12 Memos (or analytic memos) 12 Gioia Method for inductive research 12 Mistakes 13 Ensuring rigor validity and reliability 13 Reporting 13 Week 4 (4.4) 15 Research styles: 15 Research philosophies 15 Induction & deduction 16 Hierarchy of scientific knowledge: 16 What is a theory? 17 What is a model? 17 What is a variable? 17 What is a hypothesis? 17 Project planning and research questions 18 Week 5 (5.1) 19 Assignment 2 19 Literature review 19 Research designs 21 Week 5 (5.4) 23 Causality 23 Experimental design 23 Field and lab experiments 23 Validity 23 Experimental design types 24 Self-completion questionnaires 25 Week 6 (6.4) 27 Asking question 27 Nature of quantitative research 28 Theoretical and empirical plane of research 29 Why should we measure? 29 Types of validity 30 Sampling 31 Secondary analysis 31 Week 7 (7.1) 32 Empirical cycle 32 What is a business process? 32 Deduction and induction 32 Week 7 (7.3/7.4) 33 Demarcating the problem 33 Research objectives and research questions 33 Week 8 (8.1/8.2) 34 Research proposal - contractual perspective 34 Research proposal - conceptual design 34 Research proposal - technical design 34 Project organization and costs 35 Bridge between problem analysis and diagnostic phase 35 Step 2: Analysis and diagnostic phase 35 Step 3: Solution design 36 Step 4: Intervention 37 Step 5: Evaluation and learning 37 Week 1 (1.1) Role of theory in research - Empirical evidence is the basis for theory - Technology acceptance model (TAM) - Key theory related terminology - Domain: all instances to which the theory is expected to apply - TAM: all potential users of new computer technologies - Population: a subset of the domain where the theory can be developed and/or tested - TAM: e.g. all potential users of new computer technologies - Sample: the instances from the population that are studied - TAM: e.g. potential smartwatch users reached via email - Unit of analysis:the instance to which the theory applies - TAM: a potential user of a new computer technology - Theoretical reasoning - Induction - Given a cause and effect, induce the rule - Deduction - Given the rule and the cause, deduce the effect - Abduction (not often used) - Given the rule and effect abduce a cause Philosophical foundations of research methods - How can theories be supported? 1. Ontology: what kinds of objects exist in the “social world”? objectivism constructivism Phenomena and their meanings exist Phenomena and their meanings are independently of actors constantly accomplished by actors Implication: phenomena are Implications: objectively observable - Phenomena are produced - E.g. organisational hierarchies, through interaction and rules & regulations, procedures constantly revised - Researcher’s accounts of the world are constructions - Knowledge is subject to change 2. Epistemology: what can be considered “acceptable knowledge”? positivism interpretivism Apply a “natural science approach” to Attempt to understand social explain social phenomena: phenomena: - Phenomena can be observed - Phenomena have subjective and/or measured objectively meaning (both to researchers - Theory provides generally and social actors) applicable “laws” that can be - Researchers attempt to tested deductively understand this meaning for - New knowledge is derived the actors they study from induction based on facts - In doing so, researcher’s interpretation is influenced by their own experience - Being objective and value free is neither possible nor desirable - Implications for methods: - Appropriate research methods depend on ontological and epistemological foundations - Awareness of these topics helps you understand research that you encounter The empirical cycle and the problem-solving cycle - Empirical cycle - Used for generic business problem that many businesses encounter - Relationships between variables - Develop and empirically tests theories, aim: generalisable insights, useable across contexts - Observation → induction → deduction → testing → evaluation → observation … - Provides relevance for theory development - Problem solving cycle - Used for performance-related business issues (context specific) - A business process that does not meet a defined performance level (e.g. costs, quality, timeliness) - Problem definition → analysis and diagnosis → solution design → intervention → evaluation and learning → problem definition … - Provides theoretical foundations for a solution Week 1 (1.4) Developing good research questions What? Why? - What puzzles/intrigues me? - Why will this be of enough interest - What do I want to know more to others to be published? about/understand better? - Can the research be justified as a - What are my key research ‘contribution to knowledge’? questions? How-conceptually? How-practically? - What models, concepts and - What investigative styles and theories can I draw on to answer techniques shall I use to apply my my RQs? conceptual framework? - How can these be brought together - How shall I gain and maintain into a basic conceptual framework access to information sources? to guide my investigation? Reviewing literature - Keyword search - Web of science, scopus, google scholar - Keep track of your keywords! - Reading relevant journals - Differ per field & RQ - Look for journals that frequently appear in the results of your keyword search - Following citations - Backward citations: look at what gets cited in sources - Forward citations (!): use web of science or google scholar - Assessing quality of journal articles - Avoid predatory journals - Is the journal reputable? ISI journal citation reports, FT50 list, ERIM list - Has the article been cited a lot? (not necessarily indicative) - Use own judgement! - Other literature - Books - Professional journals - Working papers - Press articles - Conference papers - Websites - Writing up your review Systematic review Narrative/integrative review - Positivist - Interpretivist - Quantity focus - Quality focus - Theory informs the search - Theory emerges from search - Synthesis is the research goal - Identifying the discourse is the research goal Alternatives: 1. Concept centric 2. Author centric 3. Paper-centric Writing up research - Follow the requirements and instructions - Start early - Structure your writing - Be persuasive - Avoid discriminatory language - Make use of your resources and frequently ask feedback Week 2 (2.1) When to use qualitative research - Types of qualitative research - Narrative research - Phenomenological research - Grounded theory - Ethnography - Case studies Case study Grounded theory - Aim: develop in depth analysis of - Aim: discover a theory from the case(s), theory building (or testing) data based on the analysis - Positivist & interpretivist - Commonly used methods suit approaches exist positivist paradigm - Similar approaches to case selection & data collection (but beware of different vocabularies - Different approaches to data analysis Qualitative study designs & sampling cases - Unit of analysis - Should follow logically from your research question - Examples: company, project, team, individual, process, product - Multiple and single case designs Single-case designs Multiple-case designs Holistic (single-unit of analysis) Embedded (multiple units) - Grounded-theory approaches often focus on a single case - Rationales for single case designs - Critical case - extreme/unique case - representative/typical case - Revelatory case - Longitudinal case - Multiple case design - Enables replication - Literal replication: when you expect to find similar things - Theoretical replication: sample cases where you expect different things - Sampling cases 1. Selecting case(s) for your study 2. Selecting data sources within your case(s) Data collection - Maximise diversity in sample, try to capture all perspectives on a topic - Use purposive sampling - Start with a small sample - Perform initial analysis - Decide on additional data sources based on initial analysis - Define a rule about when to stop - Principles for validity & reliability - Are the conclusions correct? - Saturation: - Make best attempt at covering the diversity of the population - Establish and document rules of thumb - Research design protocol - Record all procedures & reasoning behind them in one document - Update protocol regularly - Aim: readers should be able to perform the same study Interviewing - Before - Type of interview: unstructured, semi-structured, structured - Preparing the interview - Avoid jargon, phrase things understandably - Avoid leading questions - Consider what you already know - Logical question order - Prioritize questions and have back-up questions - Preparing practical aspects - Choose setting with little distraction - Explain purpose of interview - Address terms of confidentiality - Explain format and duration of interview - Ask if they have questions and provide contact details - Arrange recording device, ask whether you are allowed to record - Bring print-outs of interview guidelines - During - State the research’s purpose and explain what will be done with the information from the interview - Inform interviewee that they cans top at any time or skip any question - Make clear agreements about confidentiality and anonymity - Get explicit permission to record interview, start recording only after permission has been granted - After - Take notes asap after the interview - Transcribe interview - Validate responses Week 3 (3.1) Qualitative data analysis - Challenging & rewarding: - Less guidelines, so more uncertainty but more room for agency & creativity - Software is less helpful for analysis, but can still be used to organizing - Analysis must be done by hand Quantitative Qualitative - Numbers - Words - POV of researcher - POV of participants - Researcher is distant - Researcher is close - Theory testing - Theory emergent - Static - Process - Structured - Unstructured - Generalisation - Contextual understanding - Hard, reliable data - Rich, deep data - Macro - Micro - Behavior - Meanings - Artificial settings - Natural settings - Key question in qualitative data analysis: how to makes sense of large amounts of qualitative information Coding - Codes are tagging/labeling the data that assign common meaning to units of data - Meaning: what is happening, what is someone (not) doing, what is this an example of? Inductive coding/ grounded Deductive coding/ thematic theory approach analysis template approach Focus Data-driven Theory-driven Interpretation Searching for ‘insiders' Applying theoretical perspective’ perspective Coding Start with ‘open coding’ and Coding using fixed and let ‘local theory’emerge pre-defined coding scheme Relations Explanation building, Pattern matching uncovering concepts and relationships between them - Deductive coding: - Using predefined codes instead of open coding - No axial and selective coding - Other data analysis approaches (e.g. counting occurrences of codes) may be used - When is something useful/relevant to code? - When it is related to the research question - When it is mentioned frequently - When it surprises you - When the interviewee mentions that it is important - When you have read something similar in the literature - When it reminds you of a theory or concept - Three stages of coding (ascending levels of abstraction) 1. Open coding (yielding concepts) a. Breaking down, examining, comparing, conceptualizing, and categorizing data: this yields concepts b. Their value depends on usefulness and frequency of occurrence 2. Axial coding (connections between categories) a. Move from large number of open codes to more manageable categories that give insight into the phenomenon of interest b. Data are put back together in new ways after open coding, by making connections between categories 3. Selective coding (core categories and relationships) a. Discover core categories b. Identify causal links between categories c. Develop the story line Building a model - Conceptually summarize how your subcategories make up your core category - Develop a model or explanation of your phenomena under study - Coding scheme or codebook - Includes: code, description & definition, illustrative quotes, rules & guidelines - Aim for codes that are: mutually exclusive, exhaustive - Coding process and constant comparison Saturation - Data collection (primary): collecting data until you reach the point where new data is no longer illuminating for phenomenon under study - Data analysis (theoretical saturation): coding until the point where there is no further point in reviewing your data to see how well they fit with your concepts or categories - Data collection, follow up (theoretical saturation): collecting data until you reach the point where new data is no longer illuminating the concepts or categories Memos (or analytic memos) - Useful and powerful tool for capturing thoughts that occur throughout data collection, analysis and coding, sensemaking process and conclusion drawing - Not just descriptive, attempt to synthesize data into higher level analytical meanings - Keep it as a separate document from field notes, transcripts or other data - Assign clear title and date, for reference to analytical history and progress of your study Gioia Method for inductive research 1. First order analysis a. Using informant-centric, in-vivo terms (exact words and phrases of participants) as codes b. Somewhat similar to open coding 2. Second order analysis a. Using researcher-centric concepts, themes, and dimensions b. Somewhat similar to axial coding and selective coding 3. Data structure a. Illustrates progression from raw data to themes b. Demonstrates rigor in analysis process 4. Build theory, (process) model a. Specifying temporal connections between theoretical concepts or themes b. Building up a grounded theory or process model or propositions c. Process model: an emergent model of identity change revolving around a collective state of identity ambiguity that provide insight into processes whereby organizational identity change can occur Mistakes - Lack of focus, overcoding - Uncalled quantitative sensemaking Ensuring rigor validity and reliability - Stay close to the data, establish chain of evidence from raw data to more abstract concepts and themes - Use a codebook (coding scheme) - Whenever you create new code, write down a definition and instructions of when to use the code - Whenever you use a code, refer to the codebook - Use memos - Write down ideas for later - Involve/interact with other coders - Triangulation (ensuring validity and reliability) - Use multiple sources of data to triangulate your findings Reporting - Justify using qualitative approach - Description and justification of - Research design - Data collection process - Data sources - Data analysis process/methods - Description of sensemaking process leading to main findings - Describing your data (sources) - Data collection process - Type, amount and source of data - Dates (when relevant) - Use in analysis - Data analysis process and structure, describe - Coding procedures or sequential coding steps - Sensemaking steps in relation to coding procedures - Coding schemes - (data structure) - Coding scheme or codebook, describe - Codes - Description or definition - Illustrative quotes - Writing up your findings - Finding insights from your data, they typically do not include any reference to literature - Make clear statements, include visuals, present main findings, provide triangulated evidence - Explain workings of model and key relationships, explain and illustrate key concepts, themes and causal relationships with quotes, triangulate (back up your sources) - Discussion - Reflect on findings and compare them to existing literature, have you answered the research question? Week 4 (4.4) Research project and writing up - quantitative research methods - Qualitative research usually deals with words and meanings, while quantitative research deals with numbers Research styles: - Basic (theoretical) research - Pure/fundamental research - Problem solving of a theoretical nature - Little impact on action, performance or policy decisions - Mainly conducted at universities and research institutes, but increasingly at companies - Desire to expand knowledge - Curiosity driven - Why, what how? - Increase understanding of fundamental principles - Does not have immediate commercial objectives - May not result in an invention or a solution to a practical problem - Applied (practical) research - Practical problem-solving for a business or management issue - Application of theoretical notions - Mainly performed at companies, but also at universities - Specific commercial objectives - Answers specific questions Research philosophies 1. Research is based on reasoning (theory) and observations (data), quantitative research is usually based on a fair amount of data 2. Can we know the world objectively, or is knowledge always a subjective representation of reality? 3. How objective is our world? Or is humanity unknowingly trapped inside a computer simulated reality? - Two research cycles important in business research - Problem solving cycle: how things are done, needs improvement - Empirical cycle: how things work is not well understood - The empirical cycle Induction & deduction - Induction: - The process in which a general conclusion is formed from a limited set of specific cases (observations). Conclusion is true (till this moment). - Example: first swan is white, second swan is white, third swan is white, so all swans are white - Deduction: - Form of reasoning in which the conclusion necessarily follows from the reasons given - Example: swans are white, there are two swans in the park today, so the two swans in the park are white - Induction and deduction are not the most efficient ways for advancing scientific knowledge Hierarchy of scientific knowledge: - Meta-paradigm: global perspective how a discipline views reality - Paradigm or grand theory: general knowledge system of concepts and propositions What is a theory? - Narrow the range of facts under study - Summarizes what is already known - Suggests type of research approaches - Can be empirically tested - Can be used to predict any further (new) facts What is a model? - Schematic representation of reality - Represents phenomena through the use of analogy - Model functions: - Visualization - Simplification - Explanation - Representation - Heuristic What is a variable? - Characteristic, trait, or attribute that is measured and hence is able to vary - Operationalization of a concept at the empirical level - Numerical value is based on the variables properties - Can be dichotomous or continuous - Can also reflect categories with discrete values - Types of variables: - Independent variable: cause, stimulus, predictor, input - Dependent variables: effect, response, criterion, output/outcome - The value of the dependent variable depends on the value of the independent variable. The dependent variable represents the outcome whose variation is being studied. - Mediating variable: relation of two variables is contingent on a third variable - Confounding variable What is a hypothesis? - A statement formulated for empirical testing - Is usually an expectation or assumption which follows from a theory (or previous observation) - Provides the basis for investigation (to be proved or disproved) and ensures proper direction in which the study should proceed - Types of hypothesis: - Descriptive: state the existence, size, form or distribution of a variable - Relational: statements that describe a relation between two or more variables - Categorical: statements that describe differences between groups or classes with regard to each other or to another variable - A good hypothesis: - Is adequate for its purpose - Is uniform and concrete - Is testable - Is better than its rivals Project planning and research questions - Advice for a project - Always double check and follow the requirements and instructions - Just ‘thinking about your project’ is always important - Try to start early - Be transparent, clear and cooperative during the whole process for all stakeholders - Make use of resources - Ask for feedback during projects - What remaining resources are available, such as software and reimbursements - Good time management - Research questions - Focus on what you want to know - Type of research questions largely depends on your research strategy - Research questions can be global at the beginning, and can become more concrete during the research process - Will guide - Literature study, hypothesis formulation, type of research design, data collection methods, data analysis, write-up of the project - Research question hierarchy 1. Management dilemma 2. Management questions 3. Research questions 4. Investigative questions 5. Measurement questions 6. Decision Week 5 (5.1) Literature review and quantitative research designs Assignment 2 - What is fit? - Fit research in general entails supplementary fit and complimentary fit - Supplementary fit: on could replace one particular characteristic by another characteristic (i.e. they are identical in function and/or goal) - Complimentary fit: one could add a particular characteristic to another characteristic - With complementary fit, we are dealing with person-job fit and job-job fit - Person-job fit: the compatibility between personal characteristics and characteristics of the job - Job-job fit: the compatibility between different aspects of job redesign - Scientific report - Title page, abstract & keywords, introduction, methods, results, discussion, references, appendices - Introduction - Indicates purpose and importance of the research - Briefly reviews prior research and theory - ‘Funnel reasoning’: typically concludes with specific hypotheses that follow from the information above - Methods: - Provides in-depth information about the research design and procedure, participants, sampling, measures and statistical analysis - Interested readers should be able to replicate the research study - Please mention ethical approval of a study - Results - Provides in-depth information about the findings in a chronological order - Includes the results of statistical analyses - Visualizes findings by tables and figures - Reporting facts rather than interpretations - Discussion - Summarizes important findings and (non-)confirmation of hypotheses - Discusses how these findings are in line with previous findings from earlier studies - Describes the added-value of the report - Elaborates on strengths, limitations, future research and practical implications Literature review - Overview of the current state of art regarding a research problem - To know what is already known - Revise and refine your research questions and or hypotheses - How to collect data, which measures to use and how to analyze data - To increase your credibility as someone who is knowledgeable in that area - Literature and progress in science - Progress in science by means of continuous accumulation of knowledge - Everything builds on eachother - Searching for literature - Google scholar, researchgate, working/discussion papers databases, publishers’ databases, journals’ websites, authors’ websites or databases - Do a basic search first - Refine search by time period, journal quality, article type, language, etcetera - Assess the information obtained - Finally, synthesize the assessment of information - Primary sources: full text publications of theoretical or empirical studies (original work) - Secondary sources: compilation of information (for example book or systematic reviews) - Journal impact factors - A significance measures of the frequency with which the ‘average article’ in a journal has been cited in a particular period - Calculated by dividing the number of current year citations to the source items published in that journal during the previous two years - Literature review types: - Narrative review - Aka traditional literature review is a comprehensive and critical literature search of the current knowledge on a topic - It helps you to identify patterns and trends in the literature: ‘reading and summarizing’ - Systematic review - A replicable, scientific and transparent process that aims to minimize bias through (almost) exhaustive literature searches - Systematic reviews often, but not always, use statistical analysis - Integrative review - A review that critiques and synthesizes representative literature on a particular topic in an integrated way such that new frameworks and perspectives on that topic are generated - Key differences with other types of literature reviews: - More emphasis is placed on generating new knowledge, frameworks, perspectives and understandings by resolving inconsistencies in the literature - Integrative review is more methodological than a narrative review, but less procedural/statistical than a systematic review Narrative review Systematic review Integrative review Just reading and Reading, analyzing and Reading, synthesizing summarizing summarizing and generating - Meta-analysis - Analysis of analyses - A study that combines the results of several related studies for the purpose of integrating all findings - Estimates an average or common effect - Uses effect size statistics - Statistical concept that measures the strenght of the relation between two variables on a numeric scale - Measured by standardized mean difference, correlation coefficient, odds ratio - Part of a systematic review - Critical reviews: - Book reviews - Peer reviews for academic journals Research designs - Research design is a framework or blueprint, generalizes from sample to population - Heart of a research study - Framework created to answer the research questions - Activity and time based plan - Defines study type and data collection method, describes the statistical analyses to be done - Types of research designs: - Cross-sectional survey design - Also called a social survey design - Collection of quantitative data on more than one case - Single point in time - At least two variables - Testing relations between variables - Does not involve any manipulation - Individual differences are recorded - Longitudinal survey design - Same characteristics as a cross-sectional design, but for more than one point in time -Intra- and inter-individual differences can be recorded -Test for change in variables -Allows for causal inferences -Most of the time making use of panel studies and prospective cohort studies - Two key types: - Panel study: collects repeated measures from the same sample at different points in time - Prospective cohort study: collects repeated measures from a cohort - Possible relations: - Symmetrical relations: two variables fluctuate at the same time, but change in one variable is not caused by change in another variable - Asymmetrical relation: a change in X is responsible for a change in Y - Reciprocal relation: two variables X and Y mutually influence or reinforce each other - Case study design - Detailed and intensive analysis of a single case - Deals with the complexity and nature of the case in question - Focus on a bounded system with a purpose and functioning parts - Types of cases: person, organization, single event - Multiple case study design & longitudinal case study design are also possible - Comparative design - Using more or less identical methods on two or more contrasting cases - Can also make comparisons across different countries or cultures - Aim is to seek explanations for similarities/differences - Another aim is to gain a greater awareness and deeper understanding of social reality in different contexts - Experimental design - Research method is a technique for collecting data, it involves a specific instrument Week 5 (5.4) - Correlation ≠ causation Causality 1. Theoretical support 2. Systematic covariation 3. Temporal sequence 4. Nonspurious covariance - You can never prove causality Experimental design - Random assignment of people to experimental or control conditions - Manipulation of the independent variable - Using a control group or comparison group - Matching: equal distribution of different features in the population - Weighting: if one group is under/over-represented - Stratification: divide groups in strata, then randomly assign - Notation: - R indicates random assignment - O shows that pretest and posttest are measured at the same time - X is the treatment - Os indicate different waves of measurement - Lines per group Field and lab experiments - Field: - Conducted in natural/real life settings - Participants do not know they are being studied - Lab - Participants know they are being studied Validity - Can the study scientifically answer the questions it intends to solve - Validity = the degree to which it measures what it is supposed to measure - Reliability = the extent to which a measurement gives consistent results - internal/external validity - Internal: do the conclusions drawn about a demonstrated experimental relation truly imply cause? - External: does an observed causal relation generalise across people, settings, environments and times? - Controlling internal validity decreases external validity (eg demonstrate an effect in a highly controlled environment) - Controlling external validity decreases internal validity (eg replicate the study in a more realistic, natural setting) - Measurement and ecological validity - Measurement: does a measure capture the phenomenon which it is intended to measure - Ecological: do our instruments capture the daily life conditions, opinions, values and attitudes of those we study in their natural habitat - Ecological is a subtype of external validity, ecological is more about setting, external more about population - Threats to internal validity: - History: some events may occur during the experiment that confuse the relation being studied - Maturation: subject may change during the experiment - Testing: the process of taking a test can affect the scores of a second test - Instrumentation: changes in measuring instrument or the observer - Selection: differential selection of subjects for the experimental and control groups - Statistical regression: random fluctuation of the phenomena over time - Experiment mortality: composition of groups changes during the test Experimental design types - Quasi-experimental: if researchers cannot meet the requirements of a true experiment they conduct quasi-experiments - For example, if there is no randomization possible, no controlled manipulation possible, no (real) control group - Often conducted in evaluation and intervention research - A comparison group can be a pseudo-control group - Other interesting experimental designs - Randomized block design: variability within each block is minimized, variability between blocks is maximized, treatments are then randomly assigned to the experimental units in each block - Latin square design: two major factors, type of treatments and their order Treatment order 1 2 3 4 Participant 1 A B D C Participant 2 B C A D Participant 3 C D B A Participant 4 D A C B - Ex-post facto design: an after the fact design, qualities that existed in a group before the research are compared on a dependent variable, without being randomly assigned - Factorial design: more than one independent variable - Platinum design: solomon four group design, true experimental design with random assignment Self-completion questionnaires - Aim: quick and cheap way of collecting results - Confidentiality means the researchers know who the respondents are, anonymity is that no one knows Pros Cons - Relatively cheap - No assistance/elaboration possible - Quick to administer - Cannot ask too many or too - Absence of interviewer effects difficult questions - Standardized - Not knowing exactly who - Convenient for respondents to fill responded out - Risk of missing data and low response rates Questionnaire bias: - Tendency of a sample statistic to systematically over-estimate or under-estimate a population parameter - Causes: - Non-response bias, voluntary response bias because of under/over-representative sample - Social desirability bias, extreme-scoring tendency, positive and negative affect because of measurement error - Sampling bias because of sampling procedures Controlling for positive and negative affect is called confounding - Diaries: set of questionnaires filled in by participants over the course of the research - Experience/event sampling method: variation of diaries, prompted at specific points and after specific events Week 6 (6.4) Measurement, sampling and secondary analysis Asking question Open-ended Closed Low Reliability of data High Low Efficient use of time High Low Precision of data High Much Breadth and depth Little Much Interviewer skill required Little Difficult Ease of analysis Easy - Scaling: a procedure for the assignment of numbers or symbols to an object 1. Nominal: name, categories with no difference and no order 2. Ordinal: categories with an indicator of order without stating how much greater or less 3. Interval: numerical values are assigned to observations and the intervals are equal, the zero point it arbitrary 4. Ratio: numerical values are assigned, equal differences can be interpreted and there is an absolute zero point Data characteristics Type of Which Example data classification order distance origin statistics? nominal + mode gender Mode ordinal + + education median Mode interval + + + median temperature mean Mode ratio + + + + median Age in years mean - Types of tests for particular scales: Parametric tests Non-parametric tests - Normally distributed data - Non-normally distributed data - Continuous data - Ordinal and ranked data - Different variability (dispersion) - Same variability (dispersion) - Less tolerant to small samples - More tolerant to small samples - Response categories - Simple category scale - Multiple choice scale - Multiple rating scare - Guttman scale - Items are considered to be a continuum - All persons who answered no to the first question would also answer no to the next - Likert scale - 1-5 rating with e.g. satisfaction, agreement, approval - Semantic differential scale - 1-5 with two words representing 1 and 5 - Graphic rating scale - Close to semantic, but with pictograms Nature of quantitative research Concept Construct Variables - Building block of a Is a concept, but - Characteristic, trait theory with a - Head the added or attribute that is generally accepted meaning of having measured collection of been deliberately meanings/ and consciously characteristics invented or adopted - Represents the for special scientific points around purpose which business It is used in 2 ways: research is 1. Entered into conducted theoretical schemes - Represents a label 2. Defining and that we give to specified to be elements of the operationalized and (social) world that measured seem to have common features that we deem significant - Category for the organization of ideas and observations Concept Construct - General idea or understanding of - Specific operationalization of a phenomena concept - Broader in scope - More specific in scope - Applied to both actual and - Applied to actual cases only possible cases - Used to measure concepts - Making theoretical claims Theoretical and empirical plane of research Why should we measure? - To discover the extent, dimensions, quantity or capacity of something (especially by benchmarking, comparing with a standard) - Provide highest quality, lowest error data for testing hypotheses - Before measuring: 1. Select empirical event 2. Develop mapping rules 3. Apply mapping rules - What is measured? Operationalized indicators that stand for the variables - Measurement errors - Systematic error: mis-calibrated instrument that affects all measurements - Random error: naturally occurring errors that are to be expected with any measure - Sources: participants, situational factors, measurer, data collection instruments - Measurement errors in rating scales: - Non-leniency and leniency, central tendency, halo effect - Sound measurement therefore needs to be valid, reliable and practical - Cronbach’s alpha: measures internal consistency 2 𝑛 Σ𝑠 (𝑋𝑖) - α = 𝑛−1 (1 − 2 𝑠 (𝑌) Cronbach’s alpha Internal consistency α≽0.9 Excellent 0.9≻α≽0.8 Good 0.8≻α≽0.7 Acceptable 0.7≻α≽0.6 Questionable 0.6≻α≽0.5 Poor 0.5≻α Unacceptable Types of validity Face validity Convergent Construct Predictive Concurrent validity validity validity validity Ask other Do other How does the Can you use the Compare results experts, do you research measure relate to measure to make against another think this is methods provide measures of accurate measure known valid? similar other constructs predictions? to be valid measurements of as specified by the concepts? theory? - Factor analysis - Technique used to reduce a large number of items into a fewer number of factors - Extracts maximum common variance from all items and puts them into a common factor - As an index of all items, we can use the factor score for further analyses - Two types: 1. Principle component analysis: most common, extracts maximum variance and puts them into the first factor, then removes that variance and extracts maximum variance for second factor etc 2. Principle axis factoring: extracts common variance and puts them into factors, does not include unique variance of all variables Sampling - Population: total collection of elements we want to make an inference about - Unit of analysis: what is the level we are analyzing? Country? Organization? Individual? - Sample needs to be accurate/unbiased and precise Probability sampling Non-probability sampling Random selection, equal chance Non-random selection - Simple random sampling: equal - Convenience sampling: based on chance of selection availability - Cluster sampling: divided into - Judgemental or purpose sampling groups based on elements and then - Snowball sampling: new randomly selected participant via previous - Systematic sampling: every nth participants person - Quota sampling: selecting people - Stratified random sampling must who fit categories include elements from each segment (gender, class, occupation) - Confidence interval: 95% of all sample means will lie between +/- 1.96 from the population mean - Standard error will be smaller in case a stratified sample is selected - The standard error will be larger in case a cluster sample is selected - Sampling error: difference between sample and population, increased sample size reduces this kind of error - Sampling bias: distortion of representativeness of the sample, does not change when sample size increases Secondary analysis - Data mining: uncovering knowledge from databases stored in data warehouses - Pattern recognition, prediction, risk models Week 7 (7.1) Problem solving cycle revisited, business processes, deduction and induction Empirical cycle - Empirical cycle - Used for generic business problems that many businesses encounter - Relationships between variables - Develop and empirically text theories: aim is generalisable insights, useable across contexts - Problem-solving cycle - Used for performance-related business issues, context specific - Business process that does not meet a defined performance level (costs, quality, timeliness) What is a business process? - Activity or bounded group of interrelated work activities, and adds value to one or more inputs, and produces an output to an internal or external customer Deduction and induction - Crossing the line between theory and empirical evidence 1. Induction 2. Deduction 3. Abduction - Deduction: conclusion necessarily follows from the reasons given (premises) - Induction: conclusion is formed from a limited set of specific cases, conclusion is true for now Week 7 (7.3/7.4) Cause & effect diagrams - Components: - External triggers - PESTEL: political, economic, social, technological, environmental, legal - Porter’s microenvironmental forces: customers, distributors, suppliers, competitors - Root causes - Organization structure - Organization culture - Organization strategy - Procedures/systems - Staff (competencies/motivation) - Leadership(style) - Innovation-/production-/networking-capabilities - Intermediate causes - Symptom - System boundary - Keep it simple, from complex to essential, converge to one dependent variable Demarcating the problem - A business problem is making a choice in the cause effect diagram - Open ended: multiple solutions are possible - Embedded in a social system - Several stakeholders with different potentially contradictory perspectives and interests - Often bound to limited time and resources - Focus required from which work is being done - Criteria to choose from: relevance, feasibility, corporate supervisor responsibility area, researchability, scientific news value - Be specific about structure, time, content, place Research objectives and research questions - Bachelor thesis: investigate factual causes and consequences of the faulty process and investigate possible solutions - Master thesis: (re)design the faulty process - RQ’s direct the search for required information, what is the relevant theoretical framework, bottlenecks and solutions - RQ’s often structure/title for the next chapters in the thesis report - Methodology must be discussed per RQ Week 8 (8.1/8.2) Designing your research project, choosing your methods Research proposal - contractual perspective Format: - Title page and contents list - Ch1: context, brief description of company and situation - Ch2: detailed problem analysis - Use cause-effect reasoning diagram - Accountability: explicit your sources/reasoning - Theoretical lean - Explicit problem statement - Ch3: research objective and research questions - Ch4: methodology (justification of your research approach) - Research design - Data collection (method) and analysis (discuss in terms of validity and reliability) - Ch5: expected contributions (academic, managerial) - Ch6: scope/delimitation - Ch7: organization and costs - Reference list - Appendices Size and function - BEP: 10-15 pages - A contract between problem owner, TU/e supervisor and student - Many choices are being made - Consult all parties at a regular basis - Consider it a contract between all parties, which is used as a point of departure Research proposal - conceptual design - Steps to design a business problem - See demarcating the problem from 7.3/7.4 Research proposal - technical design Research design - Input: operationalization of required information – the research questions = what information do you need? - How are you going to collect the information? - How are you going to analyze the information (chain of evidence)? Method selection - Qualitative vs quantitative (can be combined or follow eachother) Qualitative Quantitative - Observation - Experiments - In-depth interviews - Surveys - Focus groups - Observation - Projective techniques - Primary or secondary sources - Explain why you make your choices, what your benchmarks are, triangulate your data - Triangulation can be - Investigative: multiple researchers - Theoretical: multiple theoretical schemes - Methodological: multiple methods Validity and reliability Validity Reliability - Construct validity: embed in - Number of respondents (more is academic lit (defining concepts, use better) validated questionnaires etc) - Transparency of protocol (how - Internal validity: are alternative many respondents, who, why) explanations possible? - Use of jury for interpreting - Face validity: do your findings and qualitative data (analysis protocol) solutions make sense to others in - Use of triangulation the company - External validity: do your findings/solutions apply to a more generic level Project organization and costs - Persons involved: client, person with decision making power, company supervisor, daily support, possibly sound board group, supervisor from the university - Cost: internship allowance for student, staff time for information, consultation and guidance, implementation costs with regard to the improvement proposal, travel and other out-of-pockets Bridge between problem analysis and diagnostic phase - Theory will help find support for identified causal relationships, conceptualize observed facts, suggest alternative causes, provide a guiding framework Step 2: Analysis and diagnostic phase - Purposes: 1. Substantiated validation of the business problem 2. In-depth exploration and validation of causes and consequences of the problem 3. Develop preliminary ideas for alternatives to solve the business problem 4. Create support for the project - Diagnostic research characteristics: - Focused on business process - Theory informed - Research is customized - Supported by factual evidence - Empirical analysis: validating the business problem - Collect factual information - Bundling dispersed information - Collecting ‘stories’ as examples of problems - Collect opinions/perceptions by doing interviews - Immersion into the diagnostics 1. Exploring causes a. Initial cause and effect diagram, serves as input b. Quantitative exploration methods c. Qualitative methods 2. Validating causes and determining relative importance a. Collect ‘hard’ objective data b. Qualitative collection and analysis c. Determining relative importance via expert opinions, survey, simulation Step 3: Solution design - Most important question in the design phase, how does the proposed solution solve (part of) the problem? (not what is the right solution) - What to design? - Design as an iterative process: 1. Formulate requirements a. Functional & user requirements b. Boundary conditions c. Design restrictions and limitations 2. Determine solution directions and design principles 3. Determine design parameters 4. Choose design parameters 5. Detail design 6. Integral design 7. Iterative testing Step 4: Intervention - Input for the change plan 1. Delta analysis a. Analysis of most important changes, old vs new 2. Stakeholder analysis (e.g. with RASCI matrix) a. RASCI: responsibility, approval, supportive, consult, inform b. Each stakeholder gets a letter for each task, only one R per task, no task can be without an A, C’s and I’s are also important 3. Resistance analysis a. Lack of understanding b. Difference of opinion c. Lack of confidence d. Low willingness to change e. Conflict of interest - Intervention: - Goal of implementation is to execute the realization process in the company - Organizational change is a complex, multi-dimensional process, with strategic, political, technical & human aspects - Planning and communication are vital elements of any implementation process - Participation can overcome potential resistance to change Step 5: Evaluation and learning - Functional requirements met? - User requirements met? - Pre- or boundary conditions met? - Design restrictions or limitations met? - Evaluation based on performance: