Research Methodology

    Cards (202)

    • Scientific Methodology:
      • Science is a systematic, critical, controlled, reproducible path to knowledge
      • Involves theory and empiricism
      • Scientific method: set of assumptions, rules, and procedures used by scientists to conduct research
      • Results in an accumulation of scientific knowledge through reporting and modification of findings
      • Research must be objective and based on previous work to replicate
      • Research reports represent scientific findings in a standardized written format (APA)
    • Values vs. Facts:
      • Values are personal statements
      • Facts are objective statements determined to be accurate through empirical study
    • Types of Research:
      1. Basic research: answers fundamental questions about behavior
      2. Applied research: investigates issues with implications for everyday life and provides solutions (e.g., program evaluation research)
    • Research Design:
      • Specific method used to collect, analyze, and interpret data
      • Three basic types: Descriptive, Correlational, Experimental
    • Empiricism:
      • Relies on empirical evidence based on facts, evidence, and research
      • Origin of knowledge is sense experience, emphasizing sensory perception
    • Scientific Reasoning:
      • Involves inductive and deductive reasoning
      • Systematic hypothesis-testing is crucial
    • Deductive and Inductive Reasoning:
      • Deduction: deriving conclusions from general premises
      • Induction: forming generalizations based on observations
      • In research, inductive approach is used when there is little existing literature, while deductive approach tests existing theories
    • Data:
      • Raw numbers with many variables
      • Primary vs. secondary data
      • Used for analysis and statistics
    • Facts:
      • True statements verified through experience
      • Scientific facts result from repeatable observations or measurements
      • Central to building scientific theories
    • Nomothetic Paradigms:
      • Involves making generalizations and understanding large-scale patterns
      • Uses scientific methods to obtain quantitative data
      • Examples include classifying people into groups or establishing dimensions
    • Nomothetic paradigms involve developing laws and theories that can be empirically tested
    • Limitations of nomothetic paradigms:
      • Predictions can be made about groups but may not apply to individuals
      • Accused of losing sight of the 'whole person'
    • Idiographic paradigms focus on uncovering detailed information about a narrower subject of study
    • The term "idiographic" comes from the Greek word "idios" meaning "own" or "private"
    • Psychologists interested in idiographic approaches aim to discover what makes each individual unique
    • Idiographic approaches do not allow for general laws due to chance, free will, and the uniqueness of individuals
    • Idiographic approaches tend to include qualitative data and investigate individuals in a personal and detailed way
    • Methods used in idiographic research include case studies, unstructured interviews, self-reports, autobiographies, and personal documents
    • Strengths of the idiographic approach:
      • Focuses on the individual
      • Findings can serve as a source of ideas or hypotheses for later study
    • Limitations of the idiographic approach:
      • Time-consuming and costly to study individuals in depth
      • Not many people are willing to participate due to invasion of privacy and data access concerns
    • Ethical considerations in psychological research are guided by principles to protect participants' rights, enhance research validity, and maintain scientific integrity
    • Types of ethical issues in research:
      1. Voluntary participation
      2. Informed consent
      3. Anonymity
      4. Confidentiality
      5. Potential for harm
      6. Results communication
      7. Right to withdraw
      8. Debriefing
      9. Deception
    • Institutional Review Board (IRB) ensures that research aims and design are ethically acceptable and follow institutional codes of conduct
    • Verification of theories involves testing the truth-value of statements through evidence, often of an empirical nature
    • Hypothesis testing is a formal procedure for investigating ideas using statistics, often used by scientists to test predictions arising from theories
    • Steps in hypothesis testing:
      1. State research hypothesis as null and alternate hypotheses
      2. Collect data designed to test the hypothesis
      3. Perform a statistical test
      4. Decide whether to reject or fail to reject the null hypothesis
      5. Present findings in the results and discussion section
    • Operational definitions describe something in terms of the operations by which it could be observed and measured
    • Operational definitions help researchers decide how to measure variables in a study
    • Example: Happiness can be operationalized by counting the number of smiles a person emits during an observation period
    • Operational definitions are concrete and measurable, allowing others to see if the research has validity
    • In a research study, age can be defined as a participant's age measured in years, while addiction can be defined as meeting the DSM-5 diagnostic criteria for any substance use disorder
    • Operational definitions are crucial for the validity, replicability, generalizability, and dissemination of research findings
    • Types of variables:
      • Categorical variables: Nominal (describes a name or category without order) and Ordinal (values defined by an order relation)
      • Binary variables: Yes/no outcomes
      • Numeric variables: Continuous (infinite real values within an interval) and Discrete (finite real values within an interval)
    • Variables in the context of establishing causation:
      • Independent variables: Manipulated to affect the outcome
      • Dependent variables: Represent the outcome
      • Control variables: Held constant throughout the experiment
    • Other types of variables:
      • Confounding variables: Hide the true effect of another variable
      • Latent variables: Represented via a proxy
      • Composite variables: Made by combining multiple variables
    • Reliability and validity are essential in research design and analysis:
      • Reliability: Consistency of a measure
      • Validity: Accuracy of a measure
    • Types of reliability:
      • Test-retest: Consistency across time
      • Interrater: Consistency across raters
      • Internal consistency: Consistency within the measurement itself
    • Types of validity:
      • Construct: Adherence to existing theory
      • Discriminant: Divergence from unrelated measures
      • Content: Coverage of all aspects of the concept being measured
      • Criterion: Extent to which the result of a measure corresponds to other valid measures of the same concept
      • Face: measurement method appears “on its face” to measure the construct of interest
    • Improving reliability:
      1. Use enough questions to assess competence
      2. Maintain a consistent environment for participants
      3. Ensure participants are familiar with the assessment user interface
      4. Train human raters effectively
      5. Measure reliability
      6. Regular item analysis (Cronbach's, McDonald's)
    • Qualitative research involves collecting and analyzing non-numerical data to understand concepts, opinions, or experiences
    See similar decks