Designs Unit 0

Cards (66)

  • Scientific Method:
    • A systematic approach used by scientists to investigate natural phenomena, acquire knowledge, and formulate and test hypotheses
    • In psychology, it is a general approach to understanding human behavior
  • Scientific Research:
    • The researcher formulates a research question, conducts a study designed to answer the question, analyzes the resulting data, draws conclusions, and publishes the results
    • The research literature is a primary source of new research questions, creating a cycle of systematic and dynamic process
  • Conceptual clarification:
    • Research methods: procedures used to collect and analyze data
    • Research Designs: the overall strategy chosen to integrate different components of the study in a coherent and logical way
    • Techniques: specific procedures or tools used within a broader method
  • Qualitative Research:
    • Research method to explore and understand the meaning that individuals or groups of people attribute to social or human problems
    • Characteristics include collecting field data, researcher as a key instrument, multiple sources of data, inductive data analysis, and focusing on participant meanings
  • Quantitative Research:
    • The process of collecting and analyzing numerical data to find patterns, make predictions, test causal relationships, and generalize results
    • Can be used for descriptive, correlational, or experimental research
  • Hypothetico-deductive (HD) method:
    • A cyclic pattern of reasoning and observation used to generate and test proposed explanations of puzzling observations in nature
    • Based on philosophical principles of positivism where only observable and measurable objects are studied
  • Experimental Design:
    • Used to study causal relationships by manipulating independent variables and measuring their effects on dependent variables
    • Involves defining variables, writing hypotheses, designing experimental treatments, assigning subjects to treatment groups, and measuring dependent variables
  • Quasi-experimental Design:
    • Aims to establish cause-and-effect relationships between independent and dependent variables without relying on random assignment
    • Used for ethical or practical reasons when random assignment is not feasible
  • Non-Experimental Design:
    • Descriptive or correlational research that does not manipulate variables or randomly assign participants to control or treatment groups
    • Correlational research investigates relationships between variables without the researcher controlling or manipulating them
  • Quantitative research is ideal for gathering data quickly from natural settings
  • It helps generalize findings to real-life situations in an externally valid way
  • Types of research objectives:
    • General objectives state the main goal of the research
    • Specific objectives break down the main goal into smaller, logically connected parts
  • A hypothesis states predictions about what the research will find
    • It is a tentative answer to the research question that has not yet been tested
    • Hypotheses propose a relationship between variables
  • Variables in research:
    • Independent variable: Manipulated or varied in an experimental study
    • Dependent variable: Changes as a result of the independent variable manipulation
  • Data analysis process involves data organization, data reduction, and data analysis
    • Types of data in research: Qualitative data and Quantitative data
  • Inferential statistics are used to make predictions about a larger population after research and data analysis
    • Estimating parameters and hypothesis testing are significant areas of inferential statistics
    • Correlation and cross-tabulation are used to analyze relationships between variables
  • Interpretation of results explains the meaning and implications of findings in relation to research objectives, questions, and hypotheses
  • Conclusion validity is the degree to which the conclusion reached is credible or believable
  • Types of conclusion validity:
    • Internal
    • External
    • Construct
    • Statistical conclusion validity
  • Constructs are abstract concepts not directly observable but can be inferred indirectly through situations
  • Construct validity is about how well a test measures the concept it was designed to evaluate
  • Assessing construct validity is crucial for establishing the overall validity of a method
  • Internal validity is the extent to which a cause-and-effect relationship established in a study cannot be explained by other factors
  • High internal validity makes the conclusions of a causal relationship credible and trustworthy
  • Conditions for establishing causality:
    • Independent variable A (treatment variable)
    • Dependent variable B (response variable)
  • Threats to internal validity:
    • Changes in treatment and response variables must occur together
    • Treatment must precede changes in response variables
    • No confounding or extraneous factors can explain the results
  • Statistical conclusion validity holds when the conclusions of a research study are founded on an adequate analysis of the data
  • Strategies to improve statistical conclusion validity:
    • Appropriate sample size
    • Random sampling
    • Reliable measurements
    • Transparent reporting
  • External validity is the extent to which findings of a study can be generalized to other situations, people, settings, and measures
  • Without high external validity, results from the laboratory cannot be applied to other people or the real world
  • In qualitative studies, external validity is referred to as transferability
  • Ways to counter threats to external validity:
    • Replications
    • Field experiments
    • Probability sampling
  • Conclusion validity is the degree to which the conclusion reached is credible or believable
  • Types of conclusion validity:
    • Internal
    • External
    • Construct
    • Statistical conclusion validity
  • Constructs are abstract concepts not directly observable but can be inferred indirectly through situations
  • Examples of constructs: anxiety, intelligence, stress
  • Construct validity is about how well a test measures the concept it was designed to evaluate
  • Assessing construct validity is crucial for establishing the overall validity of a method
  • Internal validity is the extent to which a cause-and-effect relationship established in a study cannot be explained by other factors
  • High internal validity makes the conclusions of a causal relationship credible and trustworthy