research methods

    Subdecks (3)

    Cards (54)

    • What are the types of research methods mentioned in the study material?

      Case Studies, Experiments, Observational Designs, Self-report Techniques, Correlations, Content Analysis
    • What are the types of experiments and their characteristics?

      • Laboratory: Manipulated IV, artificial setting, strong causation, lacks ecological validity.
      • Field: Manipulated IV, natural setting, high ecological validity, good causation.
      • Natural: Naturally occurring IV, natural setting, high ecological validity, low causation.
      • Quasi: Naturally occurring IV, setting not relevant, more ethical/practical, reduced causation.
    • What must behavioral categories in observational designs be?

      They must be operationalized, specific, and measurable.
    • What is the purpose of a 5- or 7-point scale in self-report techniques?

      It allows for easy comparisons of views and is less limiting than closed questions.
    • What are the strengths and limitations of questionnaires in self-report techniques?

      Strengths:
      • Larger sample size
      • Easy distribution
      • Less socially desirable responses if anonymous

      Limitations:
      • Poor internal validity
      • Misunderstanding of questions
    • What are the differences between interviews and questionnaires?

      • Interviews allow clarification and observation of behavior.
      • Questionnaires are non-face-to-face and may lead to misunderstandings.
      • Interviews may have more social desirability bias.
    • What are the types of interviews and their evaluations?
      • Structured: High reliability, less data.
      • Unstructured: Low reliability, more data.
    • What is the difference between an experiment and a correlation?

      An experiment looks for a difference between groups, while a correlation looks for a relationship between variables.
    • What is content analysis?

      • A method for analyzing qualitative data.
      • Involves coding and categorizing data for analysis.
    • What are the aims and hypotheses in scientific processes?

      • The IV and DV must be operationalized.
      • Specific and measurable definitions are required.
    • What are the controls in scientific processes?

      • Randomisation: Reduces bias, increases generalizability.
      • Counterbalancing: Removes order effects.
      • Standardisation: Controls extraneous variables, increases reliability.
    • What are demand characteristics and investigator bias?

      • Demand characteristics: Participants alter behavior based on perceived study aims.
      • Investigator bias: Researcher unintentionally influences outcomes.
    • What are the sampling methods and their evaluations?
      • Volunteer: Large sample, unrepresentative.
      • Opportunity: Easy, unrepresentative.
      • Random: More representative, still potential bias.
      • Systematic: Representative, still potential bias.
      • Stratified: Most representative, practically harder.
    • What are the experimental designs and their evaluations?

      • Repeated Measures: No individual differences, order effects.
      • Independent Measures: No order effects, individual differences.
      • Matched Pairs: Fewer individual differences, still some.
    • What are the BPS ethical guidelines?

      • Right to Withdraw: Participants can leave or remove data.
      • Informed Consent: Participants must be informed about the study.
      • Confidentiality: Anonymity of participants must be maintained.
      • Deception: Allowed if it does not harm participants and is necessary for validity.
    • What is the purpose of pilot studies in scientific processes?

      To test the feasibility and design of a study before the main research.
    • What are the four decisions a reviewer can make during the peer review process?

      Accept, accept with amendments, reject but with resubmission, outright reject.
    • What does a reviewer critique during the peer review process?

      Validity of the science, scientific errors, design issues, and importance of findings.
    • What is the definition of reliability in research?

      Reliability refers to consistent results obtained from a study.
    • How can reliability be improved in experiments?
      By using test-retest methods to check consistency across different occasions.
    • What is the difference between interval data and ordinal data?

      Interval data has fixed intervals between scores, while ordinal data is ranked subjectively.
    • What does validity refer to in research?
      Validity refers to whether the results measure what they intend to measure.
    • How can validity be improved in experiments?

      By using double-blind procedures to reduce bias and demand characteristics.
    • What are the key features of a scientific process?
      • Theory Construction: Developing theories based on observations.
      • Peer Review: Validating research through external critique.
      • Ethical Guidelines: Ensuring participant rights and welfare.
      • Reliability and Validity: Ensuring consistent and accurate results.
    See similar decks