Intro to Research Methods-Criminology – Flashcards

Unlock all answers in this set

Unlock answers
question
1. at least two comparison groups (experimental and control) 2. random assignment 3. stimulus/change (apply a variable that you change or implement a program)
answer
What are the common elements of experimental design?
question
1. empirical association 2. appropriate time order 3. non-spuriousness 4. identifying a causal mechanism 5. Specifying the context in which the effect occurs
answer
What are the 5 criteria of causation?
question
an explanation for some characteristic, attitude, or behavior of groups, individuals, other entities or events. (families, organizations, or cities)
answer
Cause
question
when variation in one phenomenon, an independent variable, leads to or results, on average, in variation in another phenomenon, the dependent variable.
answer
causal effect
question
a relationship between 2 variables that is not due to variation in a 3rd variable; variation in IV causes variation in DV. This is a criterion to determining a causal relationships between two variables.
answer
non-spuriousness
question
a relationship between two variables that is due to variation in a third variable
answer
spurious relationship
question
many social scientist believe that a causal explanation is not adequate until a causal mechanism is identified. they want to know what process or mechanism is responsible for the relationship between the IV and DV
answer
identifying a causal mechanism
question
some discernible means of creating a connection; A discernible process that creates a causal connection between two variables
answer
Mechanism
question
a criterion for establishing a causal relationships between two variables. the variation in the independent variable must occur before the variation in the dependent variable
answer
time order- time order
question
a focus on idiographic causal explanation. a particular outcome is understood as a part of a larger set of interrelated circumstances
answer
context
question
relationships among variables that vary among geographic units or other social settings
answer
contextual effect
question
study in which data are collected at only one point in time; "snap shot"; most common form of analysis; simplest and cheapest. but cannot fully capture change (implying causal direction but can't say for sure findings are consistent over time)
answer
cross-section design
question
A study in which data are collected that can be ordered in time; also defined as research in which data are collected at two or more points in time; measures characteristics at multiple time points; more powerful for measuring change; more costly & difficult
answer
longitudinal
question
Necessary criterion for establishing a causal effect; A change in the independent variable is correlated with a change in the dependent variable; if there is no association between the 2 variables, there cannot be a causal relationship
answer
association
question
People who all have experienced a similar event or common starting point
answer
cohort
question
(cohort study)- follow up samples are collected (at one or more times) from the same cohort
answer
event based design
question
Those who share a common period of birth (ex. Those born in the 1950s)
answer
birth cohorts
question
Those who have worked at the same place for about 5 yrs., about 10 yrs., and so on
answer
seniority cohorts
question
Freshman, Sophomores, Juniors, & Seniors
answer
school cohorts
question
aka event-based design (type of panel study in criminology)- A type of longitudinal study in which data are collected at two or more points in time from individuals in a cohort Panel study but w/ a group of people that shared some experience at the same time
answer
cohort study
question
following one single person over time (or very few cases); qualitative research; direct observation; spending a lot of time w/one person; Wholistic view of each case; incorporates time but not in the systematic precise timed way as quantitative research
answer
case study
question
An experiment in which subjects are assigned randomly to an experimental group that receives the treatment or other manipulation of the independent variable and a comparison group that does not receive the treatment. Outcomes are measured in a post-test
answer
true experiments
question
Has an experimental group, control group, & is done over time (compare both post tests then if we have a pre-test, we look at it there was a change in both groups & why); Without a pretest, no way of knowing what caused the change; rate of change (sees which group changed more)
answer
elements of a true experiment
question
Same as the post-test but administered at an earlier time; A true experiment does not require a pre-test; In experimental research, this is the measurement of an outcome (dependent) variable prior to an experimental intervention or treatment (independent variable) • Examines differences in groups, not necessary but desirable (don't really need it because they should be equal to start w/if correctly randomly assigned) but we use them to measure change, tests randomness, and examines conditions under which the stimulus had an impact (whether certain kinds of people responded different to the stimulus)
answer
pre-test
question
In experimental research, this is the measurement of an outcome (dependent) variable after an experimental intervention or treatment (independent variable) • after intervention, measures differences
answer
post-test
question
In an experiment, the group of subjects that receives the treatment or experimental manipulation
answer
experimental group
question
In an experiment or study, a comparison group that receives no treatment
answer
control group
question
A procedure by which each experimental subject is placed in a group randomly Useful for ensuring internal validity not generalizability -matching/ceteris paribus
answer
random assignment
question
A procedure for equating the characteristics of individuals in different comparison groups in an experiment • Problem- poor replacement for randomization; individuals can be matched only on a few characteristics so unmatched & unknown differences between the experimental & control group may influence outcomes
answer
matching
question
Latin phrase meaning "other things being equal"
answer
ceteris paribus
question
A research design in which there is a comparison group that is comparable to the experimental group in critical ways, but subjects are not randomly assigned to the comparison and experimental groups; Done when we can't do a true experiment -field experiments, non-equivalent control group designs, before and after designs, panel designs, Ex post facto control group designs
answer
quasi-experimental designs
question
A study conducted in a real-world setting
answer
field experiments
question
A quasi-experimental design in which there are experimental and comparison groups that are designated before the treatment occurs but are not created by random assignment Individual matching Aggregate matching
answer
non-equivalent control group designs
question
A quasi-experimental design consisting of before-after comparisons involving the same variables and sometimes the same groups, but sometimes these designs may even include different groups in the pretests and posttests; no control group & requires many observations • Ex.- Rape incident rates (measuring by how often they are reported) and on predictor variable we use time measurements, so basically measuring rates across times, and somewhere in time we implement some change (like creating a rape crisis hotline or series of help centers), we would look at then rates before hand and shortly after an intervention (hopefully we would see a decline); sort of a pre test and post test (like interrupted time series)
answer
before-and-after designs
question
matching individuals characteristics in each group
answer
individual matching
question
matching based on group characteristics
answer
aggregate matching
question
opposite of time series; a lot of cases but small # of time points; exact same cases over time (ex.- same cities over time)
answer
panel designs
question
does not qualify as a quasi- experimental design because comparing subjects with themselves at just one earlier point in time does not provide an adequate comparison group.; A type of longitudinal study in which data are collected from the same individuals—the panel— at two or more points in time • Sample (panel) is drawn from a population at time 1 & data are collected as time passes some panel members become unavailable for follow-up & the population changes At time 2, data are collected from the same people as Time 1 (panel) except for those people who cannot be located
answer
fixed sample panel designs
question
A quasi-experimental design consisting of several pretest and post-test observations of the same group • Repeated cross-sectional design
answer
repeated measures panel designs
question
(trend studies)- A longitudinal study in which data are collected at two or more points in time from different samples of the same population
answer
repeated cross-sectional design
question
• 1. Expense & Attrition • 2. Subject fatigue
answer
why are panels designs a challenge
question
(aka trend studies)- have small # of cases to see how closely 2 matched things are; how closely they trend together; has a lot of different time points • Projected trend is compared with the actual trend of the DV after the intervention; A substantial disparity b/t the actual & projected trend is evidence that the intervention had an impact; A quasi-experimental design consisting of many pretest and post-test observations of the same group
answer
time series designs
question
A non-experimental design in which comparison groups are selected after the treatment or program has occurred Similar to nonequivalent control group designs but does NOT meet the criteria for quasi-experimental designs
answer
ex post facto control group designs
question
(are we measuring what we think we are measuring or are things in our study affecting outcome?); a lot of these are solved by random assignment
answer
what is "threats to validity"
question
aka Causal Validity Selection bias -endogenous change, testing, maturation, regression, attrition, regression to the mean, external events/history effects, contamination, compensatory rivalry, demoralization, diffusion, hawthorne effect, treatment misidentification, staff expectancies, generalizability
answer
internal vailidity
question
When the subjects develop or change during the experiment as part of an ongoing process independent of the experimental treatment; A source of causal invalidity that occurs when natural developments or changes in the subjects, independent of the experimental treatment itself, account for some or all of the observed change from the pretest to the post-test
answer
endogenous change
question
- When characteristics of the experimental & comparison group subjects differ; A source of internal (causal) invalidity that occurs when the characteristics of experimental and comparison groups are not equivalent • Groups differ from the beginning; random assignment helps this
answer
selection bias
question
Subjects may learn something or be sensitized to an issue in the pretest & as a result respond differently when asked the same questions on a post test o Pre-test can have an effect; If the test is highly controlled it may not be found in real life outcomes (may operate differently in experiment vs. real life)
answer
testing
question
Subjects may age or gain experience or grow in knowledge, as a part of natural maturational experience & thus respond differently on the post-test; subjects change over time o Could be helped by having a control group & random assignment
answer
maturation
question
Cyclical or Episodic changes that result in different post-test scores; change occurs because of naturally occurring fluctuations o Should be considered whenever subjects are selected b/c of their initial extremely high or low values on the outcome variable
answer
regression
question
When people drop out of a study; can make groups uneven • Could be because of boredom, moving, change their mind, they die, have health problems, can't find (may be a transient population), etc.
answer
attrition
question
A problem that occurs in experiments when comparison groups become different because subjects are more likely to drop out of one of the groups for various reasons; varies by group; attrition is caused by which group the person is in o Not helped by random assignment
answer
differential attrition
question
- Extreme cases move to the middle over time • Ex.- binge drinking; ones that had the most extreme behavior could only decline b/c they were at such a high rate; appears as change but didn't happen because of the study o Solution- don't only select extreme cases or make sure you have equal amounts of extremes by random assignment or matching
answer
regression to the mean
question
When something occurs during the experiment, other than the treatment that influences outcome scores; A source of causal invalidity that occurs when something other than the treatment influences scores on the post-test. Also called external events
answer
external events/history effects
question
When either the experimental group or the comparison group is aware of the other group & is influenced in the post-test as a result; A source of causal invalidity that occurs when either the experimental or the comparison group is aware of the other group and is influenced in the post-test as a result
answer
contaimination
question
aka John Henry effect; A type of contamination in experimental designs that occurs when control group members perceive that they are being denied some advantage that the experimental group members are getting and increase their efforts by way of compensation
answer
compensatory rivalry
question
Control group doesn't think they should try now b/c they know they are being compared
answer
demoralization
question
groups share info
answer
diffusion
question
Named after a famous productivity experiment at the Hawthorne electric plant; A type of contamination in experimental designs that occurs when members of the treatment group change in terms of the dependent variable because their participation in the study makes them feel special; When people behave different because they know they are being watched (change due to observation, simply due to being in an experiment) o Having a control group, large sample size, & random assignment helps
answer
hawthorne effect
question
When variation in the independent variable (the treatment) is associated w/ variation in the observed outcome, but the change occurs through a process that the researcher has not identified; problem that occurs in an experiment when the treatment itself is not what causes the outcome, but rather the outcome is caused by some intervening process that the research has not identified and is not aware of
answer
treatment misidentification
question
idea that when people administering program (knowing who is in which group) may treat members differently through expectations (maybe not consciously though); Convey expectations, self-fulfilling prophecy
answer
staff expectancies
question
An experimental method in which neither the subjects nor the staff delivering experimental treatments know which subjects are getting the treatment and which are receiving the placebo -this is how staff expectancies are corrected
answer
double blind procedures
question
source of treatment mis-identification that can occur when subjects receive a treatment that they consider likely to be beneficial and improve because of the expectation rather than because of the treatment
answer
placebo effect
question
The design components that are essential for a true experiment and minimize the threats to causal (internal) validity also make it more difficult to achieve sample generalizability, or the ability to apply the findings to a clearly defined, the findings to a clearly defined, larger population.
answer
generalizability
question
Subjects who can be recruited for a laboratory experiment, randomly assigned to a group, and kept under carefully controlled conditions for the study's duration are often not a representative sample of any large population of interest. In fact, most are recruited from college populations. o Not only do the characteristics themselves determine the generalizability of the treatment, the setting for the experiment must be considered; Field experiments are likely to yield findings that are more applicable to a broader populations than lab experiments
answer
sample generalizability
question
(Cross-population Generalizability)- The applicability of a treatment effect (or non-effect) across subgroups within an experiment or across different populations, times, or settings; Can results be generalized across places, groups, programs, etc. o Interaction of testing & treatment -solved by replication
answer
external vailidity
question
A variation of the problem of external validity occurs when the experimental treatment is effective only when particular conditions created by the experiment occur
answer
interaction of testing and treatment
question
A type of experimental design that combines a randomized pretest-posttest control group design with a randomized posttest-only design, resulting in two experimental groups and two comparison groups
answer
solomon four-group design
question
because the environment is real
answer
why do field experiments have high external validity?
question
o Statistical control- A technique used in non-experimental research to reduce the risk of spuriousness. One variable is held constant so the relationship between two or more other variables can be assessed without the influence of variation in the control variable
answer
causality in non-experimental designs
question
o Deception- used in social experiments to create more " realistic" treatments, often within the confines of a laboratory. o Distribution of benefits- An ethical issue about how much researchers can influence the benefits subjects receive as part of the treatment being studied in field experiments
answer
protection of subjects
question
o Does the indicator produce the same results every time? (if it does change over time is it b/c it really changed or b/c they don't understand how to answer so there are 2 different answers over time, that just look like change) o 4 types -stability, representative, internal consistency, equivalence reliability
answer
reliability
question
do research again in different times, treatments, or groups
answer
what is replication mean?
question
high internal validity and low external validity; sometimes too controlled and not applicable to real world
answer
controlled experiments have what?
question
reliable across time • Test-retest: making multiple measurements to make sure things didn't change over time; problem- people may remember answers from last time o Parallel forms method to check this (it assumes we have multiple indicators of the same thing but can be separated so that they can be randomized)
answer
stability
question
reliability across subgroups (different classes, racial groups, genders, etc.) • Guarding against this by using a pilot study
answer
representative
question
reliability w/in multiple indicators to see how closely related they are • Ex.- if you have 2 measures of self control, you ask both of them to see if they are correlated • Split-halves method- you have items & you ask all of them then randomly chose ½ & create a new score for each ½ • Tests of consistency (inter item correlation)- tests correlations b/t multiple indicators (a bunch or correlations basically) o Chronbach's Alpha
answer
internal consistency
question
another summary measure of internal consistency, in some ways related to avg. correlation of all possible questions asking but a little different, where it basically gives you 1 number for all items (greater than .7 is reliable)
answer
Chronbach's Alpha
question
about people making measurements Inter-coder/inter-rater reliability
answer
equivalence reliability
question
o Conceptualize clearly- what do we mean by what we are measuring or our terms Increase level of measurement; Use multiple indicators (most common way to enhance reliability); recognize that there are multiple dimensions of a concept & make sure to include all of them or at least most Use pretests & pilot studies Use established measures • Problem- things change; some measures may not apply anymore Continuously train staff
answer
how do we improve reliability?
question
- are people coding results equally; did the people who made observations or did scoring have consistent answers
answer
inter-coder/inter-rater reliability
question
comparison. The value of cases on the dependent variable is measured after they have been exposed to variation on an independent variable. This measurement is compared with what the value of cases on the dependent variable would have been if they had not been exposed to the variation in the independent variable. The validity of causal conclusions rests on how closely the comparison group comes to the ideal counter-factual.
answer
what does causal explanation rely on?
question
association between the variables, proper time order, and non-spuriousness of the association. In addition, the basis for concluding that a causal relationship exists is strengthened by identification of a causal mechanism and the context for the relationship.
answer
what three criteria generally viewed as necessary for identifying a causal relationship?
question
no, " Correlation does not prove causation."
answer
Is association between two variables in itself sufficient evidence of a causal relationship?
question
to make comparison groups as similar as possible at the outset of an experiment to reduce the risk of spurious effects due to extraneous variables.
answer
Why do experiments use random assignment?
question
they use statistical controls. A variable is controlled when it is held constant so that the association between the independent and dependent variables can be assessed without being influenced by the control variable.
answer
How due non-experimental designs control for spuriousness?
question
usually cross-sectional designs for establishing the time order of effects. longitudinal designs vary in terms of whether the same people are measured at different times, how the populations of interest is defined, and how frequently follow-up measurements are taken. Fixed sample panel is best
answer
What do longitudinal designs prefer?
question
the strongest test for the time order of effects, but they can be difficult to carry out successfully because of their expense as well as subject attrition and fatigue.
answer
Fixed sample panel designs provide what?
question
ethical and practical contrainsts
answer
what often prevents the use of experimental designs?
question
most widely-used data gathering technique.
answer
survey methodology
question
behavior attitudes, beliefs, opinions, characteristics, expectations, and self-classification
answer
what type of question are asked in surveys?
question
open ended or close ended questions
answer
What is question construct?
question
A- allows a more depth analysis; covers all possible responses D- requires coding, relies on interpretations, irrelevant answer sometimes occus
answer
what are the advantages and disadvantages of open ended questions?
question
A-easily coded; provides uniformity of responses D- may omit important responses; must be mutually exclusive and exhaustive
answer
What are the advantages and disadvantages of close ended questions?
question
-jargon, slang, and abbreviations. use language used on tv or newspapers--- use an 8th grade vocabulary -confusing phrases and vagueness- imagine ways of misunderstanding. use set values such as income, and education. no indefinite words like 'regularly' -emotional language- use neutral language- avoid prestige bias (ex- are you receiving government assistance vs. are you enrolled in the Welfare system?) -loaded questions such as using language to stimulate a response (Ex- ...spend ever more money...) -avoid questions beyond the respondent's capabilities- don't use false premises and negative items or double negatives and double barreled questions (two questions in one but only one answer allowed)
answer
What should you avoid during survey construction?
question
attaching power based on the position of a person, thus swaying your answer
answer
Prestige Bias
question
-keep the questions short--- if the subjects have to study the questions then the stamina will be reduced. -reference a period for referenced behaviors ex: in the past 6 months -make the questions either agreeable or disagreeable ex/; to what extent do you support or oppose the health care plan vs to what extent do you disagree...
answer
What should you do while constructing surveys?
question
fixed choice-close ended. this survey provides pre-formulated response choices that are circled or checked. -make each response mutually exclusive-- no overlapping categories -make each choice categories exhaustive--- must allow all respondents to select an option -use Likert-type response categories
answer
What to do in Fixed-Choice questionnaires?
question
a variable's attributes or values in which every case can be classified as having one attribute
answer
Exhaustive response
question
survey responses in which respondents indicate the extent to which they agree or disagree with statements
answer
Likert-Type response
question
fence sitting and floating. use filter questions
answer
What needs to be minimized during survey construction?
question
survey respondents who see themselves as being neutral on an issue and choosing the neutral response that is offered
answer
fence sitters
question
survey respondents who provide an opinion on a topic in response to a close-ended question that does not include a 'don't know' options but will choose 'don't know' if available
answer
floaters
question
a survey question used to identify a subset of respondents who then are asked other questions
answer
filter questison
question
the unique combination of questions created in a survey by filter questions and contingent questions.
answer
skip patterns
question
identical response categories are assigned to multiple questions. the questions are placed one under the other forming a matrix with response categories along the top and a list of questions down the side. effective use of page space and time
answer
Matrix questions
question
depends on the type of study, always aim for 100%. anything below 70% is bad (according to the book). there is no hard and fast rule about what is acceptable because it varies. we want to make sure we get a large sample and avoid systematic bias. 70% or above is ideal but in some cases 50% is acceptable (in populations that share the majority of characteristics or with populations that are hard to reach)
answer
acceptable response rates
question
aka questionnaire
answer
interview schedule
question
should be guided by a well-defined inquiry and a definitively targeted population
answer
how should we maintain focus in survey construction?
question
if evidence from previous surveys indicate that these already formulated questions provide a good measure of the concept the why reinvent the wheel?
answer
What does it mean to build on existing instruments?
question
translation
answer
What should you consider during survey construct?
question
getting respondents' interpretation in greater detail
answer
Open ended questions are tools for what?
question
questions included in a questionnaire or interview schedule to help explain answers to other important questions
answer
interpretive questions
question
-each survey must be used with each person, not tailored to the specifics of a given conversation -survey questions must be understood in the same way by people who are different -you can't rephrase a survey question (it would technically be a different questions from others) -remember survey respondents do not know you, so you cannot be expected to share nuances of expression
answer
What are the rules to writing questions?
question
demographoics
answer
What else should you ask on surveys?
question
pretest it with yourself or colleagues
answer
What should you do before giving out the questionnaire?
question
-have a descriptive title to indicate the overall topic to set the context of the survey -order of items may affect answers to randomize question order (may vary from researcher to researcher) -in self-administered surveys put interesting questions first and demographic ones later -face-to-face- put non-threatening questions first, and sensitive items later
answer
How should you organize surveys?
question
questions that are asked of only a subset of survey respondents
answer
contingency questions
question
matrix questions
answer
What form allows a systematic recording of particular features or qualitative data?
question
Variation in responses to a question that is caused by individuals' reactions to particular words or ideas in the question instead of by variation in the concept that the question is intended to measure
answer
idiosyncratic variation
question
Recoding response choices that were originally coded to reflect both favorable and unfavorable attitudes toward a phenomenon as indicative of either all favorable or all unfavorable, so the index is measuring the same thing
answer
reverse code
question
Ex. people may answer all strongly agree or all strongly disagree
answer
be aware of response sets
question
A composite measure of one concept created from a series of two or more questions
answer
scale
question
survey research, self-administered/mail, group administered surveys, telephone, face-to-face, electronic, mix-mode, and omnibus.
answer
types of surveys
question
Research in which information is obtained from a sample of individuals through their responses to questions about themselves or others
answer
survey research
question
A survey involving a mailed questionnaire to be completed by the respondent (you have no control this way) Advantages- Cheapest; possible w/ a single researcher, can offer anonymity, avoids interviewer bias, response rates can be high Disadvantages- can also have a low response rate; can take a long time (especially w/mail surveys and if people take their time and an outside event influences them, their answers may differ from people who responded more quickly); cannot control conditions of completion; no clarification; respondents can answer weeks apart Done in person, individually, or groups; can be done electronically (ex.- Qualtrics or SurveyMonkey) Computer-assisted self-interviewing (CASI)- system within which respondents interact with a computer-administered questionnaire by using a mouse and following audio instructions delivered via headphones
answer
self-administered/mail
question
o Group administered surveys- A survey that is completed by individual respondents who are assembled in a group
answer
group administered
question
- A survey in which interviewers question respondents over the phone and then record their answers Advantages- 95% of people have phones; quick & efficient; response rates can be high; cheaper than face to face; can control for question order; can clarify questions • Random digit dialing Disadvantages- higher cost than mail; limited interview length; cant reach respondents w/out phones; reduces anonymity; potential for interviewer bias; difficult to use open ended questions (sensitivity issue too) -computer assisted telephone interviews -computerized interactive voice response
answer
telephone
question
survey in which an interviewer questions respondents and records their answers Advantages- highest response rates; longest questionnaires; non-verbal communication (important for knowing when to prompt for an answer or back off); visual aids; can ask all types of questions; can evaluate the environment Disadvantages- most expensive; geographically spread out; interviewer bias; less supervision of interviewers (they may have different attitudes or answer surveys themselves) Interview Schedule
answer
face-to-face
question
(CATI)- An interview in which data collection and data entry can occur concurrently and data entry error is minimized. Most large surveys are performed in this way.
answer
computer assisted telephone interview
question
(IVR)- Software that uses a touch-tone telephone to interact with people to acquire information from or enter data into the database
answer
computerized interactive voice response
question
- The survey instrument containing the questions asked by the interviewer for an in-person interview or phone survey
answer
what is an interview schedule?
question
- A survey that is sent and answered by computer, either through e-mail or on the web Web-based survey- A survey designed on a server; respondents are asked to visit a website and respond to the web questionnaire by checking answers. • Advantages- relatively cheap; controlled by the researcher; flexible format; can target specific populations • Disadvantages- no available method for drawing a random sample of email addresses from any general population; excludes individuals who do not have access to the internet (or don't know how to access it); not generalizable to the population
answer
electronic survey
question
Surveys that are conducted by more than one method, allowing the strengths of one survey design to compensate for the weaknesses of another and maximizing the likelihood of securing data from different types of respondents; for example, non-respondents in a mailed survey may be interviewed in person or over the phone
answer
mix-mode survey
question
versatility, efficiency and generalizability
answer
what are attractive features of survey research
question
A survey that covers a range of topics of interest to different social scientists cons- limited depth
answer
omnibus survey
question
one of the most successful omnibus surveys; includes more than 500 questions about background characteristics and opinions, with an emphasis on social stratification, race relations, family issues, law and social control, and morale.
answer
General Social Survey (GSS)
question
Important b/c it can influence responses o Split-ballot design- may help identify problems in question order; Unique questions or other modifications in a survey administered to randomly selected subsets of the total survey sample, so that more questions can be included in the entire survey or so that responses to different question versions can be compared
answer
Why is question order important and how can it be helped?
question
Major topic divisions within the questionnaire should be organized in separate sections, and each section should be introduced with a brief statement. Instructions should be used liberally to minimize respondent confusion. Instructions should explain how each type of question is to be answered (such as circling a number or writing a response) in a neutral way that is not likely to influence responses. Instructions also should guide respondents through skip patterns. The questionnaire should look attractive, be easy to complete, and have substantial white space. Resist the temptation to cram as many questions as possible onto one page. Response choices should be printed in a different font or format from the questions and should be set off from them. Response choices should be designated by numbers to facilitate coding and data entry after the questionnaire is completed
answer
What are the guidelines to question order?
question
for a mailed questionnaire and the introductory statement read by interviewers in telephone or in-person interviews are also critical to the survey's success.
answer
What is a cover letter?
question
o Protection of respondents- any potentially harmful effects should be disclosed (in cover letter); confidentiality & anonymity; make effort to reduce emotional trauma Anonymity- Provided by research in which no identifying information is recorded that could be used to link respondents to their responses Confidentiality- must be kept in confidence what is discussed, can do this by using ID #'s for respondents
answer
Ethics in Survey research
question
problems can lie in sampling, measurement and overall survey design
answer
Why can surveys fail to produce useful results?
question
as an integrated whole, with each question and section serving some clear purpose complementing the others
answer
how should survey questionnaires or interview schedules be designed?
question
may help, but the presence of such options also affects the distribution of answers. Open-ended questions can be used to determine the meaning that respondents attach to their answers. The answers to any survey questions may be affected by the questions that precede them in a questionnaire or interview schedule.
answer
What happens when we include "don't know" choices?
question
credible, personalized, interesting, and responsible
answer
What should the cover letter be like for surveys?
question
fast turn-around and efficient sampling
answer
What do phone interviews using random digit dialing allow?
question
They allow longer and more complex interview schedules, monitoring of the conditions when the questions are answered, probing for respondents' understanding of the questions, and high response rates.
answer
what kind of advantages do in-person interviews have?
question
the strengths of one survey design to compensate for the weakness of another
answer
what do mix-mode surveys allow for?
question
respondents are able to decline to participate. This option should be stated clearly in the cover letter or introductory statement. When anonymity cannot be guaranteed, confidentiality must be ensured.
answer
Why do most surveys pose few ethical problems?
question
- Not really formal research methods at all, really derived from 1200s European explorers & missionaries (traveling to new worlds & highly detailed)
answer
What were the early beginnings of history in field research like?
question
Late 19th century anthropology, most relied on travelers' accounts; racist & ethnocentric b/c they are writing from their own viewpoint
answer
Academic field research
question
Usually credited w/earliest academic writing on ethnographic narrative; popularized field work; first writings on ethnography (the method); argued for immersion
answer
Bronislaw Malinowski (1914)
question
1910s-1930s: Life history approach; direct observation, informal interviews, documents & official records, largely descriptive, led by Robert Park; Theory most closely to this is social disorganization theory 1940s-1960s: Participant observation; less strictly descriptive; more theoretical analysis (not just observing but started to ask why we see the observations differ in different areas); Involvement of researcher in the field; flourished until survey research; rejuvenated in the 70s & 80s b/c of new techniques (like ethnography & methodological writing)
answer
Chicago School
question
1st chair of the 1st dept. of sociology; he was a journalist, which explains why it was largely descriptive
answer
Robert Park
question
Subjects' point of view; moving from observation to meanings; study of "common sense knowledge" (of culture, group, etc. being observed); One way of doing ethnomethodological research where researcher engages in the culture Ethnography- The study of a culture or cultures that some group of people share, using participant observation over an extended period of time
answer
Modern ethnography
question
grounded theory, naturalism, and enthomethodology
answer
Qualitative paradigms
question
Systematic theory developed inductively, based on observations that are summarized into conceptual categories, reevaluated in the research setting, and gradually refined and linked to other conceptual categories Starts w/general research questions, open to change, theory evolves through observations
answer
grounded theory
question
Similar to quantitative positivism approach; social reality is "out there" (one reality & we have to study it) Chicago School 1940s is an example Mostly descriptive (describing patterns & characteristics, not trying to explain relationships, very vivid descriptions); follows naturalistic perspective from qualitative research
answer
naturalism
question
reality is socially constructed (attitudes of society, social constructions & why they exist & differ across groups), must "make sense" of interpretations; follows the interpretivist perspective o Non-distinctive different approaches, they can all overlap
answer
ethnomethodology
question
o 1. Role of observer: full participant (takes more time & effort) or outsider o 2. Portrayal of role: Participants know that you're observing them (overt observation); some participants know because sometimes you don't want the subjects to know but you need to inform others (called gatekeepers); or nobody knows (covert observation) Ex. Of Gatekeeper- when studying students w/out them knowing, you must inform the parents; the parents are the gatekeepers Advantage of covert observation- Less likely to influence outcome Disadvantage of covert observation- raises the question of ethics b/c of informed consent o 3. Portrayal of purpose: Full explanation (what you're studying/interested in/who you are); Partial explanation; No explanation (hard to do if not a participant observer); False explanation (could be unethical) Full explanation advantage- full disclosure; Disadvantage- participants may not act normally o 4. Duration of observations: single observations; several observations w/limited duration; long-term multiple observation (ex. Ethnographic study where you are a part of the study) o 5. Focus of observations: narrow focus (topic of research could be really well defined); expanded focus (general topic as a whole); broad focus (think again w/ethnographic research, no focus at all going in or very little concentration on a particular aspect)
answer
What are the 5 dimensions of observation?
question
participant observation, intensive interviewing, focus groups, computer assisted analysis
answer
what are the qualitative methods?
question
A qualitative method for gathering data that involves developing a sustained relationship with people while they go about their normal activities
answer
participant observation
question
A qualitative method that involves open-ended, relatively unstructured questioning in which the interviewer seeks in-depth information on the interviewee's feelings, experiences, and perceptions
answer
intensive interviewing
question
A qualitative method that involves unstructured group interviews in which the focus group leader actively encourages discussion among participants on the topics of interest
answer
focus groups
question
Uses special computer software to assist qualitative analyses through creation, application, and refinement of categories; tracing linkages between concepts; and making comparisons between cases and events
answer
computer assisted qualitative data anlysis
question
o Case studies- in depth analysis, case can be anything (person, organization/group, law/policy, neighborhood) o Field study: really a type of case study conducted "in the field (natural environment); can be done on an organization, community, gang; observation of interaction- interpreting interactions o Ethnography: refined type of case study; more involved field study (believes in the ethnomethodological view where we have to take on view of others); cultural immersion (sometimes requires coming a part of what you're studying) ; can occur over years; o Case history: reconstruction of past events; sources of data include archived documents, personal accounts, and interviews; not done a lot in criminology but if so usually done on different policies or laws and how they came about (or changed or did away with and why & impact)
answer
What are the research strategies and definitions?
question
o Similarities- Systematically collect & analyze evidence (data is different but still systematic), importance of theory (qualitative are more likely to be inductive & quantitative are deductive but both involve theory) o Differences- Interpretive or conflict approaches, data intrinsically meaningful (observations made are meaningful not just a means to an end); subjective meanings (meaning can vary so interested in making sense of that instead of trying to code it into something); definitions, symbols, and descriptions
answer
What are the similarities and differences in qualitative and quantitative methods?
question
observations about natural behavior and artifacts that capture social life as it is experienced by the participants rather than in categories predetermined by the researcher.
answer
what does the collection of qualitative data emphasize?
question
a commitment to inductive reasoning; seeking not to test pre-formulated hypotheses, but to discover how & why people think/act in certain social settings o Focus on primarily unstudied processes & unanticipated phenomena; study things that cannot adequately be understood w/a structured set of questions or within a highly controlled experiment o An orientation to social context, to the interconnections b/t social phenomena rather than to their discrete features; context of concern may be a program, an organization, neighborhood, or broader social context o Focus on human subjectivity, on the meanings that participants attach to events & that people give to their lives o Focus on the events leading up to a particular event or outcome instead of general causal explanations o Reflexive research design; design develops as research progresses o Sensitivity to the subject role of the researcher
answer
what is exploratory research do?
question
o Progressive focusing- The process by which a qualitative analyst interacts with the data and gradually refines his or her focus
answer
inductive reasoning
question
o Reflexivity- A narrative provided by the researcher that offers a reflection on the process of research, including any obstacles encountered
answer
reflexive research design
question
A method used in case reports that clarifies the context and makes it possible for the reader vicariously to experience it
answer
thick description
question
A sampling method recommended for field researchers by Glaser and Strauss (1967). A theoretical sample is drawn in a sequential fashion, with settings or individuals selected for study as earlier observations or interviews indicate that these settings or individuals are influential.
answer
theoretical sampling
question
(ESM)- A technique for drawing a representative sample of everyday activities, thoughts, and experiences. Participants carry a pager and are beeped at random times over several days or weeks; upon hearing the beep, participants complete a report designed by the researcher.
answer
experience sampling method
question
A role in participant observation in which the researcher does not participate in group activities and is publicly defined as a researcher;
answer
role of researchers
question
The changes in individual or group behavior that are due to being observed or otherwise studied
answer
reactive effect
question
A strategy that increases the reliability of observational data by using explicit rules that standardize coding practices across observers
answer
systematic observation
question
goal is to develop a comprehensive picture of the interviewees' background, attitudes, and actions, in their own terms— to " listen to people as they describe how they understand the worlds in which they live and work"
answer
intensive interview
question
The point at which subject selection is ended in intensive interviewing, when new interviews seem to yield little additional information
answer
saturation point
question
• Develop a plausible (and honest) explanation for yourself and your study. • Maintain the support of key individuals in groups or organizations under study. • Don't be too aggressive in questioning others. Being a researcher requires that you do not simultaneously try to be the guardian of law and order. • Don't fake social similarity with your subjects. Taking a friendly interest in them should be an adequate basis for developing trust. • Avoid giving and receiving monetary or other tangible gifts, but do not violate norms of reciprocity. Living with other people, taking others' time for conversations, and going out for a social evening all create expectations and incur social obligations. Such small forms of assistance as an occasional ride to the store or advice on applying to college may strike the right balance. • Be prepared for special difficulties and tensions if multiple groups are involved. It is hard to avoid taking sides or being used in situations of intergroup conflict
answer
maintaining relationships
question
participation (complete participation)- A role in field research in which the researcher does not reveal his or her identity as a researcher to those who are observed. The covert participant has adopted the role of a "complete participant." Cannot take notes or use any obvious recording devices Cannot ask questions that will arouse suspicion Role of covert participation is difficult to play successfully Covert participants must keep up the act at all times while in the setting under study
answer
covert participation
question
- Research in which natural social processes are studied as they happen and left relatively undisturbed
answer
field research
question
Notes that describe what has been observed, heard, or otherwise experienced in a participant observation study; these notes usually are written after the observational session
answer
field notes
question
Brief notes that serve as memory joggers when writing actual field notes at a later time
answer
jottings
question
In field research, a credible sense of understanding of social processes that reflects the researcher's awareness of participants' actions as well as their words, and of what they fail to state, feel deeply, and take for granted
answer
tactic knowledge
question
o Social desirability- participant responds the way that is socially acceptable even if it not how they feel/think o Hawthorne effect- Try to please/sabotage; people behave differently b/c they know they are being watched Role selection- sometimes participants start to take on the role you expect of them
answer
validity
question
o Credibility Tactic knowledge o Transferability o Dependability o Confirmability
answer
what is reliability made up of?
question
o Voluntary participation, subject well-being, identity disclosure, confidentiality, appropriate boundaries, and researcher safety are all necessary
answer
ethics in qualitative research
question
inductively, try to understand the social context and sequential nature of attitudes and actions, and explore the subjective meanings that participants attach to events. They rely primarily on participant observation, intensive interviewing, and, in recent years, focus groups.
answer
how do qualitative researchers tend to develop ideas?
question
Each role represents a different balance between observing and participating, which may or may not include public acknowledgment of the researcher's real identity. Many field researchers prefer a moderate role, participating as well as observing in a group but publicly acknowledging the researcher role.
answer
what roles may participant observers adopt?
question
developing and maintaining relations in the field, sampling, and recording and analyzing data.
answer
what strategies must field researchers develop?
question
. Detailed notes should be recorded and analyzed daily to refine methods and to develop concepts, indicators, and models of the social system observed.
answer
Why should notes be recorded and analyzed good?
question
involve open- ended questions and follow- up probes, with the content and order of specific questions varying from one interview to the next.
answer
What do intensive interviews involve?
question
elements of participant observation and intensive interviewing. They can increase the validity of attitude measurement by revealing what people say when presenting their opinions in a group context.
answer
what do focus groups combine?
question
thick description and other qualitative techniques to provide a holistic picture of a setting or group
answer
what do case studies use?
question
a general explanation that develops in interaction with the data and is continually tested and refined as data collection continues.
answer
what does grounded theory connote?
question
concern voluntary participation, subject well- being, identity disclosure, confidentiality, appropriate boundaries, and researcher safety.
answer
what is the main ethical issue in field research?
question
ICPSR/NACJD, historical record analysis, content analysis, crime mapping, units of analysis, reliability and validity, social production of data, comparative research and triangulation
answer
What are the types of secondary data?
question
o ICPSR- Inter-University Consortium for Political & Social Research; data stored here primarily include surveys, official records, & official statistics
answer
ICPSR/NACJD
question
o Historical event research- Research in which social events of only one time period in the past are studied
answer
historical record analysis
question
o Identifying a population of documents or other textual sources Identify a population of documents or other textual sources for study Determine the units of analysis Select a sample of units from the population Design coding procedures for the variables to be measured Test & refine the coding procedure Base statistical analyses on counting occurrences of particular items
answer
content analysis
question
o Geographic information system (GIS)- The software tool that has made crime mapping increasingly available to researchers since the 1990s o Hot spots
answer
crime mapping
question
Research comparing data from more than one time period or more than one culture or country o Descriptive- Research in which social phenomena are defined and described o Analytic- Research that seeks to understand how national systems work and the factors related to their operations o Transnational- Explores how cultures and nations deal with crime that transcends their borders
answer
comparative research
question
surveys, official statistics, official records, and other historical documents, including written text or media representations (e.g., trial transcripts, newspaper articles, television shows).
answer
what are the four main types of secondary data?
question
we can improve our understanding of social processes when we make comparisons with other times and places.
answer
what is the central insight behind historical and comparative methods?
question
generally used to identify the spatial distribution of crime along with the social indicators such as poverty and social disorganization that are similarly distributed across areas (e.g., neighborhoods, census tracts).
answer
what is crime mapping usually used for?
question
Resources, raw materials, clients, and staff that go into a program
answer
inputs
question
The complete treatment or service delivered by the program
answer
program process
question
A descriptive or prescriptive model of how a program operates and produces effects
answer
program theory
question
The services delivered or new products produced by the program process
answer
outputs
question
The impact of the program process on the cases processed
answer
outcomes
question
Information about service delivery system outputs, outcomes, or operations that is available to any program inputs
answer
feedback
question
Individuals and groups who have some basis of concern with the program
answer
stakeholders
question
o Evidence-based policies- Programs and policies that are based on a systematic review of all available evidence that assesses what works and what doesn't Systematic review- A strategy that increases the reliability of observational data by using explicit rules that standardize coding practices across observers Campbell Collaboration- An international research network that prepares and disseminates systematic reviews of social science evidence in crime and justice, education, and social welfare
answer
Policy outputs vs. policy impacts
question
o Needs assessment (is the program needed)- A type of evaluation research that attempts to determine the needs of some population that might be met with a social program o Evaluability assessment (can the program be evaluated)- A method to determine the possibility of a study being able to specifically identify the effects of a particular program within the available time and with the available resources o Impact assessment (what's the program's impact)- aka impact analysis; Analysis of the extent to which a treatment or other service has an effect o Process evaluation (how does the program operate) Process evaluation (or program monitoring)- evaluators are often called on to document the extent to which implementation has taken place, whether the program is reaching the target individuals or groups, whether the program is actually operating as expected, and what resources are being expended in the conduct of the program. o Efficiency analysis (how efficient is the program)- A type of evaluation research that compares program costs with program effects. It can be either a cost-benefit analysis or a cost-effectiveness analysis. Cost-benefit analysis- A type of evaluation research that compares program costs with the economic value of program benefits Cost-effectiveness analysis- A type of evaluation research that compares program costs with actual program outcomes
answer
program evaluation
question
Do we care how the program gets results? Black box- A type of evaluation that occurs when an evaluation of program outcomes ignores, and does not identify, the process by which the program produced the effect
answer
o Black box or program theory
question
encourage researchers to be responsive to program stakeholders
answer
stakeholder approach
question
evaluator forms a task force of program stakeholders who help to shape the evaluation project so that they are most likely to use its results
answer
utilization-focused evaluation
question
program participants are engaged with the researchers as co-researchers & help to design, conduct, & report the research
answer
action/ participation research
question
- eliminates the professional researcher altogether in favor of a structured dialogue about needed changes among program participants themselves
answer
appreciative inquiry
question
emphasize the importance of researcher expertise & maintenance of some autonomy to develop the most unbiased & trustworthy program evaluation
answer
social science approach
question
attempt to cover concerns of both stakeholders & evaluators, as well as include stakeholders in the group from which guidance is routinely sought
answer
integrative approaches
question
descriptive or prescriptive and developed before or after an investigation of program process is completed
answer
a program theory can be either...
question
needs assessment, evaluability assessment, process evaluation (including formative evaluation), impact evaluation, and efficiency (cost- benefit) analysis.
answer
what are the 5 primary types of program evaluations?
question
because it may involve withholding desired social benefits.
answer
Why does evaluation research raise complex ethical issues?
Get an explanation on any task
Get unstuck with the help of our AI assistant in seconds
New