Validity Flashcards

(39 cards)

1
Q

what questions does validity answer

A

-is the test or measure meaningful?
-does it measure what it was supposed to measure?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

reliable vs valid on dart board

A

-reliable: not on the bullseye but all darts are in the same place
-valid (+ technically reliable): all darts are on the bullseye, + also in the same place

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

types of research design validity

A

-internal validity
-external validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

internal validity

A

basically protocols that have to be followed
* high rigor
* high repeatability
* highly controlled
* low risk of bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

external validity

A

generalizability
* appliability
* transferability
* less controlled

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

measurement validity

A

accuracy
-does the instrument measure what it is supposed to?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

face validity

A

subjective assessment of degree to which a test appears to measure what it is intended to
* most basic test of validity, not many papers use just this type anymore
* lacks operational definitions + comparison to reference standard
* does it pass the “eye test”?
* RED FLAG if something doesn’t pass this simple form of validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

red flag if something doesn’t pass which type of validity

A

face validity
-most basic test of validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

content validity

A

degree to which test items represent all relevant facets of the variable being measured
* theoretical concept, based on subjective opinion, so it can be difficult to measure directly (not as applicable)
* usually determined by a group of experts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

construct validity

A

different from content validity because it is more SPECIFIC + emphasizes an operational definition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

how do you assess construct validity?

A

-statistical analysis (simplest)- compare means
-analysis (simple)- correlation
-analysis (complex)- factor analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

how to assess construct validity-
STATISTICAL ANALYSIS

A

-simplest
-compare means
-comparison of means via t-tests (p-value)
* should expect no differences between “like” groups (p is greater than 0.05)
* should expect significant differences between “unlike” groups (p is less than 0.05)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

how to assess construct validity-
ANALYSIS (simple)

A

correlation
* convergent: strong association between “like” items or groups
* divergent (discriminant): weak or no association between “unlike” items or groups
* best when single measure exhibits both convergent + divergent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

how to assess construct validity-
ANALYSIS (complex)

A

factor analysis
* used to identify which items on a questionnaire are highly correlated
* convergent items are grouped together (loaded), while divergent factors stay separate from one another (ex: function + symptoms factors contain different items)
* cross-loading is a problem where one item on a questionnaire loads onto multiple factors, so we don’t know what is actually being measured

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

cross-loading

A

a problem where one item on a questionnaire loads onto multiple factors, so we don’t know what is actually being measured

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

criterion-related validity

A

degree to which a measure of interest relates to a measure with established validity

17
Q

2 types of criterion-related validity

A

-concurrent validity
-predictive validity

18
Q

concurrent validity

A

comparing 2 measures that are taken at the same time

19
Q

predictive validity

A

reflects the degree to which the results of a specific test can predict a future outcome

19
Q

responsiveness

A

ability of a measure to detect change; clinical meaningfulness

20
Q

example of responsiveness

A

-scores for a certain measure decrease for those that worsen
-scores for that same measure increase for those that improve

21
Q

what 2 things are clinically practical

A

-criterion-related validity
-responsiveness

22
Q

quantifying validity-
we normally report “r” for

A

association (relationship)

23
Q

general guideline for validity with “r”

A

it is valid if r = 0.35 or greater

24
quantifying validity- it is common to do what to "r" for validity
convert to r^2 for validity (shared variance)
25
general guideline for validity with "r^2"
it is valid if r^2 = 10% or more
26
in simple correlation/regression, describe r and r^2
r and r^2 are EQUIVALENT
27
in multiple regression, describe r and r^2
r and r^2 are NOT equivalent
28
quantifying validity- comparing means effect size
* between groups- construct validity * change over time- responsiveness
29
between groups =
construct validity
30
change over time =
responsiveness
31
factor analysis
name factors based on items that load on factor * basically running an analysis to see if all of the items on one questionnaire load onto the same factor * ex: all the items on PCS load onto Pain Catastrophizing factor
32
can you have validity without reliability
NO- you cannot have validity without some degree of reliability
33
something that is reliable is ____
consistent -but not necessarily accurate
34
cautions of validity
* even a well-designed tool could produce invalid data if it is used incorrectly * criterion-related validity + responsiveness are objective, but they are only as strong as the comparison, or gold standard, that is used * no cut-off for a coefficient to determine if it is "valid"; it's more of a spectrum * validity is multidimensional; there are multiple components going into validity
35
can a well-designed tool produce invalid data
yes- if used incorrectly
36
criterion-related validity + responsiveness are subjective/objective
objective -but they are only as strong as the comparison, or gold standard, that is used
37
is there a cut-off for a coefficient to determine if it is valid
no -it's more of a spectrum
38
validity is ____
multidimensional -there are multiple components going into validity