ACE behaviors (attention comprehension and Effort)
administration behaviors
-define how an assessment should be administered and scored.
alternate form
Split half
- single test is administered to one group of children then divided into 2 forms.
concurrent validity
-statistically measures the relationship between an existing instrument with established validity and a new instrument that assesses the same attribute.
construct validity
Criterion (predictive) Validity
examines the degree to which one or more test scores can be used to predict performance on a future related event.
Content Validity
concerns collecting evidence to support that what is measured by the instrument reflects what the instrument is designed to measure.
Face (Logical) Validity
validity
- how the test results are used.
Reliability
instrument refers to the consistency of the result obtained over multiple administrations.
Norms
-there to help teachers interpret how their students are performing compared with other students, who have taken same assessment.
Objectivity
- focuses on the consistency of the results obtained by the same assessor across multiple administrations.
intra-class correlation
is that is uses all the data that have been collected and allows for the variance to be examined from three sources.
1-students.
2-trials
3-interaction of students by trials
Pearson product-moment
Scoring precision
Student performance variability
??
Test-retest reliability
Intra and inter reliability
Go find the answers to this.
Tra one tester
Ter multiple testers
PURPOSE OF CONDUCTING ANY ASSESSMENT
MAKE INFORMED DECISIONS AND TO COLLECT ACCURATE DATA. AND MORE
TYPES OF VALIDITY
THREE MAJOR FACTORS THAT EFFECT RELIABILITY OF ASSESSMENT: