RESEARCH FOR PRACTISE Flashcards

(192 cards)

1
Q

define research

A

a systematic, focused enquiry to produce new knowledge and improve patient care

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

define EVIDENCE BASED PRACTISE

A

the continuous, explicit and judicious use of current best evidence in patient care

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what is the components of the EBF triad approach?

A

best available research evidence
clinical expertise
patient values and preferences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what are the 5 steps of EBP (AAAAE)?

A

ASK
ACQUIRE
APPRAISE
APPLY
EVALUATE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

how is EBF relevant to nursing?

A

meets NURSING COMPETENCY ASSESSMENT SCHEDULE (NCAS)
supports critical thinking and safe, high quality practise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what are the 3 components of a focus question?

A

who ? - population
what ? - intervention and comparison
what outcome ?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what is the reason to search for evidence?

A

answering clinical questions, improve care, stay updated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what are 3 types of searching (researching)?

A

informal
focussed
literature search

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

what is informal researching?

A

alerts, social media

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

what is focussed researching?

A

synthesised info sources for specific clinical issues

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

what is literature search researching?

A

using databases and systematic searching

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

define primary research

A

original studies
(interviews, observational studies, randomised control trials)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

define secondary research

A

synthesised from a range / multiple primary studies
(systematic reviews, meta-analysis)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what are some examples of effective search strings?

A

PubMed, google scholar (databases)
federated researches (tripdatabase)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what are the 3 purposes of critiquing literature?

A
  1. worth reading and useful for practise?
  2. appraise primary research and review secondary research
  3. ensuring validity, reliability and applicability to context
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

define the IMRaD format and what is it used for?

A

research papers
I - INTRODUCTION
M - METHOD
R - RESULTS
D - DISCUSSION

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

what are the 4 components of assessing primary research appraisal?

A

sample and setting
data collection
quantitative analysis
qualitative analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

what are the 4 components of assessing secondary research appraisal?

A

systematic reviews
meta-analysis
meta-synthesis
use of PRISMA guidelines

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

what is a systematic review ?

A

summarising a primary study

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

what is a meta-analysis?

A

quantitative synthesis
Combines numerical results from many studies to find an overall effect or result

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

what is meta-synthesis?

A

qualitative synthesis
Combines findings from qualitative studies to build deeper understanding or new insights

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

how does meta-analysis increase statistical power?

A

by combining the results of multiple studies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

what does secondary studies require to be effective in use?

A

structure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

what is a pro of using meta-analysis

A

strengthens evidence of the study

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
what is the purpose of gaining evidence and researching?
answers clinical questions
26
what are the 6 NHMRC levels of evidence?
Level I = systematic reviews Level II = Randomised controlled trials Level III - 1 = cohort studies (non-randomised control trials) Level III - 2 = case control studies Level III - 3 = case series, case reports Level IIII - expert opinions
27
what is the highest level of the NHMRC 'Levels of Evidence' hierarchy?
systematic reviews of randomised control trial investigations synthesising multiple high quality trials
28
what is a RCT investigation?
participants randomly allocated to intervention or control group minimising allocation bias, measuring intervention effectiveness
29
what are non-randomised control trial investigations?
participants not strictly randomly allocated causing greater risk of bias
30
what is the lowest level of the NHMRC "Levels of Evidence Hierarchy" and why is it the lowest?
case series no control group for comparison testing before and after treatment changes cannot be linked to intervention largely bias
31
define bias
consistent error in results and conclusions drawn
32
define validity
closeness to the truth accuracy of measure
33
define reliability
repeatability of results consistency in measuring across time
34
what does bias threaten?
accuracy of results
35
how does random allocation effect bias?
reduces bias
36
how can systematic reviews be bias?
through flawed original studies
37
what does the NHMRC use a grade system to asses?
quality of evidence strength of recommendations
38
what are 5 components of evaluating evidence for guidelines?
study design strengths and limitations levels of evidence consistency across studies effect size potential biases
39
which are more bias? primary or secondary ?
primary
40
which out of primary and secondary sources are more accurate to use, evidence wise?
secondary
41
what is the purpose of qualitative data usage is research?
focussed on exploring and understanding human experience concerned with why (qualifying), rather than how many (quantifying) depper understanding behind responses
42
describe 3 distinct differences of qualitative data compared to quantitative data
exploring, in-depth understanding inductive reasoning (specific to general) strength: VALIDITY
43
describe 3 distinct differences of quantitative data compared to qualitative data
measurement, statistical analysis deductive reasoning (general to specific) strength: RELIABILITY
44
what is the strength of qualitative data?
VALIDITY
45
what is the strength of quantitative data?
RELIABILITY
46
what are 3 ways to ensure validity in qualitative data?
1. triangulation 2. reflexivity 3. inter-related reliability
47
define triangulation
using multiple data collection methods
48
define reflexivity
researchers critically reflecting on their influences
49
define inter-related reliability
multiple analysis checks for consistency
50
what are the 3 techniques of non-probability sampling ?
purposive snowball convenience
51
define purposive sampling
participants targeted for relevant experience and characteristics
52
define snowball sampling
participants refer others to participate in study
53
define convenience sampling
participants that are easily accessable to researcher
54
what are 2 data collection methods of qualitative data?
interviews observational studies
55
what are 2 types of interviews
semi-structured - The interviewer has a list of prepared questions unstructured - The interviewer has only a general topic or a few guiding questions and lets the conversation flow naturally.
56
what are 2 types of observation data collection methods?
active - The researcher actively takes part in the situation or activity being studied while observing what happens. passive - The researcher watches without participating, simply observing what happens.
57
what are 2 types of data analysis methods for qualitative research?
content analysis thematic analysis
58
define content analysis
To count and categorise words, phrases, or concepts in the data. What is said and how often. More structured and sometimes quantitative (involves counting).
59
define thematic analysis
To identify and interpret patterns or themes in the data. What is said and what it means. More interpretive and qualitative (focuses on meaning).
60
what method is used in appraising qualitative research?
CASP 9 pt checklist
61
what is the CASP 9 pt checklist in appraising qualitative research?
1. problem important and well defined 2. qualitative approach appropriate 3. setting and participants suitably chosen 4. researcher bias addressed 5. data collection methods appropriate and detailed 6. analysis systematic with quality control 7. results credible and important 8. conclusions justified by results 9. findings transferable to other studies
62
what does qualitative research focus on?
focus on depth, meaning and validity
63
what does quantitative research focus on?
focus on measurement, numbers and reliability
64
what does quantitative data provide evidence for?
clinical decision making
65
what does quantitative research help to test?
cause, effect and relationships
66
what is a characteristic of strength that quantitative research has over qualitative research?
stronger and more generalisable
67
what are 4 different research design methods of quantitative research?
experimental designs observational studies
68
what are experimental designs best used for?
testing cause and effect relationships using clinical trials A research design or plan used to test cause-and-effect relationships by controlling and manipulating variables.
69
what technique is used in experimental design research?
P - population of interest I - intervention that causes change C - comparison (control group or placebo) O - outcome (measurement of change with intervention)
70
what are intervention studies?
using randomised control trials and non-randomised control trials A type of study where researchers introduce an intervention (a treatment, program, or change) to see its effect. type of experimental design
71
what are observational studies ?
To observe and describe what happens naturally — without changing anything.
72
what are diagnostic accuracy studies?
To test how well a diagnostic test identifies or rules out a disease or condition. The researcher compares test results with a “gold standard” (the best available method to confirm diagnosis).
73
what are the 4 levels of measurement for quantitative research?
nominal ordinal interval ratio
74
what is nominal ?
Data are put into categories that have no order. ex. gender, blood type
75
what is ordinal ?
ordered categories Data have a rank or order ex. Education level (primary, secondary, tertiary)
76
what is interval?
Data have equal intervals between values, but no true zero. ex. temperature ex. 0 degrees doesn't equal no temperature
77
what is ratio?
Data have equal intervals and a true zero (zero means “none”). ex. height, weight, age, test score
78
what is categorical data ?
Data that represent groups or categories, not numbers with real meaning. no averages, no fractions, discrete data nominal and ordinal
79
what is continuous data?
Data that can take any numerical value within a range — numbers that can be measured, not just counted. more detailed, summary stats, fractions Height, weight, age, temperature, time interval and ratio
80
what is an example of categorical nominal?
yes or no
81
what is an example of categorical ordinal?
education level (primary, secondary, tertiary)
82
what is an example of continuous interval?
Temperature in °C or °F → The difference between 20°C and 30°C is the same as between 30°C and 40°C, but 0°C doesn’t mean “no temperature.”
83
what is an example of continuous ratio?
weight 0 kg = no weight.
84
define independent variable
intervention being changed
85
define dependent variable
outcome being measured
86
what are 4 types of reliability?
intra-related inter-related test-retest alternative / parallel forms
87
what is intra-related reliability?
The same person measures or rates the same thing more than once, and results are consistent.
88
what is inter-related reliability?
Different people measure or rate the same thing and get similar results.
89
what is test-retest reliability?
The same test is given to the same group at two different times, and results are stable. repeated measurements over time
90
what is alternative / parallel forms reliability?
Two different versions of a test that measure the same concept give similar results. using different equipment rather than people, producing same results
91
what are 2 types of quantitative study types?
descriptive studies analytical studies
92
what are some examples of descriptive studies?
cross-sectional surveys (that only describe), case series, case report, case study
93
what is a descriptive study?
To describe what is happening — the characteristics of a population or phenomenon. Observe and report facts — no comparison or testing of hypotheses. Usually summarizes data — frequencies, percentages, averages.
94
what is an analytical study?
To analyze and find relationships or causes between variables. testing hypothesis and relationships Compare groups and test hypotheses about causes or risk factors. Looks for associations — risk ratios, odds ratios, correlations.
95
what are some examples of analytical studies?
experimental studies case-controlled studies cohort studies cross-sectional study
96
what is mixed methods research?
the combination of both qualitative and quantitative measures together
97
what are 4 types of mixed methods designs?
convergent designs sequential designs multi-phase designs embedded designs
98
what are convergent mixed method designs? give example.
both qualitative and quantitative data is collected at the same time two data sets are analysed separately and then compared or merged ex. Survey + interviews on stress
99
what are sequential mixed method studies? (2) give example.
data collected in 2 phases, one after other explanatory - Quantitative first, qualitative second. exploratory - Qualitative first, quantitative second. ex. Survey → interviews (explanatory) interviews → survey (exploratory)
100
what are multi-phase mixed methods designs? give example.
Combines multiple sequential or convergent studies over time, used for large populations, and complex problems several phases, each phase informing the next ex. Qualitative interviews to explore community needs. Interviews → survey → intervention → follow-up
101
what are embedded mixed method designs?
One data type is primary, one is secondary Enhance or explain primary data other provides secondary role for further analysis interpret analysis on primary not secondary ex. A clinical trial testing a new drug (quantitative) includes patient interviews (qualitative) to explore experiences of side effects.
102
what is quantitising data transformation?
turning qualitative data into numerical form
103
what is qualitising data transformation?
turning numerical values into themes
104
define convergence in data
When the quantitative and qualitative results agree or tell the same story. showing consistence (confirming each other)
105
define divergence in data
When the quantitative and qualitative results do NOT agree or show different patterns. contraindications leads to a deeper investigation
106
what are 5 things needing to be considered when designing research?
1 - dominance and weighing - deciding which method is primary 2 - equal quality - avoiding weak design in either method 3 - researcher expertise - being skilled in both quan and qual data 4 - collaboration - different perspectives and disagreements 5 - resources - more time, effort, money required
107
define population
entire group to generalise results to
108
define target / theoretical population
larger group of interest
109
define study / accessable population
group researcher actually accesses
110
define sample
subset selected from population
111
define participant
individual chosen for study
112
define sampling frame
detailed list of all participants
113
what are 2 types of sampling ?
1 - probability sampling 2 - non-probability sampling
114
define probability sampling
random and more representative quantitative research
115
what are 5 types of probability sampling?
simple random systematic random stratified random cluster sampling multi-stage sampling
116
what is simple random sampling?
Every member of the population has an equal chance of being selected.
117
what is systematic random sampling?
Select members at regular intervals from a list after a random start. Decide a sampling interval (k) and pick every kth person.
118
what is stratified random sampling?
Divide the population into subgroups (strata) based on a characteristic, then randomly sample from each stratum. ensuring representation of key subgroups.
119
what is cluster sampling?
Divide the population into clusters (groups), randomly select some clusters, and include all members of chosen clusters.
120
what is multi-stage sampling?
Combines two or more sampling methods in stages. Often, clusters are selected first, then individuals within clusters. narrowing from larger to smaller groups
121
what is non-probability sampling?
non-random, less generalisability qualitative research
122
what are 4 types of non-probability sampling?
convenience sampling snowball sampling purposive sampling quota sampling
123
what is convenience sampling
Select participants who are easily available or accessible. may not be representative of population
124
what is purposive sampling?
Researcher selects participants based on specific characteristics or purpose of the study.
125
what is snowball sampling?
Existing participants refer or recruit others for the study. useful for hidden populations
126
what is quota sampling?
population divided into subgroups and selected based on specific characteristics (not random) Similar to stratified sampling but non-random
127
what is sampling error?
difference between sample results and true population value
128
what are 2 types of sampling errors?
sampling error non-sampling error
129
what is a sampling error?
Happens because you only study a sample, not the whole population.
130
what is a non-sampling error?
Happens because of mistakes in the study, not because of sampling (data error/measurement error)
131
what happens if a sample size is too small?
Unreliable results, low power, not representative, Type II errors
132
what happens if a sample size is too big?
Waste of resources, ethical concerns, harder analysis
133
what influences choosing a sample size ?
population size variability margin of error confidence error
134
what is a type I (A) probability error?
false positive (finding an effect which doesn't exist) ex. Concluding a new drug works when it actually doesn’t
135
what is a type II (B) probability error?
false negative (missing the real effect) ex. Concluding a new drug doesn’t work when it actually does.
136
what is power, as a probability error?
The chance of detecting a real effect if it exists ex. A clinical trial tests a new drug. If the power is 0.8 (80%), there’s an 80% chance the study will detect a real effect of the drug if it exists.
137
what is power analysis as a probability error?
determine the sample size needed to achieve a desired power. planning tool to decide sample size so your study is likely to detect a real effect. ex. You want to detect a medium effect of a new teaching method on test scores with 80% power at α = 0.05 → power analysis tells you you need 100 students in the study.
138
what is the aim of qualitative data sampling?
To select participants, cases, or sources that can provide detailed and meaningful information to get a deeper understanding to answer questions
139
what is saturation in qualitative data sampling?
The point in data collection when no new information, themes, or insights are emerging from additional data. You keep collecting data until everything starts repeating
140
define statistics
numbers, quantitative data
141
define dataset
original raw data, unanalysed
142
define descriptive statistics
summarising sample information A branch of statistics used to summarise, organise, and describe the main features of a dataset.
143
define inferential statistics
make conclusions, predictions, or generalisations about a population based on data from a sample.
144
what are measures of dispersion?
how spread out the scores are - SD - min/max - range
145
what are the types of descriptive analysis?
measures of central tendency measures of dispersion
146
what are measures of central tendancy?
midpoint between highest and lowest scores - mean - median - mode
147
what does descriptive statistics only describe ?
sample
148
what does inferential statistics only describe?
applicability to population
149
what are 3 assumptions in statistics ?
equality normality linearity
150
define equality as an assumption of statistics
The spread (variance) of scores in different groups should be approximately equal (comparable) ex. Comparing test scores between two classes — the variation in scores should be similar in both classes.
151
define normality as an assumption of statistics
The data should be approximately normally distributed — forming a bell-shaped curve when plotted. ex. Heights of adult men are roughly normally distributed; most are around the average, few are extremely tall or short.
152
define linearity as an assumption of statistics
There should be a straight-line relationship between two variables when using correlation or relationship analysis. ex. The relationship between hours studied and exam scores — as study hours increase, exam scores increase roughly in a straight-line pattern.
153
what are two types of hypothesis testing?
null hypothesis alternate hypothesis
154
define null hypothesis in hypothesis testing
A statement that there is no effect, no difference, or no relationship between variables in the population. ex. A new drug has no effect on blood pressure.
155
define alternate hypothesis in hypothesis testing
A statement that there is an effect, difference, or relationship between variables in the population. ex. A new drug lowers blood pressure
156
what is a P value?
Probability of observing data if the null hypothesis (ex. drug has no effect) is true
157
is a p value is less than 0.05 ...
the data is statistically significant null hypothesis of no effect is rejected proven to have effect (true effect is likely)
158
if a p value is more than or equal to 0.05 ...
the data is not statistically significant null hypothesis of having no effect is true drug proven to have no effect (effect is not reliable)
159
if the null hypothesis is: H₀ = “drug has no effect.” and the p value = 0.02.. what does this indicate?
Only 2% chance of seeing this result if the drug had no effect drug likely works
160
define confidence interval
A range of values calculated from the sample data that is likely to contain the parameter with a certain level of confidence.
161
what does a narrow confidence interval explain? (shorter lines)
greater precision (don't range over a large number)
162
define minimum importance difference
The smallest change in a treatment outcome that patients, clinicians, or researchers consider meaningful or important. making treatment worthwhile
163
what does the MID actually help?
designing clinical trials: determining sample size and interpreting results.
164
if a pain scale ranges from 0-10, and the MID is 2 (considered clinically meaningful my patients) a new drug reduces pain by 1, what does this infer ...
might be statistically significant but not meaningful in practice
165
define heterogeneity
The variation or differences in results, characteristics, or effects among studies, participants, or data points. Heterogeneity = variation or inconsistency among studies or data.
166
what does High heterogeneity mean
studies give different results
167
what does low heterogeneity mean
studies give similar results
168
what is ethics?
ideal human behaviour, helping to distinguish right from wrong
169
what does is the role of NHMRC
provides ethical framework
170
what counts as human research?
any study involving people, data and biological material
171
define autonomy
own actions respecting individuals right to making their own decisions and actions informed consent and voluntary participation
172
define beneficence
benefit human kind risk must always be minimised and justified act in ways that benefit participants, maximise good, and minimise harm.
173
define justice
treated fairly fair recruitment and distribution of benefits avoid overburdening groups and exploiting vulnerable populations
174
define non-maleficence
to avoid and minimise harm preventing physical, psychological, social and economical approval depends on level of risk
175
define respect for human dignity
every participant should be treated with honour, worth, and autonomy, acknowledging their rights, values, and choices. special protection measures for vulnerable populations
176
define confidentiality
responsibility of others only taking relevant information Ensuring that information provided by participants is kept secret and only accessible to authorised researchers.
177
define privacy
right for participant to choose what happens with the data collected The participant’s right to control access to their personal information and body.
178
define autonomy (confidentiality)
Participants’ identities are not collected or cannot be linked to their data. no identity collected, de-identifying documents
179
what is the role of the human ethics committee
ensuring compliance with national statement responsibility of reviewing research applications membership - academics, lawyers, health professionals etc.
180
what are the things that we need to be aware of in evaluating ethical research
risk-benefit balance justice in recruitment informed consent adequacy confidentiality protection
181
what are the 7 ethical principals of research?
autonomy beneficence confidentiality veracity justice non-malificience respect for human dignity
182
define veracity
telling the truth obligation to tell the truth in research reporting findings honestly in publications deception if necessary, debrief must be done
183
what are the 2 types of barriers to finding and using evidence
individual barriers organisational barrier
184
what are 3 individual barriers in finding and using evidence
lack of physical ability lack of confidence lack of time personal resistance (dislike)
185
what are 3 organisational barriers in finding and using evidence
lack of resources and support workplace culture devaluing research lack of time
186
how can a workplace encourage implementation of EBP?
ensuring good evidence is made available motivation - clinicians will adopt evidence if they see benefit
187
how can an organisation support EBP?
leadership fostering culture of EBP supportive resources, policy, decision making clinical guidelines and integrated pathways
188
what must be included in EBP?
patients preferences and values included in EBP approach and decision making
189
when is evidence based practise used badly? (3)
using evidence without questioning integrity and quality of study thinking world based evidence applies to patient needs study ignoring patient perspective and significance of clinical judgement
190
what may be a challenge of guidelines in EBP? 3
guidelines too narrow or broad may be out of date ignoring patient perspectives
191
what may be a practical / logistic issue with EBP?
too many guidelines limited / expensive access confusing and conflicting evidence
192
what are some solutions in overcoming barriers to EBP?
access, education, supportive leadership, motivation