Analytical Aptitude Competency Flashcards

(98 cards)

1
Q

What is the fundamental challenge of the analytical aptitude competency for HR professionals?

A

To change the basic way they interpret or assess events.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Analytical Aptitude

A

Refers to examining an idea, a process, or an event with an open, objective, and inquiring mind. It is a critical skill in evidence-based decision making (EBDM), using sound data to hypothesize, assess, and select solutions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Data advocacy

A

Developing an inquiring mindset, learning what data drives the business and where it can be found, developing partnerships across the organization to promote EBDM, and modeling the skill of EBDM to the entire organization through the decisions HR makes and the plans of action it undertakes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Data Gathering

A

Knowing what constitutes sufficient, credible, and objective evidence and being able to find it.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Data analysis

A

Being able to organize data so that it reveals patterns and to analyze it to detect logical relationships.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Evidence-Based Decision Making (EBDM)

A

Being able to apply the results of data gathering and analysis to make better business decisions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What do effective data advocates show?

A

That analysis does not exist for the sake of analysis - it is focused on making more informed decisions that minimize risk and maximize opportunities. They also assist in building a data-driven culture, encouraging EBDM throughout the organization, from bottom up.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

HR Professionals should consider six steps described by Ben Eubanks’ research for Evidence-Based Decision Making:

A
  1. Ask - When faced with a problem, translate the situation into a question that can then be answered through information gathering.
  2. Acquire - Gather information from varied sources.
  3. Appraise - Determine whether the evidence gathered is relevant, valid, reliable, accurate, complete, and unbiased.
  4. Aggregate - Combine and organize the data to prepare it for analysis. Determine the priority to be given to different types of information.
  5. Apply - See the logical connections within the data and with the issue. Use the data to draw conclusions, develop possible solutions, win sponsor support for a decision, and take action.
  6. Assess - Monitor the solution that has been implemented and objectively measure the extent to which the objectives have been attained.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the ways Denise Rousseau and Eric Barends mention to become a better data-based HR practitioner and improve the competence of the HR function in evidence-based decision making?

A
  • Develop a questioning mind
  • Build fluency in the scientific literature for HR
  • Gather data on a continuous basis
  • Use evidence when communicating with stakeholders
  • Institutionalize the competency in the HR function
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Data sources

A

Comprise quanitfiable data and qualitative data. HR professionals use different techniques to obtain the data they need: interviews, focus groups, surveys/questionnaries, and observation. Existing data can also be used.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Quantitative Data

A

Consists of objective measurements that can be verified and used in statistical analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Qualitative Data

A

Involves a subjective evaluation of actions, feelings, or behaviors. Measurements can be made by a third-party observer; can also be self-assessments; the data can be assigned numerical values (but these values don’t carry significance)

Qualitative data can include observations of steps in a process or of traits that are present in people who are rated as successful in a job. May be assessments of value or significance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Which is more important - qualitative or quantitative data?

A

Both types of data are important to HR professionals. The purpose of the research usually determines the type of data collected.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Several questions you should consider before accepting data in a print or online publication:

A

Does the source have authority? A government agency, such as a labor department or ministry, or an academic institution is more credible than a blogger.

What are the source’s possible biases? Bias can lead to “cherry picking” data—including only the facts that support a certain position.

Are the sources for data used in a publication clearly cited? And are those sources reliable and accurate?

Are the facts relevant? For example, data about trends in one industry may not apply to all industries.

Is the data current? Some concepts are classic, but data is subject to variables that can weaken its relevance over time. For example, data from a recessionary period can be highly inaccurate in describing conditions in an expanding economy.

If the data is being offered as proof of an argument, is the argument itself sound? Are its deductions from the data logical?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Interviews as a data source - Interviews may be used for different purposes:

A
  • Sometimes they are useful in identifying topics that can be explored in focus groups or surveys.
  • They may focus on specific, high-value employees and uncover targeted retention information - or in the case of exit interviews, engagement failures.
  • Organizational “heroes”, people who are recognized and respected throughout the organization, may add a cultural perspective.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Interviews are rarely the sole form of gathering data from those involved because:

A

of the amount of time and labor required to conduct them. If multiple interviewers are used, they must be carefully trained and prepared so that all interviews are conducted in the same manner without bias.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Advantages and challenges of interviews:

A

Advantages:
-Safer, confidential environment may generate significant information.
-Comments can suggest direction for further group research (focus groups and surveys).

Challenges:
- Can be time-intensive.
- Requires strong relationship-building skills.
- Requires vigilance to avoid bias from influencing questions and interpretation of answers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Individual interviews are more effective if:

A

areas of discussion and specific questions are planned.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

A key interview tool is:

A

The interview guide or instrument. It should be drafted and reviewed by other team members and perhaps by the client.

All means should be taken to establish a positive and trusting relationship with the interviewees.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Focus Group:

A

A small group (normally 6 - 12) invited to actively participate in a structured discussion with a facilitator.

Usually last from 1-3 hours, depending ont eh topic and the purpose.

Serve a variety of purposes - are often used to follow up on a survey, providing a more in-depth look at specific issues raised during the survey - in this respect, focus groups collect qualitative data that enriches quantitative survey results.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

The following are important considerations regarding focus groups:

A
  • A focus group is intended to provide a microcosm of the population being studied. Participants must, therefore, adequately represent that population to ensure representative information. Ideally, random selection should be used so that every employee has an equal chance of being selected and the diversity of the employee population is represented.
  • Along with random selection, voluntary participation is another important consideration. Voluntary participation can help to ensure that the focus group will be a productive session with employees who are willing to share their views and opinions.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Focus Groups Advantages and Challenges:

A

Advantages:
- Provides a format that is flexible and relatively comfortable for discussion
- Allows for group brainstorming, decision making, and prioritization
- Can provide group consensus
- Enables HR to learn about employee needs, attitudes, and opinions in a direct format
- Gives employees direct input

Challenges:
- Tends to foster “group think” conformity
- May be difficult to control; can become a forum where participants go off on tangents
- Generally don’t allow for deep discussions, depending on time constraints and the number of participants
- Can provide skewed or biased results if participants are not representative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

When planning a focus group, HR should consider the following:

A
  • The importance of planning
  • The context in which a focus group might occur
  • The importance of the facilitator
  • The importance of the recorder
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Focus group tools:

A

Mind Mapping and affinity diagramming
Nominal group technique (NGT)
Delphi Techninque

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Mind Mapping and Affinity Diagramming:
Mind mapping begins the discussion with core ideas. The group members add related ideas and indicate logical connections, eventually grouping similar ideas. This can be done on paper or a whiteboard or with sticky notes. Affinity diagramming is a way of sorting a large amount of data that has already been collected. The group categorizes and subcategorizes data until relationships are clearly drawn.
26
Nominal Group Technique (NGT)
This technique proceeds through rounds in which participants each suggest ideas. The rounds continue until no further ideas are proposed. Then the group discusses the items, eliminates redundancies and items considered irrelevant, and agrees on the importance of the remaining items. NGT can be practiced with idea generation by individuals, subgroups, or the entire group. Similarly the initial sorting of ideas can be done in subgroups before returning to the main group to reach consensus.
27
Delphi Technique
This technique progressively collects information from a group on a preselected issue. The first respondent proposes information, the next respondent adds something different, and so on, until a list can be compiled. The respondents are anonymous. In the second round, the researcher circulates the list and asks each respondent in turn to refine previous ideas, to comment on each idea’s strengths and weaknesses for addressing the issue, and to identify new ideas. This technique is designed to facilitate group involvement, problem solving, and individual thinking while avoiding “group think,” where participants can be influenced by what others say.
28
Surveys and Questionnaires as a data source:
Relatively inexpensive ways to gather a large amount of data from a large and dispersed group fo subjects, such as employees or customers.
29
Serious challenges with surveys and questionnaires:
Obtaining a valid sample Designing the survey with analysis in mind Asking the right quesitons
30
Obtaining a valid sample
Researchers must make sure that survey results are truly representative—that the number of returned surveys is sufficiently large to be representative and that the group responding accurately reflects the attributes of the entire group. Explaining the purpose and importance of the survey may improve the response rate, as will making it easier to complete—shorter and easier to understand. Researchers should be aware of survey approaches that affect who can respond—for example, using an online survey in a workplace where not all workers have equal online access. This affects the sample size and the ethical impact of the survey.
31
Designing the survey with analysis in mind
Questions should be asked in a way that makes compiling and comparing responses easier. This usually means relying heavily on quantifiable responses (for example, the Likert scale, which asks respondents to choose ratings, usually from 1 to 5 points). Freeform feedback (narrative comments, examples, suggestions) can be included as well and will enrich the research report.
32
Asking the right questions
To understand an entire organization, experts often turn to questions based on organizational models that map various internal and external environmental factors that can affect attitudes and work. Internal factors generally include strategy and purpose, leadership, rewards, and relationships; external factors include opportunities and threats as in SWOT and PESTLE analyses. In global or diverse organizations, researchers must also be mindful of language and cultural differences that could complicate the communication task
33
Survey Questionnaire Advantages/Challenges
Advantages: - Efficient way to gather a lot of data from a large and dispersed group - Easier to quantify data for analysis and reporting Challenges: - Can be difficult to obtain an acceptable response rate - Difficult to follow up on data from anonymous sources - Relies on self-reporting, which can be biased - Requires time and statistical expertise to assess sample and compile and analyze data Some of these challenges and advantages will be greater or lesser depending on the makeup of the surveyed group.
34
Observation as a data source:
HR professionals can also gather data by observing the workplace and work processes. This removes the self-reporting filter that is present in interviews, surveys, and focus groups. Observers can note factors that participants are unaware of or have become accustomed to as well as any that participants are reluctant to mention for personal reasons. Observation can also strengthen the HR professional's understanding of the work at hand and the culture of the workplace.
35
Advantages and challenges of observation to gather information
Advantages: - Provides firsthand and immediate data rather than self-reported data, which can be affected by memory and selectivity. - Is time-efficient for subjects. Challenges: - Requires skill to be unseen. When the group is very aware of the observer, the data becomes less reliable. - Requires vigilance to remove personal bias from observations. - Requires experience to note significant behaviors. - Observations may not be representative of the entire body of data (i.e., the totality of every meeting, every work process, every transaction).
36
Existing Data and Documents as a Data Source:
Existing data and documents can include information from the organization itself, from public information sources, or from industry/professional associations.
37
Sources of Existing data include:
- Official documents, such as organization histories and vision and strategy statements, which can help the team understand the organization’s business and culture. - Performance data over multiple periods from the organization’s financial records as well as data from other organizational databases. - Performance data from the organization’s HR information system (for example, turnover rates, employee complaints, incident reports). - Correspondence and analytical reports. - Industry data that can provide information about external environments and performance benchmarks.
38
Advantages and cahllenges of using existing data:
Advantages: - Eliminates the effects of observation and involvement and possible bias of facilitator/interviewer/observer - Rich, multi-perspective source of data Challenges: - Can be time-intensive - Requires experience to extract key data - May require ingenuity to find data
39
Artifacts as a data source:
Artifacts are objects created by members of a culture that convey a sense of that culture's values and priorities, beliefs, habits and rituals, or perspectives. Tehy can provide insights into aspects of an organization's culture that its members may not be able to or may not want to articulate to an outsider.
40
Artifacts could include:
- Physical workplaces that can suggest characteristics of organizational culture (for example, the presence of personal artifacts, an emphasis on diplomas and certificates, environments that convey key values such as accessibility of top leaders to all employees). - Virtual environments, for example, social media providing clues about how the organization is perceived by outsiders and by employees.
41
Artifacts may be best used when they...
Confirm or conflict with findings gathered by other means. Without enough context a researcher can misinterpret the meaning and importance of artifacts.
42
Advantages and Disadvantages of using artifacts in research:
Advantages: - Provides additional insight into cultural issues - Can be examined without the help of those being observed Disadvantages: - Requires researcher to understand the principles of culture - Can create misunderstandings if the researcher is not familiar with the culture
43
Marketing data as a data source:
HR professionals may need marketing data to aid in workforce planning or establishing a business case. This data can be gathered from a variety of sources, both internal and external to the organization.
44
(Generalized) Market data as a data source:
In addition to industry and business marketing data, more generalized market data could also be used. The market research industry is dynamic and well populated by both public and private organizations dedicated to collecting and disseminating research and information about consumers, customers, populations, industries, and more. The data can be used to better understand target audiences or certain segments of society and/or business.
45
Statistics
Refers to the collection, organization, and analysis of large amounts of numerical data. The value of analysis and evidence-based decision making is in direct relationship to the quality of the statistics used in the analysis. When data is wrong or insufficient, the analysis is likely to be wrong or weak.
46
Reliability of Data:
Reliability refects the ability of a data-gathering instrument or tool, such as a survey or a rater's observation or a physical measurement, to provide results that are consistent.
47
Perfect reliability is rarely achieved. A variety of errors can occur that may create inconsistent results, including:
- A failure to maintain the same conditions or correct for differences. - Cultural differences that create different interpretations of questions. Tools requrie thorough review and testing. - Bias in using the tool to gather data. Rater reliability can be checked by testing a tool with different raters.
48
Validity of Data:
Validity is the ability of an insturment to measure what it is intended to measure. Validity reflects the degree to which a tool measures attributes that are relevant to the measurement's intention.
49
Validation answers two questions:
- What does the instrument measure? - How well does the instrument measure it?
50
A tool's validity may be damaged by:
Using irrelevant criteria to develop measures
51
Statistical Sampling:
Sampling is often used when the population to be analyzed is very large or when data cannot be obtained from the entire population. The sample must be representative - it must accurately reflect the key characteristics of the entire population being studied. Samples of data must also be sufficiently large to include all the possible variations within the population being sampled. The smaller a sample is, the more likely analysis results will be affected by statistical outliers, values that differ greatly from the average. this is a common problem with surveys with low response rates.
52
Errors may be introduced into a statistical study when:
- Incorrect data is used. A measurement may be taken incorrectly, or a number may be entered incorrectly. - A study's design includes, intentionally or unintentionally, different types of biases that affect outcomes.
53
Biases could involve:
Sampling Selection Response Performance Measurement
54
Sampling Bias:
A sample may not represent the general population. For example, HR is studying the effects of an engagement strategy on retention. The sample, however, contains a higher proportion of older workers than are in the organization’s workforce.
55
Selection Bias:
Selection bias can occur in a controlled study when participants are not randomly assigned to control and experimental groups. (Controlled studies assign participants to a control group that does not experience the intervention or condition being tested and one or more experimental groups that do experience the intervention or condition.) Bias can also occur when researchers choose to enroll only certain types of participants. In a study testing the effects of a remote working policy, for example, the researchers enroll only employees who have been rated highly by their supervisors.
56
Response Bias:
This is the inverse of selection bias. The researchers invite a representative sample to join a study, but the group that accepts or responds is not representative. For example, HR invites all employees to participate in a survey to determine new benefits. The group that responds is composed disproportionately of young parents.
57
Performance Bias:
Participants in a controlled study behave differently because they are being studied. A famous example of this is the Hawthorne Works experiment. The experiment was designed to measure the effect of improvement in various factory conditions on worker productivity, but the increases in productivity were only temporary and appeared to be related primarily to the experiment’s design. The workers appreciated the increased attention to their welfare.
58
Measurement Bias:
Raters are measuring incorrectly, either unintentionally (because of lack of training or difficult measurement procedures) or intentionally (the result of some type of personal bias).
59
When using study results, HR professionals should be mindful of:
the information-gathering and analysis approaches used in the study, which are described in the study's methodology section. These methods may reveal errors or the potential for error. If they are creating their own studies, HR professionals may want to consult with statistical experts and have them review their studies' methodology.
60
Data Cleansing:
In order to compile and share the different, large, and growing amounts of data across an organization in a usable format, data is often required to be cleansed (may also be referred to as data wrangling) - a process by which incomplete sets, anomalies, errors, and gaps int he data are identified and addressed. Incorrect, inconsistend data can lead to incorrect conclusions, poor decision making, and midirected efforts and resources.
61
Cleansing data is connected to:
Validation and identifying bieas, as these processes assess how useful and correct the data is. By cleansing the data collected, you can ensure that decisions are made based on better-quality data. Data cleansing is not an ad hoc process. HR should try to embed certain practices into the culture of the organizaiton. Pushing for standardized data collection is another valuable step to take - and one that should be taken early - despite the perceived hurdles and potential disruption. The cost (in time, errors, and so forth) of not doing this will likely grow over time.
62
Descriptive Statistics:
The process of sorting data in different ways to provide a more accurate and in-depth understanding of what the data is showing. Descriptive statistics enables analytical statistics, the process of inferring the meanings behind data descriptions. Data measurement tools are used in descriptive statistics to understand the distirbution patterns and characteristics of the dataset.
63
Common measures of central tendeny are:
Median, mode, and mean (average)
64
Median:
Or 50th percentile, is the middle value in a range of values. The median is the preferred measure of central tendency when the distrbution in a data set is skewed - when it contains a few excessively high or low values. Is also used in frequecy distributions.
65
Mode:
Most frequently occuring value in a set of data
66
Mean:
The arithemetic average/the average score or value. It is typically calculated in one of 2 ways, weighted or unweighted.
67
Unweighted Mean:
Or raw average, is the sum of all the values in the sample divided by the number of values. Useful when all the values are relatively close together and when they represent volume as opposed to numerical order or numerical preference.
68
Weighted Mean/Average:
Is used when some data in the dataset have more significance or effect than other data.
69
Weighted means are useful in situations where:
there are significant outliers in the spread of data - points that are relatively much higher or lower than the other data points - or when the values are not considered equally impactful.
70
Weighted mean is determined by:
Multiplying individual values by a factor that adjusts the value. The results are then summed.
71
It is important to note when working with the weighting factor values:
how tha tnumber is calculated can change the final result.
72
It is important to understand the level of precision that is necessary...
when determining if rounding is appropriate (and at which decimal place it should be done) and to remain consistent over time so that comparisons between different functions or from one year to the next are valid.
73
Weighting is often used in:
creating scales for evaluation
74
Frequency Distributions
Used to sort data into groups according to some factor. This allows analysts to understand the distribution of data they are working with - whether the data is focused in a normal pattern around a central value or is more broadly/narrowly disperesed over the data range. Frequency tables help locate peaks within the data range.
75
Quartiles and percentiles
Quartiles and percentiles describe dispersion across a group of ranked data. Quartiles divide a data set into quarters A percentile indicates the proportion of the dataset at a certain percentage. (90th percentile is greater than 90% of the values in the dataset)
76
A quartile ends at a...
Certain percentile. 1st Quartile - Q1 - Contains all the data at or below the 25th percentile. 2nd Quartile - Q2 - ends at the value in the center of the distribution, or the 50th percentile - half the values lie above and half below.
77
Quartiles and percentiles are frequently used in:
Benchmarking
78
Standard deviation:
Represents the distance of any data point from the center of a distribution when the data is distributed in a "normal" or expected pattern. This is often shown as a bell curve. In a normal distribution, 68% of data lies iwthin one standard deviation [expressed as SD or greek letter sigma], 95% of data lies within 2 SDs, and 99% lies within 3 SDs.
79
When standard deviation in a dataset is low...
the data curve is high and narrow, with data points tightly grouped around a center value.
80
When the standard deviation is high...
The data curve is flatter, longer, and is more spread out. There are more outliers in the dataset - measures that are significantly greather than the central value.
81
Data is subjected to different types of analysis in order to:
Sort it into categories, compare the categories, compare the relationships between variables, see patterns and trends in the data, and make more reliable predictions.
82
Metrics
Measure single data points in relative comparisons and provide tabular outputs of counts and rates. Metrics provide information that guides tactics and operations through quantitative analysis. What metrics do not provide are insights regarding the "why" behind the data - assessing the why is what allows HR and organizational leaders to make important, competitively differentiating business decisions.
83
Analytics can expose the important connections and patterns in data to make better workforce decisions because they:
- Consider the past and present and forecast the future - Connect multiple data items - Provide computational analysis of data or statistics - Provide visual outputs of patterns and trends - Provide insights that can drive strategy
84
Variance analysis
Identifies the degree of difference between planned and actual performance. The term is usually applied to analysis against objective baselines, such as schedules and budgets. Once identified, different analytical tools can be applied to understand the variance.
85
Ratio Analysis:
Compares the relavitve size of two variables and yields a percentage. Many commonly used HR metrics are ratios.
86
Trend Analysis:
Examines data from different points in time to determine if a variance is an isolated event or if it is part of a longer trend. By establishing the direction and degree of change over time, trend analyses can also be used to forecast future conditions, such as the ability of an initiative to meet its objectives. Important tools in discovering recurring peaks or troughs in an activity.
87
The ability to predict variables such as level of sales, production, or service within workable limits or ranges often becomes...
The foundation for making the HR forecast. Therefore, in trend and ratio analyses, when figures are projected into the future, the key determinant is whether relationships will remain the same. In this case, the clear identification of assumptions underlying the forecast is critical to interpretation and acceptance of the information.
88
Regression analysis
refers to a statistical method used to determine whether a relationship exists between variables and the strength of the relationship. The shape of the line formed by the data suggests if there isa liekly correlation, whether that correlation is positive or negative, and how strong/weak the correlation is. Can use multiple variables.
89
Root-Cause Analysis
Starts with a result then works backward. Each cause is queried to identify a preceding cause - conditions or actions that might ahve led to this effect. This questioning may proceed backward in rounds until the fundamental or root cause is identified (to the point where no further causes can be identified). Sometimes referred to as teh "five whys method".
90
Scenario/What-if Analysis
For the purpose of evaluation, these can be used to test the possible effects of altering the details of a situation to see how the outcomes will vary under different conditions. The outcome of a particular situation is projected, using different inputs to see what chagnes have the most profound effects.
91
The ability to tell a facts-based story is...
Data's greatest contribution to a business case. Understanding the priorities of the organization will guide you toward compiling relevant data to tell the correct story to the correct audience.
92
Is data on its own the route to a more successful business case?
No. HR professionals need to know what to do with the data and how to turn it into a narrative, and that's where interpretation, visualization, and a graphical representation come in. Creating graphical representations of the data can present your findings in a clearer, more efficient and accessible manner. This can also put you in a position to counter firmly held, but incorrect, organizational or leadership beliefs.
93
Graphic data analysis tools are often used to:
Support analysis and visualize results.
94
Pie Chart
Description: Graphically depicts as portions or slices of a circle the constituents that comprise 100% of a data group. Textual data information can be included in callouts or in an attached table for more precise communication. Application: To present a high-level impression of the data distribution as a percentage of a whole—for example, the workforce. This information may be helpful context for a deeper discussion.
95
Histogram
Description: Graphically depicts the sorting of data into groups arranged in the shape of a statistical distribution, showing a central tendency and dispersion around that tendency. This appears as columns of varying heights or lengths. Histograms can include a comparative referent, such as a target or range of values. They can also be designed to show comparisons over time (usually through multiple columns for each category). Application: To sort data and to support rapid comparison of categories of data.
96
Trend Diagram
Description: Plots data points on two axes. The horizontal axis usually represents time, while the vertical axis represents volume. Application: Can be used to test for presence of cycles or developing trends.
97
Pareto Chart
Description: Applies the Pareto principle (that 80% of effects come from 20% of causes) in the form of a histogram. Categories of data are ranked, usually from left to right along the x axis, according to size. The y axis shows reference ranges (for example, number or percentage of occurrences). A cumulative percentage line plots the category contributions to the whole, making it easier to identify the 80/20 cut—the 20% of factors that are creating 80% of the problems. Application: Distinguishes between the “vital few” categories that contribute most of the issues and the “trivial many” categories of infrequent occurrence to support more-focused quality improvement activities.
98
Scatter Diagram
Description: Plots data points against two variables that form the chart’s x and y axes. Each axis is scaled. The pattern formed by the plotted data describes the correlation between the two variables: The tightness of clustering indicates the probable strength of the correlation. A line rising from the lower left to the upper right quadrant indicates a positive correlation. (As x increases, y increases.) A line falling from the upper left to the lower right quadrant indicates a negative correlation. (As x increases, y decreases.) Application: Can be used to test possible causal relationships and narrow focus on subsequent tests.