Midterm_concepts Flashcards

(267 cards)

1
Q

Knowledge-Based AI (KBAI)

A

An approach to AI that emphasizes explicit knowledge representations and reasoning methods to solve problems, often inspired by human cognition.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Cognitive System

A

A system that exhibits human-like intelligence through interacting processes such as learning, reasoning, and memory.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Cognitive System Components

A

Common components include input/output interfaces, reaction, deliberation, reasoning, learning, memory, and metacognition (monitoring/control).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Deliberation

A

A slower, goal-directed mode of processing that uses reasoning, learning, and memory to choose actions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Reaction

A

A faster, stimulus-driven mode of processing that can produce quick responses without extensive deliberation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Metacognition

A

The system’s ability to monitor and regulate its own thinking (e.g., detect impasses, choose strategies).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Fundamental Conundrum: Limited Resources

A

Intelligent agents have limited time, memory, and compute, so they must reason efficiently and strategically.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Fundamental Conundrum: Local Computation vs Global Constraints

A

Agents make local computations, but problems often have constraints that apply globally across the whole situation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Fundamental Conundrum: Deduction vs Non-Deductive Problems

A

Logic is deductive, but many real problems involve uncertainty, incomplete data, or abductive/inductive reasoning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Fundamental Conundrum: Dynamic World vs Limited Knowledge

A

The world changes, but the agent’s knowledge may be incomplete or outdated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Explanation and Justification

A

Beyond solving problems, cognitive systems often need to explain and justify their reasoning and actions.

Intelligent systems must not only act intelligently
They must also be able to account for their decisions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Knowledge Representation

A

How an agent encodes information so it can be used for reasoning and decision-making (e.g., graphs, rules, frames).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Represent and Reason

A

A general strategy: represent a problem in a structured form, then perform reasoning operations over that representation.

Different representations enable diff. reasoning strategies.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Semantic Network

A

A graph-based knowledge representation with nodes (concepts/objects) and labeled, directed links (relations).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Nodes (Semantic Networks)

A

The entities/concepts in a semantic network (e.g., ‘Bird’, ‘Eagle’, ‘Block’).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Links (Semantic Networks)

A

Directed, labeled relations between nodes (e.g., is-a, part-of, supports, left-of).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Good Representations

A

Make relationships explicit, expose constraints, combine objects with relations, exclude irrelevant detail, and remain computable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

State Space

A

A set of possible configurations (states) of a problem and the transitions (operators) between them.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Operator

A

An action that moves from one state to another in a state space (e.g., Move(C, Table) in blocks world).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Generate and Test

A

A problem-solving method that generates candidate solutions and tests them until one passes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Generator (Generate and Test)

A

The component that proposes candidate solutions (can be naive or ‘smart’).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Tester (Generate and Test)

A

The component that evaluates candidates against constraints or goals (can be naive or ‘smart’).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Smart Tester

A

A tester that detects failure early or uses structure to prune candidates, reducing search.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Smart Generator

A

A generator that uses domain knowledge/constraints to generate better candidates and avoid obviously bad ones.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Generate and Test for Raven's Matrices
Generate candidate answers for the missing cell and test them against inferred patterns/relations in the matrix.
26
Means-Ends Analysis
A search strategy that selects actions to reduce the difference between the current state and the goal state.
27
Difference (Delta) in Means-Ends
A measure of distance between a current state and a goal; prefer operators that reduce this difference.
28
Heuristic
A rule-of-thumb that guides search toward promising options (not guaranteed optimal but often faster).
29
Problem Reduction
Reformulating a problem into smaller subproblems whose solutions combine to solve the original problem.
30
Subgoal
An intermediate goal that must be achieved on the way to the final goal (common in problem reduction).
31
Production System
A rule-based architecture using IF-THEN rules (productions) that match conditions to trigger actions.
32
Production Rule
A condition-action rule: IF (pattern matches working memory) THEN (fire an action or add knowledge).
33
Working Memory
A short-term store of the current situation/state used for matching rules in a production system.
34
Conflict Set
The set of all rules whose conditions match the current working memory at a given moment. More than one rule wants to fire. They compete for control.
35
Conflict Resolution
A strategy for choosing which rule to fire when multiple rules are applicable.
36
Chunking
Learning in production systems by compiling a sequence of reasoning steps into a new rule for faster future use.
37
Cognitive Architecture
A general computational framework (architecture) that, combined with knowledge content, yields behavior.
38
Architecture + Content = Behavior
Behavior emerges from how the architecture operates on stored knowledge and current percepts.
39
Levels of Analysis (High/Low)
Hardware level e.g memory (lowest) algorithm/symbol level e.g., search (middle) task/knowledge level e.g., decisions (highest)
40
Frames
Structured representations for stereotyped situations, using slots (attributes) and values (often with defaults).
41
Slots (Frames)
Named attributes in a frame (e.g., 'location', 'time', 'utensils' in an 'Ate' frame).
42
Default Values (Frames)
Typical assumptions filled in when information is missing, which can be overridden by specific facts.
43
Stereotypes (Frames)
Frames capture typical patterns of situations (e.g., what usually happens when someone 'ate' something). Its essentially the default configuration of a frame.
44
Inheritance (Frames)
Frames/classes can inherit slot values from more general frames (e.g., 'Human' inherits from 'Animal').
45
Class vs Instance (Frames)
A class frame defines general properties; an instance frame represents a specific individual case.
46
Learning by Recording Cases
A learning approach that stores solved examples (cases) and uses similarity to solve new problems. Builds the a case base.
47
Case
A stored problem-solution pair (often including context) used as memory for future reasoning.
48
Nearest Neighbor
Retrieve the single most similar stored case to a new problem and reuse/adapt its solution.
49
k-Nearest Neighbors (kNN)
Retrieve the k most similar cases and combine their labels/solutions (e.g., by voting).
50
Similarity Metric
A function that measures how close two cases are (e.g., Euclidean distance in feature space).
51
Feature Space
A representation where each case is a point defined by features (e.g., width and height for blocks).
52
Case-Based Reasoning (CBR)
Reasoning by remembering prior cases and adapting them to fit new problems.
53
CBR Cycle: Retrieval
Find a past case similar to the current problem.
54
CBR Cycle: Adaptation
Modify the retrieved solution so it fits the new problem context.
55
CBR Cycle: Evaluation
Check how well the adapted solution works; revise if needed.
56
CBR Cycle: Storage
Store the new problem and its (validated) solution as a new case for future use.
57
Assumptions of CBR
Patterns exist in the world, and similar problems tend to have similar solutions.
58
Incremental Concept Learning
Learning a concept over time from examples by updating the concept definition as new data arrives.
59
Variabilization
Making a concept definition more flexible by replacing specifics with variables or more general descriptions.
60
Specialization
Narrowing a concept definition to exclude negative examples (make it more specific).
61
Generalization
Broadening a concept definition to include positive examples (make it more general).
62
Heuristics for Spec/Gen
Rules that decide when to specialize vs generalize based on new examples and current concept definition.
63
Classification
Mapping observations to concepts (categories) to reduce the complexity of deciding actions from many percepts.
64
Percepts to Concepts
Explosion problem: n perception features ( each can be on/off) which is 2^n percept combinations Instead: Percepts → Concepts → Actions We compress (map) percept combinations into higher-level abstractions.
65
Equivalence Class
A set of different observations treated as the same at the concept level because they lead to the same category/decision. Small black bird Large white bird They are all "Bird" class
66
Concept Hierarchy
An organized structure of concepts from general to specific (e.g., Vertebrate -> Bird -> Eagle).
67
Bottom-Up Search (Concept Learning)
Building higher-level concepts from specific examples/subclasses by aggregating shared properties. e.g. Incremental cencept learning, build concept descriptions from observed instances.
68
Types of Concepts (Formality)
Concepts vary in formality from axiomatic (most formal) to prototype to exemplar (least formal).
69
Axiomatic Concept
Defined by necessary and sufficient conditions in a formal specification (e.g., circle definition).
70
Prototype Concept
Defined by a typical example with default properties that can be overridden (e.g., 'chair' with typical slots).
71
Exemplar Concept
Defined by remembered instances and implicit abstractions across them (e.g., 'beauty').
72
Logic (Formal Logic)
A formal language for representing knowledge and deriving conclusions with sound inference.
73
Soundness
If a conclusion is provable, it is valid (no false conclusions can be proven).
74
Completeness
If a conclusion is valid, it is provable in the system (all valid conclusions can be proven).
75
Predicate
A function that maps objects to truth values (True/False), e.g., Feathers(bluebird).
76
Conjunction (AND)
A and B is true only when both A and B are true (A AND B).
77
Disjunction (OR)
A or B is true when at least one of A or B is true (A OR B).
78
Negation (NOT)
NOT A flips the truth value of A.
79
Implication (IMPLIES)
A IMPLIES B is false only when A is true and B is false; otherwise true.
80
Truth Table
A table enumerating truth values of a logical expression for all combinations of its variables.
81
Implication Elimination
Rewrite A IMPLIES B as NOT A OR B to simplify reasoning/proofs.
82
de Morgan's Laws
NOT (A AND B) = NOT A OR NOT B and NOT (A OR B) = NOT A AND NOT B.
83
Commutative Property
A AND B = B AND A and A OR B = B OR A.
84
Associative Property
(A OR B) OR C = A OR (B OR C) and similarly for AND.
85
Distributive Property
A AND (B OR C) = (A AND B) OR (A AND C) (and dual for OR over AND).
86
Rule of Inference
A pattern of reasoning that allows deriving a conclusion from premises (e.g., Modus Ponens).
87
Modus Ponens
From p IMPLIES q and p, infer q.
88
Modus Tollens
From p IMPLIES q and NOT q, infer NOT p.
89
Universal Quantifier
FORALL x means 'for all x' (a statement holds for every object in the domain).
90
Existential Quantifier
EXISTS x means 'there exists an x' (a statement holds for at least one object).
91
Conjunctive Normal Form (CNF)
A logical formula expressed as an AND of OR-clauses (useful for resolution).
92
Proof by Refutation
To prove a claim, assume its negation and derive a contradiction.
93
Resolution (Theorem Proving)
An inference method used with CNF clauses to derive contradictions and prove entailment.
94
Planning vs Logic (Connection)
Logic can represent actions, conditions, and goals; planning searches for action sequences to achieve goals.
95
Raven's Matrices (Task)
A visual analogy task where an agent infers patterns/relations in a matrix and predicts the missing cell.
96
Knowledge Arrives Incrementally
Many AI tasks provide information over time, requiring agents to update memory and reasoning dynamically.
97
Recurring Patterns
AI problems often have repeated structures, enabling reuse via cases, rules, or learned concepts.
98
Multiple Levels of Granularity
Problems can be represented at different detail levels; good agents shift granularity as needed.
99
Computational Intractability
Many problems have huge search spaces, making exhaustive search impractical without heuristics/knowledge.
100
What are the architectural approaches discussed?
Learning Architectures Incremental Concept Learning Generalization / Specialization Chunking (Production Learning) If grouped at a higher level, they fall into: Rule-Based (Production) Graph-Based (Semantic Networks) Structured Object-Based (Frames) Logic-Based (Formal Deduction) Memory-Based (CBR) Search-Based (Problem Solving) Learning-Integrated Architectures
101
What is the framework of a Production System?
• Rule base (IF–THEN rules) • Working memory (facts/state) • Inference engine
102
How does a Production System operate step-by-step?
1. Match rules to working memory 2. Build conflict set 3. Select rule (conflict resolution) 4. Fire rule 5. Update working memory 6. Repeat This is purely: * Reactive * Local * Mechanical
103
What is the framework of a Semantic Network?
• Nodes (concepts/objects) • Directional labeled links (relationships) • Structured representation of transformations
104
What is the purpose of a Semantic Network?
• Make relationships explicit • Expose constraints • Organize structured knowledge for reasoning
105
What are the reasoning methods presented?
• Generate and Test • Problem Reduction • Means–Ends Analysis • Represent and Reason • Similarity-Based (Weighted) Reasoning • Constraint-Based Reasoning
106
How does Generate and Test compute solutions?
• Generate possible solutions • Test each against constraints • Stop when valid solution found
107
How does Problem Reduction compute solutions?
• Break problem into subproblems • Solve each subproblem • Combine partial solutions
108
How does Means–Ends Analysis compute solutions?
• Compare current state to goal state • Identify differences • Select operator to reduce differences • Repeat until goal reached
109
How does Represent and Reason compute solutions?
• Build explicit representation • Encode relationships/transformations • Reason over structured representation
110
How does Similarity-Based (Weighted) Reasoning compute solutions?
• Identify transformations • Assign weights/costs • Compute similarity score • Choose lowest-cost/highest-similarity option
111
How does Constraint-Based Reasoning compute solutions?
• Represent constraints explicitly • Eliminate invalid states • Narrow search to valid solutions
112
What is the Generate and Test method?
• Generate candidate solutions • Test each candidate against constraints • Stop when a valid solution is found
113
Why is naive Generate and Test inefficient?
• May generate too many candidates • Large search space • High computational cost • No guidance toward goal
114
What makes a generator complete?
• Generates all possible valid candidates • Does not miss the correct solution
115
What makes a generator efficient?
• Limits unnecessary candidates • Reduces search space • Uses domain knowledge
116
What makes a generator smart?
• Uses structural knowledge • Applies constraints early • Avoids impossible states
117
What is the role of the tester in Generate and Test?
• Evaluates generated candidates • Applies constraints • Selects best match • Eliminates invalid states
118
How can a tester be made smarter?
• Use similarity scoring • Use transformation logic • Apply constraints early • Compare structured representations
119
In Generate and Test, where does intelligence primarily come from?
• Smart generation • Smart testing • Domain knowledge • Constraint use
120
How does Generate and Test apply to Raven’s Progressive Matrices?
• Generate transformations from A→B • Apply transformation to C • Generate predicted answer • Test candidate answers • Select best match
121
What is the difference between brute-force and smart Generate and Test?
• Brute force generates everything blindly • Smart version uses constraints and structure to reduce search
122
What happens to Generate and Test in an unconstrained domain?
• Search space explodes • Becomes inefficient • Hard to scale
123
Why are constraints important in Generate and Test?
• Reduce candidate space • Eliminate invalid states early • Improve efficiency
124
What is the tradeoff between generator and tester?
• Strong generator reduces tester burden • Strong tester compensates for weaker generator • Balance improves performance
125
What is the main design question when building a Generate and Test agent?
• What should be generated? • How should candidates be tested? • How to reduce search space?
126
Why is Generate and Test considered a reasoning method?
• Defines how solutions are computed • Specifies a search and evaluation loop • Operates over representations
127
How does Generate and Test differ from Means–Ends Analysis?
• Generate & Test tries candidates and evaluates • Means–Ends reduces differences between current and goal states
128
What failure occurs if the generator is incomplete?
• The correct answer may never be considered
129
What failure occurs if the tester is weak?
• Incorrect candidates may be selected • Poor ranking of solutions
130
Why is domain knowledge critical in Generate and Test for Raven’s problems?
• Limits plausible transformations • Guides generation • Improves similarity evaluation
131
What is the core loop of Generate and Test?
• Generate • Test • Accept or Reject • Repeat
132
What is Generate and Test?
• Generate candidate solutions • Test each candidate • Accept solution that satisfies constraints
133
What is Means–Ends Analysis?
• Compare current state to goal • Measure difference (Δ) • Choose operator that reduces Δ • Repeat until goal reached
134
What is Problem Reduction?
• Decompose hard problem into subproblems • Solve subgoals • Combine solutions to solve original problem
135
What is a state space?
• Set of all possible states • Defined by initial state, operators, and goal • Explored during problem solving
136
What are operators in problem solving?
• Actions that transform one state into another • Must obey problem constraints
137
What is Δ in Means–Ends Analysis?
• Measure of difference between current and goal state • Guides operator selection • Heuristic value
138
Why can Means–Ends get stuck?
• Greedy strategy • May reach local minimum • No operator reduces Δ
139
What role do subgoals play in Means–Ends Analysis?
• Created when direct reduction fails • Break problem into manageable steps • Enable hierarchical planning
140
Why is Generate and Test inefficient in large domains?
• Large search space • Many unnecessary candidates • No goal-directed guidance
141
How does domain knowledge improve Generate and Test?
• Narrows generation • Improves evaluation • Reduces search space
142
How does Generate & Test differ from Means–Ends Analysis?
• Generate & Test tries candidates then evaluates • Means–Ends directly reduces difference toward goal
143
How does Means–Ends differ from Problem Reduction?
• Means–Ends reduces Δ in state space • Problem Reduction decomposes goal into subgoals
144
Which approach is greedy difference reduction?
• Means–Ends Analysis
145
Which approach relies on candidate generation and evaluation?
• Generate and Test
146
Which approach uses hierarchical decomposition?
• Problem Reduction
147
Which method is most vulnerable to combinatorial explosion?
• Generate and Test
148
Which method depends on a difference metric (Δ)?
• Means–Ends Analysis
149
Which method organizes problems as AND/OR structures?
• Problem Reduction
150
How would each approach solve Raven’s Progressive Matrices?
• Generate & Test: Generate transformations → Test answers • Means–Ends: Reduce difference between matrix patterns • Problem Reduction: Break problem into transformation subgoals
151
Which approach is most structured and hierarchical?
• Problem Reduction
152
Which approach is most brute-force by default?
• Generate and Test
153
Which approach is heuristic but not guaranteed optimal?
• Means–Ends Analysis
154
Why do we use formal logic in AI?
• To ensure soundness (only valid conclusions proven) • To ensure completeness (all valid conclusions provable) • To guarantee correctness of reasoning
155
What is a predicate in logic?
• A function mapping objects to true/false values • Represents properties or relations • Example: Feathers(x)
156
How do you translate 'If P then Q' into formal logic?
• P ⇒ Q • Can be rewritten as ¬P ∨ Q (implication elimination)
157
What is implication elimination?
• Rewrite P ⇒ Q as ¬P ∨ Q • Required for resolution theorem proving
158
What are the four core logical operators?
• ∧ (AND) • ∨ (OR) • ¬ (NOT) • ⇒ (IMPLIES)
159
What is Modus Ponens?
• P ⇒ Q • P • Therefore Q • Direct forward inference
160
What is Modus Tollens?
• P ⇒ Q • ¬Q • Therefore ¬P • Reasoning by contrapositive
161
What is a truth table used for?
• Evaluate logical expressions • Determine equivalence • Verify validity of formulas
162
State De Morgan’s Laws.
• ¬(A ∧ B) = ¬A ∨ ¬B • ¬(A ∨ B) = ¬A ∧ ¬B
163
What is the commutative property in logic?
• A ∧ B = B ∧ A • A ∨ B = B ∨ A
164
What is the distributive property in logic?
• A ∧ (B ∨ C) = (A ∧ B) ∨ (A ∧ C)
165
What is a universal quantifier?
• ∀x • Statement holds for all elements • Example: ∀x Bird(x) ⇒ LaysEggs(x)
166
What is an existential quantifier?
• ∃x • Statement holds for at least one element
167
What is Conjunctive Normal Form (CNF)?
• Conjunction of disjunctions • Required for resolution • Example: (A ∨ B) ∧ (¬C ∨ D)
168
What is resolution theorem proving?
• Rule for deriving contradictions • Combine clauses with complementary literals • Used in automated theorem proving
169
What is proof by refutation?
• Assume negation of goal • Derive contradiction • Conclude original statement is true
170
What is the first step in resolution-based proof?
• Convert all implications to disjunctions • Eliminate ⇒ using ¬P ∨ Q
171
What is the purpose of converting to CNF?
• Standardize logical expressions • Enable resolution rule application
172
How does logic differ from heuristic reasoning?
• Logic is deductive and exact • Guarantees correctness • Heuristics are approximate and may fail
173
How would you represent Raven’s Matrices in logic?
• Define predicates for shapes and relations • Encode transformation rules as implications • Use inference to deduce missing panel
174
What problem does classification solve in cognitive systems?
• Reduces combinatorial explosion • Groups percept combinations into concepts • Maps many inputs to fewer abstract categories
175
What is the combinatorial explosion problem?
• n binary percepts produce 2^n combinations • Percept-to-action mapping becomes infeasible • Search space grows exponentially
176
Why is direct percept-to-action mapping impractical?
• Too many percept combinations • Cannot store all mappings • Computationally infeasible
177
What is the purpose of introducing concepts between percepts and actions?
• Compress percept space • Form equivalence classes • Reduce dimensionality • Enable abstraction
178
What are equivalence classes in classification?
• Groups of percept combinations • All members map to same concept • Treated identically for decision-making
179
What is a concept hierarchy?
• Structured organization of concepts • General-to-specific relationships • Supports inheritance
180
What is inheritance in concept hierarchies?
• Subclasses inherit properties of superclasses • Reduces redundancy • Enables structured reasoning
181
What is an axiomatic concept?
• Defined by necessary and sufficient conditions • Strict membership criteria • Most formal concept type
182
Provide an example of an axiomatic concept.
• Circle • Defined as all points equidistant from a center
183
What is a prototype concept?
• Defined by a typical example • Has default properties • Properties can be overridden
184
Provide an example of a prototype concept.
• Chair • Typical properties: four legs, has back • Exceptions allowed
185
What is an exemplar concept?
• Defined by stored instances • Membership based on similarity • Least formal concept type
186
Which concept type is most formal?
• Axiomatic concepts
187
Which concept type is least formal?
• Exemplar concepts
188
Order the concept types from most to least formal.
• Axiomatic • Prototype • Exemplar
189
How does classification relate to case-based learning?
• Exemplar concepts rely on stored examples • Similarity determines category membership
190
How does classification relate to incremental concept learning?
• Concepts can be refined over time • Definitions can generalize or specialize
191
What is bottom-up search in classification?
• Start from percept features • Move upward in hierarchy • Identify appropriate abstract category
192
How does classification help in solving Raven’s Progressive Matrices?
• Extract features from panels • Classify transformation pattern • Map to learned concept category • Select answer from correct class
193
What is the key equation illustrating classification’s benefit?
• 2^n percepts → k concepts → 2^m actions • Concepts reduce complexity between perception and action
194
What is Incremental Concept Learning?
• Learn a concept definition over time • Update concept after each new example • Maintain structured symbolic representation • Use heuristics to refine concept
195
What triggers generalization in Incremental Concept Learning?
• A positive example does NOT fit current concept • Concept must be broadened to include it
196
What triggers specialization in Incremental Concept Learning?
• A negative example DOES fit current concept • Concept must be restricted to exclude it
197
What is variabilization?
• Replace specific objects with variables • Convert instances into general roles • Enable general concept representation
198
What happens if a positive example fits the current concept?
• Do nothing • Concept already consistent
199
What happens if a negative example does not fit the current concept?
• Do nothing • Concept already excludes it
200
What is the require-link heuristic?
• A relationship must be present for positive examples • Add constraint to specialize concept
201
What is the forbid-link heuristic?
• A relationship must be absent for positive examples • Add negative constraint (¬link)
202
What is the drop-link heuristic?
• Remove unnecessary relationship • Generalize concept
203
What is the enlarge-set heuristic?
• Allow multiple object types in same role • Replace specific object with set (e.g., Brick OR Wedge)
204
What is the climb-tree heuristic?
• Generalize object using hierarchy • Replace specific type with parent class (e.g., Brick → Block)
205
What is the close-interval heuristic?
• Expand acceptable value range • Generalize numerical constraints
206
How is a concept represented in this approach?
• As structured graph • Objects and relationships • Constraints on links
207
How does Incremental Concept Learning differ from Case-Based Learning?
• Incremental builds symbolic definition • Case-based retrieves similar stored examples • Incremental modifies structure; case-based reuses instances
208
How does Incremental Concept Learning differ from kNN?
• Incremental maintains evolving rule-like concept • kNN stores raw examples • Incremental modifies definition; kNN compares distances
209
What are advantages of Incremental Concept Learning?
• Produces interpretable concept • Uses structured knowledge • Incorporates background knowledge
210
What are weaknesses of Incremental Concept Learning?
• Depends on heuristics • May overgeneralize or overspecialize • Sensitive to example order
211
How would you use Incremental Concept Learning for Raven’s Matrices?
• Represent transformation pattern structurally • Update concept after each matrix example • Generalize across positive examples • Specialize when pattern fails
212
What is Learning by Recording Cases?
• Store solved examples (cases) • Represent each case as feature vector • Retrieve most similar past case • Apply its solution to new problem
213
What is Case-Based Reasoning (CBR)?
• Solve new problems using similar past cases • Retrieve → Reuse → (Revise) → Retain • Learning occurs by storing new cases
214
What is the Nearest Neighbor method?
• Represent problems as points in feature space • Compute distance to stored cases • Choose case with smallest distance
215
What is the Euclidean distance formula in 2D?
• d = sqrt((x_c − x_n)^2 + (y_c − y_n)^2) • Measures similarity between two points
216
What is the general k-dimensional distance formula?
• d = sqrt(sum_i (c_i − p_i)^2) • Computes distance across k features
217
What is a feature vector?
• Numerical representation of a case • Each dimension represents a feature • Used to compute similarity
218
What is k-Nearest Neighbor (kNN)?
• Retrieve k closest cases • Use majority vote (classification) • More robust than single nearest neighbor
219
Why is kNN more robust than 1-NN?
• Reduces impact of noise • Avoids overfitting to one case • Uses consensus among neighbors
220
What is lazy learning?
• Learning occurs at query time • No explicit model built beforehand • Stores raw examples
221
How does Learning by Recording Cases differ from rule-based systems?
• No IF–THEN rule derivation • No explicit planning • Uses similarity instead of logical inference
222
How does this approach differ from Means–Ends Analysis?
• Means–Ends reduces difference step-by-step • Case-based retrieves similar past solution
223
What are advantages of Case-Based Reasoning?
• Simple implementation • Adapts easily to new data • No explicit rule engineering required
224
What are weaknesses of Nearest Neighbor methods?
• Requires large memory • Sensitive to irrelevant features • Curse of dimensionality • Slow retrieval for large datasets
225
What is the curse of dimensionality?
• Distance measures become less meaningful in high dimensions • All points become similarly distant • Performance degrades
226
How would you use case recording to solve Raven’s Matrices?
• Represent each matrix as feature vector • Store solved matrices as cases • Retrieve most similar matrix • Apply its transformation pattern
227
Why is similarity critical in Case-Based Learning?
• Determines which past solution is reused • Drives classification and retrieval
228
What is retained after solving a new problem?
• The new problem and its solution • Added to case memory for future use
229
How does nearest neighbor perform classification?
• Compute distances to labeled cases • Assign label of closest case (or majority among k)
230
What is a frame in knowledge representation?
• Structured representation of a concept or situation • Contains slots (attributes) • Slots contain values or defaults • Used for common-sense reasoning
231
What are slots in a frame?
• Attributes of a concept • Define properties or relationships • Can have explicit or default values
232
What are default values in frames?
• Typical assumptions • Used when information is missing • Can be overridden by explicit information
233
What does it mean that frames represent stereotypes?
• Encode typical structure of events or objects • Capture common patterns • Support inference beyond explicit data
234
What is inheritance in frames?
• Lower-level frames inherit slots from higher-level frames • Supports class-instance hierarchy • Reduces redundancy
235
What is the difference between a class frame and an instance frame?
• Class frame defines general properties • Instance frame specifies particular values • Instance inherits from class
236
Why are frames useful for common-sense reasoning?
• Fill in missing information • Use defaults • Support structured inference
237
How can defaults be overridden in frames?
• Explicit slot values replace defaults • Supports non-monotonic reasoning
238
How are frames related to semantic networks?
• Frames extend nodes with slot structure • Both represent structured relationships • Frames are richer object-centered representations
239
How would frames represent 'Ashok ate a frog'?
• Event frame: Ate • subject = Ashok • object = frog • object-alive = false (default) • object-is = in-subject
240
How can frames represent spatial relationships in Raven’s matrices?
• Each object represented as a frame • Slots include shape, size, fill • Relationship slots include inside, above • Compare transformations across frames
241
What is non-monotonic reasoning in frames?
• Conclusions can change when new information appears • Defaults replaced by specific evidence
242
How do frames support story understanding?
• Represent structured events • Distinguish literal vs metaphorical meaning • Organize narrative elements into slots
243
What is the purpose of slot constraints?
• Restrict valid values • Maintain consistency • Guide inference
244
How do frames fit into a cognitive architecture?
• Stored in working memory • Operated on by production rules • Provide structured knowledge content
245
What distinguishes frames from production systems?
• Frames are knowledge representations • Production systems are rule-based architectures • Frames store knowledge; production systems use it
246
What distinguishes frames from Generate & Test?
• Frames represent structure • Generate & Test is a reasoning strategy
247
What distinguishes frames from Means–Ends Analysis?
• Frames encode knowledge structure • Means–Ends reduces difference between states
248
Why are frames important for advanced sense-making?
• Support abstraction • Handle incomplete information • Enable structured reasoning over complex events
249
What is a Production System?
• Rule-based cognitive architecture • Uses IF–THEN rules • Maps percepts to actions • Uses working memory and inference engine
250
What are the three core components of a Production System?
• Working Memory (current state) • Rule Base (production rules) • Inference Engine (match-select-fire cycle)
251
What is Working Memory?
• Stores current percepts and state • Updated after rule fires • Represents dynamic environment
252
What is Procedural Memory in SOAR?
• Stores production rules • Encodes action knowledge • Controls behavior
253
What is Semantic Memory in SOAR?
• Stores factual knowledge • General world knowledge • Not tied to specific episodes
254
What is Episodic Memory in SOAR?
• Stores past experiences • Records prior states and outcomes • Used for learning
255
What does the inference engine do?
• Matches rules to working memory • Selects applicable rules • Fires selected rule • Updates working memory
256
What is conflict resolution?
• Process of choosing among multiple applicable rules • Determines which rule fires
257
What is action selection in a Production System?
• Choosing an operator based on rule matches • Sending selected action to motor system
258
What does f: P* → A represent?
• Function mapping percept history to action • Core purpose of cognitive architecture
259
What does 'Architecture + Content = Behavior' mean?
• Architecture provides reasoning structure • Content provides knowledge (rules) • Together they produce intelligent behavior
260
What is chunking in Production Systems?
• Learning mechanism • Creates new rules from experience • Compiles reasoning into faster rules
261
How does chunking improve performance?
• Reduces future search • Converts experience into rules • Speeds decision making
262
Why is a Production System considered an architectural approach?
• Defines full system structure • Includes memory, reasoning, and learning • Not just a single reasoning method
263
How would a Production System solve Raven’s Progressive Matrices?
• Encode transformation rules • Store current matrix state in working memory • Apply rules to infer transformation • Select answer via rule firing
264
What happens if multiple operators are suggested?
• Conflict resolution required • System selects one based on priorities or learned rules
265
How does learning occur in Production Systems?
• Through chunking • Using episodic outcomes • Creating new production rules
266
What distinguishes Production Systems from Generate & Test?
• Production Systems are rule-based architectures • Generate & Test is a reasoning strategy
267
What distinguishes Production Systems from Means–Ends Analysis?
• Production Systems define architecture • Means–Ends is a heuristic reasoning method