Nature-Inspired Computing Flashcards

(222 cards)

1
Q

What does Nature-Inspired Computation (NIC) refer to?

A

Learning from nature to solve computing problems

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are some sources of inspiration for NIC?

A

Evolution, brains, collective behaviours (ants, swarms, etc.)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What types of tasks can NIC methods be applied to?

A
  • Pattern recognition
  • Finding shortest paths
  • Searching/optimising
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Why is nature considered a good model for problem-solving?

A

It has evolved complex problem-solving strategies over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is an example of collective behaviour in nature?

A

Ant colonies showing intelligence as a group

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the advantages of NIC?

A
  • Produces good results quickly
  • Optimises complex systems better than traditional methods
  • Great at pattern recognition
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is an example of an evolutionary algorithm?

A

Evolutionary Algorithms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the main idea behind natural evolution as proposed by Darwin?

A

More are born than survive, leading to a struggle for existence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What happens to useful variations in evolution?

A

They are passed on and improve survival chances

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does the Infinite Monkey Theorem illustrate?

A

Given infinite time, random typing would eventually produce any text, like Shakespeare’s plays

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are the ingredients of an Evolution Algorithm?

A
  • Population competing for resources
  • Selection of fitter individuals
  • Variation through mutation and recombination
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

List some applications of Evolutionary Algorithms.

A
  • Planning
  • Design
  • Simulation
  • Identification
  • Control
  • Classification
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Fill in the blank: NIC uses _______ to inspire computing.

A

[nature]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

True or False: Evolutionary algorithms only apply to biological systems.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the purpose of evolutionary algorithms?

A

To mimic survival and mutation to solve problems

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is an example of a task in planning that evolutionary algorithms can help with?

A

Routing, scheduling, packing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What book is recommended for further reading on genetic algorithms?

A

M. Mitchell – An Introduction to Genetic Algorithms (Ch.1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the goal of a Generic Evolutionary Algorithm (EA)?

A

Find best solution s to a problem.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What does the fitness function f(s) represent?

A

How good solution s is.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What is the first step in the Generic Evolutionary Algorithm process?

A

Start with random population (100–500 solutions).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What are the three main steps repeated in the EA cycle?

A

Selection, Variation, Population update.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

How are parents chosen in the selection methods of EAs?

A

Based on fitness.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is mutation in the context of EAs?

A

Small changes to create new solutions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What does recombination do in EAs?

A

Mixes parents to create new solutions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
What are two ways to update the population in EAs?
* Replace all with new * Merge old + new, keep best.
26
What is a potential downside of always selecting the best individuals in EAs?
Premature convergence (bad).
27
What is the impact of representation in EAs?
Representation matters a lot.
28
What type of problems are EAs particularly good for?
Optimisation & machine learning.
29
Give an example of a problem EAs can solve.
* Lecture timetables * Pipe networks for water * Antenna design * Satellite orbits * Fighter aircraft strategies * Neural networks * Cancer radiotherapy treatment planning.
30
What is the definition of optimisation?
Finding best solution in a set S.
31
What can the set S consist of in optimisation problems?
* Small (easy) * Huge (10³⁰ possible solutions) * Infinite (real numbers).
32
What is exhaustive search?
Try every possible solution (Enumeration).
33
When is exhaustive search feasible?
When S is small.
34
What characterizes easy problems in terms of complexity?
Tractable, in P, solvable in polynomial time.
35
What characterizes hard problems in terms of complexity?
Intractable, not known in P, solvable in exponential time.
36
What is the Minimum Spanning Tree (MST) problem?
Connect all nodes in a graph with cheapest tree (no cycles).
37
What algorithm can solve the MST problem efficiently?
Prim’s algorithm (polynomial time).
38
What happens to the MST problem when constraints are added?
It becomes hard.
39
What is an example of a real-world application of optimisation?
NY Tunnels water distribution optimisation.
40
What is the estimated number of possible solutions for the NY Tunnels problem?
1.9 × 10²⁵ possible solutions.
41
What distinguishes exact algorithms from approximate algorithms?
Exact algorithms guarantee best solution, approximate algorithms provide 'good enough' solutions quickly.
42
What is a key characteristic of EAs?
Deliver good solutions in reasonable time.
43
What is a trade-off when using sophisticated methods like EAs?
Time vs quality.
44
What are the main components of EAs?
* Selection * Variation * Population update.
45
What is the conclusion about real-world problems in the context of EAs?
Nearly always hard.
46
Fill in the blank: Exact algorithms only work for _______.
[small/easy cases].
47
Fill in the blank: Approximate algorithms like EAs are a _______ choice.
[practical].
48
What is the basic principle of Evolutionary Algorithms (EAs)?
1. Define fitness function (objective measure). 2. Generate candidate solutions. 3. Apply: * Selection (choose parents). * Mutation (small changes). * Recombination/crossover (combine parents). * Replacement (decide which solutions survive). Repeat until solutions improve.
49
What are the types of algorithms in Evolutionary Computation?
* Generational: replace whole population each cycle. * Steady-state: replace only a few at a time. * Elitist: best always survives.
50
What are the methods of recombination (crossover) in EAs?
* Single-point * Multi-point * Uniform
51
What are some selection methods used in Evolutionary Algorithms?
* Rank-based * Roulette Wheel * Tournament
52
What are the replacement strategies in Evolutionary Algorithms?
* Replace weakest * Replace first weaker solution
53
What types of mutations are there in Evolutionary Algorithms?
* Single gene * Multiple gene
54
What parameters are important in Evolutionary Algorithms?
* Population size * Number of generations * Mutation rate * Crossover rate * Tournament size
55
What is the goal of the Travelling Salesperson Problem (TSP)?
Must visit all cities once, return to start. Goal = shortest possible tour.
56
What are the steps involved in the Hill Climbing (HC) algorithm?
1. Start with random solution c. 2. Mutate → new solution m. 3. If m is as good or better → replace c with m. 4. Repeat until stopping condition.
57
Why is the algorithm called 'Hill Climbing'?
Imagine fitness as height on a landscape. HC climbs 'uphill' step by step, keeping improvements.
58
How is a solution encoded in Hill Climbing for TSP?
As a permutation of cities (e.g., ABDEC).
59
What is a search landscape in Evolutionary Algorithms?
Landscape = all solutions (S) plotted against fitness f(s).
60
What is a neighbourhood in the context of search landscapes?
All solutions reachable by one mutation.
61
What types of landscapes exist in search problems?
* Unimodal * Plateau * Multimodal (many peaks) * Deceptive (look good but lead to poor results)
62
What is a major problem with Hill Climbing?
It gets stuck in local optima.
63
What are two solutions to the problem of Hill Climbing getting stuck?
* Allow downhill moves (Local Search) * Use a population (multiple solutions explored at once)
64
What is the algorithm structure of Local Search?
Keep track of current solution c and best so far b. Explore neighbourhood → choose candidate m. Update c and b depending on policy.
65
What are the advantages of Local Search over Hill Climbing?
Better than hill climbing.
66
What are the limitations of Local Search?
Still often stuck in local optima.
67
What is the main idea of Population-Based Search (PBS)?
Instead of one solution, keep a population.
68
What are the key differences between Local Search and Population-Based Search?
1. Need selection to choose which solution(s) to mutate. 2. Can recombine multiple solutions.
69
What are the benefits of Population-Based Search?
* Avoids local optima more effectively. * Explores multiple areas of search space in parallel.
70
How do EAs compare to traditional search methods like gradient ascent or Hill Climbing?
EAs with populations: * Mutations = local moves. * Crossovers = combine solutions → explore new areas.
71
What is the process of the EA on TSP using a Steady-State approach?
Start with small population of random solutions. Mutate selected parents → new solutions. Evaluate new solution: * If better than worst → replace. * If worse → discard.
72
What conclusions can be drawn about Evolutionary Algorithms?
* Many variants of EAs (different selection, mutation, replacement). * Hill climbing & local search = simple but limited (get stuck). * Population-based search = much stronger, avoids local optima.
73
What is a Generational GA?
A method that applies selection and genetic operators to create a whole new population. ## Footnote An elitist version copies the top n best solutions unchanged to the next generation.
74
What characterizes a Steady-State GA?
Applies operators a few times, replacing the weakest solutions immediately, leading to gradual population evolution.
75
What is the Replace Weakest strategy?
A strategy where a new solution replaces the overall worst solution.
76
What does low selection pressure lead to?
Almost random progress with no significant improvements.
77
What is the consequence of high selection pressure?
Premature convergence, where the population gets stuck.
78
Name a selection method that uses fitness proportionate probability.
Roulette Wheel selection.
79
What are the problems associated with Roulette Wheel selection?
* Minimisation/negative values tricky * Very fit individuals can dominate.
80
What is Rank-Based Selection?
A method where the population is sorted and probabilities are assigned by rank.
81
What does Tournament Selection involve?
Randomly picking t individuals and choosing the best among them.
82
What is the purpose of selection in genetic algorithms?
To determine where to focus the search in the solution space.
83
What are the two main purposes of genetic operators?
* Exploitation: refine good solutions * Exploration: search new areas.
84
List different types of representations (encodings) used in genetic algorithms.
* Binary strings * Integer vectors * Real values * Permutations * Trees.
85
How does the choice of encoding affect genetic algorithms?
* It shapes the fitness landscape * It determines which operators work best.
86
What is a single-gene mutation in k-ary encoding?
Changing one value randomly.
87
What is the purpose of swap mutation?
To swap two values, useful for permutations.
88
Describe the 1-point crossover method.
Split two parents at one point and swap parts.
89
What is a k-point crossover?
A general version of crossover that can involve multiple points.
90
What is the structure of a Steady-State, Mutation-Only EA?
* Initialize random population * Select parent (tournament) * Mutate with probability * Replace worst if better * Repeat until termination.
91
What distinguishes an Elitist, Generational EA?
* Initialize random population * Select parents (rank-based) * Pair up for crossover & mutation * Keep best old solution (elitism) * Replace rest with children. * Repeat for set number of generations.
92
True or False: Selection must bias towards fitter solutions, but not too strongly.
True.
93
What does the design choice of representation in genetic algorithms affect?
It shapes the search landscape, necessitating careful selection of encoding and operators.
94
What is **Swarm Intelligence (SI)**?
Collective smart behaviour from simple individuals ## Footnote Key ideas include no central control and emergent behaviour through local interactions.
95
Name the **two key types** of swarm algorithms.
* Ant Colony Optimisation (ACO) * Particle Swarm Optimisation (PSO) ## Footnote These algorithms mimic natural systems to solve complex problems.
96
What can real ants (e.g., Lasius niger) do? List at least three abilities.
* Control nest temperature * Build bridges * Find shortest path to food ## Footnote Ants exhibit complex problem-solving abilities through simple individual actions.
97
Describe the **Double Bridge Experiment**.
Ants choose between two bridges; over time, they prefer the shorter one due to pheromone trails ## Footnote This experiment demonstrates how pheromone accumulation leads to emergent shortest-path finding.
98
What is **Stigmergy**?
Indirect communication through the environment ## Footnote It involves actions that trigger others or leave signals influencing future behaviour.
99
What are the two types of **stigmergy**?
* Sematonic stigmergy * Sign-based stigmergy ## Footnote Ants primarily use sign-based stigmergy by leaving pheromone trails.
100
Fill in the blank: The **basic ACO process** involves ants working on a graph with nodes representing _______.
cities/locations ## Footnote Edges represent paths between these nodes.
101
What happens after an ant completes a tour in the **ACO process**?
* Leaves pheromone on edges used * Evaporation reduces pheromone on all edges ## Footnote This helps in reinforcing good paths and prevents early domination.
102
What are the **key components** of ACO?
* Ants * Pheromone * Evaporation * Positive feedback * Exploration vs Exploitation ## Footnote These components work together to find optimal solutions.
103
Why does **ACO work** effectively?
* Mimics natural self-organisation * Uses distributed problem solving * Relies on pheromone feedback ## Footnote It is particularly effective for combinatorial optimisation problems.
104
List three **complex discrete optimisation problems** ACO can solve.
* Route planning * Scheduling * Network design ## Footnote ACO is well-suited for these types of problems due to its iterative improvement approach.
105
What problem does the **Travelling Salesperson Problem (TSP)** address?
* Visits each city exactly once * Minimises total distance ## Footnote TSP is NP-complete and serves as a classic testbed for optimisation algorithms.
106
Describe the core structure of the **Ant System (AS)** algorithm.
* Randomly place num_ant ants on num_city cities * Each ant chooses next city using probabilistic state transition rule * Update pheromone trails after all ants complete tours * Repeat until stopping condition ## Footnote ACO involves ants constructing solutions step-by-step while communicating via pheromones.
107
What factors influence the **transition rule** in the Ant System?
* Visited or not? * Local heuristic value ηᵢⱼ = 1/dᵢⱼ * Pheromone value τᵢⱼ(t) ## Footnote The transition from city i to city j depends on these factors.
108
What does the **update rule** in ACO do?
* Each ant deposits pheromone on edges used * Amount deposited is proportional to solution quality * Pheromone evaporates ## Footnote Evaporation prevents early convergence and encourages exploration.
109
What were the early results from the **Ant System**?
* Found optimal solutions for small problems (~30 cities) * Good results on larger problems, but not best-known * Combining ACO with local search produced world-class results ## Footnote TSP is highly static, and specialist algorithms can outperform basic ACO.
110
What are some **ACO variants** mentioned?
* Basic Ant System (AS) * Max–Min Ant System (MMAS) * Elitist Rank Ant System ## Footnote Variants differ in how pheromone updates are done.
111
True or false: ACO can only solve the **Travelling Salesperson Problem (TSP)**.
FALSE ## Footnote ACO can solve any problem representable as a path in a graph.
112
In the context of ACO, what is an example of **Single Machine Scheduling**?
* Jobs A–E with durations & deadlines * Only one machine → jobs scheduled one by one * Fitness options: average lateness, or max lateness ## Footnote Each job is a node, and ants build a sequence as a valid schedule.
113
What is the objective of the **Bin Packing** problem in ACO?
* Minimise weight difference across bins ## Footnote Ants choose bins for each item, building the packing incrementally like a path.
114
List the **key ACO components**.
* Transition Rule * Update Rule * Construction Graph ## Footnote ACO is competitive with evolutionary algorithms, especially on discrete optimisation.
115
Summarize the **high-level summary** of ACO.
* ACO = ants building solutions + pheromone learning * Transition rule = how ants choose next step * Update rule = how good solutions reinforce trails * Works great on routing, scheduling, bin packing, sequencing * Variants help avoid stagnation & improve performance * Combine with local optimisers for best results ## Footnote This summary is designed to be ADHD-friendly.
116
What is **Swarm Intelligence**?
* Emergent intelligent behaviour * Arises from simple interactions between individuals ## Footnote Example: Slime-mold behaves like a single amoeba when food is plentiful, and merges into a multicellular “slug” when food is scarce.
117
Name the different types of **coordinated movement**.
* Flocking * Shoaling * Swarming * Formation travelling ## Footnote Flocking is the coordinated movement of animals in groups, such as birds and fish.
118
What are the typical **characteristics of flocking**?
* Rapid movement of the entire group * Collective reactions to predators * Avoiding collisions * Flocks may merge or split * Tolerant of individuals joining/leaving * No leader — coordination is decentralised * Found across many species and environments ## Footnote Flocks can range from tiny groups to massive shoals up to 17 miles long.
119
What drives **flocking**?
* Attraction (aggregation) * Repulsion (segregation) * Neighbour mimetism * Environmental guidance ## Footnote These forces help in take-off, landing, turning, and route-keeping.
120
List the **performance advantages** of flocking.
* Accuracy & endurance * Energy savings * Anti-predator benefits ## Footnote Example: Geese flying in a V-formation can increase range by 70%.
121
Who created the **Boids** simulation and in what year?
Craig Reynolds, 1987 ## Footnote Boids are simple digital “birds” that simulate flocking behaviour.
122
What are the **three rules** of Reynolds’ Boids?
* Separation * Alignment * Cohesion ## Footnote These rules generate complex flocking behaviour.
123
What characteristics do **Boids’ flocks** exhibit?
* Spontaneously polarise * Synchronise turning * Merge if two flocks meet * Show flash expansion * Form mini-flocks if far apart ## Footnote These behaviours are demonstrated in simulations.
124
Define an **agent**.
An agent perceives its environment and acts on it through sensors and actuators ## Footnote Key concepts include percept and percept sequence.
125
What are examples of **agents**?
* Amazon Echo * Robot assistant * Self-driving car * Cybersecurity agent ## Footnote Agents can be software or physical systems.
126
What is meant by **environment** in the context of agents?
Where the agent works ## Footnote Characteristics include observability, determinism, dynamism, and task types.
127
What are the benefits of using **multi-agent systems (MAS)**?
* Solve complex problems collaboratively * Learn independently * Can be cooperative or competitive * Scalable ## Footnote Used in autonomous cars, robot factories, and automated trading.
128
What are the **challenges** of multi-agent systems?
* Complexity * Communication overhead * Fault detection ## Footnote These challenges arise from the need for agents to work together effectively.
129
What are the different **agent organisation structures**?
* Flat * Hierarchical * Teams * Coalitions ## Footnote These structures illustrate how agents coordinate and communicate.
130
What are the key challenges in **fault detection** in multi-agent systems?
* Centralised fault detection * Monitoring communication patterns * Identifying faulty agents ## Footnote Most research assumes homogeneous agents, which is often not the case in real systems.
131
What is **Particle Swarm Optimisation (PSO)**?
An optimisation algorithm inspired by flocking behaviour ## Footnote Invented by Kennedy & Eberhart in 1995, covered in detail next week.
132
Summarise the key points about **flocking**.
* Collective movement with no leader * Driven by attraction vs repulsion * Benefits include energy savings and predator avoidance * Uses Reynolds' three rules: separation, alignment, cohesion ## Footnote Sets foundation for understanding multi-agent systems and PSO.
133
What is **Particle Swarm Optimisation (PSO)**?
An optimisation algorithm inspired by bird flocking and swarm behaviour ## Footnote PSO was introduced by Kennedy & Eberhart in 1995.
134
In PSO, what does a **particle** represent?
A candidate solution with position and velocity (but no mass) ## Footnote Particles move through the search space to find optimal solutions.
135
What influences a particle's movement in PSO?
* Its own best position found so far (pbest) * The swarm’s best position (gbest) ## Footnote This collective knowledge helps in quicker discovery of optimal solutions.
136
PSO works best on **continuous problems**. Name examples of continuous solutions.
* x = (-1.3220, 0.1302, 2.3125, -0.6435, -0.6435) ## Footnote Continuous problems are common in engineering optimisation tasks.
137
List some applications of PSO.
* Antenna design * Aircraft wings * Amplifiers * Controllers * Circuits ## Footnote PSO is widely used in various engineering optimisation tasks.
138
What are the basic steps of the **PSO algorithm**?
* Create & initialise N particles with random positions and velocities * For each timestep: * For each particle: * Update position using velocity * Update velocity ## Footnote The algorithm iteratively improves the positions of particles.
139
What are the three parts of the **Velocity Update Rule** in PSO?
* Inertia * Cognitive term (personal best) * Social term (global best) ## Footnote These components guide the movement of particles in the search space.
140
What do the parameters **c1** and **c2** represent in PSO?
Acceleration coefficients controlling the importance of personal vs global experience ## Footnote Their values affect the convergence speed and exploration of the search space.
141
What is the purpose of **termination conditions** in PSO?
* Max iterations * “Good enough” solution found * No improvement for N iterations * Swarm has converged (normalised radius ≈ 0) ## Footnote These conditions determine when the algorithm should stop running.
142
What is **Binary PSO (BPSO)**?
An adaptation of PSO to work with binary strings ## Footnote BPSO can be implemented using traditional or geometric approaches.
143
In **Traditional BPSO**, how is velocity interpreted?
As probability ## Footnote The update rule remains the same, but positions are updated based on this probability.
144
What is the key difference in **Geometric BPSO**?
No explicit velocity; position is updated in discrete Hamming space ## Footnote This approach behaves similarly to multi-parent crossover in genetic algorithms.
145
Summarise the key features of **PSO**.
* Based on swarm behaviour (no leaders) * New solutions created using velocity update rule * Intuitive parameters (c1, c2, swarm size) * Can be extended to binary problems, dynamic optimisation, discrete search spaces ## Footnote PSO harnesses collective emergent intelligence from simple agents.
146
True or false: PSO uses **pheromone trails** like ACO.
FALSE ## Footnote PSO uses personal best and global best memory, while ACO relies on pheromone trails.
147
What analogy is used for **ADHD-Friendly Memory Anchors** in PSO?
* PSO = birds searching for food → everyone moves towards the best smell * Velocity = direction + memory (own best + swarm best) ## Footnote These anchors help in understanding PSO concepts intuitively.
148
What is the **main motivation** for using Multi-Objective Evolutionary Algorithms (MOEAs)?
* Maximise power output * Minimise cost ## Footnote These goals conflict, as a more expensive layout may yield more energy, while a cheaper layout may result in lower energy output.
149
Define a **Multi-objective problem**.
Each solution has M fitness values ## Footnote EAs must modify selection, objective handling, and visualisation to deal with multiple objectives.
150
What is the **dominance criterion** in MOEAs?
Solution a dominates b if: * a is at least as good as b on all objectives * a is better on at least one objective ## Footnote This concept helps in comparing solutions based on multiple objectives.
151
What is the **Pareto front**?
Set of all non-dominated solutions ## Footnote The front shape indicates the trade-off curve between conflicting objectives.
152
What are the characteristics of a **good Pareto front**?
* Even coverage * Wide spread ## Footnote Bad characteristics include big gaps and clustering in only part of the front.
153
How do you **rank solutions** with multiple objectives?
Use Non-dominated Sorting: 1. Find all Rank-0 individuals (non-dominated) 2. Remove them 3. Find next non-dominated group → Rank-1 4. Continue until all ranked ## Footnote This creates a hierarchy of solution sets.
154
What is the challenge in **crossover and mutation** in MOEAs?
Selection of non-dominated solutions ## Footnote A new selection operator is needed to decide between non-dominated solutions.
155
Describe the **dominance tournament selection** method.
Basic method: 1. Pick two individuals at random 2. If one dominates the other → choose it 3. If neither dominates → choose randomly ## Footnote Improved method introduces a comparison set to increase selection pressure.
156
What is the purpose of **tiebreaking** in MOEAs?
Ensure good spread of solutions ## Footnote When two solutions are non-dominated, the one in a less crowded region is preferred.
157
What is the method of **niching** in MOEAs?
Divide the solution space into niches ## Footnote Prefer solutions in less populated niches to maintain diversity on the Pareto front.
158
What does **NSGA-II** stand for?
Non-Dominated Sorting Genetic Algorithm II ## Footnote Key features include elitism, fast non-dominated sorting, and crowding distance for tie-breaking.
159
How does **crowding distance** work in NSGA-II?
Calculates how 'squeezed' a solution is: * Boundary solutions get infinite crowding distance * Others get distance = normalised difference between neighbours ## Footnote Higher distance indicates a more isolated solution, which is preferred.
160
Outline the **workflow of NSGA-II**.
Steps: 1. Create population (size N) 2. Generate offspring → size 2N 3. Apply fast non-dominated sort 4. Select best ranks until reaching N 5. Sort by crowding distance if last front doesn’t fit 6. Repeat ## Footnote This process ensures the maintenance of diversity and quality in solutions.
161
What does the **NSGA-II example** of offshore wind illustrate?
Graph interpretation: * Right/top = best power but most expensive * Bottom/left = cheapest but worst power ## Footnote NSGA-II generates a smooth Pareto front of trade-offs.
162
What are the advantages of **elitist algorithms** in MOEAs?
* No extra parameters * Faster convergence * Maintain best solutions ## Footnote However, they may lead to premature convergence on some problems.
163
What is a **weighted sum** approach in MOEAs?
Single-objective EA by weighting objectives ## Footnote Problems include needing multiple runs for different weights and inability to capture non-convex regions of the Pareto front.
164
What are the **three major issues** in many-objective optimisation (≥4 objectives)?
* Hard to visualise high-dimensional fronts * Dominance loses discriminative power * Number of solutions needed grows exponentially ## Footnote These challenges complicate the optimisation process.
165
List the **six objectives** in the many-objective offshore wind example.
* Minimise turbines * Minimise area * Minimise annual energy prediction error * Minimise levelized cost of energy * Maximise annual energy output * Maximise efficiency ## Footnote Solved using NSGA-III and visualised using a parallel coordinate plot.
166
Summarise the **key points of MOEAs**.
* Used for multiple conflicting objectives * Use Pareto dominance for comparison * Goal: approximate the Pareto front with good diversity * NSGA-II is the most famous and widely used ## Footnote Many-objective optimisation adds significant challenges.
167
What is a memory hook for **multi-objective** problems?
Multiple goals, no single 'best' ## Footnote This helps in remembering the nature of multi-objective optimisation.
168
What does **PSO** stand for in the context of optimization?
Particle Swarm Optimization ## Footnote PSO is a computational method used for optimizing a problem by iteratively improving candidate solutions.
169
In basic PSO, what are the two main updates for particles?
* Position update * Velocity update ## Footnote These updates are based on the best positions found by individual particles and the swarm.
170
In **Multi-Objective PSO**, what replaces **gbest**?
An archive ## Footnote The archive stores non-dominated solutions instead of a single global best solution.
171
How is **pbest** updated in Multi-Objective PSO?
* If current solution dominates pbest → update * If pbest dominates current → keep old one * If neither dominates → two options exist: keep oldest or newest ## Footnote This update mechanism allows for better exploration and exploitation of solutions.
172
What is the purpose of the **archive** in Multi-Objective PSO?
* Stores non-dominated solutions * Represents the current Pareto front * Must be updated with new solutions ## Footnote The archive helps in maintaining a diverse set of solutions for decision-making.
173
Why is it important to **limit the size** of the archive?
* Large archive slows finding leaders * Too many solutions can be unhelpful for decision-makers ## Footnote Managing archive size is crucial for maintaining efficiency in the optimization process.
174
Name the two main **diversity mechanisms** used in Multi-Objective PSO.
* Clustering * Crowding Distance ## Footnote These mechanisms help maintain diversity within the archive by reducing redundancy.
175
What is the **clustering** method in diversity mechanisms?
* Use hierarchical clustering on archive solutions * Merge clusters until required number remains * Keep the solution closest to cluster centre ## Footnote This method helps in reducing the number of solutions while retaining representative points.
176
What does **crowding distance** measure in Multi-Objective PSO?
* Scores solutions by how crowded their region is * Isolated solutions → high score * Crowded solutions → low score ## Footnote This scoring encourages exploration of less populated areas in the solution space.
177
In Multi-Objective PSO, how are **leaders selected** from the archive?
* Random Leader Selection * Dominated Solution Count * Hypervolume Contribution ## Footnote These methods ensure that leaders are chosen based on their effectiveness and diversity.
178
What is the **hypervolume** in the context of leader selection?
Area/volume between solution and a reference point it dominates ## Footnote The leader is the archive solution with the maximum hypervolume contribution.
179
What are the **three major issues** in Many-Objective PSO?
* Hard to visualize high-dimensional fronts * Reduction in search capability * Exponential growth in needed archive solutions ## Footnote These challenges complicate the optimization process when dealing with four or more objectives.
180
What is the **average rank** method in Many-Objective PSO?
* Rank archive solutions separately for each objective * Compute average rank ## Footnote Lower average ranks indicate better solutions, facilitating comparison among multiple objectives.
181
What does **distance ranking** encourage in Many-Objective PSO?
Exploration of sparse regions ## Footnote Solutions in less crowded areas are preferred, promoting diversity in the search space.
182
True or false: In Multi-Objective PSO, **dominance** is always effective.
FALSE ## Footnote Dominance becomes less effective in many-objective cases, necessitating alternative ranking methods.
183
What is the key takeaway about **MOPSO**?
* PSO can be adapted to multi-objective problems * Dominance is central but less effective in many-objective cases * Archive stores non-dominated solutions ## Footnote Mechanisms are needed to restrict archive size, maintain diversity, and select leaders.
184
What are the **two strengths** of computers compared to humans?
* Huge calculations instantly * Memory tasks ## Footnote Computers excel at tasks that require speed and accuracy.
185
What are the **two strengths** of humans compared to computers?
* Perception tasks * Learning and generalisation ## Footnote Humans are better at tasks like recognizing faces and finding objects.
186
List the characteristics of **computers**.
* Very fast serial processors * Almost no mistakes * Separate memory and processing * Store information indefinitely ## Footnote These features highlight the efficiency of computers in processing data.
187
List the characteristics of **humans**.
* Massively parallel brain architecture * Can learn, generalise * Processing + memory intertwined * Forget unimportant info * Gracefully degrade ## Footnote These traits allow humans to adapt and function even with damage.
188
How many **neurons** are estimated to be in the human brain?
≈ 100 billion ## Footnote This number is comparable to raindrops needed to fill an Olympic swimming pool.
189
What are the three parts of a **neuron**?
* Dendrites * Cell body * Axon ## Footnote These parts are essential for signal processing and memory in neurons.
190
What is the main difference between **Symbolic AI** and **Connectionism**?
* Symbolic AI: Explicit symbols & rules * Connectionism: Implicit numeric representations ## Footnote Symbolic AI is good for reasoning, while Connectionism excels in perception.
191
What happens at the **synapse** in a neuron?
Learning occurs through adjusting strength of connections ## Footnote Synapses can strengthen (excite) or weaken (inhibit) connections.
192
Who introduced the **Multi-Layer Perceptrons (MLPs)**?
Rumelhart & McClelland (1986) ## Footnote MLPs allowed networks to solve problems like XOR, which are not linearly separable.
193
What are two key properties of **Neural Networks (NNs)**?
* Generalisation * Graceful degradation ## Footnote Generalisation allows NNs to handle new data, while graceful degradation means performance reduces but does not fail completely.
194
In the **Rainfall Runoff Modelling** case study, what were the inputs used?
* Rainfall * Air temperature ## Footnote These inputs help predict sewer flow in urban areas.
195
What is the **fitness** measure used in ANN training?
Mean Absolute Error (MAE) ## Footnote This measure evaluates the accuracy of the model's predictions.
196
What does a **Nash–Sutcliffe Efficiency (NSE)** of 1 indicate?
Best performance ## Footnote NSE values range from 1 (best) to -∞ (worst).
197
Fill in the blank: **Neural computation is ideal for tasks where computers struggle, such as __________ and __________.**
perception, learning ## Footnote These tasks often require human-like capabilities.
198
What does **Connectionism** model?
Cognition via networks ## Footnote This approach mimics the structure and function of the brain.
199
What are **Spiking Neural Networks (SNNs)** designed to address?
* Process high-dimensional sensor data * React with low latency * Use little energy * Adapt to dynamic, uncertain environments ## Footnote Traditional ANNs struggle with these requirements.
200
What are the advantages of **Spiking Neural Networks** over traditional ANNs?
* More biologically realistic * Much more energy-efficient * Great for time-varying / sequential data * Used in robotics, speech recognition, sensory processing ## Footnote SNNs send information as spikes over time.
201
How do SNNs encode information?
* Rate coding * Temporal coding * Population coding ## Footnote Each method has a different approach to representing information through spikes.
202
What is a **Spike Train**?
A sequence of spikes over time ## Footnote Each neuron receives spikes, accumulates charge, and fires when a threshold is reached.
203
Describe the **neuron dynamics** in SNNs.
* Inputs drip in → water level rises * When full → bucket spills (fires a spike) * Water slowly leaks out if unused ## Footnote This creates time-dependent behaviour essential for temporal learning.
204
What is **Spike-Timing-Dependent Plasticity (STDP)**?
* Potentiation: pre-synaptic fires before post-synaptic → connection strengthens * Depression: pre-synaptic fires after post-synaptic → connection weakens ## Footnote STDP is a form of unsupervised learning that matches biological processes.
205
Why does **backpropagation** not work normally in SNNs?
Spiking function derivative = Dirac delta function: * Zero everywhere * Infinite at 0 * Not differentiable ## Footnote This makes standard backprop impossible for true spikes.
206
What are **surrogate spiking functions** used for?
To train SNNs with backprop-style approaches ## Footnote They replace the spike with a smooth approximation, allowing gradient flow.
207
What can **Evolutionary Algorithms (EAs)** optimize in SNNs?
* Network structure * Weights * Delays * Thresholds * Encoding schemes ## Footnote EAs can evolve multiple components of SNNs.
208
What are the challenges of using **Neural Architecture Search (NAS)**?
* Very expensive * Co-evolution complexity ## Footnote SNN evaluation is more costly than ANN evaluation, and multiple components influence each other.
209
Summarize the key points about **SNNs**.
* Biologically realistic, event-driven neural networks * Encode info using rate, timing, or population coding * Neurons spike when charge hits threshold and leak over time * STDP = core unsupervised learning rule * Backprop cannot be used directly (spike derivative is a delta) * Surrogate functions make gradient-based training possible * EAs can optimise SNNs, but expensive * Energy-efficient and ideal for temporal tasks ## Footnote This summary encapsulates the main features and challenges of SNNs.
210
What is the **original meaning** of **neuromorphic computing** from the 1980s?
Hardware inspired directly by the brain, using mixed analogue–digital circuits ## Footnote Modern meaning includes any brain-inspired computing, especially spiking neural networks (SNNs) and specialised processors.
211
Key differences between **neuromorphic systems** and **classical computing** include:
* Memory + processing combined in synapses * Massively parallel * Event-driven * Information stored in spike patterns/timing ## Footnote Neuromorphic systems are more like a living brain than a digital CPU.
212
List the **core characteristics** of neuromorphic computing.
* Massive Parallelism * Memory + Processing in One Place * Event-Driven * Scalable ## Footnote These characteristics contribute to its efficiency and speed for specific tasks.
213
True or false: Neuromorphic hardware can be faster and more power-efficient than **GPUs** for certain tasks.
TRUE ## Footnote This is particularly true for tasks like perception, spiking data, or optimisation.
214
Applications of neuromorphic computing include **graph algorithms** such as:
* Routing * Network analysis * Random walks ## Footnote Mapping graph nodes to neurons allows for natural parallelism.
215
What types of **optimisation problems** can neuromorphic systems efficiently solve?
* Integer optimisation * Continuous optimisation * QUBO problems * LASSO regression ## Footnote Spiking systems can solve these problems efficiently due to their energy-based optimisation dynamics.
216
In **signal processing**, neuromorphic systems are especially good for:
* Autonomous vehicles * Seizure detection from high-frequency oscillations * Edge computing sensors ## Footnote Spiking is advantageous for time-critical tasks.
217
Describe the **SpiNNaker** neuromorphic hardware.
* Each chip has 18 ARM cores * 16,000 neurons & 8 million synapses simulated in real time * Uses only 1 watt ## Footnote Designed for large-scale brain simulations & SNN research.
218
What are the key features of **IBM TrueNorth**?
* 1,000,000 neurons * 256 million synapses * Event-driven * Uses the Corelet programming language ## Footnote TrueNorth is designed for brain-like processing.
219
What are the **challenges** faced by neuromorphic computing?
* Traditional ML outperforms neuromorphic * Training difficulty * Limited accessibility * Lack of benchmarks * Hard programming model ## Footnote These challenges hinder widespread adoption and development.
220
How do **neuromorphic cameras** operate?
* Each pixel emits a spike when brightness changes * Results in high temporal resolution * Very low power use ## Footnote They are ideal for robotics, drones, and AR/VR applications.
221
Fill in the blank: Neuromorphic computing is **________**-based, brain-inspired hardware.
spike ## Footnote It is much more energy-efficient than classical systems.
222
List the **major challenges** of neuromorphic computing.
* Training difficulty * Usability * Lack of benchmarks ## Footnote These challenges make it still experimental compared to mainstream deep learning.