Generative AI Flashcards

(22 cards)

1
Q

Generative AI Purpose

A

to understand how to replicate features not to understand features

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Large Language Models (LLM)

A

a probabilistic model of text that determines the probability of a given sequence of words occurring in a sentence based on the previous words

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Transformers

A

instead of handing information sequentially; it handles all information in parallel @ can determine its probability on relationships; This means that it needs to see ALL information at once

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Self-Attention

A

the process that determines the relevance @ importance of each word

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Transformers are En___ and De___ based

A

Encoder; Decoder

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Transformer Encoder

A

takes in input to create understanding

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Transformer Decoder

A

takes a representation and generates an output

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

LLM are ___-based

A

decoder

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Semantic Search

A

where we evaluate documents w/ embedding to be able to put in a database so that input(user) has information evaluated by similarity to return similar information

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Retrieval-Augmented Generation (RAG)

A

makes semantic searching generative by then feeding this input to a LLM to generate output from it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Encoder-Only Transformer

A

transforms input-data into a sequence of vectors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Decoder-Only Transformer

A

generate sequences such as text, based on the input

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Encoder-Decoder Based Transformer

A

combines an encoder for input processing and a decoder for sequence generation (Used for Machine Translation; Sequence to Sequence Task)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

LLM Prompt Engineering Prompt

A

the input or initial text provided to the model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

LLM Prompt Engineering

A

the process of iteratively refining a prompt for the purpose of eliciting a particular style/response

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

LLM In-Context Learning

A

conditioning (prompting) an LLM w/ instructions and/or demonstrations on tasked its meant to complete

17
Q

LLM k-shot Prompting

A

explicitly providing k examples in a prompt to show responses that include a reasoning step

18
Q

LLM Chain-of-Thought Prompting

A

provide examples in a prompt to show responses that include a reasoning step

19
Q

Hallucination

A

model generated text that is non-factual and/or ungrounded

20
Q

When to use Prompt Engineering

A

when the LLM understands its topics but what better responses by feeding examples

21
Q

When to use RAG (Retrieval-Augmented Generation)

A

when the data its built with changes rapidly or you want to mitigate hallucinations

22
Q

When to use LLM Fine Turning

A

when the LLM doesn’t’ perform well on a specific task, the data you’d train w/ prompt engineering is too large, and/or the latency is too high