Feature Engineering Flashcards

(46 cards)

1
Q

What is feature engineering?

A

The process of creating or transforming features to make data better suited for machine learning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the main goal of feature engineering?

A

To improve model performance, reduce computational needs, or enhance interpretability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does mutual information measure?

A

The reduction in uncertainty about the target given a feature; detects any kind of relationship.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How does mutual information differ from correlation?

A

MI detects any relationship, while correlation only detects linear relationships.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is a common method to encode high-cardinality categorical features?

A

Target encoding (e.g., mean encoding with smoothing).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is target encoding?

A

Replacing categories with a number derived from the target (e.g., mean of target per category).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Why is smoothing used in target encoding?

A

To prevent overfitting and handle rare or missing categories by blending category mean with overall mean.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is PCA (Principal Component Analysis) used for?

A

To reduce dimensionality and capture the main axes of variation in the data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does PCA produce?

A

Principal components: new features that are linear combinations of original features.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

When should you scale data before PCA?

A

Always, because PCA is sensitive to feature scale.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is K-means clustering used for in feature engineering?

A

To create cluster labels as new categorical features based on similarity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does the ‘k’ in K-means represent?

A

The number of clusters (centroids) to create.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a group transform?

A

Aggregating information across rows grouped by a category (e.g., average income by state).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How can you create interaction features?

A

By combining two or more features (e.g., multiplying, adding, or concatenating).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a mathematical transform in feature engineering?

A

Applying arithmetic operations or functions (e.g., log, square, ratio) to features.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is a count feature?

A

A feature created by counting the presence of multiple binary/boolean features.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is a useful way to handle skewed numerical features?

A

Applying a log transformation to normalize the distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the purpose of feature utility metrics?

A

To rank features by their potential usefulness for predicting the target.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is an example of a domain-motivated feature?

A

Creating a feature like ‘apparent temperature’ from temperature, humidity, and wind speed.

20
Q

Why is feature engineering important for linear models?

A

Linear models can only learn linear relationships; transformations can make relationships linear.

21
Q

What is a common pitfall when using target encoding?

A

Overfitting, especially with rare categories or without a separate encoding split.

22
Q

How can PCA help with multicollinearity?

A

It creates uncorrelated components, reducing redundancy among features.

23
Q

What is a Voronoi tessellation in K-means?

A

The partition of feature space into regions where each point belongs to the nearest centroid.

24
Q

What is the difference between overfitting and underfitting in feature engineering?

A

Overfitting: capturing noise; Underfitting: missing important patterns.

25
How can you validate engineered features?
By measuring model performance on a validation set not used during feature creation.
26
What is a ratio feature?
A new feature created by dividing one numerical feature by another.
27
What is a segmentation feature?
A feature created by clustering to group similar data points (e.g., customer segments).
28
What is the benefit of using cluster labels as features?
They simplify complex relationships by breaking data into homogeneous groups.
29
What is the role of domain knowledge in feature engineering?
It helps identify meaningful transformations and interactions.
30
What is a frequency encoding?
Replacing categories with their frequency or proportion in the dataset.
31
How can you handle missing values before feature engineering?
Imputation or dropping rows/columns, depending on context.
32
What is the purpose of feature scaling before clustering?
To ensure all features contribute equally to distance calculations.
33
What is a one-hot encoding?
Creating binary columns for each category of a categorical feature.
34
What is label encoding?
Replacing categories with integer labels (ordinal encoding).
35
What is a synthetic feature?
A new feature created from existing ones (e.g., ratios, sums, products).
36
What is the advantage of using tree-based models with count features?
Tree models can naturally aggregate counts across many binary features.
37
What is the effect of normalization on neural networks?
It helps convergence by scaling features to a similar range (often 0-1 or -1 to 1).
38
How can you use PCA for anomaly detection?
Low-variance components may capture unusual patterns not visible in original features.
39
What is the 'm' parameter in m-estimate smoothing?
A smoothing factor controlling blend between category mean and overall mean.
40
What is a high-cardinality feature?
A categorical feature with many unique categories (e.g., zip codes).
41
How can you create geographic segment features?
By clustering latitude and longitude coordinates with K-means.
42
What is the purpose of mutual information scores?
To identify which features have the strongest relationship with the target.
43
What is a built-in Pandas method for creating boolean features?
Using .gt(0) or similar comparison methods to create binary columns.
44
How can you encode cyclical features (like hours or months)?
Using sine/cosine transformations to preserve cyclical nature.
45
What is the benefit of using PCA before modeling?
Reducing noise, decorrelating features, and improving model efficiency.
46
What is the risk of using in-sample scores for feature selection?
It can lead to overfitting and overly optimistic performance estimates.