TSQL Flashcards

(20 cards)

1
Q

What is the purpose of the DP-700 TestBank Setup?

A
  1. Stage raw DP-700 skills Excel in OneLake
  2. Register raw & prepared tables in Lakehouse
  3. Create SQL database, schema & target table
  4. Copy data into SQL DB and validate
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the first step in Imports & Config?

A

Load libraries and set any paths/connection strings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What libraries are imported in the setup?

A

Pandas and SQLAlchemy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the path for the raw Excel file in OneLake?

A

“/OneLake/TestBankLakehouse/Raw/DP700_SkillsMeasured.xlsx”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the purpose of staging raw Excel in OneLake?

A

Upload DP700_SkillsMeasured.xlsx to /OneLake/TestBankLakehouse/Raw/ via Fabric UI.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do you register a raw Lakehouse table?

A

Use T-SQL to map the Excel as a queryable table.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the SQL command to create the raw DP700 skills table?

A

CREATE TABLE TestBankLakehouse.Raw_DP700_Skills USING CSV OPTIONS ( path ‘/OneLake/TestBankLakehouse/Raw/DP700_SkillsMeasured.xlsx’, header ‘true’);

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the purpose of preparing Q&A in Python?

A

Enrich each row with Question/Answer/Notes/Resources, then save as CSV.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the first step in preparing Q&A in Python?

A

Read the raw data using pandas.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What columns are added to the DataFrame for Q&A preparation?

A

Question, Answer, Notes, Resources.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the path for the prepared CSV file in OneLake?

A

“/OneLake/TestBankLakehouse/Prepared/TestBankEntries.csv”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How do you register the prepared Lakehouse table?

A

Map your enriched CSV as a second Lakehouse table.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the SQL command to create the TestBankEntries table?

A

CREATE TABLE TestBankLakehouse.TestBankEntries USING CSV OPTIONS ( path ‘/OneLake/TestBankLakehouse/Prepared/TestBankEntries.csv’, header ‘true’);

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the purpose of creating a Fabric SQL Database & Schema?

A

Provision a SQL DB and define your schema/table.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the SQL command to create the schema and Questions table?

A

CREATE SCHEMA testbank;
CREATE TABLE testbank.Questions ( Category NVARCHAR(100), Subcategory NVARCHAR(100), Question NVARCHAR(MAX), Answer NVARCHAR(MAX), Notes NVARCHAR(MAX), Resources NVARCHAR(MAX));

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the purpose of ingesting into SQL DB via Pipeline?

A

Build a Copy Data pipeline in Data Engineering.

17
Q

What are the source and sink for the Copy Data pipeline?

A

Source: Lakehouse table TestBankLakehouse.TestBankEntries
Sink: SQL DB TestBankDB, Schema testbank, Table Questions.

18
Q

What is the purpose of validating the load?

A

Quick check that everything landed correctly.

19
Q

What is the SQL command to validate the load?

A

SELECT TOP 10 * FROM testbank.Questions;

20
Q

What are the next steps and tips after the setup?

A
  • Fill out all Q&A in the prepared CSV.
  • Automate pipeline on a schedule (e.g., daily).
  • Build Power BI or Notebook quiz UI on top of testbank.Questions.
  • Celebrate progress! 🎉