Commit and master ๐ฆ๐ค๐ in just ๐ฏ๐ฌ ๐๐ฎ๐๐!
I've outlined a simple, actionable plan for you to followโฆ
๐ช๐ฒ๐ฒ๐ธ ๐ญ: ๐๐ฎ๐๐ถ๐ฐ๐ ๐ผ๐ณ ๐ฆ๐ค๐
โ Day 1-2: Introduction to SQL, setting up your environment (MySQL/PostgreSQL/SQL Server).
โ Day 3-4: Understanding databases, tables, and basic SQL syntax.
โ Day 5-7: Working with SELECT, WHERE, and filtering data.
๐ช๐ฒ๐ฒ๐ธ ๐ฎ: ๐๐ผ๐ฟ๐ฒ ๐ค๐๐ฒ๐ฟ๐ถ๐ฒ๐
โ Day 8-10: Using JOINs โ INNER, LEFT, RIGHT, FULL.
โ Day 11-13: GROUP BY, HAVING, and aggregate functions (SUM, COUNT, AVG).
โ Day 14: Practice session โ write complex queries.
๐ช๐ฒ๐ฒ๐ธ ๐ฏ: ๐ ๐ผ๐ฑ๐ถ๐ณ๐๐ถ๐ป๐ด ๐๐ฎ๐๐ฎ
โ Day 15-17: INSERT, UPDATE, DELETE โ altering your data.
โ Day 18-20: Subqueries, nested queries, and derived tables.
โ Day 21: Practice session โ work on a mini-project.
๐ช๐ฒ๐ฒ๐ธ ๐ฐ: ๐๐ฑ๐๐ฎ๐ป๐ฐ๐ฒ๐ฑ ๐ฆ๐ค๐ ๐ง๐ผ๐ฝ๐ถ๐ฐ๐ ๐ฎ๐ป๐ฑ ๐ฃ๐ฟ๐ผ๐ท๐ฒ๐ฐ๐
โ Day 22-24: Window functions, RANK, DENSE_RANK, ROW_NUMBER.
โ Day 25-27: Creating and managing indexes, views, and stored procedures.
โ Day 28-30: Capstone project โ work with real-world data to design and query a database.
Here you can find SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Share with credits: https://t.iss.one/sqlspecialist
Hope it helps :)
I've outlined a simple, actionable plan for you to followโฆ
๐ช๐ฒ๐ฒ๐ธ ๐ญ: ๐๐ฎ๐๐ถ๐ฐ๐ ๐ผ๐ณ ๐ฆ๐ค๐
โ Day 1-2: Introduction to SQL, setting up your environment (MySQL/PostgreSQL/SQL Server).
โ Day 3-4: Understanding databases, tables, and basic SQL syntax.
โ Day 5-7: Working with SELECT, WHERE, and filtering data.
๐ช๐ฒ๐ฒ๐ธ ๐ฎ: ๐๐ผ๐ฟ๐ฒ ๐ค๐๐ฒ๐ฟ๐ถ๐ฒ๐
โ Day 8-10: Using JOINs โ INNER, LEFT, RIGHT, FULL.
โ Day 11-13: GROUP BY, HAVING, and aggregate functions (SUM, COUNT, AVG).
โ Day 14: Practice session โ write complex queries.
๐ช๐ฒ๐ฒ๐ธ ๐ฏ: ๐ ๐ผ๐ฑ๐ถ๐ณ๐๐ถ๐ป๐ด ๐๐ฎ๐๐ฎ
โ Day 15-17: INSERT, UPDATE, DELETE โ altering your data.
โ Day 18-20: Subqueries, nested queries, and derived tables.
โ Day 21: Practice session โ work on a mini-project.
๐ช๐ฒ๐ฒ๐ธ ๐ฐ: ๐๐ฑ๐๐ฎ๐ป๐ฐ๐ฒ๐ฑ ๐ฆ๐ค๐ ๐ง๐ผ๐ฝ๐ถ๐ฐ๐ ๐ฎ๐ป๐ฑ ๐ฃ๐ฟ๐ผ๐ท๐ฒ๐ฐ๐
โ Day 22-24: Window functions, RANK, DENSE_RANK, ROW_NUMBER.
โ Day 25-27: Creating and managing indexes, views, and stored procedures.
โ Day 28-30: Capstone project โ work with real-world data to design and query a database.
Here you can find SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Share with credits: https://t.iss.one/sqlspecialist
Hope it helps :)
โค5๐3
SQL books wonโt teach you this.
Natural Keys vs. Autoincrement IDs vs. Public IDs. (or maybe all together)
๐ก๐ฎ๐๐๐ฟ๐ฎ๐น ๐๐ฒ๐๐
Natural keys carry intrinsic meaning because they are part of the domain.
They are directly related to the data, making them intuitive and easy to understand. Examples include email addresses or employee IDs.
The problem is that they are usually not good for performance, but they can also be a security risk if you expose them.
๐๐๐๐ผ๐ถ๐ป๐ฐ๐ฟ๐ฒ๐บ๐ฒ๐ป๐ ๐๐๐
Autoincrement IDs automatically generate unique integers to identify rows within a table.
They are often used as primary keys.
Simple integers are fast for the database to index and query. They provide optimal performance.
However, they are vulnerable to enumeration attacks since predicting the next or previous record is easy.
๐ฃ๐๐ฏ๐น๐ถ๐ฐ ๐๐๐ (๐จ๐จ๐๐๐)
UUIDs (Universally Unique Identifiers) are 128-bit identifiers used to uniquely identify information without relying on a centralized authority.
They are difficult to guess, making them suitable for public exposure in APIs.
The problem is they are larger and more complex than integers. This can impact performance, particularly in indexing and storage.
๐๐ถ๐ป๐ฑ๐ถ๐ป๐ด ๐๐ต๐ฒ ๐ฆ๐๐ฒ๐ฒ๐ ๐ฆ๐ฝ๐ผ๐: ๐ ๐ ๐ถ๐ ๐ฒ๐ฑ ๐๐ฝ๐ฝ๐ฟ๐ผ๐ฎ๐ฐ๐ต
Combining different types of keys can offer a balanced solution:
โข InternalID: Used for internal operations and relationships between tables.
โข PublicID: Used in API responses and endpoints to securely reference user records.
โข Email (Natural Key): Used to ensure unique identification of users within the business logic.
The mixed approach keeps your system fast, secure, and easy to understand.
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
Natural Keys vs. Autoincrement IDs vs. Public IDs. (or maybe all together)
๐ก๐ฎ๐๐๐ฟ๐ฎ๐น ๐๐ฒ๐๐
Natural keys carry intrinsic meaning because they are part of the domain.
They are directly related to the data, making them intuitive and easy to understand. Examples include email addresses or employee IDs.
The problem is that they are usually not good for performance, but they can also be a security risk if you expose them.
๐๐๐๐ผ๐ถ๐ป๐ฐ๐ฟ๐ฒ๐บ๐ฒ๐ป๐ ๐๐๐
Autoincrement IDs automatically generate unique integers to identify rows within a table.
They are often used as primary keys.
Simple integers are fast for the database to index and query. They provide optimal performance.
However, they are vulnerable to enumeration attacks since predicting the next or previous record is easy.
๐ฃ๐๐ฏ๐น๐ถ๐ฐ ๐๐๐ (๐จ๐จ๐๐๐)
UUIDs (Universally Unique Identifiers) are 128-bit identifiers used to uniquely identify information without relying on a centralized authority.
They are difficult to guess, making them suitable for public exposure in APIs.
The problem is they are larger and more complex than integers. This can impact performance, particularly in indexing and storage.
๐๐ถ๐ป๐ฑ๐ถ๐ป๐ด ๐๐ต๐ฒ ๐ฆ๐๐ฒ๐ฒ๐ ๐ฆ๐ฝ๐ผ๐: ๐ ๐ ๐ถ๐ ๐ฒ๐ฑ ๐๐ฝ๐ฝ๐ฟ๐ผ๐ฎ๐ฐ๐ต
Combining different types of keys can offer a balanced solution:
โข InternalID: Used for internal operations and relationships between tables.
โข PublicID: Used in API responses and endpoints to securely reference user records.
โข Email (Natural Key): Used to ensure unique identification of users within the business logic.
The mixed approach keeps your system fast, secure, and easy to understand.
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
๐4โค2
How a SQL query gets executed internally - Lets see step by step!
We all know SQL, but most of us do not understand the internals of it.
Let me take an example to explain this better.
Select p.plan_name, count(plan_id) as total_count
From plans p
Join subscriptions s on s.plan_id=p.plan_id
Where p.plan_name !=โpremiumโ
Group by p.plan_name
Having count(plan_id) > 100
Order by p.plan_name
Limit 10;
Step 01: Get the table data required to run the sql query
Operations: FROM, JOIN (From plans p, Join subscriptions s)
Step 02: Filter the data rows
Operations: WHERE (where p.plan_name=โpremiumโ)
Step 03: Group the data
Operations: GROUP (group by p.plan_name)
Step 04: Filter the grouped data
Operations: HAVING (having count(plan_id) > 100)
Step 05: Select the data columns
Operations: SELECT (select p.plan_name, count(p.plan_id)
Step 06: Order the data
Operations: ORDER BY (order by p.plan_name)
Step 07: Limit the data rows
Operations: LIMIT (limit 100)
Knowing the Internals really help.
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need answer for the above question๐โค๏ธ
Hope it helps :)
We all know SQL, but most of us do not understand the internals of it.
Let me take an example to explain this better.
Select p.plan_name, count(plan_id) as total_count
From plans p
Join subscriptions s on s.plan_id=p.plan_id
Where p.plan_name !=โpremiumโ
Group by p.plan_name
Having count(plan_id) > 100
Order by p.plan_name
Limit 10;
Step 01: Get the table data required to run the sql query
Operations: FROM, JOIN (From plans p, Join subscriptions s)
Step 02: Filter the data rows
Operations: WHERE (where p.plan_name=โpremiumโ)
Step 03: Group the data
Operations: GROUP (group by p.plan_name)
Step 04: Filter the grouped data
Operations: HAVING (having count(plan_id) > 100)
Step 05: Select the data columns
Operations: SELECT (select p.plan_name, count(p.plan_id)
Step 06: Order the data
Operations: ORDER BY (order by p.plan_name)
Step 07: Limit the data rows
Operations: LIMIT (limit 100)
Knowing the Internals really help.
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need answer for the above question๐โค๏ธ
Hope it helps :)
โค2๐2
SQL for Data Analysis
Most people learn SQL just enough to pull some data. But if you really understand it, you can analyze massive datasets without touching Excel or Python.
Here are 8 game-changing SQL concepts that will make you a data pro:
๐
1. Stop pulling raw data. Start pulling insights.
The biggest mistake? Running a query that gives you everything and then filtering it later.
Good analysts donโt pull raw data. They shape the data before it even reaches them.
2. โSELECT โ is a rookie move.
Pulling all columns is lazy and slow.
A pro only selects what they need.
โ๏ธ Fewer columns = Faster queries
โ๏ธ Less noise = Clearer insights
The more precise your query, the less time you waste cleaning data.
3. GROUP BY is your best friend.
You donโt need 100,000 rows of transactions. What you need is:
โ๏ธ Sales per region
โ๏ธ Average order size per customer
โ๏ธ Number of signups per month
Grouping turns chaotic data into useful summaries.
4. Joins = Connecting the dots.
Your most important data is split across multiple tables.
Want to know how much each customer spent? You need to join:
โ๏ธ Customer info
โ๏ธ Order history
โ๏ธ Payments
Joins = unlocking hidden insights.
5. Window functions will blow your mind.
They let you:
โ๏ธ Rank customers by total purchases
โ๏ธ Calculate rolling averages
โ๏ธ Compare each row to the overall trend
Itโs like pivot tables, but way more powerful.
6. CTEs will save you from spaghetti SQL.
Instead of writing a 50-line nested query, break it into steps.
CTEs (Common Table Expressions) make your SQL:
โ๏ธ Easier to read
โ๏ธ Easier to debug
โ๏ธ Reusable
Good SQL is clean SQL.
7. Indexes = Speed.
If your queries take forever, your database is probably doing unnecessary work.
Indexes help databases find data faster.
If you work with large datasets, this is a game changer.
SQL isnโt just about pulling data. Itโs about analyzing, transforming, and optimizing it.
Master these 8 concepts, and youโll never look at SQL the same way again.
Join us on WhatsApp: https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Most people learn SQL just enough to pull some data. But if you really understand it, you can analyze massive datasets without touching Excel or Python.
Here are 8 game-changing SQL concepts that will make you a data pro:
๐
1. Stop pulling raw data. Start pulling insights.
The biggest mistake? Running a query that gives you everything and then filtering it later.
Good analysts donโt pull raw data. They shape the data before it even reaches them.
2. โSELECT โ is a rookie move.
Pulling all columns is lazy and slow.
A pro only selects what they need.
โ๏ธ Fewer columns = Faster queries
โ๏ธ Less noise = Clearer insights
The more precise your query, the less time you waste cleaning data.
3. GROUP BY is your best friend.
You donโt need 100,000 rows of transactions. What you need is:
โ๏ธ Sales per region
โ๏ธ Average order size per customer
โ๏ธ Number of signups per month
Grouping turns chaotic data into useful summaries.
4. Joins = Connecting the dots.
Your most important data is split across multiple tables.
Want to know how much each customer spent? You need to join:
โ๏ธ Customer info
โ๏ธ Order history
โ๏ธ Payments
Joins = unlocking hidden insights.
5. Window functions will blow your mind.
They let you:
โ๏ธ Rank customers by total purchases
โ๏ธ Calculate rolling averages
โ๏ธ Compare each row to the overall trend
Itโs like pivot tables, but way more powerful.
6. CTEs will save you from spaghetti SQL.
Instead of writing a 50-line nested query, break it into steps.
CTEs (Common Table Expressions) make your SQL:
โ๏ธ Easier to read
โ๏ธ Easier to debug
โ๏ธ Reusable
Good SQL is clean SQL.
7. Indexes = Speed.
If your queries take forever, your database is probably doing unnecessary work.
Indexes help databases find data faster.
If you work with large datasets, this is a game changer.
SQL isnโt just about pulling data. Itโs about analyzing, transforming, and optimizing it.
Master these 8 concepts, and youโll never look at SQL the same way again.
Join us on WhatsApp: https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
โค3๐3
Top Tools to learn Data Engineering
Join for more data engineering resources: https://t.iss.one/sql_engineer
Join for more data engineering resources: https://t.iss.one/sql_engineer
โค2๐2
The way you WRITE an SQL query differs from the way SQL PROCESSES it DataAnalytics
Your query might look like this:
1. SELECT
2. FROM
3. WHERE
4. GROUP BY
5. HAVING
6. ORDER BY
But SQL executes queries in a different order:
FROM (identifies table)
WHERE (filters rows)
GROUP BY (groups rows)
HAVING (filters grouped rows)
SELECT (chooses columns to display)
ORDER BY (sorts final output)
Your query might look like this:
1. SELECT
2. FROM
3. WHERE
4. GROUP BY
5. HAVING
6. ORDER BY
But SQL executes queries in a different order:
FROM (identifies table)
WHERE (filters rows)
GROUP BY (groups rows)
HAVING (filters grouped rows)
SELECT (chooses columns to display)
ORDER BY (sorts final output)
โค14๐8๐1
Why SQL is a Must-Have Skill?
If you're working with data, mastering SQL is non-negotiable! Itโs the backbone of handling and making sense of vast datasets in any industry.
โ Data at Your Fingertips
Effortlessly organize, retrieve, and manage large datasets to make informed decisions faster.
โ Stay Organized
Use primary and foreign keys to keep your data accurate and connected across tables.
โ Unlock Insights
Combine data from multiple sources and uncover trends using SQL's powerful query capabilities.
โ Efficiency Matters
Optimize your databases with normalization and avoid unnecessary redundancy.
โ Advanced Tools
From ACID transactions to optimizing with DELETE vs TRUNCATE, SQL makes sure your data is consistent and secure.
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
If you're working with data, mastering SQL is non-negotiable! Itโs the backbone of handling and making sense of vast datasets in any industry.
โ Data at Your Fingertips
Effortlessly organize, retrieve, and manage large datasets to make informed decisions faster.
โ Stay Organized
Use primary and foreign keys to keep your data accurate and connected across tables.
โ Unlock Insights
Combine data from multiple sources and uncover trends using SQL's powerful query capabilities.
โ Efficiency Matters
Optimize your databases with normalization and avoid unnecessary redundancy.
โ Advanced Tools
From ACID transactions to optimizing with DELETE vs TRUNCATE, SQL makes sure your data is consistent and secure.
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
๐4โค1
SQL From Basic to Advanced level
Basic SQL is ONLY 7 commands:
- SELECT
- FROM
- WHERE (also use SQL comparison operators such as =, <=, >=, <> etc.)
- ORDER BY
- Aggregate functions such as SUM, AVERAGE, COUNT etc.
- GROUP BY
- CREATE, INSERT, DELETE, etc.
You can do all this in just one morning.
Once you know these, take the next step and learn commands like:
- LEFT JOIN
- INNER JOIN
- LIKE
- IN
- CASE WHEN
- HAVING (undertstand how it's different from GROUP BY)
- UNION ALL
This should take another day.
Once both basic and intermediate are done, start learning more advanced SQL concepts such as:
- Subqueries (when to use subqueries vs CTE?)
- CTEs (WITH AS)
- Stored Procedures
- Triggers
- Window functions (LEAD, LAG, PARTITION BY, RANK, DENSE RANK)
These can be done in a couple of days.
Learning these concepts is NOT hard at all
- what takes time is practice and knowing what command to use when. How do you master that?
- First, create a basic SQL project
- Then, work on an intermediate SQL project (search online) -
Lastly, create something advanced on SQL with many CTEs, subqueries, stored procedures and triggers etc.
This is ALL you need to become a badass in SQL, and trust me when I say this, it is not rocket science. It's just logic.
Remember that practice is the key here. It will be more clear and perfect with the continous practice
SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
Basic SQL is ONLY 7 commands:
- SELECT
- FROM
- WHERE (also use SQL comparison operators such as =, <=, >=, <> etc.)
- ORDER BY
- Aggregate functions such as SUM, AVERAGE, COUNT etc.
- GROUP BY
- CREATE, INSERT, DELETE, etc.
You can do all this in just one morning.
Once you know these, take the next step and learn commands like:
- LEFT JOIN
- INNER JOIN
- LIKE
- IN
- CASE WHEN
- HAVING (undertstand how it's different from GROUP BY)
- UNION ALL
This should take another day.
Once both basic and intermediate are done, start learning more advanced SQL concepts such as:
- Subqueries (when to use subqueries vs CTE?)
- CTEs (WITH AS)
- Stored Procedures
- Triggers
- Window functions (LEAD, LAG, PARTITION BY, RANK, DENSE RANK)
These can be done in a couple of days.
Learning these concepts is NOT hard at all
- what takes time is practice and knowing what command to use when. How do you master that?
- First, create a basic SQL project
- Then, work on an intermediate SQL project (search online) -
Lastly, create something advanced on SQL with many CTEs, subqueries, stored procedures and triggers etc.
This is ALL you need to become a badass in SQL, and trust me when I say this, it is not rocket science. It's just logic.
Remember that practice is the key here. It will be more clear and perfect with the continous practice
SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
๐8โค7๐1
5 frequently Asked SQL Interview Questions with Answers in Data Engineering interviews:
๐๐ข๐๐๐ข๐๐ฎ๐ฅ๐ญ๐ฒ - ๐๐๐๐ข๐ฎ๐ฆ
โซ๏ธDetermine the Top 5 Products with the Highest Revenue in Each Category.
Schema: Products (ProductID, Name, CategoryID), Sales (SaleID, ProductID, Amount)
WITH ProductRevenue AS (
SELECT p.ProductID,
p.Name,
p.CategoryID,
SUM(s.Amount) AS TotalRevenue,
RANK() OVER (PARTITION BY p.CategoryID ORDER BY SUM(s.Amount) DESC) AS RevenueRank
FROM Products p
JOIN Sales s ON p.ProductID = s.ProductID
GROUP BY p.ProductID, p.Name, p.CategoryID
)
SELECT ProductID, Name, CategoryID, TotalRevenue
FROM ProductRevenue
WHERE RevenueRank <= 5;
โซ๏ธ Identify Employees with Increasing Sales for Four Consecutive Quarters.
Schema: Sales (EmployeeID, SaleDate, Amount)
WITH QuarterlySales AS (
SELECT EmployeeID,
DATE_TRUNC('quarter', SaleDate) AS Quarter,
SUM(Amount) AS QuarterlyAmount
FROM Sales
GROUP BY EmployeeID, DATE_TRUNC('quarter', SaleDate)
),
SalesTrend AS (
SELECT EmployeeID,
Quarter,
QuarterlyAmount,
LAG(QuarterlyAmount, 1) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter1,
LAG(QuarterlyAmount, 2) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter2,
LAG(QuarterlyAmount, 3) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter3
FROM QuarterlySales
)
SELECT EmployeeID, Quarter, QuarterlyAmount
FROM SalesTrend
WHERE QuarterlyAmount > PrevQuarter1 AND PrevQuarter1 > PrevQuarter2 AND PrevQuarter2 > PrevQuarter3;
โซ๏ธ List Customers Who Made Purchases in Each of the Last Three Years.
Schema: Orders (OrderID, CustomerID, OrderDate)
WITH YearlyOrders AS (
SELECT CustomerID,
EXTRACT(YEAR FROM OrderDate) AS OrderYear
FROM Orders
GROUP BY CustomerID, EXTRACT(YEAR FROM OrderDate)
),
RecentYears AS (
SELECT DISTINCT OrderYear
FROM Orders
WHERE OrderDate >= CURRENT_DATE - INTERVAL '3 years'
),
CustomerYearlyOrders AS (
SELECT CustomerID,
COUNT(DISTINCT OrderYear) AS YearCount
FROM YearlyOrders
WHERE OrderYear IN (SELECT OrderYear FROM RecentYears)
GROUP BY CustomerID
)
SELECT CustomerID
FROM CustomerYearlyOrders
WHERE YearCount = 3;
โซ๏ธ Find the Third Lowest Price for Each Product Category.
Schema: Products (ProductID, Name, CategoryID, Price)
WITH RankedPrices AS (
SELECT CategoryID,
Price,
DENSE_RANK() OVER (PARTITION BY CategoryID ORDER BY Price ASC) AS PriceRank
FROM Products
)
SELECT CategoryID, Price
FROM RankedPrices
WHERE PriceRank = 3;
โซ๏ธ Identify Products with Total Sales Exceeding a Specified Threshold Over the Last 30 Days.
Schema: Sales (SaleID, ProductID, SaleDate, Amount)
WITH RecentSales AS (
SELECT ProductID,
SUM(Amount) AS TotalSales
FROM Sales
WHERE SaleDate >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY ProductID
)
SELECT ProductID, TotalSales
FROM RecentSales
WHERE TotalSales > 200;
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
๐๐ข๐๐๐ข๐๐ฎ๐ฅ๐ญ๐ฒ - ๐๐๐๐ข๐ฎ๐ฆ
โซ๏ธDetermine the Top 5 Products with the Highest Revenue in Each Category.
Schema: Products (ProductID, Name, CategoryID), Sales (SaleID, ProductID, Amount)
WITH ProductRevenue AS (
SELECT p.ProductID,
p.Name,
p.CategoryID,
SUM(s.Amount) AS TotalRevenue,
RANK() OVER (PARTITION BY p.CategoryID ORDER BY SUM(s.Amount) DESC) AS RevenueRank
FROM Products p
JOIN Sales s ON p.ProductID = s.ProductID
GROUP BY p.ProductID, p.Name, p.CategoryID
)
SELECT ProductID, Name, CategoryID, TotalRevenue
FROM ProductRevenue
WHERE RevenueRank <= 5;
โซ๏ธ Identify Employees with Increasing Sales for Four Consecutive Quarters.
Schema: Sales (EmployeeID, SaleDate, Amount)
WITH QuarterlySales AS (
SELECT EmployeeID,
DATE_TRUNC('quarter', SaleDate) AS Quarter,
SUM(Amount) AS QuarterlyAmount
FROM Sales
GROUP BY EmployeeID, DATE_TRUNC('quarter', SaleDate)
),
SalesTrend AS (
SELECT EmployeeID,
Quarter,
QuarterlyAmount,
LAG(QuarterlyAmount, 1) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter1,
LAG(QuarterlyAmount, 2) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter2,
LAG(QuarterlyAmount, 3) OVER (PARTITION BY EmployeeID ORDER BY Quarter) AS PrevQuarter3
FROM QuarterlySales
)
SELECT EmployeeID, Quarter, QuarterlyAmount
FROM SalesTrend
WHERE QuarterlyAmount > PrevQuarter1 AND PrevQuarter1 > PrevQuarter2 AND PrevQuarter2 > PrevQuarter3;
โซ๏ธ List Customers Who Made Purchases in Each of the Last Three Years.
Schema: Orders (OrderID, CustomerID, OrderDate)
WITH YearlyOrders AS (
SELECT CustomerID,
EXTRACT(YEAR FROM OrderDate) AS OrderYear
FROM Orders
GROUP BY CustomerID, EXTRACT(YEAR FROM OrderDate)
),
RecentYears AS (
SELECT DISTINCT OrderYear
FROM Orders
WHERE OrderDate >= CURRENT_DATE - INTERVAL '3 years'
),
CustomerYearlyOrders AS (
SELECT CustomerID,
COUNT(DISTINCT OrderYear) AS YearCount
FROM YearlyOrders
WHERE OrderYear IN (SELECT OrderYear FROM RecentYears)
GROUP BY CustomerID
)
SELECT CustomerID
FROM CustomerYearlyOrders
WHERE YearCount = 3;
โซ๏ธ Find the Third Lowest Price for Each Product Category.
Schema: Products (ProductID, Name, CategoryID, Price)
WITH RankedPrices AS (
SELECT CategoryID,
Price,
DENSE_RANK() OVER (PARTITION BY CategoryID ORDER BY Price ASC) AS PriceRank
FROM Products
)
SELECT CategoryID, Price
FROM RankedPrices
WHERE PriceRank = 3;
โซ๏ธ Identify Products with Total Sales Exceeding a Specified Threshold Over the Last 30 Days.
Schema: Sales (SaleID, ProductID, SaleDate, Amount)
WITH RecentSales AS (
SELECT ProductID,
SUM(Amount) AS TotalSales
FROM Sales
WHERE SaleDate >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY ProductID
)
SELECT ProductID, TotalSales
FROM RecentSales
WHERE TotalSales > 200;
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
WhatsApp.com
SQL Programming
Channel โข 410K followers โข Find top SQL resources from global universities, cool projects, and learning materials for data analytics.
Contact us: [email protected]
This channel is for SQL Enthusiasts, Analysts, Developers & Beginners.
0๏ธโฃ SQLโฆ
Contact us: [email protected]
This channel is for SQL Enthusiasts, Analysts, Developers & Beginners.
0๏ธโฃ SQLโฆ
๐9โค1
Why SQL Query Optimization Matters More Than You Think
Have you ever noticed your SQL queries dragging their feet, taking forever to return results? Itโs frustrating, right? The truth is that optimizing your SQL queries is essential if you want your applications to run smoothly and efficiently.
Hereโs why SQL optimization should be on your radar:
โ ๐ฆ๐ฝ๐ฒ๐ฒ๐ฑ ๐ ๐ฎ๐๐๐ฒ๐ฟ๐: Did you know that optimized SQL queries can run up to 10 times faster? That means getting the data you need in seconds instead of minutes. Imagine how much more you could get done with that kind of speed!
โ ๐ฆ๐ฎ๐๐ฒ ๐ผ๐ป ๐๐ผ๐๐๐: Every second your query runs, itโs using up resources like CPU and memory. By optimizing your queries, you can cut down on resource usage, which can save you a lot of moneyโespecially if youโre working in the cloud where every bit of processing power is billed.
โ ๐ฆ๐๐ฎ๐ ๐ฆ๐ฐ๐ฎ๐น๐ฎ๐ฏ๐น๐ฒ: Your data isnโt getting any smaller, and as it grows, so does the load on your database. Optimization ensures that as your datasets grow, your queries can handle the increased workload without slowing down. This is crucial if you want to keep everything running smoothly as your business scales.
โ ๐๐ฎ๐ฝ๐ฝ๐ ๐จ๐๐ฒ๐ฟ๐: No one likes waiting, especially not your users. Faster queries mean a better experience for everyone. When your application is responsive, it keeps users engaged and happy, which is a big win.
๐๐ฒ๐ฟ๐ฒโ๐ ๐๐ต๐ฒ ๐ถ๐บ๐ฝ๐ฎ๐ฐ๐ ๐ผ๐ณ ๐ด๐ฒ๐๐๐ถ๐ป๐ด ๐ถ๐ ๐ฟ๐ถ๐ด๐ต๐:
โ Businesses that focus on optimizing their SQL queries often see a 30-50% improvement in application performance.
โ By just tweaking the slowest 10% of your queries, you can slash your database resource usage by up to 50%.
โ Proper indexing and smart query design can turn minutes into seconds, which is a game-changer for user satisfaction and productivity.
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
Have you ever noticed your SQL queries dragging their feet, taking forever to return results? Itโs frustrating, right? The truth is that optimizing your SQL queries is essential if you want your applications to run smoothly and efficiently.
Hereโs why SQL optimization should be on your radar:
โ ๐ฆ๐ฝ๐ฒ๐ฒ๐ฑ ๐ ๐ฎ๐๐๐ฒ๐ฟ๐: Did you know that optimized SQL queries can run up to 10 times faster? That means getting the data you need in seconds instead of minutes. Imagine how much more you could get done with that kind of speed!
โ ๐ฆ๐ฎ๐๐ฒ ๐ผ๐ป ๐๐ผ๐๐๐: Every second your query runs, itโs using up resources like CPU and memory. By optimizing your queries, you can cut down on resource usage, which can save you a lot of moneyโespecially if youโre working in the cloud where every bit of processing power is billed.
โ ๐ฆ๐๐ฎ๐ ๐ฆ๐ฐ๐ฎ๐น๐ฎ๐ฏ๐น๐ฒ: Your data isnโt getting any smaller, and as it grows, so does the load on your database. Optimization ensures that as your datasets grow, your queries can handle the increased workload without slowing down. This is crucial if you want to keep everything running smoothly as your business scales.
โ ๐๐ฎ๐ฝ๐ฝ๐ ๐จ๐๐ฒ๐ฟ๐: No one likes waiting, especially not your users. Faster queries mean a better experience for everyone. When your application is responsive, it keeps users engaged and happy, which is a big win.
๐๐ฒ๐ฟ๐ฒโ๐ ๐๐ต๐ฒ ๐ถ๐บ๐ฝ๐ฎ๐ฐ๐ ๐ผ๐ณ ๐ด๐ฒ๐๐๐ถ๐ป๐ด ๐ถ๐ ๐ฟ๐ถ๐ด๐ต๐:
โ Businesses that focus on optimizing their SQL queries often see a 30-50% improvement in application performance.
โ By just tweaking the slowest 10% of your queries, you can slash your database resource usage by up to 50%.
โ Proper indexing and smart query design can turn minutes into seconds, which is a game-changer for user satisfaction and productivity.
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
๐6โค1๐1
How to optimize SQL code:
๐ญ) ๐ข๐ป๐น๐ ๐๐ฒ๐น๐ฒ๐ฐ๐ ๐๐ต๐ฒ ๐ป๐ฒ๐ฐ๐ฒ๐๐๐ฎ๐ฟ๐ ๐ฐ๐ผ๐น๐๐บ๐ป๐
This tip is specially important when you work with large tables, running window functions, or multiple joins.
DON'T: Select all the columns from the table
DOs: Select only the columns you need!
๐ฎ) ๐๐น๐๐ฎ๐๐ ๐ณ๐ถ๐น๐๐ฒ๐ฟ ๐ฎ๐ ๐๐ผ๐ผ๐ป ๐ฎ๐ ๐๐ผ๐ ๐ฐ๐ฎ๐ป
If you only need certain rows from a table, apply the filters as early as you can.
Please, don't wait until you do all the CTEs, then all the joins, and realize you need to apply some filtersโฆ Your company will thank you for that.
๐ฏ) ๐๐ผ๐ถ๐ป ๐๐ผ๐๐ฟ ๐น๐ฎ๐ฟ๐ด๐ฒ๐๐ ๐๐ฎ๐ฏ๐น๐ฒ ๐ฎ๐ ๐๐ต๐ฒ ๐ฒ๐ป๐ฑ (๐ถ๐ณ ๐๐ผ๐ ๐ฐ๐ฎ๐ป!)
We've all experienced it. But when you have your biggest table as the base, and joining it to other tables it will result in a lot of scanning.
Look at these 2 queries:
๐ญ) ๐ฆ๐๐๐๐๐ง * ๐๐ฅ๐ข๐ ๐ฏ๐ถ๐ด_๐๐ฎ๐ฏ๐น๐ฒ ๐๐ก๐ก๐๐ฅ ๐๐ข๐๐ก ๐๐ฎ๐ฏ๐น๐ฒ_๐ญ ๐ข๐ก ๐ ๐ ๐
vs.
๐ฎ) ๐ฆ๐๐๐๐๐ง * ๐๐ฅ๐ข๐ ๐๐ฎ๐ฏ๐น๐ฒ_๐ญ ๐๐ก๐ก๐๐ฅ ๐๐ข๐๐ก ๐ฏ๐ถ๐ด_๐๐ฎ๐ฏ๐น๐ฒ ๐ข๐ก ๐ ๐ ๐
The second query will be significantly lower in computing power, so make sure to always start with the smallest table as you base (the best you can!)
๐ฐ) ๐๐ฒ ๐๐ฒ๐ป๐๐น๐ฒ ๐๐ถ๐๐ต ๐๐ถ๐ป๐ฑ๐ผ๐ ๐ณ๐๐ป๐ฐ๐๐ถ๐ผ๐ป๐
I've seen many people saying use window function for your queries. Well, I don't think it's a good idea.
Yes, they are powerful but they are also resource-intensive, so use them only if necessary.
And ask yourself how much data it needs to process!
๐ฑ) ๐ ๐ผ๐ฟ๐ฒ ๐๐ง๐๐ ๐น๐ฒ๐๐ ๐๐๐ฏ๐พ๐๐ฒ๐ฟ๐ถ๐ฒ๐
Think about your audience. When you are writing a huge query with sub-queries, it is (very!) hard for people to navigate.
Think of replacing them with CTE - they will thank you for that!
And it will also be easier for you to debug and optimize (if needed)
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
๐ญ) ๐ข๐ป๐น๐ ๐๐ฒ๐น๐ฒ๐ฐ๐ ๐๐ต๐ฒ ๐ป๐ฒ๐ฐ๐ฒ๐๐๐ฎ๐ฟ๐ ๐ฐ๐ผ๐น๐๐บ๐ป๐
This tip is specially important when you work with large tables, running window functions, or multiple joins.
DON'T: Select all the columns from the table
DOs: Select only the columns you need!
๐ฎ) ๐๐น๐๐ฎ๐๐ ๐ณ๐ถ๐น๐๐ฒ๐ฟ ๐ฎ๐ ๐๐ผ๐ผ๐ป ๐ฎ๐ ๐๐ผ๐ ๐ฐ๐ฎ๐ป
If you only need certain rows from a table, apply the filters as early as you can.
Please, don't wait until you do all the CTEs, then all the joins, and realize you need to apply some filtersโฆ Your company will thank you for that.
๐ฏ) ๐๐ผ๐ถ๐ป ๐๐ผ๐๐ฟ ๐น๐ฎ๐ฟ๐ด๐ฒ๐๐ ๐๐ฎ๐ฏ๐น๐ฒ ๐ฎ๐ ๐๐ต๐ฒ ๐ฒ๐ป๐ฑ (๐ถ๐ณ ๐๐ผ๐ ๐ฐ๐ฎ๐ป!)
We've all experienced it. But when you have your biggest table as the base, and joining it to other tables it will result in a lot of scanning.
Look at these 2 queries:
๐ญ) ๐ฆ๐๐๐๐๐ง * ๐๐ฅ๐ข๐ ๐ฏ๐ถ๐ด_๐๐ฎ๐ฏ๐น๐ฒ ๐๐ก๐ก๐๐ฅ ๐๐ข๐๐ก ๐๐ฎ๐ฏ๐น๐ฒ_๐ญ ๐ข๐ก ๐ ๐ ๐
vs.
๐ฎ) ๐ฆ๐๐๐๐๐ง * ๐๐ฅ๐ข๐ ๐๐ฎ๐ฏ๐น๐ฒ_๐ญ ๐๐ก๐ก๐๐ฅ ๐๐ข๐๐ก ๐ฏ๐ถ๐ด_๐๐ฎ๐ฏ๐น๐ฒ ๐ข๐ก ๐ ๐ ๐
The second query will be significantly lower in computing power, so make sure to always start with the smallest table as you base (the best you can!)
๐ฐ) ๐๐ฒ ๐๐ฒ๐ป๐๐น๐ฒ ๐๐ถ๐๐ต ๐๐ถ๐ป๐ฑ๐ผ๐ ๐ณ๐๐ป๐ฐ๐๐ถ๐ผ๐ป๐
I've seen many people saying use window function for your queries. Well, I don't think it's a good idea.
Yes, they are powerful but they are also resource-intensive, so use them only if necessary.
And ask yourself how much data it needs to process!
๐ฑ) ๐ ๐ผ๐ฟ๐ฒ ๐๐ง๐๐ ๐น๐ฒ๐๐ ๐๐๐ฏ๐พ๐๐ฒ๐ฟ๐ถ๐ฒ๐
Think about your audience. When you are writing a huge query with sub-queries, it is (very!) hard for people to navigate.
Think of replacing them with CTE - they will thank you for that!
And it will also be easier for you to debug and optimize (if needed)
Here you can find essential SQL Interview Resources๐
https://whatsapp.com/channel/0029VanC5rODzgT6TiTGoa1v
Like this post if you need more ๐โค๏ธ
Hope it helps :)
๐5โค1