β
Top DSA Interview Questions with Answers: Part-2 π§
11. What is the difference between BFS and DFS?
- BFS (Breadth-First Search): Explores neighbors first (level by level). Uses a queue. β‘οΈ
- DFS (Depth-First Search): Explores depth (child nodes) first. Uses a stack or recursion. β¬οΈ
Used in graph/tree traversals, pathfinding, cycle detection. π³π
12. What is a Heap?
A binary tree with heap properties:
- Max-Heap: Parent β₯ children πΌ
- Min-Heap: Parent β€ children π½
Used in priority queues, heap sort, scheduling algorithms. β°
13. What is a Trie?
A tree-like data structure used to store strings. π²
Each node represents a character.
Used in: autocomplete, spell-checkers, prefix search. π‘
14. What is a Graph?
A graph is a collection of nodes (vertices) and edges. π
- Can be directed/undirected, weighted/unweighted.
Used in: networks, maps, recommendation systems. πΊοΈ
15. Difference between Directed and Undirected Graph?
- Directed: Edges have direction (A β B β B β A) β‘οΈ
- Undirected: Edges are bidirectional (A β B) βοΈ
Used differently based on relationships (e.g., social networks vs. web links).
16. What is the time complexity of common operations in arrays and linked lists?
- Array: π’
- Access: O(1)
- Insert/Delete: O(n)
- Linked List: π
- Access: O(n)
- Insert/Delete: O(1) at head
17. What is recursion?
When a function calls itself to solve a smaller subproblem. π
Requires a base case to stop infinite calls.
Used in: tree traversals, backtracking, divide & conquer. π³π§©
18. What are base case and recursive case?
- Base Case: Condition that ends recursion π
- Recursive Case: Part where the function calls itself β‘οΈ
Example:
19. What is dynamic programming?
An optimization technique that solves problems by breaking them into overlapping subproblems and storing their results (memoization). πΎ
Used in: Fibonacci, knapsack, LCS. π
20. Difference between Memoization and Tabulation?
- Memoization (Top-down): Uses recursion + caching π§
- Tabulation (Bottom-up): Uses iteration + table π
Both store solutions to avoid redundant calculations.
π¬ Double Tap β₯οΈ For Part-3
11. What is the difference between BFS and DFS?
- BFS (Breadth-First Search): Explores neighbors first (level by level). Uses a queue. β‘οΈ
- DFS (Depth-First Search): Explores depth (child nodes) first. Uses a stack or recursion. β¬οΈ
Used in graph/tree traversals, pathfinding, cycle detection. π³π
12. What is a Heap?
A binary tree with heap properties:
- Max-Heap: Parent β₯ children πΌ
- Min-Heap: Parent β€ children π½
Used in priority queues, heap sort, scheduling algorithms. β°
13. What is a Trie?
A tree-like data structure used to store strings. π²
Each node represents a character.
Used in: autocomplete, spell-checkers, prefix search. π‘
14. What is a Graph?
A graph is a collection of nodes (vertices) and edges. π
- Can be directed/undirected, weighted/unweighted.
Used in: networks, maps, recommendation systems. πΊοΈ
15. Difference between Directed and Undirected Graph?
- Directed: Edges have direction (A β B β B β A) β‘οΈ
- Undirected: Edges are bidirectional (A β B) βοΈ
Used differently based on relationships (e.g., social networks vs. web links).
16. What is the time complexity of common operations in arrays and linked lists?
- Array: π’
- Access: O(1)
- Insert/Delete: O(n)
- Linked List: π
- Access: O(n)
- Insert/Delete: O(1) at head
17. What is recursion?
When a function calls itself to solve a smaller subproblem. π
Requires a base case to stop infinite calls.
Used in: tree traversals, backtracking, divide & conquer. π³π§©
18. What are base case and recursive case?
- Base Case: Condition that ends recursion π
- Recursive Case: Part where the function calls itself β‘οΈ
Example:
def fact(n):
if n == 0: return 1 # base case
return n * fact(n-1) # recursive case
19. What is dynamic programming?
An optimization technique that solves problems by breaking them into overlapping subproblems and storing their results (memoization). πΎ
Used in: Fibonacci, knapsack, LCS. π
20. Difference between Memoization and Tabulation?
- Memoization (Top-down): Uses recursion + caching π§
- Tabulation (Bottom-up): Uses iteration + table π
Both store solutions to avoid redundant calculations.
π¬ Double Tap β₯οΈ For Part-3
β€7π1
β
15-Day Winter Training by GeeksforGeeks βοΈπ»
π― Build 1 Industry-Level Project
π IBM Certification Included
π¨βπ« Mentor-Led Classroom Learning
π Offline in: Noida | Bengaluru | Hyderabad | Pune | Kolkata
π§³ Perfect for Minor/Major Projects Portfolio
π§ MERN Stack:
https://gfgcdn.com/tu/WC6/
π Data Science:
https://gfgcdn.com/tu/WC7/
π₯ What Youβll Build:
β’ MERN: Full LMS with auth, roles, payments, AWS deploy
β’ Data Science: End-to-end GenAI apps (chatbots, RAG, recsys)
π’ Limited Seats β Register Now!
π― Build 1 Industry-Level Project
π IBM Certification Included
π¨βπ« Mentor-Led Classroom Learning
π Offline in: Noida | Bengaluru | Hyderabad | Pune | Kolkata
π§³ Perfect for Minor/Major Projects Portfolio
π§ MERN Stack:
https://gfgcdn.com/tu/WC6/
π Data Science:
https://gfgcdn.com/tu/WC7/
π₯ What Youβll Build:
β’ MERN: Full LMS with auth, roles, payments, AWS deploy
β’ Data Science: End-to-end GenAI apps (chatbots, RAG, recsys)
π’ Limited Seats β Register Now!
β€8π1
β
Top DSA Interview Questions with Answers: Part-3 π§
21. What is the Sliding Window technique?
Itβs an optimization method used to reduce time complexity in problems involving arrays or strings. You create a "window" over a subset of data and slide it as needed, updating results on the go.
Example use case: Find the maximum sum of any k consecutive elements in an array.
22. Explain the Two-Pointer technique.
This involves using two indices (pointers) to traverse a data structure, usually from opposite ends or the same direction. It's helpful for searching pairs or reversing sequences efficiently.
Common problems: Two-sum, palindrome check, sorted array partitioning.
23. What is the Binary Search algorithm?
Itβs an efficient algorithm to find an element in a sorted array by repeatedly dividing the search range in half.
Time Complexity: O(log n)
Key idea: Compare the target with the middle element and eliminate half the array each step.
24. What is the Merge Sort algorithm?
A divide-and-conquer sorting algorithm that splits the array into halves, sorts them recursively, and then merges them.
Time Complexity: O(n log n)
Stable? Yes
Extra space? Yes, due to merging.
25. What is the Quick Sort algorithm?
It chooses a pivot, partitions the array so elements < pivot are left, and > pivot are right, then recursively sorts both sides.
Time Complexity: Avg β O(n log n), Worst β O(nΒ²)
Fast in practice, but not stable.
26. Difference between Merge Sort and Quick Sort
β’ Merge Sort is stable, consistent in performance (O(n log n)), but uses extra space.
β’ Quick Sort is faster in practice and works in-place, but may degrade to O(nΒ²) if pivot is poorly chosen.
27. What is Insertion Sort and how does it work?
It builds the sorted list one item at a time by comparing and inserting items into their correct position.
Time Complexity: O(nΒ²)
Best Case (nearly sorted): O(n)
Stable? Yes
Space: O(1)
28. What is Selection Sort?
It finds the smallest element from the unsorted part and swaps it with the beginning.
Time Complexity: O(nΒ²)
Space: O(1)
Stable? No
Rarely used due to inefficiency.
29. What is Bubble Sort and its drawbacks?
It repeatedly compares and swaps adjacent elements if out of order.
Time Complexity: O(nΒ²)
Space: O(1)
Drawback: Extremely slow for large data. Educational, not practical.
30. What is the time and space complexity of common sorting algorithms?
β’ Bubble Sort β Time: O(nΒ²), Space: O(1), Stable: Yes
β’ Selection Sort β Time: O(nΒ²), Space: O(1), Stable: No
β’ Insertion Sort β Time: O(nΒ²), Space: O(1), Stable: Yes
β’ Merge Sort β Time: O(n log n), Space: O(n), Stable: Yes
β’ Quick Sort β Avg Time: O(n log n), Worst: O(nΒ²), Space: O(log n), Stable: No
Double Tap β₯οΈ For Part-4
21. What is the Sliding Window technique?
Itβs an optimization method used to reduce time complexity in problems involving arrays or strings. You create a "window" over a subset of data and slide it as needed, updating results on the go.
Example use case: Find the maximum sum of any k consecutive elements in an array.
22. Explain the Two-Pointer technique.
This involves using two indices (pointers) to traverse a data structure, usually from opposite ends or the same direction. It's helpful for searching pairs or reversing sequences efficiently.
Common problems: Two-sum, palindrome check, sorted array partitioning.
23. What is the Binary Search algorithm?
Itβs an efficient algorithm to find an element in a sorted array by repeatedly dividing the search range in half.
Time Complexity: O(log n)
Key idea: Compare the target with the middle element and eliminate half the array each step.
24. What is the Merge Sort algorithm?
A divide-and-conquer sorting algorithm that splits the array into halves, sorts them recursively, and then merges them.
Time Complexity: O(n log n)
Stable? Yes
Extra space? Yes, due to merging.
25. What is the Quick Sort algorithm?
It chooses a pivot, partitions the array so elements < pivot are left, and > pivot are right, then recursively sorts both sides.
Time Complexity: Avg β O(n log n), Worst β O(nΒ²)
Fast in practice, but not stable.
26. Difference between Merge Sort and Quick Sort
β’ Merge Sort is stable, consistent in performance (O(n log n)), but uses extra space.
β’ Quick Sort is faster in practice and works in-place, but may degrade to O(nΒ²) if pivot is poorly chosen.
27. What is Insertion Sort and how does it work?
It builds the sorted list one item at a time by comparing and inserting items into their correct position.
Time Complexity: O(nΒ²)
Best Case (nearly sorted): O(n)
Stable? Yes
Space: O(1)
28. What is Selection Sort?
It finds the smallest element from the unsorted part and swaps it with the beginning.
Time Complexity: O(nΒ²)
Space: O(1)
Stable? No
Rarely used due to inefficiency.
29. What is Bubble Sort and its drawbacks?
It repeatedly compares and swaps adjacent elements if out of order.
Time Complexity: O(nΒ²)
Space: O(1)
Drawback: Extremely slow for large data. Educational, not practical.
30. What is the time and space complexity of common sorting algorithms?
β’ Bubble Sort β Time: O(nΒ²), Space: O(1), Stable: Yes
β’ Selection Sort β Time: O(nΒ²), Space: O(1), Stable: No
β’ Insertion Sort β Time: O(nΒ²), Space: O(1), Stable: Yes
β’ Merge Sort β Time: O(n log n), Space: O(n), Stable: Yes
β’ Quick Sort β Avg Time: O(n log n), Worst: O(nΒ²), Space: O(log n), Stable: No
Double Tap β₯οΈ For Part-4
β€13