Coding Projects
63.9K subscribers
772 photos
1 video
267 files
376 links
Channel specialized for advanced concepts and projects to master:
* Python programming
* Web development
* Java programming
* Artificial Intelligence
* Machine Learning

Managed by: @love_data
Download Telegram
πŸ‘7🌚1
βœ… Top DSA Interview Questions with Answers: Part-2 🧠

11. What is the difference between BFS and DFS?
- BFS (Breadth-First Search): Explores neighbors first (level by level). Uses a queue. ➑️
- DFS (Depth-First Search): Explores depth (child nodes) first. Uses a stack or recursion. ⬇️
Used in graph/tree traversals, pathfinding, cycle detection. πŸŒ³πŸ”Ž

12. What is a Heap?
A binary tree with heap properties:
- Max-Heap: Parent β‰₯ children πŸ”Ό
- Min-Heap: Parent ≀ children πŸ”½
Used in priority queues, heap sort, scheduling algorithms. ⏰

13. What is a Trie?
A tree-like data structure used to store strings. 🌲
Each node represents a character.
Used in: autocomplete, spell-checkers, prefix search. πŸ”‘

14. What is a Graph?
A graph is a collection of nodes (vertices) and edges. πŸ”—
- Can be directed/undirected, weighted/unweighted.
Used in: networks, maps, recommendation systems. πŸ—ΊοΈ

15. Difference between Directed and Undirected Graph?
- Directed: Edges have direction (A β†’ B β‰  B β†’ A) ➑️
- Undirected: Edges are bidirectional (A β€” B) ↔️
Used differently based on relationships (e.g., social networks vs. web links).

16. What is the time complexity of common operations in arrays and linked lists?
- Array: πŸ”’
- Access: O(1)
- Insert/Delete: O(n)
- Linked List: πŸ”—
- Access: O(n)
- Insert/Delete: O(1) at head

17. What is recursion?
When a function calls itself to solve a smaller subproblem. πŸ”„
Requires a base case to stop infinite calls.
Used in: tree traversals, backtracking, divide & conquer. 🌳🧩

18. What are base case and recursive case?
- Base Case: Condition that ends recursion πŸ›‘
- Recursive Case: Part where the function calls itself ➑️
Example:
def fact(n):
if n == 0: return 1 # base case
return n * fact(n-1) # recursive case


19. What is dynamic programming?
An optimization technique that solves problems by breaking them into overlapping subproblems and storing their results (memoization). πŸ’Ύ
Used in: Fibonacci, knapsack, LCS. πŸ“ˆ

20. Difference between Memoization and Tabulation?
- Memoization (Top-down): Uses recursion + caching 🧠
- Tabulation (Bottom-up): Uses iteration + table πŸ“Š
Both store solutions to avoid redundant calculations.

πŸ’¬ Double Tap β™₯️ For Part-3
❀7πŸ‘1
βœ… 15-Day Winter Training by GeeksforGeeks β„οΈπŸ’»

🎯 Build 1 Industry-Level Project
πŸ… IBM Certification Included
πŸ‘¨β€πŸ« Mentor-Led Classroom Learning
πŸ“ Offline in: Noida | Bengaluru | Hyderabad | Pune | Kolkata
🧳 Perfect for Minor/Major Projects Portfolio

πŸ”§ MERN Stack:
https://gfgcdn.com/tu/WC6/

πŸ“Š Data Science:
https://gfgcdn.com/tu/WC7/

πŸ”₯ What You’ll Build:
β€’ MERN: Full LMS with auth, roles, payments, AWS deploy
β€’ Data Science: End-to-end GenAI apps (chatbots, RAG, recsys)

πŸ“’ Limited Seats – Register Now!
❀8🍌1
βœ… Top DSA Interview Questions with Answers: Part-3 🧠

21. What is the Sliding Window technique?
It’s an optimization method used to reduce time complexity in problems involving arrays or strings. You create a "window" over a subset of data and slide it as needed, updating results on the go.
Example use case: Find the maximum sum of any k consecutive elements in an array.

22. Explain the Two-Pointer technique.
This involves using two indices (pointers) to traverse a data structure, usually from opposite ends or the same direction. It's helpful for searching pairs or reversing sequences efficiently.
Common problems: Two-sum, palindrome check, sorted array partitioning.

23. What is the Binary Search algorithm?
It’s an efficient algorithm to find an element in a sorted array by repeatedly dividing the search range in half.
Time Complexity: O(log n)
Key idea: Compare the target with the middle element and eliminate half the array each step.

24. What is the Merge Sort algorithm?
A divide-and-conquer sorting algorithm that splits the array into halves, sorts them recursively, and then merges them.
Time Complexity: O(n log n)
Stable? Yes
Extra space? Yes, due to merging.

25. What is the Quick Sort algorithm?
It chooses a pivot, partitions the array so elements < pivot are left, and > pivot are right, then recursively sorts both sides.
Time Complexity: Avg – O(n log n), Worst – O(nΒ²)
Fast in practice, but not stable.

26. Difference between Merge Sort and Quick Sort
β€’ Merge Sort is stable, consistent in performance (O(n log n)), but uses extra space.
β€’ Quick Sort is faster in practice and works in-place, but may degrade to O(nΒ²) if pivot is poorly chosen.

27. What is Insertion Sort and how does it work?
It builds the sorted list one item at a time by comparing and inserting items into their correct position.
Time Complexity: O(nΒ²)
Best Case (nearly sorted): O(n)
Stable? Yes
Space: O(1)

28. What is Selection Sort?
It finds the smallest element from the unsorted part and swaps it with the beginning.
Time Complexity: O(nΒ²)
Space: O(1)
Stable? No
Rarely used due to inefficiency.

29. What is Bubble Sort and its drawbacks?
It repeatedly compares and swaps adjacent elements if out of order.
Time Complexity: O(nΒ²)
Space: O(1)
Drawback: Extremely slow for large data. Educational, not practical.

30. What is the time and space complexity of common sorting algorithms?
β€’ Bubble Sort β†’ Time: O(nΒ²), Space: O(1), Stable: Yes
β€’ Selection Sort β†’ Time: O(nΒ²), Space: O(1), Stable: No
β€’ Insertion Sort β†’ Time: O(nΒ²), Space: O(1), Stable: Yes
β€’ Merge Sort β†’ Time: O(n log n), Space: O(n), Stable: Yes
β€’ Quick Sort β†’ Avg Time: O(n log n), Worst: O(nΒ²), Space: O(log n), Stable: No

Double Tap β™₯️ For Part-4
❀13