Skip to content

Commit 8659b03

Browse files
committed
Added all the changes
2 parents 20945e1 + 408c892 commit 8659b03

File tree

18 files changed

+331
-213
lines changed

18 files changed

+331
-213
lines changed

Hash/Chaining/Readme.md

+27
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
# Chaining in Hashing
2+
3+
Chaining is a method used to resolve hash collisions in hash tables. In this method, each bucket (or slot) of the hash table stores a linked list of all elements that hash to the same index. When a collision occurs, the element is simply added to the linked list at that index.
4+
5+
<p align="center">
6+
<img src="https://media.geeksforgeeks.org/wp-content/cdn-uploads/gq/2015/07/hashChaining1.png" alt="Chaining in Hashing">
7+
</p>
8+
9+
### Time and Space Complexity
10+
11+
| **Operation** | **Average Case** | **Worst Case** |
12+
|--------------------|------------------|----------------|
13+
| **Search** | O(1) | O(n) |
14+
| **Insert** | O(1) | O(n) |
15+
| **Delete** | O(1) | O(n) |
16+
17+
### Explanation
18+
19+
- **Search**: On average, the search operation takes constant time since elements are stored in a linked list in each bucket. However, in the worst case, all elements could hash to the same bucket, resulting in a linear search.
20+
21+
- **Insert**: Insertion also takes constant time on average. If a collision occurs, the element is added to the linked list at the corresponding bucket.
22+
23+
- **Delete**: Deletion follows the same approach as the search operation: O(1) on average but could degrade to O(n) in the worst case.
24+
25+
### Space Complexity
26+
27+
The space complexity of chaining in hashing is O(n), where n is the number of elements in the hash table. This is because, in the worst case, all elements can be stored in a single linked list.

Hash/Linear Probing/Readme.md

+27
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
# Linear Probing in Hashing
2+
3+
Linear Probing is a collision resolution technique used in open addressing to resolve hash collisions. When a collision occurs, linear probing checks the next available slot in a sequential manner (i.e., it moves linearly through the hash table until an empty slot is found).
4+
5+
<p align="center">
6+
<img src="https://media.geeksforgeeks.org/wp-content/uploads/Linear-Probing-1-1.jpg" alt="Linear Probing in Hashing">
7+
</p>
8+
9+
### Time and Space Complexity
10+
11+
| **Operation** | **Average Case** | **Worst Case** |
12+
|--------------------|------------------|----------------|
13+
| **Search** | O(1) | O(n) |
14+
| **Insert** | O(1) | O(n) |
15+
| **Delete** | O(1) | O(n) |
16+
17+
### Explanation
18+
19+
- **Search**: In linear probing, the search operation can be very efficient if the load factor is low. However, if many slots are occupied, it might take longer, resulting in O(n) time complexity in the worst case.
20+
21+
- **Insert**: Insertions are generally efficient with linear probing, but in cases where the hash table is nearing its capacity, insertions might degrade to O(n) as the algorithm will need to probe multiple slots to find an available one.
22+
23+
- **Delete**: Deletion works similarly to insertion. In cases of full tables or long chains of collisions, the time complexity may degrade to O(n).
24+
25+
### Space Complexity
26+
27+
The space complexity of linear probing is O(n), where n is the number of elements in the hash table. This is because the space is determined by the size of the hash table itself, and additional space is not required beyond the table.

README.md

+6-2
Original file line numberDiff line numberDiff line change
@@ -40,13 +40,17 @@ This repository contains implementations of various data structures and algorith
4040
- Min Heap
4141
- Max Heap
4242
- Heap Operations (Insert, Delete, Heapify)
43-
- [Hashing](./Hash/Chaining)
44-
- Hashing with Chaining
43+
- [Hashing](./Hash)
44+
- [Hashing with Chaining](./Hash/Chaining)
45+
- [Hashing with Linear Probing](./Hash/Linear%20Probing)
46+
- [Hashing with Quadratic Probing](./Hash/Quadratic%20Probing)
47+
- [Hashing with Double Hashing](./Hash/Double%20Hashing)
4548

4649
## Algorithms
4750

4851
- [Sorting Algorithms](./sorting%20algo)
4952
- [Recursion](./recursion)
53+
- [Searching Algorithms](./search%20algo)
5054

5155
## Notes
5256

Search/BinarySearch/readme.md

+13
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
# Binary Search
2+
3+
Binary Search is a more efficient algorithm for finding an element in a sorted list by repeatedly dividing the search interval in half. If the value of the search key is less than the item in the middle of the interval, the search continues in the left half. Otherwise, it continues in the right half.
4+
5+
<p align="center">
6+
<img src="https://media.geeksforgeeks.org/wp-content/uploads/20240506155201/binnary-search-.webp" alt="Binary Search">
7+
</p>
8+
9+
### Time and Space Complexity
10+
11+
| **Algorithm** | **Time Complexity** | **Space Complexity** |
12+
|-------------------|---------------------|----------------------|
13+
| **Binary Search** | O(log n) | O(1) |

Search/LinearSearch/Readme.md

+13
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
# Linear Search
2+
3+
Linear Search is a simple search algorithm that checks each element of a list sequentially until it finds the target or reaches the end of the list.
4+
5+
<p align="center">
6+
<img src="https://media.geeksforgeeks.org/wp-content/uploads/20240506105158/Linear-Search-algorithm-banner-(1).webp" alt="Linear Search">
7+
</p>
8+
9+
### Time and Space Complexity
10+
11+
| **Algorithm** | **Time Complexity** | **Space Complexity** |
12+
|--------------------|---------------------|----------------------|
13+
| **Linear Search** | O(n) | O(1) |

Search/Readme.md

+25
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# Searching Algorithms
2+
3+
Searching algorithms are designed to check whether a specific element exists within a data structure or to retrieve its position if it exists.
4+
5+
<p align="center">
6+
<img src="https://media.geeksforgeeks.org/wp-content/cdn-uploads/20230726172447/Searching-algorithm.png" alt="Searching Algorithms">
7+
</p>
8+
9+
## Linear Search
10+
Linear Search is a simple algorithm that checks every element in the list sequentially until it finds the target or reaches the end.
11+
12+
### Time and Space Complexity
13+
14+
| **Algorithm** | **Time Complexity** | **Space Complexity** |
15+
|--------------------|---------------------|----------------------|
16+
| **Linear Search** | O(n) | O(1) |
17+
18+
## Binary Search
19+
Binary Search is a more efficient algorithm than Linear Search. It works on sorted arrays and repeatedly divides the search interval in half.
20+
21+
### Time and Space Complexity
22+
23+
| **Algorithm** | **Time Complexity** | **Space Complexity** |
24+
|--------------------|---------------------|----------------------|
25+
| **Binary Search** | O(log n) | O(1) |

sorting algo/Insertion/Readme.md

+74
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,74 @@
1+
# Insertion Sort
2+
3+
![Insertion Sort Banner](https://media.geeksforgeeks.org/wp-content/uploads/20240408140301/Insertion-Sort.webp)
4+
5+
This repository contains the implementation of **Insertion Sort**, a simple and intuitive sorting algorithm that builds the final sorted array one element at a time. It is particularly efficient for small datasets or partially sorted data.
6+
7+
---
8+
9+
## Table of Contents
10+
1. [Introduction](#introduction)
11+
2. [Algorithm Steps](#algorithm-steps)
12+
3. [Time and Space Complexity](#time-and-space-complexity)
13+
4. [Advantages and Disadvantages](#advantages-and-disadvantages)
14+
5. [Usage](#usage)
15+
6. [Contact](#contact)
16+
17+
---
18+
19+
## Introduction
20+
21+
Insertion Sort is a comparison-based sorting algorithm that works by dividing the input into a sorted and an unsorted part. Elements from the unsorted part are picked one at a time and inserted into their correct position in the sorted part.
22+
23+
---
24+
25+
## Algorithm Steps
26+
27+
1. Start with the second element (index 1), assuming the first element is already sorted.
28+
2. Compare the current element with the elements in the sorted part of the array.
29+
3. Shift all larger elements in the sorted part to the right to make room for the current element.
30+
4. Insert the current element into its correct position.
31+
5. Repeat until all elements are sorted.
32+
33+
**Example:**
34+
For an array `[5, 3, 4, 1, 2]`, the algorithm works as follows:
35+
- Pass 1: `[3, 5, 4, 1, 2]`
36+
- Pass 2: `[3, 4, 5, 1, 2]`
37+
- Pass 3: `[1, 3, 4, 5, 2]`
38+
- Pass 4: `[1, 2, 3, 4, 5]`
39+
40+
---
41+
42+
## Time and Space Complexity
43+
44+
| Case | Time Complexity | Explanation |
45+
|---------------|-----------------|------------------------------------------------------|
46+
| **Best Case** | O(n) | Array is already sorted; only one comparison per element. |
47+
| **Average Case** | O(n²) | Each element is compared with all previous elements in the worst case. |
48+
| **Worst Case** | O(n²) | Array is sorted in reverse order; maximum comparisons are needed. |
49+
| **Space Complexity** | O(1) | In-place sorting algorithm, no additional space is required. |
50+
51+
---
52+
53+
## Advantages and Disadvantages
54+
55+
### Advantages:
56+
- Simple and easy to implement.
57+
- Efficient for small datasets or nearly sorted data.
58+
- Stable sorting algorithm (preserves the order of equal elements).
59+
60+
### Disadvantages:
61+
- Inefficient for large datasets due to O(n²) time complexity.
62+
63+
---
64+
65+
## Usage
66+
67+
To compile and run the code for **Insertion Sort**, use the following commands:
68+
69+
```bash
70+
# Compile the code
71+
gcc insertion_sort.c -o insertion_sort
72+
73+
# Run the executable
74+
./insertion_sort

sorting algo/InsertionSort.c

-30
This file was deleted.

sorting algo/Merge/Readme.md

+25
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# Merge Sort
2+
3+
Merge Sort is a divide-and-conquer algorithm that splits an array into halves, recursively sorts each half, and then merges the sorted halves back together.
4+
5+
<p align="center">
6+
<img src="https://startutorial.com/img/merge-sort-split.jpg" alt="Merge Sort Illustration" width="500" height="500">
7+
</p>
8+
9+
## How Merge Sort Works
10+
1. **Divide**: Split the array into two halves.
11+
2. **Conquer**: Recursively sort each half.
12+
3. **Merge**: Combine the sorted halves into a single sorted array.
13+
14+
## Example
15+
Consider an array `[38, 27, 43, 3, 9, 82, 10]`. Using merge sort:
16+
1. Split into `[38, 27, 43, 3]` and `[9, 82, 10]`.
17+
2. Continue splitting until single-element arrays are obtained.
18+
3. Merge back step by step until the array is fully sorted: `[3, 9, 10, 27, 38, 43, 82]`.
19+
20+
## Time and Space Complexity
21+
22+
| **Complexity** | **Best Case** | **Average Case** | **Worst Case** |
23+
|----------------|---------------|------------------|----------------|
24+
| **Time** | O(n log n) | O(n log n) | O(n log n) |
25+
| **Space** | O(n) | O(n) | O(n) |

sorting algo/MergeSort.c

-54
This file was deleted.

sorting algo/MergeSort.exe

-42 KB
Binary file not shown.

sorting algo/Quick/Readme.md

+25
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# Quick Sort
2+
3+
Quick Sort is a divide-and-conquer algorithm. It picks an element as a pivot and partitions the given array around the pivot. The array is then recursively sorted on either side of the pivot.
4+
5+
<p align="center">
6+
<img src="https://www.geeksforgeeks.org/wp-content/uploads/gq/2014/01/QuickSort2.png" alt="Quick Sort Illustration">
7+
</p>
8+
9+
## How Quick Sort Works
10+
1. **Choose a pivot**: Select an element from the array (commonly the last element) as the pivot.
11+
2. **Partition**: Rearrange the array so that elements smaller than the pivot are on the left, and elements larger are on the right.
12+
3. **Recursively sort**: Recursively apply the same process to the left and right subarrays.
13+
14+
## Example
15+
Consider the array `[10, 80, 30, 90, 40, 50, 70]` and choosing 70 as the pivot:
16+
1. Partition the array around 70: `[10, 30, 40, 50, 70, 90, 80]`.
17+
2. Recursively sort the left and right subarrays: `[10, 30, 40, 50]` and `[80, 90]`.
18+
3. Final sorted array: `[10, 30, 40, 50, 70, 80, 90]`.
19+
20+
## Time and Space Complexity
21+
22+
| **Complexity** | **Best Case** | **Average Case** | **Worst Case** |
23+
|----------------|---------------|------------------|----------------|
24+
| **Time** | O(n log n) | O(n log n) | O(n²) |
25+
| **Space** | O(log n) | O(log n) | O(n) |

0 commit comments

Comments
 (0)