Curios why we are using
int midpoint = (start + end) / 2; // overflow issue when end is Integer.MAX_VALUE
instead of
int midpoint = start + (end - start)/2;
I think we should also mention that that the maximum recursion layer is O(n) in the worst case when the list is already sorted so the space complexity in the worst case is also O(n).
Thank you for pointing this out
Quick sort: In Python
do we need extra start_ptr < end_ptr for line 10 and line12…?
Yes, it is a break condition for the inner while loop
Pythonic merge sort version:
def merge_sort(array: list[int]) → list[int]:
if len(array) < 2:
return array
else:
mid = len(array)//2
left = merge_sort(array[:mid])
right = merge_sort(array[mid:])
return merge(left, right)
def merge(left: list[int], right: list[int]) → list[int]:
result = []
while len(left) > 0 and len(right) > 0:
if left[0] <= right[0]:
result.append(left.pop(0))
else:
result.append(right.pop(0))
result += left
result += right
return result
pythonic quick sort version:
def quick_sort(array: list[int]) → list[int]:
if len(array) < 2:
return array
else:
pivot = array[0]
less = [i for i in array[1:] if i <= pivot]
greater = [i for i in array[1:] if i > pivot]
return quick_sort(less) + [pivot] + quick_sort(greater)
Btw, here is ChatGPT answer in javascript… You can’t beat this! Concise and beautiful
function quicksort(arr) {
if (arr.length <= 1) return arr;
let pivot = arr[arr.length - 1];
let left = [];
let right = [];
for (let i = 0; i < arr.length - 1; i++) {
if (arr[i] < pivot) {
left.push(arr[i]);
} else {
right.push(arr[i]);
}
}
return quicksort(left).concat([pivot], quicksort(right));
}
But sadly this example code, it is not inplace sorting, which is the main selling point of Quick sort over merge sort, even though it has n^2 worst case, but with this code, it uses the extra space as merge sort and has poor worst case complexity
Again…the example code is terrible! Its as if the authors had never written code mean to be read by other humans.
Try this for merge sort:
class Solution:
def merge(left_half, right_half):
left_idx, right_idx = 0,0
result = []
while left_idx < len(left_half) and right_idx < len(right_half):
if left_half[left_idx] < right_half[right_idx]:
result.append(left_half[left_idx])
left_idx += 1
else:
result.append(right_half[right_idx])
right_idx += 1
while left < len(left_half):
result.append(left_half[left_idx])
left_idx += 1
while right < len(right_half):
result.append(right_half[right_idx])
right_idx += 1
return result
def sortArray(self, nums: List[int]) -> List[int]:
if len(nums) <= 1:
return nums
middle = len(nums) // 2
left_half = self.sortArray(nums[:middle])
right_half = self.sortArray(nums[middle:])
return merge(left_half, right_half)
I need help with merge sort
function sortList(unsortedList) {
const n = unsortedList.length;
if (n <= 1) return unsortedList;
const midpoint = Math.floor(n / 2);
const leftList = sortList(unsortedList.slice(0, midpoint));
const rightList = sortList(unsortedList.slice(midpoint));
const res = [];
let leftPtr = rightPtr = 0;
while (leftPtr < midpoint || rightPtr < n - midpoint) {
if (leftPtr === midpoint) {
res.push(rightList[rightPtr]);
rightPtr++;
} else if (rightPtr === n - midpoint) {
res.push(leftList[leftPtr]);
leftPtr++;
} else if (leftList[leftPtr] <= rightList[rightPtr]) {
res.push(leftList[leftPtr]);
leftPtr++;
} else {
res.push(rightList[rightPtr]);
rightPtr++;
}
}
return res;
}
why are we doing rightPtr < n - midpoint ? what does it even mean ??
only thing about this quicksort example that differs from the one provided is its not in place since its using new lists for less and greater. I think this would be a comparable in place version.
def quick_sort(arr, start, end):
if start < end:
pivot = partition(arr, start, end)
quick_sort(arr, start, pivot - 1)
quick_sort(arr, pivot + 1, end)
return arr
def partition(arr, start, end):
pivot = arr[end]
i = start
for j in range(start, end):
if arr[j] <= pivot:
arr[i], arr[j] = arr[j], arr[i]
i += 1
arr[i], arr[end] = arr[end], arr[i]
return i
if end - start <= 1:
for quick sort…
^^ shouldn’t that be end - start <= 0: since if the different is 1 that means there are 2 elements and they could be unsorted?
I was confused about how recursion was used in the quick sort example. This video uses an example of [10, 80, 30, 90, 40, 50, 70], which helped clear things up for me.
The n-midpoint gives the length of the right list. So effectively, it’s saying while rightPtr is less than the end of the right array. TBH this whole line could have been more cleanly written as:
while (leftPtr < len(leftList) || rightPtr < len(rightList)) {
Although this implementation is helpful to visualize how the algorithm works more concisely, the pop(0)
operation is inefficient, as we’re removing elements from the beginning of the arrays. I made an analysis using the profile
module on my machine with a reversed list containing 1 million elements, and it took 78 seconds to finish, where ~45 seconds were spent on the pop()
operations. Using the timeit
module with three executions, it took 41.6 seconds on avg.
The implementation below is also Pythonic but took only 0.93 seconds on avg using three executions with timeit
, and 29.987 seconds on profile
where most of the time was spent on the merge operation itself:
def merge(left: List[int], right: List[int]) -> List[int]:
result = []
left_idx, right_idx = 0, 0
while left_idx < len(left) and right_idx < len(right):
if left[left_idx] < right[right_idx]:
result.append(left[left_idx])
left_idx += 1
else:
result.append(right[right_idx])
right_idx += 1
result.extend(left[left_idx:])
result.extend(right[right_idx:])
return result
def merge_sort(arr: List[int]) -> List[int]:
if len(arr) < 2:
return arr
mid = len(arr) // 2
left, right = merge_sort(arr[:mid]), merge_sort(arr[mid:])
return merge(left, right)