In lecture we have mostly been focusing on the worst case for each sorting method. The best case would be when the inputted list is already sorted, therefore nothing has to be done to the list. Time complexity deals with the difficulty with which each sorting method will tackle sorting the n elements of a list. Usually O(n log n) would be considered an optimal complexity while O(n^2) would be considered a bad time complexity. Some examples of sorting methods we've seen are insertion sort, merge sort, selection sort, bubble sort, quicksort, etc. During our week 10 lab, my partner and I found that bubble sort took the longest time to sort a randomly created list of n elements. No matter what n was, bubble sort always had the highest time used, while quicksort, merge sort and along with the built-in python sorted did the best by using the least amount of time compared to the other methods.
The notation of time complexity we learned was by using the big O notation which allows it to characterize each sorting method by their growth rate of time used depending on how many elements (n) the list to be sorted contains. It only provides the upper bound of the growth rate of the sorting method.