Sunday, March 30, 2014

Week 11 - Sorting and Efficiency

These past few lectures have been focused on different sorting algorithms and their efficiency and time complexity. In CSC108 we already learned a few basic sorting techniques such as quick sort, selection sort and bubble sort. A sorting algorithm works to put elements of a list into a certain order. We've been mostly looking at sorting lists numerically from min to max. The efficiency of a sorting algorithm is important to optimize the amount of time taken to run each sort or other functions such as search. In sorting, the user inputs a list including several sorted or unsorted elements, however when the sorted list is outputted, it is not a new list but rather a permutation of the inputted list. 

In lecture we have mostly been focusing on the worst case for each sorting method. The best case would be when the inputted list is already sorted, therefore nothing has to be done to the list. Time complexity deals with the difficulty with which each sorting method will tackle sorting the n elements of a list. Usually O(n log n) would be considered an optimal complexity while O(n^2) would be considered a bad time complexity. Some examples of sorting methods we've seen are insertion sort, merge sort, selection sort, bubble sort, quicksort, etc. During our week 10 lab, my partner and I found that bubble sort took the longest time to sort a randomly created list of n elements. No matter what n was, bubble sort always had the highest time used, while quicksort, merge sort and along with the built-in python sorted did the best by using the least amount of time compared to the other methods.

The notation of time complexity we learned was by using the big O notation which allows it to characterize each sorting method by their growth rate of time used depending on how many elements (n) the list to be sorted contains. It only provides the upper bound of the growth rate of the sorting method. 

Week 10

This week I mostly focused on working on and finishing A2: Part 2. Writing the basic code and figuring out how to approach it was pretty quick and easy, however as we began testing and continued adding conditions, we began encountering a to of unforeseen errors. It took us a little while to finally fix all the bugs we were finding, but we finally managed. Overall, I actually enjoyed working on this assignment.

This week in lecture we continued to look at sorting and big-O. However, we took a much closer look at quick sort and merge sort and looked at their performances. This also led us to look at scaling, which handles how well the various set methods perform as the size of input increases. I'm still having a little trouble getting it, but I will definitely look up more reading material to get a firmer grasp on this.

Week 9

This week in lecture we began looking at BSTs and big-O complexity. Binary Search Trees are a type of tree where the data in the left subtree is less than that of the root and the data on the right subtree is greater than that of the root. This efficiently stores data into easy to find information. I found working with BSTs much easier than regular trees.

We later recalled list searching from csc108 and looked at their performance. The performance is an algorithm independent from the efficiency of hardware, language or random events. It focuses on the size of the input. We looked at runtime analysis to compare different efficiencies of different sorting methods such as bubble, quick, insert, etc. This is something I found very challenging. I took css165 last semester and struggled with the big-o unit and seeing them again in csc148 I guess made me approach it with fear. I did the readings and looked at a lot of performance efficiency of different methods and somewhat got a better understanding of the material. I'll definitely need to go over that in detail before the exam.

Week 8

This week my partner and I worked on creating a regex tree class for assignment 2 part 1. At first we decided to create one big class that would allow the different characteristic of each regex to be inputted, however we later heard Danny say how that could be inefficient so we decided to change it. We spent maybe an hour rehabbing our code to split it into several smaller tree classes by using inheritance, but overall we found this assignment pretty easy.

In lecture, we began taking a closer look at linked lists. Linked lists are lists made up of an item and the remaining list of elements. So for example, if you created a new list it would be LinkedList() and if you added an element into it, it would be LinkedList(5, LinkedList()), and after adding a new element it would look like LinkedList(3, LinkedList(5, LinkedList())). This took me a little while to get used to, as  once you get into a list with 10 elements it gets a little harder to keep track of all the trailing lists. At first I was a little uncomfortable with the idea of working with linked lists, but after working through the lab in week 8 I definitely became much more comfortable with the concept. In further lectures we looked at various functions that were included in linked list such as delete_front, reverse, insert, etc.

Tuesday, March 4, 2014

Week 7 - Recursion

This week we had our first midterm. I found it quite simple, but after receiving the test I realized I made stupid little mistakes from maybe rushing. That has aways been my problem with programming so I realize I need to work on that in the future.

Recursion

In class, we've began to explore deeper into the use and function of recursion in python. Recursion is basically the idea of repeating a function while using itself in the process. My dad, being a computer scientist, would teach me recursion and always explained recursion by explaining how if you placed 2 mirrors in front of each other it would create an infinite tunnel of reflections because both mirrors would be reflecting their image, and then their image reflecting their image, and so on. In programming however, recursion is used by defining a function that calls upon itself to return a value or perform a function.

For recursion to work, there must always be a base case or cases. Without a base case, a recursive function would fall into an infinite loop that would never return a value or perform an action because it would never know when to stop. The rest of the function is a set of rules and recursive sub calls which are called the general case.

Week 6

Right before the break, we were introduced to trees. I was actually sick this week and could not go to lecture so I was terrified I wouldn't catch up or understand as well by having to learn it on my own. Fortunately, I found this material very easy and self-explanatory. I definitely find this area of computer science very interesting.

We focused on tree terminology and explored them more based on a representation of trees rather than an actual Tree class code. I found this a little more challenging to try to see what the code was doing and how to trace it. We looked at pre-order, in-order and post-order traversal and their corresponding codes. This I found to be much easier to understand and actually hope to do more of this in the future. We also looked over different methods commonly found in Tree classes such as height, arity, number of leaves.

Week 5

This week I was mostly preoccupied with finishing and polishing up the first assignment. What me and my partners most struggles with was transforming the recursive part of the tower of Hanoi to work with more than 3 stools. We worked through a lot of different variations and in the end we managed to get it working, using the least amount of moves necessary. I'm feeling confident about our assignment, but I'm still worried about possible, little mistakes made here and there.

In class, we continued to look at recursion. We especially went over the assignment and different methods to approach the problem. I found this extremely helpful at how to approach the recursive part of this assignment.

We also looked at tracing, naming properly and in what order python searches for a method.