Wednesday, April 2, 2014

Week 11: Sorting and Efficiency

In this blog post I'm going to talk about bunch of sorting algorithms and their time complexities. Sorting algorithms are used to rearrange the elements of a pile in ascending or descending order. There are different sorting algorithms, such as selection sort, insertion sort, bubble sort, quick sort. Let's look at them separately.

Selection sort:
This algorithm picks all the pairs of elements in the list and makes sure that smaller one comes first. If the first one is bigger then it swaps the elements. The time complexity of this algorithm is O(n^2) which is not the best one you can use.

Insertion sort:
Insertion sort algorithm picks every element in the list and tries to add them to a new pile in a way that the new pile will be sorted. It is similar to the method people use when they sort pile of playing cards. The complexity for this algorithm is also O(n^2).

Bubble sort:
This one picks every two adjacent elements in the list and puts the smaller one to the smaller index. Then it tries same thing again until the list is sorted. The time complexity varies in this algorithm. For the best case in which the list is already sorted, it only takes O(n) time to sort it. But in other cases, bubble sort has a time complexity of O(n^2)

Quick sort:
This algorithm is the one I use a lot. It takes a pivot and gathers all elements smaller than pivot to a new list, and all elements bigger than pivot to another list. Then sorts the sublists recursively and joins them together. The time complexity for this algorithm is O(nlogn) which makes quicksort one of the best sorting algorithms.

There are a lot of algorithms other than these here and each of them have best case and worst case runtimes. But on average the algorithms that have O(nlogn) complexity are considered the fastest algorithms now. If the set of numbers in the list is not too big bucket sort can be used to get a O(n) complexity but it only works on small sets.

Thursday, March 6, 2014

Week 7: Recursion

Recursion is a method that programmers use to make their code short and clear, and it happens when the function calls itself. There is always a base case to stop infinite recursion, which is the simplest case that can be solved easily. Other than base case, there are some recursive calls and relations between them and a return statement which is generated with the results of the recursive calls. Usually, divide and conquer algorithms, such as quick sort, merge sort, and binary search are implemented using recursion, because recursion divides a problem into smaller subproblems, solves them with recursion (which will divide them again until base case) and combines their results to solve the original problem. The first recursive algorithm I wrote was finding Nth fibonacci number. Fibonacci numbers are generate with formula: F[i] = F[i-1] + F[i-2]. Base cases are F[0] = 0, F[1] = 1. So the python code for this algorithm would be:

def fib(n):
    if n == 0:
        return 0
    if n == 1:
        return 1
    return fib(n-1) + fib(n-2)

Now, in these simple problems the length of recursive method isn't much different from using a list and loop, but in general it is much simpler to write recursion because it works from top to bottom, and you know the top. In contrast, if you use loops and dynamic programming although it is very fast, it is hard to think from base cases to top. Also, we can use memorization to improve time efficiency of recursion.

Wednesday, January 22, 2014

SLOG 1 - Week 3


- What's something new you learned this week in class?
In this class I learned inheritance and exceptions in python. Although I had a little background in programming I've never used inheritance in my programs. Using inheritance will make it easier for me to modify or expand built-in functions of python. Exception is also a useful feature I can use to give the user of program detailed information about errors he encountered.

- What's something you enjoyed this week in class?
I enjoyed this week's lab the most. I reviewed the stack and queue data structures, implemented them with my partner. Then I found the way to optimize my queue ADT to make it time efficient.

- What's something that challenged or frustrated you this week?
Honestly, this slog is the hardest work to do for me. I've never been good at writing, plus this is the first blog I'm writing, and I have no idea how it is going to be.

- How confident do you feel about material covered this week?
I'm sure I can use everything I learned in the class. At first when I saw inheritance example I was a little confused, but once the professor walked through it all the things became clear.

- How does course material relate to other classes or interests?
CSC148 and CSC165 have some common logical operations. For example, bitwise operators "and", "or", "xor" are also used in CSC165. If we look at the truth table in CSC165, we can see that the results of these operations are the same as the results of bitwise operations in CSC148.

-Object Oriented Programming?
Object oriented programming keeps similar objects and their functions all together in a class. Sometimes, OOP saves us from a lot of problems. For example, to create a stack without OOP I have to create a list and write all its functions. But when I want to create a second stack I have to write all these codes again. However, while using OOP I'll use my class to create the second stack in one line of code. Also, when I need some functions to modify these objects I'll only write a method that can modify every element of this object, instead of writing a function for each object. Thus, OOP makes our codes neat, easier to code and read.