Wednesday 2 April 2014

Sorting and Efficiency

Something new I learned this week?

I learned the new sorting algorithms quick sort and merge sort and Tim sort. In CSC108 I was introduced to the concept of sorting algorithms and learned Insertion sort, Selection sort and Bubble sort. I still am not very confident with Tim sort and quick sort but I do know that they are the fastest compared to the other sorting algorithms and I know Tim sort is a hybrid. Maybe if I continue "slogging" I'll update this with a more detailed understanding.

Merge Sort

 Merge sort was fairly hard to grasp but I watched this really helpful youtube video(https://www.youtube.com/watch?v=EeQ8pwjQxTM). It's better when you actually see someone move physical objects around to understand the sorting types. Merge sort has n log n runtime in its worst case and that is totally understandable. Having to make single element lists then merge it into 2 then 4 then 6 and so on is fairly tedious and it doesn't matter if your list is sorted or not because the way merge sort works is that it will still go through all the steps in the best and worst case. Merge sort basically takes all the elements in a list, sorts them into smaller single element lists. You then merge the first 2 lists and place them in order from lowest to highest then the same with the next 2 until you have a bunch of lists that are each 2 elements long and in the event that it is odd you also have one single element lists. You now have a bunch of 2 element sorted list and in an odd length of the list an extra single element list. You now merge (this is why its called merge sort I assume.)the first list with the second and put them in order by comparing its smallest in the first list to the smallest in the second until the your left with one merged sorted list from the first 2 lists. You do this to the third with the fourth and keep following these steps until you have one sorted list left.

Insertion Sort

insertion   sort Insertion sort is better
than Merge sort because the runtime is affected whether your list is sorted or not. In the worst case Insertion sort has a runtime of n^2 because it has to
keep looking through the sorted list for a position of the next element in its worst case every single element will get shuffled every single step assuming the worst case is that it is backwards and sorted. In its best case it has a runtime of n because the list is already sorted and no shuffling is really needed because the element is just added to the end of the sorted list. When you start at index 0 in a list and consider that index sorted. You then look at index 1 or index 0 in the unsorted part of the list and and insert it either before or after index zero depending whether it is greater or less than that number. After that you would just keep moving down the list and inserting the next index where it fits in the sorted part.

Selection Sort

selection   sort Selection Sort is better than merge but in its best case worse off than insertion sort. The worst case runtime and best case run time is both n^2. This is because regardless if the list is sorted or not when you go through the entire list and and select the largest and place it at the end of the list or in its place the same amount of times. You then go and look through and get the largest in the unsorted list portion and place it right before the largest in the list. This repeats until you reach the end of the list.









Bubble Sort
 The last sort I had previously been familiar with was Bubble Sort. Bubble sort compares the first and second element then swaps the larger one to the right then looks at the second and the third the swaps if the third is larger then third and fourth and swaps the larger one to the right. This continues till the largest is at the end and continues from the beginning again and again till all is sorted.

Friday 28 February 2014

Recursion

AHHHHH!!
We just got introduced to recursion and at first glance is the most confusing thing I've ever seen.
To be quite honest not having a lot of programming experience and the combination of recursion has real distracted me.

Thursday 13 February 2014

Recursive Tree Lists

This week we mainly discussed tree methods that work with the trees. I previously worked with search list in the labs which i understood and was able to complete but the tree list was like maybe 10 steps ahead of that. We learned that a leaf is a node with no children and a root is parent with no parents and a child could be a tree but all the last nodes are children. We than learned about post order and in order traversals which are two different ways of representing the nodes in a tree.
We did a lot of recursive functions in class that were fairly similar but accomplished different thing.
We worked with list comprehension in a lot of the function while working with tree lists.
In comp sci, a binary tree is a tree data structure in where each root can have a max of 2 children. The children ar In a binary tree, the degree of each node can be at most two. Binary trees are used to implement binary search trees and binary heaps, and are used for efficient searching and sorting. A binary tree is a special case of a K-ary tree, where k is 2.



Monday 10 February 2014

CONQUERING RECURSION

So recursive functions
Its when you create a function and call on it within its return statement.
Recursive functions have 3 parts in general
the base
the recursive
and the work.
the base is the simplest part of the functionthat requires no recursion at all. this is usually the first if statement,
the recursive are the recursive function within your function to make your work easier,
the work is what youll have left after youve recursed and use he smaller data to work with.
Its like when you do something in a function then manipulate it and then call on it again and again till it reaches an else statement. if not it will loop infinitely. which is like super duper bad.
its like the factorial example which gave me my eureka moment
its like 5! is the same as 5*4*3*2*1
but with recursion you can just cal 5 *4! and it will work the same because it recursively got the 4! number.
Recursion saves time energy and effort
I finally got it!
Like I had an epiphany and I can shout it out off the top of mountains!!!!!
Seriously though... I feel like I was in a black hole because I didn't grasp it sooner.
I think the main part was making my brain forget what it already knew. I was stuck trying to put my "csc108 for loop" into the recursion somehow. I had to kick my brain out its comfort zone to finally get out the old habits.
RECURSION!!!! erm.. uh.. yea.. Had another outburst...
I don't think the world can even fathom how happy I am to have been able to grasp recursion.
And search lists, I think that was the only lab I can say I took charge of with confidence!!
I think Im on a roll ...
And Pre order traversal to all whom don't understand I think is like left subtree then right subtree and then in order doesn't seem in any kinda order to me to quite honest Im gonna try and look into that so hopefully the next time i check in there will be more of it to come!

Thursday 23 January 2014

Object Oriented Programming Week 1


This week we learned about Object Oriented Programming.
From my understand object oriented programming is when you create an object and you give it a class and create methods that you can call on if you create an object in the class you've created. We also learned that you can create a new class and place the name of another class in the parameter and it will take the attributes of that class so you don't have to be repetitive.

Also this week we learned recursion, which is very very confusing. Recursion is when you create a function that you call on in the body of your function. It is a very hard concept to grasp and I'm gonna be reading up on this because it not something you can grasp after seeing once.