Thursday 27 March 2014

Sorting, efficiency, my final goodbyes.

This week, I'll provide a quick run-through of the various sorting techniques covered in class and how to improve their efficiencies. Before that, however, I must vent about the one thing that made me feel defeated this week: term test two. I studied. Oh, boy, did I ever. I went over past labs, I went over lecture notes, I played with code to break and fix it. I even re-commented all the code I could find by the profs. Alright, being honest, all of this still didn't give me the proper confidence to enter a test with- but it gave me enough to believe I could pass fairly well.

So, what happened?

I don't know if it was because I relied too much on the IDE while studying, or if it was because I didn't enter with confidence- but something that morning made me feel pretty crappy about myself. It was a constant struggle to decide if I should keep the code I wrote and risk a 0 if it is completely wrong or if I should play it safe and just leave it blank for the small mark...

Oh well, better start cramming for the final now to keep that mark up.

I'll now continue with my understanding of each sorting technique.

The selection sort looks at each element in the list, finds the smallest one and inserts in at index 0. Bubble sort looks at adjacent elements and swaps when necessary. Insertion sort compares the element at index 0 with the rest of the list and inserts it before the element larger than itself. This is all just review from 108.

Now it gets interesting.

Quick sort uses a pivot. This pivot, which can be any element in the list, bases where the list will be reordered. The element that is smaller than the pivot is placed before the pivot, while the element that is larger than the pivot is placed after it. Recursively, the pivot will be in its final place.

Merge sort continuously halves the list (by the magic of recursion), sorts these halves and merges the final divided elements together. This algorithm maintains the same efficiency for its worst, average and best case. Despite both this and quick sort maintaining the same complexity of O(n logn), merge sort actually runs better during its worst case, since quick sort then has a complexity of O(n²).

Efficiency basically means the run-time of an algorithm. Though, I didn't really get it until I experienced it. By this, I mean that the lab was super helpful in illustrating the differences in runtimes for each type of sort. During this lab, I manually inputed runtimes and graphs of each sort. Although I knew already, bubble sort and selection sort demonstrated, by far, the lowest efficiency in their worst case. Python's builtin sort, timsort, was fastest.. This made me look up the efficiencies of other python algorithms, and gave me the euphoric realization that builtins are almost always more efficient than any substitute I'd try to write. This makes life as a coder so much easier. And lazier.

I think I'm done.

Also, I want to say until next time, but this is going to be my last slog, so I shan't.

Good luck to all on finals!

Wednesday 12 March 2014

Links and lists and trees, oh my!

I've come to a conclusion that I need to reevaluate my knowledge of linked lists and binary trees.

So, building upon my last post, I'm going to regurgitate everything I know. For my own sake, that is. I don't know how many time I've already mentioned that explaining what I don't understand gives me understanding (probably because I research and triple-check my facts to not make an ass out of myself).

As I explained in an earlier post, linked lists are structures with items linking its predecessor and successor. They are useful for fast insertions and deletions. This is because a linked list adds a link wherever you need it by pointing its predecessor to it and linking itself to the item being inserted in front of. Regular lists, on the other hand, insert an item at i and must go through the rest of the list to update each index forward by one.

I then realized that linked lists are similar to trees, in that each node points to as many children as it needs. I reviewed the properties of trees to see exactly what they could be used for. First, they're hierarchical, meaning they're layered with general items at the top and specific items near the bottom. Second, children dependent on one node are completely independent of all others. Trees are trees, not spider maps. Last, each leaf is unique. Given all this, you can use trees to work out mathematical expressions, build sentences, create guessing games, store hierarchical data (family trees, evolutionary trees etc.).

Oh, but regular trees aren't enough. We need to complicate things. Here come binary trees.

Binary trees are similar to regular ones, in that parents/roots have children/leaves- but only two. Each parent has a maximum of two children; a left subtree and right subtree. These trees are useful in that the data stored is in a specific order- which is great when you need the organization, but a pain to look through for a specific item. And how can you solve this problem?

With another type of tree!

Binary search trees are also ordered, but with more structure. Each node has two or fewer leaves. The left leaf is smaller than the root, while the right leaf is larger than it.

This is all that I'm willing to talk about right now, since we're currently going through binary search trees and how to actually search through them.

Speaking of binary- I learned how to count in it! 0, 1, 10, 11, 100, 101 ... 

Until next time!

Friday 7 March 2014

[Linked, -> [Lists, ->]]

So, look at this:

source
This image is the definition of linked lists. At least, it's what I use to help me understand what on earth is going on.

The concept of linked lists, to me, appeared simple. The last index of the list points to another list, as the image I showed you demonstrates. The last index of that list points to yet another. And so on, and so on..

Simple.

Then I started working with these damn things.

Not so simple.

I figured if these lists were to save space and make it easier to delete or insert extra lists wherever you need. So I observed the code:

class Node:
    def __init__(self, cargo=None, next=None):
        self.cargo = cargo
        self.next = next

    def __str__(self):
        return str(self.cargo)

This reassured me that linked lists are only made up of two things: cargo and the next list (which is made of its own cargo and the redirection to the next list). Seeing this, I asked questions such as 'how do we see where a specific list is?'- which made me worry about deleting, inserting, length count etc. We need indexing in lists, don't we? So can these lists be indexed somehow?

My answer came in the form of a tutorial. After a ridiculous amount of frustration on my part (and the TA's), I managed to link my lists, index them, delete + insert items AND find a list length. What was odd was that the above code is just the node of a list, not the list class itself. The code for that is:

def __init__(self: 'LinkedList', item: object=None, rest: 'LinkedList'=None) -> None:
    self.empty = (item is None) and (rest is None)
        if not self.empty:
            self.item = item
            if rest is None:
                self.rest = LinkedList()
            else:
                self.rest = rest

Having this given to me made me realize that you really start from scratch with these things. You have to declare it empty, you have to declare where the items are and what they mean.

Okay, I'm done here. This slog made me want to experiment some more with these lists, so I'm off to do just that.

Until next time!

Wednesday 26 February 2014

Recursion and other thoughts

So it took me a while to understand recursion. Then it struck me. This is recursion (and this is mutual recursion).
(ha ha!)

I've been reading that recursion should be avoided at all costs. The only time it should be used is when it is absolutely necessary. I've realized, however, that it's impossible to avoid recursion in lists and trees- then it struck me that those two things are likely the only things that actually need recursion to function.

Trees are an easy example. You start with a line with two branches that form a Y. Okay, great. Now you have a sideways Y. What do you do to expand this tree? You draw another Y on each branch. How do you do this? By telling yourself to do again exactly what you just did! In python, this is possible with a recursive function: a function that calls itself x number of times.


import turtle

def tree(length,t):
    if length > 10:
        t.forward(length)
        t.right(20)
        tree(length-10,t)
        t.left(40)
        tree(length-10,t)
        t.right(20)
        t.backward(length)

t = turtle.Turtle()
t.color('orange')
tree(50,t)


This is what we get! Notice tree(length-10, t)? That's what makes our function recursive, as it calls itself to run again.


Recursion isn't only needed for these pretty colourful trees. It is also a requirement for binary trees to exist. Instead of lines, binary trees hold data. You can look at them as family trees (with parents and children) or physical trees, with a root and leaves. Similarly to the given tree above, binary trees call themselves until they hit their root (or your predetermined end). I won't go into detail, since the given example already provides some knowledge.

Hope that cleared some things up. Otherwise, for more information on recursion, see my recursion post.
(ha ha!)

Until next time!

Friday 14 February 2014

Alright, cheese could be worse.

One word: Elation.

I finished the program and it actually works. I was in shock and crying tears of joy (alright, that's a little dramatic). I realized that the best way to learn about what you're doing is by troubleshooting. Since I was lost almost the entire time, my code didn't work too well. I kept having to troubleshoot, go over the code given, see which modules use what. I'm very stubborn in that I don't like to ask for help. I want to learn and see why I'm wrong, rather than someone telling me what the problem is..

And that's what led me to finish the assignment.

I'll be honest, though. I'm pretty disappointed with my tour module. I mean, I finished it, but I didn't optimize it as much as I should have. I simply didn't give myself enough time. I spent 2 days working on tour, and most of my time was actually spent frustrated and not understanding what to do.

But generally, I'm happy. I was super relieved that I finished the assignment with only a few hiccups. I went to submit it on MarkUs, only to see the next two assignments asking for submission. Gah! There goes my brief relief.

Until next time.

Sunday 9 February 2014

I really do hate cheese.

I'm quite frustrated. Upon first looking at the assignment, I was terrified. I was in denial that I'd be able to finish it- let alone start it. I scheduled an entire day to work on it, during which I got nothing done because I was crying in the corner of my room, considering dropping out of school and selling bottle caps for a living. It really was a low time in the semester for me.

So I went to see the professor. I asked him questions that didn't make sense and I'm pretty sure he was ready to kick me out (sorry, Dustin). I then went to the CSSU office and asked them how to approach an assignment that I'm terrified of.

They calmed me down, they're a nice group.


So I printed out the instructions and all the code given to me. I read over it, I studied the code and made sure I understood 100%. I only understood about 40%, and would possibly be able to give you some made up explanation for the rest if I really needed to.

Then I started typing.

Then something worked.

So I kept typing.

Something else worked.

This went on until I thought I was done the third step.

Nope. I broke something. Why aren't the cheeses stacking? You, there! Smallest cheese! What on earth do you think you're doing? You don't belong in that bigger cheese! You're supposed to be on top of it!! Gaaaahhhh!

Then I fixed it.

And after an entire day of frustration, anger, pity and utter confusion, I've come to acceptance.  I can totally do this. This isn't rocket science, it's just stupidly hard if you moan and complain and don't calm down to actually get some work done.


Any guidance I may need is given to me. I'll finish this assignment, I just need some patience and good troubleshooting skills. 

Alright. Until next time.

Sunday 2 February 2014

There are no exceptions to this smooth week.

I think the week went well. While working with subclasses and exceptions on the lab this week, my partner and I got a little too excited seeing everything work. I always do a happy dance on the inside when something I write works on first go. It's quite an accomplishment, in my eyes.

I was thinking quite a bit about this course over the past while, and have concluded that it won't be as easy as I hoped. Programming doesn't seem to come naturally to me. At least, not yet. Right now, I have to try codes in a million ways before getting it to work. Very seldom does something I write work on first go- despite the fact that it gets to work eventually. This wouldn't be a problem if it wasn't for the fact that we'll have written tests. I can't test and debug!! That terrifies me. These written portions are what hurt my mark in csc108.

My strategy is to stop using my laptop for notes. I'm going to print out exercises and do finish them on paper. How else will I learn?

On another note, I found this fun debugger guide for python.



Until next time.