Saturday 5 April 2014

The Last Slog

Week 12: The Last Slog

As the final week of classes finish, the time to study for finals has arrived! Luckily, the CSC148 final is not until the 23rd, but there are many others to be done before (at least for me). This week in class, we went over how dictionaries worked, and we also had a review session of everything covered over the semester. The explanation on dictionaries was interesting, as in CSC108, we were told the ordering was “random” (but not quite) in that to us it seemed random, but the computer had its own way of ordering the indexes. I had been curious at what the instructor meant by this, and now I know. The indexes are ordered through hashing. That is, the indexes (which are immutable) are given a number and are ordered based on those numbers. In addition, there was also the review on Wednesday, which helped me remember what was covered this semester.

Outside of class, the main thing was E4. Overall, E4 was not that bad, the code for the function insert was very helpful in writing the two functions needed. This task probably would have been more challenging had the given code not been provided. I did make a careless (and probably common) mistake of returning the node.item instead of the node itself for E4b, which caused the auto test to fail, but this was easily fixed. It seems one recurring problem for me with this class (and in other classes as well) is making careless mistakes even though I understand what I am doing. For example, it is very easy to make a simple typing mistake (such as leaving out a character in a variable name), forget to list include a parameter in a function call, or misread directions (especially in a test when I am nervous). I am prone to such mistakes, which can sometimes be a major problem. The best way to improve this in my opinion is to just check over my work before I submit it. I usually catch most of my errors, but something almost always manages to slip between my fingers (or I run out of time). These mistakes are especially fatal in programming, as one small error can cause the entire program to fail. But with more thorough checking (something I normally do), and maybe having someone else check my work (something I don't normally do but could try), I can hopefully eliminate all of these mistakes.

The second last slog I picked to read is http://47tulips.wordpress.com/. This person found sorting, and big O to be the most difficult unit of the semester, and I feel the same way. To me, big O requires a different way of thinking than does writing code, which may be why some people are uncomfortable with it. This person also mentioned the sorts we learned in CSC 108 were very inefficient compared to the ones we learned this semester. I agree, as I said something similar in my entry on sorting. The very last slog form another person I will read is http://supercskid.blogspot.ca/. I thought this person's descriptions of merge sort and quick sort were fairly accurate. Like the previous slog, this person is also still getting used to big O. However, it was mentioned that the lab for sorting was the easiest. I actually agree with this statement, as a large part of it was just copying, pasting, and making graphs. Interpreting the graphs was not too bad, as we already knew the run times from class, and the graphs confirmed what was said in class.


Well, that was the last slog. Goodbye everyone! See you at the final :)

Wednesday 2 April 2014

Sorting and Efficiency

Week 11: Sorting and Efficiency

The last topic of CSC 148 is sorting and efficiency. My first exploration of this topic began in CSC 108, where the bubble, insertion, and selection sorting methods were introduced, and their run times were analyzed. Now we have turned to this topic again with more information. More sorting methods (quick and merge mainly) were introduced, their run times were analyzed, and we began using big O notation to describe efficiency.

To begin with sorting, here is a little summary. For quick sort, a pivot is chosen, then all values less than the pivot go to the left, all values greater than the pivot go to the right, and the pivot is in its proper position. After, quick sort is recursively called on each side of the pivot; the process continues until the list is sorted. For merge sort, the list is first divided in half. Merge sort is then recursively called to sort each half, and then the sorted halves are “merged” together.

Something I have noticed about these sorts is that they are generally much more efficient than bubble, selection and insertion sort. For example, all of the 108 sorts have a worst case run time of O(n2), while merge and quick sort's run times are generally O(nlgn). I wondered why this was, and then it occurred to me. The 108 sorts are iterative over multiple rounds. The number of comparisons in one round increases linearly with the size of the list, giving O(n). In addition, the number of rounds is simply the length of the list or sometimes the length of the list minus one. The number of rounds also increases linearly with the size of the list, giving O(n). This leads to the total number of comparisons/exchanges to be quadratic by the famous formula for the summation of every number from 1 to n. Merge sort, however, takes more recursive and binary approach. By dividing the list in half each round (a binary strategy), the number of recursive calls required only increases by one every time the list size doubles, greatly reducing the run time as opposed to a four fold increase for the 108 sorts. This is where the lgn comes from in merge's run time. So where does the additional n come from? In each recursive round of merge sort, all of the elements in the list are being compared. So the number of comparisons increases linearly with the size of the list, giving O(n). Therefore, the total number of comparisons is the number of comparisons in each round times the number of rounds. The number of comparisons in each round is n; the total number of comparisons is lgn, so the run time of merge sort is O(nlgn). The run time for quick sort offers a similar explanation to merge sort, but only if the pivot chosen lies in the middle of the list. When the pivot is in a middle position, the list is roughly divided in half, with a binary strategy like merge sort. However, when the pivot is too much to the left or too much to the right (in the worst case), quick sort starts to resemble a combination of insertion and bubble sort. The run time complexity here is O(n2), which is the same as both insertion and bubble sort in the worst case. So back to the first sentence of this paragraph, these new sorts are generally more efficient than the 108 sorts because by taking a binary approach, the number of rounds of comparisons is greatly reduced.

Of the things we learned in this class, the one I had the least practice and felt most uncomfortable with was sorting and efficiency (especially run time analysis). However, the examples in class, the past test question, and writing this slog helped me understand it better! 

Sunday 23 March 2014

Week 10

Week 10

Ten weeks class means five sixths of the course is over! I shouldn't think of this too much, as it will only distract me. (For example, I almost forgot to write this slog.) Now turning to the actual course, this week, various sorting algorithms were reviewed. Some, I already knew from CSC108 such as selection, bubble, and insertion sort. Others, such as quick and merge sort, were new to me. When these were explained in class, I felt that I could understand them fairly well. I found the card demonstrations to be helpful, as sometimes I am a visual learner. In addition to the method, the run times of these sorting algorithms were analyzed. Although I am less confident with run time analysis, once it was explained, I think I understand why merge and quick sort have the run times they have.

Outside of class, I worked on A2 part 2, and I started studying for the second test. I had finished most of A2 part 2 the week before, so this week was mainly for revision (and there were many revisions to be made!). In the end however, I believe I have a working program. To study for the next test, so far, I have been doing the old labs, starting with week six. When re-doing the labs, I find that they are much easier and straightforward. For example, to write a function that took my partner and I forever in the lab only took a fraction of the time at home. I also plan to look over in class examples and old tests. After doing the labs, I feel that I better understand linked lists and binary trees, but on piazza, it was also mentioned that big O would be on the test. This makes me nervous, as I do not feel that well about big O. I remember doing of run time analysis in CSC108, but it was only a little bit near the end. I hope to get more practice with this by doing the old test question on it (but I would also like the answers to check my work!). I should probably work on old examples to make sure I really understand them too.


The address of the first slog I randomly picked is http://codezerda.tumblr.com/. This person basically summarized binary search trees. I agree with what this person has to say, as it is mainly objective. In the “about me” section of the slog, the person mentions about being a computer science student and wanting a job coding video games (which could be cool in my opinion). I know I said this in a previous slog, but right now, I'm not CS student. I'm taking this class from family tradition. But as the semester comes to a close the time begins to chose a POSt. Among a few other things, I'm actually considering CS, but I need to think about it more before! Onto the next slog: http://bighairyleg.blogspot.ca/ (eww, big hairy leg). This person only has one post, that is, week 3 on object oriented programming. So, I can only comment on this post, but it will be fun looking back. This person (at least in week 3) didn't feel very confident on object oriented programming. I can relate, as in CSC108, when we did this at the end, I wasn't confident either. When I figured out this was the first topic of CSC148, I thought oh no! On top of a not so great feeling, I hadn't touched it at all during winter break. I too, was worried those first few weeks. But as we learned harder things like recursion, object oriented programming suddenly didn't seem so bad. A good example of this was A2 part 1. This was basically OOP: designing classes, using inheritance. Many people, including me, didn't think this was so bad. I don't know when OOP “clicked” for me, but it must have at some point, and I hope this person also feels the same way now.

Sunday 16 March 2014

Week 9

Week 9

Another week completed means another slog to be written, so here it is. This week in class, more on binary search trees and running time analysis were covered. For binary search trees, we further looked into the methods that can be performed on them, the importance of ordering, and how ordering can be preserved. It was also mentioned that searching a BST was most efficient when the tree was balanced, so half of the values could be ignored upon every completion of a search round. This began the transition into run time analysis, where the next questioned posed was, given a well balanced BST, how would the run time be described. The answer, as it turns out, is logarithmic, (which is efficient). In addition, the big O notation was introduced, which is used to described run times. Apparently, big O was introduced in CSC165, but I am not taking that (I'm not a CS student), so this is new to me.

Outside the lecture, the main things I worked on were exercise 3 and assignment 2 part 2. I actually finished exercise 3 last week, but I had some careless mistakes that caused a failure in my auto test report. Also, I think there was a mistake in the auto test, where the test for sum to deepest had an invalid input. This was mentioned on piazza several times, but I think the person doing the auto test fixed that, as I did not get that error again. So, after fixing my errors, I worked on A2 part 2. Because I had midterms in other classes to study for, I did not start part 2 until Thursday. When I started working on it, I read that we only had to write four functions. Initially, I thought that if I write two functions a day, I could finish this by Friday (tomorrow). As it turns out however, even though there are only four functions, they were much more difficult than they seemed. Writing one function and getting it to work was very time consuming for me, so I realized that two functions per day was too much. I then decided to write one function per day. When working on the assignment, I found the piazza forum very helpful, as many of the questions I had were already answered there. I have now written three functions, and I plan to write the last one (the one that makes a tree from the regular expression), later today.

The first slog I chose to read comes from “The Angry Student” at http://angryprogram.blogspot.ca/. (What makes this student angry anyway?) I usually look at the most recent post, but this time, I decided to look at some of the earlier ones. In one entry, this person writes about Project Euler. I have not done any problems on this website, but I have heard about it. A few years ago, when I did not know anything about programming, one of my older siblings wanted me to do it in order to teach me programming. But, as I was lazy and not interested, that never worked out. Now, however, I do know a little and would like to see if I can solve these problems. Maybe one day, when I have more free time, I will try it. Anyway, moving on to the next slog: http://pythonisnotasnake.wordpress.com/ (I picked this based on the funny title). The first sentence in this was Wow I need to post more often on this blog!”. When I read this, I checked to see how many entries this person had, and there were only three, so I guess that statement was right after all. Aside from that, the most recent entry had to do with recursion. Three laws (obtained from a web site) were mentioned. I agree with these, but in my opinion, the first two about the base case are saying almost the same thing and could be combine into one. Having three laws also reminds me of Newton and physics.

Sunday 9 March 2014

Week 8

Week 8

Eight weeks into the class means two thirds of the course is finished! Since there is no assigned topic, I will write this in my usual format. This week, there were two main topics, linked lists and binary search trees. Linked lists were reviewed last week, but this week, they were implemented differently by using a wrapper and a helper. A wrapper and helper are essentially two parts of one function in that the helper does one part of the function, while the wrapper calls the helper and completes the function. As a result, only the wrapper is called directly, and the helper is called when the wrapper is implemented. Having learned this, I can see how it would be useful as it makes the program easier to follow. The other topic of the week was binary search trees. A binary tree is like the trees we have been reviewing, but what is special about these is that they only have two children per node. A string method was written in class, showing the use of wrapper and helper methods.

Outside of the lecture, I worked on the lab, assignment two part one, and exercise three. This lab dealt with linked lists like last week. After working through this lab, I feel much more secure about linked lists than I did last week. I felt that I was better able to understand what I was supposed to do and why I was supposed to do it. The second thing I worked on was A2 part 1. With this, I had several designs and could not decide which I thought was the best. So, after several changes and adjustments, I think I have something I think is okay (and I hope the grader thinks so too). Last, I worked on and finished exercise three early so I have many opportunities to get autograder feedback before the due date. I thought this exercise was harder than the first two, and I had to spend much more time thinking about how to solve the problem than I thought I would. I also had to draw some pictures, which is something that was not necessary for the first two exercises.

The first SLOG I picked to read is http://confabulatingastringofcontrivances148.blogspot.ca/. This log begins with the quote “In order to understand recursion, one must first understand recursion.” I thought this quote was humorous in that not only is the quote itself recursive, but it doesn't help someone who is trying to understand recursion that much. One thing this person wrote was “How can we write our function in a way to use a value, before we even know what the value is going to be?”. He/she was describing a source of confusion many people have when first learning recursion. I can relate to this, as I actually wrote about having the same thought in my own slog entry on recursion. The rest of the slog is a straightforward example on how to solve a recursive problem, (and has an interesting name). The second slog I picked is http://everyonelovesaslog.blogspot.ca/. This log mentioned the Fibonacci sequence, something I also mentioned in my recursion entry. The slog also mentions that in order to write a recursive function, one must know the base case. I think this is true, as when I write recursive functions, the first part I write is the base case. That way, once I know that my function does, I can build from that with the recursive calls.

Sunday 2 March 2014

Week 7

Week 7: Recursion

There is an assigned topic for this week's slog: recursion! To begin, a simple definition of recursion would be a situation where, in a function body, the function itself is called. A function body, however, cannot contain only recursive calls as this would lead to an endless recursion loop. This problem is solved by having a base case, which is a piece of non recursive code that the function implements under certain conditions.

Over the past four weeks since recursion was introduced, the most common case I have seen in which recursion is used is nesting. For example, the first recursive function written in class was to find the sum of a nested list. Since then, we have written more functions for nested lists, and have been introduced to various other recursive structures. That includes trees, linked lists, and search lists from the lab, which are recursive because they may contain other instances of themselves nested within them. Another case where recursion is used, and this is where I was first introduced to the concept of recursion, is in a mathematical function, where the result is dependent on the previous result(s). For example, the week five reading mentioned the Fibonacci sequence, and the function M(n) used to find the minimum number of moves required for moving n number of cheeses from A1 use recursion. Last, an additional use of recursion could be drawing pretty fractal pictures.


When this topic was first mentioned, I thought it was strange that the function could call itself again without finishing its original call. But now I understand that in order to finish the original call, the function needs to call itself in order to solve the “sub-problem” and use that result in finishing the first call. The first time I wrote recursive function, I was overly concerned with my base case because I was worried about the endless loop, but as I got more practice, I was better able to write an effective solution with considerable confidence and with a better understanding of how the function is implemented. Also, I now see why this would be useful. For example, in the case of nesting, it is much more simple and efficient to write recursive code than to write non recursive code. That is because for a large degree of nesting, one would need to write a large number of repetitive code segments in order to deal with each degree of nesting while writing a recursive call only would take up one line. Reading that one line is also easier to understand than numerous lines of repetition. Overall, I find recursion to be an interesting topic.

Monday 17 February 2014

Week 6

Week 6

This week in class, we went over trees. The terminology was straightforward and seemed logical (for example: root, leaf, node, path, length). How to write code that visits every element in the tree, called a traversal, was also reviewed. There were three types, pre-order, in-order, and post-order traversal, and each involved recursion. In addition, various methods for a tree class were introduced.

Outside of class, the thing I worked on this week were the lab and assignment 1. This lab involved list comprehensions as well writing some unit test cases. I remember that in the last time I had to do this, in a previous lab, I forgot how to do it and was a little confused. But this time, when I looked at the examples from the previous lab, the process was much more clear to me. The other part, converting list comprehensions to for loops, went okay. The other main thing this week was assignment 1. Although I thought I had finished it last week, I figured out that I could make some major improvements to what I had. This mainly applies to step five, which solves the Towers of Anne Hoy game for four stools in the most efficient way (hopefully). What I originally had matched the most efficient known method for up to nine cheeses. Starting with the tenth cheese, my method took more moves. By twenty cheeses, my method was using more than double of the best known method. (I found this out by looking at other people's posts on Piazza). To make matters worse, I did not see all this until February 13 at around 4:30-5:00 P.M. Luckily, I had about 5 hours to fix my code. After several failed attempts, I did figure out something that I believe produces the same number of moves as the best known solution for any number of cheeses (not just up to nine like my old method) with a few hours to spare. With assignment one out of the way, I guess now the thing to do is study for the midterm and start A2!


The first SLOG I randomly picked today is http://csc148minimalist.tumblr.com/. After reading this SLOG, I (and probably many people) can relate to this person in the aspect of “drifting in and out of classes in a zombie-like fashion” and hoping that this class would help ease me into the new semester. I do agree with this person in that the first lab was harder than I expected, and like I mentioned in a previous log, the directions were a little confusing. This person also makes a comment about CSC 108 being ego boosting, which I thought was funny. The second SLOG is http://heapofcode.blogspot.ca/. The first thing that stood out on this SLOG is the person's first entry title (with the def __init__), which I thought was creative. I also liked how this person tried to sound enthusiastic, and how he/she mentioned Pokemon in the Object Oriented Programming log because I happen to have a bunch of Pokemon toys that bring some very sweet memories back to me :)