Friday 18 April 2014

Week 12

Since this our last week of lectures, Mr. Heap used them to help us prepare for the final exam. He did everything from reviewing all the concepts taught in this course, to showing us the front page of the exam and explaining to us how he and Mr. Wehr plan to write it. I plan to make an extra effort in preparing for this exam since a lot of students including myself, botched the second mid-term. This was, in fact, enough to convince Mr. Heap to bump up our marks on the test. I plan to prepare for the exam by tackling the exercise problems and past exams posted on the course website as well as reading over the course notes again and figuring out which bits I should also copy to my aid sheet. If I run into any problems, I'll go see Mr. Heap at his office hours.

Wednesday 2 April 2014

Week 11

Sorting and Efficiency

In my entry for last week, I mentioned that Mr. Heap introduced us to two new sorting methods. I learned very quickly in CSC108 that computer science is more about building a program that works; proper sorting and developing an efficient program emphasizes that.

Sorting in computer science refers to arranging a number of elements in a certain order whether it is ascending or descending, by smallest or largest or by highest or lowest value.This may seem unnecessary but it can significantly cut down on the number of steps needed to accomplish a task. Furthermore, some methods such as binary search require an object containing elements to be sorted. Some programs even require data to be sorted in order to function properly. Because of this, sorting is essential to writing organized code.

Efficiency, also known as performance, describes the number of steps executed when a program is run. This can be determined through finding the algorithm complexity; the big-oh of the program. I mentioned in an earlier entry that the big-oh of a program describes the number of steps taken in the worst case input or the maximum number of steps a program will take given any input. Knowing the big-ohs of programs makes it very easy to compare efficiencies. For example, program 1 with big-oh O(log n) is more efficient than program 2 with big-oh O(n^2); in their worst cases, program 1 will take fewer steps than program 2 given that the size of the input is sufficiently large enough for both programs

Friday 28 March 2014

Week 10

The lab this week was really difficult. I didn't get very far in trying to implement in-order traversals into a binary tree node. I'm hoping to ask Mr. Heap about this in his next office hour session.

In this week's lecture slides, Mr. Heap started with a couple of pointers to help us the the second part of Assignment 2. They really helped clarify what each function was supposed to do and provided shortcuts for some of them. If there is one thing I am going to remember about my time in CSC148, it's that I got a prof. who actually wanted us to succeed. Mr. Heap then introduced us into more sorting methods: quick sort and merge sort. Quick sort was completely new to to me; I never heard about it until now. Mr. Heap's tests and comparisons in lecture taught me how one simple detail in a function can drastically affect its efficiency. I was taught the concept of merge sort in high school but I never understood how to execute it in practice. I didn't even know that I could have a function nested within another function! But according to Mr. Heap's code, that is how merge sort would work: by using a helper function. In the lectures, Mr. Heap used a couple of cards and applied the concepts of the sort methods to help us understand them. By doing this, he stressed that learning concepts to be used in code is more important than simply learning more coding functions and methods

Tuesday 18 March 2014

Week 9

The new concept of Binary Search Trees is a bit harder to understand than I thought. The structure of a BST is simple enough; it's basically a regular tree but sorted. Each value that is smaller than the root is contained in the left sub-tree while each value larger than the root is contained in the right sub-tree. Each value is automatically sorted as it is inserted into the tree so there is rarely a need to call a sort method on a BST. Since BSTs are sorted, it is an applicable class for a binary search. The binary search cuts its search query in half every time it does not find its desired value. When compared to the linear search which checks each value one by one, the binary search is more efficient. One of the concepts I'm having difficulty understanding is how some of its methods work. For example, I don't understand how a delete method deletes the root in a BST and replaces it.

Mr. Heap also went into the concept of big-ohs this week. I already learned about this in CSC165 but it's still a nice refresher to have it be introduced again. The big-oh represents the maximum number of steps to be executed by a program. This means that the big-oh of a program would be focused on the worst case argument. Big-ohs represent these maximums in the form of mathematical functions (eg. constant, linear, quadratic) where the function is truncated to the term with the highest power (eg. if the function n^2 + 3n + 16 were to represent the number of steps a program would take in the worst case, only the term n^2 would be considered important). This makes big-ohs useful for determining the efficiency of programs

Sunday 9 March 2014

Week 8

This week, Mr. Heap talked more about linked lists. Linked lists were mentioned in high school but like many other concepts of code, I never truly understood them until I came to university. The way I see it, linked lists are lists with only two elements where the first element is an object and the second element is either empty or another linked list. There's recursion in this; the nested linked lists can be accessed with recursive concepts. The lab this week gave me some much-needed hands on practice with linked lists; implementing methods like __len__ and __setitem__ gave me some useful insight on how a linked list's structure, its strengths and its flaws.

One of the things I like about linked lists is that it is a coding concept like recursion. Linked lists are not limited to just Python; they can be implemented in other programming languages as well. Some languages like Java are very rigid when it comes to list-like variable types. For example, a Java array (Java's equivalent to Python's list) must have its maximum length be permanently set upon initialization. This can cause problems with processing efficiency but linked lists can help Java programmers around this problem.

Linked lists mirror recursion, however, when it comes to its flaws. Like a recursive function, a linked list class also requires a clever and thoughtful design. Failing to plan out a linked list's structure effectively can result in a heavily error-prone class. In other words, linked lists can be harder to design and work this than simple basic lists. Overall, like recursion, linked lists require more thought and planning in their design, but if designed properly, they can offer an alternative, more efficient method of storing data

Monday 3 March 2014

Week 7

Recursion

Sometimes, a programmer's task will involving repeatedly executing the same process over and over (eg. verifying each item in a list, constructing a list). While there's the basic method of using loops, there is another more advanced method called recursion. Recursion, put simply, is a process triggering itself over and over again. A recursive function follows this concept; it calls itself over and over again until a certain condition is met. This condition is known as the base case and it is the case where the function stops calling itself. A base case is essential for a recursive function to be usable; without one, the function would keep calling itself indefinitely. Coding with loops will usually require quite a bit of code but a full functional recursive function can consist of only one line. This can make recursion preferable when working on large projects that require several functions and classes

So far though, I've found recursive function to be difficult to implement. While recursive functions can require little code, I find that their structures require a more clever, thoughtful, flexible design, otherwise there are many cases where the function could fail. In our first midterm, we had to write a recursive function to answer one of our questions. I found writing this function challenging. Overall, while a recursive function can be useful if implemented properly, I am not sure if it is worth the effort to do so

Monday 17 February 2014

Week 6

With the first midterm for this course coming up, I can't say I'm not nervous. I plan to use the old midterm he posted on the course calendar as a "diagnostic" for how well I know the material. I'm not sure if it will be reliable enough though, since Mr. Heap mentioned that he planned to design this midterm differently. I wish he posted more prep exercises and review material. Nothing helps understanding concepts and methods more than applying it.

I had a bit of trouble understanding the order traversals Mr. Heap showed us this week. I can grasp the concept but I don't know the structure of a function that would return a traversal in a specific order. I'm hoping maybe some peers on piazza might be able to help me with this but even with help, I'm sure this topic will cause me problems on the midterm.