Friday, December 5, 2008

NFSA

Just had the term test today. It went better than I expected since I wasn't too prepared on NFSAs. I got lucky there wasn't a question on it. There was one question on DFSAs which was fairly easy since no proof was required. The second and the third question were about regular expressions and those I have had enough practise with and didn't trouble me much at all.
I took a look at NFSAs after the test however and it doesn't seem as bad as I had thought. Apparently they are equivalent to DFSAs, but a lot simpler to construct and deal with. My main concern was those seemingly "random" choices that are allowed to be made. But now I see that it's just a different form of expression and it's not like a computer randomly chooses one. It's just different allowed paths that can be taken. For some reason NFSAs actually seem more familiar to the more tangible handling of the regular expressions that we did in CSC207. Anyways it looks like I'm not too badly prepared for next week. I just need to look through NFSAs a little more in depth and also take a look at the Pumping Lemma and I should be set for the exam.
This is my last blog post for this course I suppose. I wasn't exactly jumping for joy about this course in the beginning since I'm more into algorithms that are a little more practical than the purely theoretical side of computer science, but I actually quite enjoyed it and I must say it has made me more curious about what CSC263 has in store. Danny was an excellent prof and I hope I get to take a class by him again.

Wednesday, December 3, 2008

FSA, regex and CFG

I've been out of touch with the SLOGs lately. I guess here I'm going to reflect on everything that happened in the last four weeks.
To start off, despite them being able to represent the same things, I still find it sort of hard to convert regexes into FSAs and vice versa fluently. However, the feeling I got from Danny's handling of these two is that certain languages are easier to handle by one of them and others are easier for the other one. The first example he showed us with an FSA clearly demonstrates this. The language of binary strings with an even number of zeros and an odd number of ones is quite difficult to find a regular expression for, but an FSA can be built very simply. Individually, i.e. without the worry of coverting from one to the other, I think I have a pretty good handle on both methods. This is especially true for regular expressions, since I got quite a lot of practice dealing with them in the second assignment for CSC207.
I'm not too clear on NFSAs however. I missed some important lectures on that topic and now I'm behind. Hopefully I will be able to catch up fully before Friday's test. It doesn't seem very clear to me on how to decide with path to take when there is more than one option. Also another thing that throws me off is the necessity to include epsilon transitions for the start and ending states. I was unable to solve the last question of Assignment 3, but this topic I have yet to study properly so hopefully things will clear up soon.
And then we covered context-free-grammars. These seem pretty straightforward, but I'm having trouble understanding why they are equivalent to PDAs. In a PDA, we can use a stack to remember certain information. But where is this memory feature in CFGs? PDAs are cool though. I liked how it could handle binary strings with the same number of 0s and 1s. A friend of mine in 3rd year computer science told me about this exact same language and how an FSA or regular expression cannot handle it and apparently it takes a Turing Machine to do this. But as it turned out, all it takes is the stack feature of PDAs.

Tuesday, November 11, 2008

Iterative Correctness

I missed a couple of lectures on iterative correctness and it took last week's test to get me up to speed. It is, as Danny had mentioned, definitely harder than recursive correctness. I found the use of induction here to be more traditional when compared with recursive correctness. Recursive programs are broken down into smaller parts and so we had to assume those smaller parts were true and then conclude that it implied the full size. But in iterative programs we use induction over 'i' which is each iteration of the loop and then prove a formula which is the loop invariant. So to me this seems much more like normal induction.
However, the hard part is actually figuring out what the loop invariant is. I seem to always have trouble with this unless it's a really simple program. The example that Danny showed in class about the counting down in base 7 bothers me even now. I don't think I'd have been able to figure out the loop invariant would be 7a instead of 6a.
Another thing I don't understand is the necessity to prove the termination of the program. The necessity to prove the termination of the loop itself is fine, so that it doesn't iterate infinitely. But if it's needed to prove the termination of the program in this case then why was it not necessary to do so in recursive correctness? Maybe it's simply because unless there is a loop then it's trivial to see that it will terminate, but formally speaking should we mention it in recursive correctness as well since if there is no base case then quite possibly that could recurse forever as well?

Saturday, October 25, 2008

Correctness

We started proving the correctness of recursive algorithms this week. I'm starting to realize how powerful induction is. Almost anything can be expressed in an "inductive" way.

Sometimes though it can be a bit tricky to find out how to break it down to that form. Danny showed us this in class when proving the greatest common factor algorithm. Simplifying the two numbers into one by specially relating the two (for all m, for all n > m) helped turn it into a simple one variable induction process. This was a new way of seeing it for me.

Danny mentioned that recursive programs are easier to prove correct than iterative programs. This was at first kind of against my intuition since I have a better picture of what going on in an iterative program (recursive programs that branch off into stacks of trees and whatnot are especially confusing to follow along). However, now it seems this is actually true. For correctness we don't need to follow through with the whole recursion. Instead we just need to worry about the top most layer, which is really simple. For iterative programs, however, I'm guessing the complexity comes from the fact that we actually need to worry about ALL the iterations not just one of them.
Things will get clearer next week.

Monday, October 13, 2008

Post Term Test blues

I had my term test on Friday and I am disappointed with the outcome. There were three questions in total and the first two seemed simple enough. The last one however, threw me in for a loop for a good 20 minutes or so. The worst part is that, with less than a minute left to go I figured out it was actually the simplest one. I spent most of the time attacking it with induction but as it turned out I just had to know how to count! Ironic how I always preach "think outside the box" to everyone in any situation with the slightest relevance to it. Oh well, nobody's fault but mine (heh pointless thing to say but I do love Led Zeppelin). Hopefully the TA can decrypt my scribbled attempts and sympathize a little bit.

The test did it's job though. I was brimming with overconfidence after the Assignment 1 marks were handed back and I slacked off during the week before the test.

Now problem set 3 is due and my week is absolute HELL. I have at least one piece of work due for each of my courses. All on Friday (except one). Need to get cracking. Not to mention I have some catching up to do on program correctness since I missed the first two lectures last week. I will post again after this hellish week is over.

Friday, October 3, 2008

First Impression

So this is my first blog. I'm kind of late in starting it but I think I needed this time to gain some experience with the class and reflect back on it.

Danny's lectures are quite informative and fun to be in. I really enjoy his practice of sparing 10 minutes of class time every now and then to let us go think about a problem and then discuss the solution with the rest of the class. I find this particularly useful since it instills a sense of competition among us students to come up with a better and more efficient solution quickly. I've never treated assignments in this way. Those were always (and still are) just a work session, no matter what the topic. It also gives me a better idea of what he expects from us better than a handout or rubric of some sort would.

Another thing I really like is Danny's availability on the boards. I couldn't make it to the tutorials for Assignment 1 but it wasn't a problem at all since all my questions were answered by Danny or some of the other students on the board.

All that said however, I have to say I found the first three weeks a bit slow. I thought a bit too much time was spent on induction than necessary. This sentiment is probably not be shared by everyone though. I should probably read ahead on the upcoming topics but I never manage to get around to it. I'm hoping the pace picks up a bit more.

Anyhoo that's enough rambling for now. I'll post again after another week of classes.