Udacity CS101 & CS373 – HW1 Review

Week 1 is over, the results for the homework are available. I like the new progress section, it looks great. There are a couple of feature requests I would have would be:

  1. Result summary in the sidebar to let me see all my results at a glance
  2. Ability to see the question and explanations right from within the progress section so that I can more easily hop around between questions.

However, overall I am very impressed in the quality of the site. This site was put together in just a couple of months, and the user experience is fantastic. Meanwhile, I was just paying bills on the Bank of America site, and the experience was awful. I’m very disappointed in a site that has surely had several millions of dollars invested in it. Please, can every you just automatically fill in CA when I’ve typed in Santa Monica 90404?

Now, for a quick review of the questions:

UDACITY – CS101

I saw someone with this question on reddit, so just in case you didn’t know the Udacity questions all have explanation videos. From the question, just click the next button.

These questions are most pretty straight forward even for those who don’t have any programming experience. The quizes after the lectures were more tricky than the homework problems. However, I wouldn’t blame anyone getting a few questions wrong due. It could be tricky everyone has time to fully listen to all of the lectures, or verify their work. I will nevertheless address a few of the questions:

  • Q5. This is the question asking which expression involving s would have the same value of s. This was a little tricky since almost all of the expressions were the same as s, and the remaining was equal to s in all but the trivial case of s == ”, and then only because (”)[0] throws an error (or in python speak raises an error).
  • Q8. Print out the index of the second occurrence of the string ‘zip’. Most programmers’ first instinct was probably to use an if statement checking if the first find was successful. if firstIndex > -1: text.find(‘zip’,firstIndex + 1) else: return -1. However, this is CS101, and you haven’t learned the if statement yet. Oops. But, then you realize if text.find(‘zip’) returned -1, then so would text.find(‘zip’, 0), and so you found your solution: text.find(‘zip’, text.find(‘zip’) + 1)
  • Q9. This was rated two gold stars because of the difficulty, and it did take some thought. I had even written the solution using the if statement. But, then I remembered some post I had read on performance and how many of the as3 Math functions were slow. This person’s fast round solution was to add 0.5 and cast as int. So, there was my answer. I’m not sure if I would have thought of the answer if not for that distant memory.

UDACITY – CS373 – HW 1

First, I’d just like to say how happy I am that there is actual programming in this class. The programming assignments were the single greatest factor that made Ng’s machine learning class so much better than Norvig and Thrun’s Intro to AI class. Now, the playing field is evened.

  • Q1. Just very straight forward probability here. People might get sloppy or misread something and get part of this wrong, but I don’t see it as worthy of any additional explanation
  • Q2. How does memory scale in the number of variables? A. exponentially. I’m embarrassed to say that I got this one wrong. Not quite understanding what was being asked, I just sort of guessed that the memory would scale linearly. However, if I had actually listened to the question I would have heard Thrun say, “… memory scale… for our histogram based localization method” From that phrase, I could have worked out that in 1 variables you have n buckets. For 2 variables, you have m divisions for the first variable and m for the second variable, so m2 buckets. For 3 variables m divisions for each variable so m3 buckets… for n variables you need mn buckets. In other words, the number of buckets scales exponentially with the number of variables. Thrun gave a good explanation in response video. I recommend taking a look at it if you are unfamiliar with big O notation.
  • Q3. A Bayes rule question. This question was a matter of applying the formula and plugging in the values. It sound easier than it is. It’s easy to miss a step or plug in the wrong value. For those that got this question wrong, my recommendation is to redo all of the quiz questions on Bayes and listen to Thrun’s explanations. It should be just a matter of straight forward algebra once you understand how to find the correct values.
  • Q4. My favorite question by far. It really allowed me to practice my python and achieve some comfort in the standard language constructs. (I’m using far fewer semi-colons.) I did get this marked correct, but I like Thrun’s solution more. It is far more elegant and pythony. I’ll need to gain more familiarity with the syntactic sugar of python, for example aux = [[0 for row in p[0]] for col in p].
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
def sense(p, Z):
    q = []
    for i in range(m):
        q.append([])
        for j in range(n):
            factor = sensor_right
 
            if colors[i][j] != Z:
                factor = 1. - sensor_right
 
            q[i].append(p[i][j]*factor)
 
    q_sum = 0.
    for row in q:
        q_sum += sum(row)
 
    #normalize
    for i in range(m):
        for j in range(n):
            q[i][j] = q[i][j]/q_sum
 
    return q    
 
 
def move(p, motion):
    q = []
    for i in range(m):
        q.append([])
        for j in range(n):
            failed_to_move = p[i][j] * (1. - p_move)
            moved = p[(i - motion[0]) % m][(j - motion[1]) % n] * p_move
 
            q[i].append(failed_to_move + moved)
 
    return q
 
p = []
 
m = len(colors)
n = len(colors[0])
 
uniform_p = 1./float(m*n)
 
for row in colors:
    p_row = []
    p.append(p_row)
    for cell in row:
        p_row.append(uniform_p)
 
for i in range(len(motions)):
    p = move(p, motions[i])
    p = sense(p, measurements[i])

Conclusion

Awesome, great job. I would say I’m looking forward to Unit 2, but I’ve already completed the lectures. So, here’s to Unit 3!

Note: There was also one new strange thing that happened. The time scrubber on the videos stopped showing where I was in the video. I’m not sure if this is an Udacity bug, or a youtube bug, or something with the browser, but it is very strange.

This entry was posted in Machine Learning, Online Learning and tagged , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

HTML tags are not allowed.