Support Vector Machine - Georgia Tech - Machine Learning

Check out the full Advanced Operating Systems course for free at: www.udacity.com/course/ud262
Georgia Tech online Master's program: www.udacity.com/georgia-tech

Пікірлер: 51

  • @fani5000
    @fani50005 жыл бұрын

    This format of learning through a dialog like this is fantastic! Thanks for posting :)

  • @hamed7600
    @hamed76007 жыл бұрын

    These two guys are hilarious! Eases the pain of learning :D

  • @leif1075

    @leif1075

    3 жыл бұрын

    Why is learning a pain?

  • @DanielRodrigues-bx6lr

    @DanielRodrigues-bx6lr

    3 жыл бұрын

    @@leif1075 According to this video (kzread.info/dash/bejne/jGaCl8NwiL29l5s.html) learning something new activates stress responses in the brain. I can't speak of the veracity of that claim, but the video seems to be good.

  • @guruphiji
    @guruphiji7 жыл бұрын

    the data revisionist and invisible point joke is hilarious ! these 2 guys makes it easier to understand ! very good idea to have this noob-expert exchange to help it make it more accessible!

  • @mustafaqahrman2078

    @mustafaqahrman2078

    6 жыл бұрын

    سن

  • @tehuatzi
    @tehuatzi7 жыл бұрын

    Michael's comment at 2:32-2:47 was super helpful, esp. for folks with weaker linear algebra backgrounds.

  • @moazzamali3587

    @moazzamali3587

    5 жыл бұрын

    Can you please explain. I wasn't able to understand

  • @Nupur2308
    @Nupur23083 жыл бұрын

    Awesome video. This is one of the most intuitive explanations of SVM I've seen so far. And I'm coming from Andrew NG's course where he went about explaining it in a very roundabout way and I didn't grasp anything.

  • @vtrandal
    @vtrandal Жыл бұрын

    Excellent. Thank you for making this video. I needed to know about hyperplanes in the context of support vector machines. And you nailed it.

  • @tymothylim6550
    @tymothylim65503 жыл бұрын

    Thank you very much for this video! It helps me get introduced to the conceptual understanding behind SVM!

  • @SatishSharma-ff5ug
    @SatishSharma-ff5ug3 жыл бұрын

    Awesome and creative way to present by two genuine persons

  • @leiyin5544
    @leiyin55448 жыл бұрын

    good explanation!

  • @joshfann8495
    @joshfann84955 жыл бұрын

    I like the two man system where one guy acts like he barely has idea of what's going on (just like me).

  • @sevilaybayatl6315
    @sevilaybayatl63154 жыл бұрын

    Thanks, very useful information!

  • @NikolaRJK1
    @NikolaRJK17 жыл бұрын

    One question: Since the gray lines are different, shouldn't they have different b's (intercepts)? Like b1 and b2

  • @brianvaughan633

    @brianvaughan633

    6 жыл бұрын

    The B is used as the intercept for the yellow line when the y=0, when they move them be to the gray lines they set them equal to -1 and 1 but the b's remain constant

  • @yanshi9071
    @yanshi90717 жыл бұрын

    Still can not figure out why it is 1 or -1 if points on those two lines. If this is the case, what about the equation for those are not lie on the line, is something like W*U +B >1 or come from?

  • @lobbielobbie1766

    @lobbielobbie1766

    7 жыл бұрын

    Because in the beginning of this lecture, he mentioned that this is a binary classification problem where the class variable has 2 levels i.e. + and - so he used +1 for + level and used -1 for - level. In this sense, the top grey line will be wX+b = 1 and all the + data points above this line are in the zone wX+b >= 1. Same applies to the bottom grey line. Hope this helps as I am also still learning.

  • @yuhaooo8143
    @yuhaooo81435 жыл бұрын

    why the top line is wtx+b=1? i mean it should be equal to the distance between the plane and the first positive point?

  • @anarbay24

    @anarbay24

    3 жыл бұрын

    because they want to classify positive points as +1 in order to distinguish the cases

  • @wilfredomartel7781
    @wilfredomartel77817 жыл бұрын

    Excelente... Where can i find more videos from the same topic?

  • @iamdurgeshk

    @iamdurgeshk

    7 жыл бұрын

    Udacity

  • @anarbay24

    @anarbay24

    3 жыл бұрын

    check out Georgia Tech Udacity course. You will not regret it!

  • @johnmichaelkovachi3338
    @johnmichaelkovachi33386 жыл бұрын

    Why do we transpose w in the equation?

  • @lightningblade9347

    @lightningblade9347

    6 жыл бұрын

    have the same question too can't understand how he wants to explain SVM while ignoring the most crucial part which is why is w transposed in the equation.

  • @vishweshnayak2331

    @vishweshnayak2331

    6 жыл бұрын

    w is a vector (a matrix of n rows and one column) representing the parameters, and x is another vector (n*1 again) representing a point. To multiply them, since we know we can't multiply two matrices of dimensions n*1, and y has to be a single value (or a 1*1 matrix), multiplying w transpose with x gives you a 1*1 matrix which is a summation of the product of every value of the vector w with the corresponding value in the vector x.

  • @BonsiownsGADU

    @BonsiownsGADU

    6 жыл бұрын

    You rock

  • @fugangdeng4423
    @fugangdeng44232 жыл бұрын

    what does quiz mean at the end?

  • @peciarovazuli2370
    @peciarovazuli23705 жыл бұрын

    please make video about SVR

  • @anarbay24

    @anarbay24

    3 жыл бұрын

    what is SVR?

  • @user-pt1el8wc4d
    @user-pt1el8wc4d7 жыл бұрын

    The plane helps you to define the support vectors. The support vectors help you to define the plane. But when I start with only a set of points, with no support vectors or planes, what do I do? Who would I know who is who?

  • @redsnow123456
    @redsnow1234564 жыл бұрын

    Why its w transpose ? Why are we transposing the values of matrix of feature x?

  • @dileep31

    @dileep31

    4 жыл бұрын

    w is a vector, or a matrix of dimension n*1. The convention in Machine Learning when we say vectors is that they represent column vectors. So both w and x are column vectors, and to get matrix multiplication of two vectors, we need to transform one of them to compatible shapes. So we do w.T * x which is equivalent to dot product of the two vectors.

  • @saptarshi9433
    @saptarshi94336 жыл бұрын

    It didnt mentioned about the term "Support Vector Machine" , apart from the title of the graph.

  • @anarbay24

    @anarbay24

    3 жыл бұрын

    Udacity system allows you to learn the new concepts little by little. If you want more information, check out the other videos. These guys have managed to make one of the hardest topics to be understood very easily!

  • @paperstars9078
    @paperstars90784 жыл бұрын

    How do they make the hand and the pen transparent?

  • @anarbay24

    @anarbay24

    3 жыл бұрын

    I guess it is specifically Udacity's program. I have searched it for a long time and friend of mine working on Udacity told me it is their own specifically.

  • @16avnisharma
    @16avnisharma6 жыл бұрын

    why transpose of W?

  • @lightningblade9347

    @lightningblade9347

    6 жыл бұрын

    same question

  • @JaskaranSingh-fj1iw

    @JaskaranSingh-fj1iw

    5 жыл бұрын

    Because w and x are two vectors/matrices of the same dimensions. You would have to transpose one of them to multiply them.

  • @anarbay24

    @anarbay24

    3 жыл бұрын

    in order to have valid multiplication, you need to transpose W matrix.

  • @tsunghan_yu
    @tsunghan_yu5 жыл бұрын

    0:54 I'm going to fix that... by putting a minus sign here LMAO

  • @anarbay24

    @anarbay24

    3 жыл бұрын

    hahaha they are just awesome

  • @raise7935
    @raise79356 жыл бұрын

    lol

  • @aliparcheforosh4895
    @aliparcheforosh48952 жыл бұрын

    Not useful at all!