Support Vector Machines (SVM) - Part 1 - Linear Support Vector Machines

In this lesson we look at Support Vector Machine (SVM) algorithms which are used in Classification.
Support Vector Machine (SVM) Part 2: Non Linear SVM • Non Linear Support Vec...
Videos on Neural Networks
Part 1: • Artificial Neural Netw... (Single Layer Perceptrons)
Part 2: • Artificial Neural Netw... (Multi Layer Perceptrons)
Part 3: • Artificial Neural Netw... (Backpropagation)
More Free Video Books: scholastic-videos.com/

Пікірлер: 68

  • @muhammadkamranjaved2424
    @muhammadkamranjaved24246 жыл бұрын

    Thank you so much for this explanation. It almost cleared my 80% concept...

  • @bhavyar2986
    @bhavyar29864 жыл бұрын

    amazing! :)))) LOved it.. didnt find any other content like this..

  • @lucasscott6163
    @lucasscott616310 жыл бұрын

    Part 1 is great. Could you please upload part 2 also. I would be very thankful to you.

  • @RahulKumar-dj8bt
    @RahulKumar-dj8bt3 жыл бұрын

    This was great please provide more solutions on Naive Bayes, Linear and Logistics Regression very helpful

  • @Jaro966
    @Jaro9665 жыл бұрын

    thanks, the best lesson I found

  • @aravindcheruvu2671
    @aravindcheruvu26718 жыл бұрын

    Superb explanation !!!

  • @jobmathtubebd
    @jobmathtubebd6 жыл бұрын

    nice explanation... thanks bro.....

  • @SuccessfulChan
    @SuccessfulChan7 жыл бұрын

    Please explain me how do you calculate the support vectors from data set or how do you determine the support vectors from the data set?

  • @anandachetanelikapati6388
    @anandachetanelikapati63885 жыл бұрын

    Thanks. It eases out the terminology kernel, Lagrangian multiplier, vector algebra, etc. It the procedure appears to be working on trivial cases. I've tried with few points but it doesn't seem to be yielding correct results.

  • @ghaliahmed
    @ghaliahmed9 жыл бұрын

    10000000000 thank's. it's very useful

  • @anushreepriyadarshini2036
    @anushreepriyadarshini20369 жыл бұрын

    I have one confusion regarding offset. Why are you changing the sign of offset?

  • @akankshapande8366
    @akankshapande83667 жыл бұрын

    Best explanation..Very useful. Can you please tell me how to plot support vector in text classification..

  • @toutankhadance
    @toutankhadance Жыл бұрын

    Excellent !

  • @malaraj9184
    @malaraj918410 жыл бұрын

    thanks for the video.. is there Part II?

  • @jordanmakesmaps
    @jordanmakesmaps5 жыл бұрын

    When using four support vectors, the s4 point has a bias of zero; was that intentional or accidental?

  • @mohammedshaker13
    @mohammedshaker133 жыл бұрын

    I see that: classifying these new points must use the bias offset as follow : W.X+b ... is this effect on the final result?.

  • @graemelyon
    @graemelyon7 жыл бұрын

    How do you draw the boundary from W? You mention W(3) is the bias but what to W(1) and W(2) represent?

  • @gyanprabhat3120
    @gyanprabhat31204 жыл бұрын

    How did you determine that there will be only three support vectors? Why not 4 or 2 ? And what will you do when you don't know what vectors are support vectors? How will you proceed?

  • @philipscanlon2268
    @philipscanlon22689 жыл бұрын

    Great simplified explanation but, at about 18 mins, when describing the 4 support vector version, why is S4 = [310] and not = [311]? many thanks

  • @rainclouds4346
    @rainclouds43462 жыл бұрын

    Can you please explain where that equation a1S1S1 + a2S2S2 etc was derived from? I haven't found anyone who could explain that

  • @lvuindia3507
    @lvuindia35072 жыл бұрын

    Please make video how we calculate alpha values of each sample in trainig set.

  • @congphuocphan
    @congphuocphan3 жыл бұрын

    At 2:16, could you explain how to select the support vector S1, S2, S3 in computer approach? We can recognized it by our observation, but automatically, I think we should have a way to define which points should be consider the support vectors among many data points.

  • @SwavimanKumar

    @SwavimanKumar

    3 жыл бұрын

    did you find any answer to this elsewhere? Even I have this doubt. Can you help please?

  • @vincyjoseph5893
    @vincyjoseph58935 жыл бұрын

    the equation I found for w is multiplied by yi output. Why it is not considered here

  • @joshuac9142
    @joshuac9142 Жыл бұрын

    How do you mathematically find the support vectors?

  • @edgesvm9331
    @edgesvm933110 жыл бұрын

    Great video. But I do have a question...how did you get x2=1 for w=[0,1,-1]. I understand the 1 but not the x2. Thank you.

  • @edgesvm9331

    @edgesvm9331

    10 жыл бұрын

    Nevermind....I get what you mean by x2.

  • @emilyhuang2759
    @emilyhuang27596 жыл бұрын

    Is this how you compute the canonical weights?

  • @CEDMAYAKUNTLAPRASANNAKUMAR
    @CEDMAYAKUNTLAPRASANNAKUMAR3 жыл бұрын

    sir u took 1 and -1 because there are two classes, suppose if we have three classes then what values we should take

  • @jesbuddy07
    @jesbuddy079 жыл бұрын

    i know the support vectors are the one closest to the class boundary. this is because it is still observable in 2 dimension space. if it is in 4,5 or 6 dimention space, how do you find the support vectors?

  • @hamidawan687
    @hamidawan6872 жыл бұрын

    The fundamental question is still unanswered by every tutor I have found so far. The question is how to assume/specify/determine the support vectors? Iy is of course unbelievable to assume support vectors just by visualizing the data points. How can it be true in multi-dimensional space? Please guide me. I need to program SVMs from scratch without using any library in Matlab or .Net.

  • @armylove9733

    @armylove9733

    2 жыл бұрын

    As per my understanding I guess support vector/s is/are one which is on the hyper-plane so they will first create one plane in between two datasets and then whichever point lies on that plane's equation will be called a Support vector.

  • @ghaliahmed
    @ghaliahmed6 жыл бұрын

    i'll to thank you

  • @WahranRai
    @WahranRai8 жыл бұрын

    The main problem is to find the support vectors. You must give the general algorithm to find the support vectors !!! In algorithm we dont have visual in front of us

  • @shubhamrathi3734
    @shubhamrathi37348 жыл бұрын

    Why is the need to augment the vectors? Isnt augmentation usually done to project point to a higher dimension to make points linearly separable? The points are separable without augmentation. Could you also tell how we get the 3 equations in 6:19?

  • @TheGr0eg

    @TheGr0eg

    8 жыл бұрын

    +Shubham Rathi I think he just does it to simplify the expression +b to just , where x_0=1 and w_0=b.

  • @shubhamrathi3734

    @shubhamrathi3734

    8 жыл бұрын

    +TheGr0eg You're right :) How did he get the 3 equations in 6:16?

  • @TheGr0eg

    @TheGr0eg

    8 жыл бұрын

    +Shubham Rathi He does not derive them. They are basically the result of the mathematical theory of SVM that he does not go into here. I don't really feel comfortable explaining that though because I have not yet understood it completely myself yet.

  • @maged4087
    @maged40872 жыл бұрын

    why u added the bias 1?

  • @mohsintufail5334
    @mohsintufail5334 Жыл бұрын

    any body tell me how we find the valued of alpha's?

  • @aimalrehman3657
    @aimalrehman36574 жыл бұрын

    1. why add bias in the first place? 2. why is the bias 1?

  • @mikek8329
    @mikek83299 жыл бұрын

    why did you choose the bias to be 1 ?

  • @pritomdasradheshyam2154

    @pritomdasradheshyam2154

    4 жыл бұрын

    you can choose any other positive constant too...it doesn't effect much,so we choose the simpler term...

  • @akrsrivastava
    @akrsrivastava7 жыл бұрын

    Why does w={1,0} mean a vertical line and {0,1} mean a horizontal line. When defining first support vector S1, it was represented as S1={2,1} because it lied at (2,1) on the coordinate axis. Shouldn't the same interpretation be applied to w={1,0} , meaning thereby that the straight line passes through (1,0) on the coordinate axis?

  • @bobwylie158

    @bobwylie158

    3 жыл бұрын

    Did you ever get an answer to this?

  • @aritraroy3220
    @aritraroy3220 Жыл бұрын

    at 13:46 if w=(1,0) then why it's vertical line ???

  • @circuithead94
    @circuithead945 жыл бұрын

    The vector should be horizontal by w=(1,0)

  • @anushreepriyadarshini2036
    @anushreepriyadarshini20369 жыл бұрын

    if there will be more than 4 support vector then how we will solve the equation manually?

  • @MrSchwszlsky

    @MrSchwszlsky

    5 жыл бұрын

    U can use matrix index or linear programming i think

  • @RabindranathBhattacharya
    @RabindranathBhattacharya3 жыл бұрын

    Why bias has been taken as 1? Will not the result change if bias is changed?

  • @maged4087

    @maged4087

    2 жыл бұрын

    i have the same question, do you find the answer?

  • @nikhilphadtare7662
    @nikhilphadtare76623 жыл бұрын

    It's Tilda not Tidle. Good video

  • @parasiansimanungkalit9876
    @parasiansimanungkalit98763 жыл бұрын

    i hope this will end my thesis revision 😭😭😭

  • @thepurgenight1136
    @thepurgenight11366 жыл бұрын

    This Video is really helpful to my teacher, she knows nothing but copying the notes from youtube...So if you are watching my comment -"Sudhar Jao"

  • @homevideotutor

    @homevideotutor

    6 жыл бұрын

    Thanks sir

  • @swatiantv
    @swatiantv9 жыл бұрын

    please explain that how can i compute ᾱ1,ᾱ2= -3.25,ᾱ3=3.5 ???????

  • @homevideotutor

    @homevideotutor

    9 жыл бұрын

    Your question refers to time stamp 9:30 in the video > These are 3 simultaneous equations. First remove alpha1: Eq 1 x 2 - Eq 2 x 3. And Eq 2 x 9 - Eq 3 x 4. Then use the resulting two equations to get rid of alpha2 to find alpha3. Then substitute alpha3 to find alpha 2 and then alpha 1. Or simply use Cramer's rule www.purplemath.com/modules/cramers.htm. Hope you got it.

  • @swatiantv

    @swatiantv

    9 жыл бұрын

    homevideotutor Thank you so much

  • @saifuddinabdullah7286

    @saifuddinabdullah7286

    9 жыл бұрын

    homevideotutor sir, can you please write a little about how to select the support vectors in the beginning to begin with? Like you selected three support vectors, how did you select them? Can we pick random vectors from each class and consider them as our support vectors?

  • @homevideotutor

    @homevideotutor

    9 жыл бұрын

    They are the closest points to the class boundary.

  • @volkanaltuntas1051

    @volkanaltuntas1051

    6 жыл бұрын

    how can you determine the class boundary?

  • @guoyixu5793
    @guoyixu57933 жыл бұрын

    Only a few special cases are mentioned where the solution is a line parallel to either the x or the y axis. I don't know if this solution works for other more general cases. Actually I think the solution form is not correct, not the right formulation if you derive the optimization solution from KKT. The w vector is the linear combination of support vectors, but the augmented w vector is not the linear combination of augmented support vectors. At least I think so now, and I can't prove that they are equivalent, so I think the solution provided in this video is wrong. It just happens to work for those special cases in the video. If someone can prove that this solution in the video is correct, please correct me. The correct solution is from the MIT video here: kzread.info/dash/bejne/kYSrysuQqKuxaNI.html

  • @prasad9012
    @prasad90126 жыл бұрын

    Thoda jor se bologe toh achha lagega

  • @kelixoderamirez
    @kelixoderamirez3 жыл бұрын

    Permission to Learn Sir

  • @homevideotutor
    @homevideotutor10 жыл бұрын

    www.scientific.net/AMM.534.137

  • @AmdavadiHitman
    @AmdavadiHitman8 жыл бұрын

    Great Video! I would like to suggest you to not to speak so softly. Its really difficult sometimes to get what you said. :p Thanks! :D