Lecture 06 - SIFT - 2014

Ғылым және технология

Description

Пікірлер: 29

  • @joshmiller5703
    @joshmiller57035 жыл бұрын

    Take a shot for every time he tries to watch naruto during the lecture

  • @soumalyasahoo2664
    @soumalyasahoo26644 жыл бұрын

    If you are struggling to understand this video go through his 2012 lecture on SIFT and come back.

  • @nioncao
    @nioncao4 жыл бұрын

    Best SIFT introduction in the internet. do not agree, show me one better

  • @TheBirdBrothers
    @TheBirdBrothers8 жыл бұрын

    amazing series, thank you for sharing them.

  • @jibran6635
    @jibran66353 жыл бұрын

    His lecture makes far more sense when students start asking questions persistently. Otherwise, its easy to get lost in the details.

  • @SunilMeena-do7xn
    @SunilMeena-do7xn3 жыл бұрын

    Great explanation

  • @hamedalsufyani2628
    @hamedalsufyani26289 жыл бұрын

    that is amazing lecturer. I like the clarity of your speech and it is really useful presentation. Thank you very much prof

  • @DIYGUY999
    @DIYGUY9996 жыл бұрын

    I am confused a bit. In the butterfly example you told select the scale which gives maximum response. And now in DOG we look at 26 neighbours and select and extrema. How does these two things connect?

  • @stoka43
    @stoka436 жыл бұрын

    In the exractrion of local image descriptors, it is mentioned " store numbers in a vector" which numbers is are referred here ?

  • @feyilon
    @feyilon5 жыл бұрын

    Thanks a lot very well explained... It helps a lot...

  • @ner30
    @ner307 жыл бұрын

    I dont get the usage of the taylor series development. why is this done?

  • @coffle1
    @coffle18 жыл бұрын

    How can we match interest point gradient histograms if when we encoded it into the 128 size vector, we could be making the same vector but permute the xi's in the vector? So essentially what I'm asking is if we were to somehow have a gradient vector of [1;0] of an interest point of an image, and have an interest point in another identical, rotated image that puts the 0 in the vector first so that the vector is [0;1], how can we find a way to match these up?

  • @thanaphontangchoopong1449

    @thanaphontangchoopong1449

    4 жыл бұрын

    It's late, but (might be useful for the others) the key is that it is a "relative histogram" with respect to the highest magnitude orientation of the considering keypoint.

  • @ayushdayani7157
    @ayushdayani71573 жыл бұрын

    he was vague during initial, further outlier rejection and scale-space peak detection. After watching the video for 3rd time I am still not able to understand it.

  • @nkhullar1
    @nkhullar14 жыл бұрын

    Thank you * infinity for this lecture.

  • @MonkkSoori
    @MonkkSoori4 жыл бұрын

    How is the orientation of the key point used? He explained making the 128 dimensional vector as a descriptor for key point matching, and he said previously that the orientation of the key point is for rotation invariance, but is the direction of the key point added to the 128-D vector? Where is it stored etc?

  • @MonkkSoori

    @MonkkSoori

    4 жыл бұрын

    It's still not super clear but I found the answer in the original paper: "In order to achieve orientation invariance, the coordinates of the descriptor and the gradient orientations are rotated relative to the keypoint orientation. For efficiency, the gradients are precomputed for all levels of the pyramid as described in Section 5."

  • @MW-yz8rm
    @MW-yz8rm8 жыл бұрын

    i like this version better

  • @soumalyasahoo2664
    @soumalyasahoo26644 жыл бұрын

    can somebody explain why k^2*sigma is higher in 2nd level than in the 1st level?.

  • @zhengqianyu7913

    @zhengqianyu7913

    4 жыл бұрын

    I have a similar question, if you solved your question, please leave me a comment

  • @roar363
    @roar3638 жыл бұрын

    why is the content of this lecture exactly same as the last? (lecture 5)

  • @hoanghieu6389

    @hoanghieu6389

    8 жыл бұрын

    i don't think so, lecture 5 is about pyramids.

  • @mouadmorabit3618

    @mouadmorabit3618

    8 жыл бұрын

    Pay attention to the year, there is a playlist for 2012 and 2014 (Chapter 5 - SIFT in the 2012 version)

  • @roar363

    @roar363

    8 жыл бұрын

    Mouad Morabit yea i realized later

  • @awais_latif
    @awais_latif4 жыл бұрын

    he is not clearing anything just reading slides.

  • @nkhullar1

    @nkhullar1

    4 жыл бұрын

    What the hell? He is the best teacher .. please point me to the better lecture than this.

  • @debangshasarkar5137
    @debangshasarkar51374 жыл бұрын

    The presentation is horrible

  • @nkhullar1

    @nkhullar1

    4 жыл бұрын

    Really? How? Please point me to the lecture better than this.

Келесі