[CVPR20 Tutorial] Billion-scale Approximate Nearest Neighbor Search
Ғылым және технология
[CVPR20 Tutotrial] Image Retrieval in the Wild
matsui528.github.io/cvpr2020_...
Billion-scale Approximate Nearest Neighbor Search
Yusuke Matsui
slide: speakerdeck.com/matsui_528/cv...
Пікірлер: 10
Very well explained! Thank you!
This presentation is great. I love it! It's a fast way to understand the summary of the state of the art of large scale search.
Very well explained topic. And a great presentation as well with nice colors for the hashing function and visual on the coarse graph to fine graph slide. Thanks so much
Thank you for a great summary!
Thank you for your explanation!
Thank you! That is awesome.
Thank you so much.
I expect that we can do better than kmeans clustering for dimension reduction and coarse quantization, for image data at least, by smart feature detection using transfer learning like ResNet50 as main body, plus VAE as head of network for smart dimension reduction. Also use built-in tensor quantization on the short vector, or build your own quantizer that is differentiable for purpose of backprop by custom sequential Relu activations that start at zero, which is stabdard relu, then 1,2,3,... Until it swept entire 8 bit range ie 256, or 16 bit range ie 16k. No more fine search is needed consequently, just direct addressing to correct hash bucket, and pull out an item from its list. Or you can search fine list other ways like pq or lsh .
At 11:44, dist = q_norms[m] + x_norms[n] - **2** x ip[m][n]? Shouldn't ip[m][n] be multiplied by 2 (to match the formula on the top of the slide)?
ありがろうまついせいんせい