-
Notifications
You must be signed in to change notification settings - Fork 3
3. Codebook optimization
Urmish edited this page Apr 18, 2015
·
1 revision
We have implemented the Incremental Codebook Optimization algorithm, as described in Algorithm 4.1 of Wang et al 2010. This algorithm optimizes a codebook specifically for LLC. This promises a further performance improvement in the LLC pipeline, over one that uses a dictionary that is learned through K-Means.
Instead of a joint optimization of the codebook and the encoding of feature descriptors, the authors propose an online learning algorithm that iterates over the set of feature descriptors, using one descriptor to updating the codebook in each iteration.
We show the results of this dictionary learning method in the Experiments section.
We report errors in the paper with regards to the Incremental Codebook Optimization algorithm.
- In Line 11 of Algorithm 4.1, the second and third factors need to be rearranged for the matrix multiplication to be possible. The second factor,
c_tilde, is of size Kx1, while the second factor,x_i - B_i * c_tilde, is of size Dx1. It is obvious their places should be swapped, andc_tildeshould also be transposed, for the product to have the same size asB_i. - In Equation 5, which is part of the computation of Algorithm 4.1,
C_ishould be a function of the local basesB_iinstead of the full dictionaryB. Without this correction, the element-wise summation of Equation 5 will not be possible due to a matrix size mismatch.