Skip to content

3. Codebook optimization

Urmish edited this page Apr 18, 2015 · 1 revision

Contribution

We have implemented the Incremental Codebook Optimization algorithm, as described in Algorithm 4.1 of Wang et al 2010. This algorithm optimizes a codebook specifically for LLC. This promises a further performance improvement in the LLC pipeline, over one that uses a dictionary that is learned through K-Means.

Instead of a joint optimization of the codebook and the encoding of feature descriptors, the authors propose an online learning algorithm that iterates over the set of feature descriptors, using one descriptor to updating the codebook in each iteration.

We show the results of this dictionary learning method in the Experiments section.

Errata in Wang et al

We report errors in the paper with regards to the Incremental Codebook Optimization algorithm.

  1. In Line 11 of Algorithm 4.1, the second and third factors need to be rearranged for the matrix multiplication to be possible. The second factor, c_tilde, is of size Kx1, while the second factor, x_i - B_i * c_tilde, is of size Dx1. It is obvious their places should be swapped, and c_tilde should also be transposed, for the product to have the same size as B_i.
  2. In Equation 5, which is part of the computation of Algorithm 4.1, C_i should be a function of the local bases B_i instead of the full dictionary B. Without this correction, the element-wise summation of Equation 5 will not be possible due to a matrix size mismatch.

Clone this wiki locally