Skip to content

Add an upper bound of the operator norm for limited-memory qn#395

Merged
dpo merged 6 commits intoJuliaSmoothOptimizers:mainfrom
MaxenceGollier:upper_bound
Feb 2, 2026
Merged

Add an upper bound of the operator norm for limited-memory qn#395
dpo merged 6 commits intoJuliaSmoothOptimizers:mainfrom
MaxenceGollier:upper_bound

Conversation

@MaxenceGollier
Copy link
Contributor

@MaxenceGollier
Copy link
Contributor Author

MaxenceGollier commented Jan 27, 2026

@dpo,

You can get the bound by doing the following:
We have

$$a_k = \frac{B_{k-1}s_k}{\sqrt{s_k^T B_{k-1}s_k}},$$

define

$$u_k = \frac{B_{k-1}^{1/2}s_k}{\|B_{k-1}^{1/2}s_k\|}.$$

Then,

$$a_k a_k^T = \frac{B_{k-1}s_k s_k^T B_{k-1}}{s_k^T B_{k-1} s_k} = B_{k-1}^{1/2} (u_k u_k^T) B_{k-1}^{1/2}.$$

Therefore,

$$B_{k-1} - a_k a_k^T = B_{k-1}^{1/2}(I - u_k u_k^T) B_{k-1}^{1/2},$$

and

$$\| B_{k-1} - a_k a_k^T \|_2 \leq \|B_{k-1}\|_2\|I - u_k u_k^T\|_2 = \|B_{k-1}\|_2,$$

therefore we can ignore the $$∥a_k∥^2$$ in the formula.

@dpo
Copy link
Member

dpo commented Jan 27, 2026

Great! Thank you.

@codecov
Copy link

codecov bot commented Jan 27, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 95.35%. Comparing base (32dbc5e) to head (4719536).
⚠️ Report is 59 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #395      +/-   ##
==========================================
+ Coverage   95.00%   95.35%   +0.35%     
==========================================
  Files          17       20       +3     
  Lines        1100     1140      +40     
==========================================
+ Hits         1045     1087      +42     
+ Misses         55       53       -2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@MaxenceGollier
Copy link
Contributor Author

If you approve, please merge @dpo.

src/lbfgs.jl Outdated
if !op.inverse
data.opnorm_upper_bound -= norm(data.b[insert])^2
data.b[insert] .= y ./ sqrt(ys)
data.opnorm_upper_bound += norm(data.b[insert])^2
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MaxenceGollier I thought that we should store those norms instead of recomputing them. That should be an easy change.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In RegularizedOptimization, I computed it like this:

@inbounds for i = 1:data.mem
   bound += norm(data.b[i], 2)^2
end

which is $$\mathcal{O}(mn)$$, here I only remove the norm of b being inserted and add the new norm which is $$\mathcal{O}(2n)$$.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be $O(n)$ if we stored the norm. The norm of the vector being removed was computed earlier.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

@dpo dpo merged commit d9cd737 into JuliaSmoothOptimizers:main Feb 2, 2026
53 of 57 checks passed
@MaxenceGollier MaxenceGollier deleted the upper_bound branch February 2, 2026 18:08
@MaxenceGollier
Copy link
Contributor Author

Can we add a release ?

Then I can update JuliaSmoothOptimizers/RegularizedOptimization.jl#275.

This was referenced Feb 2, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants