Skip to content

Comments

Permitting Non-Unique Sampling for KL-Divergence, Improving Dependency Compatibility #35

Open
andrewsalij wants to merge 4 commits intoBenevolentAI:masterfrom
lanl:dev
Open

Permitting Non-Unique Sampling for KL-Divergence, Improving Dependency Compatibility #35
andrewsalij wants to merge 4 commits intoBenevolentAI:masterfrom
lanl:dev

Conversation

@andrewsalij
Copy link

When assessing a model, it may be desirable to sample from potentially non-unique molecules (e.g., one wants to characterize mode collapse). This is now permitted via the interface

KLDivBenchmark().assess_model(model, unique_sample = False)

where the default behavior is the prior behavior (i.e.,e unique_sample = True).

Also, there have been a number of fixes to ensure compatibility with more recent versions of dependencies. All tests pass using
pytest tests now.

@andrewsalij
Copy link
Author

andrewsalij commented Jan 26, 2026

These changes are compatibly-licensed (MIT) and under copyright by Triad National Security, LLC, under open source release #O5007.

© 2026. Triad National Security, LLC. All rights reserved.
This program was produced under U.S. Government contract 89233218CNA000001 for Los Alamos National Laboratory (LANL), which is operated by Triad National Security, LLC for the U.S. Department of Energy/National Nuclear Security Administration. All rights in the program are reserved by Triad National Security, LLC, and the U.S. Department of Energy/National Nuclear Security Administration. The Government is granted for itself and others acting on its behalf a nonexclusive, paid-up, irrevocable worldwide license in this material to reproduce, prepare. derivative works, distribute copies to the public, perform publicly and display publicly, and to permit others to do so.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant