Permitting Non-Unique Sampling for KL-Divergence, Improving Dependency Compatibility #35
Permitting Non-Unique Sampling for KL-Divergence, Improving Dependency Compatibility #35andrewsalij wants to merge 4 commits intoBenevolentAI:masterfrom
Conversation
|
These changes are compatibly-licensed (MIT) and under copyright by Triad National Security, LLC, under open source release #O5007. © 2026. Triad National Security, LLC. All rights reserved. |
When assessing a model, it may be desirable to sample from potentially non-unique molecules (e.g., one wants to characterize mode collapse). This is now permitted via the interface
KLDivBenchmark().assess_model(model, unique_sample = False)where the default behavior is the prior behavior (i.e.,e
unique_sample = True).Also, there have been a number of fixes to ensure compatibility with more recent versions of dependencies. All tests pass using
pytest testsnow.