-
Notifications
You must be signed in to change notification settings - Fork 105
Bound tightenings using INVPROP and Partial Multi-Neuron Relaxation algorithms. #892
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
ido-shm-uel
wants to merge
93
commits into
NeuralNetworkVerification:master
Choose a base branch
from
ido-shm-uel:INVPROP_PMNR
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Bound tightenings using INVPROP and Partial Multi-Neuron Relaxation algorithms. #892
ido-shm-uel
wants to merge
93
commits into
NeuralNetworkVerification:master
from
ido-shm-uel:INVPROP_PMNR
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Added assignment and store-into-other NLR tests for every activation function.
0-2-0: Implementing layer simulations for every activation function.
0-2-1: Testing layer simulations for every activation function.
0-3-0: Implementing interval arithmetic for all activation functions.
0-3-1: Adding interval arithmetic tests for all activation functions.
…nction. 0-4-0: Implementing symbolic bound tightening for every activation function.
…tions. 0-4-1: Adding symbolic bound tightening tests for all activation functions.
0-5-0: Adding rounding constant for LP relaxation propagation.
… activation functions. 0-5-1: Implementing forward-backward LP propagation algorithm for all activation functions.
…n functions. 0-5-2: Adding forward-backward LP propagation tests for all activation functions.
…rithms, adjusting in Engine. 0-6-0: Adding MILP bound types for two backpropagation of bounds algorithms, adjusting in Engine.
0-6-1: Adjusting in Options for two new MILP Bound types.
…rithms). 0-6-2: Introducing Polygonal Tightenings (output type of two new algorithms).
0-6-3: Adjusting in NLR, LayerOwner for Polygonal Tightenings.
…rithms will optimize value of parameter). 0-6-4: Introducing parameterised bounds in LPFormulator (two new algorithms will optimize value of parameter).
0-6-5: Current state of PreimageApproximation + cleanup + corrections.
0-6-6: New constants for PreimageApproximation (random seeds).
…algorithm (gradient-free optimization for PreimageApproximation). 0-6-7: TEMPORARY CHANGE - use boost v1.85 for Differential Evolution algorithm (gradient-free optimization for PreimageApproximation).
0-6-8: Revert last change.
0-6-9: Adapt DifferentialEvolution from Boost v1.85.
…ented). 0-6-10: Current state of PreimageApproximation (EstimateVolume implemented).
0-6-11: Add constants for PreimageApproximation.
…descent). 0-6-12: Current state of PreimageApproximation (implement coordinate descent).
…ow). 0-6-13: Timing PreimageApproximation with coordinate descent (very slow).
0-6-14: Small Fix (same as 0-5---4).
UNSAT certification statistics (NeuralNetworkVerification#850) (Fork Sync)
…imization algorithms, parameters, clang-format,) - Apply 0-5--6 fixes (Backward-Forward Algorithm branch) to this branch. - Add new optimization algorithms for PreimageApproximation: Coordinate Descent, Gradient Descent, Adam Optimizer. - Change default algorithm (Adam) + parameters to reduce runtime. - Apply clang-format to all files changed.
smal fix for bug in SumOfInfeasibilitiesManager::proposePhasePatternU… - Merge idan0610's fork to solve regression test N273 failling due to fixed Sign Layer splitting (MarabouError 7).
…ption, global constants, function skeletons and implement countNonlinearLayers, addPolyognalTighteningsToLpRelaxation.
…ounds using DeepPoly module. 2. Implement OptimizeSingleParameterisedPolygonalTightening and most of selectConstraints, getParameterisdPolygonalTighteningLowerBound. 3. Move some bound propagation functions and helpers from Layer, LPFormulator to NetworkLevelReasoner. 4. Delete duplicate helpers from DeepPolySoftmaxElement and replace calls to their copies in Layer.
…or MILP_SOLVER_BOUND_TIGHTENING_TYPE. 1. Extending initializeSymbolicBoundsMaps using DeepPoly module. 2. Implement generatePolygonalTightenings, getParameterisdPolygonalTighteningLowerBound and helper functions. 3. Corrections in Layer, OptionParser. 4. Creating new test file Test_PMNR.h.
…(in DeepPoly module), required by PMNR's selectConstraints. 2. Implemented unit tests for parameterisedSymbolicBoundPropagation and initializeSymbolicBounds, in Test_NetworkLevelReasoner.h and nlr/tests/Test_NetworkLevelReasoner.h. 3. Various corrections to PMNR's generatePolygonalTightening, getParameterisdPolygonalTighteningLowerBound and related functions (in NetworkLevelReasoner.cpp). 4. Renamings in Layer, LPFormulator, NetworkLevelReasoner for consistency. 5. Few corrections to nlr/tests/Test_NetworkLevelReasoner.h calculations (documentation). 6. One correction to Sigmoid regular SBT (changed if ( FloatUtils::areEqual( FloatUtils::round( sourceLb ), FloatUtils::round( sourceUb ) ) ) to if ( FloatUtils::areEqual( sourceLb, sourceUb ) ) in Layer.cpp's computeSymbolicBoundsForSigmoid. 7. One correction to DeepPoly module (added line _phaseFixed[i] = phaseFixed; in DeepPolyMaxPoolElement.cpp's execute() which is required by its storePredecessorSymbolicBounds() (otherwise the map _phaseFixed would be used without being updated).
Sync Fork: Support Sign Constraints in Onnx Parser (NeuralNetworkVerification#886)
…unctions in in DeepPoly module, Restored Test_PMNR.h, removed Gurobi guard for Test_PMNRSelection.h, resumed implementating selectConstraints/generatePolygonalTightenings and related functions.
Proof production bugfix (NeuralNetworkVerification#887)
SBT for Sigmoid: Corrected criterion for fixed Sigmoid neuron. SBT for Softmax: Symbolic bounds now stored correctly in _symbolicLb, _symbolicUb. SBT for Bilinear: Fixed calculation for symbolic lower/upper bias, fixed mistaken assumption both source neurons are from same layer. Parameterized SBT for Bilinear: Fixed calculation for symbolic lower/upper bias, fixed mistaken assumption both source neurons are from same layer. LP relaxation for Bilinear: Fixed mistaken assumption both source neurons are from same layer. Parameterized LP relaxation for Bilinear: Fixed mistaken assumption both source neurons are from same layer. Minor changes for code of PreimageApproximation optimization function: size_t -> unsigned, define sign variable at start of function, use Vector( double, unsigned ) constructor and std::min/std::max.
…tion#889), plus: 1. Various corrections in generatePolyongalTightening, Extended INVPROP and hyperparameters, initializeSymbolicBounds (deepPoly module), Gradient-based PMNR heuristic, unit tests. 2. Implemented calculations of branch symbolic bounds in NLR. 3. Implemented branching after Extended INVPROP to derive stronger bounds. 4. Implemented detection of some infeasible branch combinations during INVPROP. 5. Implemented heuristic generation of branching points and tightening loss. 6. Wrote unit tests for parameterised SBT/DeepPoly in Test_NetworkLevelReasoner/Test_DeepPolyAnalysis.h. 7. Implemented BBPS-based heuristic for PMNR. 7. Wrote Test_PMNRSelection.h, unit test file for symbolic bound maps, branching points and PMNR neuron scores. 8. Wrote Test_PMNR.h, unit test file for backward-invprop, backward-pmnr-(random/gradient/bbps) bound tightening options.
…ulation modifed. 2. Alpha optimization removed from EXTENDED INVPROP. 3. PreimageApproximation is run before EXTENDED INVPROP. 4. BBPS branching candidates number increased to 100. 5. Removed milp-tightening options backward-invprop, backward-pmnr-random, backward-pmnr-gradient, renamed backward-pmnr-bbps to backward-pmnr. 6. Updated test files Test_PMNRSelection.h, Test_LPRelaxation.h, Test_PMNR.h.
…Apply Results of PMNR Optimization Benchmarks Manually Sync Fork + Apply Results of PMNR Optimization Benchmarks
…d function declarations from LPFormulator.h. 3. Removing redundant 'include LPFormulator.h' from NetworkLevelReasoner.h to solve Release build failure on clang++.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Implemented INVPROP (arXiv:2302.01404v4 [cs.LG]), Partial Multi-Neuron Relaxation bound tightening algorithms.