ITK
6.0.0
Insight Toolkit
|
#include <itkGradientDescentOptimizerv4.h>
Gradient descent optimizer.
GradientDescentOptimizer implements a simple gradient descent optimizer. At each iteration the current position is updated according to
\[ p_{n+1} = p_n + \mbox{learningRate} \, \frac{\partial f(p_n) }{\partial p_n} \]
Optionally, the best metric value and matching parameters can be stored and retried via GetValue() and GetCurrentPosition(). See SetReturnBestParametersAndValue().
Gradient scales can be manually set or automatically estimated, as documented in the base class. The learning rate defaults to 1.0, and can be set in two ways: 1) manually, via SetLearningRate()
. Or, 2) automatically, either at each iteration or only at the first iteration, by assigning a ScalesEstimator via SetScalesEstimator(). When a ScalesEstimator is assigned, the optimizer is enabled by default to estimate learning rate only once, during the first iteration. This behavior can be changed via SetDoEstimateLearningRateAtEveryIteration() and SetDoEstimateLearningRateOnce(). For learning rate to be estimated at each iteration, the user must call SetDoEstimateLearningRateAtEveryIteration(true) and SetDoEstimateLearningRateOnce(false). When enabled, the optimizer computes learning rate(s) such that at each step, each voxel's change in physical space will be less than m_MaximumStepSizeInPhysicalUnits.
m_LearningRate = m_MaximumStepSizeInPhysicalUnits / m_ScalesEstimator->EstimateStepScale(scaledGradient)
where m_MaximumStepSizeInPhysicalUnits defaults to the voxel spacing returned by m_ScalesEstimator->EstimateMaximumStepSize() (which is typically 1 voxel), and can be set by the user via SetMaximumStepSizeInPhysicalUnits(). When SetDoEstimateLearningRateOnce is enabled, the voxel change may become being greater than m_MaximumStepSizeInPhysicalUnits in later iterations.
Definition at line 78 of file itkGradientDescentOptimizerv4.h.
Public Member Functions | |
virtual void | EstimateLearningRate () |
virtual const TInternalComputationValueType & | GetConvergenceValue () const |
const char * | GetNameOfClass () const override |
void | ResumeOptimization () override |
virtual void | SetConvergenceWindowSize (SizeValueType _arg) |
virtual void | SetMinimumConvergenceValue (TInternalComputationValueType _arg) |
void | StartOptimization (bool doOnlyInitialization=false) override |
void | StopOptimization () override |
virtual void | SetLearningRate (TInternalComputationValueType _arg) |
virtual const TInternalComputationValueType & | GetLearningRate () const |
virtual void | SetMaximumStepSizeInPhysicalUnits (TInternalComputationValueType _arg) |
virtual const TInternalComputationValueType & | GetMaximumStepSizeInPhysicalUnits () const |
virtual void | SetDoEstimateLearningRateAtEachIteration (bool _arg) |
virtual const bool & | GetDoEstimateLearningRateAtEachIteration () const |
virtual void | DoEstimateLearningRateAtEachIterationOn () |
virtual void | SetDoEstimateLearningRateOnce (bool _arg) |
virtual const bool & | GetDoEstimateLearningRateOnce () const |
virtual void | DoEstimateLearningRateOnceOn () |
virtual void | SetReturnBestParametersAndValue (bool _arg) |
virtual const bool & | GetReturnBestParametersAndValue () const |
virtual void | ReturnBestParametersAndValueOn () |
Public Member Functions inherited from itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType > | |
virtual const DerivativeType & | GetGradient () const |
const char * | GetNameOfClass () const override |
virtual const StopConditionObjectToObjectOptimizerEnum & | GetStopCondition () const |
StopConditionReturnStringType | GetStopConditionDescription () const override |
virtual void | ModifyGradientByLearningRateOverSubRange (const IndexRangeType &subrange)=0 |
virtual void | ModifyGradientByScalesOverSubRange (const IndexRangeType &subrange)=0 |
virtual void | ResumeOptimization ()=0 |
void | StartOptimization (bool doOnlyInitialization=false) override |
virtual void | StopOptimization () |
virtual void | ModifyGradientByScales () |
virtual void | ModifyGradientByLearningRate () |
Public Member Functions inherited from itk::ObjectToObjectOptimizerBaseTemplate< TInternalComputationValueType > | |
virtual bool | CanUseScales () const |
virtual SizeValueType | GetCurrentIteration () const |
virtual const MeasureType & | GetCurrentMetricValue () const |
virtual const ParametersType & | GetCurrentPosition () const |
const char * | GetNameOfClass () const override |
virtual SizeValueType | GetNumberOfIterations () const |
virtual const ThreadIdType & | GetNumberOfWorkUnits () const |
virtual const ScalesType & | GetScales () const |
virtual const bool & | GetScalesAreIdentity () const |
bool | GetScalesInitialized () const |
virtual StopConditionReturnStringType | GetStopConditionDescription () const=0 |
virtual const MeasureType & | GetValue () const |
virtual const ScalesType & | GetWeights () const |
virtual const bool & | GetWeightsAreIdentity () const |
virtual void | SetNumberOfIterations (SizeValueType _arg) |
virtual void | SetNumberOfWorkUnits (ThreadIdType number) |
virtual void | SetScalesEstimator (ScalesEstimatorType *_arg) |
virtual void | SetWeights (ScalesType _arg) |
virtual void | StartOptimization (bool doOnlyInitialization=false) |
virtual void | SetMetric (MetricType *_arg) |
virtual MetricType * | GetModifiableMetric () |
virtual void | SetScales (const ScalesType &scales) |
virtual void | SetDoEstimateScales (bool _arg) |
virtual const bool & | GetDoEstimateScales () const |
virtual void | DoEstimateScalesOn () |
Public Member Functions inherited from itk::Object | |
unsigned long | AddObserver (const EventObject &event, Command *cmd) const |
unsigned long | AddObserver (const EventObject &event, std::function< void(const EventObject &)> function) const |
LightObject::Pointer | CreateAnother () const override |
virtual void | DebugOff () const |
virtual void | DebugOn () const |
Command * | GetCommand (unsigned long tag) |
bool | GetDebug () const |
MetaDataDictionary & | GetMetaDataDictionary () |
const MetaDataDictionary & | GetMetaDataDictionary () const |
virtual ModifiedTimeType | GetMTime () const |
const char * | GetNameOfClass () const override |
virtual const TimeStamp & | GetTimeStamp () const |
bool | HasObserver (const EventObject &event) const |
void | InvokeEvent (const EventObject &) |
void | InvokeEvent (const EventObject &) const |
virtual void | Modified () const |
void | Register () const override |
void | RemoveAllObservers () |
void | RemoveObserver (unsigned long tag) const |
void | SetDebug (bool debugFlag) const |
void | SetReferenceCount (int) override |
void | UnRegister () const noexcept override |
void | SetMetaDataDictionary (const MetaDataDictionary &rhs) |
void | SetMetaDataDictionary (MetaDataDictionary &&rrhs) |
virtual void | SetObjectName (std::string _arg) |
virtual const std::string & | GetObjectName () const |
Public Member Functions inherited from itk::LightObject | |
Pointer | Clone () const |
virtual Pointer | CreateAnother () const |
virtual void | Delete () |
virtual const char * | GetNameOfClass () const |
virtual int | GetReferenceCount () const |
void | Print (std::ostream &os, Indent indent=0) const |
virtual void | Register () const |
virtual void | SetReferenceCount (int) |
virtual void | UnRegister () const noexcept |
Static Public Member Functions | |
static Pointer | New () |
Static Public Member Functions inherited from itk::Object | |
static bool | GetGlobalWarningDisplay () |
static void | GlobalWarningDisplayOff () |
static void | GlobalWarningDisplayOn () |
static Pointer | New () |
static void | SetGlobalWarningDisplay (bool val) |
Static Public Member Functions inherited from itk::LightObject | |
static void | BreakOnError () |
static Pointer | New () |
Protected Member Functions | |
virtual void | AdvanceOneStep () |
GradientDescentOptimizerv4Template () | |
void | ModifyGradientByLearningRateOverSubRange (const IndexRangeType &subrange) override |
void | ModifyGradientByScalesOverSubRange (const IndexRangeType &subrange) override |
void | PrintSelf (std::ostream &os, Indent indent) const override |
~GradientDescentOptimizerv4Template () override=default | |
Protected Member Functions inherited from itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType > | |
void | PrintSelf (std::ostream &os, Indent indent) const override |
GradientDescentOptimizerBasev4Template () | |
~GradientDescentOptimizerBasev4Template () override=default | |
Protected Member Functions inherited from itk::ObjectToObjectOptimizerBaseTemplate< TInternalComputationValueType > | |
void | PrintSelf (std::ostream &os, Indent indent) const override |
ObjectToObjectOptimizerBaseTemplate () | |
~ObjectToObjectOptimizerBaseTemplate () override | |
Protected Member Functions inherited from itk::Object | |
Object () | |
bool | PrintObservers (std::ostream &os, Indent indent) const |
void | PrintSelf (std::ostream &os, Indent indent) const override |
virtual void | SetTimeStamp (const TimeStamp &timeStamp) |
~Object () override | |
Protected Member Functions inherited from itk::LightObject | |
virtual LightObject::Pointer | InternalClone () const |
LightObject () | |
virtual void | PrintHeader (std::ostream &os, Indent indent) const |
virtual void | PrintSelf (std::ostream &os, Indent indent) const |
virtual void | PrintTrailer (std::ostream &os, Indent indent) const |
virtual | ~LightObject () |
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::ConstPointer = SmartPointer<const Self> |
Definition at line 89 of file itkGradientDescentOptimizerv4.h.
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::InternalComputationValueType = TInternalComputationValueType |
It should be possible to derive the internal computation type from the class object.
Definition at line 99 of file itkGradientDescentOptimizerv4.h.
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::Pointer = SmartPointer<Self> |
Definition at line 88 of file itkGradientDescentOptimizerv4.h.
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::Self = GradientDescentOptimizerv4Template |
Standard class type aliases.
Definition at line 86 of file itkGradientDescentOptimizerv4.h.
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::Superclass = GradientDescentOptimizerBasev4Template<TInternalComputationValueType> |
Definition at line 87 of file itkGradientDescentOptimizerv4.h.
|
protected |
Default constructor
|
overrideprotecteddefault |
Destructor
|
protectedvirtual |
Advance one step following the gradient direction. Includes transform update.
Reimplemented in itk::ConjugateGradientLineSearchOptimizerv4Template< TInternalComputationValueType >, itk::GradientDescentLineSearchOptimizerv4Template< TInternalComputationValueType >, itk::LBFGS2Optimizerv4Template< TInternalComputationValueType >, itk::QuasiNewtonOptimizerv4Template< TInternalComputationValueType >, and itk::RegularStepGradientDescentOptimizerv4< TInternalComputationValueType >.
|
virtual |
Option to use ScalesEstimator for learning rate estimation at each iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is false.
|
virtual |
Option to use ScalesEstimator for learning rate estimation only once, during first iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is true.
|
virtual |
Estimate the learning rate based on the current gradient.
Reimplemented in itk::RegularStepGradientDescentOptimizerv4< TInternalComputationValueType >.
|
virtual |
Get current convergence value. WindowConvergenceMonitoringFunction always returns output convergence value in 'TInternalComputationValueType' precision.
Reimplemented in itk::LBFGS2Optimizerv4Template< TInternalComputationValueType >.
|
virtual |
Option to use ScalesEstimator for learning rate estimation at each iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is false.
|
virtual |
Option to use ScalesEstimator for learning rate estimation only once, during first iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is true.
|
virtual |
Set/Get the learning rate to apply. It is overridden by automatic learning rate estimation if enabled. See main documentation.
|
virtual |
Set/Get the maximum step size, in physical space units.
Only relevant when m_ScalesEstimator is set by user, and automatic learning rate estimation is enabled. See main documentation.
|
overridevirtual |
Reimplemented from itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >.
Reimplemented in itk::LBFGS2Optimizerv4Template< TInternalComputationValueType >, itk::MultiGradientOptimizerv4Template< TInternalComputationValueType >, and itk::QuasiNewtonOptimizerv4Template< TInternalComputationValueType >.
|
virtual |
Flag. Set to have the optimizer track and return the best best metric value and corresponding best parameters that were calculated during the optimization. This captures the best solution when the optimizer oversteps or oscillates near the end of an optimization. Results are stored in m_CurrentMetricValue and in the assigned metric's parameters, retrievable via optimizer->GetCurrentPosition(). This option requires additional memory to store the best parameters, which can be large when working with high-dimensional transforms such as DisplacementFieldTransform.
|
overrideprotectedvirtual |
Modify the gradient by learning rate over a given index range.
Implements itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >.
|
overrideprotectedvirtual |
Modify the gradient by scales and weights over a given index range.
Implements itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >.
|
static |
New macro for creation of through a Smart Pointer
|
overrideprotectedvirtual |
Methods invoked by Print() to print information about the object including superclasses. Typically not called by the user (use Print() instead) but used in the hierarchical print process to combine the output of several classes.
Reimplemented from itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >.
Reimplemented in itk::LBFGS2Optimizerv4Template< TInternalComputationValueType >, itk::MultiGradientOptimizerv4Template< TInternalComputationValueType >, and itk::QuasiNewtonOptimizerv4Template< TInternalComputationValueType >.
|
overridevirtual |
Resume the optimization.
Implements itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >.
Reimplemented in itk::LBFGS2Optimizerv4Template< TInternalComputationValueType >, and itk::MultiGradientOptimizerv4Template< TInternalComputationValueType >.
|
virtual |
Flag. Set to have the optimizer track and return the best best metric value and corresponding best parameters that were calculated during the optimization. This captures the best solution when the optimizer oversteps or oscillates near the end of an optimization. Results are stored in m_CurrentMetricValue and in the assigned metric's parameters, retrievable via optimizer->GetCurrentPosition(). This option requires additional memory to store the best parameters, which can be large when working with high-dimensional transforms such as DisplacementFieldTransform.
|
virtual |
Window size for the convergence checker. The convergence checker calculates convergence value by fitting to a window of the energy (metric value) profile.
The default m_ConvergenceWindowSize is set to 50 to pass all tests. It is suggested to use 10 for less stringent convergence checking.
Reimplemented in itk::LBFGS2Optimizerv4Template< TInternalComputationValueType >.
|
virtual |
Option to use ScalesEstimator for learning rate estimation at each iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is false.
|
virtual |
Option to use ScalesEstimator for learning rate estimation only once, during first iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is true.
|
virtual |
Set/Get the learning rate to apply. It is overridden by automatic learning rate estimation if enabled. See main documentation.
|
virtual |
Set/Get the maximum step size, in physical space units.
Only relevant when m_ScalesEstimator is set by user, and automatic learning rate estimation is enabled. See main documentation.
|
virtual |
Minimum convergence value for convergence checking. The convergence checker calculates convergence value by fitting to a window of the energy profile. When the convergence value reaches a small value, it would be treated as converged.
The default m_MinimumConvergenceValue is set to 1e-8 to pass all tests. It is suggested to use 1e-6 for less stringent convergence checking.
|
virtual |
Flag. Set to have the optimizer track and return the best best metric value and corresponding best parameters that were calculated during the optimization. This captures the best solution when the optimizer oversteps or oscillates near the end of an optimization. Results are stored in m_CurrentMetricValue and in the assigned metric's parameters, retrievable via optimizer->GetCurrentPosition(). This option requires additional memory to store the best parameters, which can be large when working with high-dimensional transforms such as DisplacementFieldTransform.
|
overridevirtual |
Start and run the optimization.
Reimplemented from itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >.
Reimplemented in itk::LBFGS2Optimizerv4Template< TInternalComputationValueType >, itk::MultiGradientOptimizerv4Template< TInternalComputationValueType >, and itk::QuasiNewtonOptimizerv4Template< TInternalComputationValueType >.
|
overridevirtual |
Stop the optimization.
Reimplemented from itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >.
Reimplemented in itk::MultiGradientOptimizerv4Template< TInternalComputationValueType >.
|
protected |
Definition at line 240 of file itkGradientDescentOptimizerv4.h.
|
protected |
Definition at line 236 of file itkGradientDescentOptimizerv4.h.
|
protected |
Store the best value and related parameters.
Definition at line 239 of file itkGradientDescentOptimizerv4.h.
|
protected |
Definition at line 234 of file itkGradientDescentOptimizerv4.h.
|
protected |
Definition at line 235 of file itkGradientDescentOptimizerv4.h.
|
protected |
Store the previous gradient value at each iteration, so we can detect the changes in gradient direction. This is needed by the regular step gradient descent and Quasi Newton optimizers.
Definition at line 249 of file itkGradientDescentOptimizerv4.h.
|
protected |
Definition at line 242 of file itkGradientDescentOptimizerv4.h.