C++ - optimizing gradient descent.

Its time to refactor stuff... I have the classes `Function`, `Sample`, `Scale`, `SampleMinimum`, `History`, `KineticEnergy`, and the following variables that exist outside the (epoch) loop: Function L; double learning_rate; History history; Sample new_sample; Scale scale; KineticEnergy energy; int vdirection; int hdirection; std::list⟨SampleMinimum⟩ extremes; best_minimum and last_extreme.
We need a State variable that deals with all internal state needed to find the best global minimum. Perhaps `gradient_descent::State`. This class then will contain all the variables.
You can follow the progress of the project as a whole here: github.com/CarloWood/cairowindow

Пікірлер