Introduction To Optimization: Gradient Free Algorithms (1/2) - Genetic - Particle Swarm

Ғылым және технология

A conceptual overview of gradient free optimization algorithms, part one of two.
This video is part of an introductory optimization series.
TRANSCRIPT:
Hello, and welcome to Introduction To Optimization. This video covers gradient free algorithms.
Gradient based algorithms and gradient free algorithms are the two main types of methods for solving optimization problems. In this video, we will survey a variety of gradient free algorithms and discuss some of the basic ideas behind how they work.
The main difference between gradient based and gradient free algorithms is that gradient free algorithms do not require derivatives. This means that they can be used for optimization problems where derivatives can’t be obtained, or are difficult to obtain. This can include functions that are discrete, discontinuous, or noisy. This makes gradient free algorithms very flexible in the types of problems they can be applied to. The major disadvantage of gradient free algorithms is that they are generally much slower than gradient based algorithms.
There are a huge variety of gradient free algorithms, and many variations on those algorithms. Let’s take a look at some of the most common:
Exhaustive Search
First off, Exhaustive Search.
The simplest, and most inefficient gradient free optimization method is to try every possible solution and pick the best answer. While this approach may work for very small problems, with larger problems it quickly becomes impossible.
Genetic Algorithms
Genetic algorithms are another type of gradient free optimization algorithm. Genetic algorithms are based on the ideas of biology and evolution. Instead of proposing just a single solution to an optimization problem, a genetic algorithm generates many possible solutions that form a “population”. For example, if we were trying to optimize x and y, a population would be formed from various combinations of values of x and y. The solutions are scored using a fitness function, or objective function to decide which solutions are better than others. These candidate solutions are then recombined so that the best solutions “reproduce” to form a new generation of solutions with the best traits of the previous solutions. This continues until improvement stops or the maximum number of generations is reached. Let’s observe the progress of a genetic algorithm on the function x^3 + 15x^2 + y^3 +15y^2.
Particle Swarm
Particle swarm is similar to a genetic algorithm in that it creates a population, or in this case a swarm of possible solutions at each iteration. Each solution, or particle in the swarm has a direction and velocity. At each iteration, the movement of the particle is determined by a mixture of the direction it is currently moving, the direction of the best point it has found in the past, and the direction of the best point the whole swarm has discovered. The idea is that more and more particles will eventually move towards areas where better solutions are found, and that the swarm will eventually converge to the optimal value.

Пікірлер: 22

  • @lugaseth3732
    @lugaseth3732 Жыл бұрын

    Thank you for great videos. Concise, engaging, and clear explanations.

  • @sunnymag1093
    @sunnymag10936 жыл бұрын

    Once again, thank you for such a wonderfully helpful video!

  • @mirzaaslambaig9687
    @mirzaaslambaig96873 жыл бұрын

    God bless you so much for all the concepts you cleared. love your calm voice and the way you the way you teach with good relevant examples and analogies. indebted to you

  • @halomary4693
    @halomary46934 жыл бұрын

    THANK YOU FOR THE AWESOME SERIES. WOW WHAT AN AWESOME EFFORT. THANK YOU SO MUCH.

  • @ThePheonix123
    @ThePheonix1236 жыл бұрын

    Amazing video, explained the concepts so clearly, we need more tutorials like yours :D

  • @alphaopt2024

    @alphaopt2024

    6 жыл бұрын

    Thanks, glad you enjoyed it!

  • @user-tv4cl7gq1s
    @user-tv4cl7gq1s3 жыл бұрын

    This was a great and simple explenation. Thanks!

  • @krenovaFromSG
    @krenovaFromSG4 жыл бұрын

    really good and concise explanation. superb!

  • @twisties.seeker
    @twisties.seeker4 жыл бұрын

    Thank you very much. I really enjoyed watching your video.

  • @joshsavage8659
    @joshsavage86596 жыл бұрын

    Awesome Video!! Thank you!

  • @kingdeku7674
    @kingdeku76743 жыл бұрын

    Brilliant. Thank you!

  • @tiagovla
    @tiagovla6 жыл бұрын

    That is so coooool! :D

  • @nishah4058
    @nishah40582 жыл бұрын

    The best

  • @PritishMishra
    @PritishMishra3 жыл бұрын

    I request to make a video on 'Conjugated Gradient method' and 'Newton method of Optimization'

  • @hazemahmed8389
    @hazemahmed83897 жыл бұрын

    thank you...very important video.....please could u give the code of Genetic algorithm or particle swarm??

  • @ThePheonix123

    @ThePheonix123

    6 жыл бұрын

    There are many places online where you can find that. This video and tutorial series is just to explain the concepts clearly.

  • @muhammadawais581
    @muhammadawais5816 жыл бұрын

    where is part 2 ?

  • @alphaopt2024

    @alphaopt2024

    6 жыл бұрын

    Working on it, should be up soon.

  • @sanzhang2647

    @sanzhang2647

    6 жыл бұрын

    Looking forward to it! Thanks for making these videos! They're a great overview and pointer for further reading.

  • @alphaopt2024

    @alphaopt2024

    6 жыл бұрын

    Here we are: kzread.info/dash/bejne/gH1nuc6lotm9n8Y.html

  • @sanzhang2647

    @sanzhang2647

    6 жыл бұрын

    sweet! thanks!

  • @danielkrajnik3817
    @danielkrajnik38173 жыл бұрын

    an animation is worth a thousand images

Келесі