Enthought

Enthought

Enthought is a global consulting and software company that powers digital transformation for science. As the creators of the SciPy package and founders of the conference for Scientific Computing with Python, we’ve been leaders of scientific software development for over 20 years.

Our technology and deep scientific expertise enable faster discovery and continuous innovation. We solve complex problems for the most innovative and respected organizations across the life sciences, material sciences, chemical, semiconductor, and energy industries. Enthought helps companies leverage data strategy, modeling, simulation, machine learning and AI to accelerate scientific discovery and uncover new revenue opportunities through the transformation of people, processes, and technologies.

enthought.com #digitaltransformation #python #scipy #machinelearning #ai #training #software #datascience #materialsinformatics #materialsscience #bioinformatics #biopharmaceutics #semiconductor

Пікірлер

  • @kamalnathkadirvel2691
    @kamalnathkadirvel269115 сағат бұрын

    14:05Multiple input elements

  • @user-nd7zh6gy6b
    @user-nd7zh6gy6b20 сағат бұрын

    It's interesting for novice in the field, I thought you would present multiple objectives convex optimization with non standards form of objective functions.

  • @PaulIvanov314
    @PaulIvanov314Күн бұрын

    54:15 is where you get your waffles!

  • @MahmoudMohamed-lo7oz
    @MahmoudMohamed-lo7oz6 күн бұрын

    😊

  • @thenerdguy9985
    @thenerdguy998511 күн бұрын

    Great Talk

  • @caine7024
    @caine702412 күн бұрын

    very useful tool. it's such a freaking pain to merge jupyter notebooks D:

  • @wolpumba4099
    @wolpumba409916 күн бұрын

    *Summary* *Overall Tutorial:* * *Focus (**0:03**):* Practical application and problem-solving for numerical optimization using Python libraries. * *Libraries:* `scipy.optimize`, `estimagic`, and `jaxopt`. * *Exercises (**0:36**):* Hands-on Jupyter notebooks with examples of common optimization issues and their solutions. * *Prerequisites:* Basic Python, NumPy, and function definition knowledge. *Library Breakdown:* * *`scipy.optimize` (**0:03**):* * Simple, mature, and reliable starting point. * Provides access to 14 local optimizers suitable for various optimization problems. * Parameters are 1D NumPy arrays. * Lacks features like parallelization, interactive feedback, and flexible parameter representation. * *`estimagic` (**0:31**):* * Built on top of `scipy.optimize` and other libraries (`nlopt`, `tao`, `pygmo`, etc.), providing a harmonized interface. * Offers a wider range of optimizers and advanced features, including: * Flexible parameter representation using dictionaries, Pandas Series/DataFrames, and nested structures. * Interactive dashboard, logging, and visualization tools. * Built-in scaling and constraint handling mechanisms. * Support for global optimization techniques. * Emphasizes informed algorithm choice and robust convergence assessment. * *`jaxopt` (**0:47**):* * Utilizes JAX for automatic differentiation, JIT compilation, and GPU acceleration. * Provides differentiable optimizers, enabling gradient-based approaches with high precision and speed. * Excels in solving many instances of similar optimization problems efficiently through vectorization. *Key Concepts and Exercise Highlights:* * *Criterion Functions (**3:28**):* Defining optimization targets as Python functions. * *Start Parameters (**5:26**):* Understanding their importance and setting them appropriately. * *Algorithm Selection (**30:11**):* Choosing the right algorithm based on function properties (differentiability, complexity, constraints, size). Exercises involve identifying and fixing optimization failures by switching algorithms. * *Scaling (**40:22**):* Recognizing the impact of poorly scaled problems and using `estimagic`'s scaling capabilities to improve performance. Visualizing scaling issues with slice plots. * *Benchmarking (**53:32**):* Comparing optimizer performance on a set of benchmark problems with known optima. Utilizing profile plots and convergence plots for analysis. * *Bounds and Constraints (**1:08:48**):* Using bounds, fixed parameters, and linear constraints to define the optimization problem. Exercises involve implementing these constraints in `estimagic`. * *Automatic Differentiation (**49:59**):* Employing JAX to calculate gradients efficiently and accurately. Implementing JAX gradients within `estimagic`. * *Global Optimization (**1:29:49**):* Briefly introducing techniques like genetic algorithms, Bayesian optimization, and multi-start optimization. * *Vectorization with JAX (**2:00:19**):* Utilizing `jaxopt` and the `vmap` function transformation to solve multiple optimization problems concurrently. *Exercise Breakdown with Timestamps:* *Exercise 1: First Optimization with `scipy.optimize` (**6:57**):* * *Goal:* Familiarize yourself with basic optimization in Python using `scipy.optimize.minimize`. * *Task:* * Translate a mathematical criterion function (a function of multiple variables to be minimized) into Python code. * Set up starting parameters for the optimization. * Use `scipy.optimize.minimize` to find the minimum of the function. * *Key Takeaway:* You learn the essential steps involved in setting up and solving a basic optimization problem in Python. *Exercise 2: Convert Previous Example to `estimagic` (**13:45**):* * *Goal:* Experience the advantages of `estimagic`'s interface and features. * *Task:* * Convert the criterion function and starting parameters from Exercise 1 to work with `estimagic.minimize`. * Use dictionaries instead of flat arrays to represent parameters, taking advantage of `estimagic`'s flexibility. * Plot the optimization history using `estimagic`'s built-in plotting functions (`criterion_plot` and `params_plot`) to visualize the optimization process. * *Key Takeaway:* You become comfortable with `estimagic`'s syntax, understand how to represent parameters flexibly, and learn to use visualization tools for analyzing optimization runs. *Exercise 3: Play with Algorithm and `algo_options` (**30:26**):* * *Goal:* Develop an intuition for choosing appropriate algorithms and understanding their impact on optimization success. * *Task:* * You receive code snippets for two optimization problems, each with a pre-selected algorithm that *appears* to succeed but produces incorrect results. * Analyze the criterion functions to understand why the initial algorithm choice fails. * Choose a different algorithm (and potentially fine-tune `algo_options`) that successfully finds the true minimum. * *Key Takeaway:* You gain a deeper understanding of the strengths and weaknesses of different optimization algorithms and learn how to diagnose and address optimization failures caused by inappropriate algorithm choices. *Exercise 4: Benchmarking Optimizers (**54:53**):* * *Goal:* Learn to systematically compare optimizers and understand their relative performance on different types of problems. * *Task:* * Use `estimagic`'s benchmarking tools to run a set of benchmark problems with various optimizers. * Visualize the results using profile plots (showing the share of problems solved over the number of function evaluations) and convergence plots (detailing the convergence paths for individual problems). * Compare different implementations of the Nelder-Mead algorithm to see how implementation details can affect performance. * *Key Takeaway:* You gain experience with benchmarking optimizers, understand how to interpret benchmark results, and learn to appreciate the importance of choosing well-implemented algorithms. *Exercise 5: Constrained Optimization (**1:24:31**):* * *Goal:* Implement bounds and constraints to define a more realistic optimization problem. * *Task:* * Use `estimagic`'s constraint handling features to: * Set upper and lower bounds on specific parameters. * Fix certain parameters at their starting values. * Implement a linear constraint on the average of a subset of parameters. * Solve the constrained optimization problem and compare the results to the unconstrained case. * *Key Takeaway:* You learn to define and solve constrained optimization problems in `estimagic` and understand the impact of constraints on the solution. *Exercise 6: Scaling of Optimization Problems (timestamp not available):* * *Goal:* Visualize and address the challenges posed by poorly scaled optimization problems. * *Task:* * Work with a badly scaled benchmark problem. * Use `estimagic`'s `slice_plot` function to visualize the sensitivity of the criterion function to changes in each parameter. * Run the optimization with and without scaling (`scaling=True` in `estimagic.minimize`) and compare the results using a criterion plot. * *Key Takeaway:* You understand the concept of scaling in optimization, learn to recognize scaling issues through visualization, and experience how `estimagic`'s scaling feature can significantly improve optimizer performance. *Exercise 7: Using JAX Derivatives in `estimagic` (**1:53:58**):* * *Goal:* Integrate JAX's automatic differentiation capabilities into `estimagic` for faster and more precise gradients. * *Task:* * Translate a criterion function to use JAX arrays (`jnp`). * Compute the gradient of the function using `jax.grad` and optionally JIT-compile it for further speedup. * Solve the optimization problem using `estimagic.minimize`, passing the JAX gradient as the `derivative` argument. * *Key Takeaway:* You learn to combine the strengths of `estimagic` and JAX, demonstrating how automatic differentiation can be seamlessly integrated to enhance optimization performance. *Exercise 8: Vectorized Optimization in `jaxopt` (**2:00:19**) (Optional):* * *Goal:* Explore `jaxopt`'s capabilities for solving multiple instances of the same optimization problem concurrently. * *Task:* * Define a wrapper function that encapsulates the `jaxopt` optimization process for a single problem instance. * Use JAX's `vmap` function transformation to vectorize the wrapper function, enabling it to handle batches of problems. * Solve a set of problems with varying parameters efficiently using the vectorized solver. * *Key Takeaway:* You gain exposure to `jaxopt` and understand how to leverage JAX's vectorization features for situations where you need to solve many similar optimization problems. These exercises offer a comprehensive, hands-on approach to learning practical numerical optimization in Python, covering a wide range of topics from basic problem setup to advanced techniques using JAX and `jaxopt`. They are designed to build your intuition, problem-solving skills, and confidence in tackling real-world optimization challenges. i used gemini 1.5 pro to summarize the transcript

  • @salimtlemcani4122
    @salimtlemcani412219 күн бұрын

    Awesome presentation! Please do you provide the solution of the exercises?

  • @user-tg2gm1ih9g
    @user-tg2gm1ih9g22 күн бұрын

    the mere fact that everyone and his brother is working on a way to speed up python makes it *crystal* clear that python is unusable except for throw-away prototypes or maybe a bignums calculator.

  • @kisho2679
    @kisho267924 күн бұрын

    how can an external LaTex file be called/included/embedded in a JupyterLab cell?

  • @ravindarmadishetty736
    @ravindarmadishetty73628 күн бұрын

    Such a nice presentation on plotly. In one area while visual to code i am unable to see any response where fig.show('json') at 12.53.....root, data.....etc. Please help me if it is required to avail anything.

  • @mohammadgaeini4500
    @mohammadgaeini450028 күн бұрын

    It was very helpful. Thank you.

  • @JJayaraj-fs1di
    @JJayaraj-fs1diАй бұрын

    Hi. Im having a normal python dictionary which has values a list of numbers. Is there a way to convert this dictionary into a numba typed dictionary?

  • @JoseCostas-nd8np
    @JoseCostas-nd8npАй бұрын

    please remove this NOISE; no way to hear such poor quality.

  • @kcvinu
    @kcvinuАй бұрын

    I tested cython with python version 3.11. I just created a GUI library with cython and it seems a nice language and nice bridge between C and Python. But surprisingly "ctypes" ran faster than cython. Yes! I wrote the same GUI lib in Odin & C3. Then called the functions from python with ctypes. It was 2.5 times faster than cython. Both Odin & C3 are newer languages with manual memory management. Both are aiming to be alternatives to C. Due to this performance diff, I checked my cython code again and again. I realized that type conversion takes more time in Cython. But ctypes module in CPython 311 is marvelous.

  • @mmorpe
    @mmorpeАй бұрын

    Thanks, this was fast. Little difficult to for newcomers, but was great.

  • @rithanyabalamurali6936
    @rithanyabalamurali69362 ай бұрын

    how to install mayavi? ,it always shows file not responding!!!

  • @maximecros4090
    @maximecros40902 ай бұрын

    Can you implement your own manifold with geomstats ? then customize your metric and connexion or is it not possible ?

  • @xmurisfurderx
    @xmurisfurderx2 ай бұрын

    why isn't there some kind of goddamn ISO standard already

  • @zapy422
    @zapy4222 ай бұрын

    how are the dependencies managed ?

  • @user-mv3im2fi4f
    @user-mv3im2fi4f2 ай бұрын

    Comecei sem entender nada, terminei sem entender nada e mais um pouco

  • @Rohull-12345
    @Rohull-123452 ай бұрын

    Very good❤❤❤🎉🎉🎉🎉

  • @andresfelipehiguera785
    @andresfelipehiguera7852 ай бұрын

    Easy. Desinstall Matlab, and use Python.

  • @AdityaBhoite-vj1lb
    @AdityaBhoite-vj1lb3 ай бұрын

    can you guys provide the materials for this tutorial, the link in the description is not working

  • @MrMate12345
    @MrMate123453 ай бұрын

    Didn't expected to understand the (i know that very very) basics of gene expression from a NumPy tutorial. Thank you.

  • @code2compass
    @code2compass3 ай бұрын

    Ahhh such a polite teacher and the way she talk abd explain. OMG she and people like her are really a gift to our society. Stay safe, keep teaching and keep smiling. thank you

  • @convexset
    @convexset3 ай бұрын

    Useful.

  • @sandipdas7206
    @sandipdas72063 ай бұрын

    Is nobody gonna talk about how swiftly he switched from Mac to Windows

  • @diodin8587
    @diodin85874 ай бұрын

    22:06 pybind11

  • @codejunkes4607
    @codejunkes46074 ай бұрын

    40 degrees C, where are you?

  • @mmmhorsesteaks
    @mmmhorsesteaks4 ай бұрын

    "the posterior is what we're interested in" - brother knows what's up!

  • @AkashKumar-lr6hc
    @AkashKumar-lr6hc4 ай бұрын

    thanks for the presentation

  • @YesSirPi314
    @YesSirPi3144 ай бұрын

    Government using python notebooks

  • @AndrewDArcyEvans
    @AndrewDArcyEvans4 ай бұрын

    Potentially such an interesting talk. Do you have a link without the masks and with better sound?

  • @bephrem
    @bephrem5 ай бұрын

    great talk

  • @spinj3307
    @spinj33075 ай бұрын

    How is this library installed i've really never successfully installed it in all platforms that i have used

  • @pyajudeme9245
    @pyajudeme92455 ай бұрын

    Awesome!

  • @mattpopovich
    @mattpopovich5 ай бұрын

    This is a very impressive demonstration. I wish it was recorded at a higher resolution. Thank you for sharing.

  • @MaskedPixel
    @MaskedPixel5 ай бұрын

    Yup. Using this one tomorrow.

  • @alerdoballabani8322
    @alerdoballabani83225 ай бұрын

    Very good explanation, Unfortunately I can not access the study material.

  • @piotr780
    @piotr7805 ай бұрын

    Tfx is useless junk😂

  • @pyajudeme9245
    @pyajudeme92455 ай бұрын

    Best project ever! Keep on!

  • @flowy-moe
    @flowy-moe5 ай бұрын

    Would someone be able to share the Jupyter Notebooks? The link in the description is not working for me ...

  • @stevehageman6785
    @stevehageman67855 ай бұрын

    well done talk, thanks! :-)

  • @affable.pebble
    @affable.pebble6 ай бұрын

    thank you for this concise explanation!

  • @djangoworldwide7925
    @djangoworldwide79256 ай бұрын

    Coming from R I was like dang

  • @forheuristiclifeksh7836
    @forheuristiclifeksh78366 ай бұрын

    2:17:20

  • @forheuristiclifeksh7836
    @forheuristiclifeksh78366 ай бұрын

    2:01:17

  • @forheuristiclifeksh7836
    @forheuristiclifeksh78366 ай бұрын

    52:27

  • @forheuristiclifeksh7836
    @forheuristiclifeksh78366 ай бұрын

    1:03