Jesper Dramsch - Real-world Machine Learning
Jesper Dramsch - Real-world Machine Learning
I build neural networks in Earth science and physics
Danger, here be Pythons. 🐍
I do real-world machine learning and here I share insights and nuggets I find while doing so.
Пікірлер
Tremendo trabajo de investigación, gracias por tomarte el tiempo para recopilar toda esta información y resumirla.
for gpu's unless you're a deep learning guy with loaded pockets, it is not worth it to buy some rtx laptop or something with low vram, simply because these things require a LOT of computing, the free kaggle gpus are better in that regard
Can I do Ho 15 s
you tube algo gods finally decided to give me a relevant video rather than click baits
Neat! Welcome!
Most people keep pondering whether to buy an SUV when they actually need is a cycle.
no one care if you have PHD or any sort of that, just don't tell that right at the beginning of the video, no one care
well it depends on the complexity of the task. suppose you’re working on time series forecasting, you could do that with either scikit or tf/pytorch which are ML or DL respectively. by following the tf/pytorch approach you may get better results due to the NNs, but this approach demands heavy requirements like CUDA for parallel computations and accelerate the process. Meanwhile, if you are satisfied with slightly worse and cardinal results, you could just stick to ML which don’t demand much
I am intersted in 4D seismic inversion for prediction SW, Sg or pressure. I go through your github but I do not see the data input, could you please share with me about that. I am also reserching apply ML in oil&gas specially in geoscience. Many thanks for your sharing!
Could you please recommend any old workstation laptops
Hugging face,keras,kaggle 🎉🎉
Hi! I am also starting to to learn AI and ML now. Can you please help me with a few things. 1) After what amount of time will I need a better laptop or can I do it on my current laptop? Right now I have an office laptop with intel i5 10gen U Processor with integrated graphics 2) Since I am starting to learn where should I start for AI & ML? 3) Is Asus ROG Flow X13 2023 a good option? It has Ryzen 9 7940HS and Nvidea RTX 4050 6GB (60W). I want this one because it is super portable and would also help in taking notes since it is touchscreen. Also is 16GB RAM enough in the laptop? It would be great if you could help me out a bit. Thanks!
Awesome post. Can you elaborate a bit more on the mechanics of content based filter. Thanks a lot
Thank you for a wonderful work and sharing the application of ML in Geoscience. Really appreciate it.
Thanks man
He is so pretty...
want a video on classification vs clustering, ummm, more of a supervised giant vs unsupervised giant !!
Do I need upgrade my Gtx 1660Ti GPU?
Was looking for an EDC work laptop that can handle personal scoped AI ML on my own air gapped network at home. That way i can leave the models and data private and connect the laptop by cable to train the model when needed. Appreciate this extremely detailed information and how it all relates to AI and ML. My very own private air gapped AI 😀 sounds like I'm better off setting up a tower AI build and go with maybe a laptop with one of the new intel core ultras for otg edc.
Such a helpful video! I need an update for 2024 products. Can't decied on which laptop to buy.
Thanks! Honestly it mostly still holds 😅
Great explanation video! One thing I would have liked to hear more about is dominance of the Nvidia CUDA framework. It seems to me that a lot of ML python libraries are compiled to work with the CUDA framework and therefore one would need to run it on Nvidia hardware. That’s the advantage that Nvidia has because it started 20 years ago developing the CUDA framework and was miles ahead of everyone else in the field of deep learning. As you said, things like Tensorflow is just starting to have Apple silicon /aarch/arm64 architecture. But Nvidia continues to innovate with RAPIDS (CUDF vs. Pandas) and with their NVLink on their DGX A100 and DGX H100 (8 GPUs w 80GB VRAM each and all linked together). However, with respect to laptop for ML, would it make sense from the perspective of a DevOps use case? Rather than using the laptop to train a huge LLM (llama2 , falcon40b, mistral, etc.) what if I just want to test a few of the prepackage Nvidia NGC containers in docker and add some additional python packages/libraries to them and test training on a smaller dataset smaller model to confirm that things work and then move the container over to the Cloud like Amazon AWS and run it on Nvidia A100 or DGX A100 resources to do the full training? Would laptops with nvidia GPUs (for docker, kubernetes, VMware) for DevOps testing purposes be useful or not at all? Thanks.
Thank you Jesper!
Thank you 🙏 really helpful ❤❤❤
Good content, but ... why the background music?
Loved the way you explained this. I also thought of using an autoencoder to find say data that similar to other data in ways I didn’t think of. Not just finding outliers.
Thank you You just saved me from getting broke 😅 I was thinking of getting a pc but confused on what graphics card to get
Hi! I am planning to buy a laptop and have some questions. Can you share your email, because I want to elaborate and the comment section is not for that. @Jesper
on a laptop???
For windows, when you say ram.. Is refer to RAM or gpu ram?
is it big difference between dedicated or integrated? I want to buy expertbook core i7 -32 RAM BUT it has iris xe .is this gpu enough for my works as you said???
I am in a bootcamp rn. I have a masters in philosophy and bc of my interest of ethics of ai i enrolled into a 3-months intensive fulltime bootcamp for data science. I got it paied, but i have to live off of my savings. The content and pace for someone with not former CS education is tough. I learn on weekends too and it takes most of my time. Yet, at this Point, half way through, i cannot imagine to be job-ready at the end. Question: do you think people like me are even sought after in the job market (living in Berlin/Germany)?
Thanks good stuff, your a good teacher.
Thanks!
With the release of AMD's new APU line up this month. Amongst them being a neural processing unit (NPU). I was wondering whether you could do a review of this chip, either in the laptop or desktop form and compare it to any of your Apple and Windows products you own.
What is GPU?
A graphics card. "Graphics Processing Unit"
@@JesperDramsch Alright. Thanks :)
pytorch🔥🔥🔥
Really nice video bro
My man out here looking beautifull fr
That's a good one, you can do deep network also on a cpu and for small models is even faster then a gpu. About using the cloud I have a different take, I'm using one computer for browsing and ome for training. 😂
I tend to buy a laptop with ryzen 7 pro 6850U. Is AMD good for who begin learning machine learning ? Hope your response.
Also for doing neural networks the GPU is not always needed, I was playing with a small model and then I tried the computer of my brother's to see if I could speed up everything with the GPU and it wasn't faster.
Agreed!
I don't how to explain this, but when I see you, I see a German Senior C++ Engineer for some reason
Hey can I get a way to contact to directly cuz I’m in real trouble to choose and IT field option where idk to do AI or software engineering which are both har for me
urgent!! Is a i5 12 h 16 gb laptop with intel iris xe graphics good for neural network training
No. You'll definitely want a modern nvidia GPU to do deep learning(neural network).
Pls What do u advice, i wanna get a laptop for Deepfacelab 2.0 to make deepfake vids Here are my laptop specs ; MSI raider ge78hx, corei7 13700hx Nvidia RTX 4070 8gb Vram 32gb of memory 1tb. Second laptop is; MSI stealth gs77 Intel corei7 12thgen Nvidia RTX 3070ti with 8gb vram 32gb ddr5 memory 1tb Are these specs good enough to run deepfacelab and get decent results?
🐍 Python + 🐼 pandas + 🤖 ai
It’s between a 6700 XT Radeon and a 4060 TI Nvidia card. I want to game and be able to train fast on my pc. 😫
I think Radeon cards can't do machine learning? Correct me if am wrong
@@romo6015 I think you’re right. 🙂 I ended up buying the 4060 TI I am so glad I did lol. It runs great
@@df6148 the 16gb vram version??Good for you
@@df6148did i get the 8gb or 16gb? Hows it so far? I am trying to get a gpu to learn machine learning
You wrong,but Nvidia Is Better for machine learning@@romo6015
We know eGPU is poor performance for high end games because of the constant GPU transaction through the limited bandwidth of thunderbolt, but what about ML/LLM training. I'm curios to learn others experience with thunderbolt 4 and perhaps rtx40x0 series or higher GPU. Is it believed ML will encounter similar significant limitations as games do?
Bhai, tumne to mhari aakkkhen khol di, nahi to main around 3000$ ka chuna laagane wala tha. ( Boss, you just opened my eyes, otherwise I was just going to fry a whooping $3000 dollar for a i9 , RTX 4070 laptop with 16 Gigs of RAM for training my thesis algo for object detection.
In fact macbook are not that expensive - maybe 1,3 of a dell or 2x of a new Acer :) but a little used macbook with nice layout/look can go well under 1000euro. I still dont recommend but they are not expensive that much - if you dont need 10 macbooks
Excellent information!