Abonia Sojasingarayar

Abonia Sojasingarayar

👋 Hi there! Welcome to our Channel!

🚀 I share my learning and insights of industry level AI solution. Dive into the exciting world of Artificial Intelligence, where I make complex concepts simple and captivating. Whether you're curious about Natural Language Processing, Computer Vision, Machine Learning, Generative AI, or MLOps, we've got you covered!

💡 With over 7 years of hands-on experience, I'm here to share practical insights and industry-level applications in a easy and engaging way.

🎓 Let's learn, innovate, and shape the future together!

Пікірлер

  • @World-um5vo
    @World-um5vo6 күн бұрын

    Hi, Thank you for the video, So if we want to fine tune the model and evaluate it for videos, then how to do it ?

  • @AboniaSojasingarayar
    @AboniaSojasingarayar4 күн бұрын

    Your most welcome. Here I have introduced basic usage of SAM 2 models. If you want to evaluate your finetuned model you may try mean IoU score, for a set of predictions and targets or DICE, precision, recall, and mAP.

  • @Basant5911
    @Basant591115 күн бұрын

    streaming does't work via doing this. I wrote code from scratch without langchain.

  • @AboniaSojasingarayar
    @AboniaSojasingarayar13 күн бұрын

    @@Basant5911 can you share your code base and error or issue that you are facing currently please?

  • @DenisRothman
    @DenisRothman15 күн бұрын

    ❤Thank you for this fantastic educational video on my book!!! 🎉

  • @AboniaSojasingarayar
    @AboniaSojasingarayar15 күн бұрын

    @@DenisRothman Thank you for your kind words. I'm grateful for the opportunity to review the book and share my thoughts. Your work is well-deserved and truly one of the most insightful books I've read.

  • @MohamedMohamed-xf7wh
    @MohamedMohamed-xf7wh27 күн бұрын

    You used a webpage as a data source for the RAG app, what If I add pdf file instead of the webpage as a data source, how can I deploy it in aws lambda?

  • @AboniaSojasingarayar
    @AboniaSojasingarayar27 күн бұрын

    To build RAG with pdf in AWS ecosystem, you need to follow steps that involve uploading the PDF to an S3 bucket, extracting text from the PDF, and then integrating this data with your RAG application.

  • @MohamedMohamed-xf7wh
    @MohamedMohamed-xf7wh27 күн бұрын

    @@AboniaSojasingarayar Can I locally extract text from pdf and build vector DB locally using vscode and then build the docker image and push it to ECR AWS like what you did in the video?

  • @AboniaSojasingarayar
    @AboniaSojasingarayar27 күн бұрын

    @@MohamedMohamed-xf7wh Yes, you can locally extract text from PDF files, build a vector database and then prepare your application for deployment on AWS Lambda by building a Docker image and pushing it to ECR. But which vector db are you using? It can be accessible with API?

  • @MohamedMohamed-xf7wh
    @MohamedMohamed-xf7wh27 күн бұрын

    @@AboniaSojasingarayar FAISS .. what is the problem with vector db?

  • @AboniaSojasingarayar
    @AboniaSojasingarayar23 күн бұрын

    @@MohamedMohamed-xf7wh Great!

  • @htayaung3812
    @htayaung3812Ай бұрын

    Really Nice! Keep going. You deserve more subscribers.

  • @AboniaSojasingarayar
    @AboniaSojasingarayarАй бұрын

    @@htayaung3812 Thank you so much for your support! I'm working to bring more tutorials.

  • @raulpradodantas9386
    @raulpradodantas9386Ай бұрын

    Save my life to create lambda layers... I have been trying for days. TKS!

  • @AboniaSojasingarayar
    @AboniaSojasingarayarАй бұрын

    @@raulpradodantas9386 Glad to hear that! You most welcome.

  • @SidSid-kp4ij
    @SidSid-kp4ijАй бұрын

    Hi I'm trying to run my trained model with interface to webcam but getting error can you share any insight on it

  • @AboniaSojasingarayar
    @AboniaSojasingarayarАй бұрын

    @@SidSid-kp4ij Hello Sid, Sure can you post your error message here please?

  • @gk4457
    @gk44572 ай бұрын

    All the best

  • @RajuSubramaniam-ho6kd
    @RajuSubramaniam-ho6kd2 ай бұрын

    Thanks for the video. Very useful for me as I am new to AWS lambda and bedrock. Can you please upload the lambda function source code? Thanks again!

  • @AboniaSojasingarayar
    @AboniaSojasingarayar2 ай бұрын

    Glad it helped. Sure you can find the code and complete the article on this topic in the description. In any way here is the link to the code : medium.com/@abonia/build-and-deploy-llm-application-in-aws-cca46c662749

  • @jannatbellouchi3908
    @jannatbellouchi39082 ай бұрын

    Which version of BERT is it used in BERTScore ?

  • @AboniaSojasingarayar
    @AboniaSojasingarayar2 ай бұрын

    As we are using lang= "en" so it uses roberta-large. We can also customize it using the model_type param of BERScorer class For default model for other languages,find it here: github.com/Tiiiger/bert_score/blob/master/bert_score/utils.py

  • @jagadeeshprasad5252
    @jagadeeshprasad52522 ай бұрын

    hey great content. please continue to do more videos and real time projects. Thanks

  • @AboniaSojasingarayar
    @AboniaSojasingarayar2 ай бұрын

    Glad it helped. Sure I am already on it.

  • @zerofive3699
    @zerofive36992 ай бұрын

    Awesome mam , very easy to understand

  • @NJ-hn8yu
    @NJ-hn8yu2 ай бұрын

    Hi Abonia, thanks for sharing. I am facing this error . can you please tell how to resolve it "errorMessage": "Unable to import module 'lambda_function': No module named 'langchain_community'",

  • @AboniaSojasingarayar
    @AboniaSojasingarayar2 ай бұрын

    Hello, You are most welcome. You must prepare your ZIP file with all the necessary packages. You can refer to the instructions starting at the 09:04

  • @humayounkhan7946
    @humayounkhan79462 ай бұрын

    Hi Abonia, thanks for the thorough guide, but i'm abit confused with the lambda_layer.zip file, why did you have to create it through docker? is there an easier way to provide the dependencies in a zip file without going through docker? Thanks in advance!

  • @AboniaSojasingarayar
    @AboniaSojasingarayar2 ай бұрын

    Hi Humayoun Khan, Yes we can but Docker facilitates the inclusion of the runtime interface client for Python, making the image compatible with AWS Lambda. Also it ensures a consistent and reproducible environment for Lambda function's dependencies. This is crucial for avoiding discrepancies between development, testing, and production environments. Hope this helps.

  • @evellynnicolemachadorosa2666
    @evellynnicolemachadorosa26663 ай бұрын

    hello! Thanks for the video. I am from Brazil. What would you recommend for large documents, averaging 150 pages? I tried map-reduce, but the inference time was 40 minutes. Are there any tips for these very long documents?

  • @AboniaSojasingarayar
    @AboniaSojasingarayar3 ай бұрын

    Thanks for you kind words and glad this helped. Implement a strategy that combines semantic chunking with K-means clustering to address the model’s contextual limitations. By employing efficient clustering techniques, we can extract key passages effectively, thereby reducing the overhead associated with processing large volumes of text. This approach not only significantly lowers costs by minimizing the number of tokens processed but also mitigates the recency and primacy effects inherent in LLMs, ensuring a balanced consideration of all text segments.

  • @Coff03
    @Coff033 ай бұрын

    Did you use OpenAI API key here?

  • @AboniaSojasingarayar
    @AboniaSojasingarayar3 ай бұрын

    Here we use open-source Mixtral from ollama.But, yes we can use OpenAI models as well.

  • @MishelMichel
    @MishelMichel3 ай бұрын

    Very informatics nd Your voice very clear dr

  • @AboniaSojasingarayar
    @AboniaSojasingarayar3 ай бұрын

    Glad it helped!

  • @fkeb37e9w0
    @fkeb37e9w03 ай бұрын

    Can we use openai and chromadb on aws??

  • @AboniaSojasingarayar
    @AboniaSojasingarayar3 ай бұрын

    Yes we can! In the below tutorial I have demonstrated how we can create and deploy lambda layer via container for larger dependencies : kzread.info/dash/bejne/mZ2X1cRyoJrbmpc.htmlsi=F_X7-6YCAb0Kz3Jc

  • @fkeb37e9w0
    @fkeb37e9w03 ай бұрын

    @@AboniaSojasingarayar yes but can this be done without eks or containers?

  • @AboniaSojasingarayar
    @AboniaSojasingarayar3 ай бұрын

    Yes! You can try it by creating a custom lambda layer.If you face issue try to use only the required libraries and remove any unnecessary dependencies from your zip file.Hope this helps.

  • @vijaygandhi7313
    @vijaygandhi73133 ай бұрын

    In the abstractive summarization use-case, usually a lot of focus is given to the LLMs being used and its performance. Limitations of LLM including context length and ways to overcome this issue are often overlooked. Its important to make sure that our application is scalable when dealing with large document sizes. Thank you for this great and insightful video.

  • @AboniaSojasingarayar
    @AboniaSojasingarayar3 ай бұрын

    Thank you Vijay Gandhi, for your insightful comment! You've raised an excellent point about the importance of considering the limitations of LLMs in the context of abstractive summarization, especially regarding their context length and scalability issues when dealing with large documents. Indeed, one of the significant challenges in using LLMs for abstractive summarization is their inherent limitation in processing long texts due to the maximum token limit imposed by these models. This constraint can be particularly problematic when summarizing lengthy documents or articles, where the full context might not fit within the model's capacity.

  • @zerofive3699
    @zerofive36993 ай бұрын

    Really useful info mam , keep up the good work

  • @AboniaSojasingarayar
    @AboniaSojasingarayar3 ай бұрын

    It's my pleasure.

  • @Bumbblyfestyle
    @Bumbblyfestyle4 ай бұрын

    👍👍

  • @akshaykotawar5816
    @akshaykotawar58164 ай бұрын

    Very Informative thanks for uploading

  • @AboniaSojasingarayar
    @AboniaSojasingarayar4 ай бұрын

    Glad it helped!

  • @akshaykotawar5816
    @akshaykotawar58164 ай бұрын

    Nice video

  • @AboniaSojasingarayar
    @AboniaSojasingarayar4 ай бұрын

    Thanks Akshay. Glad it helped!

  • @MishelMichel
    @MishelMichel4 ай бұрын

    Nyccc Mam 😍

  • @AboniaSojasingarayar
    @AboniaSojasingarayar4 ай бұрын

    Glad it helped 😊

  • @zerofive3699
    @zerofive36994 ай бұрын

    Very nice video, learnt a lot

  • @AboniaSojasingarayar
    @AboniaSojasingarayar4 ай бұрын

    Thank you! Glad it helped🤓

  • @user-ht5ev7il3h
    @user-ht5ev7il3h4 ай бұрын

    Please do more on AWS Bedrock to develop on RAG applications......your explanation is simple and effective.......stay motivated and upload more videos about LLM

  • @AboniaSojasingarayar
    @AboniaSojasingarayar4 ай бұрын

    Thanks for your kind words! Sure I will do it.

  • @akshaykotawar5816
    @akshaykotawar58164 ай бұрын

    Yes same thing i want ​@@AboniaSojasingarayar

  • @AboniaSojasingarayar
    @AboniaSojasingarayar3 ай бұрын

    Here the tutorial link to Deploying a Retrieval-Augmented Generation (RAG) in AWS : kzread.info/dash/bejne/mZ2X1cRyoJrbmpc.html

  • @Bumbblyfestyle
    @Bumbblyfestyle5 ай бұрын

    👍👍

  • @AboniaSojasingarayar
    @AboniaSojasingarayar4 ай бұрын

    😊😊

  • @zerofive3699
    @zerofive36995 ай бұрын

    Very informative

  • @AboniaSojasingarayar
    @AboniaSojasingarayar4 ай бұрын

    Glad it was helpful!

  • @zerofive3699
    @zerofive36995 ай бұрын

    Awesome , thanks

  • @AboniaSojasingarayar
    @AboniaSojasingarayar5 ай бұрын

    🎯𝐋𝐋𝐌 𝐓𝐫𝐚𝐢𝐧𝐢𝐧𝐠 𝐅𝐫𝐚𝐦𝐞𝐰𝐨𝐫𝐤𝐬🎯 ✅ 𝐀𝐥𝐩𝐚 is a system for training and serving large-scale neural networks. Scaling neural networks to hundreds of billions of parameters has enabled dramatic breakthroughs such as GPT-3, but training and serving these large-scale neural networks require complicated distributed system techniques. Alpa aims to automate large-scale distributed training and serving with just a few lines of code. 📌Alpa: github.com/alpa-projects/alpa 📌Serving OPT-175B, BLOOM-176B and CodeGen-16B using Alpa: alpa.ai/tutorials/opt_serving.html ✅ 𝐃𝐞𝐞𝐩𝐒𝐩𝐞𝐞𝐝 is an easy-to-use deep learning optimization software suite that enables unprecedented scale and speed for DL Training and Inference. 📌Megatron-LM GPT2 tutorial: www.deepspeed.ai/tutorials/megatron/ 📌DeepSpeed: github.com/microsoft/DeepSpeed ✅𝐌𝐞𝐠𝐚𝐭𝐫𝐨𝐧-𝐋𝐌 / Megatron is a large, powerful transformer developed by the Applied Deep Learning Research team at NVIDIA. Below repository is for ongoing research on training large transformer language models at scale. Developing efficient, model-parallel (tensor, sequence, and pipeline), and multi-node pre-training of transformer based models such as GPT, BERT, and T5 using mixed precision. 📌pretrain_gpt3_175B.sh: github.com/NVIDIA/Megatron-LM/blob/main/examples/pretrain_gpt3_175B.sh 📌Megatron-LM: github.com/NVIDIA/Megatron-LM ✅ 𝐂𝐨𝐥𝐨𝐬𝐬𝐚𝐥-𝐀𝐈 provides a collection of parallel components for you. It aim to support us to write our distributed deep learning models just like how we write our model on our laptop. It provide user-friendly tools to kickstart distributed training and inference in a few lines. 📌Colossal-AI: colossalai.org/ 📌Open source solution replicates ChatGPT training process.Ready to go with only 1.6GB GPU memory and gives you 7.73 times faster training: www.hpc-ai.tech/blog/colossal-ai-chatgpt ✅ 𝐁𝐌𝐓𝐫𝐚𝐢𝐧 is an efficient large model training toolkit that can be used to train large models with tens of billions of parameters. It can train models in a distributed manner while keeping the code as simple as stand-alone training. 📌BMTrain: github.com/OpenBMB/BMTrain ✅ 𝐌𝐞𝐬𝐡 𝐓𝐞𝐧𝐬𝐨𝐫𝐅𝐥𝐨𝐰 (mtf) is a language for distributed deep learning, capable of specifying a broad class of distributed tensor computations. The purpose of Mesh TensorFlow is to formalize and implement distribution strategies for your computation graph over your hardware/processors. For example: "Split the batch over rows of processors and split the units in the hidden layer across columns of processors." Mesh TensorFlow is implemented as a layer over TensorFlow. 📌Mesh TensorFlow: github.com/tensorflow/mesh Please let me know in the comment section if there are any frameworks in your curated list 👇

  • @MishelMichel
    @MishelMichel5 ай бұрын

    Good dr.....❤

  • @AboniaSojasingarayar
    @AboniaSojasingarayar5 ай бұрын

    Thank you 🙂

  • @AboniaSojasingarayar
    @AboniaSojasingarayar6 ай бұрын

    TOP 3 Repo to start LLM your Learning in 2024 1. Awesome-LLM: github.com/Hannibal046/Awesome-LLM 2.Awsome-LLMOps: github.com/tensorchord/awesome-llmops 3.awsome-llm: github.com/KennethanCeyer/awesome-llm

  • @tommyshelby6277
    @tommyshelby62776 ай бұрын

    thanks!

  • @AboniaSojasingarayar
    @AboniaSojasingarayar6 ай бұрын

    Glad it helped!

  • @vindovirasingarayar6961
    @vindovirasingarayar69616 ай бұрын

    👍👍👍

  • @rabelrayar
    @rabelrayar6 ай бұрын

    😕👍

  • @gk4457
    @gk44576 ай бұрын

    all the best

  • @Bumbblyfestyle
    @Bumbblyfestyle6 ай бұрын

    👍👍👍