NLP & AI database integration | Get Insights from database using NLP | Chat with database | AI | NLP

Ғылым және технология

Get ready for some exciting tech🚀!
In this video we're building a Streamlit app that helps you get insights from a SQL database using Natural Language (NLP)! Imagine being able to ask questions like "What's the total sales?" or "Which products are most popular?" and getting instant answers from your SQL database, all without leaving your local machine!
We are using Llama 3, an open-source Large Language Model (LLM) that runs locally on our machine. This means we keep our data safe and secure within our own network. Here is how you can set up Ollama and OpenWebUI for the local LLM set up:
Ollama: • How to run LLM Locally...
OpenWebUI: • Build custom private c...
Link to AI Playlist: hnawaz007.github.io/ai.html
DBT series for database development: hnawaz007.github.io/mds.html
How to install Postgres & restore sample database: • How to install Postgre...
Follow the step-by-step guide on how to build this app and unlock the power of natural language insights from your data.
Link to GitHub repo: github.com/hnawaz007/pythonda...
#ai #chatwithdatabase #opensourceai
Link to Channel's site:
hnawaz007.github.io/
--------------------------------------------------------------
💥Subscribe to our channel:
/ haqnawaz
📌 Links
-----------------------------------------
#️⃣ Follow me on social media! #️⃣
🔗 GitHub: github.com/hnawaz007
📸 Instagram: / bi_insights_inc
📝 LinkedIn: / haq-nawaz
🔗 / hnawaz100
🚀 hnawaz007.github.io/
-----------------------------------------
Topics in this video (click to jump around):
==================================
0:00 - Overview of the App
1:18 - Custom LLM for SQL
2:40 - Develop LangChain Chain
3:52 - First Chain to Generate SQL
3:53 - Second Chain database interface
5:55 - Streamlit App
6:46 - Test the NLP and AI database integration
8:01 - Use Cases for this App

Пікірлер: 6

  • @GordonShamway1984
    @GordonShamway198417 күн бұрын

    Wonderful as always and just in time. Was going to build a similar use case that auto generates database docs for business users next week. This comes in handy🎉 Thank you again and again

  • @BiInsightsInc

    @BiInsightsInc

    17 күн бұрын

    Glad it was helpful! Happy coding.

  • @mohdmuqtadar8538
    @mohdmuqtadar853816 күн бұрын

    Great video What if the response from database exhaustes the context window of the model.

  • @BiInsightsInc

    @BiInsightsInc

    16 күн бұрын

    Thanks. If you are encountering model's maximum context length then you can try the following. 1. Choose a different LLM that supports a larger context window. 2. Brute Force Chunk the document, and extract content from each chunk. 3. RAG Chunk the document, only extract content from a subset of chunks that look “relevant”. Here an example of these from LangChain. js.langchain.com/v0.1/docs/use_cases/extraction/how_to/handle_long_text/

  • @krishnarajuyoutube
    @krishnarajuyoutube12 күн бұрын

    can we run llama 3 locally on any simple VPS Server, or do we need GPUS ?

  • @BiInsightsInc

    @BiInsightsInc

    12 күн бұрын

    Hi you'd need a gpu to run llm. By the way VPS servers can have GPUs.

Келесі