Groq - New ChatGPT competitor with INSANE Speed

There is a brand new AI Chatbot platform called Groq that can answer any prompt in near real-time speed.
In my testing, it was 3-4 times faster than ChatGPT.
Two important notes before I show you how it works in more detail.
Groq with a Q is not the same as Grok on Twitter.
When I looked into it, it turns out the Groq has been around for much longer and they even have the trademark on the word. And they released a letter asking Elon Musk to change the name of his AI chatbot, Grok.
Groq is also not a large language model like Grok and ChatGPT. It’s a language processing unit. So its purpose is to run AI models like large language models using that technology and give you speed that other models haven’t achieved.
They way Groq explains it’s speed is due to it’s hardware. They use LPU which is a new technology they created, specifically for this reason.
All other LLMs are using GPUs.

Пікірлер: 49

  • @wonsunparque
    @wonsunparque3 ай бұрын

    Groq is a competitor of nVIDIA, not OpenAI. Their chips are specialized in processing generative AI and why this is significant is not so much in the text/type interaction but in oral conversation/speech. It can have closer-to real time conversation instead of you speak and wait and ChatGPT speak, etc, which isn’t very natural.

  • @ChukwumaOnyeijeMD

    @ChukwumaOnyeijeMD

    3 ай бұрын

    This is unbelievably fast. Incredible.

  • @pythonner3644

    @pythonner3644

    3 ай бұрын

    this video is a clickbait, it will affect chipmakers and future chips that OpenAI were planning on building

  • @SkillLeapAI

    @SkillLeapAI

    3 ай бұрын

    There is website called groq.com. It runs LLMs that are compatible to GPT 3.5. Sometimes better. And it’s a free chatbot. So what I showed in this video is the exact definition of an alternative. Someone can literally choose to use mistral on this website for free instead of ChatGPT.

  • @SkillLeapAI

    @SkillLeapAI

    3 ай бұрын

    Don’t they offer API access and a free website with a LLM you can use? Aren’t those the two things OpenAI provides also?

  • @pythonner3644

    @pythonner3644

    3 ай бұрын

    ​@@SkillLeapAI ​ @SkillLeapAI The primary business of grok is selling chips, as opposed to hosting an LLM . The revenue generated from this is minimal, excluding their API, which does not bring in a significant amount. Comparatively, Nvidia is a more fitting comparison.their are several free hosting options available for Mixtral models on Hugging Face and many other sites.

  • @js6909
    @js69093 ай бұрын

    I gave Groq a spin. What I like about is the fact it gives me a response in double quick time. It also answers my query from structured data. Indeed, Groq is a powerful and flexible query language that simplifies working with data in a way that I like. Thanks my friend for shining the light on this powerful platform.

  • @miladkhahil
    @miladkhahil3 ай бұрын

    Groq can serve up to 500 tokens per second. It is able to do this because it uses custom hardware that utilizes Linear Processor Units (LPUs) instead of GPUs. LPUs are designed to deliver deterministic performance for AI computations. They offer a more streamlined approach that eliminates the need for complex scheduling hardware, allowing every clock cycle to be utilized effectively. The system ensures consistent latency and throughput. LPUs can be linked together without the traditional bottlenecks found in GPU clusters, making them extremely scalable.

  • @ricardocnn
    @ricardocnn3 ай бұрын

    Hardware is evolving again after a long period. Is part of the AI race.

  • @jupaogasol1663
    @jupaogasol16633 ай бұрын

    Nice.

  • @carkawalakhatulistiwa
    @carkawalakhatulistiwa3 ай бұрын

    Groq is not new chat gpt. But new hardware. Replace nividia GPU with new groq LPU.

  • @dogecoinx3093
    @dogecoinx30933 ай бұрын

    maybe Gpu should wright Groq a letter saying Lpu is to simular to Gpu and change there name to megabyte processor

  • @nhtna4706
    @nhtna47063 ай бұрын

    How about testing on some images , videos using some stable diffusion models?

  • @alessoalesso2154

    @alessoalesso2154

    2 ай бұрын

    You can't it's an LPU not a GPU. This kind of chips are only for text processing, that's why it's so fast. Instead of VRAM it's uses like cache ram

  • @kiyonmcdowell5603
    @kiyonmcdowell56033 ай бұрын

    So did elon come up with grok or someone else

  • @amykpop1
    @amykpop13 ай бұрын

    It is not a chatgpt competitor lol. They are designing and producing the chips that run large language models. Chatgpt can then train their models with these chips. The one develops hardware, and the other develops software.

  • @Luxcium

    @Luxcium

    2 ай бұрын

    I think those chips are more for inference and less for training but I think it would be nice to have more information it is very interesting 😅

  • @Srindal4657
    @Srindal46573 ай бұрын

    Lpu's are just the start.

  • @humzamansoor95
    @humzamansoor953 ай бұрын

  • @Luxcium
    @Luxcium2 ай бұрын

    The actual title of this video is misleading because it says that Groq is a competitor to ChatGPT (Sounds like Nvidia being the competitor to Twitter or something)… I don’t know if the video correctly explains this… and I don’t know how to make a title that is compelling enough and concise in a similar manner… Theoretically Groq could be used to run ChatGPT (I don’t know the technical implications of my last claim though)😅😅😅😅

  • @GameHEADtime
    @GameHEADtime3 ай бұрын

    Yeah is fast

  • @CoachOlali
    @CoachOlali3 ай бұрын

    I don't see the speed as a benefit. I'm not even close to keeping up with the speed of any GPT version's let alone GPT 4, so I don't find the speed to be helpful. Now if it has a larger knowledge base, I think that would be an advantage or the information is more credible.

  • @SkillLeapAI

    @SkillLeapAI

    3 ай бұрын

    Yea I think it's more useful on the API side. Building apps that work in real-time speed can be a big benefit.

  • @CoachOlali

    @CoachOlali

    3 ай бұрын

    @@SkillLeapAI makes sense

  • @eth852

    @eth852

    3 ай бұрын

    Assuming the cost per millisecond is the same between GPU and LPU, running on an LPU is cheaper if it runs faster.

  • @carkawalakhatulistiwa

    @carkawalakhatulistiwa

    3 ай бұрын

    ​@@CoachOlaliGoogle created Gemini 1.5 pro with 1 million tokens. the problem is that it takes a minute to come up with an answer because it runs on the GPU . groq introduces LPU. LLM-focused chip. making it 100x faster than Nvidia GPU. this year LLM with 10 million tokens. will be out to the public. and LPU is very useful so that people don't have to wait long when processing a lot of data using LLM

  • @carkawalakhatulistiwa

    @carkawalakhatulistiwa

    3 ай бұрын

    ​@@SkillLeapAIthe emergence of LLMs with 10 million tokens or more to process research data and corporate data. groq LPU will be very useful in this task

  • @TheHistoryCode125
    @TheHistoryCode1253 ай бұрын

    its just using openai chatgpt api...

  • @jaysonp9426
    @jaysonp94263 ай бұрын

    You didn't show Mixtral...which is the superior model and twice as fast

  • @SkillLeapAI

    @SkillLeapAI

    3 ай бұрын

    It had a long wait line when I tested it. So I couldn’t even use it

  • @jaysonp9426

    @jaysonp9426

    3 ай бұрын

    @@SkillLeapAI ah that makes sense. It rips at 400-500 tokens/second and has the best reasoning I've seen outside of GPT4

  • @SkillLeapAI

    @SkillLeapAI

    3 ай бұрын

    Yea since uploading, I’ve been doing some testing and it’s pretty great. Crazy speed

  • @olternaut
    @olternaut3 ай бұрын

    ChatGPT competitor? Isn't this more of an Nvidia competitor?

  • @SkillLeapAI

    @SkillLeapAI

    3 ай бұрын

    Yea. But I was talking more on the api side.

  • @jaysonp9426
    @jaysonp94263 ай бұрын

    Nvidia has to be 💩 themselves

  • @thierry-le-frippon

    @thierry-le-frippon

    3 ай бұрын

    Yes... Their cake will shrink a lot. The news is slow to come to the stock market but it will ultimately.

  • @Luxcium
    @Luxcium2 ай бұрын

    0:45 Groq is *NOT* a model… Grok is a model, an AI Model but you can not say Groq with a Q is a model… 😮😮😮😮

  • @ryanscorner170
    @ryanscorner1703 ай бұрын

    I used it a few days back... it's not precise

  • @mariusklos9987

    @mariusklos9987

    3 ай бұрын

    I asked the simple questions and in comparison with Claude 2 the result was clear Claude beats groq

  • @misterm4590
    @misterm45903 ай бұрын

    Sounds ok, but my opinion Chat GPT is one of the best AI tools

  • @dogecoinx3093
    @dogecoinx30933 ай бұрын

    the only reason people would ever remember that name is because of elon and they want to be assholes