All About AI

All About AI

Welcome to my channel All About AI =)

Website:
www.allabtai.com

How you can start to use Generative AI to help you with creative or other daily tasks.

So I aim to bring Generative AI to everyone.

- AI Engineering
- ChatGPT
- GPT-4
- Midjourney
- Stable Diffusion
- Python
- AI Automation

We will answer all your questions about ai, genereal questions about the future of ai and the ethics of artificial general intelligence.

Follow our channel if you want to get going in this space =)

Пікірлер

  • @smnomad9276
    @smnomad92763 сағат бұрын

    You need to make a custom cap, and make three big letters AAI that stand for All About AI in the front so that it is visible when you wear it.

  • @Etienne_O
    @Etienne_O6 сағат бұрын

    Cool tool, but no MLX support :(

  • @DefaultFlame
    @DefaultFlame8 сағат бұрын

    Awwww . . . I was hoping the video was about finetuning an LLM into only speaking in 1337 5p34k.

  • @jpsl5281
    @jpsl5281Күн бұрын

    When Creating the dataset to train gpt-3.5-turbo For a conversational AI If I want to train the model to answer something specific to a range of similar questions that can occur during a conversation. *The conversation is a script which the AI follows* Should I include only the question and the response or should I include all the conversation up to where that specific question is asked ?

  • @drhenkharms6514
    @drhenkharms6514Күн бұрын

    Ok.. new kid on the block here... where do I find this example on the members github?

  • @akimezra7178
    @akimezra7178Күн бұрын

    Bruuuuuuh, just found this channel, you sure you're human?!?! Wish i had 5% of your brain.... thank you so much for your work! Im learning so much!!

  • @b1e2n3rl
    @b1e2n3rlКүн бұрын

    Amazing stuff Kris, i subscribed but my discord invite is expired? can you resend me an invite? have to mention the presentation and content is super high quality! big thanks!

  • @Moukrea
    @MoukreaКүн бұрын

    I wonder if OpenVoice behind RVC would produce good results with a fine tuned RVC model, 'cause with OpenVoice you can explicitly control the emotion given to the output (whispering, cheerful, terrified, angry, sad, friendly), which XTTS cannot... OpenVoice sounds more robotic than XTTS, I guess that could be somehow fixed thanks to RVC!

  • @rocket-mx6bh
    @rocket-mx6bh2 күн бұрын

    is mistral even necessary in this structure?

  • @TheHistoryCode125
    @TheHistoryCode1252 күн бұрын

    Hold on a minute! This video is basically an infomercial disguised as a helpful tutorial. The creator spends a good chunk of the video promoting their own GitHub repository for an AI-powered terminal, which honestly seems more like a gimmick than a truly useful tool. They keep hyping up the AI features, but let's be real, most terminal users already know basic commands or can easily look them up. The AI explanations for commands are often verbose and unnecessary, and the whole "ask openAI" thing feels clunky. Sure, it might be fun to play around with for a while, but it's not going to revolutionize your terminal experience. Don't get me wrong, integrating AI into software can be interesting, but this particular application feels forced and lacks practical value. Let's focus on developing AI tools that genuinely solve problems, not just create more distractions.

  • @ti0v283
    @ti0v2832 күн бұрын

    I'm in need of someone to install this collection of open-source software on my server and develop APIs for it. This is for the purpose of building a mobile app for conversational AI chat. Compensation will be provided for these services

  • @ti0v283
    @ti0v2832 күн бұрын

    from where i can get the all this code

  • @BStudioT
    @BStudioT2 күн бұрын

    Geniuos!

  • @duffy666
    @duffy6662 күн бұрын

    I really like it! It this already on Github for members (could not find it)?

  • @avi7278
    @avi72782 күн бұрын

    ShellGPT is the best integration of AI in the terminal.

  • @xspydazx
    @xspydazx2 күн бұрын

    my problemmy friend is that your using open AI for EVERYTHING!!!: why not a local model? It would be nice to see all these things running from a local model via one of the servers,,, llama_cpp,Lmstudio/GPT4all etc (these are just servers so the api provided will ccept all openI requests to the local llm?) it makes it easier to follow! andf to know that everything can be done completly free locally !.... but also bro good work as always !great videos and projects: Especiallly vidioes giving character to the models : I deployed simular prompts in training my model : (i need to always rememeber to reemove the prompt from the template after (to allow for other previous prompts to rise): So my home env i have an empty prompt : and training i put my detailed prompts in : i train with and without prompt! My latest model (samantha) is more personalized as i will always flourish tasks with some form of comment .... and even ask you questions ! personal thought and what your opinion is on stuff .... sometimes before you ask ! as my thing is to build these models with the personallity hidden inside ! I found that when i served the models for openinterpretor etc they still produced great formulae responses as required and performed tasks exactly as they should.... (previous training was task focused(hence loosing its aboility to talk ! so i had to reADD the personality)....

  • @xspydazx
    @xspydazx2 күн бұрын

    After todays training bout! i found that ehen i had added ### Input : {} ### Thoughts: What are your thoughts on this {} ### Response {} so i used DPO: (dataset) : The Prompt as the input, Rejected as the thought, Chosen as the output! I trained without thoughts first then with thoughts: i also re-framed many other datasets in this manner which have chain of thoughts(i put the explanation in the thoughts) Or some Wiki entrys : with the term s the input and the definition as the output , but the source in the thoughts : or even sometimes the associated entity list: ie giving it addtional data to have in the associate with the completion..... In later discussions it was those thought i put in the mind that the llm asked me about.... i give some random opinion and it told me the other thing as the real truth .... lol ... so it asked me about the rejected and answered withthe chosen? i also did this with various nfsw datas also : so now i asks me some @silly questions...even questions about various scenes in moveies as i also used many scripts from movies :: ! and some where other had AI prompt chat gpt to answer all questions from the perspective of.... removing all refference to the character and changing to BABE or BRO... (and BOSS , MASTER) or ME & YOU... Ie the AI should never consider itself an AI! the original dat for functions and progam calling is still inside ! and you can still set complex or roleplay prompts! So LOCAL! is the way to have that full autonomy over the CHARACTER of the bot! Especially when you need it to sound more real and even emotional! this technique of fine tuning actually works well for small and large datsets : which have aditional feilds: (also i never add the instruct !)