Stable Warpfusion Tutorial: Turn Your Video to an AI Animation

Фильм және анимация

The first 1,000 people to use the link will get a 1 month free trial of Skillshare skl.sh/mdmz06231
Learn how to use Warpfusion to stylize your videos. Discover key settings and tips for excellent results so you can turn your own videos to Ai Animations
Tech support: / discord
📁Warpfusion Settings:
bit.ly/42rJLPw
🔗Links:
Warpfusion v0.16(FREE & recommended): bit.ly/3pBh5X3
Warpfusion v0.14: bit.ly/42HozoG
DreamShaper: civitai.com/models/4384/dream...
Stable WarpFusion local install guide: • Stable WarpFusion loca...
Another local install guide: github.com/Sxela/WarpFusion/b...
Best Custom Stable Diffusion Models stablecog.com/blog/best-custo...
How to get good prompts: bit.ly/3IEAzjQ
How to use Luma AI: • Create FPV-Like Videos...
Disclaimer: Some links in the description are affiliate links. If you make a purchase through them, I may earn a small commission at no extra cost to you.
©️ Credits:
Stock video: www.pexels.com/video/energeti...
James Gerde: / gerdegotit
Marc Donahue: / permagrinfilms
Markus Paolo Pe Benito: / markuspaolo_
Alex Spirin: / defileroff
Noah Miller: / noahrobertmiller
Willis Hsieh: / willis.visual
Diesellord: / diesel_ai_art
Stefano Knoll: / steknoll
Josh Doctors: / fewjative
patchesflows: / patchesflows
Yüksel Aykilic: / designyukos
Oleh Ibrahimov: / drimota.ai
nointroproductions: / nointroproductions
Positive Prompts:
"0": [
"realistic female beautiful statue of liberty is a rocky statue dancing, manhattan city skyline in the background, the environment is new york city in day time, realism, hyper detailed, cinematic lighting, photograpny, High detail RAW color art, diffused soft lighting, sharp focus, hyperrealism, cinematic lighting, unreal engine, 4k, vibrant colours, dynamic lighting, digital art, winning award masterpiece, fantastically beautiful, illustration, aesthetically, trending on artstation, art by Zdzisaw Beksiski x Jean Michel Basquiat, high quality, 8k, "
]
Negative prompts:
"0": [
"smoke, fog, lowres, (bad anatomy:1.2), EasyNegative, multiple views, six fingers, black & white, monochrome, (bad hands:1.2), (text:1.2), error, cropped, worst quality, low quality, normal quality, jpeg artifacts, (signature:1.2), (watermark:1.3), username, blurry, out of focus, amateur drawing, colored, shading, displaced feet, out of frame, massive breasts, large breasts ,((ugly)), nude nsfw"
]
⏲ Chapters:
0:00 Introducing Warpfusion
0:34 How to start with Warpfusion
1:08 Google colab: local vs online runtime
2:01 How to transform a video
2:34 What's an AI model?
3:06 Settings
8:35 How to run Warpfusion
9:23 Animation preview
9:30 How to change GUI settings
12:06 How to export the animation
12:36 Get featured
12:49 Warpfusion + Luma AI
Support me on Patreon:
bit.ly/2MW56A1
🎵 Where I get my Music:
bit.ly/3boTeyv
🎤 My Microphone:
amzn.to/3kuHeki
🔈 Join my Discord server:
bit.ly/3qixniz
Join me!
Instagram: / justmdmz
Tiktok: / justmdmz
Twitter: / justmdmz
Facebook: / medmehrez.bss
Website: medmehrez.com/
#warpfusion #ai #stablediffusion
Who am I?
-----------------------------------------
My name is Mohamed Mehrez and I create videos around visual effects and filmmaking techniques. I currently focus on making tutorials in the areas of digital art, visual effects, and incorporating AI in creative projects.

Пікірлер: 415

  • @MDMZ
    @MDMZ Жыл бұрын

    Update: I recommend using Warpfusion v0.16: bit.ly/3pBh5X3 Update 03/04: Just re-tested the same exact steps in the tutorial using v0.14 and Dreamshaper 8 model, it works perfectly! The first 1,000 people to use the link will get a 1 month free trial of Skillshare skl.sh/mdmz06231 For tech support and other questions: discord.gg/YrpJRgVcax Don't forget #mdmz when you post your Warpfusion videos 😉🥳

  • @juanjuanchen6814

    @juanjuanchen6814

    9 ай бұрын

    the problem is if I pay you, can I use it on a free colab or free kaggle account? if not, seeming useless

  • @kelvinpatricio8842

    @kelvinpatricio8842

    9 ай бұрын

    I'm using v0_16_13 and the script is giving an error on Generate optical flow and consistency maps 🙁

  • @kelvinpatricio8842

    @kelvinpatricio8842

    9 ай бұрын

    Can someone help me?

  • @KREOGHOSTOFFICIAL

    @KREOGHOSTOFFICIAL

    3 ай бұрын

    YOU ARE CONFUSING THE SHIT OUTTA ME BRO

  • @MDMZ
    @MDMZ Жыл бұрын

    📁Warpfusion Settings: bit.ly/42rJLPw If you keep getting errors, use Warpfusion v0.16: bit.ly/3pBh5X3

  • @qwax

    @qwax

    Жыл бұрын

    What are the GPU requirements/VRAM requirements for Warpfusion?

  • @BoomBoomMac

    @BoomBoomMac

    11 ай бұрын

    Does it work with M1 MacBook OR any apple computers?

  • @MREDZ

    @MREDZ

    10 ай бұрын

    Hey man, thanks for your in-depth tutorials on stable diffusion and warp fusion, they've helped me understand the software greatly. Unfortunately I am having an issue when trying to create a warp fusion, specifically at the 'define SD + K functions, load model' section. I keep getting this error no matter what I do. NameError Traceback (most recent call last) Cell In[8], line 6 4 import argparse 5 import math,os,time ----> 6 os.chdir( f'{root_dir}/src/taming-transformers') 7 import taming 8 os.chdir( f'{root_dir}') NameError: name 'root_dir' is not defined Any help would be much appreciated, as there is nothing online that comes up when searching for a solution. Thanks.

  • @Rishivlogs551

    @Rishivlogs551

    10 ай бұрын

    1:19

  • @MREDZ

    @MREDZ

    10 ай бұрын

    @@Rishivlogs551 Ah okay thanks, I should've checked that out before I stated the process. I am now getting a different type of error when trying to run through a hosted runtime, under the Install and import dependencies. ImportError: cannot import name 'isDirectory' from 'PIL._util' (/usr/local/lib/python3.10/dist-packages/PIL/_util.py) Any idea what could be causing this? :\

  • @creatorsmafia
    @creatorsmafia Жыл бұрын

    I'm definitely going to give it a try and experiment with different settings.

  • @bdwedgeofanimotion4106
    @bdwedgeofanimotion4106 Жыл бұрын

    amazing and it really does look good

  • @korujaa
    @korujaa Жыл бұрын

    Very good, thanks !!!

  • @bdnwfantaziedreams
    @bdnwfantaziedreams9 ай бұрын

    very nice and I always wondered how it was done, not easy but the output is impressive

  • @MDMZ

    @MDMZ

    9 ай бұрын

    Thank you! Cheers!

  • @saraeljamal5009
    @saraeljamal5009 Жыл бұрын

    great tutorial, I have followed another tutorial to train my own AI model using rendered images of a character and used it, my first try wasn't so successful ( not sure if the reason is the video or the model) , any chance you can perhaps create a tutorial on creating our own AI models and using it on warpfusion?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    I followed this once before and it worked great!: kzread.info/dash/bejne/nXeXutSmhs6XdpM.html

  • @saraeljamal5009

    @saraeljamal5009

    Жыл бұрын

    @MDMZ, Thank you for your assistance! I managed to train my AI model and achieved some progress. However, I'm still struggling with maintaining consistency in masking the female's head throughout each frame. Initially, the mask works for a few frames, but then it starts to take on the form of the original face in the video.

  • @saumyajeetbhowmick7803

    @saumyajeetbhowmick7803

    11 ай бұрын

    which video tutorial did you use

  • @kartunenetwork9232
    @kartunenetwork9232 Жыл бұрын

    thanks for the awesome tutorial! Looks amazing, only thing is mine keeps changing the subject's aesthetic looks and especially the face within a couple frames... is there a way to make it keep the same look as the first frame?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    you can try to fix that by scheduling

  • @AiLabxArts
    @AiLabxArts Жыл бұрын

    That's impressive!!

  • @MDMZ

    @MDMZ

    Жыл бұрын

    🙏

  • @consciousHarmony
    @consciousHarmony Жыл бұрын

    In the "define SD + K functions, load model" section should I select CPU or GPU for the 'load_to' variable?

  • @user-uk9qk3sj4b
    @user-uk9qk3sj4b Жыл бұрын

    Please do a tutorial for the cola shorts clip it's so amazing

  • @baraazidan4946
    @baraazidan4946 Жыл бұрын

    Wonderful 👍👍

  • @WajihSouilemm
    @WajihSouilemm Жыл бұрын

    Cool bro !! 🔥

  • @MDMZ

    @MDMZ

    Жыл бұрын

    🙏

  • @JuanPerez_2023
    @JuanPerez_2023 Жыл бұрын

    Amazing !!!!

  • @owensy365
    @owensy365 Жыл бұрын

    ty vv much legend❣

  • @chocaholic65
    @chocaholic658 ай бұрын

    This is an awesome tutorial ❤❤❤

  • @MDMZ

    @MDMZ

    8 ай бұрын

    Thank you! Cheers!

  • @Fyhan69
    @Fyhan69 Жыл бұрын

    Awesome. Great Tutorial, ❤

  • @MDMZ

    @MDMZ

    Жыл бұрын

    Thank you! Cheers!

  • @borcan7287
    @borcan7287 Жыл бұрын

    Which is better, Warpfusion v0.14 or Stable WarpFusion v0.5.12 ?

  • @staffan_ofwerman
    @staffan_ofwerman9 ай бұрын

    I tried to follow your instruction here with my own video clip, but I seem to get errors all the time. Maybe it's because there are new versions up and running now that behave different. What I'm looking for is to use the video clip I have (it's me in front of a green screen). I would like to change myself into something fun, like some kind of animation, but not all different. Just making me look animated. And still have the Green Screen in the background in the final output. Maybe it's not possible in WarpFusion or what do you think? Should I look at something else or is it possible to make this with the right prompt and right model? Just can't find any tutorials about it. And I thought your video was great.

  • @MDMZ

    @MDMZ

    9 ай бұрын

    it is possible, I have instructions on how to keep the background untouched in this same tutorial, shooting on a green screen will definitely help with the separation. and YES, you should look into using a newer version

  • @BigBrisian
    @BigBrisian Жыл бұрын

    Hi MDMZ, my run stopped at 'Video Masking' with the issue of 'NameError: name 'os' is not defined'. Would be amazing if you can help, thank you.

  • @AnnaBednarek

    @AnnaBednarek

    10 ай бұрын

    Same here. Can somebody help us, please? :(

  • @nizamkoc8261
    @nizamkoc82618 ай бұрын

    Best vid. Thanks

  • @MDMZ

    @MDMZ

    7 ай бұрын

    Glad you liked it!

  • @rafaeladvincula4564
    @rafaeladvincula4564 Жыл бұрын

    Would you recommend using this to a horizontal 1080p video? I have an NVIDIA 3070.

  • @MDMZ

    @MDMZ

    Жыл бұрын

    both will work fine, depends hwo you plan to use the output, if for IG/tiktok just go with vertical

  • @CONCEPTSJRS
    @CONCEPTSJRS Жыл бұрын

    question, will this tutorial basically work if i run it locally? Im not familiar with colab pro but i have a 4080.

  • @MDMZ

    @MDMZ

    Жыл бұрын

    yes same process right after you connect to local run

  • @DearVMON
    @DearVMON Жыл бұрын

    Awesome tutorial!! Quick question, I do have a windows pc, but was wondering will this work on a macbook as well?

  • @5XM-Film

    @5XM-Film

    Жыл бұрын

    Obviously not for mac. Also would prefer if he would mention this right at the beginning 🤷🏻‍♂️

  • @MDMZ

    @MDMZ

    Жыл бұрын

    It actually works on the cloud! So your OS doesnt matter

  • @MDMZ

    @MDMZ

    Жыл бұрын

    I think you are referring to the local method, this is the online one 😉

  • @DearVMON

    @DearVMON

    Жыл бұрын

    @@MDMZ ey hedheke ch7abit nafhm bch n3rf ala ena pc nkhdm kn juste tst7a9 fazt l. Collab w local install yhmch thtd b a relief hh, thank you for the info^^

  • @5XM-Film

    @5XM-Film

    Жыл бұрын

    Can anybody help how to get this done with a mac?

  • @MarylandDevin
    @MarylandDevin11 ай бұрын

    How does this compare to using stable diffusion image to image batching for creating a stylized look for videos?

  • @MDMZ

    @MDMZ

    11 ай бұрын

    this is much more consistent

  • @braedongarner
    @braedongarner Жыл бұрын

    Took about 4 hours to render 4 seconds but man it looks buttery smooth. My 1080ti was really trying🤣

  • @MDMZ

    @MDMZ

    Жыл бұрын

    glad it worked for you 😁

  • @AhvaBidu

    @AhvaBidu

    Жыл бұрын

    970 here. I envy you! AhaHaHa

  • @Twigslap

    @Twigslap

    Жыл бұрын

    About to try this today wish me luck lol

  • @Tamannasehgal19

    @Tamannasehgal19

    Жыл бұрын

    I,ve GTX 1650 would it be okay?

  • @AhvaBidu

    @AhvaBidu

    Жыл бұрын

    ​@@Tamannasehgal19 Yes. Better than a 970. But will take time. Oh, I think it's ok. I don't really know. Your card is better than mine, so... I will just shut up now.

  • @minigabiworld
    @minigabiworld Жыл бұрын

    Thank you so much! Great video! Does this also work for cartoon characters with different human proportions?

  • @CYBERNORM

    @CYBERNORM

    Жыл бұрын

    Aah, sorry, I think we r out of cartoon characters.

  • @ToMgRoEbE
    @ToMgRoEbE11 ай бұрын

    If I have AMD GPU is it still safe to use the online version only/its the same as not having strong enough hardware?

  • @XViewer
    @XViewer Жыл бұрын

    Nice

  • @dr.greenvil7679
    @dr.greenvil7679 Жыл бұрын

    Hey! I'm considering buying a new PC of 8GB VRAM. Since Warpfusion seems to require more than that(wich means I'd have to pay for Colab Pro anyway), is there any benefit of buying a better 8GB VRAM PC, or should I just stick with my Laptop?anks for the tutorial.

  • @MDMZ

    @MDMZ

    Жыл бұрын

    depends on what you intend to use it for, 8GB is a bit low for SD

  • @triangulummapping4516
    @triangulummapping4516 Жыл бұрын

    How you increase the trails effect?

  • @Voidedsomeone
    @Voidedsomeone Жыл бұрын

    whats the song that people use for stabled diffusion

  • @TheDroneExperiment
    @TheDroneExperiment9 ай бұрын

    Quick Question. If I want to try to keep the original background which options do I select?

  • @MDMZ

    @MDMZ

    9 ай бұрын

    I actually explain that in the video

  • @cocoysalinas1
    @cocoysalinas111 ай бұрын

    Loved your video! Super Super Helpfull. Is there a way or a prompt to achieve a better lipsync or mouth movement? I'm struggling with this.

  • @MDMZ

    @MDMZ

    11 ай бұрын

    not yet!

  • @MarylandDevin
    @MarylandDevin11 ай бұрын

    Is this not part of stabled diffusion a1111 web ui, like an extension? This is it's own thing? Also, i have 12 gb vram. Does anyone have any input if similar ram worked for them? Thx

  • @MDMZ

    @MDMZ

    11 ай бұрын

    this is its own thing

  • @hurgerburger.
    @hurgerburger.10 ай бұрын

    Do you need the later versions of warpfusion or can you use the earlier ones?

  • @MDMZ

    @MDMZ

    10 ай бұрын

    It's best to use the latest

  • @dlysid
    @dlysid Жыл бұрын

    Does anyone know many time does it take to make a 30 seconds video with warp fusion? I need to understand this in order to present in on a live activation! Many Thanks in advance!

  • @MDMZ

    @MDMZ

    Жыл бұрын

    no one will be able to give you the correct answer, it depends on so many factors and it's pretty much impossible to predict until you run it.

  • @VRMOTION
    @VRMOTION Жыл бұрын

    You're a handsome man!!! I've been really looking forward to this video. And there is also a question, how to process VR1803D video in this way? After all, we cannot get consistently the same result for both lenses. (left and right) Please let me know if you have a guide for such a solution with style generation in VR180 3D video. Thank you. We will be following your news, with our whole small team.

  • @MDMZ

    @MDMZ

    Жыл бұрын

    I'm not so familiar with VR, but you can try using the same seed for both videos, or render both videos side by side in a single file then run it through Warp, if that makes sense

  • @FirstLast-tx3yj

    @FirstLast-tx3yj

    11 ай бұрын

    ​@@MDMZeverytime i run it locally i get the vram error And i could not find a way to install xformers to it (everything out there is about stable diffusion) How can i install xformers so that I lower the ram usage? Also it shows when running the code "no xformers module found" so it must work with xformers i just dont know what to change to activate it Please help

  • @johnnyc.31

    @johnnyc.31

    11 ай бұрын

    Use A1111 and Deforum or Deforumation. You can control camera angles and more.

  • @clash9927
    @clash99278 ай бұрын

    where can I find the stable_warpfusion_settings_sample document for the default_settings_path?

  • @LucidFirAI
    @LucidFirAI11 ай бұрын

    Can I use my own GPU or do I need to pay for Google Colab? Can you achieve the same results with Temporal Kit?

  • @zuzana7366
    @zuzana73669 ай бұрын

    hey, how to only diffuse the background but keep the object original? whats the setting for this masking, thanksss

  • @MDMZ

    @MDMZ

    9 ай бұрын

    I have covered that in the video

  • @SA-Brawl
    @SA-Brawl Жыл бұрын

    im using the free version of google colab so it doesent let it run do i need colab pro ?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    Hi, as explained in the video, colab pro will give you access to more resources

  • @Heartog.Design
    @Heartog.Design10 ай бұрын

    I'm 2 minutes in and I'm like 🤯 ... so many steps and it feels so complicated

  • @MDMZ

    @MDMZ

    10 ай бұрын

    it only takes a bit of patience, you can do it!

  • @AhvaBidu
    @AhvaBidu Жыл бұрын

    You are a monster, man! And I own a GTX970 😂 so, some others tutorials are more "for me"

  • @MDMZ

    @MDMZ

    Жыл бұрын

    Enjoy!

  • @theartforeststudio8667
    @theartforeststudio8667 Жыл бұрын

    Is there a way I could use warpfusion locally with automatic 1111? . Please make a tutorial on it 🙏

  • @MDMZ

    @MDMZ

    Жыл бұрын

    you can use stable diffusion locally both with A1111 and warpfusion as well, I do have a stable diffusion tutorial on how to install it with A1111

  • @theartforeststudio8667

    @theartforeststudio8667

    Жыл бұрын

    @@MDMZ thankyou!!! You mean a tutorial on using warpfusion with automatic 1111 , not Google colab. Right?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    @theartforeststudio8667 Pretty much the same things just different platforms. warpfusion on google colab is used to run stable diffusion A1111 is used to run stable diffusion on your browser Both are set up and work differently, so it depends on which one u r more comfortable with

  • @hinlee1947
    @hinlee1947 Жыл бұрын

    I have a trouble about not having really good consistency, is there a tutorial about the settings to make it perfect?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    if you're seeking perfect consistency, we're not there yet! I suggest playing with the settings I covered, try enabling fixed_code, etc...

  • @vyasbrothers
    @vyasbrothers6 ай бұрын

    Hi super video..however I have been trying since 2 days..it disconnected at 20% .Is there any fix for that? Thank you in advance :)

  • @raunaksharma3604
    @raunaksharma360410 ай бұрын

    @MDMZ, While Processing Video Input setttings, I got the following error: NameError: name 'generate_file_hash' is not defined Please Guide

  • @sonnyalexis2204
    @sonnyalexis2204 Жыл бұрын

    Can we used for photo ??

  • @vyasbrothers
    @vyasbrothers6 ай бұрын

    Hi..thank you for the amazing videos ....but it keeps disconnecting after a few hours and it goes back to square one! how do I keep the connection alive?

  • @MDMZ

    @MDMZ

    5 ай бұрын

    I usually play a 10 hour youtube video on another tab 😅 you gotta keep your computer active

  • @radstartrek
    @radstartrek Жыл бұрын

    bro, if you don't mind telling us, how many compute units did you use per video on average? especially that video you just showed?

  • @reubzdubz

    @reubzdubz

    Жыл бұрын

    I burnt like 20 units just for a 13s vid lol

  • @radstartrek

    @radstartrek

    Жыл бұрын

    @@reubzdubz wow man! thats some expensive job :D

  • @reubzdubz

    @reubzdubz

    Жыл бұрын

    @@radstartrek that is if you follow the resolution in the video tho. I went down to 540x960 afterwards.

  • @radstartrek

    @radstartrek

    Жыл бұрын

    @@reubzdubz ok, so it would cost even more compute units on something like 720p.

  • @MDMZ

    @MDMZ

    Жыл бұрын

    honestly I have never documented as I was experimenting regularly with different resolutions and settings which affects the rendering time heavily, but yes the lower the resolution, the faster it runs

  • @Raharajabimindset-vg3rz
    @Raharajabimindset-vg3rz5 ай бұрын

    Thanks it was really usuful. When I save my video and run the last cell it tooks almost 1 hour to complete though the video that I diffused(out put video) would be almost 1 second. I don't really know what is wrong.

  • @MikeBishoptv
    @MikeBishoptv Жыл бұрын

    When I hit "run all' it can't get passed the "1.4 Install and import dependencies" section, says it's missing some modules (timm, lpips) been scouring discord and see others with this problem but no solutions. I'm using colab pro remotely on a Mac.

  • @MDMZ

    @MDMZ

    Жыл бұрын

    did you try re-running? or using a different version ?

  • @MikeBishoptv

    @MikeBishoptv

    Жыл бұрын

    @@MDMZ yeah I fixed it by downloading the latest version and not the one in your tutorial

  • @MDMZ

    @MDMZ

    Жыл бұрын

    @@MikeBishoptv cool !

  • @user-uk9qk3sj4b
    @user-uk9qk3sj4b Жыл бұрын

    Can you do a tutorial for Deforum Stable Diffusion for google colab Because my installed version is not working

  • @MDMZ

    @MDMZ

    Жыл бұрын

    will look into it

  • @koa8299
    @koa82996 ай бұрын

    this is probably the most complicated ai program i used by far. so many errors you cant find a fix for online and confusing settings you got to learn on your own because nobody has a full setting explanation for it. it took me almost 300 renders to understand what most settings do but i feel like its all going to be worth it once i get it all down.

  • @MDMZ

    @MDMZ

    6 ай бұрын

    it's definitely challenging and can be frustrating at times, keep an eye on updates, newer notebooks are much more stable

  • @koa8299

    @koa8299

    6 ай бұрын

    @@MDMZ lol turns out all i needed to do was tweak was the controlnet settings to get the output i desire. i had no clue consistency and controlnet correlated with eachother

  • @jaknowsss
    @jaknowsss Жыл бұрын

    Why my colab always reconnecting, when i reconnect all my settings will be back to default settings and i cant go back to the 1st i made

  • @tr4dingsinperdidas
    @tr4dingsinperdidas11 ай бұрын

    Do you have the local tutorial?

  • @parzimav
    @parzimav Жыл бұрын

    Does A111 stable diffusion capable of this output?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    technically yes, but warpfusion is way way easier

  • @user-ll3pp6lh6b
    @user-ll3pp6lh6b Жыл бұрын

    Will it be on mobile?

  • @user-tx8gq5zf1l
    @user-tx8gq5zf1l9 ай бұрын

    Is there anyway to create videos like this on an iphone?

  • @Howling_Moon
    @Howling_Moon Жыл бұрын

    Which one you prefer? This Warpfusion or Difussion with it's Auto1111 interface? I tried this with stable difussion, got similar results and what's most important, it's free.

  • @MDMZ

    @MDMZ

    Жыл бұрын

    I find this more consistent, perhaps I need to play around with A1111 a bit more

  • @BeetjeVreemd

    @BeetjeVreemd

    Жыл бұрын

    What do you need exactly to make these kind of videos for free in Stable Diffusion?

  • @SultanHz

    @SultanHz

    Жыл бұрын

    @@BeetjeVreemd did you find out how

  • @BeetjeVreemd

    @BeetjeVreemd

    Жыл бұрын

    @@SultanHz Unfortunately no i didn't :(

  • @kubagacek7352

    @kubagacek7352

    Жыл бұрын

    @@BeetjeVreemd did you find out by now ?

  • @vivienatan8039
    @vivienatan803911 ай бұрын

    Hi does this work on MAC M2 chip?

  • @jannroche
    @jannroche Жыл бұрын

    Can u model a specific image instead of copying known ones like statue of liberty? I want to dance an image of myself for example ?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    in the example of using your own image, you will probably need to train a model first using your images, there are plenty of tutorials on how to do that on youtube

  • @dwmwi7216
    @dwmwi721611 ай бұрын

    1.4 import dependencies, define functions Runtime error

  • @davidw717
    @davidw717 Жыл бұрын

    Anyone know of a free alternative to Warpfusion

  • @VisitBeforeHumanPollute
    @VisitBeforeHumanPollute Жыл бұрын

    Can you please discuss about some "free ai site" for video

  • @MDMZ

    @MDMZ

    Жыл бұрын

    sure

  • @KayaDeus
    @KayaDeus Жыл бұрын

    On average how much does it cost to make a 30 second video? Supposing it's 1080 vertical and you use the online processing option

  • @MDMZ

    @MDMZ

    11 ай бұрын

    very difficult to predict

  • @triangulummapping4516
    @triangulummapping4516 Жыл бұрын

    How i can standby the process , turn off my laptop and continue later from the last frame generated?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    try using the resume_run feature

  • @Privacyking
    @Privacyking Жыл бұрын

    I am having issues connecting to google colab to local host.... i have posted into discord on the issue

  • @goldalemanha6330

    @goldalemanha6330

    Жыл бұрын

    Is it possible to do this on your cell phone or do you need a computer?

  • @triangulummapping4516
    @triangulummapping451610 ай бұрын

    After getting any error or server disconnection, is there a way to continue from the latest frame without running all the process again?

  • @MDMZ

    @MDMZ

    10 ай бұрын

    You can use the resume run festure

  • @user-rq8km3us3r
    @user-rq8km3us3r Жыл бұрын

    Can the generated video be used commercially

  • @Deviiiiiilllll
    @Deviiiiiilllll11 ай бұрын

    hello I followed your video step by step until the launch of all the scripts but an error is displayed at optical map settings and it tells me NameError: name 'os' is not defined can you help me vp (I have already tried 3 times but still the same and I have took the warpfusion 0.16) )

  • @MDMZ

    @MDMZ

    11 ай бұрын

    hi, check the pinned comment

  • @Deviiiiiilllll

    @Deviiiiiilllll

    11 ай бұрын

    I still have to pay another subscription to make warpfusion work?

  • @dougiejones628
    @dougiejones6287 ай бұрын

    Does anyone know, can this be done using another image as reference instead of a text prompt?

  • @MDMZ

    @MDMZ

    7 ай бұрын

    I believe it's possible now with IPadapter

  • @MeowVibrations
    @MeowVibrations Жыл бұрын

    First time please help, got error 1.2 Pytorch - 'No such file or directory: 'nvidia-smi'' Followed the entire tutorial with no luck. None of them talk about switching the Notebook settings Hardware Accelerated from None - to GPU. I have no idea if im suppose to do that. but thats the only way I can get the error to go away and keep the runtime going past 1.2 . However, with this GPU setting, it finish down to the GUI cell then disconnect my runtime and would not connect. I then switch the Notebook setting back to None, and it connected to the runtime. but now I am back at square 1 with the 1.2 Pytorch Nvidia smi error. Please help!

  • @MDMZ

    @MDMZ

    Жыл бұрын

    hi, check the pinned comment

  • @jasontreyes8078
    @jasontreyes80784 ай бұрын

    Does the AI have the capability of animating a drawing that I created (do I need to create the same subject in several angles?), and applying that drawing to a video, dance, walk or jumping video clip?

  • @MDMZ

    @MDMZ

    4 ай бұрын

    you can try image to video, I have a video on that

  • @jessecallahan480
    @jessecallahan480 Жыл бұрын

    Do you need CUDA and Visual Studio installed to run this locally on Win 10

  • @MDMZ

    @MDMZ

    Жыл бұрын

    you can follow the installation guide, the pre-required tools are listed there

  • @bigdaddysho962
    @bigdaddysho962 Жыл бұрын

    Hello dear sir, can I do it with Mac studio?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    Yes, you can! this works on the cloud so your computer's brand/model is irrelevant 😊😉

  • @bigdaddysho962

    @bigdaddysho962

    Жыл бұрын

    @@MDMZ Thank you very much, stay healthy🙌

  • @VL20IG
    @VL20IG Жыл бұрын

    So, after trying a few times and getting all types of different errors i realized that the problem were not withing my settings, but with the unstable free GPU provided..once i signed upfor ColabPro i ran the same notebook and it worked.

  • @MDMZ

    @MDMZ

    Жыл бұрын

    glad it worked

  • @fatjon6117
    @fatjon61179 ай бұрын

    which runtime should i use on colab? T4 or V100

  • @MDMZ

    @MDMZ

    8 ай бұрын

    I recommend u try both, one will cost you more over the other, but u get more speed

  • @samlavi
    @samlavi9 ай бұрын

    Getting an error msg failing at the Load a Stable tab saying; ModuleNotFoundError: No module named 'jsonmerge'. Even after getting a fresh install file and manually installing jsonmerge using pip install jsonmerge. Anyone else had this issue and managed to solve it?

  • @MDMZ

    @MDMZ

    9 ай бұрын

    hey, please visit Alex's discord for technical support, link in the description

  • @user-rq8km3us3r
    @user-rq8km3us3r Жыл бұрын

    Are subscription members allowed unlimited use of generation

  • @ProtRifprottoyislam
    @ProtRifprottoyislam Жыл бұрын

    is it not possible to do the same with stable diffusion?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    warpfusioin results are much more consistent

  • @essencialreal
    @essencialreal Жыл бұрын

    so, Do I have to pay on patreon to have acess online Warpfusion ? I did´t undersand how acess it. Can I buy it ? I can´t run on my PC. I have a poor 3070.

  • @MDMZ

    @MDMZ

    Жыл бұрын

    you dont need your local GPU for this method

  • @andrestamashiro
    @andrestamashiro10 ай бұрын

    I can't do it because google colab disconnects all the time in the 5th, 6th step so I have to start again. Is there any way to solve that?

  • @MDMZ

    @MDMZ

    10 ай бұрын

    try using the latest version of warpfusion

  • @myronkoch
    @myronkoch Жыл бұрын

    will this work on a Mac m1?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    this is the online method, it should work, I suggest you try it out u have nothing to lose

  • @aidigitalgoddess
    @aidigitalgoddess10 ай бұрын

    Hi, i used this tutorial and i have a question, why is my video at the end only 4 second if i uploaded video on 16 sec, did i do something wrong? i'm new in AI :(

  • @MDMZ

    @MDMZ

    10 ай бұрын

    probably, check the step at 7:36 and make sure you set the right frame range, [0,0] to process all frames

  • @ojasvisingh786
    @ojasvisingh786 Жыл бұрын

    🎉🎉

  • @jabeeyow186
    @jabeeyow18611 ай бұрын

    i have an error says OS is not define how to fix it? tia

  • @thekarmicbrat
    @thekarmicbrat10 ай бұрын

    Can this also work with still images or is it only video to video?

  • @MDMZ

    @MDMZ

    10 ай бұрын

    for images i suggest you use stable diffusion on A1111, it's free and easier to use

  • @goldalemanha6330
    @goldalemanha6330 Жыл бұрын

    Please bring a mobile option. I don't have a PC and I wanted to do this on my phone 😢

  • @riyando
    @riyando Жыл бұрын

    is there any free alternative?

  • @valideliyev8243
    @valideliyev8243 Жыл бұрын

    what about how to install to PC (Auto1111 )not google navigate?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    I have another tutorial on A1111, but this method works better in many scenarios

  • @valideliyev8243

    @valideliyev8243

    Жыл бұрын

    @@MDMZ no I need this effect how to make in from pc not in google navigate.

  • @OwlFatherTarnished
    @OwlFatherTarnished Жыл бұрын

    I followed the video step by step, But i generated a video of 4 seconds. Any tips on how to get a longer video ??

  • @MDMZ

    @MDMZ

    Жыл бұрын

    did you change your end frame from 0 to another number ?

  • @KoyaEry
    @KoyaEry11 ай бұрын

    Will it also work when using a Macbook?

  • @MDMZ

    @MDMZ

    11 ай бұрын

    i suggest you try, cause this is the cloud method

  • @stevopatiz
    @stevopatiz10 ай бұрын

    I tried to link my video after I uploaded the file but I get "FileNotFoundError: [WinError 2] The system cannot find the file specified: '/FILENMAME'". I linked it just like you did in the video. Any help is appreciated!

  • @MDMZ

    @MDMZ

    10 ай бұрын

    can you try the process from scratch? it might be referring to another setup file

  • @stevopatiz

    @stevopatiz

    10 ай бұрын

    @@MDMZ I've uninstalled and reinstalled everything the local guide said to install. It seems it has trouble finding the video? I put everything in the same folder.

  • @bboysounds
    @bboysounds Жыл бұрын

    Hey! my run crashed at line 4: controlnet_multimodel = get_value('controlnet_multimodel',guis) NameError: name 'get_value' is not defined Could you help?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    hi, check the description

  • @dancewithrajiv94
    @dancewithrajiv94 Жыл бұрын

    there is an error , "NameError: name 'get_value' is not defined". how do I fix this. please help !

  • @MDMZ

    @MDMZ

    Жыл бұрын

    hi, check the pinned comment for technical support

  • @blurise
    @blurise Жыл бұрын

    @mdmz I Guess i know the answer because of the GPU, but can i somehow use this with my Surface Pro? and does anybody have maybe an alternative app or programm?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    this runs online, your hardware doesn't matter here

  • @blurise

    @blurise

    Жыл бұрын

    @MDMZ so why is everybody in the comments talking about the hardware and how long rendering takes?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    @@blurise cause people nowadays don't even bother watching, the info is literally in the video

  • @blurise

    @blurise

    Жыл бұрын

    @@MDMZ okay you got me 🥲

  • @MDMZ

    @MDMZ

    Жыл бұрын

    @@blurise 🤣

  • @jaknowsss
    @jaknowsss Жыл бұрын

    hi there, is 4070ti with 12gb vram will work? for local runtime?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    yep should work fine

  • @jaknowsss

    @jaknowsss

    Жыл бұрын

    @@MDMZ do you think 4070ti 12gb is faster than the one with the colab plan?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    @@jaknowsss I'm not sure 😅, anything stopping you from trying it out ?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    I suggest you try it locally first since u have 12gb, before paying for colab pro

  • @chrissmarrujo5869
    @chrissmarrujo5869 Жыл бұрын

    Hi! Is this thing works with stable_warpfusion_v0_14_14.ipynb version?

  • @MDMZ

    @MDMZ

    Жыл бұрын

    it should, you can always move on to the newest version, settings shouldnt be much different

Келесі