Interactive Disco Diffusion 5.2

Фильм және анимация

#ai #discodiffusion #aianimation
Interactive?
Yes
Automatic?
Kinda
Real Time?
If you call 2 frames per minute real time,
yes it's real time
created with disco diffusion 5.2 and touch designer and couple more
prompt:
"a beautiful painting of a single source of light in a meditative landscape by Asher Durand Brown, Trending on artstation, german romanticism"
If you want to chat with me,
ask something, give compliments,
make a job offer, or talk about commissions
discord is the bestest way to do so
/ discord
made myself a new server
or just email me
saturdaycuteblack@gmail.com

Пікірлер: 29

  • @generator71
    @generator712 жыл бұрын

    This is amazing! May I ask which notebook settings you used for the generated art? Texture is awesome.

  • @ayacyte443
    @ayacyte4432 жыл бұрын

    Very very cool!

  • @atom_unhinged
    @atom_unhinged Жыл бұрын

    i like your magik

  • @thedorbrothers5047
    @thedorbrothers5047 Жыл бұрын

    Smart !

  • @BeatoxYT
    @BeatoxYT2 жыл бұрын

    Do you have an explainer video/guide on how you made this?! Love this so much.

  • @TomHutchinson5
    @TomHutchinson52 жыл бұрын

    Good stuff

  • @1DusDB
    @1DusDB2 жыл бұрын

    Ok, now with music as input, to be rhythm-sensitive ;-)

  • @techattack
    @techattack Жыл бұрын

    Neat! I wonder whether you could use googles hand tracking pipeline to estimate hand position instead of having to use the phone to pipe data in

  • @thesoundarchitects.official
    @thesoundarchitects.official Жыл бұрын

    Amazing stuff Really! I'm trying to do something similar, but struggling with the mapping of movement to the translation x and y. What range did you use?

  • @hotmigchannel
    @hotmigchannel2 жыл бұрын

    Hello! Which program did you use in this video for rotate the camera in DD?

  • @cuteblack4252

    @cuteblack4252

    2 жыл бұрын

    Sensor Kinetics Pro in Playstore to use and export sensors in the smartphone

  • @kristapskazaks4015
    @kristapskazaks40152 жыл бұрын

    very, very cool. If you don't mind me asking, how did you manage to feed in the live video input? Is it somehow piped in through video init and the DD is running as normal, or is there a bigger modification to DD code?

  • @cuteblack4252

    @cuteblack4252

    2 жыл бұрын

    No, it’s simple. Touchdesigner slices the video into processed image sequence real time, and it gets mounted onto g-drive via some custom nod for Touchdesigner I found on google. Then you have to upload the mounted images into colab “manually.” And then it gets disco diffusioned. That’s why I’m calling it semi automatic. ‘Uploading the image into colab’ and ‘downloading the image from colab’ are the only two things that can’t be automated. Still only two clicks required. I think the whole process can be automated. And I think I can do it, if I had a sufficient GPU that can handle dd. But I don’t, so I had to use colab.

  • @kristapskazaks4015

    @kristapskazaks4015

    2 жыл бұрын

    ​@@cuteblack4252 Wait, so how did you get it to work realtime/interactive? Is the input video generated beforehand in touchdesigner, uploaded to googledrive and then DD is launched manually and you get the processed video back, or are you somehow processing each frame as it comes from the touchdesigner? Thanks for the reply, appreciate it!

  • @TJ-om4zy

    @TJ-om4zy

    2 жыл бұрын

    @@kristapskazaks4015 the disco diffusion stuff isnt in realtime, it can't be. The video is being turned into images in realtime, then those images are used in making the DD stuff. We are still a ways away from DD being able to render that quickly.

  • @xthefacelessbassistx

    @xthefacelessbassistx

    2 жыл бұрын

    @@cuteblack4252 wait , please explain this further to me! i have been using DD for about a month now and am familiar with making interactive projections using touchdesigner but this interactive diffusion thing has got me going ! i friggen love it

  • @cuteblack4252

    @cuteblack4252

    2 жыл бұрын

    @@xthefacelessbassistx even more detailed information??? ok, i’ll try. it’s a super super simple experiment i did for my art school assignment within a couple days. first of all, it’s an interactive disco diffusion project. but not real time. it’s impossible. secondly, when using init image, disco diffusion struggles to generate image on pure black and white. while the image gets generated the most on medium saturation / brightness colors, or gray. just like it’s shown on the video. so, using this concept, i created an image sequence of a gray ball tracking my hand coordinates. with touchdesigner since it seemed the easiest way to do so. and using that image sequence as the init image, i was able to create an interactive-like video just by waving my hand on camera. simple stuff. the third and fourth method is even more simple. this app called ‘sensor kinetics pro’ I bought on playstore can export the rotation vector of the phone as csv data. well you can just copy and paste that data into rotation vector_x,y,z in disco diffusion animation settings. done

  • @Tom-xn9kx
    @Tom-xn9kx2 жыл бұрын

    2 frames a second is still amazing, how is it so fast when generally it takes me 10 minutes to make an image in disco diffusion?

  • @tsdoihasdoihasdoih2493

    @tsdoihasdoihasdoih2493

    2 жыл бұрын

    setup must be a hundred RTXs

  • @masegado

    @masegado

    2 жыл бұрын

    It's 2 frames per minute according to the description, not per second =)

  • @euclideanplane

    @euclideanplane

    2 жыл бұрын

    @@masegado yeah, people hear what they want to hear, even when they read, haha.

  • @TheOlderSoldier

    @TheOlderSoldier

    2 жыл бұрын

    He’s recording the motion data then feeding it into each frame init and syncing after the fact.. pretty cool results!

  • @comble999
    @comble9992 жыл бұрын

    DD in real time? What kind of 2030 technique is it lol

  • @euclideanplane

    @euclideanplane

    2 жыл бұрын

    It's not in real time.

  • @euclideanplane
    @euclideanplane2 жыл бұрын

    There is a program called "Dream Studio" than gandamu made recently, works quite well Do you have a discord handle where I could invite you to the dev server he and I are a part of? and, honestly I kinda stopped using his dream studio just because the warp notebook works so well with video inits, you could just record your surrounds with a phone camera and get far better quality output than anything we're doing but, idk, programs like this could prove useful to me again in the near future when Dalle-2 notebooks come out. kzread.info/dash/bejne/np2h1ah-g5SuZ6w.html

Келесі