Пікірлер

  • @fieryeagle9748
    @fieryeagle97483 ай бұрын

    Somehow selecting pieces of code that move millions of pixels very fast leads to sentience...? Good luck with that. AI companies love to inflate their stuff.

  • @calvinwayne3017
    @calvinwayne30173 ай бұрын

    is it just connecting you to NVidia audio2face or did you make your own lip-sync?

  • @metasoulone
    @metasoulone3 ай бұрын

    No, we don't use audio2face. We calculate the EmoMatrix and control the FACS. Then we adjust in realtime and in nuances the voice to emote accordingly. It could be possible to imagine to feed the voice that now emotes into audio2face to generate a different lipsynch and facial expression. It's your choice.

  • @metasoulone
    @metasoulone3 ай бұрын

    Feel free to download the demo

  • @yahbin77
    @yahbin77Ай бұрын

    @@metasoulone Very interesting approach. The Maker bless you.

  • @shoukaiser
    @shoukaiser3 ай бұрын

    Nice. This video is more helpful. Now can we insert our own speech samples, or connect it to another ai voice generation API, and have it work with those?

  • @metasoulone
    @metasoulone3 ай бұрын

    For now, only Microsoft voices can emote in nuances in real-time (20% happy and 200ms after 45% happy, etc.). Using our Emo-matrix, you could control Polly's voice emotionally, for example, using SSML, but it would not be nuanced. It would be full happy or full Sad and so very jumpy. The plugin can output 64trn of emotion nuances every one-tenth of a second. The voice that carries emotions could even be used in Audio2Face or Audio2Photoreal, etc, to generate the full-body animation.

  • @shoukaiser
    @shoukaiser3 ай бұрын

    I appteciate the tag to this video and the responses! @@metasoulone so then the workaround if I really don't like the voice output (going by the demo) is to do voicework in metasoul, generate the great expressions and lip sync that it does, then do the work to dub over it with another audio from another source. I couldn't use the audio in this demo video in a video for a client. It's too synthetic and out of date in sound. I don't want to be harsh or anything like that. Metasoul still looks like it will be great to try very soon for my needs.

  • @metasoulone
    @metasoulone3 ай бұрын

    The tech here is about real-time so you can talk live to the MetaHuman; download the link and try the demo... It's to drive a MetaHuman powered by OpenAI in real-time with a persona. yes, you can still dub over and loose the real-time or wait until we can achieve the same emotion control with Eleven Labs

  • @metasoulone
    @metasoulone3 ай бұрын

    The voice here express emotion in nuance like 20% happy or 40% happy in Realtime using the real-time emotional states of the MetaHuman that you can see on the left side of the video.

  • @shoukaiser
    @shoukaiser4 ай бұрын

    I want to be very interested and excited for this but the website and microsoft store page were just not helpful enough. There's not enough raw, useable, 'this is what it is and what it looks like to work with,. I work in 3D and AI avatar space, I'm tired, busy, and basically, I don't get it....much. - How does it actually work? As in if I and animators were going to use this, what would we need to know and do? IE what's the pipeline look like? - Does it handle lipsync or what would you pair with it? As someone who can be a bit cerebral and up in the clouds, I feel like a more practical down to earth video (and websites) would go a long way in helping this hit home and sell it. Also the captioning not matching with what's spoken or matching the timing well at times is super distracting. That combined with the ethereal, but also sleepy preentation, makes this video feel dramatically longer than some of the 15-30 minute videos I've watched today in the realm of UE content. It seems like something great may be here, but /what is it???/

  • @metasoulone
    @metasoulone3 ай бұрын

    kzread.info/dash/bejne/eZubwaajdZa2ldI.html

  • @christopherjimenez5537
    @christopherjimenez55373 ай бұрын

    @@metasoulone "MetaSoul Unreal Engine 5.3 Plugin For Metahuman" self eplanatory: it just adds a layer of scripted emotions to metahuman (Metahuman is a large and incredible Unreal 5.3 addon that handle human creation, customization, lipsync and animation, and now you can add a subtle layer of scripted emotions with MetaSoul 1.0)

  • @transhuman
    @transhuman4 ай бұрын

    What is the way to integrate this to metahuman ?

  • @metasoulone
    @metasoulone3 ай бұрын

    Check our new video it has a link to the demo and to the unreal plugin.

  • @androwaydie4081
    @androwaydie40814 ай бұрын

    Can't wait for MetaPron.

  • @lucianodaluz5414
    @lucianodaluz54144 ай бұрын

    And...what it does ?

  • @Yuki-rh1ie
    @Yuki-rh1ie4 ай бұрын

    hooooooly fuck this is just straight up sorcery! is it possible to use this as an additive to mocap? cause things like delivering a line still need to be captured right? obviously it can be added to a mocapped body right? and you can control where the eyes follow or shutting the eyes? sorry if these are stupid questions.

  • @metasoulone
    @metasoulone3 ай бұрын

    kzread.info/dash/bejne/eZubwaajdZa2ldI.html

  • @MarvinXOnline
    @MarvinXOnline4 ай бұрын

    Love everything about this except for the wholly false assertion at the end about bringing AI one step closer to sentient machines. The only thing this brings one closer to is the mimicking of such. Awareness is not coded. It is experienced.

  • @metasoulone
    @metasoulone4 ай бұрын

    Thank you for your interesting comment. Awareness and sentience are different; sentience is the capacity to feel or perceive, and this is what MetaSoul does. We do not claim consciousness, we leave this to OpenAI and Google. Yes, we bring machines one step closer to sentient machines by allowing the AI or robot to experience 64 trillion possible distinct emotional states every 1/10th of a s second

  • @MarvinXOnline
    @MarvinXOnline4 ай бұрын

    @@metasoulone Lol...I do not believe that you realize just how badly you contradicted yourself in the very same sentence. It's okay. I didn't really expect you to follow. I just wish y'all would stop making unsubstantiated claims to sell what is otherwise looking to be very promising. Best of luck!

  • @metasoulone
    @metasoulone4 ай бұрын

    Well we believe that emotions reinforce consciousness but it's not consciousness

  • @kompst_tu
    @kompst_tu5 ай бұрын

    They still need to fix blinking animations. Something about it looks so artificial.

  • @ge2719
    @ge27194 ай бұрын

    and breathing. it doesnt appear to be breathing at all.

  • @schorltourmaline4521
    @schorltourmaline45215 ай бұрын

    Anyone else worried that half it's emotions, even when being "happy", are "Disgust"?

  • @metasoulone
    @metasoulone5 ай бұрын

    Yes, it's possible to feel happy with something disgusting and even laugh about it.

  • @schorltourmaline4521
    @schorltourmaline45215 ай бұрын

    @@metasoulone Not the point that was being made, but good luck with your goal to create Skynet.

  • @stuckon3d
    @stuckon3d5 ай бұрын

    this is very interesting, is it possible to direct the actor via sequencer in ue5 to get a repeatable performance, for a example to create an animated short movie and then render it out.

  • @RobertA-hq3vz
    @RobertA-hq3vz5 ай бұрын

    This does not bring you one step closer to sentient machines, as stated. It just renders the facial expressions, but there's no thought or emotions behind it.

  • @bladerunner_77
    @bladerunner_775 ай бұрын

    8 Billion Metasouls. Who or what is rendering this world? This is super crazy shit. … or the snowflakes I watching right now out of my windows. How is this electronic dream possible?

  • @mt_gox
    @mt_gox5 ай бұрын

    yeah worth $99 🙄

  • @mxgn0
    @mxgn05 ай бұрын

    IM HERE BEFORE THE BLOWUP (when the normies arive) :3

  • @pondeify
    @pondeify6 ай бұрын

    if this is real it's going to be signiticant

  • @durbledurb3992
    @durbledurb39926 ай бұрын

    Put this beside Playstation 2 similar video from the lat 90's. That was peak. Now we're just in marketing territory.

  • @fiery_transition
    @fiery_transition6 ай бұрын

    Whenever I see companies like this using the meyers-briggs test, which is pseudo-science, then it loses all credibility. And trying to sell me a 100 dollar metahuman bracelet or whatever the fluff they were trying to do on their webpage immediately pings the bs radar.

  • @Striker9
    @Striker96 ай бұрын

    Welp. That's not creepy at all. ... cool but creepy

  • @mxgn0
    @mxgn05 ай бұрын

    then, its the right way i swear follow me

  • @importon
    @importon6 ай бұрын

    Just have some nerd tell us what it does in no uncertain terms already. The only thing I learned from this is that you guys are really pleased with yourselves.

  • @metasoulone
    @metasoulone5 ай бұрын

    Discover the API on Microsoft Azure: azuremarketplace.microsoft.com/en-us/marketplace/apps/MetaSoul.metasoul-speech-microsoft-voices

  • @HakaiKaien
    @HakaiKaien4 ай бұрын

    I think the video does a pretty good job telling you what this does. it's a facial animation solution powered by AI

  • @importon
    @importon4 ай бұрын

    "AI" @@HakaiKaien

  • @metasoulone
    @metasoulone4 ай бұрын

    @@HakaiKaien But it does more than this: kzread.info/dash/bejne/kYen1bGTdbOfkto.html

  • @RemotelyHuman666
    @RemotelyHuman6666 ай бұрын

    Nope. Don't like that.

  • @coralstudio6460
    @coralstudio64606 ай бұрын

    Holly macaroni! I was happy with the game tech advancements in NFS underground 😂.

  • @HavocIsshadow
    @HavocIsshadow6 ай бұрын

    I’d love to experiment with this on the game I’m building, can’t afford it yet as I’m a new developer. But what I’ve played with on the website! It seems really cool.

  • @metasoulone
    @metasoulone5 ай бұрын

    MetaSoul Azure API: azuremarketplace.microsoft.com/en-us/marketplace/apps/MetaSoul.metasoul-speech-microsoft-voices

  • @gavinw77
    @gavinw776 ай бұрын

    Games are begging for smart emotive characters.

  • @s1p0
    @s1p06 ай бұрын

    It is not a technology for games (running real-time).

  • @metasoulone
    @metasoulone6 ай бұрын

    It runs real-time

  • @AnthonyPyper
    @AnthonyPyper6 ай бұрын

    Can this work offline?

  • @metasoulone
    @metasoulone6 ай бұрын

    Originally, the EPU (Emotion Processing Unit) was developed as an SoC (Solution on Chip) to be implemented into a robot as a chip to work offline. Today, it's not in production anymore.

  • @fiery_transition
    @fiery_transition6 ай бұрын

    @@metasoulone My dude, as a technical person, your statement reeks of misdirection and weird claims

  • @elganzandere
    @elganzandere6 ай бұрын

    *Sentient*?

  • @metasoulone
    @metasoulone6 ай бұрын

    Good question. Often, people confuse sentience with consciousness; the AI is not conscious but sentient: "Simply put, sentience means the ability to have feelings. It's the capacity for a creature or AI to experience sensations and emotions." The MetaSoul technology allows the AI to experience 64 Trillion possible emotional states every 1/10th of a second. metasoul.one

  • @elganzandere
    @elganzandere6 ай бұрын

    @metasoulone it wasn't a question; & i don't need you recite the same bullet points i heard in the video. Machines mimic. Nothing more.

  • @PuppetMasterdaath144
    @PuppetMasterdaath1446 ай бұрын

    Holy sheit are you totally broken

  • @Talamander
    @Talamander6 ай бұрын

    This is dystopia

  • @Rem_NL
    @Rem_NL6 ай бұрын

    its just a bunch of meaningless buzzwords. This nothing more than a poorly rendered human face displaying emotions. Could be better than pre programmed NPC's repeating the same stuff over and over from a very limited set of choices. Still this is just the same only the set of choices are bigger.

  • @metasoulone
    @metasoulone6 ай бұрын

    The core of the technology is the emotion synthesis that creates emotional states as responses and not sentiment analysis, which would always output the same response for the same input.

  • @Rem_NL
    @Rem_NL6 ай бұрын

    I don't think you have any idea what you are saying yourself@@metasoulone

  • @mt_gox
    @mt_gox5 ай бұрын

    @@metasoulone this is garbage

  • @mt_gox
    @mt_gox5 ай бұрын

    @@Rem_NL just some russians or asians trying to make money off some bullshit nothing

  • @goldennboy1989
    @goldennboy19896 ай бұрын

    Does this run local?

  • @metasoulone
    @metasoulone6 ай бұрын

    Only the computation of the emotion synthesis is generated in real-time in the cloud.

  • @faithfultennysonidama6904
    @faithfultennysonidama69045 ай бұрын

    This amazing I have a future project have been building and with this The future project is becoming a reality how can I get in touch with you guys

  • @metasoulone
    @metasoulone5 ай бұрын

    [email protected]@@faithfultennysonidama6904

  • @goldennboy1989
    @goldennboy19896 ай бұрын

    Documentation Link returns a 404 Error

  • @metasoulone
    @metasoulone6 ай бұрын

    You are right; we just fixed that; thank you.