38 - Connecting to Local LLM Server with Microsoft Semantic Kernel

Ғылым және технология

Join this channel to get access to perks:
/ @vinothrajendran
The PHI2 model from Microsoft is an advanced Large Language Model (LLM) with 2.7 billion parameters. Developed by Microsoft Research, this artificial neural network is proficient in generating natural language text. Trained on synthetic data produced by the GPT-3.5 model, PHI2 is designed to excel in tasks like reasoning, language comprehension, and coding. Notably, it offers high performance with a smaller size and reduced computational cost compared to other LLM models. #llm #local #semantic #kernel

Пікірлер: 9

  • @skaus123
    @skaus1235 ай бұрын

    does lm studio offer faster whisper models to use for speech to text with lower latency ?

  • @VinothRajendran

    @VinothRajendran

    5 ай бұрын

    I currently don't have information; I need to verify.

  • @skaus123

    @skaus123

    5 ай бұрын

    @@VinothRajendran seems it does'nt, focuses only on text generation

  • @VijayDChauhaan

    @VijayDChauhaan

    3 ай бұрын

    Or can we use faster-whisper library?

  • @jorgeromero4680
    @jorgeromero46804 ай бұрын

    it's not HtPP

  • @MrBigdogtim69
    @MrBigdogtim692 ай бұрын

    Great stuff! How would I connect to local LLM in a Maui app? I won't be able to have a webserver on iOS or Android - needs to be offline only.

  • @VinothRajendran

    @VinothRajendran

    2 ай бұрын

    I believe it's feasible. Refer to this guide on connecting to a local web service learn.microsoft.com/en-us/dotnet/maui/data-cloud/local-web-services?view=net-maui-8.0

  • @MrBigdogtim69

    @MrBigdogtim69

    2 ай бұрын

    @@VinothRajendran That guide refers to development and deploying on an emulator or simulator. I want to fully embed it on the device

  • @MrBigdogtim69

    @MrBigdogtim69

    2 ай бұрын

    @@VinothRajendran Actually - the link is for setting up emulators on connecting with web services running locally e.g. a typical client server app . I want to embed the model into the mobile app. For example, I want to consume Phi-3.

Келесі