Phi3 with ONNX Runtime at the edge

Ғылым және технология

Use Olive and ONNX Runtime to efficiently inference Phi-3 model at the edge for mobile and Web applications
Part 1: Getting optimized models for mobile and web platforms Optimizing Phi-3 for chosen hardware with Olive​: github.com/microsoft/Olive/tr...
Part 2: Running phi-3 on Android phone Android chat app with Phi-3 and ONNX Runtime Mobile: github.com/microsoft/onnxrunt...
Part 3: Running phi-3 in the browser Web chat app with Phi-3 and ONNX Runtime Web: github.com/microsoft/onnxrunt...
Blog: Enjoy the Power of Phi-3 with ONNX Runtime on your device​: huggingface.co/blog/Emma-N/en...
Learn more
Olive: microsoft.github.io/Olive/
ONNX Runtime: onnxruntime.ai/
ONNX Runtime Generate API: github.com/microsoft/onnxrunt...

Пікірлер: 6

  • @l.halawani
    @l.halawani17 күн бұрын

    Great demo, thank you!

  • @cocooooooooo9150
    @cocooooooooo9150Ай бұрын

    Is there an example of using phi3 vision model on Android?

  • @ONNXRuntime

    @ONNXRuntime

    Ай бұрын

    It's coming!

  • @MrBigdogtim69
    @MrBigdogtim69Ай бұрын

    Great content and exciting to see! Is there an example using Maui?

  • @ONNXRuntime

    @ONNXRuntime

    Ай бұрын

    here is a simple example of ORT mobile for Maui. github.com/microsoft/onnxruntime-inference-examples/tree/8dc4650a77b849fb706bee1bdaa4dda91a52511d/mobile/examples/Maui/MauiVisionSample

  • @MrBigdogtim69

    @MrBigdogtim69

    Ай бұрын

    @@ONNXRuntime Thank you!

Келесі