Hi, I'm Kante Yin 👋

R&D Engineer • OpenSource Enthusiast • Cat Owner • Sports Fan ⚽️ 🏀 🥊


KubeCon London - Sailing Multi-Host Inference with LWS

[Slides] [Project]

Inference workloads are becoming increasingly prevalent and vital in Cloud Native world. However, it’s not easy, one of the biggest challenges is large foundation model can not fit into a single node, which brings out the distributed inference with model parallelism, again, make serving inference workloads more complicated.

Read more...

KServe, AIBrix, and llmaz

As a follower and active contributor for inference platform, I created the llmaz project to provide an unified inference platform for LLMs and also joined the AIBrix community to build the next-gen GenAI infrastructure.

Read more...
Previous Page 2 of 5 Next Page