phixtral-2x2_8 is the first Mixure of Experts (MoE) made with two microsoft/phi-2 models, inspired by the mistralai/Mixtral-8x7B-v0.1 architecture
Stay ahead with weekly updates: get platform news, explore projects, discover updates, and dive into case studies and feature breakdowns.