phixtral-2x2_8

phixtral-2x2_8 is the first Mixure of Experts (MoE) made with two microsoft/phi-2 models, inspired by the mistralai/Mixtral-8x7B-v0.1 architecture

Reviews

No reviews yet. Be the first.

What do you think about this AI tool?

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Subscribe to weekly digest

Stay ahead with weekly updates: get platform news, explore projects, discover updates, and dive into case studies and feature breakdowns.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.