Merge pull request #403 from emory/patch-1

Update mixtral.en.mdx (very minor)
pull/440/merge
Elvis Saravia 2 months ago committed by GitHub
commit aeda144340
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

@ -71,7 +71,7 @@ In addition, the model's perplexity decreases monotonically as the size of conte
A Mixtral 8x7B - Instruct model is also released together with the base Mixtral 8x7B model. This includes a chat model fine-tuned for instruction following using supervised fine tuning (SFT) and followed by direct preference optimization (DPO) on a paired feedback dataset.
As of the writing of this guide (28 January 2028), Mixtral ranks 8th on the [Chatbot Arena Leaderboard](https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard) (an independent human evaluation conducted by LMSys).
As of the writing of this guide (28 January 2024), Mixtral ranks 8th on the [Chatbot Arena Leaderboard](https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard) (an independent human evaluation conducted by LMSys).
<Screenshot src={mixtralchat} alt="Mixtral Performance on the Chatbot Arena" />

Loading…
Cancel
Save