You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
petals/src/petals/models
Artem Chumachenko d6f4f80f3f
Fix Mixtral-related issues (#570)
This PR fixes problems related to #569:
- block initialization
- throughput calculation and cache usage
- mixtral in tests

Beam search is removed for Mixtral and Llama for now. Those models use DynamicCache, which requires special function to change: (see https://github.com/huggingface/transformers/blob/main/src/transformers/cache_utils.py#L161)

---------

Co-authored-by: Max Ryabinin <mryabinin0@gmail.com>
3 weeks ago
..
bloom Fix Mixtral-related issues (#570) 3 weeks ago
falcon Add position_ids argument to DistributedFalconModel (#525) 7 months ago
llama Fix p2p pushing in rpc_inference (by @miaoqijun ) , support transformers 4.38.2 (#563) 1 month ago
mixtral Fix Mixtral-related issues (#570) 3 weeks ago
__init__.py Add Mixtral models (#553) 1 month ago