You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
petals/src/petals/models/bloom
Artem Chumachenko d6f4f80f3f
Fix Mixtral-related issues (#570)
This PR fixes problems related to #569:
- block initialization
- throughput calculation and cache usage
- mixtral in tests

Beam search is removed for Mixtral and Llama for now. Those models use DynamicCache, which requires special function to change: (see https://github.com/huggingface/transformers/blob/main/src/transformers/cache_utils.py#L161)

---------

Co-authored-by: Max Ryabinin <mryabinin0@gmail.com>
1 month ago
..
__init__.py Add AutoDistributed{Model, ModelForCausalLM, ModelForSequenceClassification} (#329) 11 months ago
block.py Fix Mixtral-related issues (#570) 1 month ago
config.py Replace dots in repo names when building DHT prefixes (#489) 9 months ago
model.py Bump transformers and accelerate versions (#554) 3 months ago