You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
petals/src/petals/models/llama
justheuristic 2ad0b2b936
Fix p2p pushing in rpc_inference (by @miaoqijun ) , support transformers 4.38.2 (#563)
This pull request solves #560 using a solution proposed by @miaoqijun .
It also bumps transformers to the latest version to test with the latest code.

---------

Co-authored-by: Yingtong Dou <ytongdou@gmail.com>
2 months ago
..
__init__.py Add AutoDistributed{Model, ModelForCausalLM, ModelForSequenceClassification} (#329) 11 months ago
block.py Fix p2p pushing in rpc_inference (by @miaoqijun ) , support transformers 4.38.2 (#563) 2 months ago
config.py Force use_cache=True in config only (#497) 8 months ago
model.py Fix p2p pushing in rpc_inference (by @miaoqijun ) , support transformers 4.38.2 (#563) 2 months ago