You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
gpt4free/g4f/api/_tokenizer.py

9 lines
286 B
Python

# import tiktoken
# from typing import Union
# def tokenize(text: str, model: str = 'gpt-3.5-turbo') -> Union[int, str]:
# encoding = tiktoken.encoding_for_model(model)
# encoded = encoding.encode(text)
# num_tokens = len(encoded)
# return num_tokens, encoded