llamamodel: free the batch in embedInternal

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
fix-missing-batch-free
Jared Van Bortel 3 weeks ago
parent 61cefcfd8a
commit 9604d76708

@ -940,6 +940,8 @@ void LLamaModel::embedInternal(
}
if (tokenCount) { *tokenCount = totalTokens; }
llama_batch_free(batch);
}
#if defined(_WIN32)

Loading…
Cancel
Save