gpt4all/gpt4all-bindings/typescript
Jacob Nguyen 545c23b4bd
typescript: fix final bugs and polishing, circle ci documentation (#960)
* fix: esm and cjs compatibility

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

* Update prebuild.js

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

* fix gpt4all.js

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

* Fix compile for windows and linux again. PLEASE DON'T REVERT THISgit gui!

* version bump

* polish up spec and build scripts

* lock file refresh

* fix: proper resource closing and error handling

* check make sure libPath not null

* add msvc build script and update readme requirements

* python workflows in circleci

* dummy python change

* no need for main

* second hold for pypi deploy

* let me deploy pls

* bring back when condition

* Typo, ignore list  (#967)

Fix typo in javadoc,
Add word to ignore list for codespellrc

---------

Co-authored-by: felix <felix@zaslavskiy.net>

* llmodel: change tokenToString to not use string_view (#968)

fixes a definite use-after-free and likely avoids some other
potential ones - std::string will convert to a std::string_view
automatically but as soon as the std::string in question goes out of
scope it is already freed and the string_view is pointing at freed
memory - this is *mostly* fine if its returning a reference to the
tokenizer's internal vocab table but it's, imo, too easy to return a
reference to a dynamically constructed string with this as replit is
doing (and unfortunately needs to do to convert the internal whitespace
replacement symbol back to a space)

* Initial Library Loader for .NET Bindings / Update bindings to support newest changes (#763)

* Initial Library Loader

* Load library as part of Model factory

* Dynamically search and find the dlls

* Update tests to use locally built runtimes

* Fix dylib loading, add macos runtime support for sample/tests

* Bypass automatic loading by default.

* Only set CMAKE_OSX_ARCHITECTURES if not already set, allow cross-compile

* Switch Loading again

* Update build scripts for mac/linux

* Update bindings to support newest breaking changes

* Fix build

* Use llmodel for Windows

* Actually, it does need to be libllmodel

* Name

* Remove TFMs, bypass loading by default

* Fix script

* Delete mac script

---------

Co-authored-by: Tim Miller <innerlogic4321@ghmail.com>

* bump llama.cpp mainline to latest (#964)

* fix prompt context so it's preserved in class

* update setup.py

* metal replit (#931)

metal+replit

makes replit work with Metal and removes its use of `mem_per_token`
in favor of fixed size scratch buffers (closer to llama.cpp)

* update documentation scripts and generation to include readme.md

* update readme and documentation for source

* begin tests, import jest, fix listModels export

* fix typo

* chore: update spec

* fix: finally, reduced potential of empty string

* chore: add stub for createTokenSream

* refactor: protecting resources properly

* add basic jest tests

* update

* update readme

* refactor: namespace the res variable

* circleci integration to automatically build docs

* add starter docs

* typo

* more circle ci typo

* forgot to add nodejs circle ci orb

* fix circle ci

* feat: @iimez verify download and fix prebuild script

* fix: oops, option name wrong

* fix: gpt4all utils not emitting docs

* chore: fix up scripts

* fix: update docs and typings for md5 sum

* fix: macos compilation

* some refactoring

* Update index.cc

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

* update readme and enable exceptions on mac

* circle ci progress

* basic embedding with sbert (not tested & cpp side only)

* fix circle ci

* fix circle ci

* update circle ci script

* bruh

* fix again

* fix

* fixed required workflows

* fix ci

* fix pwd

* fix pwd

* update ci

* revert

* fix

* prevent rebuild

* revmove noop

* Update continue_config.yml

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

* Update binding.gyp

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

* fix fs not found

* remove cpp 20 standard

* fix warnings, safer way to calculate arrsize

* readd build backend

* basic embeddings and yarn test"

* fix circle ci

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

Update continue_config.yml

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

fix macos paths

update readme and roadmap

split up spec

update readme

check for url in modelsjson

update docs and inline stuff

update yarn configuration and readme

update readme

readd npm publish script

add exceptions

bruh one space broke the yaml

codespell

oops forgot to add runtimes folder

bump version

try code snippet https://support.circleci.com/hc/en-us/articles/8325075309339-How-to-install-NPM-on-Windows-images

add fallback for unknown architectures

attached to wrong workspace

hopefuly fix

moving everything under backend to persist

should work now

* update circle ci script

* prevent rebuild

* revmove noop

* Update continue_config.yml

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

* Update binding.gyp

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

* fix fs not found

* remove cpp 20 standard

* fix warnings, safer way to calculate arrsize

* readd build backend

* basic embeddings and yarn test"

* fix circle ci

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

Update continue_config.yml

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

fix macos paths

update readme and roadmap

split up spec

update readme

check for url in modelsjson

update docs and inline stuff

update yarn configuration and readme

update readme

readd npm publish script

add exceptions

bruh one space broke the yaml

codespell

oops forgot to add runtimes folder

bump version

try code snippet https://support.circleci.com/hc/en-us/articles/8325075309339-How-to-install-NPM-on-Windows-images

add fallback for unknown architectures

attached to wrong workspace

hopefuly fix

moving everything under backend to persist

should work now

* Update README.md

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>

---------

Signed-off-by: Jacob Nguyen <76754747+jacoobes@users.noreply.github.com>
Co-authored-by: Adam Treat <treat.adam@gmail.com>
Co-authored-by: Richard Guo <richardg7890@gmail.com>
Co-authored-by: Felix Zaslavskiy <felix.zaslavskiy@gmail.com>
Co-authored-by: felix <felix@zaslavskiy.net>
Co-authored-by: Aaron Miller <apage43@ninjawhale.com>
Co-authored-by: Tim Miller <drasticactions@users.noreply.github.com>
Co-authored-by: Tim Miller <innerlogic4321@ghmail.com>
2023-07-25 11:46:40 -04:00
..
.yarn/releases typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
scripts typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
spec typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
src typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
test typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
.gitignore typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
.npmignore typescript: publish alpha on npm and lots of cleanup, documentation, and more (#913) 2023-06-12 15:00:20 -04:00
.yarnrc.yml typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
binding.ci.gyp typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
binding.gyp typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
index.cc typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
index.h typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
package.json typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
prompt.cc typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
prompt.h typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
README.md typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00
yarn.lock typescript: fix final bugs and polishing, circle ci documentation (#960) 2023-07-25 11:46:40 -04:00

GPT4All Node.js API

yarn add gpt4all@alpha

npm install gpt4all@alpha

pnpm install gpt4all@alpha

The original GPT4All typescript bindings are now out of date.

Chat Completion (alpha)

import { createCompletion, loadModel } from '../src/gpt4all.js'

const ll = await loadModel('ggml-vicuna-7b-1.1-q4_2', { verbose: true });

const response = await createCompletion(ll, [
    { role : 'system', content: 'You are meant to be annoying and unhelpful.'  },
    { role : 'user', content: 'What is 1 + 1?'  } 
]);

Embedding (alpha)

import { createEmbedding, loadModel } from '../src/gpt4all.js'

const ll = await loadModel('ggml-all-MiniLM-L6-v2-f16', { verbose: true });

const fltArray = createEmbedding(ll, "Pain is inevitable, suffering optional");

API

  • The nodejs api has made strides to mirror the python api. It is not 100% mirrored, but many pieces of the api resemble its python counterpart.
  • Everything should work out the box.
  • docs

Build Instructions

  • binding.gyp is compile config
  • Tested on Ubuntu. Everything seems to work fine
  • Tested on Windows. Everything works fine.
  • Sparse testing on mac os.
  • MingW works as well to build the gpt4all-backend. HOWEVER, this package works only with MSVC built dlls.

Requirements

  • git
  • node.js >= 18.0.0
  • yarn
  • node-gyp
    • all of its requirements.
  • (unix) gcc version 12
  • (win) msvc version 143
    • Can be obtained with visual studio 2022 build tools
  • python 3

Build (from source)

git clone https://github.com/nomic-ai/gpt4all.git
cd gpt4all-bindings/typescript
  • The below shell commands assume the current working directory is typescript.

  • To Build and Rebuild:

yarn
  • llama.cpp git submodule for gpt4all can be possibly absent. If this is the case, make sure to run in llama.cpp parent directory
git submodule update --init --depth 1 --recursive

AS OF NEW BACKEND to build the backend,

yarn build:backend

This will build platform-dependent dynamic libraries, and will be located in runtimes/(platform)/native The only current way to use them is to put them in the current working directory of your application. That is, WHEREVER YOU RUN YOUR NODE APPLICATION

  • llama-xxxx.dll is required.
  • According to whatever model you are using, you'll need to select the proper model loader.
    • For example, if you running an Mosaic MPT model, you will need to select the mpt-(buildvariant).(dynamiclibrary)

Test

yarn test

Source Overview

src/

  • Extra functions to help aid devex
  • Typings for the native node addon
  • the javascript interface

test/

  • simple unit testings for some functions exported.
  • more advanced ai testing is not handled

spec/

  • Average look and feel of the api
  • Should work assuming a model and libraries are installed locally in working directory

index.cc

  • The bridge between nodejs and c. Where the bindings are.

prompt.cc

  • Handling prompting and inference of models in a threadsafe, asynchronous way.

docs/

  • Autogenerated documentation using the script yarn docs:build

Known Issues

* why your model may be spewing bull 💩 
    - The downloaded model is broken (just reinstall or download from official site)
    - That's it so far

Roadmap

This package is in active development, and breaking changes may happen until the api stabilizes. Here's what's the todo list:

  • x] prompt models via a threadsafe function in order to have proper non blocking behavior in nodejs
    
  •  ] ~~createTokenStream, an async iterator that streams each token emitted from the model. Planning on following this [example](https://github.com/nodejs/node-addon-examples/tree/main/threadsafe-async-iterator)~~ May not implement unless someone else can complete
    
  • x] proper unit testing (integrate with circle ci)
    
  • x] publish to npm under alpha tag `gpt4all@alpha`
    
  • x] have more people test on other platforms (mac tester needed)
    
  • x] switch to new pluggable backend
    
  •  ] NPM bundle size reduction via optionalDependencies strategy (need help) 
    - Should include prebuilds to avoid painful node-gyp errors
    
  •  ] createChatSession ( the python equivalent to create\_chat\_session )
    

Documentation