Adam Treat
e1eac00ee0
Fix the download and settings dialog to take more real estate if available on large monitors.
...
Signed-off-by: Adam Treat <treat.adam@gmail.com>
2024-02-01 15:43:34 -05:00
Adam Treat
111e152a5d
Fix the sizing for model download.
...
Signed-off-by: Adam Treat <adam@nomic.ai>
2024-02-01 15:39:28 -05:00
Adam Treat
ffed2ff823
Fix for progress bar color on legacy theme.
...
Signed-off-by: Adam Treat <treat.adam@gmail.com>
2024-02-01 08:29:44 -05:00
Adam Treat
a5275ea9e7
Bump the version and release notes for v2.6.2.
...
Signed-off-by: Adam Treat <treat.adam@gmail.com>
2024-01-31 23:25:58 -05:00
Adam Treat
cdf0fedae2
Make sure to use the search_query tag for nomic embed.
...
Signed-off-by: Adam Treat <treat.adam@gmail.com>
2024-01-31 22:44:16 -05:00
Adam Treat
d14b95f4bd
Add Nomic Embed model for atlas with localdocs.
2024-01-31 22:22:08 -05:00
Jared Van Bortel
eadc3b8d80
backend: bump llama.cpp for VRAM leak fix when switching models
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-31 17:24:01 -05:00
Jared Van Bortel
6db5307730
update llama.cpp for unhandled Vulkan OOM exception fix
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-31 16:44:58 -05:00
Jared Van Bortel
0a40e71652
Maxwell/Pascal GPU support and crash fix ( #1895 )
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-31 16:32:32 -05:00
Jared Van Bortel
b11c3f679e
bump llama.cpp-mainline for C++11 compat
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-31 15:02:34 -05:00
Jared Van Bortel
061d1969f8
expose n_gpu_layers parameter of llama.cpp ( #1890 )
...
Also dynamically limit the GPU layers and context length fields to the maximum supported by the model.
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-31 14:17:44 -05:00
Jared Van Bortel
f549d5a70a
backend : quick llama.cpp update to fix fallback to CPU
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-29 17:16:40 -05:00
Jared Van Bortel
38c61493d2
backend: update to latest commit of llama.cpp Vulkan PR
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-29 15:47:26 -06:00
Jared Van Bortel
29d2c936d1
chat: don't show "retrieving localdocs" for zero collections ( #1874 )
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-29 13:57:42 -05:00
Adam Treat
cfa22ab1c4
Change to a color that exists.
...
Signed-off-by: Adam Treat <treat.adam@gmail.com>
2024-01-29 13:06:47 -05:00
Adam Treat
3556f63a29
Make the setting labels font a bit bigger and fix hover.
...
Signed-off-by: Adam Treat <treat.adam@gmail.com>
2024-01-29 12:02:51 -06:00
Adam Treat
34de19ebf6
Add a legacy dark mode.
...
Signed-off-by: Adam Treat <treat.adam@gmail.com>
2024-01-29 12:02:51 -06:00
Adam Treat
c1fce502f7
Fix checkbox background in dark mode.
...
Signed-off-by: Adam Treat <treat.adam@gmail.com>
2024-01-29 12:02:51 -06:00
Adam Treat
363f6659e4
Fix the settings font size to be a tad bigger.
...
Signed-off-by: Adam Treat <treat.adam@gmail.com>
2024-01-29 12:02:51 -06:00
Adam Treat
6abeefb303
Hover for links and increase font size a bit.
...
Signed-off-by: Adam Treat <treat.adam@gmail.com>
2024-01-29 12:02:51 -06:00
Adam Treat
697a5f5d2a
New lightmode and darkmode themes with UI revamp.
...
Signed-off-by: Adam Treat <treat.adam@gmail.com>
2024-01-29 12:02:51 -06:00
Karthik Nair
0a45dd384e
add fedora command for QT and related packages ( #1871 )
...
Signed-off-by: Karthik Nair <realkarthiknair@gmail.com>
Co-authored-by: Jared Van Bortel <cebtenzzre@gmail.com>
2024-01-24 18:00:49 -05:00
Adam Treat
27912f6e1a
Fix bug with install of online models.
2024-01-22 14:16:09 -05:00
Jared Van Bortel
26acdebafa
convert: replace GPTJConfig with AutoConfig ( #1866 )
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-22 12:14:55 -05:00
Jared Van Bortel
c7ea283f1f
chatllm: fix deserialization version mismatch ( #1859 )
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-22 10:01:31 -05:00
Jared Van Bortel
b881598166
py: improve README ( #1860 )
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-21 19:53:55 -05:00
Jared Van Bortel
a9c5f53562
update llama.cpp for nomic-ai/llama.cpp#12
...
Fixes #1477
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-17 14:05:33 -05:00
Jared Van Bortel
15ce428672
ci: run all workflows on config change ( #1829 )
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-17 12:41:52 -05:00
Jared Van Bortel
b98e5f396a
docs: add missing dependencies to Linux build instructions ( #1728 )
...
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-01-17 11:33:23 -05:00
Jared Van Bortel
b7c92c5afd
sync llama.cpp with latest Vulkan PR and newer upstream ( #1819 )
2024-01-16 16:36:21 -05:00
Jared Van Bortel
e7c4680b51
github: enable blank issues
2024-01-16 15:27:01 -05:00
Jared Van Bortel
03a9f0bedf
csharp: update C# bindings to work with GGUF ( #1651 )
2024-01-16 14:33:41 -05:00
Jared Van Bortel
f8564398fc
minor change to trigger CircleCI
2024-01-12 16:13:46 -05:00
Jared Van Bortel
b96406669d
CI: fix Windows Python build
2024-01-12 16:02:56 -05:00
Adam Treat
e51a504550
Add the new 2.6.1 release notes and bump the version.
2024-01-12 11:10:16 -05:00
Jared Van Bortel
eef604fd64
python: release bindings version 2.1.0
...
The backend has a breaking change for Falcon and MPT models, so we need
to make a new release.
2024-01-12 09:38:16 -05:00
Jared Van Bortel
b803d51586
restore network.h #include
...
The online installers need this.
2024-01-12 09:27:48 -05:00
Jared Van Bortel
7e9786fccf
chat: set search path early
...
This fixes the issues with installed versions of v2.6.0.
2024-01-11 12:04:18 -05:00
Adam Treat
f7aeeca884
Revert the release.
2024-01-10 10:41:33 -05:00
Adam Treat
16a84972f6
Bump to new version and right the release notes.
2024-01-10 10:21:45 -05:00
Jared Van Bortel
4dbe2634aa
models2.json: update models list for the next release
2024-01-10 09:18:31 -06:00
Adam Treat
233f0c4201
Bump the version for our next release.
2024-01-05 09:46:03 -05:00
AT
96cee4f9ac
Explicitly clear the kv cache each time we eval tokens to match n_past. ( #1808 )
2024-01-03 14:06:08 -05:00
ThiloteE
2d566710e5
Address review
2024-01-03 11:13:07 -06:00
ThiloteE
a0f7d7ae0e
Fix for "LLModel ERROR: Could not find CPU LLaMA implementation" v2
2024-01-03 11:13:07 -06:00
ThiloteE
38d81c14d0
Fixes https://github.com/nomic-ai/gpt4all/issues/1760 LLModel ERROR: Could not find CPU LLaMA implementation.
...
Inspired by Microsoft docs for LoadLibraryExA (https://learn.microsoft.com/en-us/windows/win32/api/libloaderapi/nf-libloaderapi-loadlibraryexa ).
When using LOAD_LIBRARY_SEARCH_DLL_LOAD_DIR, the lpFileName parameter must specify a fully qualified path, also it needs to be backslashes (\), not forward slashes (/).
2024-01-03 11:13:07 -06:00
Gerhard Stein
3e99b90c0b
Some cleanps
2024-01-03 08:41:40 -06:00
Daniel Salvatierra
c72c73a94f
app.py: add --device option for GPU support ( #1769 )
...
Signed-off-by: Daniel Salvatierra <dsalvat1@gmail.com>
Co-authored-by: Jared Van Bortel <jared@nomic.ai>
2023-12-20 16:01:03 -05:00
Cal Alaera
528eb1e7ad
Update server.cpp to return valid created timestamps ( #1763 )
...
Signed-off-by: Cal Alaera <59891537+CalAlaera@users.noreply.github.com>
2023-12-18 14:06:25 -05:00
Jared Van Bortel
d1c56b8b28
Implement configurable context length ( #1749 )
2023-12-16 17:58:15 -05:00