shamoon
|
0690fd36c5
|
Fix openai api key, config settings saving
|
2025-07-02 11:03:55 -07:00 |
|
shamoon
|
bb3336f7bc
|
Just use the built-in ollama LLM class of course
|
2025-07-02 11:01:58 -07:00 |
|
shamoon
|
a9ed46de11
|
Fix naming
|
2025-07-02 11:01:58 -07:00 |
|
shamoon
|
1ccaf66869
|
Trim nodes
|
2025-07-02 11:01:57 -07:00 |
|
shamoon
|
e864a51497
|
Backend streaming chat
|
2025-07-02 11:01:57 -07:00 |
|
shamoon
|
4a28be233e
|
Fixup some tests
|
2025-07-02 11:01:56 -07:00 |
|
shamoon
|
5f26139a5f
|
Unify, respect perms
[ci skip]
|
2025-07-02 11:01:56 -07:00 |
|
shamoon
|
ccfc7d98b1
|
Individual doc chat
[ci skip]
|
2025-07-02 11:01:55 -07:00 |
|
shamoon
|
d1bd2af49c
|
Super basic doc chat
[ci skip]
|
2025-07-02 11:01:55 -07:00 |
|
shamoon
|
e2eec6dc71
|
Better encapsulate backends, use llama_index OpenAI
|
2025-07-02 11:01:54 -07:00 |
|
shamoon
|
df8f07555f
|
Tweak ollama timeout, prompt
[ci skip]
|
2025-07-02 11:01:54 -07:00 |
|
shamoon
|
3660336bcf
|
Fix ollama, fix RAG
[ci skip]
|
2025-07-02 11:01:53 -07:00 |
|
shamoon
|
aeceaf60a2
|
RAG into suggestions
|
2025-07-02 11:01:53 -07:00 |
|
shamoon
|
959ebdbb85
|
llamaindex vector index, llmindex mangement command
|
2025-07-02 11:01:52 -07:00 |
|
shamoon
|
f5fc04cfe2
|
Use a frontend config
|
2025-07-02 11:01:51 -07:00 |
|
shamoon
|
767118fa8a
|
Fix
|
2025-07-02 11:01:48 -07:00 |
|
shamoon
|
339612f4ec
|
Backend tests
|
2025-07-02 11:01:48 -07:00 |
|
shamoon
|
e7592c6269
|
Correct object retrieval
|
2025-07-02 11:01:48 -07:00 |
|
shamoon
|
ffc0b936f3
|
Refactor
|
2025-07-02 11:01:47 -07:00 |
|
shamoon
|
1a6540e8ed
|
Move module
|
2025-07-02 11:01:47 -07:00 |
|