You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+14-17Lines changed: 14 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -63,25 +63,21 @@ This project also works well with papers from [PubMed](https://pubmed.ncbi.nlm.n
63
63
64
64
### Setup
65
65
66
-
Install the following.
67
-
68
-
```bash
69
-
# Change autoawq[kernels] to "autoawq autoawq-kernels" if a flash-attn error is raised
70
-
pip install annotateai autoawq[kernels]
71
-
72
-
# macOS users should run this instead
73
-
pip install annotateai llama-cpp-python
74
-
```
75
-
76
66
The primary input parameter is the path to the LLM. This project is backed by [txtai](https://github.com/neuml/txtai) and it supports any [txtai-supported LLM](https://neuml.github.io/txtai/pipeline/text/llm/).
77
67
78
68
```python
79
69
from annotateai import Annotate
80
70
81
-
# This model works well with medical and scientific literature
0 commit comments