You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+11-2Lines changed: 11 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -65,10 +65,14 @@ This project also works well with papers from [PubMed](https://pubmed.ncbi.nlm.n
65
65
66
66
### Setup
67
67
68
-
The examples below use an AutoAWQ model, make sure to install that dependency first.
68
+
Install the following.
69
69
70
-
```
70
+
```bash
71
+
# Change autoawq[kernels] to "autoawq autoawq-kernels" if a flash-attn error is raised
71
72
pip install annotateai autoawq[kernels]
73
+
74
+
# macOS users should run this instead
75
+
pip install annotateai llama-cpp-python
72
76
```
73
77
74
78
The primary input parameter is the path to the LLM. This project is backed by [txtai](https://github.com/neuml/txtai) and it supports any [txtai-supported LLM](https://neuml.github.io/txtai/pipeline/text/llm/).
@@ -78,6 +82,11 @@ from annotateai import Annotate
78
82
79
83
# This model works well with medical and scientific literature
0 commit comments