|
1 |
| -# Elastic Langchain Sample App |
2 |
| - |
3 |
| -This is a sample app that combines Elasticsearch, Langchain and a number of different LLMs to create a semantic search experience with ELSER. |
4 |
| - |
5 |
| - |
6 |
| - |
7 |
| -## 1. Download the Project |
8 |
| - |
9 |
| -Download the project from Github and extract the `workplace-search` folder. |
10 |
| - |
11 |
| -```bash |
12 |
| -curl https://codeload.github.com/elastic/elasticsearch-labs/tar.gz/main | \ |
13 |
| -tar -xz --strip=2 elasticsearch-labs-main/example-apps/workplace-search |
14 |
| -``` |
15 |
| - |
16 |
| -## 2. Connecting to Elasticsearch |
17 |
| - |
18 |
| -This app requires the following environment variables to be set to connect to Elasticsearch |
19 |
| - |
20 |
| -```sh |
21 |
| -export ELASTIC_CLOUD_ID=... |
22 |
| -export ELASTIC_USERNAME=... |
23 |
| -export ELASTIC_PASSWORD=... |
24 |
| -``` |
25 |
| - |
26 |
| -Note: |
27 |
| - |
28 |
| -- If you don't have an Elastic Cloud deployment, sign up [here](https://cloud.elastic.co/registration?utm_source=github&utm_content=elasticsearch-labs-samples) for a free trial. |
29 |
| - |
30 |
| - 1. Go to the [Create deployment](https://cloud.elastic.co/deployments/create) page |
31 |
| - 2. Select **Create deployment** and follow the instructions |
32 |
| - |
33 |
| -### Change the Elasticsearch index and chat_history index |
34 |
| - |
35 |
| -By default, the app will use the `workplace-app-docs` index and the chat history index will be `workplace-app-docs-chat-history`. If you want to change these, you can set the following environment variables: |
36 |
| - |
37 |
| -```sh |
38 |
| -ES_INDEX=workplace-app-docs |
39 |
| -ES_INDEX_CHAT_HISTORY=workplace-app-docs-chat-history |
40 |
| -``` |
41 |
| - |
42 |
| -## 3. Connecting to LLM |
43 |
| - |
44 |
| -We support three LLM providers: Azure, OpenAI and Bedrock. |
45 |
| - |
46 |
| -To use one of them, you need to set the `LLM_TYPE` environment variable: |
47 |
| - |
48 |
| -```sh |
49 |
| -export LLM_TYPE=azure |
50 |
| -``` |
51 |
| - |
52 |
| -### OpenAI |
53 |
| - |
54 |
| -To use OpenAI LLM, you will need to provide the OpenAI key via `OPENAI_API_KEY` environment variable: |
55 |
| - |
56 |
| -```sh |
57 |
| -export LLM_TYPE=openai |
58 |
| -export OPENAI_API_KEY=... |
59 |
| -``` |
60 |
| - |
61 |
| -You can get your OpenAI key from the [OpenAI dashboard](https://platform.openai.com/account/api-keys). |
62 |
| - |
63 |
| -### Azure OpenAI |
64 |
| - |
65 |
| -If you are using Azure LLM, you will need to set the following environment variables: |
66 |
| - |
67 |
| -```sh |
68 |
| -export LLM_TYPE=azure |
69 |
| -export OPENAI_VERSION=... # e.g. 2023-05-15 |
70 |
| -export OPENAI_BASE_URL=... |
71 |
| -export OPENAI_API_KEY=... |
72 |
| -export OPENAI_ENGINE=... # deployment name in Azure |
73 |
| -``` |
74 |
| - |
75 |
| -### Bedrock LLM |
76 |
| - |
77 |
| -To use Bedrock LLM you need to set the following environment variables in order to AWS. |
78 |
| - |
79 |
| -```sh |
80 |
| -export LLM_TYPE=bedrock |
81 |
| -export AWS_ACCESS_KEY=... |
82 |
| -export AWS_SECRET_KEY=... |
83 |
| -export AWS_REGION=... # e.g. us-east-1 |
84 |
| -export AWS_MODEL_ID=... # Default is anthropic.claude-v2 |
85 |
| -``` |
86 |
| - |
87 |
| -#### AWS Config |
88 |
| - |
89 |
| -Optionally, you can connect to AWS via the config file in `~/.aws/config` described here: |
90 |
| -https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#configuring-credentials |
91 |
| - |
92 |
| -``` |
93 |
| -[default] |
94 |
| -aws_access_key_id=... |
95 |
| -aws_secret_access_key=... |
96 |
| -region=... |
97 |
| -``` |
98 |
| - |
99 |
| -### Vertex AI |
100 |
| - |
101 |
| -To use Vertex AI you need to set the following environment variables. More infos [here](https://python.langchain.com/docs/integrations/llms/google_vertex_ai_palm). |
102 |
| - |
103 |
| -```sh |
104 |
| - export LLM_TYPE=vertex |
105 |
| - export VERTEX_PROJECT_ID=<gcp-project-id> |
106 |
| - export VERTEX_REGION=<gcp-region> # Default is us-central1 |
107 |
| - export GOOGLE_APPLICATION_CREDENTIALS=<path-json-service-account> |
108 |
| -``` |
109 |
| - |
110 |
| -## 3. Ingest Data |
111 |
| - |
112 |
| -You can index the sample data from the provided .json files in the `data` folder: |
113 |
| - |
114 |
| -```sh |
115 |
| -python data/index-data.py |
116 |
| -``` |
117 |
| - |
118 |
| -by default, this will index the data into the `workplace-app-docs` index. You can change this by setting the `ES_INDEX` environment variable. |
119 |
| - |
120 |
| -### Indexing your own data |
121 |
| - |
122 |
| -`index-data.py` is a simple script that uses Langchain to index data into Elasticsearch, using the `JSONLoader` and `CharacterTextSplitter` to split the large documents into passages. Modify this script to index your own data. |
123 |
| - |
124 |
| -Langchain offers many different ways to index data, if you cant just load it via JSONLoader. See the [Langchain documentation](https://python.langchain.com/docs/modules/data_connection/document_loaders) |
125 |
| - |
126 |
| -Remember to keep the `ES_INDEX` environment variable set to the index you want to index into and to query from. |
127 |
| - |
128 |
| -## Running the App |
129 |
| - |
130 |
| -Once you have indexed data into the Elasticsearch index, there are two ways to run the app: via Docker or locally. Docker is advised for testing & production use. Locally is advised for development. |
131 |
| - |
132 |
| -### Through Docker |
133 |
| - |
134 |
| -Build the Docker image and run it with the following environment variables. |
135 |
| - |
136 |
| -```sh |
137 |
| -docker build -f Dockerfile -t workplace-search-app . |
138 |
| -``` |
139 |
| - |
140 |
| -Then run it with the following environment variables. In the example below, we are using OpenAI LLM. |
141 |
| - |
142 |
| -If you're using one of the other LLMs, you will need to set the appropriate environment variables via `-e` flag. |
143 |
| - |
144 |
| -```sh |
145 |
| -docker run -p 4000:4000 \ |
146 |
| - -e "ELASTIC_CLOUD_ID=<cloud_id>" \ |
147 |
| - -e "ELASTIC_USERNAME=elastic" \ |
148 |
| - -e "ELASTIC_PASSWORD=<password>" \ |
149 |
| - -e "LLM_TYPE=openai" \ |
150 |
| - -e "OPENAI_API_KEY=<openai_key>" \ |
151 |
| - -d workplace-search-app |
152 |
| -``` |
153 |
| - |
154 |
| -### Locally (for development) |
155 |
| - |
156 |
| -With the environment variables set, you can run the following commands to start the server and frontend. |
157 |
| - |
158 |
| -#### Pre-requisites |
159 |
| - |
160 |
| -- Python 3.8+ |
161 |
| -- Node 14+ |
162 |
| - |
163 |
| -#### Install the dependencies |
164 |
| - |
165 |
| -For Python we recommend using a virtual environment. |
166 |
| - |
167 |
| -_ℹ️ Here's a good [primer](https://realpython.com/python-virtual-environments-a-primer) on virtual environments from Real Python._ |
168 |
| - |
169 |
| -```sh |
170 |
| -# Create a virtual environment |
171 |
| -python -m venv .venv |
172 |
| - |
173 |
| -# Activate the virtual environment |
174 |
| -source .venv/bin/activate |
175 |
| -``` |
176 |
| - |
177 |
| -```sh |
178 |
| -# Install Python dependencies |
179 |
| -pip install -r requirements.txt |
180 |
| - |
181 |
| -# Install Node dependencies |
182 |
| -cd frontend && yarn |
183 |
| -``` |
184 |
| - |
185 |
| -#### Run API and frontend |
186 |
| - |
187 |
| -```sh |
188 |
| -# Launch API app |
189 |
| -python api/app.py |
190 |
| - |
191 |
| -# In a separate terminal launch frontend app |
192 |
| -cd frontend && yarn start |
193 |
| -``` |
194 |
| - |
195 |
| -You can now access the frontend at http://localhost:3000. Changes are automatically reloaded. |
| 1 | +This app has now moved to [https://github.com/elastic/elasticsearch-labs/tree/main/example-apps/chatbot-rag-app] |
0 commit comments