22
33# TIMDEX Is Making Discovery EXcellent @ MIT
44
5- This application interfaces with an ElasticSearch backend and exposes a set of
6- API Endpoints to allow registered users to query our data.
7-
8- The backend is populated via [ pipelines] ( https://github.com/MITLibraries/mario ) .
5+ This application interfaces with an OpenSearch backend and exposes a GraphQL endpoint to allow anonymous users to query our data.
96
107## Architecture Decision Records
118
@@ -22,26 +19,6 @@ additional records with a standardized template.
2219- don't commit your .env or .env.development, but do commit .env.test after
2320 confirming your test values are not actual secrets that need protecting
2421
25- ### Updating the data model
26- Updating the data model is somewhat complicated because many files need to be
27- edited across multiple repositories and deployment steps should happen in a
28- particular order so as not to break production services.
29- - Start by updating the data model in [ Mario] ( https://github.com/MITLibraries/mario ) . Instructions for that can be found
30- in the [ Mario README] ( https://github.com/MITLibraries/mario/blob/master/README.md ) . Then complete the following steps here in TIMDEX.
31- - Update ` app/models/search.rb ` to build/update/remove queries for the added/
32- edited/deleted fields as appropriate. Make sure to update filters and
33- aggregations if relevant to the changed fields.
34- - Update ` app/views/api/[version]/search/_base_json_jbuilder ` to
35- add/update/remove changed fields OR update
36- ` views/api/[version]/search/_extended_json_jbuilder ` if the changed fields
37- aren’t/shouldn’t be in the brief record result.
38- - If changed fields should be aggregated, update
39- ` views/api/[version]/search/_aggregations_json_jbuilder ` as appropriate.
40- - Update tests as necessary. Make sure to test with all current data
41- source samples ingested into a local ES instance.
42- - Update ` openapi.json ` to make sure our spec matches any changes made
43- (including bumping the version number).
44-
4522## Publishing User Facing Documentation
4623
4724### Running jekyll documentation locally
@@ -79,46 +56,32 @@ locally.
7956 ` yourapp.herokuapp.com ` . However, if you use a custom domain in production,
8057 that should be the value you use in production.
8158- ` JWT_SECRET_KEY ` : generate with ` rails secret `
82- - ` ELASTICSEARCH_INDEX ` : Elasticsearch index or alias to query
83- - ` ELASTICSEARCH_URL ` : defaults to ` http://localhost:9200 `
8459
8560## Production required Environment Variables
8661
87- - ` AWS_ACCESS_KEY `
88- - ` AWS_ELASTICSEARCH ` : boolean. Set to true to enable AWSv4 Signing
89- - ` AWS_SECRET_ACCESS_KEY `
62+ - ` AWS_OPENSEARCH ` : boolean. Set to true to enable AWSv4 Signing
63+ - ` AWS_OPENSEARCH_ACCESS_KEY_ID `
64+ - ` AWS_OPENSEARCH_SECRET_ACCESS_KEY `
9065- ` AWS_REGION `
66+ - ` OPENSEARCH_INDEX ` : Opensearch index or alias to query, default will be to search all indexes which is generally not
67+ expected. ` timdex ` or ` all-current ` are aliases used consistently in our data pipelines, with
68+ ` timdex ` being most likely what most use cases will want.
69+ - ` OPENSEARCH_URL ` : Opensearch URL, defaults to ` http://localhost:9200 `
9170- ` SMTP_ADDRESS `
9271- ` SMTP_PASSWORD `
9372- ` SMTP_PORT `
9473- ` SMTP_USER `
9574
96- ### Additional required Environment Variables when Opensearch is enabled (aka v2=true)
97-
98- - ` v2 ` : set to ` true `
99- - ` OPENSEARCH_INDEX ` : Opensearch index or alias to query
100- - ` OPENSEARCH_URL ` : Opensearch URL
101- - ` OPENSEARCH_LOG ` set to ` true `
102-
103- - ` AWS_OPENSEARCH `
104- - ` AWS_OPENSEARCH_ACCESS_KEY_ID `
105- - ` AWS_OPENSEARCH_SECRET_ACCESS_KEY `
10675## Optional Environment Variables (all ENVs)
10776
108- - ` ELASTICSEARCH_LOG ` if ` true ` , verbosely logs ElasticSearch queries.
77+ - ` OPENSEARCH_LOG ` if ` true ` , verbosely logs OpenSearch queries.
10978
11079 ``` text
11180 NOTE: do not set this ENV at all if you want ES logging fully disabled.
11281 Setting it to `false` is still setting it and you will be annoyed and
11382 confused.
11483 ```
11584
116- - ` ES_LOG_LEVEL ` set elasticsearch transport log level. Defaults to ` INFO ` .
117-
118- ``` text
119- NOTE: `ELASTICSEARCH_LOG` must also be set for logging to function.
120- ```
121-
12285- ` PREFERRED_DOMAIN ` - set this to the domain you would like to to use. Any
12386 other requests that come to the app will redirect to the root of this domain.
12487 This is useful to prevent access to herokuapp.com domains.
@@ -127,47 +90,3 @@ NOTE: `ELASTICSEARCH_LOG` must also be set for logging to function.
12790 Default is 1.
12891- ` SENTRY_DSN ` : client key for Sentry exception logging
12992- ` SENTRY_ENV ` : Sentry environment for the application. Defaults to 'unknown' if unset.
130-
131- ## Docker Compose Orchestrated Local Environment
132-
133- This section will describe how to use the included docker compose files to spin up ElasticSearch
134- and optionally use Mario to load sample data for testing.
135-
136- You may set ` ELASTICSEARCH_URL ` to ` http://0.0.0.0:9200 ` to use this ES instance in development if you
137- choose to not use the included Dockerfile
138-
139- ### Startup ElasticSearch and Timdex
140-
141- ` make up `
142-
143- ### Shutdown ElasticSearch and Timdex when you are done
144-
145- ` make down `
146-
147- ### Optionally, load sample data
148-
149- After ElasticSearch is running from ` make up ` command:
150-
151- ` make sampledata `
152-
153- Note: if you run this and it fails, try again in a few seconds as ES may still be loading
154-
155- ### Run arbitrary Mario commands
156-
157- You can also run arbitrary Mario commands using a syntax like this after first running ` make up ` .
158-
159- ` docker run --network timdex_default mitlibraries/mario --url http://elasticsearch:9200 YOUR_MARIO_COMMAND_HERE [e.g. indexes] `
160-
161- Note: if you have no indexes loaded, many mario commands will fail. Try ` make sampledata ` or load the data you
162- need before proceeding.
163-
164- ### Quick curl examples with sample data in mind
165-
166- ` curl 'http://0.0.0.0:3000/api/v1/ping' `
167-
168- ` curl 'http://0.0.0.0:3000/api/v1/search?q=archives' `
169-
170- You can also use the playground via your browser, see the index page of the running app for a link
171- and example queries.
172-
173- http://0.0.0.0:3000
0 commit comments