Report bugs, feature requests, questions or ideas at https://github.com/fapi-search/fastapi-search-cookiecutter/issues.
Because this is a cookiecutter, that is, a template to generate a running site, but not a running site itself, it’s a bit “meta” to work on many features. A typical practice to deal with this looks like
- Fork the repo and clone locally
- Create a cookiecutter built version of the service, e.g.,
cookiecutter fastapi-search-cookiecutter
- Answer questions
project_name [Search Project]: project_slug [search-project]: default_local_host [0.0.0.0]: default_local_port [8001]: default_docker_port [8000]: Select search_backend: 1 - elasticsearch 2 - opensearch Choose from 1, 2 [1]: default_postgres_url [postgresql+asyncpg://db_user:db_pass@localhost:5432/app_db]: default_search_url [http://elastic:elastic@localhost:9200]: use_precommit [y]:
cd fastapi-search-work
- Code, test, and run your changes
- Copy changed files back to the
fastapi-search-cookiecutter
repo, taking care where files have cookiecutter logic in them. - Run cookiecutter on your updated repo
cookiecutter fastapi-search-cookiecutter
- Answer questions for a new instance
project_name [Search Project]: project_slug [search-project]: default_local_host [0.0.0.0]: default_local_port [8001]: default_docker_port [8000]: Select search_backend: 1 - elasticsearch 2 - opensearch Choose from 1, 2 [1]: default_postgres_url [postgresql+asyncpg://db_user:db_pass@localhost:5432/app_db]: default_search_url [http://elastic:elastic@localhost:9200]: use_precommit [y]:
- Make sure everything looks right, tests pass, etc. Running a tree diff between the work and test instances can be helpful.
Until CI and versioning is setup, our SDLC will be a bit ad-hoc. Assuming you’re working on an issue already in the repo, submit a PR from your fork citing that issue. For those working directly on the upstream, use the issue create a branch link,
which provides a reasonably slugified branch name…