Fletchling is a Golbat webhook receiver that processes pokemon webhooks and computes nesting pokemon.
- receives and processes pokemon on the fly via webhook from Golbat
- fletchling-osm-importer (separate tool): Can copy nests from: overpass to db (or koji soon)
- Koji can be used as an authortative source for nests (soon)
- has an API to pull stats, purge stats, reload config, etc.
- rename and edit configs in configs/
- in golbat, add a new webhook. (i think restart is required.) it should look like this in the config:
[[webhooks]]
url = "http://your-fletchling-hostname:9042/webhook"
types = ["pokemon_iv"]
If you are using pm2, I would leave 'addr' in fletchling.toml as the default. Use '127.0.0.1' for your fletchling hostname in the above.
If you are using the included docker-compose.yml, golbat is also running under docker, and everything is attached to the same docker network, your fletchling hostname will be 'fletchling'.
- There is an included docker-compose.yml.example file. Copy it to docker-compose.yml and edit it, if needed.
docker-compose build
docker-compose up -d
make
pm2 start ./fletchling --name fletchling -o /dev/null
- curl http://localhost:9042/api/config -- this should give you the current processor configuration
- The logfile is configurable in configs/fletchling.toml but defaults to logs/fletchling.log. Check it for errors.
- Every minute, a log message will appear saying how many pokemon were processed. If this is 0, it means that Fletchling is not getting any webhooks. Check your Golbat webhooks configuration. Check the address Fletchling is listening on (http section in config).
- Gather your Golbat DB info.
- Create
configs/fletchling.toml
from the existing example config.- configure your existing golbat db in configs/fletchling.toml in both 'nests_db' and 'golbat_db' sections.
- fix the listen address in 'http' section, if necessary.
- nuke your cronjob.
- start up fletchling
This will be available soon.
I would just start over, personally. :)
Overpass API can be queried to find parks, etc, to import into your nests db (or Koji soon).
The fences/areas that are searched can come from either a poracle-style json file, or a geojson FeatureCollection file, or Koji(soon).
- If you want to find nests for your areas that are in Koji, make sure you have a project which exports your areas.
- If you want to find nests for your areas that are in a file, make note of its location.
- Gather your nests DB info/credentials. If you are currently using the Golbat DB, use this. Otherwise, if you are starting from scratch, use golbat or make a new database.
- (Optional): Gather your golbat DB info/credentials (might be same as Step 3.). min_spawnpoints filtering will only work with this configured.
- Create
configs/fletchling-osm-importer.toml
from the existing example config. The comments in the file should explain. ./make
./fletchling-osm-importer 'AreaName'
to import a single area first, if you wish../fletchling-osm-importer -all-areas
to import all areas.
curl http://localhost:9042/api/config
curl http://localhost:9042/api/config/reload
(Also supports PUT. You can also send a SIGHUP signal to the process)
curl http://localhost:9042/api/nests
curl http://localhost:9042/api/nests/:nest_id
curl http://localhost:9042/api/nests/_/stats
curl http://localhost:9042/api/nests/_/:nest_id
Untested:
curl -X PUT http://localhost:9042/api/stats/purge/all
This ditches all stats history including the current time period. This starts the stats with a clean slate, but like startup.
curl -X PUT http://localhost:9042/api/stats/purge/oldest -d '{ "duration_minutes": xx }'
Purges the specified duration of the stats starting from the oldest. This will never remove the current unfinished time period. This can be used to nuke everything but the current time period by specifying a very high number of minutes.
curl -X PUT http://localhost:9042/api/stats/purge/newest -d '{ "duration_minutes": xx, "include_current": false }'
Purges the specified amount of minutes of stats starting from the newest. 'include_current' specifies whether it should start with the current time period that is not done, or if it should start at the last period.
`curl -X PUT http://localhost:9042/api/stats/purge/keep -d '{ "duration_minutes": xx }'
This is another way to purge oldest stats. But with this one, you specify the duration to keep, not the duration to purge.
The importer can import nests that are fully contained by other nests. For example, if a large park has a number of baseball fields, it is possible that nests for both the park and the fields will be imported. This will be fixed soon.
All your nest are belong to us.