Skip to content

Latest commit

 

History

History
 
 

.org

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 

README

1 Scripts

This is the scripts directory, where we place scripts of various types to help with various activities. :)

Let’s get a little more concrete though.

1.1 apps-process-count.sh

A simple script to query the Erlang VMs process count

./scripts/apps-process-count.sh
10

1.2 bump-copyright-year.sh

Python script to walk the supplied files and bumps the copyright year if appropriate.

./scripts/bump-copyright-year.sh [FILE]

1.3 check-app-registered.sh

Checks Erlang applications for registered processes and compares that to the application’s .app.src file.

./scripts/check-app-registered.sh [PATH/TO/APP]

For example, I set `{registered, []} in callflow.app.src, then ran the script:

./scripts/check-app-registered.sh applications/callflow
cf_event_handler_sup, callflow_sup, cf_exe_sup
applications/callflow has no registered modules??
1 errors
1 errors in total

Now you have a listing of registered processes to put in your .app.src

1.4 check-dialyzer.escript

An Erlang escript that dialyzes changed files. Run it using the makefile target ‘dialyze’ with the files to dialyze:

TO_DIALYZE=applications/callflow/ebin/callflow_sup.beam make dialyze
scanning "applications/callflow/ebin/callflow_sup.beam"
0 Dialyzer warnings

Typically `TO_DIALYZE` would be a generated list of files.

Do note: this will only check the file itself for issues. To really leverage Dialyzer, you’ll want to include remote project modules for Dialyzer to use as well.

1.5 check-release-startup.sh

Creates a release, starts it, and issues some commands to test that the release starts up and appears to be running

1.6 check-scripts-readme.bash

A quick script to check that all scripts in $(ROOT)/scripts are documented in this file!

1.7 check-spelling.bash

Takes the misspellings.txt and checks for common mistakes.

Each line on the text file has the format {correct}|{mispelt} [{misspelt} ...]

1.8 check-unstaged.bash

Checks if any unstanged changes are found in the repo and exits if so. Used in CircleCI to fail builds with unstaged changes after applying code checks, spell checking, etc.

1.9 check-xref.escript

An Erlang escript for cross referencing (xref) calls to remote modules. Set `TO_XREF` to ebin paths (or use the default):

make xref
Pass: global
Loading modules...
Running xref analysis...
Xref: listing undefined_function_calls
Xref: listing undefined_functions
Done

If there are any calls to non-existant modules, or non-exported functions, you will get errors listed here.

1.10 circleci-build-erlang.sh

Fetches kerl and installs configured Erlang version (used in CircleCI)

1.11 code_checks.bash

Checks source code files for various formatting expectations and exits if any are found.

./scripts/code_checks.bash applications/crossbar/src/cb_context.erl
Check for andalso/orelse dropped lines
Check for uses of module in lieu of ?MODULE
Check for TAB characters
Check for trailing whitespaces

1.12 code_checks.bash

Checks source code for various style requirements of the project

1.13 conn-to-apps.sh

Opens a remote shell to the kazoo_apps@hostname VM.

./scripts/conn-to-apps.sh [{VM@HOSTNAME}, {LOCAL_SHELL@HOSTNAME}]

1.14 conn-to-ecallmgr.sh

A convenience wrapper for connecting to ecallmgr@HOSTNAME via conn-to-apps.sh

1.15 convert_org_files.bash

Script that is helpful when converting org files from 8.x to 9.x

1.16 cover.escript

creates and sends coverage report for testing of codebase

1.17 crash-apps.sh

Forces the running VM to halt, producing a crashdump, and exiting with status code 1 (as per the docs). Currently hard-coded the VM name to ‘kazoo_apps’

1.18 crash-ecallmgr.sh

Same as crash-apps.sh but for the ecallmgr VM.

1.19 dev-exec-mfa.sh

Runs M:F(A) on the node: #+INCLUDE “../dev-exec-mfa.sh” :lines “3-6”

1.20 dev-start-apps.sh

Starts a VM with an interactive shell. {VM_NAME} defaults to ‘kazoo_apps’

./scripts/dev-start-apps.sh {VM_NAME}

1.21 dev-start-ecallmgr.sh

Defaults node name to ‘ecallmgr’; otherwise the same as dev-start-apps.sh

1.22 dev/kazoo.sh

When using releases, executes a release command against the running VM:

KAZOO_CONFIG=/etc/kazoo/core/config.ini ./scripts/dev/kazoo.sh {CMD}

{CMD} can be:

  • ‘attach’: Attach to a running VM
  • ‘console’: connect to the VM with an interactive shell
  • ‘escript’: Run an escript under the node’s environment
  • ‘eval’: evaluates the string in the running VM
  • ‘foreground’: start up the release in the foreground
  • ‘pid’: get the OS pid of the VM
  • ‘ping’: test aliveness of the VM
  • ‘reboot’: restart the VM completely (new OS process)
  • ‘remote_console’: connect as a remote shell
  • ‘restart’: restart the VM without exiting the OS process
  • ‘rpc’: execute a remote procedure call
  • ‘rpcterms’:
  • ‘start’/’start_boot’: start the VM
  • ‘stop’: stop the VM
  • ‘unpack’: Unpack a tar.gz for upgrade/downgrade/installation
  • ‘upgrade’‘downgrade’‘install’: perform an upgrade/downgrade/installation

1.23 dev/sup.sh

Runs the SUP escript against the running release

1.24 dialyze-changed.bash

This script gets a diff set (against master) of .erl files from the current branch and dialyzes all changed files. You can include extra beam files on the end of the script (for things like gen_listener, kz_json, etc).

./scripts/dialyze-changed.bash core/kazoo/ebin/kz_json.beam
dialyzing changed files:
  Checking whether the PLT .kazoo.plt is up-to-date... yes
  Compiling some key modules to native code... done in 0m0.28s
  Proceeding with analysis...
  ...Issues Found...
  Unknown functions:
  ...Unknown functions...
  Unknown types:
  ...Unknown types...
 done in 0m6.69s
done (warnings were emitted)

1.25 dialyze-usage.bash

Given a module name, such as ‘props’ or ‘kz_json’, search core/applications for modules that make calls to the supplied module and dialyze those beam files looking for dialyzer complaints. You will likely see complaints unrelated to your supplied module - go ahead and fix those too if possible ;)

The more heavily utilized the module is, the longer this will take to run!

 ./scripts/dialyze-usage.bash kz_config
dialyzing usages of kz_config
  Checking whether the PLT .kazoo.plt is up-to-date... yes
  Proceeding with analysis...
kz_dataconfig.erl:26: Function connection/0 has no local return
kz_dataconfig.erl:27: The call kz_config:get('data','config',['bigcouch',...]) breaks the contract (section(),atom(),Default) -> kz_proplist() | Default
kz_dataconfig.erl:32: Function connection_options/1 will never be called
...
 done in 0m4.08s
done (warnings were emitted)

1.26 ecallmgr-process-count.sh

Connects to the ecallmgr VM and outputs a count of running Erlang processes.

1.27 empty_schema_descriptions.bash

Checks JSON schemas for empty “description” properties and exit(1) if any are found

1.28 export_auth_token.bash

Script for exporting AUTH_TOKEN and ACCOUNT_ID when doing Crossbar authentication. Handy when running curl commands to use $AUTH_TOKEN instead of the raw value (and for re-authing when auth token expires).

1.29 format-json.sh

Python script to format JSON files (like CouchDB views, JSON schemas) and write the formatted version back to the file. ‘make apis’ runs this as part of its instructions.

./scripts/format-json.sh path/to/file.json [path/to/other/file.json,...]

1.30 generate-api-endpoints.escript

Builds the Crossbar reference docs in ‘applications/crossbar/doc/ref’. Helps detect when Crossbar endpoints have changes to their functionality that is client-facing.

Also builds the Swagger JSON file in applications/crossbar/priv/api/swagger.json

1.31 generate-doc-schemas.sh

Updates crossbar docs with the schema table from the ref (auto-gen) version

1.32 generate-fs-headers-hrl.escript

Parses the ecallmgr code looking for keys used to access values in the FreeSWITCH proplist and builds a header file at applications/ecallmgr/src/fs_event_filters.hrl for use when initializing mod_kazoo.

1.33 generate-schemas.escript

Parses the core/applications code looking for calls to kapps_config (module used to access documents in the system_config database) and building a base JSON schema file for each document found.

Also parses callflow’s action modules looking for keys used to access values in the Data JSON object to build a base JSON schema file for each callflow action.

1.34 kz_diaspora.bash

Script for updating Erlang code to account for functions that have moved modules.

  • kz_util to alternative modules
  • kz_json to kz_doc for public/private fields

1.35 no_raw_json.escript

Erlang has a handful of internal representations of JSON used by the various parses. The kz_json module handles these details and Kazoo programmers should treat the data structure used as opaque. This script parses the codebase looking for instances where the opaqueness of the data structure is violated.

1.36 rabbitmq-generic.sh

Wrapper for running rabbitmq script commands?

1.37 rabbitmq-server.init

Init.d script for rabbitmq

1.38 reconcile_docs_to_index.bash

Finds all docs in the repo and checks which are included in the mkdocs.yml index

1.39 setup-dev.sh

Script to setup a dev environment including:

  • Symlink SUP to /usr/bin
  • Symlink rabbitmq init.d script to /etc/init.d
  • Symlink kazoo init.d scripts to /etc/init.d
  • Reset RabbitMQ mnesia databases, logs
  • Setup users for rabbitmq and kazoo

1.40 setup-git.sh

Setup the username/email to use in Git commits and other Git settings

1.41 setup_docs.bash

Script for setting up a local environment for running the mkdocs-built docs site

1.42 src2any.escript

Reads the .app.src file and writes a .src file?

1.43 start-apps.sh

Starts a VM in the background with name kazoo_apps

1.44 start-ecallmgr.sh

Starts a VM in the background with name ecallmgr

1.45 state-of-docs.sh

Searches for undocumented APIs and reports percentage of doc coverage.

./scripts/state-of-docs.sh
Undocumented API endpoints:
> DELETE /v2/templates/{TEMPLATE_NAME}
> PUT /v2/templates/{TEMPLATE_NAME}
> GET /v2/sup/{MODULE}
> GET /v2/accounts/{ACCOUNT_ID}/agents
> GET /v2/accounts/{ACCOUNT_ID}/agents/stats
> GET /v2/accounts/{ACCOUNT_ID}/agents/status
> POST /v2/accounts/{ACCOUNT_ID}/agents/status/{USER_ID}
> GET /v2/accounts/{ACCOUNT_ID}/agents/status/{USER_ID}
> GET /v2/accounts/{ACCOUNT_ID}/agents/{USER_ID}
> GET /v2/accounts/{ACCOUNT_ID}/agents/{USER_ID}/queue_status
> POST /v2/accounts/{ACCOUNT_ID}/agents/{USER_ID}/queue_status
> GET /v2/accounts/{ACCOUNT_ID}/agents/{USER_ID}/status
> POST /v2/accounts/{ACCOUNT_ID}/agents/{USER_ID}/status
> GET /v2/accounts/{ACCOUNT_ID}/alerts
> PUT /v2/accounts/{ACCOUNT_ID}/alerts
> DELETE /v2/accounts/{ACCOUNT_ID}/alerts/{ALERT_ID}
> GET /v2/accounts/{ACCOUNT_ID}/alerts/{ALERT_ID}
> GET /v2/accounts/{ACCOUNT_ID}/blacklists
> PUT /v2/accounts/{ACCOUNT_ID}/blacklists
> GET /v2/accounts/{ACCOUNT_ID}/blacklists/{BLACKLIST_ID}
> POST /v2/accounts/{ACCOUNT_ID}/blacklists/{BLACKLIST_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/blacklists/{BLACKLIST_ID}
> PATCH /v2/accounts/{ACCOUNT_ID}/blacklists/{BLACKLIST_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/bulk
> POST /v2/accounts/{ACCOUNT_ID}/bulk
> PUT /v2/accounts/{ACCOUNT_ID}/cccps
> PUT /v2/accounts/{ACCOUNT_ID}/cccps/{CCCP_ID}
> POST /v2/accounts/{ACCOUNT_ID}/cccps/{CCCP_ID}
> GET /v2/accounts/{ACCOUNT_ID}/cccps/{CCCP_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/cccps/{CCCP_ID}
> GET /v2/accounts/{ACCOUNT_ID}/cdrs/summary
> PUT /v2/accounts/{ACCOUNT_ID}/clicktocall
> PATCH /v2/accounts/{ACCOUNT_ID}/clicktocall/{C2C_ID}
> POST /v2/accounts/{ACCOUNT_ID}/clicktocall/{C2C_ID}
> GET /v2/accounts/{ACCOUNT_ID}/clicktocall/{C2C_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/clicktocall/{C2C_ID}
> GET /v2/accounts/{ACCOUNT_ID}/clicktocall/{C2C_ID}/connect
> POST /v2/accounts/{ACCOUNT_ID}/clicktocall/{C2C_ID}/connect
> GET /v2/accounts/{ACCOUNT_ID}/clicktocall/{C2C_ID}/history
> GET /v2/accounts/{ACCOUNT_ID}/conferences
> PUT /v2/accounts/{ACCOUNT_ID}/conferences
> PATCH /v2/accounts/{ACCOUNT_ID}/conferences/{CONFERENCE_ID}
> GET /v2/accounts/{ACCOUNT_ID}/conferences/{CONFERENCE_ID}
> POST /v2/accounts/{ACCOUNT_ID}/conferences/{CONFERENCE_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/conferences/{CONFERENCE_ID}
> GET /v2/accounts/{ACCOUNT_ID}/conferences/{CONFERENCE_ID}/participants
> GET /v2/accounts/{ACCOUNT_ID}/conferences/{CONFERENCE_ID}/participants/{PARTICIPANT_ID}
> PATCH /v2/accounts/{ACCOUNT_ID}/configs/{CONFIG_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/configs/{CONFIG_ID}
> GET /v2/accounts/{ACCOUNT_ID}/configs/{CONFIG_ID}
> PUT /v2/accounts/{ACCOUNT_ID}/configs/{CONFIG_ID}
> POST /v2/accounts/{ACCOUNT_ID}/configs/{CONFIG_ID}
> PUT /v2/accounts/{ACCOUNT_ID}/connectivity
> DELETE /v2/accounts/{ACCOUNT_ID}/connectivity/{CONNECTIVITY_ID}
> PATCH /v2/accounts/{ACCOUNT_ID}/connectivity/{CONNECTIVITY_ID}
> POST /v2/accounts/{ACCOUNT_ID}/connectivity/{CONNECTIVITY_ID}
> GET /v2/accounts/{ACCOUNT_ID}/connectivity/{CONNECTIVITY_ID}
> PUT /v2/accounts/{ACCOUNT_ID}/directories
> POST /v2/accounts/{ACCOUNT_ID}/directories/{DIRECTORY_ID}
> PATCH /v2/accounts/{ACCOUNT_ID}/directories/{DIRECTORY_ID}
> GET /v2/accounts/{ACCOUNT_ID}/faxboxes
> PUT /v2/accounts/{ACCOUNT_ID}/faxboxes
> DELETE /v2/accounts/{ACCOUNT_ID}/faxboxes/{FAXBOX_ID}
> GET /v2/accounts/{ACCOUNT_ID}/faxboxes/{FAXBOX_ID}
> PATCH /v2/accounts/{ACCOUNT_ID}/faxboxes/{FAXBOX_ID}
> POST /v2/accounts/{ACCOUNT_ID}/faxboxes/{FAXBOX_ID}
> PUT /v2/accounts/{ACCOUNT_ID}/faxes/inbox/{FAX_ID}
> GET /v2/accounts/{ACCOUNT_ID}/freeswitch
> PUT /v2/accounts/{ACCOUNT_ID}/global_provisioner_templates
> GET /v2/accounts/{ACCOUNT_ID}/global_provisioner_templates
> GET /v2/accounts/{ACCOUNT_ID}/global_provisioner_templates/{TEMPLATE_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/global_provisioner_templates/{TEMPLATE_ID}
> POST /v2/accounts/{ACCOUNT_ID}/global_provisioner_templates/{TEMPLATE_ID}
> POST /v2/accounts/{ACCOUNT_ID}/global_provisioner_templates/{TEMPLATE_ID}/image
> GET /v2/accounts/{ACCOUNT_ID}/global_provisioner_templates/{TEMPLATE_ID}/image
> DELETE /v2/accounts/{ACCOUNT_ID}/global_provisioner_templates/{TEMPLATE_ID}/image
> GET /v2/accounts/{ACCOUNT_ID}/hotdesks
> GET /v2/accounts/{ACCOUNT_ID}/local_provisioner_templates
> PUT /v2/accounts/{ACCOUNT_ID}/local_provisioner_templates
> GET /v2/accounts/{ACCOUNT_ID}/local_provisioner_templates/{TEMPLATE_ID}
> POST /v2/accounts/{ACCOUNT_ID}/local_provisioner_templates/{TEMPLATE_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/local_provisioner_templates/{TEMPLATE_ID}
> GET /v2/accounts/{ACCOUNT_ID}/local_provisioner_templates/{TEMPLATE_ID}/image
> POST /v2/accounts/{ACCOUNT_ID}/local_provisioner_templates/{TEMPLATE_ID}/image
> DELETE /v2/accounts/{ACCOUNT_ID}/local_provisioner_templates/{TEMPLATE_ID}/image
> GET /v2/accounts/{ACCOUNT_ID}/menus
> PUT /v2/accounts/{ACCOUNT_ID}/menus
> PATCH /v2/accounts/{ACCOUNT_ID}/menus/{MENU_ID}
> GET /v2/accounts/{ACCOUNT_ID}/menus/{MENU_ID}
> POST /v2/accounts/{ACCOUNT_ID}/menus/{MENU_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/menus/{MENU_ID}
> GET /v2/accounts/{ACCOUNT_ID}/metaflows
> DELETE /v2/accounts/{ACCOUNT_ID}/metaflows
> POST /v2/accounts/{ACCOUNT_ID}/metaflows
> PUT /v2/accounts/{ACCOUNT_ID}/onboard
> GET /v2/accounts/{ACCOUNT_ID}/parked_calls
> POST /v2/accounts/{ACCOUNT_ID}/presence
> GET /v2/accounts/{ACCOUNT_ID}/presence/report-{REPORT_ID}
> GET /v2/accounts/{ACCOUNT_ID}/presence/{EXTENSION}
> PUT /v2/accounts/{ACCOUNT_ID}/queues/eavesdrop
> PUT /v2/accounts/{ACCOUNT_ID}/queues/{QUEUE_ID}/eavesdrop
> POST /v2/accounts/{ACCOUNT_ID}/queues/{QUEUE_ID}/roster
> GET /v2/accounts/{ACCOUNT_ID}/rate_limits
> DELETE /v2/accounts/{ACCOUNT_ID}/rate_limits
> POST /v2/accounts/{ACCOUNT_ID}/rate_limits
> GET /v2/accounts/{ACCOUNT_ID}/resource_selectors
> GET /v2/accounts/{ACCOUNT_ID}/resource_selectors/name/{SELECTOR_NAME}/resource/{RESOURCE_ID}
> GET /v2/accounts/{ACCOUNT_ID}/resource_selectors/rules
> POST /v2/accounts/{ACCOUNT_ID}/resource_selectors/rules
> DELETE /v2/accounts/{ACCOUNT_ID}/resource_selectors/{UUID}
> GET /v2/accounts/{ACCOUNT_ID}/resource_selectors/{UUID}
> POST /v2/accounts/{ACCOUNT_ID}/resource_selectors/{UUID}
> PUT /v2/accounts/{ACCOUNT_ID}/resource_templates
> GET /v2/accounts/{ACCOUNT_ID}/resource_templates
> POST /v2/accounts/{ACCOUNT_ID}/resource_templates/{RESOURCE_TEMPLATE_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/resource_templates/{RESOURCE_TEMPLATE_ID}
> GET /v2/accounts/{ACCOUNT_ID}/resource_templates/{RESOURCE_TEMPLATE_ID}
> PATCH /v2/accounts/{ACCOUNT_ID}/resource_templates/{RESOURCE_TEMPLATE_ID}
> POST /v2/accounts/{ACCOUNT_ID}/service_plans/reconciliation
> POST /v2/accounts/{ACCOUNT_ID}/service_plans/synchronization
> GET /v2/accounts/{ACCOUNT_ID}/services/plan
> POST /v2/accounts/{ACCOUNT_ID}/services/status
> GET /v2/accounts/{ACCOUNT_ID}/services/status
> PUT /v2/accounts/{ACCOUNT_ID}/signup
> POST /v2/accounts/{ACCOUNT_ID}/signup/{THING}
> PUT /v2/accounts/{ACCOUNT_ID}/sms
> GET /v2/accounts/{ACCOUNT_ID}/sms/{SMS_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/sms/{SMS_ID}
> PATCH /v2/accounts/{ACCOUNT_ID}/storage
> DELETE /v2/accounts/{ACCOUNT_ID}/storage
> PUT /v2/accounts/{ACCOUNT_ID}/storage
> POST /v2/accounts/{ACCOUNT_ID}/storage
> PUT /v2/accounts/{ACCOUNT_ID}/storage/plans
> GET /v2/accounts/{ACCOUNT_ID}/storage/plans
> PATCH /v2/accounts/{ACCOUNT_ID}/storage/plans/{STORAGE_PLAN_ID}
> GET /v2/accounts/{ACCOUNT_ID}/storage/plans/{STORAGE_PLAN_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/storage/plans/{STORAGE_PLAN_ID}
> POST /v2/accounts/{ACCOUNT_ID}/storage/plans/{STORAGE_PLAN_ID}
> GET /v2/accounts/{ACCOUNT_ID}/tasks/{TASK_ID}/output
> PUT /v2/accounts/{ACCOUNT_ID}/temporal_rules
> POST /v2/accounts/{ACCOUNT_ID}/temporal_rules/{TEMPORAL_RULE_ID}
> GET /v2/accounts/{ACCOUNT_ID}/temporal_rules/{TEMPORAL_RULE_ID}
> DELETE /v2/accounts/{ACCOUNT_ID}/temporal_rules/{TEMPORAL_RULE_ID}
> PATCH /v2/accounts/{ACCOUNT_ID}/temporal_rules/{TEMPORAL_RULE_ID}
> PUT /v2/accounts/{ACCOUNT_ID}/temporal_rules_sets
> GET /v2/accounts/{ACCOUNT_ID}/temporal_rules_sets
> POST /v2/accounts/{ACCOUNT_ID}/temporal_rules_sets/{TEMPORAL_RULE_SET}
> PATCH /v2/accounts/{ACCOUNT_ID}/temporal_rules_sets/{TEMPORAL_RULE_SET}
> GET /v2/accounts/{ACCOUNT_ID}/temporal_rules_sets/{TEMPORAL_RULE_SET}
> DELETE /v2/accounts/{ACCOUNT_ID}/temporal_rules_sets/{TEMPORAL_RULE_SET}
> DELETE /v2/accounts/{ACCOUNT_ID}/whitelabel
> PUT /v2/accounts/{ACCOUNT_ID}/whitelabel
> POST /v2/accounts/{ACCOUNT_ID}/whitelabel
> GET /v2/accounts/{ACCOUNT_ID}/whitelabel
> POST /v2/accounts/{ACCOUNT_ID}/whitelabel/icon
> GET /v2/accounts/{ACCOUNT_ID}/whitelabel/icon
> POST /v2/accounts/{ACCOUNT_ID}/whitelabel/logo
> GET /v2/accounts/{ACCOUNT_ID}/whitelabel/logo
> POST /v2/accounts/{ACCOUNT_ID}/whitelabel/welcome
> GET /v2/accounts/{ACCOUNT_ID}/whitelabel/welcome
> GET /v2/accounts/{ACCOUNT_ID}/whitelabel/{WHITELABEL_DOMAIN}
> GET /v2/accounts/{ACCOUNT_ID}/whitelabel/{WHITELABEL_DOMAIN}/icon
> GET /v2/accounts/{ACCOUNT_ID}/whitelabel/{WHITELABEL_DOMAIN}/logo
> GET /v2/accounts/{ACCOUNT_ID}/whitelabel/{WHITELABEL_DOMAIN}/welcome
> GET /v2/sup/{MODULE}/{FUNCTION}
> GET /v2/sup/{MODULE}/{FUNCTION}/{ARGS}
> DELETE /v2/auth/links
> GET /v2/about
> GET /v2/auth/links
> GET /v2/auth/tokeninfo
> GET /v2/templates
> POST /v2/auth/links
> PUT /v2/auth/authorize
> PUT /v2/auth/callback
> PUT /v2/ip_auth
> PUT /v2/shared_auth

349 / 526 ( 66% documented )

Documented but not matching any allowed_method:
> DELETE /v2/notifications/{NOTIFICATION_ID}
> GET /v2/accounts/{ACCOUNT_ID}/about
> GET /v2/accounts/{ACCOUNT_ID}/descendants/port_requests
> PATCH /v2/accounts/{ACCOUNT_ID}/descendants/webhooks
> DELETE /v2/accounts/{ACCOUNT_ID}/devices/{DEVICE_ID}/access_lists
> GET /v2/accounts/{ACCOUNT_ID}/devices/{DEVICE_ID}/channels
> GET /v2/accounts/{ACCOUNT_ID}/users/{USER_ID}/cdrs
> GET /v2/accounts/{ACCOUNT_ID}/users/{USER_ID}/channels
> GET /v2/accounts/{ACCOUNT_ID}/users/{USER_ID}/devices
> GET /v2/accounts/{ACCOUNT_ID}/users/{USER_ID}/recordings
> GET /v1/accounts
> GET /v2/channels
> GET /v2/notifications
> GET /v2/phone_numbers
> GET /v2/resource_selectors/rules
> GET /v2/search
> GET /v2/search/multi
> GET /v2/tasks
> GET /v2/webhooks
> GET /v2/websockets
> POST /v2/resource_selectors/rules
> POST /v2/whitelabel/domains

1.46 sync_to_remote.bash

HOST="server.com" ERL_FILES="path/to/source.erl" BEAM_PATH="/tmp/beams" ./scripts/sync_to_remote.bash

Takes the provided Erlang files, finds their .beam and syncs those to the remote server provided.

  • ERL_FILES: which source files to sync (the changed files (against master) are used by default).
  • HOST: The Host to use for the scp command
  • BEAM_PATH: Where on the Host to put the beam files

1.47 sync_to_release.bash

Useful in conjunction with sync_to_remove. Takes .beam files in a directory and moves them into a release, into the proper application ebin, and reloads them in the default VMs

  • BEAMS: Path to beam files, defaults to /tmp/beams/*.beam
  • DEST: Path to the release’s lib/ directory, defaults to /opt/kazoo/lib

1.48 update-the-types.sh

Used to search the code looking for deprecated Erlang functions and types and replace them with the newer versions as appropriate

1.49 validate-js.sh

Processes JSON files:

  • Checks that _id matches the file name in schema files
  • Checks map functions in CouchDB views for ‘Object.keys’ usage

1.50 validate-swagger.sh

Validate Swagger file using online validator

./scripts/validate-swagger.sh
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  2973  100  2973    0     0   4945      0 --:--:-- --:--:-- --:--:--  4938
Swagger file validation errors: 2
{
    "messages": [
        "malformed or unreadable swagger supplied"
    ],
    "schemaValidationMessages": [
        {
            "domain": "validation",
            "instance": {
                "pointer": "/definitions/allotments"
            },
            "keyword": "additionalProperties",
            "level": "error",
            "message": "object instance has properties which are not allowed by the schema: [\"patternProperties\"]",
            "schema": {
                "loadingURI": "http://swagger.io/v2/schema.json#",
                "pointer": "/definitions/schema"
            }
        },
        {
            "domain": "validation",
            "instance": {
                "pointer": "/definitions/domain_hosts"
            },
            "keyword": "additionalProperties",
            "level": "error",
            "message": "object instance has properties which are not allowed by the schema: [\"patternProperties\"]",
            "schema": {
                "loadingURI": "http://swagger.io/v2/schema.json#",
                "pointer": "/definitions/schema"
            }
        },
        {
            "domain": "validation",
            "instance": {
                "pointer": "/definitions/metaflow"
            },
            "keyword": "additionalProperties",
            "level": "error",
            "message": "object instance has properties which are not allowed by the schema: [\"oneOf\"]",
            "schema": {
                "loadingURI": "http://swagger.io/v2/schema.json#",
                "pointer": "/definitions/schema"
            }
        },
        {
            "domain": "validation",
            "instance": {
                "pointer": "/definitions/metaflow_children"
            },
            "keyword": "additionalProperties",
            "level": "error",
            "message": "object instance has properties which are not allowed by the schema: [\"patternProperties\"]",
            "schema": {
                "loadingURI": "http://swagger.io/v2/schema.json#",
                "pointer": "/definitions/schema"
            }
        },
        {
            "domain": "validation",
            "instance": {
                "pointer": "/definitions/storage"
            },
            "keyword": "additionalProperties",
            "level": "error",
            "message": "object instance has properties which are not allowed by the schema: [\"patternProperties\"]",
            "schema": {
                "loadingURI": "http://swagger.io/v2/schema.json#",
                "pointer": "/definitions/schema"
            }
        },
        {
            "domain": "validation",
            "instance": {
                "pointer": "/definitions/storage.attachments"
            },
            "keyword": "additionalProperties",
            "level": "error",
            "message": "object instance has properties which are not allowed by the schema: [\"patternProperties\"]",
            "schema": {
                "loadingURI": "http://swagger.io/v2/schema.json#",
                "pointer": "/definitions/schema"
            }
        },
        {
            "domain": "validation",
            "instance": {
                "pointer": "/definitions/storage.connection.couchdb"
            },
            "keyword": "additionalProperties",
            "level": "error",
            "message": "object instance has properties which are not allowed by the schema: [\"definitions\"]",
            "schema": {
                "loadingURI": "http://swagger.io/v2/schema.json#",
                "pointer": "/definitions/schema"
            }
        },
        {
            "domain": "validation",
            "instance": {
                "pointer": "/definitions/storage.connections"
            },
            "keyword": "additionalProperties",
            "level": "error",
            "message": "object instance has properties which are not allowed by the schema: [\"patternProperties\"]",
            "schema": {
                "loadingURI": "http://swagger.io/v2/schema.json#",
                "pointer": "/definitions/schema"
            }
        },
        {
            "domain": "validation",
            "instance": {
                "pointer": "/definitions/storage.plan.database"
            },
            "keyword": "additionalProperties",
            "level": "error",
            "message": "object instance has properties which are not allowed by the schema: [\"definitions\"]",
            "schema": {
                "loadingURI": "http://swagger.io/v2/schema.json#",
                "pointer": "/definitions/schema"
            }
        }
    ]
}
FIX THESE ISSUES

1.51 validate_mkdocs.py

Parses the mkdocs.yml and looks for non-existent docs

1.52 wh_to_kz.sh

Part of the great rename, converts Whistle-related names to Kazoo-specific names