Skip to content

Latest commit

 

History

History

server

Server

Getting started

Before running Polar locally (specifically the API), you should set up a GitHub app with the necessary permissions. For a list of required permissions, refer to this file. You can then run uv run task verify_github_app to verify everything is okay then run the commands below.

# Run these commands in this directory (./server)
#
# Create a .env file and edit it
cp .env.template .env

# Start PostgreSQL and Redis
docker compose up -d

# Install dependencies
uv sync

# Checkout what powers are in the toolbelt
uv run task --list

# Use our VSCode workspace (extensions, settings etc)
code polar.code-workspace

# Run database migrations
uv run task db_migrate

# Fast API backend
uv run task api

# (in another terminal) Start the arq worker
uv run task worker

# Run the tests
uv run task test

# Our VSCode settings configure Ruff, but you can run it manually too
uv run task lint

Quick Start with Make

We provide a Makefile and helper scripts to automate the development setup and common tasks. The setup process will:

  1. Create a .env file from template
  2. Start Docker services
  3. Install dependencies
  4. Generate development JWKS (JSON Web Key Set) file
  5. Run database migrations

Here are the available commands:

# First-time setup (copies .env, generates JWKS, installs dependencies, runs migrations)
make setup

# If you need to generate JWKS manually
uv run task generate_dev_jwks

# Start the development server
make dev

# Start the worker (run in a separate terminal)
make worker

# Other useful commands:
make test           # Run tests
make lint           # Run linter
make migrate        # Run database migrations
make verify-github  # Verify GitHub app configuration
make docker-up      # Start Docker services
make docker-down    # Stop Docker services

# Create a new migration
make new-migration m="your migration description"

Development Requirements

Before running the server, ensure you have:

  1. Set up a GitHub app with proper permissions
  2. A valid .env file (created from template)
  3. A JWKS file (generated automatically during setup)

If you encounter JWKS-related errors, you can regenerate the JWKS file using:

uv run task generate_dev_jwks

Helper Scripts

The scripts/ directory contains Python scripts that help automate common tasks:

  • scripts/dev.py: Handles development server setup and startup
  • scripts/worker.py: Manages the worker process
  • scripts/setup_scripts.sh: Makes the scripts executable

To use the scripts directly:

# First make them executable
chmod +x scripts/setup_scripts.sh
./scripts/setup_scripts.sh

# Then you can run
./scripts/dev.py    # Start development server
./scripts/worker.py # Start worker

Create a database migration

Modify the model in polar.model, then run

alembic revision --autogenerate -m "[description]"

and a migration will be generated for you.

Design

Polar started out being structured in modules per function, e.g models, schemas, api endpoints etc. However, a resource and/or service was then scattered across a handful of modules. Inspired by Netflix Dispatch, we've moved to modules per domain(ish) containing all business logic surrounding a given resource/service in one place.

Exception to this rule: Database models and core modules (see more & why below).

Screenshot 2023-03-13 at 08 40 23

How a module is structured

polar/your_module/ Explanation & Usage
endpoints.py FastAPI router for the module. Mounted and routed in polar.api. Endpoint functions should be in charge of validation and authentication, but business logic should be contained within service.py
schemas.py Pydantic schemas for request/response and data validation. Resources should have schemas for their applicable CRUD operations named `(Read
service.py Module containing all the business logic. Essentially the non-public API for the resource/service which its own API utilizes along with any other services.
signals.py Blinker signals (if any). Great way for other services to listen for specific events and do their own thing.
receivers.py Receiver functions of Blinker signals (if any). Needs to be registered in polar.receivers which is mounted ensuring all signals/receivers are setup.
exceptions.py Any local exceptions.
*** resources/services are the same. So extend it as needed.

Of course, only add what's needed for a given resource/service. Once we've finalized this structure and feel we're used to it, we can create a nice little CLI generator :-)

Great resources and inspirations

Q&A

Why are database models not in their respective modules vs. polar.models?

Pragmatic solution. Ideally, they would be and polar.models could be a global container for them all. The challenge with this is model relationships, e.g Organization having many Repository, and keeping a feature complete metadata object for SQLAlchemy. There are ways around this, but they introduce complexity and some magic that will need to be run at runtime. In short, it creates more problems than it solves.

Absolute vs. relative imports?

By default we use absolute imports. However, within an isolated module, e.g polar.organization, we use relative imports to their corresponding schemas, endpoints, services and the like for better readability and separation. Let's not be fanatical about it and optimize for readability though, e.g do absolute imports vs. deep relative imports.