Maxun lets you train a robot in 2 minutes and scrape the web on auto-pilot. Web data extraction doesn't get easier than this!
Website |
Discord |
Twitter |
Join Maxun Cloud |
Watch Tutorials
Note: Maxun is in its early stages of development and currently does not support self-hosting. However, you can run Maxun locally. Self-hosting capabilities are planned for a future release and will be available soon.
git clone https://github.com/getmaxun/maxun
docker-compose up -d
You can access the frontend at http://localhost:5173/ and backend at http://localhost:8080/
- Ensure you have Node.js, PostgreSQL, MinIO and Redis installed on your system.
- Run the commands below
git clone https://github.com/getmaxun/maxun
# change directory to the project root
cd maxun
# install dependencies
npm install
# change directory to maxun-core to install dependencies
cd maxun-core
npm install
# get back to the root directory
cd ..
# make sure playwright is properly initialized
npx playwright install
npx playwright install-deps
# get back to the root directory
cd ..
# start frontend and backend together
npm run start
You can access the frontend at http://localhost:5173/ and backend at http://localhost:8080/
- Create a file named
.env
in the root folder of the project - Example env file can be viewed here.
Variable | Mandatory | Description | If Not Set |
---|---|---|---|
BACKEND_PORT |
Yes | Port to run backend on. Needed for Docker setup | Default value: 8080 |
FRONTEND_PORT |
Yes | Port to run frontend on. Needed for Docker setup | Default value: 5173 |
BACKEND_URL |
Yes | URL to run backend on. | Default value: http://localhost:8080 |
VITE_BACKEND_URL |
Yes | URL used by frontend to connect to backend | Default value: http://localhost:8080 |
PUBLIC_URL |
Yes | URL to run frontend on. | Default value: http://localhost:5173 |
VITE_PUBLIC_URL |
Yes | URL used by backend to connect to frontend | Default value: http://localhost:5173 |
JWT_SECRET |
Yes | Secret key used to sign and verify JSON Web Tokens (JWTs) for authentication. | JWT authentication will not work. |
DB_NAME |
Yes | Name of the Postgres database to connect to. | Database connection will fail. |
DB_USER |
Yes | Username for Postgres database authentication. | Database connection will fail. |
DB_PASSWORD |
Yes | Password for Postgres database authentication. | Database connection will fail. |
DB_HOST |
Yes | Host address where the Postgres database server is running. | Database connection will fail. |
DB_PORT |
Yes | Port number used to connect to the Postgres database server. | Database connection will fail. |
ENCRYPTION_KEY |
Yes | Key used for encrypting sensitive data (proxies, passwords). | Encryption functionality will not work. |
MINIO_ENDPOINT |
Yes | Endpoint URL for MinIO, to store Robot Run Screenshots. | Connection to MinIO storage will fail. |
MINIO_PORT |
Yes | Port number for MinIO service. | Connection to MinIO storage will fail. |
MINIO_CONSOLE_PORT |
No | Port number for MinIO WebUI service. Needed for Docker setup. | Cannot access MinIO Web UI. |
MINIO_ACCESS_KEY |
Yes | Access key for authenticating with MinIO. | MinIO authentication will fail. |
GOOGLE_CLIENT_ID |
No | Client ID for Google OAuth, used for Google Sheet integration authentication. | Google login will not work. |
GOOGLE_CLIENT_SECRET |
No | Client Secret for Google OAuth. | Google login will not work. |
GOOGLE_REDIRECT_URI |
No | Redirect URI for handling Google OAuth responses. | Google login will not work. |
REDIS_HOST |
Yes | Host address of the Redis server, used by BullMQ for scheduling robots. | Redis connection will fail. |
REDIS_PORT |
Yes | Port number for the Redis server. | Redis connection will fail. |
MAXUN_TELEMETRY |
No | Disables telemetry to stop sending anonymous usage data. Keeping it enabled helps us understand how the product is used and assess the impact of any new changes. Please keep it enabled. | Telemetry data will not be collected. |
Maxun lets you create custom robots which emulate user actions and extract data. A robot can perform any of the actions: Capture List, Capture Text or Capture Screenshot. Once a robot is created, it will keep extracting data for you without manual intervention
- Capture List: Useful to extract structured and bulk items from the website. Example: Scrape products from Amazon etc.
- Capture Text: Useful to extract individual text content from the website.
- Capture Screenshot: Get fullpage or visible section screenshots of the website.
BYOP (Bring Your Own Proxy) lets you connect external proxies to bypass anti-bot protection. Currently, the proxies are per user. Soon you'll be able to configure proxy per robot.
- β¨ Extract Data With No-Code
- β¨ Handle Pagination & Scrolling
- β¨ Run Robots On A Specific Schedule
- β¨ Turn Websites to APIs
- β¨ Turn Websites to Spreadsheets
- β¨ Adapt To Website Layout Changes (coming soon)
- β¨ Extract Behind Login, With Two-Factor Authentication Support (coming soon)
- β¨ Integrations (currently Google Sheet)
- +++ A lot of amazing things soon!
We offer a managed cloud version to run Maxun without having to manage the infrastructure and extract data at scale. Maxun cloud also deals with anti-bot detection, huge proxy network with automatic proxy rotation, and CAPTCHA solving. If this interests you, join the cloud waitlist as we launch soon.
This project is in early stages of development. Your feedback is very important for us - we're actively working to improve the product. Drop anonymous feedback here.
This project is licensed under AGPLv3.
Thank you to the combined efforts of everyone who contributes!