Web-Crawler built in go using standard libraries
Shipped as cli using cobra
- Clone this repository
- Install the necessary packages using go.mod
- Run it
go run main.go [args] [flags]
- Optionally Install it
go install
Web-Crawler [args] [flags]
- Make sure the go/bin is in your path.
- crawl (Main Subcommand of Crawler... Entrypoint)
- help
- --version
- --help
Web-Crawler --help
- --depth (Defines the depth level of crawling including root) (Needed) (int)
- --help
- --root-relative (Crawling and scraping from same domain with root path) (Optional) (bool)
- --url (Root url to start the crawling) (Needed) (string)
- --generate (Generate a .txt file with all the links crawled) (Optional) (bool)
Example
Web-Crawler crawl --url=https://transform.tools/ --depth 3 --generate --root-relative