ElonMosque
Meta Breaker
Divine
2
MONTHS
2 2 MONTHS OF SERVICE
LEVEL 1
300 XP

Hey Folks, today in this tutorial we are going to discuss a web application security testing tool called “hakrawler“. hakrawler is a Go web crawler designed for easy, quick discovery of endpoints and assets within a web application. It can be used to discover :
- Forms
- Endpoints
- Subdomains
- Related domains
- JavaScript files
Let’s take a look

Install Golang
The tool is coded in the Go language, so we have to configure the go utility by using the following command to operate this tool.
apt install golang1apt install golang

Installation of Hakrawler
Now the time has come to install this tool using Go utility. We only have to execute the following command to install this tool.
go get github.com/hakluke/hakrawler1go get github.com/hakluke/hakrawler

The tool has automatically reached to the binary location of kali linux which means that we can access it from anywhere.
hakrawler -h1hakrawler-h

Robots.txt Parsing
Basically robots.txt is a standard used by websites to communicate with web crawlers and other web robots which we can find with the help of this tool.
hakrawler -url < website > -robots1hakrawler-url<website>-robots

Subdomains
Finding a sub-domain is a common feature but you can also use it.
hakrawler -url fintaxico.in -subs1hakrawler-url fintaxico.in-subs

Depth Scan
If you want to crawl the website completely with depth then you can use the following command and also increase the depth accordingly.
hakrawler -url secnhack.in -depth 101hakrawler-url secnhack.in-depth10

Likewise, this tool has many features that can give you a good experience while crawling any web application.
