• We just launched and are currently in beta. Join us as we build and grow the community.

Waybackurls – A Web Crawler To Fetch Url’s

Fusth

Social Media Maven
F Rep
0
0
0
Rep
0
F Vouches
0
0
0
Vouches
0
Posts
112
Likes
163
Bits
2 MONTHS
2 2 MONTHS OF SERVICE
LEVEL 1 400 XP
Orange-Basketball-Dark-Gamer-Sports-Youtube-Thumbnail-2.png


Hey Folks, as you may already know about the various web crawler tools used to crawl the documents available on the web application and one of them is going to present called “waybackurls“, which works in the same way as other web crawlers. Basically the tool accept line-delimited domains on stdin, fetch known URLs from the Wayback Machine for *.domain and output them on stdout.

Let’s take a look 🙂 !!

Install Golang

To operate this tool, it is necessary to install the Go utility in your system, otherwise you cannot operate it. Let’s install it easily using the following command.

apt install golang1apt install golang

1.png

Waybackurls Tool Installation

Once the installation is done, we can download this tool through the Go utility and also operate it from anywhere.

go get github.com/tomnomnom/waybackurls
waybackurls -h12go get github.com/tomnomnom/waybackurlswaybackurls-h

2.png


Done 🙂 !! Everything looks good and the time has come to test this tool. Only we need to leave a target URL in the command that we want to crawl and that’s it. It will automatically crawl all the URLs and documents of the web application with the help of sitemaps.

Usage 🙂 !! waybackurls < URL >

waybackurls testphp.vulnweb.com1waybackurls testphp.vulnweb.com

3.png

Exclude Subdoamin

By default it automatically fetches all subdomains of a given domain and starts spiders scan on them as well but if you only want to crawl specific given domains then you can mention “-no-subs” after gives the URL.

Usage 🙂 !! waybackurls < URL > -no-subs

waybackurls testphp.vulnweb.com -no-subs1waybackurls testphp.vulnweb.com-no-subs

4.png

Save Output

There is no specific command given in this tool to save the output but if you want to save your output in txt file then you can use the following command.

Usage 🙂 !! waybackurls < URL > > < output file name >

waybackurls fintaxico.in > res.txt1waybackurls fintaxico.in>res.txt

5.png
About the AuthorShubham Goyal Certified Ethical Hacker, information security analyst, penetration tester and researcher. Can be Contact on Linkedin.
 

428,602

311,185

311,194

Top