• We just launched and are currently in beta. Join us as we build and grow the community.

The Dark Web: here are some tools

lolimgay

Blue Team Strategist
L Rep
0
0
0
Rep
0
L Vouches
0
0
0
Vouches
0
Posts
79
Likes
82
Bits
2 MONTHS
2 2 MONTHS OF SERVICE
LEVEL 1 300 XP
You must upgrade your account or reply in the thread to view hidden text.
probably is and will continue to be overkill. As of this writing, the last update to TorBot was in February. It uses Python 3.x and requires a Tor dependency. TorBot has a list of features that makes it useful for multiple applications. Features include:
  • Onion Crawler (.onion).(Completed)
  • Returns Page title and address with a short description (Partially Completed)
  • Save links to database.(PR to be reviewed)
  • Get emails from site.(Completed)
  • Save crawl info to JSON file.(Completed)
  • Crawl custom domains.(Completed)
  • Check if the link is live.(Completed)
  • Built-in Updater.(Completed)
  • Visualizer module.(Not started)
  • Social Media integration.(not Started) …(will be updated)
What I appreciate about TorBot is how ambitious the project is. There is a laundry list of promised features that are currently being worked on that are very exciting including:
  • Visualization Module
  • Implement BFS Search for webcrawler
  • Multithreading for Get Links
  • Improve stability (Handle errors gracefully, expand test coverage and etc.)
  • Create a user-friendly GUI
  • Randomize Tor Connection (Random Header and Identity)
  • Keyword/Phrase search
  • Social Media Integration
  • Increase anonymity and efficiency
Make sure to put in the time to get this script running.
Fresh Onions
You must upgrade your account or reply in the thread to view hidden text.
is a tool that hasn’t been updated in a while. As a disclaimer, you may have issues running the script as 2017 was the last GitHub push. However, even as an academic piece of what is possible on the dark web using Python, it’s worth taking a look at what features this tool offers or once offered. Here’s a list of the features:
  • Crawls the darknet looking for new hidden service
  • Find hidden services from a number of clearnet sources
  • Optional fulltext elasticsearch support
  • Marks clone sites of the /r/darknet superlist
  • Finds SSH fingerprints across hidden services
  • Finds email addresses across hidden services
  • Finds bitcoin addresses across hidden services
  • Shows incoming / outgoing links to onion domains
  • Up-to-date alive / dead hidden service status
  • Portscanner
  • Search for “interesting” URL paths, useful 404 detection
  • Automatic language detection
  • Fuzzy clone detection (requires elasticsearch, more advanced than superlist clone detection)
  • Doesn’t fuck around in general.
That last feature is very important. Now, this spider is hosted on the dark web and the developer has provided the onion URL. You can view it on the GitHub repo. If you can get this tool to work, you can combine it with other databasing tools like the ones previously discussed to create your own archive. I don’t have a ton of experience with this tool, though many in the OSINT community have commented about its value. Make sure to check it out as you sift through this list!
Onioff
Once you’ve created a database of hidden services and onion domains in tor, you need to inspect them to prevent from exposing yourself to malicious material or worse.
You must upgrade your account or reply in the thread to view hidden text.
is an onion url inspector for deep web links. The developer was kind enough to create a https://asciinema.org/a/87557 (so I don’t have to!). It’s also worth noting that the https://nikolaskama.me/ is a high school student. His other projects include
You must upgrade your account or reply in the thread to view hidden text.
,
You must upgrade your account or reply in the thread to view hidden text.
, and
You must upgrade your account or reply in the thread to view hidden text.
All are pretty awesome and high-level stuff.
Make sure you know what you’re about to open before you open a link.
TorCrawl
Now if you’re looking for a powerful, robust tool that has a really good wiki, I’m bring up the rear with
You must upgrade your account or reply in the thread to view hidden text.
TorCrawl not only crawls hidden services on tor, it extracts the code on the services’ webpage. Installation is pretty standard, clone the git and install the requirements with pip; however, if you don’t already have the tor service installed, the wiki provides a link to instructions on how to do that. If you’ve made it this far without doing that already, bravo.
So, what is this useful for? In a world with infinite time, you could setup and run TorBot, figure out how to get everything running, and have a reliable tool that will consistently get new DLCs. In a semi perfect world you’d have the time to database services with subscriptions, manual tools, and Fresh Onions, then inspect each onion webpage for possible malicious content, then manually inspect each page for your investigation.
But it’s not a perfect world and in most cases, the Pareto Principle applies and you have to get the most amount of work done in the least amount of time. So instead of worrying about crawling, inspection, then investigation, do it all in one with TorBot. You get the webpage markup so you can view the content without having to physically access the page. You can also view the static webpage by saving it as an .html file.
Check out TorCrawl and see which process you prefer. I haven’t spent enough time to consistently compare each tool/process nor have I seen how each tool performs over time. As of this writing, TorCrawl was last updated 25 days ago. It’s by far the most recently updated tool which gives me even more confidence in the tool and developer.
Source (adapted):
Code:
https://onehack.us/t/dark-web-tools-inspecting-the-dark-web/65356
Edited by drugnaught, 13 January 2020 - 10:36 PM.
 

438,740

315,860

315,869

Top