General security

Inspecting Deep Web Links

Nikolaos Kamarinakis
December 30, 2016 by
Nikolaos Kamarinakis

Introduction

If you have ever decided to explore what on earth someone can find on the so-called 'Deep Web,' then you have probably gone through the pain were more than half the links available online do not work.

The majority of people will look for 'hidden wikis' which contain a list of multiple deep web (.onion) URLs and then use the TorBrowser in the attempt of discovering if the 'services' that the specific site provides are those that the 'wiki' claims it does. These 'wikis' contain links to social networks, forums, and blog websites or even sites which sell drugs, and supply hitman and hacking services.

FREE role-guided training plans

FREE role-guided training plans

Get 12 cybersecurity training plans — one for each of the most common roles requested by employers.

This is all great for gaining awareness of all the crazy madness found on the trails of the Deep Web. However, the problem is - and trust me it is a big one - that more than 70% of the links will not work. As a result, we clearly want to be able to see whether the sites are active, before realizing that most of them are not - so, let's go ahead and do that.

Action

We are going to use a tool I recently developed called ONIOFF, a simple tool - written in pure python - for inspecting Deep Web URLs (or onions). Make sure Tor is running on your device ($ service tor start) and then download ONIOFF by cloning the Git Repo and simply installing its requirements:

$ git clone https://github.com/k4m4/onioff.git

$ cd onioff

$ pip install -r requirements.txt

Next, we want to create a flat (.txt) file which contains many, line-separated onion links (the exact amount is up to you obviously). To do so, we can use hidden wikis such as thehiddenwiki.org or torhiddenwiki.com and extract all links into our text file.

Subsequently, we want to use ONIOFF to inspect our links and see which of them are active and running. If they are, indeed, running, ONIOFF will also give us each site's title. Without any further ado, let's run it:

$ python onioff.py -f [Link File] -o [Report File]

Once we have executed our command, all active sites and their title will be displayed in our CLI, and a report will be saved into our output (-o) file. For example, you should be able to see something like the following:

What should you learn next?

What should you learn next?

From SOC Analyst to Secure Coder to Security Manager — our team of experts has 12 free training plans to help you hit your goals. Get your free copy now.

Conclusion

Now, we are aware of all 'actually working' sites and can thus explore the Deep Web with ease, without having to go through the struggle of inactive .onion URLs! Hope you found this useful. Go ahead and explore all the weird things you can find on the 'Deep' part of the Web - and remember, always stay on the white side.

Nikolaos Kamarinakis
Nikolaos Kamarinakis

Nikolaos Kamarinakis is an Information Security Researcher and member of the Greek Cyber Security Team. He enjoys contributing to the open source community, building InfoSec tools and writing about cybersecurity matters. You can contact him at nikolaskam{at}gmail{dot}com, check out his work at github.com/k4m4 and follow him on twitter at @nikolas_kama.