The murky past of OnionCity


The murky past of OnionCity

Onion.City was a search tool built on Tor2Web proxy system allowed users to search for underground markets directly from clearnet browsers without installing and browsing via Tor Browser.

Tor2Web was developed by Aaron Swartz and Virgil Griffith. After Aaron’s passing Virgil continued maintaining Tor2Web and launched OnionCity as a new way to search the dark web with a Google-type results page. The search engine indexes the dark web allowing users to search for their desired search terms, and OnionCity will search through its database, and return all the results it was able to gather.

The search engine is powered by the Tor2web proxy, a project which enables clearnet internet users to access .onion domains

This doesn’t remove the need for Tor browser as many know Tor2Web has clearly outlined that its proxy is not a secure browsing method. Think of Tor2web proxy, as a middle man between your clearnet browser like Apple’s Safari browser and the hidden world of .onion sites. Another key difference with Onion.City is the results appear with a .city suffix at the end instead of the usual .onion. The above image is a prime example, searching for a Wiki will leave you with results such as, rather then xxxx.onion.

As the dark web can harvest a lot of illicit material such as child pornography and black markets, to comply with United States law and Internet regulations, OnionCity has blacklisted a number of domains from appearing in their search results. According to the OnionCity site it had indexed approximately 664,000 sites.

During its early years in service OnionCity operated as an insecure http:// connection, meaning web traffic may be able to be intercepted and read. Many also voiced concern over the search engines use of Google to help index the entire site to best enable searching the vast index of pages. Griffith responded that Google is suboptimal, but see’s the search giants help as a temporary-ladder to get the ball rolling.

Virgil Griffith publicly shared his reasoning in TorProject pipermail during an Ethics investigation from the Tor Project

Virgil Griffith:

“FWIW, Aaron Swartz was the one who chose the somewhat-odd subdomain structure of tor2web URLs. He chose this structure for the *explicit purpose* of making /robots.txt "just work". So we can put Aaron down in the column for "supports the robots.txt precedent". I find it peculiar that the position of the person to whom Tor 0.2.4.x was dedicated, on one of his signature projects, is considered so out-of-the-norm to attract an analogy to rape.”

Isis (Tor Contributor):

Perhaps, more explicitly, what we'd like to eliminate is people like you, Virgil. You've admitted publicly, in person, to several of our developers that you harvested HSDir data and then further attempted (unsuccessfully) to sell said data on users to INTERPOL and the Singaporean government.

Virgil Griffith:

As to Isis's suggestion that I have conspired to or was an accomplice in de-anonymizing Tor users. It is mistaken, against my values Moreover, and moreover lacks any evidence implying otherwise. The closest thing I do to this spurious charge is sell minimized logs (which, ironically, aims to protect user privacy from ad-networks). So here, let us concretize this---I emailed a day's worth of premium logs [249MB] to tor-assistants@ . I am totally fine going on record saying that this data is less damaging to privacy than Google Adsense or something similar.

The end of OnionCity

Concerns over OnionCity’s data logs and simple encryption from HTTPS expanded the rift between the Tor Project and Griffith. OnionCity lapsed its domain renewal in 2019 and no longer serves as a proxy between users and Tor. Tor2Web is now maintained and developed by Giovanni Pellerano from the Hermes Center for Transparency and Digital Human Rights as part of the GlobaLeaks Project, with financial support from the Open Technology Fund.