Search Social Networks for free username - Moriarty part-1
decentralizuj
Posted on April 1, 2021
You can find this gem on GitHub and rubygems. Sharing is the best way to donate for my development. This is part of multiple series content about moriarty gem and it's creation, from script to gem. It's intended for beginners
Moriarty - Tool to check social networks for available username
GIF preview v-0.2.0 - without latest changes'
Gem is still in alpha, many things will be added (like scrapping info from social networks)
Breaking changes are highly possible, so I do not suggest use of this gem except for development and test
That means, do not use it in production services, otherwise feel free to play with it.
Moriarty
Tool to check social networks for available username.
Idea from python tool - sherlock
What it do
Search multiple social networks for free username.
It's reverse from Sherlock, so not-found username is success.
Argument --hunt will run search like sherlock would, looking for valid users.
How to install
Clone repo and install dependencies:
# Moriarty use rest-client, nokogiri and colorize gems
git clone https://github.com/decentralizuj/moriarty.git &&cd moriarty && bundle install
Or install from rubygems:
DISCLAIMER
GitHub repository is updated before Rubygems.
Before v-1.0.0 is released, recommended way is to clone repo,…
After using Sherlock for a long time (like it!), I decided to create something similar in Ruby. This give me more power to extend software with new features easily, like scrapping info from website, comments from FB, tags from photos etc... For me, ruby is much over python (because of dev happiness, and many ways to do the same thing). When I say "over" python, I talk about personal experience, not about Ruby VS Python
Start Project
What script should do:
take username
take website url
submit request
receive response
For http request I'll go with gem rest-client, and here's why. RestClient response is used to scrap page html data with Nokogiri::HTML method. If we use Net::HTTP#get instead of RestClient#get we will get all positive results. But with rest-client in use, nokogiri fail if there's no page to scrap, even if response code is 200. So here I used rescue to set @success to false. This do not work for all sites, so I need to check scrapped data in future.
#!/usr/bin/env rubyrequire'rest-client'require'nokogiri'classMoriarty# initialize new object, set user and url# add https and '/' to the enddefinitialize(name='',site='github.com',type=:https)@user=name.to_s@url=type.to_s+'://'+site.to_s@url+='/'unless@url.end_with?('/')end# make request to url# accept optional options hashdefgo(opt={})# use initialized data or enter newopt[:user]||=@useropt[:site]||=@url# construct url from sitename and usernameuri=opt[:site].to_s+opt[:user].to_s# make request with rest-client@response=RestClient.geturi# get page html with nokogiri@html=Nokogiri::HTML@response# if everything is fine, set and return true# otherwise, set and return falsereturn@success=truerescuereturn@success=falseendend
Part of job is done. But we must somehow access this variables, so we need to define attr_reader (we will create setter methods manually).
# This way we can check GitHub, dev, instagram, but we can't# check linkedin etc... This is because those pages return# data to nokogiri to scrap, even if no user@jim=Moriarty.new('moriarty','github.com')@jim.goif@jim.success?puts"#{@jim.user} is registered on #{@jim.url}"elseputs"Username #{@jim.user} seems to be free on #{@jim.url}"end
This post is part of multiple series. In next part I will add all other methods from gem, and in third I will create ruby gem. Last, but not least will explain CLI interface use.
With next updates I will add methods to scrap from html and check is user registered or is false-positive. If registered, data will be saved in some kind of database (I like CSV for this kind of data, or just a .txt)