Jump to content

Bot Auto Visitor Website Using Proxy

Ade Yonatan

Recommended Posts

Tested oN Linux Mint WIth Pyhton Version 2.6.6

This Bot Coded by jimyromantic devilz for PTC

and I Recode for visit web

# This code is just for educational only ;)
# coder by jimmyromanticdevil
# code for tutorial Python [ Membuat Bot Auto Clicker ]
import urllib2
import urllib
import sys
import time
import random
import re
import os
proxylisttext = "proxylist.txt"
useragent = ['Mozilla/4.0 (compatible; MSIE 5.0; SunOS 5.10 sun4u; X11)',
		   'Mozilla/5.0 (X11; U; Linux i686; en-US; rv: Gecko/20100207 Ubuntu/9.04 (jaunty) Namoroka/3.6.2pre',
		   'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Avant Browser;',
		   'Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 5.0)',
		   'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1)',
		   'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:',
		   'Microsoft Internet Explorer/4.0b1 (Windows 95)',
		   'Opera/8.00 (Windows NT 5.1; U; en)',
		   'amaya/9.51 libwww/5.4.0',
		   'Mozilla/4.0 (compatible; MSIE 5.0; AOL 4.0; Windows 95; c_athome)',
		   'Mozilla/4.0 (compatible; MSIE 5.5; Windows NT)',
		   'Mozilla/5.0 (compatible; Konqueror/3.5; Linux) KHTML/3.5.5 (like Gecko) (Kubuntu)',
		   'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0; ZoomSpider.net bot; .NET CLR 1.1.4322)',
		   'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; QihooBot 1.0 qihoobot@qihoo.net)',
		'Mozilla/4.0 (compatible; MSIE 5.0; Windows ME) Opera 5.11 [en]']

referer	= ['http://google.com','http://bing.com','http://facebook.com','http://twitter.com']
link_invation= 'http://jadicontoh.com'

def Autoclicker(proxy1):
	proxy = proxy1.split(":")
        print 'Auto Click Using proxy :',proxy1
	proxy_set = urllib2.ProxyHandler({"http" : "%s:%d" % (proxy[0], int(proxy[1]))})
	opener = urllib2.build_opener(proxy_set, urllib2.HTTPHandler)
	opener.addheaders = [('User-agent', random.choice(useragent)),
						('Referer', random.choice(referer))]
	f = urllib2.urlopen(link_invation)
	if "jadicontoh.com" in f.read():
	   print "[*] Link Berhasil Di Kunjungi ..."
	   print "[*] Link gagal di kunjungi !"
           print "[!] Proxy failed"

           print "[!] Proxy Error "

def loadproxy():
	get_file = open(proxylisttext, "r")
	proxylist = get_file.readlines()
	count = 0
        proxy = []
	while count < len(proxylist):
	      count += 1
        for i in proxy:
    except IOError:
	print "\n[-] Error: Check your proxylist path\n"

def main():
   print """
Simulation Bot Autoclicker
coder : jimmyromanticdevil
if __name__ == '__main__':

Dont Forget Change text jadicontoh.com with your website and Build proxylist.txt for your proxy list
, if you dont have proxy list you can copas from http://spys.ru/en/free-proxy-list and you can cleaning for copas with this code

$lines = file('mentah.txt');

foreach($lines as $line_num => $line)
    $data = explode(' ', $line);
    $data = $data[1].'';
	$cek = explode('HTTP', $data);
	 $cek = $cek[0].'';
    echo "$cek";
	echo "<br>";

Copas All COntent On Spy.ru and save text on file, rename file with name mentah.txt

You Can Get Cleaning Proxy With OUT , One By One Copas

Link to comment
Share on other sites

  • 4 weeks later...

Great script. I like it.

I might add a function to randomize the order of the proxies and the times each is used within a given 24 hour period so the hits look more like unique visitors.

Maybe running a random integer against the total line count to read and set a radom proxy or loading all of the proxies into an array and using something like a shuffle function change up the order kinda like you did with the referrer

('Referer', random.choice(referer))]

I'm not up on Python but here's something like what I'm talking about.

Adding sleep for a random amout of time between page hits is also not a bad idea.


Having a bunch of hits from a long list of IPs in the same order every day is going to be a dead giveaway that it's bot traffic. But nonetheless good idea.

Edited by vailixi
Link to comment
Share on other sites

I've created this script some time ago to do the same using Java. it has the ability to randomize the User-Agent.

Meant to go back and finish it by randomizing some other headers, but never did it. But i would recommend for sure to randomize the user-agent b/c that can be used as a unique identifier to at least link all requests coming from same "browser".

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Create New...