对 https://github.com/WangYihang/SourceLeakHacker  进行修改,可以一次性扫描多个网站 
 
使用 1 2 3 4 5 6 7 8 9 usage: infoFiles.py  [-h ] [--file  FILE ] [--url  URL ] [--save ] Sensitive  information  file optional  arguments :  -h , --help             show  this  help  message  and  exit    --file  FILE , -f  FILE   url  file    --url  URL , -u  URL      may  be  you  only  want  to  test  one  url    --save , -s             save  scan  infos  
 
f指定url文件,u指定单一url,s,将扫描信息保存
4/10 暂无多线程的贫穷版本
1 2 3 4 5 6 7 class  ColorPrinter :    def  print_red_text (self, content) :         print("\033[1;31;40m %s \033[0m"  % (content),end='' )     def  print_green_text (self, content) :         print("\033[1;32;40m %s \033[0m"  % (content),end='' )     def  print_blue_text (self, content) :         print("\033[1;34;40m %s \033[0m"  % (content),end='' ) 
 
定义一个类,输出不同颜色字体 PS:Linux系统下
1 2 3 4 5 6 def  urlFormater (url) :    if  (not  url.startswith("http://" )) and  (not  url.startswith("https://" )):         url = "http://"  + url     if  not  url.endswith("/" ):         url += "/"      return  url 
 
对传入的url进行处理,修改为http/https://example.com/的形式
1 2 3 4 5 parse = argparse.ArgumentParser(description="Sensitive information file" ) parse.add_argument('--file' ,'-f' ,help='url file' ) parse.add_argument('--url' ,'-u' ,help='may be you only want to test one url' ) parse.add_argument('--save' ,'-s' ,action='store_true' ,help='save scan infos' ) args = parse.parse_args() 
 
导入argparse模块,对传入参数进行处理,f指定url文件,u指定单一url,s,将扫描信息保存
1 2 3 4 5 6 7 8 9 10 11 12 13 listFile = open('list.txt' , 'r' ) for  i in  listFile:    i = i.replace("\n" ,"" )     i = i.replace("\r" ,"" )     if  "?"  in  i:         fileFile = open('file.txt' , 'r' )         for  j in  fileFile:             j = j.replace("\n" ,"" )             j = j.replace("\r" ,"" )             temp = i.replace("?" ,j)             urls.append(website + temp)     else :         urls.append(website + i) 
 
从list.txt文件中获取常见泄露文件,如果带有?,则从file.txt文件中获取文件替换?
1 2 3 4 5 6 7 8 9 10 11 12 13 response = requests.head(url,timeout = timeout) code = response.status_code logs.append(str(code)+',' +url) if  code == 200 :    colorPrinter.print_green_text("[ "  + str(code) + " ]" )     print("Checking : "  + url)     if  "404"  in  response.text:         colorPrinter.print_blue_text(url + "\tMaybe every page same!" ) elif  code == 404  or  code == 405 :    pass  else :    colorPrinter.print_red_text("[ "  + str(code) + " ]" )     print("Checking : "  + url) 
 
检测网站状况,输出并记录
Git https://github.com/No4l/MyTools/tree/master/SourceLeakHacker%20plus