• We just launched and are currently in beta. Join us as we build and grow the community.

VPN.FAIL / FREE PROXY LIST UPDATED IN REAL-TIME + SCRAPE SCRIPT | SQL INJECTION

M@sterLog

Malware Analyst
Divine
M Rep
0
0
0
Rep
0
M Vouches
0
0
0
Vouches
0
Posts
140
Likes
20
Bits
2 MONTHS
2 2 MONTHS OF SERVICE
LEVEL 1 400 XP
Hello, vpn.fail/free-proxy list provides realtime proxy updates. They offer the download option for downloading full proxy list on that page.
OR NOT?
Actually, when you check network requests, you can find that ts_value is the "time parameter for updating proxy" or requesting "based on timeline". You can modify that value beyond the time available with download section.
Code:
Code:
import requests
import time
import re
url = 'https://vpn.fail/free-proxy/update'
headers = {
'Host': 'vpn.fail',
'Cookie': 'ci_session=4b80s7g8j6or8t5351bjvilcva; _ga_TZK290V6F3=GS1.1.1711875778.1.0.1711875778.0.0.0; _ga=GA1.1.1000176836.1711875778',
'Content-Length': '13',
'Sec-Ch-Ua': '"Not=A?Brand";v="99", "Chromium";v="118"',
'Accept': 'application/json, text/javascript, */*; q=0.01',
'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8',
'X-Requested-With': 'XMLHttpRequest',
'Sec-Ch-Ua-Mobile': '?0',
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/118.0.5993.90 Safari/537.36',
'Sec-Ch-Ua-Platform': '"Windows"',
'Origin': 'https://vpn.fail',
'Sec-Fetch-Site': 'same-origin',
'Sec-Fetch-Mode': 'cors',
'Sec-Fetch-Dest': 'empty',
'Referer': 'https://vpn.fail/free-proxy',
'Accept-Encoding': 'gzip, deflate, br',
'Accept-Language': 'en-US,en;q=0.9'
}
ts_value = 1700000042  # Initial ts value, for date 31.3.2024 (earliest value possible before crash == 1660000042)
while True:
data = {'ts': str(ts_value)}
response = requests.post(url, headers=headers, data=data)
if response.status_code == 200:
# Extract IP:PORT pairs using regular expressions
proxies = re.findall(r'\b(?:[0-9]{1,3}\.){3}[0-9]{1,3}:[0-9]+\b', response.text)
# Open the file for writing in append mode (creates the file if it doesn't exist)
with open("scraped.txt", "a") as file:
# Write IP:PORT information to the file
for proxy in proxies:
file.write(proxy + '\n')
print("Proxies saved to scraped.txt!")
# Remove duplicate lines from the file
with open("scraped.txt", "r") as file:
lines = file.readlines()
lines = set(lines)
# Rewrite the file with unique lines
with open("scraped.txt", "w") as file:
file.writelines(lines)
else:
print('Request failed with status code:', response.status_code)
ts_value += 5  # Increment ts value by 5
time.sleep(10)  # Wait for 10 seconds before the next request
Remember before using it, to change ts_value based on network request to this date. If you get 500 error, then server is overloaded with data (said in simple way). You can't then "scrape" more proxies, due to server configuration.
+one juicy = there is SQL injection in that value
Goodluck hunters
 

432,289

312,550

312,559

Top