Domains Blocklist for Squid-Cache
English | Español |
BlackWeb is a project that collects and unifies public blocklists of domains (porn, downloads, drugs, malware, spyware, trackers, bots, social networks, warez, weapons, etc.) to make them compatible with Squid-Cache.
ACL | Blocked Domains | File Size |
---|---|---|
blackweb.txt | 4927229 | 123,7 MB |
git clone --depth=1 https://github.com/maravento/blackweb.git
blackweb.txt
is already updated and optimized for Squid-Cache. Download it and unzip it in the path of your preference and activate Squid-Cache RULE.
wget -q -c -N https://raw.githubusercontent.com/maravento/blackweb/master/blackweb.tar.gz && cat blackweb.tar.gz* | tar xzf -
#!/bin/bash
# Variables
url="https://raw.githubusercontent.com/maravento/blackweb/master/blackweb.tar.gz"
wgetd="wget -q -c --timestamping --no-check-certificate --retry-connrefused --timeout=10 --tries=4 --show-progress"
# TMP folder
output_dir="bwtmp"
mkdir -p "$output_dir"
# Download
if $wgetd "$url"; then
echo "File downloaded: $(basename $url)"
else
echo "Main file not found. Searching for multiparts..."
# Multiparts from a to z
all_parts_downloaded=true
for part in {a..z}{a..z}; do
part_url="${url%.*}.$part"
if $wgetd "$part_url"; then
echo "Part downloaded: $(basename $part_url)"
else
echo "Part not found: $part"
all_parts_downloaded=false
break
fi
done
if $all_parts_downloaded; then
# Rebuild the original file in the current directory
cat blackweb.tar.gz.* > blackweb.tar.gz
echo "Multipart file rebuilt"
else
echo "Multipart process cannot be completed"
exit 1
fi
fi
# Unzip the file to the output folder
tar -xzf blackweb.tar.gz -C "$output_dir"
echo "Done"
wget -q -c -N https://raw.githubusercontent.com/maravento/blackweb/master/checksum.md5
md5sum blackweb.txt | awk '{print $1}' && cat checksum.md5 | awk '{print $1}'
Edit:
/etc/squid/squid.conf
And add the following lines:
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
# Block Rule for Blackweb
acl blackweb dstdomain "/path_to/blackweb.txt"
http_access deny blackweb
BlackWeb contains millions of domains, therefore it is recommended:
Use
allowdomains.txt
to exclude essential domains or subdomains, such as.accounts.google.com
,.yahoo.com
,.github.com
, etc. According to Squid’s documentation, the subdomainsaccounts.google.com
andaccounts.youtube.com
may be used by Google for authentication within its ecosystem. Blocking them could disrupt access to services like Gmail, Drive, Docs, and others.
acl allowdomains dstdomain "/path_to/allowdomains.txt"
http_access allow allowdomains
Use
blockdomains.txt
to add domains not included inblackweb.txt
(e.g.: .youtube.com .googlevideo.com, .ytimg.com, etc).
acl blockdomains dstdomain "/path_to/blockdomains.txt"
http_access deny blockdomains
Use
blocktlds.txt
to block gTLD, sTLD, ccTLD, etc.
acl blocktlds dstdomain "/path_to/blocktlds.txt"
http_access deny blocktlds
Input:
.bardomain.xxx
.subdomain.bardomain.xxx
.bardomain.ru
.bardomain.adult
.foodomain.com
.foodomain.porn
Output:
.foodomain.com
Use this rule to block Punycode - RFC3492, IDN | Non-ASCII (TLDs or Domains), to prevent an IDN homograph attack. For more information visit welivesecurity: Homograph attacks.
acl punycode dstdom_regex -i \.xn--.*
http_access deny punycode
Input:
.bücher.com
.mañana.com
.google.com
.auth.wikimedia.org
.xn--fiqz9s
.xn--p1ai
ASCII Output:
.google.com
.auth.wikimedia.org
Use this rule to block words (Optional. Can generate false positives).
# Download ACL:
sudo wget -P /etc/acl/ https://raw.githubusercontent.com/maravento/vault/refs/heads/master/blackword/blockwords.txt
# Squid Rule to Block Words:
acl blockwords url_regex -i "/etc/acl/blockwords.txt"
http_access deny blockwords
Input:
.bittorrent.com
https://www.google.com/search?q=torrent
https://www.google.com/search?q=mydomain
https://www.google.com/search?q=porn
.mydomain.com
Output:
https://www.google.com/search?q=mydomain
.mydomain.com
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
# Allow Rule for Domains
acl allowdomains dstdomain "/path_to/allowdomains.txt"
http_access allow allowdomains
# Block Rule for Punycode
acl punycode dstdom_regex -i \.xn--.*
http_access deny punycode
# Block Rule for gTLD, sTLD, ccTLD
acl blocktlds dstdomain "/path_to/blocktlds.txt"
http_access deny blocktlds
# Block Rule for Words (Optional)
acl blockwords url_regex -i "/etc/acl/blockwords.txt"
http_access deny blockwords
# Block Rule for Domains
acl blockdomains dstdomain "/path_to/blockdomains.txt"
http_access deny blockdomains
# Block Rule for Blackweb
acl blackweb dstdomain "/path_to/blackweb.txt"
http_access deny blackweb
This section is only to explain how update and optimization process works. It is not necessary for user to run it. This process can take time and consume a lot of hardware and bandwidth resources, therefore it is recommended to use test equipment.
The update process of
blackweb.txt
consists of several steps and is executed in sequence by the scriptbwupdate.sh
. The script will request privileges when required.
wget -q -N https://raw.githubusercontent.com/maravento/blackweb/master/bwupdate/bwupdate.sh && chmod +x bwupdate.sh && ./bwupdate.sh
Update requires python 3x and bash 5x. It also requires the following dependencies:
wget git curl libnotify-bin perl tar rar unrar unzip zip gzip python-is-python3 idn2 iconv
Make sure your Squid is installed correctly. If you have any problems, run the following script: (
sudo ./squid_install.sh
):
#!/bin/bash
# kill old version
while pgrep squid > /dev/null; do
echo "Waiting for Squid to stop..."
killall -s SIGTERM squid &>/dev/null
sleep 5
done
# squid remove (if exist)
apt purge -y squid* &>/dev/null
rm -rf /var/spool/squid* /var/log/squid* /etc/squid* /dev/shm/* &>/dev/null
# squid install (you can use 'squid-openssl' or 'squid')
apt install -y squid-openssl squid-langpack squid-common squidclient squid-purge
# create log
if [ ! -d /var/log/squid ]; then
mkdir -p /var/log/squid
fi &>/dev/null
if [[ ! -f /var/log/squid/{access,cache,store,deny}.log ]]; then
touch /var/log/squid/{access,cache,store,deny}.log
fi &>/dev/null
# permissions
chown -R proxy:proxy /var/log/squid
# enable service
systemctl enable squid.service
systemctl start squid.service
echo "Done"
Capture domains from downloaded public blocklists (see SOURCES) and unifies them in a single file.
Remove overlapping domains (
'.sub.example.com' is a subdomain of '.example.com'
), does homologation to Squid-Cache format and excludes false positives (google, hotmail, yahoo, etc.) with a allowlist (debugwl.txt
).
Input:
com
.com
.domain.com
domain.com
0.0.0.0 domain.com
127.0.0.1 domain.com
::1 domain.com
domain.com.co
foo.bar.subdomain.domain.com
.subdomain.domain.com.co
www.domain.com
www.foo.bar.subdomain.domain.com
domain.co.uk
xxx.foo.bar.subdomain.domain.co.uk
Output:
.domain.com
.domain.com.co
.domain.co.uk
Remove domains with invalid TLDs (with a list of Public and Private Suffix TLDs: ccTLD, ccSLD, sTLD, uTLD, gSLD, gTLD, eTLD, etc., up to 4th level 4LDs).
Input:
.domain.exe
.domain.com
.domain.edu.co
Output:
.domain.com
.domain.edu.co
Remove hostnames larger than 63 characters (RFC 1035) and other characters inadmissible by IDN and convert domains with international characters (non ASCII) and used for homologous attacks to Punycode/IDNA format.
Input:
bücher.com
café.fr
españa.com
köln-düsseldorfer-rhein-main.de
mañana.com
mūsųlaikas.lt
sendesık.com
президент.рф
Output:
xn--bcher-kva.com
xn--caf-dma.fr
xn--d1abbgf6aiiy.xn--p1ai
xn--espaa-rta.com
xn--kln-dsseldorfer-rhein-main-cvc6o.de
xn--maana-pta.com
xn--mslaikas-qzb5f.lt
xn--sendesk-wfb.com
Most of the SOURCES contain millions of invalid and nonexistent domains. Then, a double check of each domain is done (in 2 steps) via DNS and invalid and nonexistent are excluded from Blackweb. This process may take. By default it processes domains in parallel ≈ 6k to 12k x min, depending on the hardware and bandwidth.
HIT google.com
google.com has address 142.251.35.238
google.com has IPv6 address 2607:f8b0:4008:80b::200e
google.com mail is handled by 10 smtp.google.com.
FAULT testfaultdomain.com
Host testfaultdomain.com not found: 3(NXDOMAIN)
For more information, check internet live stats
Remove government domains (.gov) and other related TLDs from BlackWeb.
Input:
.argentina.gob.ar
.mydomain.com
.gob.mx
.gov.uk
.navy.mil
Output:
.mydomain.com
Run Squid-Cache with BlackWeb and any error sends it to
SquidError.txt
on your desktop.
BlackWeb: Done 06/05/2023 15:47:14
/etc/acl
. You can change it for your preference.bwupdate.sh
includes lists of remote support related domains (Teamviewer, Anydesk, logmein, etc) and web3 domains. They are commented by default (unless their domains are in SOURCES). To block or exclude them you must activate the corresponding lines in the script (# JOIN LIST), although it is not recommended to avoid conflicts or false positives.bwupdate.sh
(ctrl + c) and it stopped at the DNS Loockup part, it will restart at that point. If you stop it earlier, you will have to start from the beginning or modify the script manually so that it starts from the desired point.aufs
, temporarily change it to ufs
during the upgrade, to avoid: ERROR: Can't change type of existing cache_dir aufs /var/spool/squid to ufs. Restart required
.
wget https://raw.githubusercontent.com/maravento/blackweb/refs/heads/master/bwupdate/tools/checksources.sh
chmod +x checksources.sh
./checksources.sh
e.g:
[?] Enter domain to search: kickass.to
[*] Searching for 'kickass.to'...
[+] Domain found in: https://github.com/fabriziosalmi/blacklists/releases/download/latest/blacklist.txt
[+] Domain found in: https://hostsfile.org/Downloads/hosts.txt
[+] Domain found in: https://raw.githubusercontent.com/blocklistproject/Lists/master/everything.txt
[+] Domain found in: https://raw.githubusercontent.com/hagezi/dns-blocklists/main/domains/ultimate.txt
[+] Domain found in: https://raw.githubusercontent.com/Ultimate-Hosts-Blacklist/Ultimate.Hosts.Blacklist/master/hosts/hosts0
[+] Domain found in: https://sysctl.org/cameleon/hosts
[+] Domain found in: https://v.firebog.net/hosts/Kowabit.txt
Done
We thank all those who have contributed to this project. Those interested can contribute, sending us links of new lists, to be included in this project.
Special thanks to: Jhonatan Sneider
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Due to recent arbitrary changes in computer terminology, it is necessary to clarify the meaning and connotation of the term blacklist, associated with this project:
In computing, a blacklist, denylist or blocklist is a basic access control mechanism that allows through all elements (email addresses, users, passwords, URLs, IP addresses, domain names, file hashes, etc.), except those explicitly mentioned. Those items on the list are denied access. The opposite is a whitelist, which means only items on the list are let through whatever gate is being used. Source Wikipedia)
Therefore, blacklist, blocklist, blackweb, blackip, whitelist and similar, are terms that have nothing to do with racial discrimination.