I was bored at work and logged into home and looked at the Deep Packet Inspection on my Ubiqiti router. Well bored and someone askesd a question about Unifi and I wanted to show off. The DPI showed some site “Financial Times News” as consuming 59GB of data over the last week. I pay for unlimited bandwidth with wave cabke for an extra 20$ a month. However, unlimited to them means after 2TB a month they “slow you down” vs charge you more money. When I’m going over 2TB on a regular basis now, 59Gb is a number worth being annoyed at. It’s like 3% of my total bandwidth for the month before they slow me down.
“Financial Times News” appears to mostly come from the FT.COM domain space. I own my own DNS at home hosted in AD and figured a simple method to block FT.COM would be to setup a primaryzone with zero records in it. Then I got to thinking, on my home machine I manually download a host file from GIT HUB once a month to block advertising things, malware things, and other annoying sites.
The StevenBlack host file on gethub I download is updated often, and currently has over 32,000 blocked FQDNs in it. I’m pretty lazy and not planning on adding these FQDNs to the DNS server by hand so I scripted it. After I wrote the first script which downloaded the host file to a file on my domain controller. I thought to my self, “Self ,using a file is lame sauce“. It would be much more better cool like if the script downloaded to a variable and did all of the work in RAM.
Two different versions
Downloading to a file with WebClient.DownloadFile made parsing the host file to contain only FQDNs pretty simple. Simple because the powershell get-content command placed each line of the host file in a different line of an array for me. Making it easy to shove the array into the Add-DnsServerPrimaryZone command one a time to make new zones.
Downloading to memory with the Invoke-WebRequest command took some more thinking. The invoke-webrequest command sets the file content to the .content attribute of the returned value, and stores the data there as a string. Converting the string to an array took manual work vs. powershell doing all of the work for me.
I used the split method and broke the string in array lines at each line break in the .content string value of the data returned by invoke-webrequest – ($h.Content).Split([environment]::NewLine) Sadly the regex newline breaks \n or ‘n were not working for me. The internet helped and eventually I came up with a solution. To correctly break the lines I used some funky [environment]::NewLine bits to break the lines apart.
The two versions of the script are below.
Version of code to download a file in the middle
#download the host file from the GIT repository https://github.com/StevenBlack/hosts -
#using the raw file download link of https://raw.githubusercontent.com/StevenBlack/hosts/master/hosts
#And place the host file in the folder c:\dns
$WebClient = New-Object -TypeName System.Net.WebClient
$url = "https://raw.githubusercontent.com/StevenBlack/hosts/master/hosts"
$target = "c:\dns\host"
$WebClient.DownloadFile($url, $target)
#create the list of domains from the downloaded host file
#grab the host file only where the line starts with 0.0.0.0, then remove the 0.0.0.0 from the lineset-con
$zones = get-content host | Where-Object {$_ -like "0.0.0.0*"} | Foreach-Object {$_ -replace "0.0.0.0 ", ""}
#add every new line in host file to the DNS server as a new primary zone
foreach ($zone in $zones) {
$i ++
if ((get-dnsserverzone $zone -erroraction silentlycontinue).zonename -eq $null) {Add-DnsServerPrimaryZone -Name $zone -ReplicationScope forest };
Write-Progress -Activity "adding domain $zone - zone $i of $total" -PercentComplete (($i / $total) * 100);
}
Version of code to download to memory
#download the host file from the GIT repository https://github.com/StevenBlack/hosts - using the raw file download link of https://raw.githubusercontent.com/StevenBlack/hosts/master/hosts
#set the file to a variable and deal with it in memroy vs createing a file
$h = Invoke-WebRequest -Uri "https://raw.githubusercontent.com/StevenBlack/hosts/master/hosts"
#parse the file and extract only the URIs
$q = ($h.Content).Split([environment]::NewLine)
$zones = $q | Where-Object {$_ -like "0.0.0.0*"} | Foreach-Object {$_ -replace "0.0.0.0 ", ""}
#add every new line in $zones to the DNS server as a new primary zone
foreach ($zone in $zones) {
$i ++
if ((get-dnsserverzone $zone -erroraction silentlycontinue).zonename -eq $null) {Add-DnsServerPrimaryZone -Name $zone -ReplicationScope forest };
Write-Progress -Activity "adding domain $zone - zone $i of $total" -PercentComplete (($i / $total) * 100);
}
Note || updated with a progress bar – Thanks Wes for the suggestion
Next step
Next step when I have a round tuit again will be to turn this into a .ps1 script and setup a scheduled task to run once a week. When I do that I better add in a report to email me when it’s done and tell me how many new entries are added, and maybe even a quick method to remove everything?
For another time
UPDATE – 11/24/2016 – Blocked Bing, opps
Wife sent me an email shortly after I added all of the DNS zones to the home DNS server and says to me “you silly goose you blocked Bing, don’t you work for Microsoft?” why yes I do work for Microsoft, and use Bing. On my home RDP logged into all day machine I checked an Bing is indeed blocked. WTF? Bing was not blocked when I loaded this file on the home laptop a few weeks ago, Turns out there is a section in the HOST file setup to block Windows telemetry under the following header
### Extra rules for @StevenBlack ‘s hosts project
### https://github.com/FadeMind/hosts.extras
### <Windows 10 Telemetry> < B E G I N >
Under the section heading is a number of FQDNS required to access Bing. Microsoft uses Anycast to route DNS queries to the closest DNS server to your location then they convert the CNAME record www.bing.com into an A record and provide you with a site to connect to that is both close to you and currently up and operating. Close is anycast based on source IP, up is our method of redundancy. By modifying the A record responses we can bring datacenters in and out of rotation for maintenance as needed by modifying DNS. The following domains are where the A record responses for the CNAME www.bing.com come from / are needed to connect to Bing.
a-0001.a-msedge.net
a-0002.a-msedge.net
a-0003.a-msedge.net
a-0004.a-msedge.net
a-0005.a-msedge.net
a-0006.a-msedge.net
a-0007.a-msedge.net
a-0008.a-msedge.net
a-0009.a-msedge.net
I don’t have an issue with Windows Telementry because I’m aware of how the data is used, so I chose to remove the entire section of the HOST file from my DNS servers and allow the traffic. to do that I took all of the names and placed them in a file called remove.txt then ran the following commands:
$bob = get-content remove.txt
foreach ($name in $bob) {remove-dnsserverzone $name -force}
Before and After Examples of NSlookup from my HappyLab server
C:\Windows\system32>nslookup www.bing.com
Server: UnKnown
Address: 10.10.10.5Non-authoritative answer:
Name: www.bing.comC:\Windows\system32>nslookup www.bing.com
Server: UnKnown
Address: 10.10.10.5Non-authoritative answer:
Name: a-0001.a-msedge.net
Addresses: 204.79.197.200
13.107.21.200
Aliases: www.bing.com
www-bing-com.a-0001.a-msedge.net
2 thoughts on “Blocking Some Sites at Home with DNS”
Did this end up working? I have the same issue – dozens of GB showing as financial times. I haven’t blocked it yet but wonder if its a mis-categorization.
Yes it worked no more Financial times.