If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Rating: | Display Modes |
#61
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On Sat, 30 Aug 2014 14:09:49 -0400, Mayayana wrote:
servers can just keep changing the subdomain to thwart your HOSTS file. You might have entries for 200 Doubleclick subdomains, but you don't have an entry for the one they might create next week. Very true. I, for one, wish the HOSTS file format allowed for regular expressions. |
Ads |
#62
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for others to benefit)?
Ned Turnbull wrote:
On Sat, 30 Aug 2014 15:56:19 +0200, J.O. Aho wrote: Hosts file has the negative side, you need to copy it to each and every machine you have, including portable devices Portable devices require root, unfortunately. Which makes this idiotic method impractical there But, how can copying a text file to Windows, Linux, and Mac be harder than trying to get a huge number of programs to perform the same task to work on all these platforms? That task is "slowing down each and every network connection"? Brilliant, you found it Besides, the hosts file works for all protocols, whereas most of these other methods work only for some protocols. Yes, it will slow down each connection, regardless of protocal There's no way there is any easier yet more complete & yet totally portable method than the hosts file method. Yes, there *are* other methods, but let's keep this thread to just the hosts file method. Yes, a very "portable method" to build a completely unmaintainable file |
#63
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On Sat, 30 Aug 2014 17:04:44 +0200, J.O. Aho wrote:
Just a reminder, no one benefits from a such list and they will never do. I don't understand. Are you intimating no one benefits from the MVP HOSTS file? http://winhelp2002.mvps.org/hosts.htm Many sites must be recommending the HOSTS file for some reason e.g., Lifehacker, TechRepublic, SomoneWhoCares, etc. http://lifehacker.com/5817447/how-to...the-hosts-file http://www.techrepublic.com/blog/win...ws-hosts-file/ http://someonewhocares.org/hosts/ BTW, that last URL is yet another HOSTS file which I will include in mine, but since it's already published, I'll let you integrate it into yours. NOTE: The effort is in culling out duplicates, which I do simply by removing all extraneous lines, and then removing extraneous characters in the good lines. |
#64
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On Sat, 30 Aug 2014 09:46:58 -0400, Big Al wrote:
How does one go about finding out all these bogus host addresses so you can build your own HOST file. Hi Big Al, Here's another one I just found by accident: http://someonewhocares.org/hosts/ I will run my scripts on that one to see if it adds any value to my existing hosts file (which is 2/3 MVP HOSTS and 1/3 other hosts). |
#65
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On Sat, 30 Aug 2014 14:23:42 -0500, J² wrote:
You need to verify that the additions you have put on your file over the years are still relevant. There is a program available for download that will check each and every one, and it takes hours, if not days, to check as many as you have. I'm sorry but I can't recall the name of this utility. I suspect I can run a ping, and, to capture the results in a script, to test them. But, really, I have never typed in a URL that I *wanted* to go to, that came back *accidentally* blocked by my hosts file, so, it's not really a concern of mine. Of course, *you* might have it as a concern, as I would think you should, since you have no idea *how* these domains got into "my" hosts file. BTW, you can *skip* "my" hosts file, and just use the trusted ones out there, of which MVP HOSTS is certainly one, but, I just found this one just now, which I'm going to incorporate: http://someonewhocares.org/hosts/ It has 10,250 entries in it, once I cleaned up all the comments. Here is the cleaned up file before I tested it with mine: http://pastebin.com/vhNMYa4x |
#66
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On 08/30/2014 09:31 AM, Ned Turnbull wrote:
On Sat, 30 Aug 2014 10:05:55 -0300, Shadow wrote: The problem is when you seed it you will have to be online, so that unique seeder will be your IP. Can I seed from Tor? I don't think that's a good idea. It's probably just like installing a program that you downloaded while Tor is still open. It destroys your anonymity. Tor warns not to do that for that reason. -- Caver1 |
#67
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On Sat, 30 Aug 2014 15:51:35 -0500, John Hasler wrote:
I find that Privoxy plus NoScript blocks 99.9% of all advertising with no need for any configuration. Do others concur that Privoxy plus NoScript is the panacea that solves all but 0.01% of the obnoxious web sites? If so, that's a winner, at least for one browser and one port. |
#68
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On Sat, 30 Aug 2014 23:03:06 +0200, Peter Köhlmann wrote:
That task is "slowing down each and every network connection"? Brilliant, you found it I have noticed absolutely no slowdown, when I run a speedtest.net with and without a hosts file. I notice absolutely no slowdown when I access web pages, with and without the hosts file. In fact, things *speed* up, because far fewer background sites are being connected to. If the 127.0.0.1 syntax is "slowing" you down, the 0.0.0.0 synax on Win8 machines might help, but, this site says the speed difference is minuscule. http://someonewhocares.org/hosts/ Since I have a decade of experience with huge HOSTS files, I wonder where you get your *facts* from, since you've never even tried it, it seems. Are you just making that up? |
#69
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On Sat, 30 Aug 2014 21:06:15 +0000, Ned Turnbull wrote:
NOTE: The effort is in culling out duplicates, which I do simply by removing all extraneous lines, and then removing extraneous characters in the good lines. Since the only work in merging hosts file is in culling out the duplicates, I've done that work for you, for this site: http://someonewhocares.org/hosts/ Here is the 10,250 line file, cleaned up, from that site: http://pastebin.com/vhNMYa4x When I compare that to yesterday's MVP HOSTS file, I see: 13626 domains in HOSTS 10250 domains in someonewhocares 11761 domains in HOSTS that are not in someonewhocares Based on a quick output from this simple command: $ comm -2 -3 (sort file1) (sort file2) file3 $ wc -l file* So, I'll make a *new* combined hosts file for myself. If anyone wants it, let me know. |
#70
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On Sat, 30 Aug 2014 21:07:38 +0000, Ned Turnbull wrote:
Here's another one I just found by accident: http://someonewhocares.org/hosts/ I combined that recently found hosts file with mine, and, after removing duplicates, I found that my hosts file grew by 232 hosts (from 23,735 hosts to 23,967 hosts). So, with about 30 seconds' effort, we are able to add over two hundred additional hosts that we will never see on any machine in our networks. |
#71
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On Sat, 30 Aug 2014 17:25:02 -0400, Caver1 wrote:
I don't think that's a good idea. It's probably just like installing a program that you downloaded while Tor is still open. It destroys your anonymity. Tor warns not to do that for that reason. That's what I was worried about. Thanks. Since a hosts file is a TEXT file, the pastebin.net method seems to work, as long as I keep it below the 500 KB limit. |
#72
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
Ned Turnbull wrote:
On Sat, 30 Aug 2014 21:06:15 +0000, Ned Turnbull wrote: NOTE: The effort is in culling out duplicates, which I do simply by removing all extraneous lines, and then removing extraneous characters in the good lines. Since the only work in merging hosts file is in culling out the duplicates, I've done that work for you, for this site: http://someonewhocares.org/hosts/ Here is the 10,250 line file, cleaned up, from that site: http://pastebin.com/vhNMYa4x When I compare that to yesterday's MVP HOSTS file, I see: 13626 domains in HOSTS 10250 domains in someonewhocares 11761 domains in HOSTS that are not in someonewhocares Based on a quick output from this simple command: $ comm -2 -3 (sort file1) (sort file2) file3 $ wc -l file* So, I'll make a *new* combined hosts file for myself. If anyone wants it, let me know. Ned You haven't mentioned the Hosts file incorporated in the Malwarebytes anti-malware software. Have you actually reviewed it? You can find same he http://hosts-file.net/?s=Download HTH Dave -- First they ignore you, then they laugh at you, then they fight you, then you win. |
#73
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On Sat, 30 Aug 2014 21:27:17 +0000, Ned Turnbull wrote:
Do others concur that Privoxy plus NoScript is the panacea that solves all but 0.01% of the obnoxious web sites? OoOOps. I was off by one decimal place, but the question remains whether Privoxy & NoScript solve almost all the problems that the hosts file attempts to solve on the three major PC platforms for all the browsers one would use, and for all the ports. |
#74
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
~BD~ wrote:
Ned Turnbull wrote: On Sat, 30 Aug 2014 21:06:15 +0000, Ned Turnbull wrote: NOTE: The effort is in culling out duplicates, which I do simply by removing all extraneous lines, and then removing extraneous characters in the good lines. Since the only work in merging hosts file is in culling out the duplicates, I've done that work for you, for this site: http://someonewhocares.org/hosts/ Here is the 10,250 line file, cleaned up, from that site: http://pastebin.com/vhNMYa4x When I compare that to yesterday's MVP HOSTS file, I see: 13626 domains in HOSTS 10250 domains in someonewhocares 11761 domains in HOSTS that are not in someonewhocares Based on a quick output from this simple command: $ comm -2 -3 (sort file1) (sort file2) file3 $ wc -l file* So, I'll make a *new* combined hosts file for myself. If anyone wants it, let me know. Ned You haven't mentioned the Hosts file incorporated in the Malwarebytes anti-malware software. Have you actually reviewed it? You can find same he http://hosts-file.net/?s=Download I should mention the the guy preparing this Hosts file also hosts the web site of Dustin Cook's BugHunter software. Verify he- http://mysteryfcm.co.uk/?mode=Contact -- First they ignore you, then they laugh at you, then they fight you, then you win. |
#75
|
|||
|
|||
I have a fantastic HOSTS file (where can I post it for othersto benefit)?
On Sat, 30 Aug 2014 22:48:07 +0100, ~BD~ wrote:
You haven't mentioned the Hosts file incorporated in the Malwarebytes anti-malware software. Have you actually reviewed it? You can find same he http://hosts-file.net/?s=Download Hi Dave, I was not aware of that hosts-file.net hosts file! I was only using these hosts files: a) Mine b) Plus MVP HOSTS (http://winhelp2002.mvps.org/hosts.htm) c) Plus someonewhocares (http://someonewhocares.org/hosts/) Thanks for helping out! I had just added 232 unique lines to my current hosts file based on what I accidentally found at http://someonewhocares.org/hosts/ so, my current hosts file contained 23,967 unique hosts. That file from http://hosts-file.net/?s=Download, once cleaned up and extraneous ASCII text removed, contains a whopping 901,983 unique hosts (after I removed extraneous spaces, tabs, & comments). Renaming my hosts lines "file1" and this malwarebytes hosts "file2" and running the sugested commmand gives me a surprising result: $ comm -2 -3 (sort file1) (sort file2) file3 $ wc -l * 23967 file1 (my hosts file unique lines) 901983 file2 (the malwarebytes hosts file unique lines) 18451 file3 944401 total So, there are 18,451 hosts in my /etc/hosts file that are *not* one of the malwarebytes 901,983 hosts. Likewise, renaming the malwarebytes hosts file as "file1" and my hosts file as "file2", we find almost ninety thousand hosts listed in the malwarebytes hosts file which are not (yet) in "my" hosts file. 901983 file1 (the malwarebytes hosts file unique lines) 23967 file2 (my hosts file unique lines) 896467 file3 1822417 total Whew! That's a huge difference! To test out a million-line hosts file, now "my" hosts file has all 917,434 unique hosts. With almost a million hosts, we'll see what happens (if anything) to my Linux network performance. If you can suggest a procedural test, to run with and without the hosts file in place, that would be helpful to see what the performance impact (if any) is. |
Thread Tools | |
Display Modes | Rate This Thread |
|
|