Jump to content
Strawberry Orange Banana Lime Leaf Slate Sky Blueberry Grape Watermelon Chocolate Marble
Strawberry Orange Banana Lime Leaf Slate Sky Blueberry Grape Watermelon Chocolate Marble

MSFN is made available via donations, subscriptions and advertising revenue. The use of ad-blocking software hurts the site. Please disable ad-blocking software or set an exception for MSFN. Alternatively, register and become a site sponsor/subscriber and ads will be disabled automatically. 



NoelC

Anti-Malware Suggestions

Recommended Posts

ye but be aware of portables too

many russians make crap out of them and include garbage init

Share this post


Link to post
Share on other sites

most of the apps can be installed normally and them move to another directory and still work

Share this post


Link to post
Share on other sites

The biggest misinformation I see in this thread is the misuse of the Hosts file. The Hosts file is not meant to be used as a blacklist for malicious domains.

This article is a good read. Blocking malware and advertisements safely. - remember to whitelist MSFN to show your support.

Share this post


Link to post
Share on other sites

"Meant to be used"?  Who set that?  Some ancient Unix author?

 

You can't argue that it's not effective.

 

Personally I cannot see how repurposing a feature that's otherwise generally unused is "misuse" by any stretch of the imagination.  In my hosts file I have a couple of entries at the top that do some special things for me, and the bottom part is all the hosts file from MVPS.  It works, plain and simple.  In fact, it's one of the most effective things you can do to make the Internet safer.

  • Do you anticipate performance problems on today's computers?
  • Do you imagine it's blocking things you DO want to see? 

 

Per your linked article,

 

"The Hosts file should only be used for redirecting a website or a new IP address."

 

Redirecting a malicious parasite web site to a non-existent IP address sounds like a perfect use to me.

 

Please, by all means, elucidate the disadvantages that lead you to classify locally blocking known malicious domains as "misuse".

 

-Noel

Edited by NoelC

Share this post


Link to post
Share on other sites

Having to parse through tens of thousands of lines in the hosts file. No matter what web browser or web service you use. Every time you access the web your computer has to filter through all these lines. This is a performance degradation on the browser and your computer. Every single time you load a website it has to filter through to match these things.

 

Redirecting to your local machine/localhost leaves a persisting connection. You have to wait for these persistant connections to time out on your local machine, too. Unecessary activity for the machine and the network.

127.0.0.1          myfakemaliciousdomain.com::1                myfakemaliciousdomain.com

Why degrade your network performance too, by having connections that have to open and then wait to time out solely to block advertisements and malicious software? Many things in the hosts files are also not kept up to date. Numerous domains are expired yet take up an entry in the hosts file.

 

Back before the Internet became popular, the hosts file was a text document to act as a DNS. Now we have DNS services to the hosts file is very rarely used for these purposes unless people are having an issue loading a website due to a slow DNS refresh/update. The hosts file is a file used by computers to map IP addresses to domain names.

Share this post


Link to post
Share on other sites

The MVPS hosts file redirects parasite web server names to 0.0.0.0, not home (127.0.0.1).  No connections show up (e.g., in Resource Monitor). 

 

After a long day of using my system pretty heavily, and listening to Pandora, the command ipconfig /displaydns shows that I have 7503 entries, which have a TTL of 1 day.

 

Thing is, a modern CPU can search a table of tens of thousands of lines in a few microseconds, maybe less.  And what's even more important, the whole set of local table searches likely takes less time than resolving any one of the blocked site addresses via online DNS.  I suggest this will actually make browsing faster, not degrade performance.

 

Table lookup time is really not a practical worry.  And this proves out in practice - from where I sit, with literally 15,801 lines in it, I browse virtually instantaneously to any site I choose with Internet Explorer.

 

I've set up IE's network trace, browsed to a site I haven't been to any time in recent history, and haven't seen anything that makes me think there's any slowdown worth mentioning.  Example:  Apple.com's first response got back to me in 16 ms (that's 1/60 of a second).  That did not change with the hosts file renamed and out of the way.

 

Can you suggest alternate ways to measure degradations or compare performance?

 

-Noel

Share this post


Link to post
Share on other sites

A connection to 0.0.0.0 is still a connection that is used.

 

Those numbers are excessively high...
 

 

When the ipconfig /displaydns command is used to display current resolver cache contents, the resultant output generally includes the local host and loopback IP address (127.0.0.1) mappings. This is because these mappings typically exist in the default (unmodified) contents of the local Hosts file.

 

 

Source: https://technet.microsoft.com/en-us/library/cc758108%28v=ws.10%29.aspx

 

There are so many things wrong with using this. It also causes problems with other software. Comodo antivirus, System Mechanic and others were listed. Many of these sites foolishly tell you to disable the DNS Cache service. You should never alter a Microsoft service. Stock settings work fine and these "optimizations" are just myths.

 

Also, the Hosts file is loaded into your memory (RAM) at startup, another unnecessary performance degradation.

 

 

Microsoft has said flat out:

Note The overall performance of the client computer decreases and the network traffic for DNS queries increases if the DNS resolver cache is deactivated.

 

The DNS client service optimizes performance. There's a knowledge base that details this, and I'm more than certain Microsoft knows how their services and files perform.

 

Also, what's the point of using a Hosts file improperly when there are addons and such designed for use with browsers?

 

All of these things I'm covering are already detailed in more comprehensive detail on my wiki.

Share this post


Link to post
Share on other sites

The point is, it's effective, and notwithstanding your inferences simply without practical downside on a modern system.

 

A managed blacklist is an extremely effective means of blocking bad sites.  IMO, the mvps list in particular is well-managed. 

 

And let's be clear, I don't advocate doing things as you have described to the DNS cache service.  Though I started this thread with some general sounding recommendations, I mean to suggest that people just load the particular mvps hosts file.

 

And I'm sorry, even though you bill yourself as an expert, I'm one as well.  I simply won't accept a blanket statement like "the Hosts file is loaded into your memory (RAM) at startup, another unnecessary performance degradation" without measurements to back it up.  I've already posted my measurements supporting my argument...  If you want to debate this further, please make some suggestions for capturing actual measurements of degradation.

 

-Noel

Share this post


Link to post
Share on other sites

Only as an historical note, up to roughly January 2013 the MVPS site sported a note:

https://web.archive.org/web/20130131065053/http://winhelp2002.mvps.org/hosts.htm

Editors Note: in most cases a large HOSTS file (over 135 kb) tends to slow down the machine.

 

and for this case the suggestion was to disable the DNS Client Service,  the current page:

http://winhelp2002.mvps.org/hosts.htm

maintains this recommendation, providing as an alternative the periodical flushing of the DNS cache or altering the settings for the Service in the Registry in order to shorten the life of cached records.

 

It should mean that at least on some systems this may actually cause a slowdown, while on some others (possibly more powerful or with more Ram or *whatever*) this slowdown is so trifling that it is not an issue in practice.

 

Sorry to have interrupted this battle of the experts..., please by all means continue :).

 

 

 

jaclaz

  • Upvote 1

Share this post


Link to post
Share on other sites

A note removed in 2013 might have been there for a very long time.  Computers have gotten faster (and yes, that statement implicitly excludes those still running hardware from yesteryear).  The fact is, that statement has been removed - I suspect because it's no longer viable.  And, in case people don't go read, right now the web page says, "...there is no need to turn on, adjust or change any settings with the exception of the DNS Client service..."

 

Jaclaz, do *you* have any suggestions for how to measure "slowdowns" or indeed any downside?

 

You know I'm all about using real, measured data to make decisions on performance.  With such information I'd imagine a number of us could do some testing and post our results here.  THAT is the power of a forum.  Pissing contests are useless (especially against Italians :angel ).

 

FWIW, the mvps hosts file is now over 500 kB.  It seems to me that if it were going to make problems I'd be seeing them - which I'm clearly not.  But as you noted, I tend to run big systems.  Have *you* seen actual problems with it?

 

Finally, I don't buy that a big hosts file causes problems on even a lesser computer.  The burden is on those who say it does to show it.  I've been using the mvps hosts file for a long time, on systems that were in their day high-end but really were quite tiny and pale by comparison to even a modest computer system today.  Granted, the list probably wasn't as big back then, but it was big enough to be effective.

 

FYI, I have it also on another, smaller system with only 4 GB of RAM running Win 7.  That system boots up quickly and also navigates to sites it's never been to before in a few 10s of milliseconds.  Internet speed tests max out the fiber optic link I have.  Where are the problems?

 

-Noel

Share this post


Link to post
Share on other sites

Here are my measurements, by the way...  Before each test I flushed the DNS Resolver Cache to simulate a worst case scenario.  I can repeat these results with small variations in the timing numbers.  Note the response times for the first accesses to the site.

 

Big 12 core 48 GB workstation running Win 8.1 x64.

 

SpeedTestOnBigWorkstation.png

 

 

Smaller 2 core 4 GB computer running Win 7 x64.

 

SpeedTestOnSmallerSystem.png

 

 

Do we need to be talking about even older or lesser systems?  I'd have to test using virtual machines.  I can do so if it's interesting.

 

-Noel

Share this post


Link to post
Share on other sites

Jaclaz, do *you* have any suggestions for how to measure "slowdowns" or indeed any downside?
 

No, I have no specific experience on this.

 


 THAT is the power of a forum.  p***ing contests are useless (especially against Italians :angel ).
  

Sure, Italians do it better ;), there is simply no match. :whistle:

 

What is the difficult part or problem in "Only as an historical note"? :unsure:

Text before January 2013:

Editors Note: in most cases a large HOSTS file (over 135 kb) tends to slow down the machine.

To resolve this issue (manually) open the "Services Editor"
Start | Run (type) "services.msc" (no quotes)
Win8 users - Control Panel > Administrative Tools > Services
Scroll down to "DNS Client", Right-click and select: Properties - click Stop
Click the drop-down arrow for "Startup type"
Select: Manual (recommended) or Disabled click Apply/Ok and restart.

 
Text (current):
 

In most cases the DNS Client Service is not needed, it is recommended to turn it off. These instructions are intended for a single (home-user) PC. If your machine is part of a "Domain", check with your IT Dept. before applying this work-around. This especially applies to Laptop users who travel or bring their work machines home. Make sure to reset the Service (if needed) prior to connecting (reboot required) to your work Domain ...

To resolve this issue (manually) open the "Services Editor"
Start | Run (type) "services.msc" (no quotes)
Win8 users - Control Panel > Administrative Tools > Services
Scroll down to "DNS Client", Right-click and select: Properties - click Stop
Click the drop-down arrow for "Startup type"
Select: Manual (recommended) or Disabled click Apply/Ok and restart.

 
in both version of the page follow workarounds to keep DNS Client "normally" enabled by flushing the DNS cache (or limiting the cache duration).

From a purely linguistic point of view offering a solution for an issue should mean that at least in some cases the issue may exist.

 

jaclaz

Share this post


Link to post
Share on other sites

It seems to me you may focus too much on what people say.  I pay more attention to how things actually work. 

 

  • Tarun has said there'll be a timeout, even with 0.0.0.0.  I see no evidence of that.  Connections are aborted immediately, per traces from F12 developer tools in IE.
     
  • Tarun has said that it causes problems with other software.  While that can be true at the highest level, understand that it will be the case if that software required a connection to a site that's considered a parasite by the manager of the mvps hosts file and didn't handle an inability to reach the server.  That would be, in basic terms, a BUG in the application's web programming.  Personally I've seen it block Avast antivirus from gathering tracking information during an update, and when called on it Avast made changes to their product to not fail if they couldn't gather tracking info.  I've also seen it block tracking information in some of Adobe Photoshop's web code, and upon reporting it I have received word that they'll be fixing that oversight as well.  In practice, it's revealing things you didn't know were happening and may not WANT happening.
     
  • Tarun has said it's a misuse of the hosts file.  I disagree.  The intent of that file is to redirect certain names to particular addresses.  That's EXACTLY what's being done.  I say that redirecting parasite web site names to an address that can't possibly respond is an excellent use of the capability the operating system is providing.
     
  • Tarun and Jaclaz (indirectly) have said that there will be performance degradation.  I'm unable to measure a performance degradation on $10K and $299 computers running Windows 8.1 and 7.  Are we talking about some ancient system here?

By the way, I misread the advice on the MVPS site.  I read it before as saying "there is no need to change the DNS Client service".  I see now that it says otherwise.  I do agree that it's bad advice.  For what it's worth, I never have changed the default status of the DNS Client service.  It's always worked just fine.

 

There's a LOT of "rule of thumb" info out there that's been accumulated over the years and which is simply no longer pertinent.  Maybe it used to cause problems in Windows ME or something.  Or maybe it's just unfounded lore - there's plenty of that out there too.

 

You have a computer.  Instead of blabbering about what other people said and when, do some tests.  I have done my part, using both a $10K computer and a $299 computer and two different operating systems - and showed that there's no measureable degradation.  There's also no instability either.  My systems all run without fault between reboots mandated by Windows Updates. 

 

So I ask again:  Ignoring the lore, where's the measurable downside?

 

-Noel

Edited by NoelC
  • Upvote 1

Share this post


Link to post
Share on other sites

Jaclaz has not actually stated directly nor indirectly that there will be (or that there will not be) performance degradation, he doesn't know :w00t::ph34r: nor has personally tested this particular thing (nor he is minimally interested in the specific topic :no: which is - as earlier stated - outside his personal experience and interests), he simply reported what was (and is) written on a given page remarking how it is logically a sign that a performance degradation may happen and has evidently been reported as happening in some cases to that particular site, not necessarily on all machines, and possibly not even on "most" machines.

 

If you prefer jaclaz attempted (failing completely :blushing: ) to simply highlight how the personal experience by NoelC does not match the experiences reported on the referenced site, which is BTW the actual source of the suggested HOSTS file.

 

This may well mean that the site contents are partially outdated, that the Author of the site is completely wrong on the specific topic or that he got deceiving or incorrect reports by one or more less experienced users (or people running obsolete OS on obsolete hardware) and there could be another thousand possible reasons for this mismatch.

 

Even more specifically and explicitly jaclaz does not endorse, recommend or expresses an own opinion on:

  • whatever is written on the MVPS site
  • whatever Tarun stated in this thread
  • whatever NoelC stated in this thread

all of which he declares being a SEP:

http://en.wikipedia.org/wiki/Somebody_Else's_Problem

claiming it to be a perfectly legitimate "choice of ignorance by an individual"

jaclaz

Share this post


Link to post
Share on other sites

More info:

  • I booted up a Windows XP SP3 (32 bit) VM.
  • Updated the hosts file (with the same one I use on my Win 8.1 system).
  • Installed the latest Windows Update (Windows Malicious Software Removal Tool) and an update to my antivirus software, then rebooted.
  • Brought up IE8 and saw that it displayed my home page data immediately.
  • Cleared the DNS cache.
  • Brought up IE and saw that it displayed my home page data immediately.  Using a stopwatch, the time between display of the IE window with white background and display of the web site data was 0.4 seconds
  • Typed in another URL.  Same timing.
  • Moved the 500+kbyte hosts file out of the way.
  • Rebooted.
  • Did the same tests and saw no visible improvement in responsiveness.  In fact, I measured 2 seconds to display Apple.com, where it had only taken 1.8 seconds with the hosts file in place.  This could be human error.
  • Restored the big hosts file.
  • Rebooted again.  Times to boot and login (18 seconds and 3 seconds) were the same with or without the hosts file.

 

I don't see a way to start a web tracing tool in the F12 developer tools in IE8 to give millisecond accuracy, but in all observable ways Windows XP is working equivalently with or without a half-megabyte hosts file.  If you want to discuss older systems or smaller hardware, reasonably speaking that's outside the scope of this thread.

 

-Noel

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Recently Browsing   0 members

    No registered users viewing this page.

×