• Disk cleanup and performance improvements?

    Home » Forums » AskWoody support » Windows » Windows – other » Disk cleanup and performance improvements?

    Author
    Topic
    #481659

    Cleaning of a hard drive and performance improvements is a frequently discussed topic, but rarely with any specific target or root cause explanation.

    Years ago, as a young greenhorn, my mentor gave me a important tip: “if you don’t understand the problem you can’t be certain to have fixed it“.

    So, with that in mind, I would like to drill into some detail.

    Before I begin, please understand that this is not an attempt to detract from the reported improvements, but to discuss why such cleaning has a positive effect so that others can benefit.

    OK, here goes:

    My understanding is that a system can only be slowed down in a very few ways:

      [*]CPU bottleneck – not enough processor power to perform the tasks required
      [*]I/O bottleneck – fragmented or full disks, bad drivers or other resource latency waiting for data from a storage device
      [*]Network latency – waiting for a network resource response, be it from the WAN or LAN
      [*]Insufficient memory – causing excessive disk swapping (arguably a I/O bottleneck)
      [*]Poorly configured system preferences – asking for system resources that do not exist or cannot support the request – e.g gfx capabilities

    Each has its own sub divisions: for example a network latency may be caused by a malware infection, or a hardware issue. Anyway, assuming I have not missed anything big (please correct me if I have), I would like to reflect on those against the observed performance increases obtained by cleaning and maintenance as suggested.

    Clearly there has been an increase in performance observed, but why? What is is about the removal of log files, setup files, temporary internet files, etc that makes the system run faster?

    If I might offer a few of my own answers to see if they fit the bill….

    Example 1. A system that has high disk utilisation will benefit from a disk clean up because free space is generated which can then be rationalized using a defrag, making the drive work less to access the data. In effect reducing the I/O bottleneck. But that only should a marked effect on a drive that is heavily used.

    Example 2. Many applications come bundled with branding and additional features that are configured to load at startup. This can affect the CPU utilisation and memory utilisation, both of which can have a detrimental effect on performance. Being careful during an installation can often allow one to request not to install additional features. Using msconfig, or 3rd-party apps can get to the root of the issue, buy preventing various startup items from launching. Fine and well, but other than consuming RAM and potentially causing disk swapping, a start up item should not generate a CPU bottleneck after it has loaded.

    Example 3. Browsing the internet may be more responsive if the Temporary Internet Files are rationalised, but the purpose of these files are to provide a local cache to reduce network utilisation, so if there is no local cache, the system must download the files from the web server again. That suggests that a total removal of temporary internet files may actually have a detrimental effect. On the flip side however, if the drive is getting full, removal of these files may improve the disk performance as in Example 1. Also if a corrupt file is located in the cache, removing it would be beneficial to prevent errors.

    Example 4. Network latency is not something that removing data form a drive should affect, but perhaps refreshing the TCP/IP stack may have an effect if one is experiencing slow network browsing or access of data on a network share.

    Example 5. Poorly a poorly configures system can often be helped by resetting some parameters to default, for example internet explorer settings. This isn’t really disk cleanup or removal of data, but has been included for completeness.

    So, where does that leave us?

    There are observed performance increases by regularly maintaining a system and/or data drive, but why?

    What are your experiences and understanding?

    Viewing 42 reply threads
    Author
    Replies
    • #1320398

      WOW, a thread that deals with probably my MOST favorite subject, “Garbage Removal” and system performance. (my speciallity)

      “Disk Cleanup” is an MS App has been around since the Windows 98 days, and is ignored by 99.9999% of users.
      The ones who DO use it are the ones where I put it on their Desktop as a part of a weekly Maintenance Routine. (that is IF they do their maintenance)

      I run Disk Cleanup from a desktop shortcut, like this:

      %SystemRoot%System32Cmd.exe /c Cleanmgr /sageset:65 & Cleanmgr /sagerun:65

      But, in this form, it has to be RUN by the user, by clicking on the [OK] button after the program starts.
      At least, once I set the things to delete, the program remembers those so the user only has to click on the [OK] button to run the program and remove the preset files. I always tell them: “Don’t change anything, just click the [OK] button.”

      Now, I’d love to run that line from within my own XPCleanup.bat program, but I’m stuck, with wanting to send an [ENTER] keypress to that command, to operate it automatically.

      So maybe someone knows how I can cause the above command to run within a batch file with NO operator intervention, by simulating an [ENTER] keypress.

      I hope my post will be found to be in line with the original post and not as topic Hijacking.

      Cheers Mates!
      The Doctor 😎

    • #1320401

      TIF removal speeds things up because our system does not have to look at all that garbage first before going online and getting new pages. This sounds pretty simplistic, and I am NOT a network guy at all. Many of us now have fast cable connections, at least here in the US. It is faster for us to go to the web and load a fresh page rather than find the old page in history (and hope it’s not corrupted) then go out and find a new page and load it and compare the two then refresh the page and utilize resources to do all this while we are doing 3 other things with our PCes, etc. At least TMHO.

      Same thing with all the other garbage that collects on our PCes. Windows tends to like to collect GB of old garbage. It’s like a hoarder on steroids. I can see keeping some of the newest lod files, but many have logs files from months or years ago as DrWho has suggested, then wonder why they are running short on disk space.

    • #1320402

      Okay, you asked for it.

      1: Disk “Utilization” has IMHO to be separated in three subsets (talking about PCs only!):

      1a: Disk physically full, that is 90% of space or more used: Much of the effect depends on the file system used. FAT machines used to get perceivably slower around 75% to 80% used space and more. I have seen NTFS systems more than 90% full and still running (maybe not great but running).

      Main difference here IMHO the methods the OS uses to access FAT/MFT and from there finding free space and accessing data.

      1b: File fragmentation naturally raises the number of physical read/write head movements and thus can add lots of latency and accesses to other tracks that would be avoided were the disk decently defragmented. Can “kill” some older PATA drives (remember those 30 and 60GB RLL drives?). Regularly defragmenting was a MUST from the beginnings through NT and early XP times.

      Back then the trick was to have a dependable and efficient defragmenter program. MS’s own defragmenter was the clumsiest I have ever watched.

      For most defragmenters I remember the rule of thumb was that you needed at least 25% free space for them to work well.

      1c: I/O load: Several disk I/O intensive programs running concurrently and pushing the disk drive to it’s physical limits.

      The only remedies I know for that are either a much faster disk drive (Solid State, IMHO the future anyway) or load balancing/distributing (batch instead of concurrently). Yes, I know about RAID but every disk subsystem has a physical throughput limit somewhere.

      2: CPU bottleneck: Generally I see on plain vanilla usage home computers only very few CPU bottlenecks. I leave installations of Service Packs out of scope (W7 SP1 anyone?), they happen too rarely.

      Older XP machines with less than 2.2Ghz P4, older Celeron or Turion CPUs just are sluggish, with almost any amount of RAM. Example with equal RAM, a 2Ghz P4 and a 2Ghz Celeron, main difference the P4 has depending on the model twice to four times the amount of L1 cache built in. The P4 feels significantly “zippier”, responses are more immediate. Yes this is human experience, not scientific measurements.

      So far with most Windows systems I have seen more RAM has a much bigger positive effect on performance than a faster CPU. Which seems to show that at least some perceived CPU bottlenecks actually are I/O bottleneck because of paging – see below.

      On modern multi-CPU machines stuff like video editing and rendering or huge spreadsheet computations can show limits of a CPU but I can’t comment since I have no experience in these fields.

      3: Network latency: The only problem I ever encountered (besides outages) is slow DNS resolution and bottlenecks beyond my control (like the Chicago bottleneck). The former I solved with OpenDNS (or GoogleDNS if you want), the latter teaches me patience but occurs increasingly seldom.

      My LAN is under my control and thus should never be an issue.

      4: Insufficient memory: Again I believe this is two related but separate issues.

      4a: RAM
      Six or seven years ago XP ran fine with 512MB. The same machine now needs 1GB minimum, better more. For Vista and Win7 4GB IMHO is the minimum for satisfactory performance.

      Do you have any idea how many machines are out there with 3GB dual channel memory? And all because of the limitation of the 32-bit architecture.

      Supposedly we get the full speed of dual channel only when the chips are paired equally by speed and size. So the manufacturers throttled memory accesses knowingly. Ridiculous.

      4b: Page file size
      Have your customers ever called because of the “Insufficient virtual memory” message?

      Microsoft still hangs on to an IMHO ridiculous formula for the page file size. IMHO this always was way too low. I don’t know for how many years on XP systems I always set page file size >=2GB or more with no difference between minimum and maximum sizes. Dynamic shrinking and enlarging of the page file are just additional I/Os.

      If disk space is tight I get the space easily by limiting the space for System Restore significantly. This defaults to 15% of the gross disk capacity – which in turn is an absolutely ridiculous value on modern disks of many 100s of GB.

      On Win7 I have not ever even looked at it – yet. :o:

      5: System preferences
      What IMHO is most important I have already mentioned:

      As much RAM as the motherboard will hold (within reason…).

      On XP: Reasonably large minimum page file size.

      On FAT systems keep the disk de-fragmented or convert to NTFS (only after chkdsk and sfc!). And did I mention to keep the disk de-fragmented?

      Especially in older XP systems with certain SIS and or VIA graphics chip sets turn ALL visual effects off! Much of that eye candy seems to be done by the CPU. Grrr.

      Regularly remove leftover temporary files. CCleaner comes to mind as the tool of choice for grown-ups, Auslogics Temporary File Cleaner for my average customer.

      ———
      I wonder what all I forgot to mention. Can’t wait for comments, suggestions and the seemingly in these issues hard to avoid chiming in as in “I am more right than you”.

      • #1320408

        Thanks guys, keep the thoughts coming: none of them are new to me, but all though provoking.

        @ DrWho: I always read your disk cleanup posts with interest and it certainly seems to work for your customers. Can you expand on why removal of junk improves performance? What is the system doing that causes the junk to slow it down. Junk it may be, but unless it’s doing or preventing something it shouldn’t impact performance (that’s not to say it should be kept!). The obvious answer is it’s consuming too much disk space slowing down access to data that’s needed, but surely that can’t be the case every time as that would suggest most drives you come across are nearly full.

        @ Ted: Re Temporary Internet Files…..are you sure it takes less time to query and download a file off a webserver than to pull it off a hard drive? Perhaps in some cases, but that would imply a very fast internet connection. I also sometimes see improvements in browsing experience by cleaning up the TIF’s, but have yet to adequately explain them.

        @ eikelein: Very much my line of analysis too: If I might be so bold as to summarise what I think you are saying – it’s not so much the removal of data and disk clean-up: its the optimisation after the clean up that has the most benefit.

    • #1320406

      I am sincerely glad that there are so many “smarter” people than me in this Lounge. As I get older it is getting harder to “fire up” my brain cell (singular). Perhaps it’s because I am not an educated computer tech person and did/do not work in the computer field. I am totally self taught (took a fortran programming course in 1970 as part of my electronics degree, but that’s it) and am still learning. Today I feel as though I have learned more. Now I have to digest some of it. That might be the problem. Guys I am humbled by your expertise. Please keep it up! Have a great day all of you!

    • #1320409

      WOW again!

      That was a long and very detailed dissertation, but essentially correct.

      It’s a little bit ‘Pie in the sky” though, for as a computer tech, when I get a call that the resident PC is taking a half hour to boot up and is so slow to shut down that the user is just killing the power and walking away, I’m very limited at what I can do.

      I can’t add ram, I can’t install an SSD drive, nor can I change the CPU, so what CAN I do?

      Well, I can clean up the mess and get the PC pretty much back to like-new condition, with a few improvements.
      Even with a brand new PC, right out of the box, I can improve performance by over 50%, because Windows XX is written with so many SAFE defaults that it only runs at a small percentage of the capability of the hardware. (CPU, Ram & Hard Drive)

      Without going into the details of where I cut and trim, I clean out every program and unneeded file on the HD that is just loading down the PC. I set up my own XPCleanup.bat program, (or VICleanup.bat for Vista or W7Cleanup.bat for Win-7) in the Startup folder, so that every time the PC is turned on, the junk from the previous session is removed, in just a few seconds.

      I give the user a “Weekly Maintenance” routine and an instruction sheet on how to do it, which includes running the “Disk Cleanup” program in extended mode, as I outlined earlier, followed by a defrag. Regardless of what anyone thinks of the MS Defrag, it is probably the safest defrag program out there. *
      * NEVER RUN DEFRAG BEFORE REMOVING ALL THE JUNK FROM THE HARD DRIVE. (WHY DEFRAG JUNK?)

      Just one time, not too long ago, I ran the Auslogics Defrag program. It made such a total mess out of my HD that I had to do a Backup + Restore to get my drive back in shape again. That program had my files scattered from one end of the HD to the other. NEVER AGAIN!

      Ok guys and dolls, I gotta run, but “I’ll be back!”

      Cheers Mates!
      The Doctor 😎

    • #1320411

      Tinto, I guess it’s more the old page gets queried and loaded while the web site is queried, then the new page is loaded. The PC is working finding the correct old page at the same time finding the correct web site to get the new page, rather than just getting the new page from the get go. The other thing is, and perhaps this is just my paranoia, having this old info sitting on my PC is showing where I’ve been, perhaps holding some of my personal info or sign on info, just waiting for someone to break through and grab it. Paranoid, you betcha, but then I have both a sister and sister-in-law than have had their info stolen and have had terrible times solving those problems. I cannot swear this is what happened to them. No one can, so yes I’m paranoid and I wipe everything regularly.

      And yes I do have a fast connection. Time Warner Road Runner, approx 15.5 mb/s download, 1.5 mb/s upload. Pretty fast.

      • #1320412

        Nothing wrong with being safe rather than sorry Ted.

        My understanding was that the Temporary Internet Files are searched for a cached copy of the page and only then if one is not found is the webserver queried. I think that’s how it used to work, but not sure nowadays.

        So, if that’s right, the size and quantity of TIF’s may have a bearing on the time taken to respond to a web request, but I would have expected the indexing to be able to cope with that. Perhaps the indexing isn’t as good as it could be? Your broadband connection certainly is fast, but it is still much slower than reading data of the drive.

        As previously noted, I too have seen improvements by cleaning TIF’s, but the cache is meant to make pages load faster so it’s counter-intuitive that by emptying the cache that the online page loads faster.

    • #1320443

      As another data point, I only clean up TIF and temp files once or twice a year and don’t experience any slowdowns or speedups with the cleanup of gigs of temporary files. But then my system disk is 70% free space. Just removed 1.5 gigs of temp and TIF space on a client XP laptop with no effect on performance. Cleaned up the number of programs in Windows startup and that had considerable positive improvement in performance. In my experience, the only time temp file cleanup has had an impact on performance is when the disk is close to full (80% full or more).

      Things I have found that improve performance:
      1. Unnecessary startup items in windows boot.
      2. Malware removal or corrupted system files.
      3. Disable Windows eye candy (Window animations, transparent background)
      4. Add memory if less than 1 gig on XP, 3 gigs on Vista/Windows 7
      5. Temp files only if the system disk is close to full.

      Jerry

    • #1320448

      Just a few hasty and random thoughts:

      From the disk performance aspect, having enough free space and keeping fragmentation levels low, so that there’s ample contiguous free space for streaming video temporary storage, for example, is a good starting point. The 25% free space factor is pretty close to that ideal, just look at how disk drive performance drops off when reading/writing in the last 25% or so of a spinning drive. So, if you add a % of a few GB above that for big temp. files, you should keep the drive performing close to the optimum. Say around 30% free space is good, less would need some serious attention.

      As for TIFs, I think they’re in a different category as they are (usually) very small files, they don’t require contiguous space and Windows will place them ‘randomly’, probably in the first available space closest to the heads current position(s). On a sessional basis, many are also duplicated in the paging file and will be pulled from there for reuse (or usually from a ReadyBoost drive, if that’s in use).

      For borderline disk I/O bottleneck cases, some tweaking of background NTFS data read/writing might improve things a little, such as disabling the creation of 8.3 file names.

      Of course, if the machine has multiple drives and to a lesser extent, ReadyBoost and/or RAM drives further tweaks can be made.

      • #1320758

        A good, fast, well maintained PC does not need to rely on TIF’s to get to where you want to go. Eh?
        Any time lag of downloading a web page should be inconsequential.
        Just like you shouldn’t need a road map to go to work each day.

        I’ve cleaned up PC’s where the TIF’s go back years, , , hundreds of thousands of them. Windows will remember places you went to just once, years ago. How important can that really be?
        And that’s only the tip of the iceberg. There are other programs, like AV and AS software that constantly create .log and .bak files that just never go away, , , EVER!
        AOL, for instance, is horrible about creating garbage and then hanging on to it forever, and if it’s updated, the old versions never uninstall or just go away.
        One day, I worked on a PC that had nine versions of AOL installed on it. Just one version of AOL is bad enough, but nine?

        I just cleaned up an old Dell a few days back, that would no longer defrag, due to lack of free space on the HD. I set down with the owner/user and I took 20 gig’s of just pure old fashioned CRAP off of that HD. A lot of that was old programs that were not needed or ever used. Most were put on there by Dell.

        I search through my Hard Drive for those “Fibber McGee’s Closets” and add those paths to my cleanup batch file. The programs that create all that clutter don’t seem to mind me taking out their garbage. 🙂

        Microsoft programs like Disk Cleanup only address MS created files. They do nothing about the internet cashe (TIF’s) created by Firefox or other 3rd party browsers.
        That path will be different for each user, so you just have to search it out and then wipe it out.

        My newest cleanup batch file includes the line to run the Extended Disk Cleanup in the automated mode as the last line.
        I’ve experimented with installing it on XP, Win-7 and even Win-8/DP and it does work.
        The batch file itself needs to be customized for 7 and 8, as many of the paths are different than on XP.
        That too, is still a work in progress.

        My list of services to disable is just a suggestion, not a word from God. So it’s up to each person to only do what they feel comfortable with.

        I still suggest that anyone wanting to minimize their running services, consult the listings at Black Vipers web site and stay within the SAFE zone.

        Cheers Mates!
        😎

        • #1320764

          A good, fast, well maintained PC does not need to rely on TIF’s to get to where you want to go. Eh?
          Any time lag of downloading a web page should be inconsequential.

          So by implication you are saying that it is slightly faster if the Temporary Internet Files are present. That’s how I would expect it to be, but appears in contradiction to anecdotal observations.

          I’ve cleaned up PC’s where the TIF’s go back years, , , hundreds of thousands of them. Windows will remember places you went to just once, years ago. How important can that really be?
          And that’s only the tip of the iceberg. There are other programs, like AV and AS software that constantly create .log and .bak files that just never go away, , , EVER!

          I just cleaned up an old Dell a few days back, that would no longer defrag, due to lack of free space on the HD. I set down with the owner/user and I took 20 gig’s of just pure old fashioned CRAP off of that HD

          A really good clean up there Doc. Old machines fill up with user data and other stuff, but what is it about the quantity of data that makes the machine slow? Yes, more data will extend times to run AV scans or defrags, but to slow a machine something must be impacting CPU, I/O, network or some other resource. Data that’s old and just sitting there is probably not required, but until the drive is nearly full, how does it slow the machine…what is the mechanism?

          I still suggest that anyone wanting to minimize their running services, consult the listings at Black Vipers web site and stay within the SAFE zone.

          A good site that. Recommended, but use with caution unless experienced.

    • #1320454

      For borderline disk I/O bottleneck cases, some tweaking of background NTFS data read/writing might improve things a little, such as disabling the creation of 8.3 file names.

      That’s an interesting one satrow, and one that I hadn’t considered previously. {For the casual reader: 8.3 filenames = Short File Names}

      @ Jwitalka: Jerry, your bullet point list is pretty much what I subscribe to too.

    • #1320458

      Naming no names, but I see many answers along the lines of “This is what I do with MY computer”. (That’s fine! You do that.)
      That’s totally immaterial, when you’re a computer tech who has to deal with hundreds of computers, some of them not having any service for maybe the last ten years.

      I recently ran my XPCleanup.bat program on an OLD PC that essentially had never had any service. Bootup took many minutes and shutdown was well near impossible. A defrag took hours and hours and simple scans like from an Anti-Virus or Anti-Spyware program was nearly an all day affair.
      Anyway, when I ran my little cleanup batch file, it removed over 100,000 garbage files.
      You say that a temp file or TIF doesn’t slow down a computer, but it sure as heck WILL if you ever do any maintenance or have a Good AV or AS program that does a full scan of the HD every day.

      That’s why I get every file off of the HD that doesn’t absolutely HAVE to be there.
      It’s not just to save space, although it does, but it speeds up many operations of the PC.
      I just took over 20 gig’s of junk off of a PC this past week. The owner of that PC was elated at how much better his PC ran once I was done working on it.

      The Doctor 😎

      PS: I’m still looking for a way to run the MS “Disk Cleanup” automatically from a batch file. Any help would be greatly appreciated.
      Here’s the line I need to run:
      %SystemRoot%System32Cmd.exe /c Cleanmgr /sageset:65 & Cleanmgr /sagerun:65

      Addendum: The above mentioned problem has been solved by adding “Echo. |” (sans quotes) to the beginning of the command.
      Sageset only needs to run once to set the parameters for Disk Cleanup and then sagerun will do the same job over and over again. I have it now running nicely.
      Problem Solved!

      • #1320472

        Naming no names, but I see many answers along the lines of….

        I would prefer to leave the debate about whether one person’s method is better than another for a different time: it could easily lead to misunderstandings. Suffice to say there are probably as many different experiences as methods.

        No, what interests me is the root cause for poor performance, what improves that performance and whether “disk cleaning” is part of that solution. By determining a root cause we can understand how the improvement methods work, and perhaps, why they are effective in some cases, but not in others.

        So, if I pick up the point by DrWho above that leaving TIF’s or other temp files on the PC will slow it during an AV scan, I’m lead to ask how so?

        I can understand that it will increase the scanning time and by rationalising the TIF’s one undoubtedly reduces the overall scanning time. However, on an individual file basis, scanning a temporary internet file should be no different from scanning any other file. So the machine should be more or less equally loaded scanning operating system files as it is scanning TIF’s. As such, the system load should not be any different compared to scanning other files.

        Having said that, I wonder if the small size of them may make the drive work harder opening many more small files – i.e the I/O throughput may be limited by having to process large numbers of small files. Anyone have thoughts on that?

        While speculating on performance improvements: How about the Windows (7) defrag compared to 3rd party? A defragged mechanical drive is normally considered essential, but presumably the different tools implement different algorithms. Does anyone have any evidence of improved performance after using different tools?

        • #1320546

          To all previous posters:

          THANK YOU!

          1: Most important and THE NUMBER ONE thing I totally forgot to mention in my long, effusive previous post:

          CLEAN UP THE STARTUP programs! Everything non-essential has to go. Especially on older XP systems with 1GB RAM or less and on Vista/7 with only 2GB RAM that alone often has miraculous effects.

          And for me the central point here is: MSCONFIG and CCleaner and the like are NOT sufficient at all! I use CCleaner a lot but only for the first shot at cleaning up startups.

          Especially after removing any of the well known AV or security suites I ALWAYS find left over drivers and/or services running! The older the basic version of the removed tool was the more left-over drivers and services show up.

          My tool of choice for this is AutoRuns. Yes, I know, this definitely is a tool for grown-up people only. But then, I am not “the kid next door who knows computers” either.

          What just came to mind – but I have never tried it before – is using Revo Uninstaller, letting it do it’s Hunter Mode thing and then check drivers and services with AutoRuns.

          Isn’t it fun that there always seems to be a “next time” to look forward to? Even at my age!

          2: Important on slow CPU, old XP machines:
          Always, always set the Internet Cache size down to 50MB.

          XP by default seems to allocate space for TIFs by some weird percentage of the raw disk capacity. And since HDDs have grown almost exponentially over XPs lifetime we end up with cache sizes that are slowing down the system as to Microsoft’s own admitting; see below. History, history…

          We all have seen this message I assume:
          30133-ScreenShot007
          And in the dialogue to actually set the value MS says:
          30134-ScreenShot008

          Please ask Microsoft, not me, why too large a cache slows down the system. I could speculate a lot but I ain’t good at that.

          Cheers to y’all.

        • #1321490

          Have read the first couple of pages of this discussion and it’s filled with good advice. But what about a good cleanup/tuneup routine for SSDs? We’ve upgraded two of our home PCs with an SSD. We did the basic step of enabling AHCI mode in the BIOS before installing Windows 7 Home Premium 64-bit. Windows was smart enough to turn off the scheduled task Defrag, and we also disabled Superfetch in the Services window (recommended by a tweaky friend!).

          Our basic maintenance is to run Ccleaner about once a week. Great little cleaner, that.

          But what else should we do to keep our SSDs running in top form? Are there other things to turn off? Or, turn on? Or move to another drive, such as TemporaryInternetFiles or Temp. files or .log files? And where is the best location for them?

          Is there a guide for all this written in plain English somewhere?

        • #1321989

          On a SSD, unless the drive is essentially completely full, removing junk will not necessarily improve performance. Seek and access times do not depend on fragmentation or amount of stored data on these drives if they are properly configured.

          Windows performance with a mechanical hard drive is limited by I/O (USB drives have very slow transfers of large files, for example), how full the drive is (seek times increase with additional data, fragmented or not) and how often the system needs to read and write to/from the Paging Files instead of physical RAM. If the drive light is not constantly on (“thrashing”) there is sufficient RAM in most cases. And there are other factors which affect the efficiency of drive performance under Windows.

          Generally, Windows does a very poor job of managing its use of hard drive resources, and Windows does not clean up junk as well as it could if Microsoft cared about such things. Part of the issue is the Windows File Systems, but there are other, more fundamental inefficiencies. Linux, for example, makes much smaller and less fragmented files than Windows.

          While fragmentation does slow down seek and access times a bit, much more significant is the sheer amount of data on the drive. And the file structure has a considerable effect, as Windows Explorer (not Internet Explorer) is not very efficient at handling excessively deep file structures (folders nested within folders deeper than about seven levels).

          If a folder’s contents need to be displayed on the Windows Desktop as icons or thumbnails, excessive numbers of files per folder will slow down Explorer as well. These folders full of many files will create an Icon Cache which will need to be read or updated each time Explorer opens the Folder or saves files to that folder. Turning off thumbnails relieves a lot of the system resources demand which the Thumbnail Cache can impose when a folder has a lot of files.

          When Windows starts up or shuts down, or when switching from one user to another, the Registry must be read and updated, and Windows and Program Services must be started and shut down. Even without conflicts, this process can become very lengthy as the number of Services increases. Hence the need to limit background processes and Startups. CCleaner can help manage Startups with one of its Tools items.

          Background processes which can slow Windows include scheduled or automatic updaters, which are practically ubiquitous these days. Scanners have also proliferated, and these are not limited to anti-malware scans. And then there are all these Web Services which sit around doing nothing useful while a program is not being used. Lots of stuff in the System Tray and in the Running Processes.

          All of these things can be trimmed or cut out, and system performance will improve. In some extreme cases, performance makes dramatic gains when Windows is de-cluttered. While I might not go as far as DrWho in trimming the bloat, I have put my laptops on pretty strict background process and startups diets.

          All of which does not stop my Windows XP laptop from taking forever to start up MSE (single core system). And if I set up any Group Policies in Windows 7 Home Premium (which the WinBubbles utility program does), the system seems to take forever to shut down.

          -- rc primak

    • #1320463

      That’s so true DrWho. Any time you run a complete scan all the junk files will be scanned as well including all the old TIF as well as the many year’s worth of old log files, etc, etc, etc. And all this stuff has to be defragged whenever you do defrag, etc, etc, etc. I just feel more warm and cozy getty rid of this stuff. I did notice marked improvement on my mom’s PC a while ago when I started getting her to do the cleanup stuff on her PC. DrWho I did set up a Cleanup batch file for her and set up showed her how to do other cleanup and defrag chores. The first time I did this stuff she could not believe how much more responsive her PC was. She also could not believe how quickly web pages loaded. No, I cannot explain the why it works, but getting rid of the old TIF did speed things up.

    • #1320515

      Installing programs at the limit of or beyond your hardware’s capacity
      Another big problem with slow computers are, especially with the novice user, the tendancy to over install programs for which, in many instances, the physical hardware is just not up to the task. Not merely installing programs for which the computer’s hardware may be just barely able to run, but having every feature of it running at a full “on” configuration.

      “Buying cheap assembly line computers with paultry hardware and expecting to throw everything at it and have it run everything smoothly”.
      Use and run your computer within it’s means.

      Failure to maintain computer
      (The number one novice trap)
      Basically this is much of what was previously mentioned; Not cleaning up the junk accumulates over time and failure to uninstall
      the programs that have long since been unused. These are definitely among the root causes.

      Poor Computer Habits and Bad File System Organization
      One of the biggest things I see is the tendancy toward disorganization.
      *Dumping everything onto the desktop is just one good example of poor organization.
      *Having a very large amount of your personal data and file accumulates intermixed with the operating system will produce the
      same effect as junk file accumulates over time. You’ll get a gradual slowdown of everything over time, this might even seem
      imperceptible at first.
      *Flat out overt laziness; Doing everything & anything to avoid a clean install. Just prolonging the inevitable imo.
      Windows 7 is the most secure and sophisticated MS operating system to date, but that does not mean a clean install is even remotely
      not needed anymore. Novices are the ones who need it most.
      Many of us have learned how to use and maintain computers by our own mistakes, we’ve fudged them up and have learned how to
      do clean installs and have learned from our mistakes. In a way we deny others this in order to impart to them an “instant gradification”.
      Sometimes we need to realize that helping too much makes people stupid.
      …Then again, there will always be a subset of people that will never be computer literate.

    • #1320548

      For those of us who are the “kid” next door (OK I’m using the term pretty loosly here) What’s In Startupis a very easy to use app for stopping those apps that always want to load at Windows startup. Eike I also have Autoruns.

      30132-WhatsInStartup

      • #1320554

        For those of us who are the “kid” next door (OK I’m using the term pretty loosly here) What’s In Startupis a very easy to use app for stopping those apps that always want to load at Windows startup. Eike I also have Autoruns.

        Yes, you are correct. That’s one of the tools I referred to as “… and the like…”.

        But it does not show all the drivers and services as AutoRuns does, does it?

        The previuous was a serious question because I don’t use What’s In Startup regularly ecactly because of this reason.

        • #1321033

          Tinto, I wonder, if you’ve noticed one particular brand of PC as being exceptionally bad about putting out hardware that will barely even run the installed OS? (I have!)

          Ted,
          Why don’t you install Team Viewer for your mother, the next time you’re there? You could do her weekly maintenance and some repairs from your home. I’ve done that with several of my friends and customers, just to fix little glitches that pop up from time to time. I’ve even done a complete tune-up on a PC in Ca and one in Ks, while I sit here and chat with them via Google Phone.

          Laters!
          😎

      • #1322011

        For those of us who are the “kid” next door (OK I’m using the term pretty loosly here) What’s In Startupis a very easy to use app for stopping those apps that always want to load at Windows startup.
        30132-WhatsInStartup

        Here sits another “kid next door”, residing in one of the back alleys……..

        I am so thankful, for all of your thoughts, of ALL posters/Loungers. Now to describe my situation, computering (using electronic devices) was part of my 3rd or 4th way of education, very late in my life. Saying this, it means…. I learned how to USE those pieces of hard- and software, but everything beyond that….. I keep my fingers clear of….

        Nevertheless, the main subject of this thread, a slow machine, also is true for at least 3 of my PCs. I am eager to learn, how to handle all the suggestions you all described here. Ted, Eike, Dr.Who, Tinto, and all others, THANK YOU. The shortcoming (my shortfall) is, I have no clue what to remove safely, is it the temp, cache, startup, etc. or even how to access those locations. I confess, I know pretty well how to operate my PC to get the results, and the services I want, but … OK I know I need to get this done, or simply buy another new machine…… (thats what I most times did up to now).

        On the other hand, now I have much more time to “waste” (thats what Miss Alpine, my better half, calls it) so I will try to speed up (or trash) her little old PC first. (she announced last christmas it would be time to get her a faster machine).

        My question is: is there a sort of a list/handbook/walkthrough/? You all have the basic knowledge to do those important steps in an instant without much effort, for me its a FIRST, and I want to do it right.

        Thanks for any guidance that gives me a rail to humble along.

        Have a nice sunday

        ••K®IS••

        PS: pardon my english

        • #1322019

          …I am so thankful, for all of your thoughts, of ALL posters/Loungers….

          You are welcome – Gern geschehen!

          …the main subject of this thread, a slow machine, also is true for at least 3 of my PCs… I have no clue what to remove safely…
          …or simply buy another new machine… …I will try to speed up (or trash) her little old PC first…

          I will send you a personal message so we can take it off the forum; we will be off topic in an instant.

          …is there a sort of a list/handbook/walkthrough/?

          Not that I know of.

          PS: pardon my english

          No need to be bashful. I find your English better than some local poster’s…

    • #1320555

      That is correct. But for the average user that really is afraid of tinkering too far What’s in Startup is an easy way to start the process and get their feet wet. Once they get that taste, perhaps they will venture further and say “Hey that wasn’t hard, maybe I can do this” and take so more baby steps. That’s how many of us learned how to do this. Start small and gradually build up to the more complex stuff. Build up our confidence.

    • #1320558

      I already know that some “Expert” will say it doesn’t make any difference, but one thing I do when I set up a new PC or do a first time tune-up on an older PC, is shut down or put into Manual mode, all those redundant Services that Windows installs by default. I’ve mentioned SAFE defaults before but it’s really one of my pet peeves.

      To make sure I don’t miss one, or set something wrong, I’ve committed everything to a batch file, that I call
      SetServices.bat
      It only takes a few seconds to run and DOES improve performance.

      In XP I can shut down as many as 11 to 24 services and in Win-7, the number stays at 24, but that’s still a work in progress. 🙂

      A good source for information about that can be found on “Black Viper’s website”.

      If it only makes 1% improvement in performance, that’s 1% we didn’t have before. Eh?

      Cheers Mates!
      The Doctor 😎

      • #1320586

        DrWho – All those 1%’s can definitely add up: and as long as you have grabbed the 2, 3, 4 and 5%’s first then it’s very worthwhile.

        I assume in the past Microsoft profiled what is expected to be used by a range of users and set the services to support those to start automatically.

        One point though: the list of services you disable (I assume I’m looking at the correct document on your site – services1.rtf ?) includes Performance Logs & Alerts and System Event Notification. I would council against stopping those services as they can be highly effective in debugging hardware or other system problems. For example, without Performance Logs & Alerts the Event Viewer probably won’t work, which would be a real problem for debug.

        I certainly can’t argue against the others, given the very important caveat you note at the top of the document – only if the machine is not networked.

        Have you measured any decrease in memory allocation or other performance indicator that can be judged across a range of systems?

      • #1320590

        Please note:

        The following should only be tried if you like crashing your computer or seeing error messages pop up. A lot of experience and some trial and error is needed to try these things out – if in doubt, don’t do it!

        In XP I can shut down as many as 11 to 24 services and in Win-7, the number stays at 24, but that’s still a work in progress. :)[/quote] Noob 😉 getting XP down to 9 is possible if you’re troubleshooting offline, 23 is my normal in W7; got Workstation turned off yet?

        A good source for information about that can be found on “Black Viper’s website”.

        If it only makes 1% improvement in performance, that’s 1% we didn’t have before. Eh?

        Not solely for performance reasons, the less you have loaded or accessible, the smaller your profile is to potential attacks – if it ain’t loaded (as in it’s disabled) it’s pretty hard to squeeze malware into it.

        Tinto “without Performance Logs & Alerts the Event Viewer probably won’t work … ” Event Viewer works fine for me with Performance Logs & Alerts set to Manual.

        • #1321454

          A PC is a complex piece of kit and ideally needs careful setup and a range of maintenance actions to keep it in top form.

          That’s exactly where I’m at!

          No one would dream of getting into a new car, putting the pedal to the metal and keeping it there till the car crashes, but that’s EXACTLY the way 99.9% of the world run their computers. Without the expense for gasoline, of course. :rolleyes:
          It’s turn it on, run every conceivable program at the same time and then let it run till it crashes.
          Even worse are those who run their computer like that without the benefit of any AV or AS software, to keep it safe.
          I could go to jail, in some countries, for even telling you what I’d like to do to people like that.:mad:
          Computer Abuse, falls right in line with animal abuse or Child Abuse.

          Enter The Doctor: I spend a lot more time setting up a new PC than any other tech I know of. Two to three hours is typical.
          But even after what seems like a lifetime of trying to get computer maintenance fully automated, I’m still learning of new ways to do things.

          One day I showed a local computer tech how I set up and tune a computer. His response was not what I expected,,,,, he said, “Oh, I’d never do that….it would make a computer too reliable and my customers would never have to call me back”.
          All too many PC Tech’s share that sentiment and PC Repair shops are getting as common as gas stations or 7-11’s.

          With a little help, I’ve just recently fully automated Microsoft’s “Disk Cleanup” so it can run daily, or whenever the PC is rebooted.
          For those hard headed customers that refuse to turn their computer off at night when they go to bed, I have an automated way to do a ReBoot at some wee hour of the night. That saves and refreshes the Registry, stored on the HD and can run an automated cleanup routine to keep the PC free of garbage file buildup.
          Even a PC used as a server or FAX receiver, can benefit from a daily Re-Boot. In mainframes, that’s called “Bouncing the System”. They do that as a part of a scheduled maintenance routine.

          Are we back on topic yet?

          The Doctor 😎

    • #1320591

      Thanks satrow I stand corrected re the Event Viewer – I haven’t tried those tweaks for the reasons you alluded to.

    • #1320646

      I don’t play much with services on my customer’s computers because I find it hard to predict what their future requirements are and see very little performance gain from messing with them. If you do mess with them, its safer to set an unneeded service to manual rather than disabled. There is something to be said for satrow’s comment:

      Not solely for performance reasons, the less you have loaded or accessible, the smaller your profile is to potential attacks – if it ain’t loaded (as in it’s disabled) it’s pretty hard to squeeze malware into it.

      but the problem is minimized with decent anti-malware programs. Dr Who’s methods obviously work for him and I have a happy customer base with mine as well.

      Jerry

      • #1320649

        Yes Jerry, running a decent AV and antimalware will cover many potential threats.

        Removing (by disabling) the running Services that are the target of many these threats is somewhat different, no?

        Like Dr Who, I’m quite happy with my methods, thank you :), I’d not recommend them to all users though, as I noted at the top of my last post.

    • #1320652

      satrow, I tried to agree with you. A good example is disabling networking services, a possible source of malware infections, if you don’t use it. But 6 months down the road, if one of my customers gets a new tablet or laptop and decide they want to network them, I don’t want to get a call asking why it doesn’t work. This whole thread has been very useful but can border on theology, sort of like the free AV discussion under Windows Secrets column. We just need to keep the discussion civil as it has been and everyone will pick and choose what works for them.

      Jerry

    • #1320669

      Sorry Jerry, my last post didn’t quite work out as intended, I was (poorly) trying to clarify a few things for those that may have been unsure as to what had recently been referred to. If you get time, study the reg. patches that can be created at BV’s site, they may have uses not dissimilar to the old (underused) Hardware Profiles on XP, in that you can enable/disable a batch of Services by just rebooting.

      Anyway, enough of my peregrinations >Thread: Disk cleanup and performance improvements? > back to normal, carry on all!

    • #1320677

      Some random thoughts on TIFs.

      When one starts a PC for the first time the TIF cache is empty therefore everything has to be downloaded. (slower?)

      After a few days of use the cache will probably contain the static parts of the websites one frequents (faster?)

      Later on the cache will become full and older elements will need to be removed (I/O hit and as a result slower?)

      So, by my reckoning, we need a browser that knows where you are going to go before you do!

      • #1320712

        So, by my reckoning, we need a browser that knows where you are going to go before you do!

        In my house, such a browser is called Mrs Tinto Tech!

        Seriously though, I thought the TIF cache has a retention period, which if I understand it correctly, means that files downloaded from sites are only retained in the cache for a predetermined time. That being the case, it follows that the cache should not grow beyond a reasonable size.

        My counter-intuative observation is that clearing the cache appears to improve browsing performance, but it’s such a difficult subject to define: it could easily be different due to differing websites.

        Wasn’t it Deming that once said “In God we trust: all others must bring data“? I have no doubt that very many if not all comments make a difference, but how to pin that down to measurable effects that we can apply against specific conditions?

        What else then? Memory utilisation, disk I/O, CPU Bottlenecks, network latency, startup programs and services…..I think there is a missing factor that CLiNT was referring to: the human factor. Is it not just as important to devise processes and organisational methods that encourage (if not enforce) better data organisation?

        Perhaps another item also worth consideration is drivers and firmware. SSD’s and SATA3 provide huge improvements, but for most people these are not implemented. Often I come across out of date drivers, particularly chipset. Do people have experience of improved performance from updated drivers – perhaps not a fair question because the answer is probably dependent on the manufacturer etc….but worth asking anyway.

    • #1320785

      There are observed performance increases by regularly maintaining a system and/or data drive, but why?
      What are your experiences and understanding?

      Software

      Normally when we run any particular software application we get the intended and desired result, which is the task at hand for which the software was intended, in a relative & timely manner.
      Occasionally there are problems, software misbehaves, code gets corrupt, the memory on which the code resides get corrupt, a wide variety of issues can occur related to program interactions in other words. The language upon which all software, including drivers, are written, is far from perfect.
      A good example of this is any piece of software you download and install will have a log detailing the bug fixes from the previous version.
      The very fact that computers need to be maintained is a testament to the above.

      We all know what bad software is, remember AOL?

      Bad software is perhaps the worst of it, most all software applications are far from perfect but will run just as effectively all the same.
      Improper or corrupt uninstall routines leave components behind, that in turn become part of the operating system’s file system tally
      and may or may not interact in a negative way with other components, …or future components. No way to know for sure. There are so many variables.
      I’m thinking that an accumulation over time will contribute to degredation of overall perfromance between the application itself and the operating system.
      Maybe this is something only a Ph.D can answer, who knows.

      The operating system is no different than any other piece of written software, Microsoft’s Windows Update is a testament to that.
      Previous MS operating systems were highly prone to degradation by having some or many of their root components replaced or otherwise modified by other software applications and their processes. Although Windows 7 is purported to have made some great strives to correct this anomaly, it truly remains to be seen in it’s full entirety.

      Hardware drivers fall into the same general catagory as does any other piece of software in that bugs are addressed from version to version.
      A driver issue may have a cascading affect simply by the fact that they are the interactive elements between the physical hardware of the computer and the software environment of the operating system. The mentality “if it aint broke don’t fix it” is not fully correct when it comes to drivers. Many times one won’t see the hidden effects of accumulated bugs until their effect becomes catastrophic. ie some new variable interacts with them.

      What we commonly refer to as junk is also the accumulative effects of bugs, errent code, large amounts of potentially useless code that some process somewhere is trying to read through and make sense of.
      If you have a computer that has been running reasonably well with considerable up time, take look through the event viewer, you may be surprised to see so many yellow flags and errors that you were not previously aware of that up until now may or may not seem to impact performance much.

      The “WHY” of the need to maintain a computer’s performance is rooted firmly in the fact that the technology, which is a constant work in progress, is far from perfect.

      ,

    • #1320796

      A good analysis CLiNT.

      As our systems become more complex, either by design or by incremental installations and updates, there are more opportunities for errors. Modern software development and testing procedures are designed to standardise processes, but in the real world the is no such thing as a bug free system (caveat: I’m not a software developer).

      Some systems are stacked and a weakness at an early stage or lower level can cause a Tower of Babel. Other systems have obscure parallel interactions and perhaps if we are honest, not always fully understood by the developers. These interactions can consume resource, eat CPU cycles, swamp networks and generally limit the system performance.

      So, if I can answer my own rhetorical question posed earlier: it’s not necessarily the quantity of data, but the context and inter-dependencies of that data which can lead to issues. Uncontrolled or poorly managed build up of inter-dependencies can lead to unpredictable consequences.

      A good analogy is Six-Sigma, in which systems are developoed to a degree of maturity such that they should be statistically defect free to 6 standard deviations (99.99966%). When one looks a the complexity of an OS, Six Sigma leaves plenty of scope for inconsistencies and errors. Adding further complexity builds on that.

      By building systems that are optimised and employing regular preventative maintenance, we can remove many of the causes of these inconsistencies and errors. Hence cleaning per se does not necessarily attack the root cause but can be a very effective containment for systems that cannot deliver stability over the long term.

    • #1320812

      Wheeewwww. And all I said was I think my PC works better when I keep the junk off it. I am humbled by you gentlemen! :clapping:

    • #1320999

      Another big issue when it comes to performance and usability…

      USER INTERACTION & CHOICE

      User interaction has probably the most profound effect on computer performance since the average user is one of the biggest variables in this equation, along with software.
      Everything about habits that bring about infection to choices in what’s installed and run will have an effect on performance. Sometimes no amount of upkeep or maintenance will fix.

      As I attempted to touch on earlier, it’s the user who will contribute much of the issues we see both in the day to day operation of the computer and in the long term.
      One doesn’t need to understand a whole lot about computers in order to use them, but one does need to know some basic precepts.

      This is one…

      Matching Hardware with Expectation & task
      Purchasing a computer for a specific task or a specific set of tasks, whether the task be as simple and rudimentary as surfing the web and generating email to playing intensive games, one needs to buy into hardware that will get the job done. It would then stand to reason that one should have an idea of the hardware requirements of the task prior to purchase.

      This is not always so because we see folks quite often running multiple software on inadequate hardware, or mismatching their perceptions of what their actual needs are relative to hardware requirements and cost. Expectations are out of sync with reality through lack of knowledge.
      If one doesn’t understand this basic precept then they’ll have problems, you’ll never hear of anyone complaining that their computer is too fast or too powerful.

      I recently came across a laptop with 64 bit Windows 7 Home Premium installed on it. It had a single core processor less than 2 GHz, and 2 GBs of DDR3 RAM at probably 800 MHz.
      This is what I call a market place joke, and a very good reason to look and think carefully about exactly what you are buying.
      Who in their right mind would buy something like this and hope to do real work if it were not for a novice?

      The laptop in question had quite a bit installed, not to mention full drive encryption, which on a 5400 RPM drive and the above specs can be somewhat taxing in itself.
      Curiously the user had made some meaningful attempts to improve performance, I had noticed 8-10 entries in the startup menu of MSCONFIG were unchecked.
      The performance was still horrible. Cleaning this up means uninstalling programs and ensuring there is no remnants of them left behind. I had to also disable a few services as well, along with disabling indexing, system restore, remote access, and others, I still could not get the performance I thought I should have gotten out of it.

      Bottom line;
      The user has to be able to use and maintain the computer’s software environment within the limitations of the physical hardware specifications, otherwise one is going to hit the wall when it comes to performance and satisfaction.
      The above system clearly wasn’t meant for almost any level of productivity.

      • #1321027

        Matching Hardware with Expectation & task

        That’s so right, but is often overlooked. Marketing and advertiser pressure can very easily make for a poor user experience.

        Last week, I was working on a laptop sold less than a year ago by one of the largest online retailers in the world. It was running Win7 32bit with just 1GB Ram and swapping to disk almost permanently. I took a note of the asset tag and looked up on the manufactures website to confirm the original spec at point of sale. Sure enough it was the same. Frankly I was shocked that such a machine could be sold less than a year ago with that spec. The phrase “not fit for purpose” nearly crossed my lips.

        So the user must make an informed choice about what their requirements are, keep them under review and not expect a low end system to handle intensive tasks with ease.


        @Ted
        , your family experiences are far from unique – I’m afraid it goes with the territory. 🙁

    • #1321006

      Clint, I think your last paragraph is one of the most difficult to teach some of the average users out there. Many of these average users get their PCes to play games, send and receive email and surf the web a little. My mother is a prime example of this. She has had a PC for almost 10 years now. She started with XP on a hand me down PC. When she started having problems with that PC, we got together and bought her a nice mid-range Vista desktop which she still owns. I have taught her finally how to clean her junk off her PC (I set up a Cleanup batch file and do have CCleaner for temp file cleanup only). She does finally know how to update her AV/AM app (MSE) and run periodic scans and does know how to let WU do it’s thing. She does know how to update Malwarebytes and Spybot regularly and run regular scans with them. Guess what, it only took 10 years to get her to this point!

      She still contacts me almost weekly with emails she gets about special deals she receives, even after I have told her repeatedly that she should just delete them all. Every time we visit my wife or I have to assist loading pictures from their digital camera or email to her HD so she can look at them because she can’t remember how to do it. I wrote the steps down for her.

      Now this is a fairly intelligent woman who has a son that has at least one brain cell that works intermittently that helps her out with her PC. How many PC users out ther don’t have me, or you, or Fred, or Jerry, or Wayne or whoever to help them with their PC needs. I would suspect thousands of times more than that do. It is very easy to see why people have PCes that are so messed up.

      I guess to sum up, yes it is vital to be able to use and maintain our PCes, but it is equally as important to attempt to continue to get the word out to all those others how important this chore is.

      Sorry for the long rant.

    • #1321055

      So that we do not get off topic I will PM you Wayne.

      • #1321075

        Tinto, I wonder, if you’ve noticed one particular brand of PC as being exceptionally bad about putting out hardware that will barely even run the installed OS?

        Apart from one or two exceptions, most consumer models here are fairly reasonable for everyday surfing and email {as opposed to gfx, media or other intensive purposes}.

        For example, my daughter’s laptop runs just fine with a dual core cpu and 2GB Ram. When doing her schoolwork it is perfectly adequate: word, excel and powerpoint all runing simultaneously without significant hit. Sure, I have to remind her to shut down the 10 or more tabs open on Chrome, limit the simultaneous iTunes downloads while trying to watch BBC iPlayer and to wind down her Behringer recording suite from time to time. But she is a teenager and we all know what that means – the entire bandwidth of BT, Virgin, Sky and TalkTalk put together still wouldn’t be enough to satisfy the internet savvy teenager. She knows she can also call on dad to fix things when they go wrong. :huh:

        :
        :
        :
        :

        I guess what we have been discussing is that when a machine is stretched beyond its everyday capabilities, or is not maintained within its original requirement, it will likely experience performance degradation. That degradation may not be an easy thing to spot at first and may not exhibit a intuitive cause and effect.

        Moreover, we could spend our whole lives looking for a magic-bullet explanation, when in reality, there is no single cause, no single performance indicator and no single fix.

        A PC is a complex piece of kit and ideally needs careful setup and a range of maintenance actions to keep it in top form.

        • #1321140

          … She knows she can also call on dad to fix things when they go wrong. :huh:

          Tinto, I feel with you. Does your daughter have at least an idea of how blessed she is with this dad?

          A PC is a complex piece of kit and ideally needs careful setup and a range of maintenance actions to keep it in top form.

          Amen to that, brother!

    • #1321464

      Hey Dr, since I don’t do this every day, I spent several more hours than that setting up my granddaughter’s new PC. I did remove all the D(H)ell stuff (as you called it :D) did quite a bit of customizing, used quite a few of your Win7 rar stuff, not all but quite a bit. I’ve loaded quite a few apps, got rid of the Dell Recovery partition (burned that to my Ext HD as an Image just in case) just have one large HD at present. Created an Image of the final setup as my Gold Standard Image for her. When I present the gift I will discuss the pros/cons of partitioning a data drive for her.

      I do have a Cleanup batch file that I included the Sageset:99, Sagerun:99 commands as well as the other commands you use. I also do use CCleaner and Privacy Mantra (I realize you do not use CCleaner, but I do not have a problem with it) I will discuss with her how important it is to keep things clean.

      Thank you to everyone for the fantastic suggestions throughout this thread on the importance of maintenance. This has got to be one of my all time favorite threads on one of the most important subjects IMHO, so to all of the contributors here a hearty :fanfare: :clapping: :thewave:

      • #1322440

        In the previous post, sageset is in the first line of the batch file. That’s correct if the batch file is only going to be run once, because once the setting is done, it will be remembered by Windows.
        If that batch file is going to be run again, like as a regular cleanup routine, then that line can be rem’ed out or just deleted, after the first run.

        From a previous post of mine:

        I spend a lot more time setting up a new PC than any other tech I know of. Two to three hours is typical. But even after what seems like a lifetime of trying to get computer maintenance fully automated, I’m still learning of new ways to do things.

        Somewhere, I made the comment that I’d like to see Spybot Search & Destroy be able to run Automatically.
        Well, with some digging, I found out how to do that. There is an extensive list of command line switches, to run the program from a batch file, for instance, to run the program without any user intervention.

        I’ll be installing this for one of my elderly customers, who absolutely refuses to do anything above reading her email and playing Solitaire. She calls me about once a year, because her PC has slowed to a crawl and I have to go in and delete thousands of garbage files and run Spybot S&D and remove a lot of spyware.

        The following line, in a batch file can run an update followed by a scan with Spyware removal. The batch file can be put into your Startup folder to run every day, or it can be set up in your Task Manager to only run once a week (Wednesday) when the updates are posted for Spybot S&D.

        ”C:Program FilesSpybot – Search & DestroySpybotSD.exe” /taskbarhide /minimized /autoupdate /autocheck /autofix /autoclose

        For this to be successful, Spybot S&D has to first be installed properly in your computer.
        Installation and setup instructions for Spybot S&D are on my web site.

        Cheers Mates!
        The Doctor 😎

    • #1321494

      These maintenance tips were all about platter drives I believe. Beware using most if not all of them with SSD drives. Perhaps those of you that have these drives can chime in. or better yet, perhaps a Moderatoe or Administrator can move this last couple posts to a new thread for just SSD’s to give them the exposure they deserve. Having them tacked on here will bury them in this thread.

    • #1321532

      Referring back to mechanical drives:

      What about program uninstallers – hold on their tiger, before anyone bites my head off, I generally don’t use them and I approach with trepidation when I see a seriously ill machine with CCleaner or similar shortcut on the desktop.

      But here’s a thought (and perhaps lets try keep this to system performance rather than the can-o-worms that is registry performance or size etc)…

      If the performance of a machine can be affected by inconsistencies and bad setup files etc, how about what’s left behind after removal of a program? Not the quantity of data, (because I think we have established its not the quantity but the type of data), but garbage from a previous software package – unused dll’s, registry inconsistencies – pointing to a version of software that no longer exists for example.

      Now, all that should make no difference until such time as a bit of code wants to reference a registry key or shared dll and it merrily tries to use the resource, which no longer exists or is the incorrect version.

      So, despite my propensity to avoid them, could a careful use of a registry or uninstaller tool after removal of an application have a positive benefit? Not in “compacting the registry” because that makes no sense given the nature of the beast, but in removing inconsistencies or other leftover files. Perhaps at this point I should also say that due to my avoidance of them I don’t know in detail the features of all of them. Are there any tools that can scan for specific left-overs rather than a on-size-fits-all and are these effective in keeping things ship shape?

    • #1321549

      Well Tinto, I do use CCleaner, generally for temp file cleanup, but when I uninstll an app, whether with Windows uninstaller or Revo uninstaller I run CCleaner registry cleaner to see what’s left and if I find something from the recently uninstalled app I remove it. Never had a single failure because of it. I’m sold that this helps for the reasons you have stated in your last post. Perhaps this leftover stuff will never cause a problem. And Revo does not leave much behind, I’ll grant you, but I always find one or two items, sometimes a few. And I’ve never had a problem related to these removals.

    • #1321611

      Cleaning up an SSD isn’t too much different than a mechanical drive except that one does not defragment a SSD.
      There will be some specific read/write tweaks that one may want initiate on an SSD, but for the most part, cleaning up the drive will be the same as discussed here.

      One of your best advantages with a SSD will be the overal performance increase over mechanical drives.
      But this does not negate the need to keep the drive well maintained and groomed to be lean.

      Ensure you have as much free space as you can stand, the more free space on a SSD the better.
      If your overal computer specs are decent, and you didn’t buy into some cheep 80 dollar SSD, you can easily disable prefetch or superfetch, limit the PF to it’s bare minimum.
      Set the PF to a minimum and a maximum, whatever minimum it will take to preserve file error logging.

    • #1321626

      Ted, from my understanding, after looking at Revo in the past, it uses the Windows uninstall process first anyway.

      Nonetheless, regarding Revo, do you use the free or Pro version?

      Reviewing their marketing info, the Pro version appears to perform an audit of installed software which is then used to search for specific left-overs after an uninstall. If that’s the case, it could be a reasonably well tuned search rather than casting the net wide and potentially finding problems that don’t exist.

      I can’t recall ever seeing a big issue after uninstalling a program using Windows, but I’m open minded enough to wonder if there is a place for such utilities to help maintain the performance of a system – when used in a controlled way.

      Re: SSD’s. Building on CLiNT’s comments, the other thing that I do is to max out on RAM (it’s a cheap upgrade) which minimises PF utilisation.

    • #1321627

      Tinto, I use the pro version and really like it.

      Particularly the feature of installing with Revo and that keeps a log of everything the install does so when you un-install it use the log to un-install..
      There are other features in the Pro version I also like.

    • #1321629

      I use the Pro version as well, but only because it also can uninstall 64 Bit apps. The free version is limited to 32 Bit apps. I do not at this time have Revo running in real time monitoring installs. I do not see a big advantage to this as I normally only see 2 or 3 items at most (sometimes none) left after the Revo uninstall, and those are cleaned with CCleaner. I would rather not have anything extra running in real time than the bare minimum unless I’m working with it. All that’s normally running on my PC is MSE, Online Armor forewall and Alps Touchpad for my laptop. That’s it other than the MS services. That’s all the 3rd party S/W. I shut down everything else.

      You would be very surprised in most cases how much Revo finds after it finishes with either the Windows uninstaller or the apps own uninstlaller (yes it starts with either of these that is available) then does a scan for leftovers. Remember that if an uninstall requires a restart, DO NOT restart prior to scanning for leftover items. Wait until after Revo finishes. I always do the Advanced, most agressive scan and always delete everything found and so far have never had a single problem. I try and throw out a lot of S/W.

      I suppose if I were trying a piece of S/W I was unsure about, having Revo running during that install might be beneficial.

    • #1321635

      hey Ted, maybe we should see what PC Mag has to say about this…

    • #1321636

      I used to read PC Mag but have not for a few years. I still have a digital subscription to PCWorld and love this one. I’m switchind all my subscriptions to digital. Killing fewer trees that way. Quite honestly if it weren’t for the lack of 64 Bit support for Revo I would not have bought the Pro edition of it, but I thought this this was the feature that convinced me. I do not want MalwareBytes to run in real time, so see no need for the Pro version. The free version does everything I want.

    • #1321637

      Ted, I don’t get any of them anymore because they just seem to be bloated with advertising.

      I use Maywarebytes in real time…

      What I like about Revo pro is that I can force un-install for programs that are not listed in the program list to un-install…

    • #1321654

      Hmmm, interesting. I just downloaded and installed a trial of that product and it seems different to what I recall. Perhaps that was the free version that I am remembering.

      Anyway, it does appear to be different to the classic registry cleaning tools that we have seen from time to time. What interests me is the ability to use a tool to find issues that may affect performance. All too often when working with tools such as these they weigh heavily on “cleaning the registry” and use weasel words to describe what exactly it is doing, so it’s refreshing to see a 61 page pdf help file describing in detail the features and functions.

      There are quite a few functions that are potentially beneficial to maintaining a clean (and hence well performing) system, just as there are some which probably have no measurable effect (the unrecoverable delete function for example). I guess that some of these features are a nod in the direction of the more widely used “cleaners”.

      What does worry me a little is the interface is perhaps not what one would recommend to a novice. If we are to try to help less experienced users maintain the performance of their systems, a semi automated feature set, or standard/advanced switch may be helpful. (Caveat – I’ve only been playing with his for a hour or so: I may have overlooked something).

      Of course, whether any left overs from a software removal actually have a detrimental effect on performance is very specific to the individual system, but in general eradication of any left-over files etc. seems to be beneficial.

      This is an interesting journey.

    • #1321655

      Tinto, I believe when you initially install Revo it’s the PRO version and you have, I believe, 30 days and then it reverts to the free version which doesn’t include a lot of the tools that are in the free version, like the force install and registry cleaner etc.

    • #1321673

      Most of the other tools available in Revo Uninstaller I have not tried. Just as many of the additional tools in CCleaner I have not tried. I suppose I should do some of this investigation at some point in time. I will see what happens.

    • #1321739

      Example 1. A system that has high disk utilisation will benefit from a disk clean up because free space is generated which can then be rationalized using a defrag, making the drive work less to access the data. In effect reducing the I/O bottleneck. But that only should a marked effect on a drive that is heavily used.

      With the HDD seek and access speeds (even before SSDs) being so much faster today than they were 10-15 years ago, the impact of a fragmented HDD just isn’t what it used to be. I suspect that much of the emphasis on defragmentation is based on lessons learned “back in the day” and passed on as gospel over the years (just like people who learned to pump their carburated car’s accellerator a couple of times on a cold day still doing it today on a fuel-injected car).

      One time that I can think of fragmentation’s still having a significant impact is on a nearly-full system drive–in this case, the Windows swapfile can end up getting fragmented (due to insufficient contiguous space). As even a 6GB Windows 7 system still makes ample use of the swapfile (I have mine isolated on its own partition and constantly monitor activity on that partition, so I’m not making this up), having a fragmented swapfile will slow things down noticeably. That said, it’s probably true that the best solution is to first (a) isolate the swapfile on a dedicated partition and (b) replace that nearly-full HDD with a larger one (they’re not that expensive, even if the price is still recovering from the Thailand flood).

      Other than with the swapfile issue (and maybe even with that), I suspect that as soon as we’re fully on SSDs, the whole defrag utility discussion will finally die.

      Example 2. Many applications come bundled with branding and additional features that are configured to load at startup. This can affect the CPU utilisation and memory utilisation, both of which can have a detrimental effect on performance. Being careful during an installation can often allow one to request not to install additional features. Using msconfig, or 3rd-party apps can get to the root of the issue, buy preventing various startup items from launching. Fine and well, but other than consuming RAM and potentially causing disk swapping, a start up item should not generate a CPU bottleneck after it has loaded.

      As you pointed out, this is mainly a startup issue, which is why the articles on this subject are all about minimizing the startup time. I don’t know about you, but I restart only once every 5-15 days or so, so this seems more relevant to those who restart regularly (like software testers and reviewers).

      Example 3. Browsing the internet may be more responsive if the Temporary Internet Files are rationalised, but the purpose of these files are to provide a local cache to reduce network utilisation, so if there is no local cache, the system must download the files from the web server again. That suggests that a total removal of temporary internet files may actually have a detrimental effect. On the flip side however, if the drive is getting full, removal of these files may improve the disk performance as in Example 1. Also if a corrupt file is located in the cache, removing it would be beneficial to prevent errors.

      I’ve occasionally had to clear the Internet cache in order to get a flash video to play. Other than that, I can’t say that it’s ever posed a performance problem. I still let CCleaner clean out those cached files for me, but only on those rare times that I run it (every 2-3 months?).

      One more thing: Cached and temp files can slow down a backup and use more of the backup medium, but the solution there is to be less broad about what you’re including in your backup.

      Example 5. Poorly a poorly configures system can often be helped by resetting some parameters to default, for example internet explorer settings. This isn’t really disk cleanup or removal of data, but has been included for completeness.

      Firefox profile preference settings can get corrupted, but the symptom is usually a freeze, not a slowdown.

      What are your experiences and understanding?

      The only other factor I’ve looked at is the load time for fonts, primarily with respect to how it affects the start-up time for Word. Just for fun, I’ve backed up and uninstalled the fonts I don’t regularly use (which wasn’t as clear a thing as I had expected) and use a font utility to load the backups on demand. Having done that, I can’t really swear that I’ve seen a significant improvement in Word’s startup time–but it’s not costing me anything to leave it this way, and I do like that the fonts list is about 80% shorter now.

    • #1322025

      Kris,

      First, Welcome to the Lounge, and your English is very good.

      I also do not know of any handbook of cleanup/maintenance chores. Here’s the way I started. Go ahead and start with the oldest/slowest of the bunch and see if it makes a difference. If it does, it will give you the initiative and determination to do so with the newer PCes. Be advised that I do not know how the advice works when interacting between an English language version and a German language (????) version of Windows.I assumed here you are using the German language version.

      Start with a small batch file that was developed by Fred Langa back in Win 95 I think and modified through the versions. Just copy these into notepad and save as a .bat file on your desktop. select all the check boxes when they are presented. If you know any dos you can actually add to this batch file. All it is doing is telling the batch file to open a cmd window and running the commands to delete all items in certain temporary folders. If you have other temporary folders by all means add more line that add the paths to those folders. In Win XP there are temp Internet file folders that can be added but I do not have the valid paths any more to those folders so I would not wish to give you bad info.

      Start your copy on the next line.

      cleanmgr /sageset:99
      del /q /s “C:Users”username”AppDataLocalTemp*.*” (please note, this line is specific to Win 7 I believe. It can just be deleted if you do not use Win 7)
      del /q /s “C:WindowsTemp*.*”
      del /q /s “C:Temp*.*”
      cleanmgr /sagerun:99
      Pause

      End your copy on the previous line.

      The sageset/sagerun command runs the Windows Disk Cleanup utility on steroids.

      Next I would download the CCleaner and the Privacy Mantraapps and run the temp file cleanup areas. Be careful on what you choose to clean up. Same about interactions with language.

      Since you are going to start with the oldest and slowest of the PCes you might want to check out the registry cleaner section. After you analyze, look through the entries. You will find MANY entries, especially the first time. Since it sounds like this is a “throw away” PC anyway, I would just go for it. You will find dozens, if not hundreds of entries of orphaned or leftover entries from apps you might have uninstalled years before. These old uninstallers never did a good job. Heck even modern uninstallers do a terrible job. That’s why most of us are switching to Uninstaller apps.

      Just try these things to start and see what happens.

      Note: You will have to escuse my English, after all it is my first language and I have developed nasty habits of using slang. If I have used any terms that you do not understand please ask what the heck I am talking about.

    • #1322536

      Ok, did not realize Sageset/sagerun would be remembered. The original batch file I used was from Fred Langa way back in Win 95 I believe.

    Viewing 42 reply threads
    Reply To: Reply #1320812 in Disk cleanup and performance improvements?

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information:




    Cancel