Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Hasn’t the problem of Windows updates being partially installed been solved? (msdn.microsoft.com)
52 points by nikbackm on Dec 12, 2015 | hide | past | favorite | 48 comments


Recently I was surprised by how long Windows Update took to simply decide which updates were available. When rejuvenating a relative's Vista machine that hadn't been online for about 6 years, it took WU more than 2 days of fully pegging the CPU to finally deliver the news that ~200 updates were available.

What was it doing all that time? Some kind of solver?


About a month ago I decided to reinstall Windows on one of my old PCs, an Intel Ivy Bridge desktop i5. It is frustrating and nearly impossible to end up with a patched system. The process was pretty much as follows:

1. Manually check for updates. This took forever and thrashed the hard disk, every time.

2. Select, download, and install the updates.

3. Goto 1 until Windows couldn't find any more updates.

In some cases, step 2 would fail with various unknown errors and hex codes. The only way to fix this is to run a Windows diagnostic that "resets Windows Update components", which makes Step 1 take even longer than it would have otherwise.

All in all, I think the update cycle took me about 4 hours before I gave up on the third reboot and decided to just let Windows figure itself out. Compare this to an 'apt-get update && apt-get upgrade' or the OS X Software Update, and it's a wonder people update Windows at all.


> Compare this to an 'apt-get update && apt-get upgrade' or the OS X Software Update

There was a major update to Arch recently - due to GCC 5 changing the standard C++ library with binary interface changes. Basically every package written in C++ and compiled with GCC needed re-compiling and updating.

Mine took about an hour to update (6 year old machine) - with over 500 packages updated. Of course everything kept running, but I chose to reboot since it was such a major change and everything worked flawlessly. Kudos to the whole Arch team.

But take a minute to think about that - it's a major percentage of the whole system updated.

Updating my Windows machines in comparison is terrible - the amount of time and effort is so variable, plus having to be careful to avoid or remove the dreaded tracking updates backported from Windows 10.


Win10 actually improves things here BTW, with the new cumulative update model requiring only a few updates to be installed at most.


Windows 10 should (eventually) move to a model where the initial install is much closer to the current update, requiring less patching.


And only a few updates are needed at most when you do need them, thanks to the new cumulative update model.


Interesting, do you have any links about this?


The latest is actually they did it, then pulled it, but presumably they'll try again. http://arstechnica.com/information-technology/2015/11/window...


Even though I dnot use windows anymore, I have done this many many times with Windows Vista and Windows 7. It never took more than few hours. Even though few hours pretty freaking long. Taking couple of days doesn't sound like normal.



So others don't have to click through to this:

"Windows 7 uses Component-Based Servicing [1], which means Windows Update has to work ridiculously hard to determine file and component dependencies/inter-dependencies, maintain side-by-side versions of older files/components, while still making it possible to uninstall individual updates/components but without breaking any other updates/components, all the while taking into account supercedence and god knows what else. The code that does all this must be hellishly complex."

[1] https://technet.microsoft.com/en-us/library/cc756291(v=ws.10...


Yea, definitely much more complex than the update.exe that was used back in XP.


There was a brutal bug in Vista on some machines where it would peg the CPU for hours during update check on start-up. Users learned to never fully reset unless it was critical. I saw it personally in multiple machines, and I'm a dev not an admin - I only use a handful of computers.


Maybe there's a problem somewhere, because I've seen reports of other people being affected on various forums in October and November. I am currently trying to update a W7 SP1 VM, but it just keeps looking for updates, even after installing two updates for the update client itself.

It's extremely annoying and not related to Vista particularly, nor specific types of hardware as far as I can tell.


Try manually installing the patch at https://support.microsoft.com/en-us/kb/3102810


Already did, thanks. Didn't seem to have an effect, back when I installed it yesterday.

Today, after about 3h it found the updates.


This is the thing that amazed me most when switching from Windows to Mac, no more endless updating and rebooting. Updates seem less frequent and also less time consuming.

I noticed the same thing you describe on Windows 7 by the way.


The Windows update used to be okay in Windows XP, Vista, Windows 7 and 8. But Windows 10 with its forced update that has negative wait effects after every reboot is unbearable. Also Linux don't need a reboot for 99.9% of its updates and updates only the files that have changed. Something like Threshold 2 update for Win10 and all its problems like downloading 2+GB, doing a complete OS re-installation (WTF), loosing drivers and applications and settings is laughable in 2015. Also the new OldNewThing blog software isn't that great.


> The Windows update used to be okay in Windows XP

Nostalgia much? How many times did e.g. .NET updates force XP computers into endless loops, or just killed the OS?

Windows Update has always been a barely functional, embarrassing mess. It's just that seems to be stuck in 2001, while others finally managed to sort their crap out.


> It's just that seems to be stuck in 2001, while others finally managed to sort their crap out.

Those others do not include Apple as anyone who's been indefinitely waiting for "installing" OSX updates can confirm - and the OSX fixes needed have the same weird incantation like qualities as the fixes for Windows updates (i.e. remove update, then restart, then run again, then panic, redo-sequence). No particular hate meant for Apple or MS.. but seriously, learn how to write a basic update mechanism that do not include random-fail-over.


I've managed a ton of Macs and can't say I've ever come across the need to remove an update, or really any issue with updates failing, but maybe I'm lucky?


I manage Windows and Mac networks. Mac updates may take up to 5 minutes to discover necessary updates, resulting in 5 - 10 package downloads, even on machines that haven't been regularly updated. Windows, depending on the version can take upwards of 20 - 30 minutes, with up to 50-60 package downloads, and on machines that haven't been updated recently, 1 - 2 hours, and hundreds of package retrievals.

> OS X fixes needed have the same weird incantation like qualities as the fixes for Windows updates

I find your statement inaccurate. Even if the update operations are fairly similar in Windows and OS X, the latter's update system seems fairly efficient. OS X doesn't have the libraries and dependences hell that windows does. The update model fits well in to the OS X environment, not well at all in Windows.

Also, Mac updates that update system files are 'rare' and usually only require one reboot. Windows often requires 2 or 3 reboots. Again, I believe this is because of the dependancies and libraries Windows is forced to maintain and keep up to date.

Mac's don't hang on update or install, frankly, Ever, unless there is something very wrong.


The update procedure works fine, the newly updated OS X seldom so. Far too often updating OS X means having something break.


I manage Macs for myself and my family, as well having used many through my employer over the years. I have never seen what you describe.


There has definitely been a trade off. I recently had to install Win 8 on a Surface, and had all but forgotten the days of a fresh install needing 100+ updates, which require multiple download/install/reboot/repeat cycles. Win 10 has less flexibility for deciding how the update happens, but I also almost never reboot anymore. I can probably count the number of times I've had to reboot for Win 10 on one hand, whereas it was a given for nearly every update pre-Win 10. Having said that, your point about needing a reboot at all is well taken.


> updates only the files that have changed

No linux system that I ever used worked that way. The only system I've seen that did that was solaris, and that patching software was a disaster.

Systems like dpkg and rpm replace the entire contents of a package when there is an update. The benefit of this is that if things are very out of date or a package is corrupted, the only thing you need to do to get things up to date is (re)install the latest package.

Packages on linux are fairly granular, so in practice doing full individual package updates is not a huge problem. I have 2300 packages or so installed on my laptop and all but 100 or so are under 10M.

Google did a lot of work optimizing chrome package updates, but I'm not sure if most distributions are prioritizing this sort of thing anymore.


Correct of course, however at least Fedora does delta RPM updates now a days so full rpm downloads for updates are rare. Most drpms turn out to be few hundreds of KB.


That's interesting.. I looked up how it is implemented.

They seem to pull down the updated bits, merge them with what is currently on the system to reconstruct the new package, and then install that.

That would save bandwidth, but not make the actual operation of installing the new package any faster.


Yes, that is basically an in place upgrade, but Pro users can defer it (this is called CBB). And when a new build is not being installed, the amount of time spent on each reboot should be much less due to less processing being required after each cumulative update.


Recently after Windows 10 performed an update it notified me that it had uninstalled a VPN client because of incompatibility.

Personally I would have preferred it not perform the Windows Update and allow me to connect to my client's network considering that's one of the requirements of getting paid by the client.

This is exactly why I keep a Windows 7 VM ready.


Was this after the 1511 upgrade? I think you can go back to a previous build pretty easily in Win10.


Having talked to many Windows users, I'm convinced that they never experienced the package and release management process of Linux distros before. If they did, they would constantly complain about Windows. With many Linux distros you can even update packages as part of the installation process, giving you a fully up to date distro. There's no reason for Microsoft to do anything less since they require serial keys, activation and network connections anyway.

Compare the time it takes to remove or install a component (application or library) and you will be very disappointed in Windows. No matter how fast your disk or cpu is, Windows finds a way to make you wait half a weekend. This is unacceptable, but nobody complains loud enough for Microsoft to reconsider.

Windows software management is a total mess and it got worse with newer Windows versions.

But other parts of Windows get better with each release and it's mostly the kernel land. This seems to be a case of different teams having different culture (Windows kernel, Windows shell, Windows installer, ...). Take Windows 2003's desktop, put it on Window's 10's kernel, and you have a nice system.


I'm not sure the comparison here is entirely fair.

On Linux systems, often most or all of the software being used is installed from the distro packages using the distro's package manager. Usually things go smoothly in this context.

On Windows systems, often most or all of the software being installed is installed using third party installers, which may have their own versions of various dependencies. The Windows infrastructure has to cope with this much less predictable context, and much of the time it does so reasonably effectively.

If you need to build third party software from source on Linux, or do more tricky things like cross-compiling to run on some embedded device, things are often far less pretty. Often you are just trusting that someone's makefile running under sudo won't dump junk all over your system, and that if it does, the uninstall target will tidy up properly -- which, in my experience, is not a safe bet on either count. Even building something from source in the first place on Linux typically depends on system-provided compilers and probably standard libraries, which as we saw with some relatively recent GCC changes can switch ABIs in non-backwards-compatible ways, potentially causing serious problems of their own.

In short, things look much better on Linux as long as you stay within the walled garden of your distro's own package management system. As soon as you need to go beyond that, you're probably in the Wild West.


Package management on Linux can be summed up as: use this old, custom patched version we provide or build from source.

The alternative is just Windows style software distribution: find a random exe on the internet and run it.


Some distros provide unpatched software that is up-to-date. :)


Perhaps I should have used those years ago instead of becoming jaded and cynical.


The trade off to having stable, secure packages has always been not getting new features.

For example, in LTS releases, major versions are frozen but perpetually updated and patched for the lifetime of the distro. In more bleeding edge rolling release distros, you receive (or compile) sometimes major version updates of software within a short time after their public release. In those distros you can still freeze your packages at certain versions, but it's not quite the same as making sure every library and component in the system is getting the same treatment.

Try manjaro (https://manjaro.github.io/). It uses pacman (like Arch) and yaourt (for AUR, building from source). Their driver and kernel management makes switching between free/non-free drivers or different kernels easy. And they're pretty close to bleeding edge; I've been running kernel 4.2 with Xorg 1.17 (released 2015/02) for months, and up to kernel 4.4rc4 and Xorg 1.18 are available in the repos.


I use Linux as my day-to-day work platform, but my GOSH I hate the patching hell to which popular distros (mainly Debian and RH-based ones) subject everybody.


For third-party software, it's a trade-off.

But when the distributor is also the author – like Microsoft is for everything that is delivered via Windows Update – that is no longer a concern. We can compare the solutions by technical merit alone… and Windows doesn't come off too well.


Kernel is getting better? It is disaster compared to Linux.


Care to elaborate?


No ext4 or other alternative file systems support, no file system snapshots, no file system write overlay, no alternative schedulers, no kernel upgrades without reboot...

Just regular things from Linux.


"No ext4" might be a feature.


Dont know the situation now, but this was posted some time back :

http://blog.zorinaq.com/?e=74


"If the role of an internal developer conference is to encourage discussions among teams, then that one certainly succeeded. Because discussions most definitely ensued."

Been there, got the video, in less exalted company.

Is the book simply a repackaging of the blog posts or would it provide a more coherent view of the history of the Windows project?


The book just contains selected blog posts, as well as two downloadable appendices with more stories that haven't been published on the blog.


My system somehow lost all the original installation files for my programs. Now whenever I try to update a program, it fails on the "removing previous version" step. Insane.


I don't have such a problem, I have been Microsoft free for years ..




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: