The desktop client is what I am talking about, it will try and use as much upload bandwidth as you have if you put popular tracks in offline playlists.
AFAIK, the Spotify desktop app throttles the amount of upload so that it shouldn't disturb the user (or at least, it used to do that). If it used up all my upload bandwidth, I would consider that a bug and file a report.
> I'm a bit sick of hearing this meme perpetuated. Give Unity a chance ... in fact, the author's main gripe about Debian is resolved in a really fluid way by Ubuntu + Unity. I think Unity's multi-monitor support is one reason why it's worth sticking with.
I've used Unity on my desktop for a six months or so, but I just wasn't compatible with it.
The display issue is more of an issue of drivers or something similiar: The issue I'm having with my DP-connected 30" Dell is that I can't make it being sole display without first disabling laptop's internal screen with xrandr. If I keep my laptop display on, the screen works as a mirror or secondary screen just fine.
Now this might work in Unity, but unfortunately I've got no way of testing it.
You bring up good points and my article would need some clarification on some parts, I agree on that. I'll just write quick replies back to you, and try to format something on the article itself later.
Purge really did free memory and quite a lot. I'm not too expert (as you probably can tell) how the OS X memory management works, but I mostly settled with solutions that seemed to help my problem. Maybe there was some third party software that messed things up.
The problem with the inactive memory is that it is not freed, it is swapped. So when hitting memory limits of my system, the computer started swapping. Just freeing the memory, in my case, would have been much quicker. Practically my machine was constantly swapping when the memory limit came up. As you said, repair disk permissions caused all this to happen due to filling memory with disk cache. So that was a nice solution to my problem; a way to force swapping on inactive memory.
Python point is bit wrong, I was indeed trying to argue that installing python packages is impossible through homebrew. However, I did use a lot pip+virtualenv, so that's at best a bit vague argument on the OS X side. However, in production I always rely on the packages provided by the OS, not pip + venv, unless really necessary. This is mostly because it makes it easier to keep system up-to-date.
I'm sorry if this showed up as uninformed rant, but I just wanted to share how I felt using OS X and Macbook for a year as my primary computer.
The problem with the inactive memory is that it is not freed, it is swapped. So when hitting memory limits of my system, the computer started swapping. Just freeing the memory, in my case, would have been much quicker. Practically my machine was constantly swapping when the memory limit came up.
That's just not true. If inactive memory is something that's already backed by disk (like a memory mapped file), it'll be discarded. Only if it's read-write memory that's /currently held allocated by a running process/ will it be swapped out to disk. Unless you're doing something pathological (like suddenly allocating lots of RAM and forcing paging - what purge utilities do...), the architecture /speeds things up/.
If it's really true that, without doing anything special, things were always being swapped out to disk for you, it necessarily means that there was a process that had allocated (lots, it sounds like) of RAM and written to it, so that stuff had to be paged out to disk to free up RAM without losing data.
It sounds like you were running purge commands or utilities to 'free up RAM'. That is counterproductive. It causes the system to release cached 'clean' (i.e. as already on disk - a mapped file, essentially) mapped RAM and swap read-write memory out to disk, only then have to re-read it all when you actually need it. In other words, using purge 'utilities' actually puts the system into the worst possible state.
The problem with the way how OS X keeps data cached in the inactive memory is based on the assumption that you are going to re-use the same app within reasonable amount of time. With the current behavior/performance response (without knowing exactly how Apple engineers implemented it), it really feels like a giant garbage collection system that takes age to free up its own memory with no real sense of concept that if you don't use certain apps for day, chances are, it is going to take a long while before I use them again.
I am one of the devs out there running a Macbook Pro with 8Gig of memory (I wish I could have more but I have a 2010 old model). For web development, I have at least Firefox/Chrome/PhpStorm/SmartGit/Mamp/Thunderbird/Terminal/Notational Velocity/Dropbox/Alfred/Sophos AntiVirus open at all times. Now, a long the way, I may open a few other apps that I use rarely, like Photoshop/CyberDuck/VMWare Fusion/iTune/iCal/iOS Simulator/Preview/LibreOffice/Skype. Now, pretty quickly 8Gig gets used up, and the system runs to the ground shortly after.
If I then have VMWare Fusion shutdown for awhile, and relaunch later, the system really just can't take it anymore. The last resort? purge&
At least, that's my day-to-day experience with OSX. Personally, I find the memory management really lousy, worse than other other OSes I used in the past (both Windows & Ubuntu/Fedora/Gentoo).
So why the heck I use Mac still? Because of the driver support is still far better than Linux. With Mac, you are less likely in need of blacklist of some drivers because of freeze up issues.
Either way, I am definitely not a happy camper with the current memory management system in OS X
The inactive count is a misleading figure in many ways. It includes both 'dirty' and 'clean' (i.e. identical to what's on disk, and hasn't been altered) memory that's not been used for a long time. Purging 'clean' memory is instantaneous. The only cost you'll see is writing 'dirty' memory out to disk.
Your explicit purging is changing the cost of writing dirty data out from an ongoing cost to a single, longer, upfront cost. Instead of writing only when more RAM is required, you're forcing it all to happen at once.
Incidentally, the OS does try to keep an area of free RAM so that some memory can be allocated instantly, it's not only swapping things out when RAM is absolutely full. It's possible though to outrun this process if an app tries to allocate huge amounts of RAM at once though (i.e. more than is kept free for this purpose).
For your specific case, presuming apps are behaving well (see below) you would be better off quitting apps and relaunching them when you need them later. This will free up app the dirty RAM they've allocated (just like when you purge), but the 'clean' inactive RAM will not be purged (because that's not necessary - as I said, it's free to purge that kind of memory when it's needed for something else).
You also want to run Activity Monitor when your system is in it's bad state and see if it's one of the apps you're using in particular that's allocating lots of memory (check out the "Real Mem" column). The OS can't do anything if it's an app that's really allocating and writing to memory, it's obviously not able to just discard this written-to memory.
Really though, if you want to do all those things at once, more RAM might be required. Remember, with VMWare and the iOS Simulator running, you've got two whole other OSes running at the same time, it's reasonable they'd require lots of memory to work well!).
By the way, the purge command was written to simulate /worst case/ conditions when performance testing. It's designed to flush out caches so that the system has to e.g. load all an app's code from disk when launching.
[Source: I worked analysing this kind of thing at Apple until a couple of years ago].
> So why the heck I use Mac still? Because of the driver support is still far better than Linux.
I feel like this is a stereotype that just won't die.
Yes, up till a few years ago you might have to do some poking in /etc to get things working, but as long as you spend a few minutes looking up basic background info before you buy those problems just don't happen these days. I haven't had to edit a config file to get hardware working since 2007.
I've had to edit configurations files to get hardware working in the last month.
On a laptop we have here at the office I had to disable a certain driver from loading before a different driver or else the two would squabble over the wifi card and it would never show up.
The other thing that is more software related than hardware is that it is a MS Windows shop, all of the local domains are machine.domainname.local. This conflicts with MDNS as you could imagine, so the Linux machines are unable to access any of the resources on the machines named machine.domainname.local because MDNS would respond with a failure. Had to modify /etc/nsswitch.conf to fix that issue.
Linux is not without its failures. Saying it just works is certainly not the case. Whereas the Mac OS X machines I deploy come out of the box, get configured and are ready to go. Drivers work, software works, don't need to go googling for hours trying to figure out why ping won't resolve a machine.domainname.local address but dig is doing just fine.
I said if you spend a little time scoping these kinds of things out up-front, it's very easy to get a machine on which it will work without issues. Obviously if you found a machine lying around the office and tried to load an OS on it, your chances of it working well are not going to be as good. You can't just load OS X on random hardware and expect it to work either.
After 10 years of Linux desktops, including purchasing a laptop with low-performance hardware just because it used entirely OSS drivers, I still has issues with basic things like multiple monitor support (it worked, but only when I disabled compositing, which makes redraw suck).
Sorry, from my personal experience that still isn't true, as much as I wish it were.
I've been using dual monitor setups at home and at work for a few years now through several hardware builds, both desktops and laptops first using Ubuntu, then switching to Debian Testing for a rolling distribution about a year ago (after finally realizing I love the latest software, but don't want to spend time to upgrade/rebuild every 6 months to keep up with Ubuntu's release schedule or deal with the hassle of installing everything from source). I've been using Nvidia cards with the nvidia driver and dual monitor support has been fantastic for me.
Well, I've been using Nvidia cards with the nvidia-driver and dual-monitor support as well, and it has been fantastic, until I upgraded my Ubuntu install, got Unity without asking for it, after which multi-monitor support was completely broken. My colleague who has 2 screens of a slightly different type (all are Lenovo ThinkVision) and is running Arch with Gnome 3, has been experiencing random multi-monitor glitches since day one. Sometimes for no apparent reason one of his displays doesn't get a signal after waking his laptop from sleep or hibernate, and the only way to resolve it is a reboot.
To make a long story short: we could exchange anecdotes all day about the state of 'Just Works' on Linux, but at the end of the day, I think no one with enough experience using various Linux distros and OS X, can honestly and sincerely say Linux is even close to OS X in that aspect.
Myself, I've been using Linux since Slackware 4 and have tried about 10 different distro's over time, alongside OS X for the last 5 years or so. Up to this day, I regularly run into problems that need fixing on Linux, particularly after upgrades, or when switching hardware. Whether it's Wifi cards, USB hardware, multi-monitor support, network configuration issues, software that stops working, system library problems: there's always something. OS X on 3 different machines, from OS X 10.4 through 10.7, I've only had one issue that required maintenance once, on a b0rked upgrade. It was pretty nasty, but fortunately OS X has Time Machine and target disk mode, so in no-time I was able to pull off any important data just to be sure, re-install the OS, restore my Time Machine backup, only to find out everything was back to normal, to the point I didn't even need the files I had to pull before the restore.
I still haven't figured out how to get my phone to tether over USB on my friend's Mac. Works flawlessly on my Debian machine though. It's not like OS X is seamless.
Well, I would say yes, Linux has come a long way to be much more mature and much more usable than before.
However, up to this date, it is not without issues, especially on laptop hardware. Remember that Lenovo ThinkPad T400 from a few years ago? Well, the level of stability from a popular distro, such as ubuntu, has been quite up and down. One release (like 11.04), I had trouble with it booting up and playing nice with dual graphics mode. Today, with 11.10, it is much better. How about that shiny Acer AspireOne 722 netbook? You should check the online thread. There are still discussions about how to prevent freeze up and etc. All these little quirks here and there are the reason why I would still run OS X.
Interesting - I do about the same thing (Minus the Antivirus.... that could be your big pig there), with 8 gigs of physical ram, and I have paging (swapping?) disabled.... and I've never had a "not enough memory" crash or whatever.
With vmware going, iTunes, Xcode building stuff, skpye, dropbox, item, mail, a few browsers, a bunch of tools..... a video going maybe on the second monitor. On a mid-2009 mbp.
I do happen to have an 8 GB machine, and almost never reboot (only reboot for OS updates).
I run half a menu-bar full of resident helper apps, like Dropbox (a big one), Fantastical, ScanSnap, Xmarks (another big one), Transmit (another big one), Evernote, and more. I also keep Apple Mail running, mapped to a half dozen Gmail IMAP accounts. I have "geek tool" updating my desktop with iCal appointments and various ps outputs.
I'm running a local MAMP stack and local Django stack. I run a Parallels Windows 7 VM for testing things in IE and testing from Windows in general.
Other running software is usually Safari, Terminal, Sublime Text 2, Codebox, Source Tree, Sequel Pro, Adium. I run and quit Office 2011 every time I need to edit a document. I run and quit Aperture and Photoshop.
Using this command line to see memory used by processes:
I do happen to have an 8 GB machine, and almost never reboot (only reboot for OS updates).
Here's what I don't understand - why would anyone in their right mind purchase a $2000 laptop and then not spend the 20 minutes and 100 bucks to max out the memory on the thing? It's the easiest thing in the world to do, and basically means you never have to worry about memory usage again.
Recall that what "matters" is really that second line, which translates to 740MB being really "used" and 3.1GB being just "stuff I happened to read from disk at one point", which as it happens includes rather a lot of media files. Loading another 4GB of media into RAM isn't going to help my system performance any.
This is with a respectable Linux dev loadout, but I'm not running my VMs, but that still tends not to strain my system any. $100 on RAM would just be a wasted $100.
And-or get an SSD - the heavyweight disk-space users like movies and music are trivial to move to an external drive on a Mac, and with an SSD so much disk-thrashing pain just goes away. It's pretty great.
After noting my rather older MBP with 8GB RAM was outperforming my new iMac, I spent $16 for 4GB of RAM (doubling the total to 8GB); the iMac's performance improved immeasurably. By far the best bang for the buck on OSX is maxing out the RAM. (Well, at least until I can afford a solid state drive of adequate size.)
You add the individual RSS of every process, but processes share physical pages, so that doesn't make any sense. You're counting physical pages more than once. For example on my 4GB laptop you're command yields 5.6GB, and I don't even have a swap partition.
I'm skeptical about your little script (though I too have a lot of heavy apps open all the time and almost never reboot, and certainly have never, ever had any memory-related problems whatsoever).
My MBP has 8 GBs of RAM, and this is what Activity Monitor tells me:
I think the problem is that you used OS X for a year. I've been using my Mac for 4 years, and Linux for another 4. And I have Python (in fact I have at the same time Python 2.5, 2.6, 2.7, 3 and 3.1 for testing purposes), I have installed so many packages I already lost the count. You only need to be sure the path is correct when doing it. And I have 2 Gigs of RAM. Mac OS can be a bit hungry, but there's an "easy" solution for this. Closing unused applications. That bright dot in the dock means something is using an incredible amount of RAM just to sit there doing nothing. If I only leave open my mail app, Chrome, emacs and twitter client, I can go for days without ever hitting the disk cache. If I start adding more things (or tabs like I'm crazy), I'll hit the cache, of course. But this also happens in my Linux systems, it's not a Mac OS problem. As for the DVI port... I usually leave the DVI adapter linked to the monitor I use. And use bluetooth devices when possible. So far only once I have used a hub to connect 3 things to my Macbook (an SD reader to copy images, my Ben Nanonote to install some packages and my iTouch to sync with iTunes). Happened once, in 4 years (and I also have a drawing tablet, btw).
That's not a solution, that's a workaround. I'm sitting here with a 4GB RAM machine running Arch Linux with XMonad. I usually run the following applications in day to day usage:
- Firefox Nightly with 50-100 tabs (largest memory hog,
the rest doesn't even come close)
- Thunderbird
- Pidgin
- XChat
- smplayer, which I usually keep open
- deluge
- mpd + ncmpcpp
- half to one dozen of terminator instances with zsh,
most running some text mode applications like vim,
htop or the aforementioned ncmpcpp
I reboot my PC once in a blue moon, usually after kernel updates. otherwise it's running 24/7. Right now, I sit at ~35% memory (and that should include memory used for disk caching), a bit less than half of which is Firefox with ~16% (65 tabs). I usually never go over 50% unless compiling heavy stuff (like QT level heavy). I don't really know what the fuck OS X does to eat all that RAM, but it apparently does something wrong.
Emulating a setup similar to yours, the OSX machine infront of me sits a 2.8GB active, of which a little less than 800MB are taken by Terminal.app due to me having activated infinite scrollback and having two rails processes being hit 8 hours a day since like forever as I develop. So it's really closer to 2GB and I didn't even try hard. This includes mds+mdworker (the indexer) which clocks in at 200MB. Normally I also have LibreOffice, Pixelmator, iCal, iTunes, Reeder, VMWare Fusion, which makes it balloon to 3.1GB and up as I open more documents/VMs.
I have experienced OSX's "swappiness", having gone as far as disabling dynamic paging in an attempt to avoid it. Upgrading memory was the only real solution. A little bit of research would reveal a lot of other people have run into the same problem, you aren't alone at all in that.
I split my time between OSX/Linux and it's pretty obvious to me that Linux is vastly superior in terms of performance, in a wide-range of scenarios. I prefer to use Linux on older and/or memory-constrained systems.