Do you remember if the textbook was orange (possibly with a two-tone cover design)? I had a really good textbook in college that had a... 4? CD set (with the big jewel case) that had a bunch of tracks and like you I really enjoyed it.
It was a reddish (could be orange, could have been maroon) color lightly mottled in black with a picture of a violin (or cello, idk) set in the lower 2/3 of the cover. I'm somewhat certain it had 6 cds because it filled my disc changer in my stereo, although that detail is fuzzy too.
Contrasting seemingly all the other responses to this, I use it the same way you do (only opening it when needed) and I'm fine with the delay: even at its slowest rebuilding the index and searching is faster than the in-built windows Search.
I always liked the cases where people have broken open ROMs only to find that the compilation process stuffed a bunch of directory and filenames in the silicon. It's funny to think that in a time when literally every byte cost money there's a fair number of carts that have people's FAT entries burned into them.
Well... every byte costs money, but it's really quantized. If your game doesn't fit in a power of 2, you have to pay more to get the bigger one (and maybe even a lot more if you also need to include a mapper, plus using a mapper means you've got to adjust your code and what not). But if you're using 75% of your quantized space, you don't need to be that picky.
The last time I built a ROM (in the 90's) it had 7 free bytes of space. That was not a coincidence. It was larger than that and then optimized until it hit a nice power of two and then I stopped optimizing.
It is quantized, but not necessarily to a power of two. RAM and flash sizes of 3x2^n are common even in contemporary microcontrollers, and it’s clearly possible to use fewer R chips then your bus decoder supports in the less integrated computers of yore.
And once you have a project with a fixed amount of ROM, either because the hardware is at a point in its life cycle where you can’t change it, or it’s third-party hardware and you have to use what you get, or because you’re already at your BoM budget, then your software will behave like an ideal gas - it will expand to fill the available ROM. This happens because until you run out of ROM, you will write your software in whatever way is easiest to get your job done. But then once you run out of ROM, you will go back and look for something that you can make smaller. Then you can add a few more features or whatever, until you run out of space again. This process will repeat until you’re done, but your ROM will always be nearly full.
Software complexity following an ideal gas law is an analogy that works in a ton of ways… time (you’ll keep adding stuff until you’re running close to a deadline at which point you have to actually decide what to cut), head count (complexity will rise to match the number of people working on it, since people aren’t just gonna sit idle), speed (it’ll get slower until it’s “noticeable” on the dev’s machine, at which point it gets optimized), etc etc etc.
I’ve tried to use this analogy to PM’s who have trouble understanding why adding more engineers to a project doesn’t make it go faster: the software expands to fill its container.
It's usually more common on CD games with the directory/file being unlisted, but it's always interesting to find partial source code bakes into ROMs. Last I heard/understood, it was usually as some sort of white space padding; in the case of CD-ROMs and an artifact of the sector-copying mechanism in disk duplicators.
Also, the bytes cost money, but if you have a 32K ROM (because the next lowest size is 16K) and only 28K of data, it's not costing anything extra to fill that space.
Every chip cost money, right? If your game shipped on a single 1mbit EEPROM, it cost the same amount of money whether it was 90% full or 99.99% full. There was of course a cost incentive not to go over 1mbit, or to get your size down enough to fit onto a 512kbit EEPROM.
In Windows when you set a wallpaper it (sometimes) silently transcodes the one you selected into a new, smaller version. It doesn't always, there's a heuristic, but it's assumed it happens to prevent people from selecting a 1TB terapixel photo and have it destroy the machine.
Anyway, since it transcodes the WP into a JPEG, it has the ability to select a compression ratio. That ratio is pretty famously < 100% and as a result there's some degenerate cases where a wallpaper that looks good when viewed in the filesystem looks terrible when set to the background.
I've seen machines which needed to swap to show the desktop, because the wallpaper was a high/true-colour BMP and it was like half the memory the machine had!
Do things like the upgradable firmware and integrated text editor live in a path that's separate from the 8KB of data? If not I'd fear it would die before the actual storage did.
I also vaguely recall that Sonic 3 used this storage type? Like it was a weird outlier for cartridge-based saves IIRC.
Or more broadly, don't do something unbidden on my behalf then complain when it doesn't work. I swear 75% of the times I've daydreamed about a return to a single-tasking OS like DOS was on account of software trying to 'help' me.
Yeah, people frequently trot out the Sim City classic example, but the fact is that games in particular have borne the brunt of Windows API changes. If it didn't happen we'd have no need for wrappers like dgVoodoo: software once only meant to wrap Glide calls, but now used to get around deprecation to DirectX as well.
Not to mention the breaking changes to the audio stack that happened between 9x and Vista.
I don't know if anything's changed but for a long time a lot of Word's formatting was impacted by the printer you had installed and/or selected. So it was pretty easy to run into scenarios on a network where printers being taken offline/their capabilities changed resulted in documents getting reformatted.
[1]: https://en.wikipedia.org/wiki/User_Account_Control