This isn't a big deal for many of us on here, but presented a large barrier to entry for my non-tech-savvy friends & family. * Poor cross-platform app experience, especially on mobile (iOS in particular). * Great data ownership philosophy and data storage flexibility. Fast, easy, polished, powerful (TOTP available by default). Having used both, Bitwarden has been a better overall experience for me and much better for my friends and family. Storage may very well become so big and cheap as to be practically infinite, but CPU will always cost energy.įor one-off things with limited use, knock yourself out, of course, but if you package something for distribution, it may stay around "forever" and get handled countless times, by servers and end-users, so if it can be made smaller without making it worse and without extreme hassle, make it smaller. The whole universe in all it's infinite wealth cannot change that, and we live on a planet that's about to get ruined real hard because of our consumption of materials and energy. The same achieved with less is always better, I'll just claim that. Consider that this runs on hardware from 1981: We cannot even begin to imagine what our current hardware would be capable of, if we only allowed ourselves the time to use it well. not for any useful reason, just so people can show they can afford beefy hardware and waste it. Imagine some kind of character encoding that is exactly Unicode, but every character gets repeated 10 times. But generally, this attitude of just throwing hardware at software is a problem, which by now it has reached gigantic proportions IMO, and you made the argument generally. In this case, I like using KeePassXC portable, so if the size is the result of having less outside dependencies, I'm fine with it, don't get me wrong. And then there are billions of people using even more devices. I mean sure, if the data just sits there, and you don't do anything else with the machine but run a password so manager, it really doesn't matter, but we tend to run dozens programs actively with even more running in the background, all of this adds up quick even on one machine. The more free space a SSD has, the smarter it can be about wear leveling (I think, though I have no idea how much this matters in practice). No, but just about any aspect of computing benefits from smaller file sizes, or smaller data size in general, starting with CPU caches, RAM caching of files, file transfers, including syncing things over the network, backing things up. > Why does the file size matter? Are the devices you use so short on storage that an extra 100 mb is an issue? All subcommands must be slaved to the main command, and update-alternatives enables this. Here's an old (out-of-date) example script for installing Haskell Platform after building it in /opt/ that demonstrates this. The upfront work part is in scripting the various binaries and manpage files that need to be linked together, atomically, into system directories. It makes both upgrades and rollbacks completely painless, and you get to decide which version you use. It takes a little more work up front, but is worth it. You install them in /opt or somewhere else where collisions won't occur, then use the update-alternatives command to link them into /bin, /usr/bin or other system directories. It allows you to install multiple different versions side-by-side and toggle which of them is called by the canonical system command in /bin, /usr/bin or wherever. My preferred way of installing and managing software on Ubuntu is compiling the source to /opt and then installing it to the system with the Debian/Ubuntu Update Alternatives system:
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |