• 0 Posts
  • 34 Comments
Joined 1 year ago
cake
Cake day: August 8th, 2023

help-circle

  • Until a couple of weeks ago I used Fedora Silverblue.

    Then, after mostly using GNOME Shell for about a decade, I (reluctantly) tried KDE Plasma 5.27 on my desktop due to its support for variable refresh rate and since then I have fallen in love with KDE Plasma for the first time (retrospectively I couldn’t stand it from version 4 until around 5.20).

    Now I am using Fedora 39 Kinoite on two of my three devices and Fedora 39 KDE on a 2-in-1 laptop that requires custom DKMS modules (not possible on atomic Fedora spins) for the speakers.

    Personally I try to use containers (Flatpaks on the desktop and OCI images on my homeserver) whenever possible. I love that I can easily restrict or expand permissions (e. g. I have a global nosocket=x11 override) and that my documentation is valid with most distributions, since Flatpak always behaves the same.

    I like using Fedora, since it isn’t a rolling release, but its software is still up-to-date and it has always (first version I used is Fedora 15) given me a clean, stable and relatively bug-free experience.

    In my opinion Ubuntu actually has the perfect release cycle, but Canonical lost me with their flawed-by-design snap packages and their new installers with incredibly limited manual partitioning options (encryption without LVM, etc.).


  • My whole infrastructure is designed so that my homeserver is expendable.

    Therefore my most important tool is Syncthing. It is decentral, which is awesome for uptime and reducing dependance on a single point of failure. My server is configured as the “introducer” node for convenience.

    I try to find file-based applications, such as KeePassXC or Obsidian, whenever I can so that I can sync as much as possible with Syncthing.

    Therefore there is (luckily) not much left to host and all of it is less critical:

    • Nextcloud AIO: calendar, contacts, RSS, Syncthing files via external storage
    • Webserver: Firefox search plugins (Why is this necessary, Mozilla?!), custom uBlock Origin filter list, personal website

    So the worst thing that can happen when my server fails is: I need to import my OPML to a cloud provider and I loose syncing for some less important stuff and my homepage is not accessible.

    Since I just rebuilt my server, I can confirm that I managed a whole week without it just fine. Thank you very much, Syncthing!


  • FOSS Is Fun@lemmy.mltoOpen Source@lemmy.mlDistrochooser
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Linux Mint nowadays supports release upgrades, but you have to follow their blog to know when a new major Mint release is out and you have to manually install mintupgrade and do the upgrade.

    So it is definitely not caused by technical constraints, as Mint has implemented the difficult part (providing and testing an upgrade path) already. Notifying the user about a new release upgrade shouldn’t be too difficult? E. g. in the most simple form you could probably preinstall a package that does nothing at first, but receives an update once the next Mint release is out to send a notification to the user to inform about a new Mint release.

    When it comes to elementary OS, I think they could support in-place upgrades, as they properly use metapackages (unlike Mint, which marks most packages as manually installed and doesn’t really utilise automatically installed packages and metapackages in a way that you would expect on a Ubuntu-based distro), but they probably don’t want to allocate / don’t have the resources to test an official upgrade path.

    But again, I don’t understand why it is so difficult for elementary OS to at least provide a simple notification to the user that a new version is out. Even if the users have to reinstall, it is critical to inform them that their OS is about to become end of life. You know, people do things like online banking on their computers …

    It’s the first thing I check with every distribution and if it doesn’t have an EOL / upgrade notification, it is immediately out.


  • This has always been the case with Ubuntu. Ubuntu only ever supported its main repository with security updates. Now they offer (paid) support for the universe repository in addition, which is a bonus for Ubuntu users, as they now have a greater selection of packages with security updates.

    If you don’t opt-in to use Ubuntu Pro, nothing changes and Ubuntu will be as secure (or insecure) as it has always been. If you disable universe and multiverse you have a Ubuntu system where all packages receive guaranteed security updates for free.

    Please note: I still don’t recommend Ubuntu due to snapd not supporting third-party repositories, but that’s no reason not to get the facts right.


    Debian has always been the better choice if you required security updates for the complete package repository.

    Personally I have my doubts if Debian actually manages to reliably backport security updates for all its packages. Afterall Eclipse was stuck on version 3.8 for multiple Debian releases due to lack of a maintainer …


  • There are plenty of reasons to get rid of Ubuntu, but this isn’t one of them.

    Before Ubuntu Pro, packages in universe (and multiverse) were not receiving (security) updates at all, unless someone from the community stepped up and maintained the package. Now Canonical provides security updates for universe, for the first time since Ubuntu has been introduced, via Ubuntu Pro, which is free for up to five personal devices and paid for all other use cases.

    Debian is actually not that different (anymore). If you read the release notes of Debian 12, you’ll notice that quite a few package groups are excluded from guaranteed security updates, just like packages in universe are in Ubuntu. Unlike Ubuntu, Debian doesn’t split its package repository by security support though.


  • FOSS Is Fun@lemmy.mltoOpen Source@lemmy.mlDistrochooser
    link
    fedilink
    English
    arrow-up
    17
    ·
    1 year ago

    It misses one important choice: “I want to get notified of new releases of the operating system and want to have a graphical upgrade path.”

    Otherwise people just run their no longer supported OS until something stops working (I’ve seen this countless times …), as very few people follow blog posts or social media feeds of their operating system.

    This rules out lots of supposedly “beginner friendly” distributions, such as elementary OS or Linux Mint, as they don’t notify users about the availability of a new distribution release. Elementary OS doesn’t even offer in-place upgrades and requires a reinstallation.





  • Innovation or regression?

    Innovation doesn’t necessarily mean that all past functionality needs to be carried over. Actually innovation often means that past technology becomes obsolete and gets replaced with something new.

    Gnome used to have optional desktop icons. They removed them.

    They removed them because with GNOME Shell those icons no longer made sense. There was no longer a concept of dragging apps from a panel menu to a desktop, instead apps were now pinned from the fullscreen app overview to the dash.

    Since the code was no longer used by the default GNOME experience, it became unmaintained and eventually got removed.


  • Because GNOME is the only DE with some potential and by not having 2 or 3 simple optional features aren’t getting more traction.

    But everyone has different requirements and my “2 or 3 simple optional features” that are missing are completely different than what you think is missing. I couldn’t care less about desktop icons or system trays. I even prefer not having a system tray, as this functionality should be provided via notifications and regular application shortcuts in my opinion.

    But in the end, a software project only has a limited amount of resources available and developers have to decide where they want to focus on. GNOME chose not to focus on desktop icons:

    GNOME had icons, v3.28 discontinued them

    Because the code was “old and unmaintained” and probably no one was willing to modernise and maintain it. Desktop icons were already disabled by default before 3.28, so they didn’t “re-invent” this feature with the removal of the code in Nautilus.

    Using other DE doesn’t make much sense as you’ll inevitable run in GTK and parts of GNOME and having to mix and match to get a working desktop experience.

    I use GNOME and KDE and use the same applications (as Flatpaks) on both desktops: I use GNOME Calculator on KDE, because I dislike both KDE calculators, and I use Ark on GNOME with a Nautilus script, as File Roller doesn’t allow me to set the compression ratio (I need to create zip files with 0 compression for modding games). So for me it has become the norm to mix applications created with different toolkits. Thanks to Flatpak I still have a “clean” base system though.

    Btw. I am getting tired of these re-occurring complaints that GNOME works differently than other desktops. I am not constantly complaining about what features KDE is, in my opinion, missing all the time either (e. g. dynamic workspaces, same wallpaper and desktop configuration across all existing and new monitors, online account integration, command line config tool, etc.), instead I accept that this is how it is at the moment and either use KDE the way it is (like I do on my desktop PC) or use something that better suits my needs (like I do on all my laptops).


  • FOSS Is Fun@lemmy.mltoLinux@lemmy.mlWho uses pure GNOME (no extensions)
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    3
    ·
    1 year ago

    Because it takes manpower to develop and maintain these features?

    Especially desktop icons are difficult to get right (see workarounds like “ReIcon” on Windows). E. g. keeping icon positions across multiple monitors and varying resolutions and displays (which can be unplugged at any time). They can also be a privacy-issue, e. g. when doing a presentation.

    But most importantly: GNOME doesn’t want to be a traditional (Windows-like) desktop, so why would they implement features that don’t align with their ideas for a desktop experience?

    There are lots of other desktops, like Cinnamon, that offer a traditional desktop experience within the GTK ecosystem. There is also plenty of room for desktops, like GNOME, that have a different philosophy and feature set.

    In my opinion it would be boring, if every desktop tried to do the same thing. And there wouldn’t be any innovation, if no one tried to do things differently.



  • I’ve tried to combat this a bit with a global Flatpak override that takes unnecessarily broad permissions away by default, like filesystem=home, but apps could easily circumvent it by requesting permissions for specific subdirectories. This cat-and-mouse game could be fixed by allowing a recursive override, such as nofilesystem=home/*.

    But even then, there is still the issue with D-Bus access, which is even more difficult to control …

    I think it is sad that Flatpak finally provides the tool to restrict desktop apps in the same way that mobile apps have been restricted for a decade, but the implementation chooses to be insecure by default and only provides limited options to make it secure by default.


  • I was in a similar situation not too long ago.

    My criteria for another scripting language included that it should be preinstalled on all target systems (i. e. Debian and Fedora), it should be an interpreted language and it needs to have type safety.

    Afterall I settled with Python due to its popularity, its syntax and features (type safety since v3.6, etc.) and the fact that it is preinstalled on many Linux distributions. System components often use Python as well, which means that libraries to interact with the system tend to be included by default.




  • Actually that’s one of the main reasons I use Syncthing: It doesn’t need a server, as it is a peer-to-peer architecture. Unlike a centralised solution (cloud storage, Nextcloud, etc.) devices sync directly with each other. If they are on the same local network, you get to enjoy the full bandwidth of your local network. If they need to sync over a long distance over the internet, you are limited by the upload and download speeds of your internet provider, just like with centralised storage.

    I have a server that serves as an introducer, so I don’t have to connect each device with every other device manually. But the server doesn’t need to be available once all devices are connected with each other.

    Syncing continues to work without it for as long as I don’t reinstall any of the other devices. And even if I’d reinstall a device, I could delegate any other device to be the introducer or connect the devices manually with each other. It really is quite robust and fail-safe.


  • FOSS Is Fun@lemmy.mltoLinux@lemmy.mlBack to linux!
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Nowadays switching to Windows isn’t really an option for me anymore, as I am just too invested into the Linux ecosystem.

    It’s always funny hearing about how difficult it is to switch from Windows to Linux, because you have to relearn how to use a computer and all your favourite software isn’t available.

    But for me it’s the same, but the other way around! I would have to relearn how to document my installation (scripts, etc.), what program to use for which task or how to force a game onto a certain monitor (the last time I looked into this, the only way on Windows was switching the primary monitor before starting said game; on Linux I can just tell KWin how to make the program behave).

    It would be a lot of work with little or no benefit to me and I’m not even sure if all my hardware is compatible with Windows, as I did all my software and hardware purchases in the last decade with only Linux in mind and I usually didn’t purchase something if the manufacturer offered no support for Linux (money talks).