Canadian software engineer living in Europe.

  • 4 Posts
  • 107 Comments
Joined 2 years ago
cake
Cake day: June 7th, 2023

help-circle

  • The bit of information you’re missing is that du aggregates the size of all subfolders, so when you say du /, you’re saying: “how much stuff is in / and everything under it?”

    If you’re sticking with du, then you’ll need to traverse your folders, working downward until you find the culprit folder:

    $ du /*
    (Note which folder looks the biggest)
    $ du /home/*
    (If /home looks the biggest)
    

    … and so on.

    The trouble with this method however is that * won’t include folders with a . in front, which is often the culprit: .cache, .local/share, etc. For that, you can do:

    $ du /home/.*
    

    Which should do the job I think.

    If you’ve got a GUI though, things get a lot easier 'cause you have access to GNOME Disk Usage Analyzer which will draw you a fancy tree graph of your filesystem state all the way down to the smallest folder. It’s pretty handy.



  • Daniel Quinn@lemmy.catoLinux@lemmy.mlWhy?
    link
    fedilink
    English
    arrow-up
    2
    ·
    25 days ago

    I was a Windows user as a kid in the 80s & 90s doing pirate installs of 3.11 and later 95 for friends and family. I got into “computers” early and was pretty dedicated to the “Windows is the best!” camp from a young age. I had a friend who was a dedicated Mac user though, and she was bringing me around. The idea of a more-stable, virus-free desktop experience was pretty compelling.

    That all changed when I went to school and had access to a proper “Mac lab” though. Those motherfuckers crashed multiple times an hour, and took the whole OS with them when they did it. What really got to me though was the little “DAAAAAAAAAAA!” noise it would make when you had to hard reboot it. It was as if it was celebrating its inadequacy and expected you to participate… every time it fucked you over and erased your work.

    So yeah, Macs were out.

    I hadn’t even heard of Linux in 2000 when I first discovered the GPL, which (for some reason) I conflated with GNOME. I guess I thought that GNOME was a new OS based on what I could only describe as communist licensing. I loved the idea, but was intimidated by the “ix” in the name. “Ix” meant “Unix” to me, and Unix was using Pine to check email, so not a real computer as far as I was concerned.

    It wasn’t until 2000 that I joined a video game company called “Moshpit Entertainment” that I tried it. You see, the CEO, CTO, and majority of tech people at Moshpit were huge Linux nerds and they indoctrinated me into their cult. I started with SuSe (their favourite), then RedHat, then used Gentoo for 10 years before switching to Arch for another 10+.

    TL;DR: Anticapitalism and FOSS cultists lead me into the light.







  • Daniel Quinn@lemmy.catoLinux@lemmy.mlDoes it get better?
    link
    fedilink
    English
    arrow-up
    16
    ·
    2 months ago

    It does get better, but… it’s kinda like river rafting.

    Coming from Windows, Linux can and does often feel like you’ve spent your whole life trapped in a box. Suddenly “that thing that’s always annoyed you” is something you can turn off, replace, or improve with very little effort. I remember for example that when I switched back in 2000 I was blown away by a checkbox in the KDE PDF viewer. You could, in the basic settings, with no special hackery required, simply uncheck the box labelled Respect Adobe DRM. Suddenly, my computer was actually mine.

    Using Linux these days is still just as amazing. You go from an OS that spies on you, pushes ads into your eyeballs, and has some of the worst design patterns ever, to a literal bazaar of Free options. It’s different for everyone, and that’s sort of the point: Linux is “Free” in all senses of the word, as you can make your machine do whatever you want.

    It takes some time to get there though, and a lot of it is hardware unfortunately. A lot of the machines out there are built exclusively for Windows and the companies that make these things hide a lot of their inadequacies in their (proprietary) Windows drivers. So, when you try to use not-Windows, you end up using drivers written by people who had to reverse engineer or just do some guesswork to get that hardware working. This arrangement works very well for both Microsoft and these budget hardware vendors because it provides lock-in for the former, and a steady market for the latter.

    The reality is that if you want to make the switch to Linux, you’re more likely to have a hard time if your hardware choices fall in this camp. For example, some times it’s just easier to buy a €12 USB WiFi or Bluetooth adapter that you know works with Linux than it is to rely on the chip that came with your laptop. It’s better now than it once was, but Nvidia cards, the occasional webcam, and a few WiFi devices have presented as problems for me in the last few years.

    My advice is to embrace that “patience and stubbornness” and temper it with an honest pricing of your time vs. the cost of replacing the problematic hardware. When buying new stuff, look up its Linux support online before buying anything. You’ll save yourself a lot of pain.

    In cases when you really want to dig in and understand/fix your problem (because it’s Linux, you’re allowed to understand and fix things on your computer!) then I recommend looking at the Arch Wiki and even using Arch Linux since (a) that’s the basis for most of the information there, and (b) Arch tends to favour “bleeding edge” stuff, so you’re more able to install the latest version of things that may well support your hardware.

    I know it’s probably not the answer you were hoping for, but if you stick it out, I promise it’s worth it. I’ve been doing this for 25 years now and I’m never going back. Windows makes me so inexplicably angry with it’s constant nagging, spying, and inadequacies, I just can’t do it.


  • Most of the comments here seem to be from the consumer perspective, but if you want broader adoption, you need to consider the corporate market too. Most corporate software these days is web-based, so the problem is less with the software and more with the people responsible for it.

    The biggest hurdle is friction with the internal IT team. They like Windows because that’s all they ever learnt and they’re not interested in maintaining a diverse set of company laptops. They won’t entertain Linux in a corporate environment unless it’s mandated by management, and even if the bosses approve it, IT will want a way to lock you out of your laptop, force updates, do a remote wipe, etc.

    There are (proprietary) tools to do some of this, but they generally suck and often clash with your package manager. Microsoft is just way ahead of Linux in the “bloatware that tours your hands” department.





  • I don’t think there’s an official “way”, but here’s mine (which I love):

    On start-up I open all the apps I usually use, one per designated workspace:

    1. Slack/Teams/Mattermost, whatever my work requires.
    2. Thunderbird
    3. Kitty
    4. PyCharm/RustRover, whatever the job requires
    5. Firefox

    Workspaces 6-9 are left empty, ready for whatever app I need in the moment, but only ever one app per workspace.

    With this setup, I’ve mapped Ctrl+Fx to each workspace, so Ctrl+F4 takes me to PyCharm where I write the code, and Ctrl+F5 followed by another F5 takes me to Firefox and reloads the page. Ctrl+F3 is always the terminal, etc., so you quickly start building these shortcuts to mean Fwhatever is $APP_NAME.

    I almost never use the mouse, unless what I’m doing is necessarily mouse-driven: browsing or drawing charts etc. Everything else is keyboard-driven.



  • I have a few interesting ones.

    Download a video:

    alias yt="yt-dlp -o '%(title)s-%(id)s.%(ext)s' "
    

    Execute the previous command as root:

    alias please='sudo $(fc -n -l -1)'
    

    Delete all the Docker things. I do this surprisingly often:

    alias docker-nuke="docker system prune --all --volumes --force"
    

    This is a handy one for detecting a hard link

    function is-hardlink {
      count=$(stat -c %h -- "${1}")
      if [ "${count}" -gt 1 ]; then
        echo "Yes.  There are ${count} links to this file."
      else
        echo "Nope.  This file is unique."
      fi
    }
    

    I run this one pretty much every day. Regardless of the distro I’m using, it Updates All The Things:

    function up {
      if [[ $(command -v yay) ]]; then
        yay -Syu --noconfirm
        yay -Yc --noconfirm
      elif [[ $(command -v apt) ]]; then
        sudo apt update
        sudo apt upgrade -y
        sudo apt autoremove -y
      fi
      flatpak update --assumeyes
      flatpak remove --unused --assumeyes
    }
    

    I maintain an aliases file in GitLab with all the stuff I have in my environment if anyone is curious.


  • I have much the same:

    • Files on the network with NFS
    • Kodi on an old laptop under the TV so we can watch said files.
    • Syncthing on our phones and laptops to pull films from there onto that file server.

    The only difference is that I’m using a Synology 'cause I have 15TB and don’t know how to do RAID myself, let alone how to do it with an old laptop. I can’t really recommend a Synology though. It’s got too many useless add-ons and simple tools like rsync never work properly with it.