• 3 Posts
  • 61 Comments
Joined 1 year ago
cake
Cake day: September 7th, 2023

help-circle






  • I haven’t done too much work with WASM myself, but when I did, the only languages I saw recommended were Rust, C++, or TinyGo. From what I’ve heard, Rust and C++ are smoother than TinyGo. Garbage collected languages usually aren’t great choices for compiling to wasm because wasm doesn’t have any native garbage collection support. That limits your selection down a lot.

    But another option you may want to consider is Nim. As I understand, it compiles to C, so any C->Wasm compiler should theoretically work for you as well. I did a quick search and wasn’t able to find any great resources on how to do this, but you might get a bit more lucky. Good luck!


  • You’re probably right. I think COBOL development is one of the cases where the crazier stories are the ones that bubble to the top. The regular scene is probably more mundane.

    I do think there are a few advantages to learning COBOL over C++. COBOL seems to be much stickier - companies that use it seem much more hesitant to replace it than a lot of the companies that use C++, and as a result, they will probably get more desperate. And while there’s definitely a lot more C++ out there than COBOL, I have to imagine that the number of people under 50 that use COBOL is probably tiny, while C++ still has a very large userbase. On the other hand, consulting depends a lot on your portfolio, references, and past accomplishments, and nobody’s going to pay 1k EUR/USD/etc. per hour (exaggerating, obviously) if you don’t have any credentials. It takes time to build that up.

    Ultimately, I do think you’re pretty spot on, but we’ll have to see. This is more just a fantasy I tell myself to make it seem like retirement is closer than it probably is…



  • It was always obvious to me that as long as I was using closed source software that any day could come when the vendor would screw me over. In fact, it could have been running it with bundles and bundles of spyware already and I had no way of knowing it. So I pledged to start using open source software only, to make sure that wouldn’t happen. First, I migrated all my desktop applications to open source alternatives. Then I finally made the switch.




  • This is very interesting! Things like this make me wish programmers would give functional^W declarative programming more of a chance. I’ve long fantasized about being able to write programs as declarative code that the computer can optimize automatically without human intervention. When you implement your program in more restrictive (ie. stateless) paradigms, you can more easily reason about the code, and thereby make it easier to optimize or run in different environments.

    SQL is a great example of this - when you look at some of the optimizations that servers like PostgreSQL can do under the hood, this is because the language inherently limits what you can do so the actual system executing your instructions can do different things with it for better performance and reliability. Things like this are what make query optimizers possible, and it’s really fascinating if you actually read carefully what query analyzers report (beyond just checking whether your indices are being used or not).

    Beautiful chart. Thanks for sharing!



  • What exactly is it that people obsess over? The desktop environment and terminal customisation? Setting up NetworkManager with nmcli? Using Vim to edit a .conf file?

    Welcome to the crowd! Eventually, you realize that an operating system is just an operating system: something you use to get work done, and the less you notice it, the better it’s doing its job. The pride of setting it all up mostly ends very shortly after you’re done. At that point, you realize that pretty much all distros are the same, give or take.

    That said, there are always moments that make you realize that your OS is amazing. When you’re faced with a new and difficult task that you don’t know how to achieve, then you look at your distro’s documentation and solve it in a few elegant steps. And I’m not an Arch user, but that’s when the Arch wiki will really be your friend, as well as all the other resources that Arch has for its users. I can’t think of examples of these kinds of moments because they’re so rare, but those are the moments that feel great and really make you appreciate your OS.




  • Do you know how vim has distributions like lunarvim, lazvim, nvchad, etc.? Simply installing something like lazyvim can quickly and easily convert vim from a text editor to a full blown IDE.

    I think Gnome needs something like this. A curated set of plugins that are easy to install and maintain compatibility with different versions of Gnome - something that would deal with the API churn in Gnome while maintaining a stable, usable desktop environment.

    I don’t know if this is feasible, because I haven’t used Gnome since 2.x, but I think it would really help make it an actual full blown DE.



  • Agreed. Objects are nice and a great way to program. Composition is great. Traits/interfaces are great. Namespaces are great. Objects are a really nice way to reap the benefits of principles like these.

    But then there are aspects of OOP that absolutely suck, like inheritance. I hate inheritance. The rules get very confusing very quickly. For example, try understanding overriding of methods. Do I need to call the superclass method or not? If not, does it get called automatically? If so, in what order? How do these rules change for the constructor? Now repeat this exercise for every OOP language you use and try not to mix them up… Java, C++, Python, etc.

    Fortunately, it feels like we rely on inheritance less and less these days. As an example, I really like how Java allows you to implement Runnable these days. Before, if you wanted to run a thread, you needed a separate object that inherited Thread. And what if that object needs to inherit from another one too? Things would get out of hand quickly. (This is a very old example, but with lambdas and other new features, things are getting even better now.)

    Anyway, long story short, I think OOP is a complicated way to achieve good principles, and there are simpler ways to achieve those principles than a full OOP implementation.