• 0 Posts
  • 21 Comments
Joined 1 year ago
cake
Cake day: June 2nd, 2023

help-circle

  • Why are you using networkd instead of networkmanager on a desktop?

    What a weird question. Networkd works anywhere systemd works, why whould desktops be any different.

    It’s the same as asking someone “why are you using systemd-boot instead of grub?” Because I like systemd boot better and it’s easier to configure. Same with networkd, configuration is stupid simple, I have installed it on my work machine even.

    As for op: since you can manually ping ip addresses and the issue seems to be time-based, could it be that your machine is somehow not renegotiating a dhcp lease?


  • Yeah that’s what we did last time. I implemented a basic framework on top of a very widespread system in our codebase, which would allow a number of requested minor features to be implemented similarly, with the minimal amount of required boilerplate, and leaving the bulk of the work to implementing the actual meat of the requests.

    These requests were completely independent and so could be parallelized easily. The “framework” I implemented was also incredibly thin (basically just a helper function and an human instruction in the shape of “do this for this usecase”) over a system that is preexisting knowledge. My expectation was to have to bring someone up to speed on certain things and then let them loose on this collection of tasks, maybe having to answer some question a couple times a day.

    Instead, since the assigned colleague is basically just a copilot frontend, I had to spend 80% or more of my days explaining exactly what needed to be done (I would always start with the whys od things since the whats are derived from them, but this particular colleague seems uninterested in that).

    So I was basically spending my time programming a set of features by proxy, while I was ostensibly working on a different set of features.

    So yeah, splitting work only works if you also have people capable of doing it in the first place. Of course I couldn’t not help this colleague either, that’s a bad mark on performance review you know. Even when the colleagues have no intention of learning or being productive in any way (I live in a country with strong employee regulations so almost nobody can be fired for anything concerning actual work performance, and this particular colleague doesn’t hide that they don’t care about actually doing a good job, except to managers so they still get pay raises for “improving”).

    Yeah, you can tell I’m unhappy


  • who is actually stopping them from dealing with it?

    Management. Someone in management sets idiotic deadlines, then someone tells you “do X”, you estimate and come up with “it will take T amount of time” and production simply tells you “that’s too long, do it faster”

    they don’t care about the details or maintenance

    They don’t, they care about time. If there are 6 weeks to implement a feature that requires reworking half the product, they don’t care to know half the product needs to be reworked. They only care to hear you say that you’ll get it done in 6 weeks. And if you say that’s impossible, they tell you to do it anyway

    you have to include the cost of managing technical debt

    I do, and when I get asked why my time estimations are so long compared to those of other colleagues I say I include known costs that are required to develop the feature, as well as a buffer for known unknowns and unknown unknowns which, historically, has been necessary 100% of the time and never included causing us development difficulties and us running over cost and over time causing delays and quality issues that caused internal unhappiness, sometimes mandatory overtime, and usually a crappy product that the customers are unhappy with. That’s me doing a good job right? Except I got told to ignore all of that and only include the minimum time to get all of the dozens of tiny pieces working. We went over time, over cost, and each tiny piece “works” when taken in isolation but doesn’t really mix with everything else because there was no integration time and so each feature kinda just exists there on its own.

    Then we do retrospectives in which we highlight all the process mistakes that we ran into only to do them all again next time. And I get blamed come performance review time because I was stressed and I wasn’t at the top of my game in the last year due to being chronically overburdened, overworked, and underpaid.







  • I am always amazed by how the japanese are often times very willing to experiment and be inventive in terms of melding their own culinary culture with foreign ones, considering the isolationist and conservative history and reputation they have overall as a people.

    To me, that simply says that food really is one of the universal languages.

    I’d love to try this dish if just for experimentation, although I suspect it wouldn’t be something I’d have more than once lol


  • Yeah, no judgement here, when one is poor they gotta do what they gotta do, and ketchup is probably cheaper than decent tomato sauce in some parts of the world I would imagine.

    That said, I am willing to bet that the same pasta but with actual prepared tomato sauce (that means put it on the stove, let it simmer, add some salt, maybe a bit of pepper or a pinch of chili flakes if you like, and a drop of EVO oil when it comes off the heat) in place of ketchup would be even better.

    Although in your case, the ketchup recipe likely brings back happy emotions relating to your childhood which, after all, are also part of the food experience. Cheers!


  • ugo@feddit.ittoLinux@lemmy.mlGoldilocks distro?
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    2 months ago

    +1. Arch is super easy to install, just open the install guide on the wiki and do what it says.

    It’s also really stable nowadays, I can’t actually remember the last time something broke.

    As a counterpoint, on ubuntu I constantly had weird issues where the system would change something apparently on its own. Like the key repeat resetting every so often (I mean multiple times an hour), weirdness with graphic drivers, and so on.

    That said, I also appreciate debian for server usage. Getting security updates only can be desirable for something that should be little more than an appliance. Doing a dist upgrade scares the shit out of me though, while on arch that’s not even close to a concern.



  • What “it” is configurable? If the code is indented with 4 spaces, it is indented with 4 spaces. You can configure your editor to indent with 1 space if you want, but then your code is not going to respect the 4 spaces of indentation used by the rest of the code.

    I repeat, the only accessible indentation option is using tabs. This is not an opinion because every other option forces extra painful steps for those with vision issues (including, but not limited to, having to reformat the source files to tabs so they can work on them and then reformat them back to using spaces in order to commit them)





  • Meh. Been developing professionally with C++ for 10 years at this point. I’m one of the weird people that kinda likes C++ and its pragmatism despite all its warts.

    I’d like C++ better if it didn’t have inheritance. There are better solutions to model interfaces, and without inheritance people can’t write class hierarchies that are 10 levels deep with a different set of virtual functions overridden (and new virtual functions added) at each level.

    And yes, that is not hypothetical. Real codebases in the real world shipping working products do that, and it’s about as nice as you can imagine.


  • You do have a terminology mismatch. In C++, an abstract class is a class with at least one pure virtual method.

    Such classes cannot be instantiated, so they are useful only as base classes.

    An interface is more of a concept than a thing.

    Sure you can say that Iterable is an interface that provides the Next() and Prev() methods and you can say that Array is an Iterable because it inherits from Iterable (and then you override those methods to do the correct thing), and that’s one way to implement an interface in C++.

    But you can also say that Iterable<T> is a class template that provides a Next() and Prev() methods that call the methods of the same name on the type that they wrap (CRTP aka static polymorphism).

    Or you can say that an algorithm that scans a collection T forward requires the collection to have a Next() method by calling Next() on it.

    And I can think of at least 2 other ways to define an interface that isn’t using abstract classes.

    And even if using abstract classes, inheriting from them is definitely the least flexible way to use them to define an interface, because it doesn’t allow one to do something like mocking functionality in tests, because it’s not possible to redefine the class to be tested to inherit from the test interface implementation with mocked functionality, so one still needs something to the effect of dependency injection anyway.

    So yeah, abstract class is very different from inheritance, and it’s also very different from interface, even though it relates to both.


  • Looks to me like the ruling is saying that the output of a model trained on copyrighted data is not copyrighted in itself.

    By that logic, if I train a model on marvel movies and get something that is exactly the same as an existing movie, that output is not copyrighted.

    It’s a stretch, for sure, and the judge did say that he didn’t consider the output to be similar enough to the source copyrighted material, but it’s unclear what “close enough” is.

    What if my model is trained on star wars and outputs a story that is novel, with different characters with different voices. That’s not copyrighted then, despite the model being trained exclusively on copyrighted data?