• 0 Posts
  • 177 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle




  • Speed is less of a factor than endurance in a persistence-hunting scenario where we’re much slower than our prey anyway.

    I don’t know the facts for this specific claim, but the logic is fair. One group can be better suited for endurance without being faster. One group could also be faster on average without having the individual fastest performers. Not only because of cultural factors, but also because the distribution curves might have different shapes for men vs women. There could be greater outliers (top performers) among men even if the average is higher among women in general. It’s not necessarily as straightforward as, say, height, where men’s distribution curve is almost the same shape as women’s, just shifted up a few inches.

    I don’t have the data to draw any real conclusions, though.

    One of the problems looking at athletic records is that it’s really just the elite among a self-selected group of enthusiasts, which doesn’t tell us a whole lot about what might have been the norm 100,000 years ago, or what might be the norm today if all else were equal between genders. These are not controlled trials.

    I’ve read that the top women outperform the top men in long-distance open-water swimming, supposedly due in part to higher body fat making women more buoyant, helping to regulate body temperature, and providing fuel. This is the first time I’ve read that women might have an advantage in running, though.

    I wish the article provided citations. The reality is probably too complex to fit into a headline or pop-sci writeup.








  • Apple tried to allow clones, but ran into the same problem because the clone makers could make cheaper machines by slapping together parts.

    Yeah, this is exactly what happened, although some of the clone brands were perfectly high-quality (Power Computing in particular made great machines, usually the fastest on the market). In the Mac community at the time, a lot of people (myself included) wished Apple would just exit the hardware business and focus on what they were good at: software.

    Then Steve Jobs came back and did exactly the opposite of that. First order of business was to kill cloning. Then came the iPod.

    To be fair, the next generation of Power Macs after that were about half the price of the previous gen.


  • GenderNeutralBro@lemmy.sdf.orgtoMemes@lemmy.mlCosts Less? When That Happened?
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    7
    ·
    edit-2
    22 days ago

    Most of Apple’s history, actually.

    Macs have a reputation for being expensive because people compare the cheapest Mac to the cheapest PC, or to a custom-built PC. That’s reasonable if the cheapest PC meets your needs or if you’re into building your own PC, but if you compare a similarly-equipped name-brand PC, the numbers shift a LOT.

    From the G3-G5 era ('97-2006) through most of the Intel era (2006-2020), if you went to Dell or HP and configured a machine to match Apple’s specs as closely as possible, you’d find the Macs were almost never much more expensive, and often cheaper. I say this as someone who routinely did such comparisons as part of their job. There were some notable exceptions, like most of the Intel MacBook Air models (they ranged from “okay” to “so bad it feels like a personal insult”), but that was never the rule. Even in the early-mid 90s, while Apple’s own hardware was grossly overpriced, you could by Mac clones for much cheaper (clones were licensed third-parties who made Macs, and they were far and away the best value in the pre-G3 PowerPC era).

    Macs also historically have a lower total cost of ownership, factoring in lifespan (cheap PCs fail frequently), support costs, etc. One of the most recent and extensive analyses of this I know if comes from IBM. See https://www.computerworld.com/article/1666267/ibm-mac-users-are-happier-and-more-productive.html

    Toward the tail end of the Intel era, let’s say around 2016-2020, Apple put out some real garbage. e.g. butterfly keyboards and the aforementioned craptastic Airs. But historically those are the exceptions, not the rule.

    As for the “does more”, well, that’s debatable. Considering this is using Apple’s 90s logo, I think it’s pretty fair. Compare System 7 (released in '91) to Windows 3.1 (released in '92), and there is no contest. Windows was shit. This was generally true up until the 2000s, when the first few versions of OS X were half-baked and Apple was only just exiting its “beleaguered” period, and the mainstream press kept ringing the death knell. Windows lagged behind its competition by at least a few years up until Microsoft successfully killed or sufficiently hampered all that competition. I don’t think you can make an honest argument in favor of Windows compared to any of its contemporaries in the 90s (e.g. Macintosh, OS/2, BeOS) that doesn’t boil down to “we’re used to it” or “we’re locked in”.




  • Probably ~15TB through file-level syncing tools (rsync or similar; I forget exactly what I used), just copying my internal RAID array to an external HDD. I’ve done this a few times, either for backup purposes or to prepare to reformat my array. I originally used ZFS on the array, but converted it to something with built-in kernel support a while back because it got troublesome when switching distros. Might switch it to bcachefs at some point.

    With dd specifically, maybe 1TB? I’ve used it to temporarily back up my boot drive on occasion, on the assumption that restoring my entire system that way would be simpler in case whatever I was planning blew up in my face. Fortunately never needed to restore it that way.




  • I don’t think there’s any way to count years without rooting it somewhere arbitrary. We cannot calculate the age of the planet, the sun, or the universe to the accuracy of a year (much less a second or nanosecond). We cannot define what “modern man” is to a meaningful level of accuracy, either, or pin down the age of historical artifacts.

    Most computers use a system called “epoch time” or “UNIX time”, which counts the seconds from January 1, 1970. Converting this into a human-friendly date representation is surprisingly non-trivial, since the human timekeeping systems in common use are messy and not rooted in hard math or in the scientific definition of a second, which was only standardized in 1967.

    Tom Scott has an amusing video about this: https://www.youtube.com/watch?v=-5wpm-gesOY

    There is also International Atomic Time, which, like Unix Time, counts seconds from an arbitrary date that aligns with the Gregorian calendar. Atomic Time is rooted at the beginning of 1958.

    ISO 8601 also aligns with the Gregorian calendar, but only as far back as 1582. The official standard does not allow expressing dates before that without explicit agreement of definitions by both parties. Go figure.

    The core problem here is that a year, as defined by Earth’s revolution around the sun, is not consistent across broad time periods. The length of a day changes, as well. Humans all around the world have traditionally tracked time by looking at the sun and the moon, which simply do not give us the precision and consistency we need over long time periods. So it’s really difficult to make a system that is simple, logical, and also aligns with everyday usage going back centuries. And I don’t think it is possible to find any zero point that is truly meaningful and independent of wishy-washy human culture.