• Magister@lemmy.world
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    7 months ago

    It happens, I once ask a question about a spinning wheel and the rpm required to have 1G. chatGPT started a couple of calculus and in the end answered that to have 1G, the diameter should be 7 times the radius. I answered back “this does not make sense, a diameter is by definition 2 times the radius”, it apologized and redid the right calculus :)

    • Technus@lemmy.zip
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      This is because LLMs do not inherently understand math. They stick characters together that are likely to go together based on the content they were trained on. They’re literally just glorified autocorrect.

      If you want a tool that can actually do math from natural language input, try WolframAlpha.