You must log in or register to comment.
It happens, I once ask a question about a spinning wheel and the rpm required to have 1G. chatGPT started a couple of calculus and in the end answered that to have 1G, the diameter should be 7 times the radius. I answered back “this does not make sense, a diameter is by definition 2 times the radius”, it apologized and redid the right calculus :)
This is because LLMs do not inherently understand math. They stick characters together that are likely to go together based on the content they were trained on. They’re literally just glorified autocorrect.
If you want a tool that can actually do math from natural language input, try WolframAlpha.