4 Comments
User's avatar
тна Return to thread
Jed Rothwell's avatar

These things are not interesting in this context because this AI model cannot yet deal with them. If this were physics and we were discussing a single-body equation that could not be used in a multi-body simulation, the multi-body simulation would not be "interesting." Perhaps it would be more accurate to say "not relevant."

Expand full comment
flagrante delicto's avatar

I looked your name up. If you're the physicist, I will take your word for this. I have no counter comment.

My purpose, by asking an AI program about objective reality has epistemological implications for gaining knowledge, going forward.

Expand full comment
Jed Rothwell's avatar

I am not a physicist. I edit and translate papers for physicists. They often talk about models that work with one or two particles, but fail with more complex systems.

The ChatGPT AI program has no actual knowledge of anything. It has no more intelligence than an old fashioned library paper index card system. There are some AI programs that attempt to simulate knowledge, and apply logic to problems, but this one does not. At least, it did not when I read about it a few years ago.

Expand full comment
flagrante delicto's avatar

Thank you for the context of your comment. I am curious. Is there is a way to prevent bias when doing scientific inquiry, outside mathematics? My knowledge about AI is very limited.

Expand full comment