I looked your name up. If you're the physicist, I will take your word for this. I have no counter comment.
My purpose, by asking an AI program about objective reality has epistemological implications for gaining knowledge, going forward.
I am not a physicist. I edit and translate papers for physicists. They often talk about models that work with one or two particles, but fail with more complex systems.
The ChatGPT AI program has no actual knowledge of anything. It has no more intelligence than an old fashioned library paper index card system. There are some AI programs that attempt to simulate knowledge, and apply logic to problems, but this one does not. At least, it did not when I read about it a few years ago.
Thank you for the context of your comment. I am curious. Is there is a way to prevent bias when doing scientific inquiry, outside mathematics? My knowledge about AI is very limited.
I looked your name up. If you're the physicist, I will take your word for this. I have no counter comment.
My purpose, by asking an AI program about objective reality has epistemological implications for gaining knowledge, going forward.
I am not a physicist. I edit and translate papers for physicists. They often talk about models that work with one or two particles, but fail with more complex systems.
The ChatGPT AI program has no actual knowledge of anything. It has no more intelligence than an old fashioned library paper index card system. There are some AI programs that attempt to simulate knowledge, and apply logic to problems, but this one does not. At least, it did not when I read about it a few years ago.
Thank you for the context of your comment. I am curious. Is there is a way to prevent bias when doing scientific inquiry, outside mathematics? My knowledge about AI is very limited.