We don't know how humans arrive at a lot of their decisions, either. We tell stories about how they did, but in the end they are only stories. WE DON'T KNOW.
I have a student that does something (usually stupid and bad). I ask them why they did it. They often lie to me with some made up reason (rationalization after the fact) when the re…
We don't know how humans arrive at a lot of their decisions, either. We tell stories about how they did, but in the end they are only stories. WE DON'T KNOW.
I have a student that does something (usually stupid and bad). I ask them why they did it. They often lie to me with some made up reason (rationalization after the fact) when the reality (which some of them will admit) is: they don't know. The mind is not less complex than an AI (it is probably more complex) and we have the additional factor that the mind will lie to itself and it will lie to other minds and the mind is largely unaware of all the things it uses to decide and do things.
We probably have a better idea of what is going on inside an AI than we do of what is going on inside our own heads. We did not build our minds, we DID build the AI. It isn't unexplainable it is just very very complex.
The same thing is true (in a sense) of the human mind, While we did not build it, we can (to an extent) analyze it. The problem is that it is a human mind and a lot of things get in the way of that analysis. Preconceptions, morality, metaphysical beliefs, etc.
We don't know how humans arrive at a lot of their decisions, either. We tell stories about how they did, but in the end they are only stories. WE DON'T KNOW.
I have a student that does something (usually stupid and bad). I ask them why they did it. They often lie to me with some made up reason (rationalization after the fact) when the reality (which some of them will admit) is: they don't know. The mind is not less complex than an AI (it is probably more complex) and we have the additional factor that the mind will lie to itself and it will lie to other minds and the mind is largely unaware of all the things it uses to decide and do things.
We probably have a better idea of what is going on inside an AI than we do of what is going on inside our own heads. We did not build our minds, we DID build the AI. It isn't unexplainable it is just very very complex.
The same thing is true (in a sense) of the human mind, While we did not build it, we can (to an extent) analyze it. The problem is that it is a human mind and a lot of things get in the way of that analysis. Preconceptions, morality, metaphysical beliefs, etc.