It's people faking things to do evil that's the danger, not AI per se. If someone forges a signature, does it really matter if they used AI to do it? AI is making it easier but the responsibility still lies with the criminal or the irresponsible engineer or company.
It's people faking things to do evil that's the danger, not AI per se. If someone forges a signature, does it really matter if they used AI to do it? AI is making it easier but the responsibility still lies with the criminal or the irresponsible engineer or company.
Oh, no doubt the responsibility always will lie with the human driving, but force multipliers make it easier for individual bad actors to do great harm.
Agreed but we should avoid making laws to control AI as that will just create a reason for the bad guys to avoid using that term. What is and what is not AI will always be debatable. It's all just software tools that allow people to do good or bad.
Until we have AGI, perhaps decades from now, I think it isn't that hard. It is mostly making people responsible for their actions. AGI will bring us a whole different set of problems as we'll be likely to want to give the robots responsibility. Until then, people are in charge and, therefore, bear all responsibility. In short, laws should prevent people from saying, "Not me. The AI did it."
It's people faking things to do evil that's the danger, not AI per se. If someone forges a signature, does it really matter if they used AI to do it? AI is making it easier but the responsibility still lies with the criminal or the irresponsible engineer or company.
Oh, no doubt the responsibility always will lie with the human driving, but force multipliers make it easier for individual bad actors to do great harm.
Agreed but we should avoid making laws to control AI as that will just create a reason for the bad guys to avoid using that term. What is and what is not AI will always be debatable. It's all just software tools that allow people to do good or bad.
There is most definitely a lot of thought to be out into how to legislate for an AI enabled world. I donтАЩt know what the right answers are.
Until we have AGI, perhaps decades from now, I think it isn't that hard. It is mostly making people responsible for their actions. AGI will bring us a whole different set of problems as we'll be likely to want to give the robots responsibility. Until then, people are in charge and, therefore, bear all responsibility. In short, laws should prevent people from saying, "Not me. The AI did it."