Well, presumably the speed would ultimately be constrained by Plank Time. Basically the time it takes light to travel in a vacuum across the smallest known sub-atomic particle. When you talk about change, you need a thing that's changing, and a thing that's causing the change. Smallest thing changing + fastest thing causing the change = …
Well, presumably the speed would ultimately be constrained by Plank Time. Basically the time it takes light to travel in a vacuum across the smallest known sub-atomic particle. When you talk about change, you need a thing that's changing, and a thing that's causing the change. Smallest thing changing + fastest thing causing the change = limit. I promise that's as physics nerd as I will get here. But believe me, it's not only faster than you think it is, it's literally faster than you *can* think it is.
I think you've neglected something possibly more important that we are facing though: What happens to our idea of ourselves as our occupations become obsolete? Who are each of us if everything that we do and did can now be done better, faster, more cheaply by machines? We all know what happens in countries with large numbers of military age men with nothing to do, but that's not really what I mean. Say what you want about Capitalism, but actually I don't think we have a good enough idea of how deeply we are what we do. I'm a banker, you're a writer, the other guy's a dentist. The ultimate AI scenario turns us all into dilatants and hobbyists. I do what I do because I enjoy it, and I feel like I'm pretty good at it, but also because I think I "contribute" something. I wake up wanting to contribute that thing. Take that away, and you take away a part of me. Human's have a concept of earning. We earn our keep: We earn the right: We earn respect. In a very real way, we have earned the right to be the person we see when we look in the mirror. How do we keep earning these things if machines do everything better?
The scariest part is I think this is the best case scenario, and the only end game if everything with AI *goes right*. If we don't start having serious conversations about how we value ourselves and each other beyond what we do or make, they might as well turn us all into paper clips.
Also, agree we all want our lives and our efforts to matter.
And I don't think JVL was neglecting the potential loss of work AI will bring. I recognize that, and I'm just a thoughtful person who reads and writes in the Bulwark comments section. So I'm sure JVL, who writes the Bulwark, knows that. He just wasn't including it in the two things that scare him.
FWIW, my fears about AI are two: The loss of work, as you noted, and the inability to know what's true. Our ability has already eroded since social media and Trumpism. It's only going to get worse with AI-produced digital content.
Just a thought. I don't think AI can replace the relationship between a parent and a child and the child rearing parents do. It's just too much of a mystery as to how good parenting works. There are too many factors involved. Anything involving humans - juries, voters, marriage, raising children - cannot be predicted. And the theory of mind, that children on the autism spectrum lack - I don't see AI having that ability necessary to sustain human relationships. And parents rely on their parents to guide them (if their parents did a good job at parenting).
However, at least one of those parents needs to be a breadwinner to sustain the family. The vast majority of fathers are, to-date, men. So I think the loss of identity grounded in what we do is going to be much worse for men than for women.
Well, presumably the speed would ultimately be constrained by Plank Time. Basically the time it takes light to travel in a vacuum across the smallest known sub-atomic particle. When you talk about change, you need a thing that's changing, and a thing that's causing the change. Smallest thing changing + fastest thing causing the change = limit. I promise that's as physics nerd as I will get here. But believe me, it's not only faster than you think it is, it's literally faster than you *can* think it is.
I think you've neglected something possibly more important that we are facing though: What happens to our idea of ourselves as our occupations become obsolete? Who are each of us if everything that we do and did can now be done better, faster, more cheaply by machines? We all know what happens in countries with large numbers of military age men with nothing to do, but that's not really what I mean. Say what you want about Capitalism, but actually I don't think we have a good enough idea of how deeply we are what we do. I'm a banker, you're a writer, the other guy's a dentist. The ultimate AI scenario turns us all into dilatants and hobbyists. I do what I do because I enjoy it, and I feel like I'm pretty good at it, but also because I think I "contribute" something. I wake up wanting to contribute that thing. Take that away, and you take away a part of me. Human's have a concept of earning. We earn our keep: We earn the right: We earn respect. In a very real way, we have earned the right to be the person we see when we look in the mirror. How do we keep earning these things if machines do everything better?
The scariest part is I think this is the best case scenario, and the only end game if everything with AI *goes right*. If we don't start having serious conversations about how we value ourselves and each other beyond what we do or make, they might as well turn us all into paper clips.
Also, agree we all want our lives and our efforts to matter.
And I don't think JVL was neglecting the potential loss of work AI will bring. I recognize that, and I'm just a thoughtful person who reads and writes in the Bulwark comments section. So I'm sure JVL, who writes the Bulwark, knows that. He just wasn't including it in the two things that scare him.
FWIW, my fears about AI are two: The loss of work, as you noted, and the inability to know what's true. Our ability has already eroded since social media and Trumpism. It's only going to get worse with AI-produced digital content.
Just a thought. I don't think AI can replace the relationship between a parent and a child and the child rearing parents do. It's just too much of a mystery as to how good parenting works. There are too many factors involved. Anything involving humans - juries, voters, marriage, raising children - cannot be predicted. And the theory of mind, that children on the autism spectrum lack - I don't see AI having that ability necessary to sustain human relationships. And parents rely on their parents to guide them (if their parents did a good job at parenting).
However, at least one of those parents needs to be a breadwinner to sustain the family. The vast majority of fathers are, to-date, men. So I think the loss of identity grounded in what we do is going to be much worse for men than for women.
Just a thought.
And an excellent one.