| Salt Lake City
|
| University of Utah
|
| B. What of the duality of mind and body? |
| |
David: |
OK, erm, duality. The duality of mind and body. |
| |
Prof Norman: |
I believe that the mind is deterministic, it's just we haven't figured out how to do it yet. And erm, so I believe that the mind's gonna be no more mystical or different than the body is, just works with different sort of cells. Different circuitry. I'm not sure we'll ever be able to understand the mind, because of it's complexity, but I do believe that in fact the brain is just another organ like the liver or the heart the lungs. |
| |
David: |
And the personality is defined by that. |
| |
Prof Norman: |
Yeah of course. It's complicated system. My MacIntosh computer or PC Computer seems to have a personality of its own when it gets into sort bizarre states. You get a system with sufficient complexity it can manifest rather interesting behaviour - and in fact this is very interesting and maybe you've thought a bit about this as well, is you can, I can - I can't, - other people have written programs that cause 'bugs' to move across the computer screen, and interact with other 'bugs' |
| |
|
 |
| |
David: |
Oh yes I've heard of this, artificial life on screen. |
| |
Prof Norman: |
Yeah right. And these bugs manifest incredibly sophisticated behaviour. And you think wow. The algorithms that must be causing this to happen have got to be incredibly complicated, and in fact they're incredibly simple just like two or three rules which are required |
| |
David: |
Like the Chaos theory thing |
| |
Prof Norman: |
That's right, very simple rules, make what seem to be extremely sophisticated behaviours. So in fact the behaviours that we manifest … erm… |
| |
David: |
May indeed boil down to very simple equations. |
| |
Prof Norman: |
Well not equations, but very simple sort of, you know, connections. What amazes me is erm, we used to believe that we were special - you know, Humans are special, a unique form of animal life. Now I think that belief has fallen by the wayside. Quite properly. I don't believe we're special. I believe that we do have some unique features, we have hands and our language which makes us unique, but what amazes me is the capacity for lower life forms to manifest an incredible range of behaviours that's almost as rich as ours. You take a cat - a cat has a brain the size of a walnut, yet that cat manifests rage, anger, affection, curiosity, play, erm, communication - all the sort of basic emotions that we experience. It just can't articulate them very effectively. But they're there. So, you know, our brain is probably 30-40 times the size of a cat's brain, but we're not much different. So I don't believe that we're something special, I think that the brain is just another organ. |
| |
David: |
OK, staying with philosophical and social things with respect to. Okay, first thought. Personal responsibility, in legal and ethical terms, is predicated on the basis of the control over the self-as-a-body exercised by the self-as-a-moral-intelligence. This is a kind of dualistic mind / body set-up, but its kind of what we understand in today's culture as defining humanity, and it's enshrined in the fundamental structures of our society. Work ongoing at CalTech that I mentioned earlier, showcased at the Exoskeleton conference. They're using the Utah electrode array to derive accurate directional and volitional data from the areas of monkeys brains where intent is formed prior to the related processes in the motor cortex, and they have achieved some success with monkeys. Although DARPA director Dr Garcia was clear that invasive surgery for soldiers was not on the agenda, some neuro-mechanical response system, based on this research, which no longer required invasive surgery but enabled soldiers to fire their weapons by wanting to, would certainly them the edge, which is something he was quite keen on. In the longer term, such technology could grant the self-as-a-moral-intelligence, control over a body no longer delimited by the skin, as you said on your website, moving wheelchairs and so forth. Now a self-body incorporating new organs of varied function and complexity. What intrigues me is that when the skin no longer demarcates the limit of the self, what effect will this have socially, on the notion of personal responsibility - as in, e.g. it wasn't me officer it was a bug in the software. I suppose there's a kind of part of my research here that's a kind of risk analysis, let's think about these things before the problem arises. |
| |
Prof Norman: |
Right. Well that is a problem actually. I have thought a little bit about that. Let's talk about controlling the wheelchair. Are you familiar with the work we've done in collaboration with John Donahue at Brown University. OK. We've implanted one of our electrode arrays in the motor cortex of a primate. The primate plays video games. The video game is a very simple video game. A spot comes up on the screen, and he has to put the cursor over the spot and then one of eight radial positions lights up and he has move the joy stick and put the cursor over the spot that's lit up. And if he can do that in a second he gets a reward. And the primate can do this very well. And we've had our electrode arrays implanted in a primate for over 3 years playing this game. And we look at the firing pattern of the neurons in the motor cortex - and we look at the firing patterns and we use those firing patterns to determine where he's moving his hand. So to start with of course we have no idea, but you can train a neural network - each time he moves to the right, you say, look at the firing pattern, this means move to the right, each time it moves up, look at the firing pattern, this means move up, and you can train a neural network to figure out the patterns that are in the firing… firing patterns, and you can use that to estimate where he's going to move. And you can estimate where he's going to move to within 80-90% correct. You still can't do it 100% correct. But this is with only 16 electrodes. So the question is - how many electrodes would be required to reliably control a wheelchair. Now the thing that controls - so you could sort of say, if he's already doing this at 80 or 90% correct, we're going to miss interpreting which way he wants to move 20% / 10% of the time, and if we're controlling a wheelchair, 10% of the time this guy's gonna want to turn right and he's gonna go faster straight ahead and smash into a car. Not a good idea. Right. But, the thing that's causing the cursor to move on the screen, is not the firing pattern of the neurons. It is the joystick. So if we could actually get the monkey to move the cursor on the screen based on the firing patterns of his neurons here, he might think that this is what's doing it but he'll start to make subtle motions which will allow this to probably happen 100% correct - might - so this notion of responsibility - in terms of MY responsibility, if I implant some electodes into a person who's wheelchair bound and he's going to use these controls to control an external device - what is MY responsibilty in this going to be? What happens when the system fails? Look when you drive a car, what happens when your wheel falls off and you kill other people, it's your responsibility the wheel fell off. |
| |
David: |
Because you're in control of the car.. |
| |
Prof Norman: |
Yeah - you're supposed to be driving slow enough so you can actually control the car but in fact nobody does. At Freeway speeds if your wheel falls off you'll kill people. And you'll be killed probably too. So we just assume. The wheel can always fall off - there is a finite amount of risk associated with it so you just try to minimise that as much as you can. So before these systems are implanted in human volunteers, you don't allow them to get into situations where significant damage could happen to them and to others. As the systems become more and more and more proficient, I mean - the Wright brothers' flyer was not a very good thing to put twenty passengers on, to start with, but with evolution, better systems evolved. And that will certainly happen with these kinds of systems that we're working with. But that's not the question you're talking about. You're talking about I think a more sinister view of responsibility. You're talking about a more instrumented individual than what I'm talking about. How much instrumentation are you talking about? I mean if you can give me a specific. |
| |
David: |
Well because of the Exoskeletons conference I went to the vision I gained from that, looking at the ways in which these technologies are being applied by military, they were after some kind of external neuro-mechanical response system that didn't include invasive surgery, but could be worn - in a helmet, which you could then plug into all sorts of things, and move around - at that point, the self-as-a-body includes mechanical organs. |
| |
|
 |
| |
Prof Norman: |
Let's not make that step, because in fact it's not clear how simple that'sgoing to be. In fact, intellectually you can make that leap, with facility. But that might be technically a very difficult leap to make. I mean the amount of control that you might be able to do with external systems is very minimal, and so let's not think about that, let's think about the legitimacy of implanting arrays. Now I'm not endorsing this. But let's just…. Then you start getting many more degrees of control. You can put a whole bunch of electrode arrays in the motor parts of the brain, you can put a whole bunch of electrode arrays in the sensory parts of the brain, so in fact, instead of using your eyes you could use tv cameras to get information or you could use rotating satellites in orbit to give you information which you could directly visualise. And you could - you could - I'm not advocating this. And then you could implement things, rather than having to sort of move your hands to do things you could just think volitional thoughts with sufficient training and you could sort of pull triggers or you could do whatever you wished to do, just through volitional thought. I don't see that as being a significantly different situation, I mean your element of personal responsibility is still the same, I mean, you have licence whether you in fact, - there's a firing pattern in my brain that causes my finger to do that [bends finger]. And it's not God's will that causes me to do that it's a firing pattern in my brain. I think I have personal responsibility to pull the trigger on a gun and shoot somebody. Whether I in fact physically pull that trigger, because of this connection between the motor part of my brain to my muscles, or whether in fact I've got an external system which allows me to do that, I still have that responsibility.. I think I could - right now - create a system that could shoot you - just through my verbal command, I could say 'OK shoot him!' And it would shoot you. Er - I still have the responsibility cause I built that system, and I told it to shoot you. |
| |
David: |
So in terms of the social contract of legal and ethical personal responsibility you would have to be able to measure these volitional firing patterns somehow to prove the case… |
| |
Prof Norman: |
Oh yes and your argument that there was a bug in the software is legitimate. That argument is used today as a matter of fact. I mean - the bug in the software is schizophrenia, or ratolin or a variety of psychotropic drugs that are being taken now. There's a psychotropic drug, can't think of the name of it, which in 95% of people calms them down, and these are people who are kind of, you know, psychopathic, and it calms them down, but in 5% of people, or actually maybe even less maybe 2% it actually causes them to become even more hostile and aggressive and a number of people have actually - there are murders that have been committed by people on this particular drug. And now the claim is - 'the drug made me do it.' And so I think the issue you're talking about is already here. It's been here for a long time. I mean what is responsibility, a person who is 'criminally insane' doesn't go through the same legal system as person who is a murderer goes through. |
| |
David: |
It's already a bug in the system. |
| |
Prof Norman: |
That's right, exactly. So I don't see significant differences there. I still think people are personally responsible. |
| |
David: |
Oh yeah. On the Bionic Technologies website, there's mention of the capability of electrical stimulation of the sensory cortex. |
| |
Prof Norman: |
That's the whole notion of artificial vision and artificial hearing and whatever. |
| |
David: |
One of the ideas that came from that was could this take, say, electronic tagging to its ultimate extreme, controlling behaviour in the criminally insane, or the politically undesirable. A kind of electronic soma. |
| |
Prof Norman: |
People already see what they wanna see. TV commercials tell us - make us see things that we want to start seeing, wanting to start buying and things like that, so I mean already there are systems in place which can rather dramatically alter our behaviour. |
| |
David: |
Sure. So this is an extension of what's already here. |
| |
Prof Norman: |
Yeah. |
| |
David: |
But I wonder if the intimacy and complexity of things takes it onto a new level. What I'm angling at, I think, is questions of identity. At the moment we can choose to switch the TV off |
| |
Prof Norman: |
And you can choose to switch your input into your sensory systems off as well, too. |
| |
David: |
Will we always be able to do that? |
| |
Prof Norman: |
Well if you're blind - well you always can. I mean - you will be in control, of whatever. It sounds like I'm advocating for a very much of a Brave New World sort of - I'm not - I made the premise to start with that I thought the best we could do is to take people who are non-performers and make them very poor performers. That's the best that we can do. A blind person who's profoundly blind - we will be able to reproduce some limited visual experience, but it's going to be extremely primitive compared to what you and I enjoy. And so if I wanted to make you see better, I would not use an implant in you. I would put you in front of a video monitor, and let you look at the signals that were coming down from the spy satellite or whatever |
| |
David: |
The human eye is going to be better |
| |
Prof Norman: |
That's right - it will always be better, than what we can do. So I don't think we need to worry too much about that. The only thing that we might wanna worry about is the motor system. Now if you're a jet fighter pilot, and you're flying at Mach 1 and having a dog fight with some enemy or something like that a 10th of a second becomes very important. And it takes about a 10th of second to go from here [brain] down to here [finger] and if you can recognise this firing pattern directly then it gives you a 10% increase in performance. And if it's only one thing - which is to pull the trigger - that's the only signal you're doing, you do all the flying here [hands] but the decision to pull the trigger happens up here [brain] then it could make the difference - you know 10th of a second - you can see something, fire and in a 10th of a second it could be out of the way. So that could be a significant advantage. So I think that there could be - I'm not advocating for this but I would think that's the only thing that I could think of where performance enhancement could be possible through an implant system. And I think that the military has correctly decided that this is not a good idea. In fact there's better ways to do that. A better way to do that would be to make your plane fly even faster and turn sharper so the G-loads will be twice as high as they are now and take the pilot out of the plane and let the pilot fly the plane remotely. |
| |
David: |
Telepresent. |
| |
Prof Norman: |
Yeah, exactly. Then in fact, you can move much faster, and dodge much more effectively, without having to worry about blacking out, which is the limitation right now. So there are better ways to do that, than putting chips in people's heads. So I don't think we have to worry too much about chips in people's heads, period. Except for people who are really, completely disabled. |
| |
David: |
In order to bring them some performance. |
| |
Prof Norman: |
That's right, exactly. |
|