Monday, 23 March 2015

Removing trauma with technology - what could go wrong?

It’s a curious thing to discuss memory like it’s retrievable when no one's sure how it was stored in the first.

The linked article outlines an emerging field of research, out of the U.S. Defence Advanced Research Projects Agency (DARPA), which apparently wants to have implantable brain computers in the...well...I guess it wants them for the future? The earlier the better, right? They don't exist yet (thank Christ), but they'll appear on a shelf soon. Or maybe you'll be the shelf? I dunno, I didn't read the research very closely.

From what I remember, the mere recalling of memory can change the nature of that memory – you can add or edit or delete or combine similar events/memories without even realising what you’re doing. I’m also sceptical to use the analogy of the harddrive in the story because in reality, how the brain works is still very much up for debate. They’re not sure even how or where memory is stored in the brain – the hippocampi is one area, but there’s been discussion of memory being distributed across the cortex also.
What's that? The cool-aid? Drink up, buddy.

I wonder if the troops whose memories they’re trying to improve also have memory losses for good reasons? Bear with me.

All those traumatic events they experienced may have been ‘dulled’ by loss of memory (for their survival) and the side effect is that other important memories were also lost with the 'dulling'. So it’d be interesting to see if regaining some memories turns out to be a worse thing for them in the aggregate?

Not that I’m totally against it – I mean, I wouldn’t know who I am without my memories, and I’d imagine it would be pretty painful for the brain-damaged troops and their families and friends. But I also know for example that there’s experiments where people (rape victims) have had the opposite treatment (a dulling of the memories) so they can cope in life.

The thing that excites/bothers me is the potential for near-human or pseudo-human brains. It’ll make us more-than-human in a way we’ve been attempting as a species through the use of stranger and stranger ideologies for hundreds of years. There wouldn't be any more “human condition” to worry about if you can just switch off the drive to murder, lie, hate, frighten, etc. I can see why they're chasing this.

But we’re already far too malleable from advertising and propaganda for me to think this is an unalloyed good for humanity. Let’s assume a liquid harddrive could be implanted in human brains to “help” them get over trauma (it’s odd that these are always sold as helping people, isn’t it?…), what would it take for someone to “train” a brain harddrive to make a that person a better person? Surely not much.

I have one question: better according to whom? A Nazi “better” might be very different to a Jain “better”. Jains don’t like any kind of violence (even, presumably, implanting a device what makes a person better in almost every aspect) which means this technology would only be used by the Nazi person in this scenario. So now you’ve got a problem where it doesn’t matter what it means when someone’s talking about “better”, only that we keep it out of the hands of people who’s concept of “better” is something we don’t already agree with.

Yeah…good luck with that…worked so well with nuclear weapons…  

And this is assuming the creators of the technology already know what “better” means! I’m pretty sure we don’t. And it assumes that the equivalent of Nazis don't beat the rest of us to the invention in the first place.

I reckon the good-enough-brain analogy is apt. I can’t really see any good reason to manipulate how a brain functions based on some company’s blue-sky theory that it’ll help trauma victims.

After all, what is trauma exactly? Is stubbing my toe trauma? I know it bloody hurts and now I’m scared of table legs. So would I qualify for a brain harddrive to resurrect my trust in furniture? If so, then we’re pushing the bar waaaay to close to the ground for my liking.

Also, if it’s not just trauma that’s on the table here (sorry, bad pun), what else would be acceptable in the grand scheme of implantable computers? Emotional pain? Repressed memories? Sibling rivalry? Hatred of the colour green? In fact, wouldn’t a more frightening question be: where does this whole enterprise stop?

I don’t trust humans a whole lot now. What makes the agency think I’m going to trust a bunch of humans with machines in their skulls programmed to make their every move more ethical than everyone else’s? Especially when no one’s ever come up with anything remotely resembling a universal theory of ethics. That's asking for problems. I still think that although we can advance technology, it'll take a lot more to advance humans - even with all this new technology,

In saying all that, the idea this technology may be teamed with prosthesis is a cool idea and might give people true freedom when they’ve lost limbs. Then again, all the arguments I’ve outlined already exactly apply here in the same format. We can trust the tech, we just can't trust the humans wielding it.

This is gonna take some serious philosophical ethics to figure out…

No comments: