?

Log in

No account? Create an account

Previous Entry | Next Entry

Scary fucking shit

A team of world-leading neuroscientists has developed a powerful technique that allows them to look deep inside a person's brain and read their intentions before they act.

The research breaks controversial new ground in scientists' ability to probe people's minds and eavesdrop on their thoughts, and raises serious ethical issues over how brain-reading technology may be used in the future.

The team used high-resolution brain scans to identify patterns of activity before translating them into meaningful thoughts, revealing what a person planned to do in the near future. It is the first time scientists have succeeded in reading intentions in this way.

"Using the scanner, we could look around the brain for this information and read out something that from the outside there's no way you could possibly tell is in there. It's like shining a torch around, looking for writing on a wall," said John-Dylan Haynes at the Max Planck Institute for Human Cognitive and Brain Sciences in Germany, who led the study with colleagues at University College London and Oxford University.

The research builds on a series of recent studies in which brain imaging has been used to identify tell-tale activity linked to lying, violent behaviour and racial prejudice.

The latest work reveals the dramatic pace at which neuroscience is progressing, prompting the researchers to call for an urgent debate into the ethical issues surrounding future uses for the technology. If brain-reading can be refined, it could quickly be adopted to assist interrogations of criminals and terrorists, and even usher in a "Minority Report" era (as portrayed in the Steven Spielberg science fiction film of that name), where judgments are handed down before the law is broken on the strength of an incriminating brain scan.

"These techniques are emerging and we need an ethical debate about the implications, so that one day we're not surprised and overwhelmed and caught on the wrong foot by what they can do. These things are going to come to us in the next few years and we should really be prepared," Professor Haynes told the Guardian.

The use of brain scanners to judge whether people are likely to commit crimes is a contentious issue that society should tackle now, according to Prof Haynes. "We see the danger that this might become compulsory one day, but we have to be aware that if we prohibit it, we are also denying people who aren't going to commit any crime the possibility of proving their innocence."

During the study, the researchers asked volunteers to decide whether to add or subtract two numbers they were later shown on a screen.

Before the numbers flashed up, they were given a brain scan using a technique called functional magnetic imaging resonance. The researchers then used a software that had been designed to spot subtle differences in brain activity to predict the person's intentions with 70% accuracy.

The study revealed signatures of activity in a marble-sized part of the brain called the medial prefrontal cortex that changed when a person intended to add the numbers or subtract them.

Because brains differ so much, the scientists need a good idea of what a person's brain activity looks like when they are thinking something to be able to spot it in a scan, but researchers are already devising ways of deducing what patterns are associated with different thoughts.

Barbara Sahakian, a professor of neuro-psychology at Cambridge University, said the rapid advances in neuroscience had forced scientists in the field to set up their own neuroethics society late last year to consider the ramifications of their research.

"Do we want to become a 'Minority Report' society where we're preventing crimes that might not happen?," she asked. "For some of these techniques, it's just a matter of time. It is just another new technology that society has to come to terms with and use for the good, but we should discuss and debate it now because what we don't want is for it to leak into use in court willy nilly without people having thought about the consequences.

"A lot of neuroscientists in the field are very cautious and say we can't talk about reading individuals' minds, and right now that is very true, but we're moving ahead so rapidly, it's not going to be that long before we will be able to tell whether someone's making up a story, or whether someone intended to do a crime with a certain degree of certainty."

Professor Colin Blakemore, a neuroscientist and director of the Medical Research Council, said: "We shouldn't go overboard about the power of these techniques at the moment, but what you can be absolutely sure of is that these will continue to roll out and we will have more and more ability to probe people's intentions, minds, background thoughts, hopes and emotions.

"Some of that is extremely desirable, because it will help with diagnosis, education and so on, but we need to be thinking the ethical issues through. It adds a whole new gloss to personal medical data and how it might be used."

The technology could also drive advances in brain-controlled computers and machinery to boost the quality of life for disabled people. Being able to read thoughts as they arise in a person's mind could lead to computers that allow people to operate email and the internet using thought alone, and write with word processors that can predict which word or sentence you want to type . The technology is also expected to lead to improvements in thought-controlled wheelchairs and artificial limbs that respond when a person imagines moving.

"You can imagine how tedious it is if you want to write a letter by using a cursor to pick out letters on a screen," said Prof Haynes. "It would be much better if you thought, 'I want to reply to this email', or, 'I'm thinking this word', and the computer can read that and understand what you want to do."

· FAQ: Mind reading

What have the scientists developed?
They have devised a system that analyses brain activity to work out a person's intentions before they have acted on them. More advanced versions may be able to read complex thoughts and even pick them up before the person is conscious of them.

How does it work?
The computer learns unique patterns of brain activity or signatures that correspond to different thoughts. It then scans the brain to look for these signatures and predicts what the person is thinking.

How could it be used?
It is expected to drive advances in brain-controlled computers, leading to artificial limbs and machinery that respond to thoughts. More advanced versions could be used to help interrogate criminals and assess prisoners before they are released. Controversially, they may be able to spot people who plan to commit crimes before they break the law.

What is next?
The researchers are honing the technique to distinguish between passing thoughts and genuine intentions.

http://www.guardian.co.uk/frontpage/story/0,,2009229,00.html

Tags:

Comments

( 27 comments — Leave a comment )
consortofvenus
Feb. 17th, 2007 06:12 pm (UTC)
More reasons to be thrilled that I'm not immortal. Ofcourse if I were, they might forget all about scanning my head and just focus on the immortal thing so.
fayanora
Feb. 20th, 2007 06:07 pm (UTC)
You ARE immortal. That's why we die eventually... the energy of us that lives on, whatever you want to call it, gets bored with one lifetime. :-)
scwizard
Feb. 17th, 2007 07:09 pm (UTC)
Well, as scary as it may be this is progress and this is the world of the future.

I've been particularly interested in brain reading technology (I have a collection of newspaper clippings, I'll add this article to them). I even planned to write a science fiction book where peoples thoughts were as available as the data on their personal computers and hackers could read minds. It'll be kind of funny if I don't get around to writing the story until physic hackers already exist.

The biggest problem we have now is that brain imagining requires a big machine and a big computer to process the output of the big machine. However as we know part of progress is that big stuff gets smaller as time goes on, so this kind of development is pretty inevitable now.

If the the government has more or less exclusive access to this kind of technology then we will get a 1984 scenario. Scientists need to get this type of technology as cheap as possible once its viable. If its too expensive the military will have far more access to it than civilians.

Another way to end up in 1984 is if we cede control of the data on our personal computers to the government (PCs these days contain a lot of our thoughts, but this is nothing compared to what will come). Microsoft and a bunch of other corporations increasingly wants to control what data you can access on your own hard drive through DRM. This eventually leads to trusted computing. This leads into 1984 once in the name of security (except national security this time and not computer security), you can no longer be trusted to have access to, control the transmission of or delete the records of your own thoughts.
fayanora
Feb. 20th, 2007 06:22 pm (UTC)
The thing is, it's not the idea of computer being able to record thoughts that bothers me... in one or more of my Nokwahl novels, she has a computer in her apartment with a "Psionic Tranciever" built into it, so she can give it commands with a directed thought and have it record her thoughts for her diary or for when writing a letter. And the computer is an AI. But this is a future where the Psionic Tranceiver was developed not by humans but by the Ah'Koi Bahnis. The AI technology itself is mostly Droid technology, with a little human influence.

So, in the right hands... or rather, made by a race that has advanced quite a ways in its ways of thinking, I wouldn't mind the technology. In fact, I'd LOVE to be able to write without typing or speaking. But I don't trust the human race at this point in its memetic evolution. If this technology were invented by them, I would start seeking out Psionic Tranciever Blocking Technology.
gkathellar
Feb. 20th, 2007 10:25 pm (UTC)
To be fair, it seems unlikely they're going to be able to do this without enormous pieces of metal any time soon. Maybe I've misunderstood, but isn't brain imaging fairly similar to MRI?
fayanora
Feb. 22nd, 2007 07:27 pm (UTC)
Yes, it is. In fact, I think they need an MRI machine to do a brain scan.
gkathellar
Feb. 22nd, 2007 08:34 pm (UTC)
Therefore, given the huge, mildly dangerous magnets involved, and the expense involved in using those magnets, this doesn't seem like it'll be a major concern in the near future.
fayanora
Feb. 23rd, 2007 06:53 pm (UTC)
*Mildly* dangerous mangets? *MILDLY* dangerous??? :-)

I see your point, and raise you double elephants.
gkathellar
Feb. 23rd, 2007 07:45 pm (UTC)
Not double anteaters?
fayanora
Feb. 23rd, 2007 07:57 pm (UTC)
Sprockets of doom collide with combovers from Mercury.
gkathellar
Feb. 23rd, 2007 08:12 pm (UTC)
Mooooo!
fayanora
Feb. 24th, 2007 04:41 pm (UTC)
Dorkus porkus nincompoop,
Sally O'Malley alley-oop!
beautifulpyre
Feb. 17th, 2007 09:30 pm (UTC)
MINORITY REPORT!
wizwom
Feb. 19th, 2007 03:13 pm (UTC)
Nah, M.R. used psychics to predict futures that were going to happen if no action was taken.
This is more like Brin's "Sundiver" Society, where people who fail psychological testing are "permanent probationers" and second-class citizens, to the point of being fitted with a locating device, like a dog.
fayanora
Feb. 20th, 2007 06:24 pm (UTC)
I don't know how it is that I read that book and never noticed the thing about permanent probationers. I mean, it sounds familiar, so I must have read it and remembered it, but I never *noticed* it.
gkathellar
Feb. 17th, 2007 09:50 pm (UTC)
There are bigger problems.

I think that this could easily be over-amplified in the news media, and is probably being done so. I'm not saying it's not a problem - but I think people shouldn't be getting over-excited about it.
fayanora
Feb. 20th, 2007 06:41 pm (UTC)
What bigger problems did you have in mind, specifically?
gkathellar
Feb. 20th, 2007 06:52 pm (UTC)
Global warming. You know ... the immediate problem that's going to irreversible in a fairly short amount of time?
fayanora
Feb. 20th, 2007 07:18 pm (UTC)
Oh yeah.
gkathellar
Feb. 20th, 2007 10:22 pm (UTC)
It's really difficult to wrap your head around, but the worst thing is ... it actually is the single worst problem in the world right now.
wizwom
Feb. 19th, 2007 03:16 pm (UTC)
Well, I've known that brain imaging of this sort could be used to diagnose difficult diseases such as Schizophenia since the mid-80s, and just wasn't done because, er, the Psychiatrists would Rather Not Know For Sure.
fayanora
Feb. 20th, 2007 06:37 pm (UTC)
Psychiatrists would Rather Not Know For Sure.

Really? Why?
wizwom
Feb. 20th, 2007 06:40 pm (UTC)
As I understand it, because then many people on Prozac and Therapy and/or Ritalin and therapy would have it shown that they are normal, and thus they would need no treatment.

No law against a psychiatrist having stock in drug companies. Or wanting to make a buck.

caveat emptor, as they say.
fayanora
Feb. 20th, 2007 07:18 pm (UTC)
Weird.
shadowdf
Feb. 19th, 2007 07:36 pm (UTC)
Somehow... Total Recall comes to mind...
fayanora
Feb. 20th, 2007 06:26 pm (UTC)
Yeah, I suppose it makes sense that once they master the ability to read the mind, they could develop how to write to the mind.

Ghost In The Shell comes to mind, too. Specifically, in the original movie where some people were getting "ghost-hacked." Someone erasing their memories, replacing them with new ones. *Shudders*
(Anonymous)
Feb. 24th, 2007 11:09 pm (UTC)
Here'shoping that it never comes to that... though with the way society is quietly going, it seems a likelihood...
( 27 comments — Leave a comment )

Profile

mourning
fayanora
The Djao'Mor'Terra Collective
Fayanora's Web Site

Latest Month

August 2019
S M T W T F S
    123
45678910
11121314151617
18192021222324
25262728293031

Tags

Powered by LiveJournal.com
Designed by Taichi Kaminogoya