Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
AI-controlled brain implants for mood disorders tested in people (nature.com)
116 points by doener on Nov 23, 2017 | hide | past | favorite | 34 comments


> One challenge with stimulating areas of the brain associated with mood, he says, is the possibility of overcorrecting emotions to create extreme happiness that overwhelms all other feelings.

Terrifyingly, this risk does not seem far off the wireheading described in gwern's article, Terrorism is not Effective.

> I have just laid out a scheme whereby agents extraordinary only in dedication have exerted world-shaking power. Similar scenarios are true of other sectors. (The Secret Service works hard, but can they protect the President against the 100 fanatics?) Destruction and offense is always easier than construction and defense, but it’s hard to see why the fanatic advantage would be completely negated in constructive enterprises. (Small groups of programmers and engineers routinely revolutionize sectors of technology, without being especially fanatical.) But of course, we see very few such schemes in either direction. That is the point. There is a very large gap between what we can do and what we will do. Coordination is extremely hard (see again the principal-agent problem).

> But the scary thought is - will things remain that way? I have been at pains to keep the agents ordinary. Is there any way now or in the future to create such agents? [...]

> In short, is there any reason to believe wireheading will not work in humans like it works in mice? [...] That is one scenario. Here is another: the electrode is under the control of a program connected to metrics chosen by the subject, like going to the gym. (Related topic: nicotine & habit-formation.) The incentives are much more closely aligned: the subject could gain control of the stimulation, but that would frustrate another goal of his (going to the gym). Imagine the program hooked up to a comprehensive plan for attacking Goldman Sachs; one rather doubts that an agent will break the plan and not eat bulgur pilaf if that means he is simultaneously sabotaging the plan and also depriving himself of pleasure.

http://www.gwern.net/Terrorism-is-not-Effective


Off topic:

> (Small groups of programmers and engineers routinely revolutionize sectors of technology, without being especially fanatical.)

In my opinion, that's because of the current state of software as a field. In the future, as the field grows and the distance between subfields grows, it's going to be more and more difficult to get stuff done.

You could argue that abstraction will offset this difference, but I doubt that, as you generally need people who are invested in those layers of abstraction to fix bugs. Occasional bugs will be fixable by the small team, but already we're outsourcing Big Bugs to other, more specialized teams, even if we don't realize that it is so (Usage of libraries is just outsourcing work to other 'research teams'). But then I guess the question is that if a bug fix for team X utilizes seven different teams that are unconnected to the project worked on by team X, then does it still count as a 'small group', at what point do you account for the critical-yet-unconnected effort of other teams?



Also https://en.wikipedia.org/wiki/A_Deepness_in_the_Sky

> when a person began failing or slowing at a set task ... they were able to reverse it

Vernor Vinge anticipated this treatment and called it "focus".


It reminds me of the The Dark Forest by Liu Cixin (2nd book of the Three Body Problem trilogy) where one of the wallfacers came up with a chip to overcome defeatism in the troops (the Mental Seal).

Scary but fascinating how fast future approaches.


I've always thought whenever DARPA funded something sounds good, there's a catch, like "yeah, let's publish the good-natured stuff... " and in this case, "make sure the implants can be used in other part of the brain"

Or am I just delusional? Or it's just the nature of tools?


I don't even see why you'd need them to work in other parts of the brain. Directly manipulating mood and emotion would already give you an enormous power about a person.

As psychology research shows (skinner etc, the current discussion about manipulation in games) - controlling emotional rewards would also have a big influence on the "rational" decision making of that person.

I think the "let's publish the good-natured stuff" is more about the target values they seem to tune this system for. Sure, you could tune it to counteract mood disorders and steer the subject towards emotional stability. You could also tune it to other targets.


Yes, you are delusional. How do you define new technology that is "good natured"?

The reason DARPA is studying this it seems is to reduce its casualty rate to research new ways to treat mental illness. If you can do that you will spend less and also keep people alive. A bunch of medical research has been created for this motivation.

ARPA created the prototype to the internet. (ARPANet). The catch of the internet is that it allowed for people to disseminate information faster than anything before. You get the good (advancement of science) and the bad (misleading people).

DARPA is in the business of moonshots. They are all about high risk / high reward.

Its more likely the stuff you don't hear about didn't work in the first place.


>The catch of the internet is that it allowed for people to disseminate information faster than anything before.

The specific idea was to create a resilient network to allow for decentralized command and control communications in the event of a direct nuclear strike on well-defined targets like the White House or Pentagon or the Capitol building IIRC. From what I can find via searching right now, the currently only claimed intent was to provide communications links between research installations but I remember reading about the purpose I described in some sort of histories of the ARPANET/Internet decades ago on the net.


I have exactly ARPAnet in mind when I said "Or it's just the nature of tools?" :D

I should've worded it better, good natured here means good-intention, in which there will be bad intention that we will probably NEVER see (C&D, military secret, etc...)

In this case, what's stopping the scientists and/or funder from misusing/weaponizing this implant?

p/s: guess I'm biased when it's DARPA, since it's US military complex... but hey, I'm enjoying the Internet :D


For one thing how would you weaponize this? Do you think that scientists are robots and don't have their own morality?

You still haven't told me how to design a technology to be "good intentioned". Technology has no intension it's the people who use it.

It's really disturbing that you automatically assume that technology is evil because who created it and you have a prejudice because it is DARPA.


> You still haven't told me how to design a technology to be "good intentioned".

Design.

Design simply encodes the values of the designer in the physical technology. A virtuous designer will design something which makes good things easy. A nihilistic designer will design something which makes their life easy. It’s unavoidable.

That’s why so much software these days is harmful, the tech world is largely populated with supremacists and nihilists.

(I don’t mean that in any particularly judgemental way. Supremacism is the norm in the wealthier slices of American culture, so it’s unremarkable that Silicon Valley would end up reflecting that. And the competitive nature of funding and hiring self selects for supremacists as well. And for those who don’t tend towards supremacism, nihilism one of the few ways to stay sane amongst supremacists.)


I automatically assume any new technology will be used for evil because we are all humans. Individual morality breaks down over large groups with separation of knowledge.


> The catch of the internet is that it allowed for people to disseminate information faster than anything before.

That's an unintended consequence, years and tech generations apart from the DARPA work.

The ARPANET goal was always explicit: to permit troops coordination even on the event of large scale bombings.


Just imagine a future where this technology is stolen and used to control dissent...

Coming to a future dystopia near you


Imagine a future when this technology is proprietary.


Imagine a future where the technology triggers good feelings in you when you view certain products or depictions thereof.


I could easily see this being as widespread as add medication.


Actually there is a mission achievement in the latest Hitman game which has you using just such an implant to wirelessly hack someone’s mood.


Yeah, we need some tool to ensure that no one is going to steal it.


We already have technology to control dissent. Why do you think every country has media sector, education system, etc?

But direct manipulation of the brain itself rather than through your senses via propaganda is on another level.

It introduces a whole slew of issues. A individual and his role in society. A citizen and their relation to government. And of course, an individual and his relation to his own identity and free will. If a chip can manipulate your brain to feel or think a certain way, then are you really you? Do you have free will?


The logical endpoint of the "you don't really own your smartphone/notebook/computer/IoT-device/etc" thread running through much of HN discussion:

> One challenge with stimulating areas of the brain associated with mood, he says, is the possibility of ... extreme happiness

Extreme happiness you'd be able to feel if you were in control of your "own" implants which in turn are in control of your "own" brain.

Needless to say, you wouldn't be.


This story is ripe for conspiracy theories

http://edition.cnn.com/2013/03/16/health/mental-illness-over...



This makes sense. The human body consists of many biological control systems, and if few of them are out of order, then the human will become ill. So add an artificial control system and you might have a fix.

However, as humans evolve, we may become dependent on this type of technology. That's what worries me. On the other hand, patents will expire within one or two generations.


How long do they last?

It was my understanding that a big problem with brain implants (both probes and stimulators) was that after some time the body rejects them. Is that true here too and they have limited lifetimes, or have they got around that somehow (or is it otherwise a non-issue for whatever reason)?


Reminds me of The Terminal Man by Michael Crichton. Just re read it the other day and although it’s fiction and overblown, it is crazy to see the future catch up to what was science fiction 30 years ago.


No


What could possible go wrong? It's just your brain.


Imagine a future in which every North Korean has this tech and is actually truly happier in NK than the west. And NK starts sending these people out to actually "follow the generals start and bring the Juch idea to the world." Now, we bank on NK not wanting their countrymen to discover how much more technologically advanced the west is.


I'm not sure happiness is that simple. It's "easy" to fix depression in someone if they're well fed, in good physical health, and aren't overly concerned about their physical safety (I'm not trying to minimize depression here, just pointing out it's relative problems). However DPRK soldiers apparently aren't fed that well, lack access to good healthcare and above all they'd be fighting on a battlefield seeing death around them and in constant fear of their life. It might be possible to amp up the soldiers, but the Germans also drugged their troops and even military leadership during WW II which didn't work out so well for them in the medium term.


Why do you say it didn't work out well for the germans?


Because they lost WW II and had their country occupied for decades by Americans and Russians, both of whom many Germans resented (at least according to my parents who were part of the occupying force).


But did they lose because of the drugs or because they were massively outnumbered and made ridiculously stupid mistakes on the eastern front and ran out of fuel and didn't have nukes in time?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: