Articles, Blog

Can we create new senses for humans? | David Eagleman

October 19, 2019


We are built out of very small stuff, and we are embedded in
a very large cosmos, and the fact is that we are not
very good at understanding reality at either of those scales, and that’s because our brains haven’t evolved to understand
the world at that scale. Instead, we’re trapped on this
very thin slice of perception right in the middle. But it gets strange, because even at
that slice of reality that we call home, we’re not seeing most
of the action that’s going on. So take the colors of our world. This is light waves, electromagnetic
radiation that bounces off objects and it hits specialized receptors
in the back of our eyes. But we’re not seeing
all the waves out there. In fact, what we see is less than a 10 trillionth
of what’s out there. So you have radio waves and microwaves and X-rays and gamma rays
passing through your body right now and you’re completely unaware of it, because you don’t come with
the proper biological receptors for picking it up. There are thousands
of cell phone conversations passing through you right now, and you’re utterly blind to it. Now, it’s not that these things
are inherently unseeable. Snakes include some infrared
in their reality, and honeybees include ultraviolet
in their view of the world, and of course we build machines
in the dashboards of our cars to pick up on signals
in the radio frequency range, and we built machines in hospitals
to pick up on the X-ray range. But you can’t sense
any of those by yourself, at least not yet, because you don’t come equipped
with the proper sensors. Now, what this means is that
our experience of reality is constrained by our biology, and that goes against
the common sense notion that our eyes and our ears
and our fingertips are just picking up
the objective reality that’s out there. Instead, our brains are sampling
just a little bit of the world. Now, across the animal kingdom, different animals pick up
on different parts of reality. So in the blind
and deaf world of the tick, the important signals
are temperature and butyric acid; in the world of the black ghost knifefish, its sensory world is lavishly colored
by electrical fields; and for the echolocating bat, its reality is constructed
out of air compression waves. That’s the slice of their ecosystem
that they can pick up on, and we have a word for this in science. It’s called the umwelt, which is the German word
for the surrounding world. Now, presumably, every animal assumes that its umwelt is the entire
objective reality out there, because why would you ever stop to imagine that there’s something beyond
what we can sense. Instead, what we all do
is we accept reality as it’s presented to us. Let’s do a consciousness-raiser on this. Imagine that you are a bloodhound dog. Your whole world is about smelling. You’ve got a long snout that has
200 million scent receptors in it, and you have wet nostrils
that attract and trap scent molecules, and your nostrils even have slits
so you can take big nosefuls of air. Everything is about smell for you. So one day, you stop in your tracks
with a revelation. You look at your human owner
and you think, “What is it like to have the pitiful,
impoverished nose of a human? (Laughter) What is it like when you take
a feeble little noseful of air? How can you not know that there’s
a cat 100 yards away, or that your neighbor was on
this very spot six hours ago?” (Laughter) So because we’re humans, we’ve never experienced
that world of smell, so we don’t miss it, because we are firmly settled
into our umwelt. But the question is,
do we have to be stuck there? So as a neuroscientist, I’m interested
in the way that technology might expand our umwelt, and how that’s going to change
the experience of being human. So we already know that we can marry
our technology to our biology, because there are hundreds of thousands
of people walking around with artificial hearing
and artificial vision. So the way this works is, you take
a microphone and you digitize the signal, and you put an electrode strip
directly into the inner ear. Or, with the retinal implant,
you take a camera and you digitize the signal,
and then you plug an electrode grid directly into the optic nerve. And as recently as 15 years ago, there were a lot of scientists who thought
these technologies wouldn’t work. Why? It’s because these technologies
speak the language of Silicon Valley, and it’s not exactly the same dialect
as our natural biological sense organs. But the fact is that it works; the brain figures out
how to use the signals just fine. Now, how do we understand that? Well, here’s the big secret: Your brain is not hearing
or seeing any of this. Your brain is locked in a vault of silence
and darkness inside your skull. All it ever sees are
electrochemical signals that come in along different data cables, and this is all it has to work with,
and nothing more. Now, amazingly, the brain is really good
at taking in these signals and extracting patterns
and assigning meaning, so that it takes this inner cosmos
and puts together a story of this, your subjective world. But here’s the key point: Your brain doesn’t know,
and it doesn’t care, where it gets the data from. Whatever information comes in,
it just figures out what to do with it. And this is a very efficient
kind of machine. It’s essentially a general purpose
computing device, and it just takes in everything and figures out
what it’s going to do with it, and that, I think, frees up Mother Nature to tinker around with different
sorts of input channels. So I call this the P.H.
model of evolution, and I don’t want to get
too technical here, but P.H. stands for Potato Head, and I use this name to emphasize
that all these sensors that we know and love, like our eyes
and our ears and our fingertips, these are merely peripheral
plug-and-play devices: You stick them in, and you’re good to go. The brain figures out what to do
with the data that comes in. And when you look across
the animal kingdom, you find lots of peripheral devices. So snakes have heat pits
with which to detect infrared, and the ghost knifefish has
electroreceptors, and the star-nosed mole has this appendage with 22 fingers on it with which it feels around and constructs
a 3D model of the world, and many birds have magnetite
so they can orient to the magnetic field of the planet. So what this means is that
nature doesn’t have to continually redesign the brain. Instead, with the principles
of brain operation established, all nature has to worry about
is designing new peripherals. Okay. So what this means is this: The lesson that surfaces is that there’s nothing
really special or fundamental about the biology that we
come to the table with. It’s just what we have inherited from a complex road of evolution. But it’s not what we have to stick with, and our best proof of principle of this comes from what’s called
sensory substitution. And that refers to feeding
information into the brain via unusual sensory channels, and the brain just figures out
what to do with it. Now, that might sound speculative, but the first paper demonstrating this was
published in the journal Nature in 1969. So a scientist named Paul Bach-y-Rita put blind people
in a modified dental chair, and he set up a video feed, and he put something
in front of the camera, and then you would feel that poked into your back
with a grid of solenoids. So if you wiggle a coffee cup
in front of the camera, you’re feeling that in your back, and amazingly, blind people
got pretty good at being able to determine
what was in front of the camera just by feeling it
in the small of their back. Now, there have been many
modern incarnations of this. The sonic glasses take a video feed
right in front of you and turn that into a sonic landscape, so as things move around,
and get closer and farther, it sounds like “Bzz, bzz, bzz.” It sounds like a cacophony, but after several weeks, blind people
start getting pretty good at understanding what’s in front of them just based on what they’re hearing. And it doesn’t have to be
through the ears: this system uses an electrotactile grid
on the forehead, so whatever’s in front of the video feed,
you’re feeling it on your forehead. Why the forehead? Because you’re not
using it for much else. The most modern incarnation
is called the brainport, and this is a little electrogrid
that sits on your tongue, and the video feed gets turned into
these little electrotactile signals, and blind people get so good at using this
that they can throw a ball into a basket, or they can navigate
complex obstacle courses. They can come to see through their tongue. Now, that sounds completely insane, right? But remember, all vision ever is is electrochemical signals
coursing around in your brain. Your brain doesn’t know
where the signals come from. It just figures out what to do with them. So my interest in my lab
is sensory substitution for the deaf, and this is a project I’ve undertaken with a graduate student
in my lab, Scott Novich, who is spearheading this for his thesis. And here is what we wanted to do: we wanted to make it so that
sound from the world gets converted in some way so that a deaf person
can understand what is being said. And we wanted to do this, given the power
and ubiquity of portable computing, we wanted to make sure that this
would run on cell phones and tablets, and also we wanted
to make this a wearable, something that you could wear
under your clothing. So here’s the concept. So as I’m speaking, my sound
is getting captured by the tablet, and then it’s getting mapped onto a vest
that’s covered in vibratory motors, just like the motors in your cell phone. So as I’m speaking, the sound is getting translated
to a pattern of vibration on the vest. Now, this is not just conceptual: this tablet is transmitting Bluetooth,
and I’m wearing the vest right now. So as I’m speaking — (Applause) — the sound is getting translated
into dynamic patterns of vibration. I’m feeling the sonic world around me. So, we’ve been testing this
with deaf people now, and it turns out that after
just a little bit of time, people can start feeling,
they can start understanding the language of the vest. So this is Jonathan. He’s 37 years old.
He has a master’s degree. He was born profoundly deaf, which means that there’s a part
of his umwelt that’s unavailable to him. So we had Jonathan train with the vest
for four days, two hours a day, and here he is on the fifth day. Scott Novich: You. David Eagleman: So Scott says a word,
Jonathan feels it on the vest, and he writes it on the board. SN: Where. Where. DE: Jonathan is able to translate
this complicated pattern of vibrations into an understanding
of what’s being said. SN: Touch. Touch. DE: Now, he’s not doing this — (Applause) — Jonathan is not doing this consciously,
because the patterns are too complicated, but his brain is starting to unlock
the pattern that allows it to figure out what the data mean, and our expectation is that,
after wearing this for about three months, he will have a direct
perceptual experience of hearing in the same way that when a blind person
passes a finger over braille, the meaning comes directly off the page
without any conscious intervention at all. Now, this technology has the potential
to be a game-changer, because the only other solution
for deafness is a cochlear implant, and that requires an invasive surgery. And this can be built for 40 times cheaper
than a cochlear implant, which opens up this technology globally,
even for the poorest countries. Now, we’ve been very encouraged
by our results with sensory substitution, but what we’ve been thinking a lot about
is sensory addition. How could we use a technology like this
to add a completely new kind of sense, to expand the human umvelt? For example, could we feed
real-time data from the Internet directly into somebody’s brain, and can they develop a direct
perceptual experience? So here’s an experiment
we’re doing in the lab. A subject is feeling a real-time
streaming feed from the Net of data for five seconds. Then, two buttons appear,
and he has to make a choice. He doesn’t know what’s going on. He makes a choice,
and he gets feedback after one second. Now, here’s the thing: The subject has no idea
what all the patterns mean, but we’re seeing if he gets better
at figuring out which button to press. He doesn’t know that what we’re feeding is real-time data from the stock market, and he’s making buy and sell decisions. (Laughter) And the feedback is telling him
whether he did the right thing or not. And what we’re seeing is,
can we expand the human umvelt so that he comes to have,
after several weeks, a direct perceptual experience
of the economic movements of the planet. So we’ll report on that later
to see how well this goes. (Laughter) Here’s another thing we’re doing: During the talks this morning,
we’ve been automatically scraping Twitter for the TED2015 hashtag, and we’ve been doing
an automated sentiment analysis, which means, are people using positive
words or negative words or neutral? And while this has been going on, I have been feeling this, and so I am plugged in
to the aggregate emotion of thousands of people in real time, and that’s a new kind of human experience,
because now I can know how everyone’s doing
and how much you’re loving this. (Laughter) (Applause) It’s a bigger experience
than a human can normally have. We’re also expanding the umvelt of pilots. So in this case, the vest is streaming
nine different measures from this quadcopter, so pitch and yaw and roll
and orientation and heading, and that improves
this pilot’s ability to fly it. It’s essentially like he’s extending
his skin up there, far away. And that’s just the beginning. What we’re envisioning is taking
a modern cockpit full of gauges and instead of trying
to read the whole thing, you feel it. We live in a world of information now, and there is a difference
between accessing big data and experiencing it. So I think there’s really no end
to the possibilities on the horizon for human expansion. Just imagine an astronaut
being able to feel the overall health
of the International Space Station, or, for that matter, having you feel
the invisible states of your own health, like your blood sugar
and the state of your microbiome, or having 360-degree vision
or seeing in infrared or ultraviolet. So the key is this:
As we move into the future, we’re going to increasingly be able
to choose our own peripheral devices. We no longer have to wait
for Mother Nature’s sensory gifts on her timescales, but instead, like any good parent,
she’s given us the tools that we need to go out and define our own trajectory. So the question now is, how do you want to go out
and experience your universe? Thank you. (Applause) Chris Anderson: Can you feel it?
DE: Yeah. Actually, this was the first time
I felt applause on the vest. It’s nice. It’s like a massage. (Laughter) CA: Twitter’s going crazy.
Twitter’s going mad. So that stock market experiment. This could be the first experiment
that secures its funding forevermore, right, if successful? DE: Well, that’s right, I wouldn’t
have to write to NIH anymore. CA: Well look, just to be
skeptical for a minute, I mean, this is amazing,
but isn’t most of the evidence so far that sensory substitution works, not necessarily
that sensory addition works? I mean, isn’t it possible that the
blind person can see through their tongue because the visual cortex is still there,
ready to process, and that that is needed as part of it? DE: That’s a great question.
We actually have no idea what the theoretical limits are of what
kind of data the brain can take in. The general story, though,
is that it’s extraordinarily flexible. So when a person goes blind,
what we used to call their visual cortex gets taken over by other things,
by touch, by hearing, by vocabulary. So what that tells us is that
the cortex is kind of a one-trick pony. It just runs certain kinds
of computations on things. And when we look around
at things like braille, for example, people are getting information
through bumps on their fingers. So I don’t think we have any reason
to think there’s a theoretical limit that we know the edge of. CA: If this checks out,
you’re going to be deluged. There are so many
possible applications for this. Are you ready for this? What are you most
excited about, the direction it might go? DE: I mean, I think there’s
a lot of applications here. In terms of beyond sensory substitution,
the things I started mentioning about astronauts on the space station,
they spend a lot of their time monitoring things, and they could instead
just get what’s going on, because what this is really good for
is multidimensional data. The key is this: Our visual systems
are good at detecting blobs and edges, but they’re really bad
at what our world has become, which is screens
with lots and lots of data. We have to crawl that
with our attentional systems. So this is a way of just
feeling the state of something, just like the way you know the state
of your body as you’re standing around. So I think heavy machinery, safety,
feeling the state of a factory, of your equipment, that’s one place
it’ll go right away. CA: David Eagleman, that was one
mind-blowing talk. Thank you very much. DE: Thank you, Chris.
(Applause)

100 Comments

  • Reply Patrick Moore August 12, 2019 at 5:06 am

    I think the "adding a new peripheral" process might take longer than you are insinuating

  • Reply Jim Macdonald August 12, 2019 at 5:38 am

    awesome!

  • Reply Daan Van IJcken August 12, 2019 at 6:18 pm

    I so want this field to have a new breakthrough!

  • Reply Secret Journey August 13, 2019 at 3:38 pm

    I already deal with sensory overload. I can’t imagine what else

  • Reply Owen Campbell August 14, 2019 at 12:26 am

    Well every thought of helping humanity and advancing technologie is a good thought, even if in the first place money making was prior. There's nothing against getting payed if its worth it.
    It seems not to be the best solution, but I'm optimistic that it can only develope better.

  • Reply Hector Gonzalez August 14, 2019 at 12:49 pm

    This is the birth of A.I.

  • Reply Hector Gonzalez August 14, 2019 at 12:49 pm

    They played us lmao

  • Reply Hector Gonzalez August 14, 2019 at 12:55 pm

    This is how they’re training their computers, this is why they’re advancing so fast so quickly. This is why this hasn’t been released to the public yet. They realized it would be more useful to train their computers, rather than waste it on humans early on. Make money now and later. Master it and learn to control it before we do.

  • Reply Hector Gonzalez August 14, 2019 at 12:55 pm

    This is how they’re training their computers, this is why they’re advancing so fast so quickly. This is why this hasn’t been released to the public yet. They realized it would be more useful to train their computers, rather than waste it on humans early on. Make money now and later. Master it and learn to control it before we do.

  • Reply GeniiExE August 16, 2019 at 11:59 am

    Who’s watching this in 2019 and shocked they haven’t heard of this, or something like this product?

  • Reply Andre V.C. August 16, 2019 at 7:13 pm

    audience: claps
    speaker: please stop, it tickles!

  • Reply Paradox Productions - Rinohifive August 17, 2019 at 1:31 am

    The hive mind is nigh……

  • Reply Clumsy Captain August 17, 2019 at 2:02 am

    Language learnings gonna get whole easier

  • Reply Wolv August 17, 2019 at 4:41 am

    I have always dreamt of being able too see a much larger range of the EM spectrum and too see magnetic/electric fields. i cant even explain how excited this video makes me feel. i want this right now.

  • Reply David Oliden Rodríguez August 17, 2019 at 8:51 am

    Trashumanismo…puro…

  • Reply backfire357 August 18, 2019 at 3:00 pm

    100% snake oil.

  • Reply Wilmer August 19, 2019 at 6:44 pm

    "Air compression waves". Wow, big brain time

  • Reply Stephen Milligan August 22, 2019 at 12:02 am

    My left eye is used almost solely for peripheral vision… If anyone wants to implant my left eye with infrared and UV light cameras I'd consent.

  • Reply OperationWeekend August 22, 2019 at 3:50 pm

    If this is true then I can put eyes (cameras) in the back of my head

  • Reply OmegaFalcon August 24, 2019 at 2:24 am

    This is still my favorite Ted Talk

  • Reply TheAmazeer August 24, 2019 at 7:50 am

    In the future sexual relations will take place with no contacts anymore

  • Reply Shaerif August 24, 2019 at 12:12 pm

    To create a new race called cyborg, been there not fan of it.

    mixed humans and machine they are not same!

  • Reply M๑ᘞᏓᎥå August 24, 2019 at 2:37 pm

    声波转化成手语转化成固定区域震动,原理简单很难想到。

  • Reply Bazooka Sniper August 25, 2019 at 8:37 am

    Would the back vibrator vest work on the blind and deaf

  • Reply Duke Necromancer August 25, 2019 at 3:22 pm

    Now when a born profoundly deaf person gets introduced to this before they learn to sign. Will their brain interpret those thoughts as vibrations or images first? It wont be human speech in their head, even though we can think in our own language without stimulating the actual peripherals; would a deaf person then feel the world around them in their head or just picture it or both? This could completely separate feeling physically and feeling auditory senses through peripheral devices. This technology fascinates me beyond what i could ever comprehend because it is beyond my umwelt.

  • Reply MadPaperPeople August 25, 2019 at 7:22 pm

    x ray vision…cool

  • Reply Chris Dragotta August 26, 2019 at 1:03 am

    He won't be able to out trade a computer.

  • Reply Matt V August 26, 2019 at 1:37 pm

    4 years later, Neuralink was announced. (^_^)

  • Reply America Is my city August 26, 2019 at 4:46 pm

    So if you put the vibrations on their back, would they perceive it as just vibrations and translate it, or would the brain turn it into a physical language?

  • Reply TobiEeck August 26, 2019 at 6:44 pm

    Funny how he talks about the internet being sensory in 2015. In 2018 or 17. Elon Musk announced the start of Neuralink, a chip in your head to read the internet and interact with it with your brain

  • Reply Jens Videbaek August 26, 2019 at 7:09 pm

    I think the correct phrase is perception of the electromagnetic spectrum from gamma rays through to long wavelength radio frequencies – visible light is in the 400 to 700 nanometer range.

  • Reply Mathieu Blake August 26, 2019 at 8:34 pm

    mother nature… a good parent..

  • Reply Mathieu Blake August 26, 2019 at 9:06 pm

    The brain SEES… The brain is seeing only darkness??

  • Reply heureka47 August 26, 2019 at 9:31 pm

    Most civilized people suffer unconsciously from being separated to their spiritual soul (caused by trauma). Get aware of the "Disease of Society", "Collective Neurosis", etc.

  • Reply Ophelia Rolle August 27, 2019 at 3:37 am

    What are we thinking?….They've already figured exactly how to do that too?

  • Reply Doppelpunktdrei August 27, 2019 at 5:15 pm

    This is amazing!

  • Reply MachineThatCreates August 27, 2019 at 9:00 pm

    Plus side; This will allow us to experience any higher dimensions.
    Down side: We become less human.
    Other side: Is this evolution?🌴

  • Reply Rishit Jain August 28, 2019 at 7:46 am

    Is it possible that we sample a large crowd to observe their electrochemical nervous activity, and then use it to create implants that could essentially help us to converse without words? The way it could work is we could trigger the electrochemical activity of our own implants based on the electrochemical activity around the sensors of the implant. Or we could assign addresses to each implant, so everyone can communicate, even at long distances. The implant could detect the impulses of our own brain when we're trying to send a message and convert it into radio waves. The sensors of other implants could check if that data packet is sent towards them. The ones that aren't the recipients could simply ignore those waves, or convert it into helpful energy, maybe electrical energy to power the implants? The recipient implant could accept the data packet, trigger electrochemical impulses according to the radio waves detected, and the brain could sense that. We could evolve to become telepaths, also cutting down on our sound pollution.

  • Reply Chirag Gupta August 30, 2019 at 3:44 am

    the best video of my life i have ever seen

  • Reply litle snek August 30, 2019 at 5:23 pm

    This is awesome!

  • Reply Spartan Spark August 31, 2019 at 2:07 am

    this is really cool!

  • Reply George Isaak August 31, 2019 at 2:20 am

    promising , interesting , and i hope the human brain can keep up with this !

  • Reply papa yeast stack August 31, 2019 at 2:24 pm

    idiot

  • Reply the_audiophiles September 1, 2019 at 5:19 pm

    Great Scott! dooooooooooooooooode! love it mate.

  • Reply Kyle Noe September 5, 2019 at 7:27 pm

    I think I have a sense that few others have. The wordsense so intimate nobody can seem to understand it like i see it.

  • Reply Jesse Mnmehe September 6, 2019 at 7:19 pm

    It was objectivelly a good ted talk. One more step into the future!
    #From Belgium

  • Reply Jesse Mnmehe September 6, 2019 at 7:21 pm

    So fresh😃😃😃espacially for AVIATION

  • Reply Matt Johnston September 6, 2019 at 11:46 pm

    I'm curious where this technology and research is at now in 2019…

  • Reply Clog Sexton September 9, 2019 at 3:41 am

    In the beginning of the lecture the speaker suggested that the brain can intrepid sensory data from sensory input other than our regular data, however this is not what he demonstrated, what he explained in the beginning logically would be possible given his explanation of the brain brain being in a dark space and process sensory input e.g. the eye and form a perception of light colour and shapes.
    What he demonstrated is using an already existing sensor input on the body being stimulated by an artificial sensor and the brain makes sense of it.
    That is something completely different from feeding artificial sensor data into the brain.
    What he should have demonstrated is replacing an eye with an e.g. IR sensor directly connected to the optic nerve and let the brain intrepid the foreign sensor input and let the brain learn what to make of it.
    I don't want to take away from the fact that the speakers demonstrated aids are helpful for impaired people but he demonstrated something else from what he explained in the beginning.
    It is misleading and a bit disappointing I thought I was going to see some groundbreaking innovations giving humans out of our spectrum sensory computing capabilities.

  • Reply auditore63 September 9, 2019 at 8:14 pm

    UmWelat

  • Reply Tim September 11, 2019 at 8:25 pm

    few steps down the "experience" path, can we control people behavior with certain impulses where u get a physical response good/bad behavior/emotions and when thinking or acting in certain ways u get a "feelgood" shot or vise verse.

    yeah that won't get misused….. xD

    nonetheless its too good to pass as the benefits for the people that will benefit such systems are too great.
    gotta hand it to the brain its a marvelous thing, he had a nice angle it makes u wonder how far we can push it and just how much potential it could have.

  • Reply Harsh T September 12, 2019 at 5:24 am

    4:45

  • Reply fofi fernandez September 13, 2019 at 1:51 am

    I don't have a sense of smell, how do I experience smell if my brain doesn't process it and I have neurological damage?

  • Reply Patrick Andro • September 13, 2019 at 3:42 pm

    I was waiting for him to say
    "Im actually deaf!"

  • Reply mac 187 September 14, 2019 at 3:40 am

    So is this a interview for him to b a new marvel character

  • Reply see1050 m September 17, 2019 at 9:41 am

    empathy would be nice

  • Reply badreddine rahim September 19, 2019 at 4:38 pm

    Am kinda having a bad time between seeing this show and hearing my sister talking right now I guess that brain potentiel is quit overrated by u bro, focus on a sens and forget other ,an everyday's human life b like,that's why, in order to amplify our sensory and informatory data we can just decode each animal language that has that encredible umvelt maybe input it on a pc to translat it but as we have too undevelopped pcs for these tasks the animal human communication should finally arrise in peacfull form to understand the misteries that we all living beings share the disconfort of not knowing 🙂 #thisworldfuturpath yep that's the solution, cz even after analysing all the enviromental state of earth , the matters ahead will have other us creatures brains to think about and then we're on the next relm
    We may even say bby to the 1+1 world that we built by our selfs and find a new scatch bro !!!! Hahahahahahahaha ha ! Good luck scientists 😉

  • Reply Nicholas Gerry September 19, 2019 at 4:55 pm

    Apple be like: "Unlock your new sense. Only $200000000"

  • Reply herumetto-san September 21, 2019 at 9:25 pm

    wow no comments mentioning Paul Rudd at all? c'mon guys….

  • Reply MjrDario September 22, 2019 at 4:30 pm

    A suit that literally allows you to "feel a disturbance in the force".

  • Reply 60lego September 22, 2019 at 11:11 pm

    …like he's extending his skin up there, far away

  • Reply renonkkk September 23, 2019 at 1:43 am

    I came up with an advanced scientific explanation.

    The alien who was alive in the Roswell case

    According to the story, the eyes can see the other side of the blind spot of the door

    It seems to have had.

    The human race has gained a similar and common super technology.

  • Reply Steven Zhu September 23, 2019 at 2:49 am

    Now, create devices we can interact with using not our regular motors (arms, legs, etc.)

  • Reply noe leonel lopez de leon September 24, 2019 at 1:08 pm

    That's the most amazing discovery I have ever seen, get my Like 👍

  • Reply LordFryofKent September 24, 2019 at 7:17 pm

    It's a cool device, but Deaf people have lots of other options that they can use without having to wear the same thing every day.

  • Reply Patrick Monroe September 25, 2019 at 1:36 am

    Wonder if a cadaver brain can be implemented in modern AI technologies as a literal brain to help computers learn and adapt, much like Elon musk is looking to do

  • Reply Larry SAL September 25, 2019 at 6:30 pm

    So….i can send smell in chat message

  • Reply Ashley Whispers September 26, 2019 at 2:02 am

    Wow this is absolutely amazing!

  • Reply Yashaswi Kulshreshtha September 26, 2019 at 2:58 pm

    Sometimes it feels that we have extra ordinary intelligence that's beyond the most intelligent species which would ever live

  • Reply CallahanJones September 27, 2019 at 6:53 am

    re plug and play: Don't brain processes and interfaces evolve like our input mechanisms do? If granted the smell capabilities of a dog in an instant, would a persons brain have the capability to make the same sense of the signals that a dog does? Or any sense at all? After all, different breeds of dogs have different powers of smell (or do they?) Do they have different noses? Or different brains? Different both?

  • Reply Rowan Chavez September 27, 2019 at 5:16 pm

    Imagine parents being able to feel their baby's health. Like a sensory baby monitor

  • Reply Dilan Rigby September 28, 2019 at 8:16 am

    Imagine this tech in VR gaming.

  • Reply Julie q September 28, 2019 at 7:00 pm

    16:15 this is how aliens do it

  • Reply Theo Suharto September 29, 2019 at 2:23 pm

    Dude just wanted to flex them triceps

  • Reply Jordan Barela September 29, 2019 at 7:33 pm

    Amazing work! 💯

  • Reply Seamo One September 30, 2019 at 3:17 am

    I’m only a few seconds into this, I do hope he mentions it but if not… Here it is!
    I will tell you now that the best eyes on the planet that we’ve discovered, are on the peacock mantis shrimp.
    Oh and BTW each Eye sees
    360° independently, & simultaneously…
    And it’s little claw hammers packs a punch of a 22 Caliber bullet!
    So fast that the water underneath it boils for a split second, then it destroys its prey!
    I applied him and his team, they are doing something revolutionary. Go do something revolutionary. Build something think of something invent something, do it!

  • Reply Michael Mills September 30, 2019 at 12:29 pm

    More absolute GENIUS at a TED talk

  • Reply Rain Tamer October 1, 2019 at 4:49 am

    Resistance is futile.

  • Reply Emma Frost October 1, 2019 at 3:15 pm

    This should be a video game

  • Reply Quackadoo October 1, 2019 at 8:23 pm

    His sideburns are on point.

  • Reply sulaiman syed October 4, 2019 at 1:37 am

    one of the best ted talks ever

  • Reply Anil Kumar Sharma October 4, 2019 at 5:39 am

    giving food grade, quality, shape, size, aroma, intensity, chemically reactivity with hydrochloric acid, so we got that scale which gives us the standard of food on basis of alkalinity and hotness and swad

  • Reply Anil Kumar Sharma October 4, 2019 at 5:40 am

    sensing the universe is based on intensity of number like one chilly, and number of chilly

  • Reply Greg Bystroff October 5, 2019 at 1:27 am

    4 years ago. Where is it?

  • Reply Scrap Rocket October 5, 2019 at 12:36 pm

    Hmmm… what do I wanna feel today?

  • Reply Jaden CM October 7, 2019 at 7:57 am

    No
    outro music

  • Reply UpNorth October 10, 2019 at 12:20 am

    This makes perfect sense for how a baby learns language, etc. Their brain learns patterns by processing information.

  • Reply Sorry Mom October 11, 2019 at 2:59 am

    watched 3 ted talks back to back iq is now 7000

  • Reply MaxBas October 11, 2019 at 4:15 pm

    I like hoe he say Umwelt, it was so strange to hear as a German xD

  • Reply Grace October 12, 2019 at 12:16 am

    Who’s here because of Landon

  • Reply Sean O'Donovan October 13, 2019 at 6:02 pm

    Wow, just wow.

  • Reply crazy Guy October 14, 2019 at 5:00 pm

    short circuit with brain

  • Reply daniyal k October 15, 2019 at 12:13 am

    So now science also proves that god exists as he said just because you can't see or ypu don't come equipped with the sensors to see it doesn't mean it doesn't exists and atheists have their own stupid would view like bats or ticks or that fish they say Allah doesn't exist because we can't see it now how stupid and unscientific that statement is.

  • Reply Snow Chan October 15, 2019 at 9:53 am

    The brain is the most sophisticated computer in the world. Fantastic. The electrical synapse is the same, it is only the location in the brain in which it occurs that infers its meaning.

  • Reply Asami xd October 16, 2019 at 12:34 am

    This is honestly great

  • Reply Wade James Kennedy October 18, 2019 at 5:14 am

    Wow !!!

  • Reply Ethernos Grace October 18, 2019 at 10:19 am

    Operate on me

  • Reply Deviant October 18, 2019 at 3:51 pm

    Ah, now I umderstand The Matrix so much better now.

  • Reply RDnAC October 18, 2019 at 4:38 pm

    He shoulda made the vest black so it would look more badass.

  • Reply Az Da Sinista October 19, 2019 at 12:34 pm

    Evolution is a theory.

  • Leave a Reply