TED Session 4: Oliver Sacks, Joann Kuchera-Morin…
The theme is SEE.
Hallucinations. Tells story of 95-year-old patient who hallucinated a silent movie. Diagnosed her with Charles Bonnet syndrome. In his experience, 10% of hearing impaired have hearing hallucinations, 10% of visually impaired have images. MRIs done while people are having hallucinations show that visual hallucinations activate fusiform gyrus. Animations activate different cells. Car hallucinations different. Can’t think of them as dreams. This is common — must be hundreds of thousands of blind people who have these hallucinations who are too scared to mention them, but valuable to understanding how the brain works. Bonnet wanted to understand the theater of the mind created by the infrastructure of the brain. Sacks himself is partially blind, and sees the geometric hallucinations, but they stop there.
AlloSphere at Santa Barbara for research, artistic and scientific installations. She’s a composer, works with visual artists to map complex mathematical patterns visually and socially. Shows demo of flying into research projects in the allosphere. Flying through the cortex of a brain with real MRI data, blood density is heard sonically. Shows another of bacteria, another of lattices of atoms to look at hydrogen bonds, another of electron flow based on Schrodinger Equation. Art, science, and engineering combined into an instrument.
3-4. Dale Chihuly and Olafur Eliasson, sculptors
FYI I am not always doing the art stuff and not the music, just the talks that are more talks and more techie. Which kind of ruins the effect, but that’s OK.
5. Ed Ulbrich, visual storyteller
Breakthrough in computer-visual animation for The Curious Case of Benjamin Button. For nearly the first hour of the film, Brad Pitt is completely generated from the neck up — no makeup. This movie has floated around Hollywood for well over a century. First got involved when Ron Howard directing, but it was too hard at the time. The human head was the holy grail of our industry. A decade later David Fincher pushed it through. He wanted the main character in the film to be played from the cradle to the grave by one actor. Very hard. Prosthetic makeup wouldn’t hold up particularly in close up, character needed to be appealing. Brad Pitt’s face so well known. Attached computer-generated head to the body of another actor. He threw up when he received the phone call to say the project was a go.
Motion capture. Available techniques: marker-based with infrared sensors, facial marker tracking — pretty crappy. We needed to know what was going on between the markers — skin, creases, dimples. So we walked away from the strategies of the day, out of comfort zone. Looked to other fields — medical imaging, videogames — to reappropriate. Surface capture in front of array of cameras, phosphorescent makeup. Went into millions of polygons. Made 3-D database of everything Brad Pitt’s face is capable of doing. But that was at age 44. But he needed to be 80 years older. Shows bust of Brad/Ben at 87. So retargeted facial expressions to be older. Show bodies in New Orleans with blue heads. Now needed footage to actually be acted — he watched on screen footage six months later, then he improvised. With four HD cameras. Could choose positions on face to pull the expressions from — e.g. when a particular eyebrow is raised. Fast-forwarding through lighting system — global illumination, creating a lighting environment to match whatever happened in the real world. Also had to create an eye system, an articulating tongue — one person devoted to tongue for nine months. Skin had to be good enough for him to fit in at old home among real old people.
Effectively we created a digital puppet that Brad Pitt could operate using his own face.
There was one thing we created, a digital Botox. Took humans to differentiate between an ironic smile, a happy smile, etc. “Emotion capture.” Took 155 people over two years to create. If another actor were to enter the system, wouldn’t have ticks and idiosyncracies — this is shaped to be Brad.
6. Renny Gleeson, TEDster
Culture of availability driven by mobile device proliferation. Furtive mobile device use — “the lean” to quickly check on device while someone looks away. “The stretch” — looking at phone at end of arm. Recumbent motorcyle text-messaging. Fundamental need developed to create shared narratives. Literally what’s you’re saying is: What’s happening here, now, isn’t as importatnt to me as what could be happening anywhere else. Moment doesn’t exist unless documented. I share, therefore I am. What we push out becomes who we are. We aren’t merely projecting identity, we’re creating it. Let’s make technologies that make people more human, and not less.
7. Ray Zahab, crazy outdoorsman
A month ago today he was at geographic South Pole, had just broken world speed record to south pole by shaving five miles off. Hercules inlet to South Pole, him entirely on his feet. 40 below every single day, massive headwind, crevasses in snow, uphill the entire way. Had just gotten back from 111 days in desert in Africa. We are capable of doing anything we set our minds to, but we need to be doing it for a reason. So developed live website updating everyday, talking about depleted ozone effects they were witnessing. Young people sent in questions, they inspired. Has only been running for five years, a year before that was a pack-a-day smoker. I’m learning this at 40, imagine being 13 years old, hearing those words and believing it.
8. Golan Levin, interactive artist
Art and technology still pretty far apart — trying to bring them together. Phonesthesia we all have, where as synesthesia is uncommon. Cognitive psychologists have sussed our shapes of phonemes — sounds and shapes. Really neat installation where person stands in front of microphone with projector behind, and shapes and letters of sounds are animated. This only took one or two people for a few months — not Benjamin Button style — being a hybrid person. Live subtitles for nonsense poem. “Typing” with eyes using camera that captures short clip of your eyes every time you blink. What if art looked back at you? Opto-Isolator robot — blinks in response to you. Looks away if you look at it too long because it gets shy. New one — eight-foot snout with googley eye, with 8-foot robot arm inside. Making a robot that seems like it’s continually surprised to see you. Computer targets people who are moving around the most. Creating a novel body language but have that communicate something to the person who sees it — that it’s interested and surprised to see you.