My frequent collaborator Christine Sun Kim performs with various textual interfaces to create works about sound and deaf culture. In “a real line ran near an ear” where Christine lead a project by “a workshop to investigate the various states of breath, wind, sound, silence and communication as they exist and transform through physical interaction and installation.” and the performance invited collaborators and participants to communicate via text.
We share fascination art’s potential to unlearn, unearth and untangle the complexities and power imposed on the common perception of normalcy.
“Kim uses a mélange of visual and sound art tools in her unorthodox, defiant art. Her aim, she has said, is to “perceive sound without considering social norms”—a practice that she calls “unlearning sound etiquette.” – Art news
On Nov 4th, CK emailed me “do you know anyone who can make me a very quick app where i can use textedit while my camera is in use? my face would show up in the background with large text typing across my face.”
and over the weekend, I made an application for her performance Bounce House in Tokyo. Nov 15, 2015. Some awesome musicians contributed tracks! The concept of the piece is clearly stated in the following statement.
“This is neither an opportunity for the hearing to virtually experience the perception of the deaf, nor scientific experimentation; this is a dance party for extracting as much as possible, playing back, listening to, and if possible, dancing to the sounds that have been considered valueless as “social currency” and excluded merely for their inefficiency in conveying information.”
Concept by Christine Sun Kim
Tracks contributed by 34423, Katsuhiro Chiba, Matt Karmil, kyoka, Ah-Reum Lee, nanonum, NOEL-KIT, Nyolfen, Phasma, Marina Rosenfeld, hiromi sunaga, James Talambas, and TOKiMONSTA + edIT
The application takes keyboard input and the user can control text like the textedit app, over a real time video feed. It’s a simple application however it can be very effective for Christine who communicates by American Sign Language. With the app, she can type text and sign with facial expression and body movement at the same time. I hope it can be used for others in various situations for assistive technologies and for more inclusive design applications.
Check out christinesunkim.com/typer
Start typing away while looking at yourself! 🙂
Code still needs work, and might have bugs in different browsers.
Feel free to contribute to the app through github repository.
Special thanks to Kyle McDonald and Lauren McCarthy for technical advise and help.
More of our collaboration on Absence Presence.
Pics Christine sent during the set up!
Venue is Super Deluxe. (I misspelled and wrote Super Deleuze… haha…)