"Alan" Dingtian Zhang

alandtzhang@gmail.com




Current Projects



COSMOS: COmputational Skins for Multifunctional Objects and Systems

I am leading an interdisciplinary research team to design and develop ubiquitous computational skins that weave into everyday life. It is an collaborative effots across disciplines to fabricate flexible nanomaterial non-silicon circuits into sensor networks which can collect, process, and communicate data with energy harvested from the environment. Applications include interactive wallpaper and post-it notes, informative food packages, object localization, etc.

Previous Projects (2013 ~ 2016)



Whoosh: Non-Voice Acoustics for Low-Cost, Hands-Free, and Rapid Input on Smartwatches (2016)

I worked in a team to develop Whoosh, a non-voice acoustic input (e.g., blowing, shooshing, and other dynamicevents) for low-cost and rapid interaction on smartwatch. I designed and 3D-printed a passive watch case inspired by traditional Asian flute to expand the vocabulary (directional and bezel blows) for commodity smartwatches. [Paper]




Applying Design Studio Pedagogy in STEM Learning with Novel Presentation and Sensing Technologies (2015)

I worked in a team to develop projected augmented reality to add design studio learning models to a classroom for STEM classes that encourage creativity, innovation, and help build strong peer learning environments. Students do classwork using an enhanced version of Pythy, a web IDE for Python and Jython, that captures students' work and displays it around the room. We leverage the Microsoft RoomAlive Toolkit to construct a room-scale augmented reality using pairs of projectors and depth cameras. The system "pins" students' work to the walls, where teachers and students can view, interact with, and discuss. [Poster]



Proprioceptive Interface: Enabling Eye-Free Interaction for Textile Buttons (2015)

I have been investigating ways to facilitate eye-free interactions with textile buttons by using different types of techniques, including different vibrating patterns, various textures, and locking mechanisms. I am also experimenting different types of textile buttons including capacitive sensing, resistive sensing, and piezoelectric harvesting buttons.



Cardboard Augmented Reality for Sweet Auburn (2015)

I led a team to develop a stereoscopic augmented reality view for Sweet Auburn Atlanta with Google Cardboard and Three.js to augment tourist experience with an immersive AR, and location-based context awareness. This can be used for Atlanta Streetcar riders. [Demo]



IUI 2015 -- BeyondTouch: Extending the Input Language with Built-in Sensors on Commodity Smartphones (2015)

I worked in a team to develop BeyondTouch, which extends and enriches smartphone inputs to a wide variety of additional tapping and sliding inputs on the case of and the surface adjacent to the smartphone, by using only existing sensing capabilities on a commodity smartphone. It can be applied to a variety of application scenarios. [Video] [Paper]



Master Project -- Quadcopter Navigation Using Google Glass and Brain-Computer Interface (2015)

I developed assistive technologies for ALS patients to explore surroundings with wearable technology and a camera-mounted quadcopter. Google Glass is used to creating telepresence by displaying drone-retrieved first-person view, and presenting visual stimili for Steady-State Visually Evoked Potential (SSVEP). OpenBCI, a mobile Brain-Computer Interface, acquires user's electroencephalogram (EEG) for real-time analysis. Thus, user's attention to different icons presented on Glass is used to navigate the quadcopter wirelessly. Java, Android Studio, and Matlab are used in the project. [Video]



NASA Wearable Symposium 2014 -- Wearable Unobtrusive Noise Canceling Vest for ISS (2014)

I worked in a team to develop an unobtrusive wearable noise-canceling system for NASA astronauts onboard International Space Station (ISS) where very high level of noise is contantly generated from life support systems. Together we designed a vest with 3D printed adjustable collar integrated with circuit boards, speakers, microphones, and power supplies. I implemented anti-noise generation in MAX/MSP, C, and Matlab for which could decrease various kinds of noise up to -10 dBA SPL. [Poster]



Dressage Horse Pattern Recognition (2014)

To help dressage riders analyze and review their performance, I developed approaches of wearable technologies, data analytics & visualization, and pattern recognition: Data are collected from sensors instrumented on rider and horse. Insights of the sport are revealed by signal processing and visualization techniques. Finally, a machine learning model is built to classify 10-class gaits with an overall accuracy of 97.4%. The development is done with Java, Android Studio, and Bluetooth Low Energy. [Poster]



ICCC 2014 -- Building Artistic Computer Colleagues with an Enactive Model of Creativity (2014)

I worked on Computational Creativity which investigates ways to make computers generate creative products or use technology to support and enhance human creativity. We developed an aritificial intelligence program called Drawing Apprentice, which collaborates with human users as they draw on a digital canvas. The user and the program take turns to draw strokes, where the program learns the style of the human user and adds its own creativity . [Paper]