Jan 2018 - May 2018 | UX Engineer, Entertainment Technology Center, Carnegie Mellon University
Using the pipeline SDK the client is developing, our armband device learns and predicts data from the user’s continuous gesture and it controls the sound filters. I designed the whole exhibits, implemented the sound filter, mapped the gesture data into the filter, and visualized the sound to create an immersive experience.
From our experience, we believe the device has great potentials in a variety of applications, including physical training therapy, on-the-go alternative controllers, VR/AR entertainment, live theater performance, controlling smart homes, and controlling robotics. However, there are still several technical challenges in the current hardware and network communication that the later SDK would need to resolve. Therefore, it is very important for us to communicate daily as a team to scope the design process, set appropriate weekly sprints to experiment and learn through trials and errors, discuss with clients through weekly documentations and meetings, and manage our schedule to plan ahead for any unexpected problems that can arise.
In a way, we are our client’s first external developers whose responsible is to provide feedback on the quality assurance of the device and SDK documentation. We are proud of what we have achieved, and we hope other developers can learn from our experience to create better applications for this cutting-edge technology.