⟵ back to

Iteration 2

My main focus for Iteration 2 was haptic rendering for the right Haply (volume) and building the overall experience with both Haplys. Based on our Iteration 1, our team decided to continue with the strategy of working on each Haply separately, which each member developing features specific to a single Haply. My role was similar to Iteration 1, I worked on controlling volume with the right Haply. The second team member continued working on the pitch selection interface for the left Haply and we decided to explore a circular interface as opposed to the original piano interface. The third member would work on volume with the right Haply as well, but using a different haptic approach to me. Since the previous iteration setup volume control to sound, we would now explore the haptic experience of this control. I explored rendering a plucking sensation of a string to control volume, while the third member explored a bowing sensation. As in Iteration 1, I took on the role of combining the our individual parts into one (combining both Haplys).

My motivation for this iteration was to 1) create an intuitive and haptic method to control volume and 2) setup our system to work separately and then later combine easily. My approach for 1) was to setup a physical string apparatus as a reference to recreate its plucking feel on the Haply; this approach was inspired by Lab 1 and 2. For 2), my approach was to restructure the code to be modular and loosely coupled. I used an Object-Oriented strategy with a Haply class responsible for communicating to the Haply. In this class's constructor, the user passes a Model instance. I had each teammate do their work in their own model child class that would inherit from this Model. That way, it was easy to swap and add different models to the Haply. I setup the code so it would be easy to test with either one or both Haplys. The outcome of this structure was faster development because of less debugging time; it was easy to merge our code together in the end and test everything with minimal changes.

The commented code on the left shows the Pure Data file from the previous iteration which I simplified and added comments to. We decided to focus more on the Haply side of development so I kept the sound rendering minimal. We will develop this sound file in the next iteration through the addition of timbre.

Link to the code

Haptic Rendering: Pulling

The left video shows haptic force with volume control for the right Haply. As the user pulls downwards, the volume increases and the Haply provides resistance upwards. This produces a pulling sensation; I associated the metaphor of opening a valve under tension to release air. I based this force on the spring equation with the volume as a function of its displacement from from its resting position (0 force).
The volume control was originally on the x axis but I switched it to the y because I found pulling downwards to feel more natural than moving left and right. I discovered this when combining the Haplys together and evaluating the playing experience holistically. The left video shows both Haplys being used, the left with pitch selection on a piano and the right my code controlling volume.

Haptic Rendering: String

The left video shows the right Haply producing the feeling of plucking a string, through visuals and haptics. I explored having the string horizontal instead of vertical like in the previous example, but I felt vertical to be more natural. I learned this layout depends on the type of haptic experience, a string might feel most natural vertically, while a pulling could be horizontal. In close proximity pushing against the string, I modelled the force response similar to a spring, but past a threshold distance away from the string's center (or "breaking point") I removed the force suddenly, thus creating a plucking effect. I varied this threshold distance as a function of velocity, the faster (or harder) the user plucked, the smaller the distance. I felt this corresponded to the actual variation of plucking depending on the speed or force (I used velocity as a proxy for force) the user plucks the string.

I found one very important aspect of making the string plucking feel realistic was adding a subtle vibration force (as a function of time in a sine function) at the moment after the pluck, which decayed to 0 in less than a second. This was a very small, almost imperceptible addition that I felt made a huge difference in terms of realism for the haptics (feeling a "buzz") and visually (seeing the string oscillate).
In the left video, I combine both Haplys, the left Haply is again pitch control in a circular interface created by another teammate. The right uses the same plucking code from above, but without the visuals. I also connected the string plucking and its resulting vibration to the volume of the string which produced a sound like a piano key. I found this experience to be enjoyable and easier to play that our previous iteration in terms of haptics and sound. The sound felt less artificial (less like a pure sine wave) because of the subtle vibrato of the string.


From this iteration, I learned a lot about haptic rendering in terms of experience and how to structure our code for collaboration. Adding a small vibration after plucking greatly improved the realism of the overall haptic experience. I learned the key to haptic rendering is about subtlety and users are sensitive to small parameter changes in force rendering. When recreating a haptic experience, I should focus on the details and go beyond its obvious aspects. I draw a similar comparison when creating visuals or sound - what someone consciously notices is only a fraction of what is actually happening. The subtle additions the user may not be directly aware of, informs their experience nonetheless. One example is in a song, the user may not be aware of the bass or background ambient notes, but these are essential to the feeling of the song and they would notice right away if you took them away (notice their absence but not presence). For code collaboration, I learned restructuring the code to be more modular and have less individual parts rely on each other, helps down the road when merging others' code. I think we made a good choice to assign each teammate's work specific for features of one Haply and then combine them in the end, because there would otherwise be too much overlap, conflict, and duplicated effort in each of ours' code.

Next Iteration

My goal for the next iteration is to work on the sound aspect and merge my string haptics with the other teammates bow haptics. For sound, I plan to incorporate timbre and explore other variations in relating the Haply to sound. Our idea for the right Haply is to have a single string, with the plucking sensation at the top half and the bow sensation at the bottom. This would correspond to discrete and continuos volume control respectively. I will also focus more on the holistic playing experience, or how the device looks, feels, and sounds with both Haplys as one unified instrument.

© Derrek Chow 2022