Tactus Update: Force Feedback — March 29, 2018

Tactus Update: Force Feedback

(This is a small piece of a larger project I have been working on. Read the About Tactus page for an overview and stay tuned for more updates!)

As the main focus of most haptics companies, force feedback is essential to our sense of touch in the virtual world. It is what gives virtual objects physicality and presence. By pulling back against a user’s fingers, a haptic glove can simulate the shape and function of objects in the hand. Many haptic gloves do this with pneumatics or servo motors. Pneumatics are incredibly strong and accurate, but a pneumatic system does not fit on your hand. This means having a giant box next to you that contains an air compressor and a giant valve array. Servo motors are easy to use and can track position well but are limited in their force feedback capabilities. In addition, the bulk of a servo-based system means that these gloves are usually restricted to one degree of freedom per finger.

At Tactus, we are developing tiny, force feedback modules called Tensors.

These are similar to servos but are specifically designed and programmed for use with haptics. This means that their form factor, actuator motion type, speed, and controls are all completely different and optimized for haptics.

Tensors are small. The smallest common servos have about an 11 cm³ volumetric footprint while Tensors are half that size at 5.5 cm³. This small size allows Tensors to be used in almost any imaginable position on the glove, whether that is on the back of the hand or even mounted directly to a finger. In addition, Tensors were built with people in mind. While the larger, more utilitarian, blocky form of a servo matches robots well, humans are sleek and made up of organic shapes. Tensors are thin and almost finger-like, matching the design of the human hand. Because of this, they integrate naturally with our bodies and allow for a much higher actuator density, much like our muscles.

Tensors are linear actuators, just like our muscles. While our joint movements are rotational, muscles are not. In order to smoothly integrate technology with the human body, we have to match its movements. This means that actuator motion must be more natural, allowing actuators to work well with (or, in the case of haptics, against) the evolutionarily refined human body.

Tensors are fast. With a gear ratio and motor choice suited for high-frequency actuation, they are built to race right along with a user’s hands. Normal servos do not prioritize quickness and require bulky external gear chains to increase speed. For haptics, speed is key because latency and sluggish controls can ruin the magic that the technology is designed to create.

Finally, Tensors are smart. Really smart. Normal servos run a series of controls algorithm to precisely change their position. Tensors have this positional control but go way beyond that. Through current modulation and current draw measurement as well as positional and voltage control, they know everything happening at the tip of your finger. Carefully tuned algorithms give Tensors access to position, velocity, acceleration, and force control and measurement. This is where Tensors truly excel. With all of this control and data, they have complete command over touch sensation and can simulate almost any object.

Through an innovative form factor, specialized linear actuation, fast response times and intelligent control algorithms, Tensors have the ability to change haptic interaction.

Top View 1
Tensor top view: Full mechanism with actuation cable extending from the bottom
Clear Angle view
Tensor with clear casing

This slideshow requires JavaScript.


The real magic comes when two Tensors work together.

Due to space constraints, tetherless haptic technologies have mostly been limited to a single degree of freedom per finger. This is analogous to having one muscle to control each finger. As anyone who uses sign language could tell you, this couldn’t be further from the truth. The human hand has 14 joints and 34 muscles to control them with. Because of this, current haptic technologies can only access a fraction of the interactions that your hands are capable of.

Most haptic gloves focus on flexion, the most common and useful motion we do with our hands. They do this by running a single cable down the back of the finger, pulling back to simulate force on the fingertip. However, flexion of each finger is controlled not by one, but by two muscles. One flexes the lowest joint on your finger (the MCP), essentially bending from the palm. The other muscle flexes the other two joints (PIP and DIP). Tensors are small enough that we can place one on top of your finger, right above the MCP. With another mounted directly behind it on the back of your palm, the Tensors mirror their respective muscles. By matching the dexterity of your hand, Tactus can simulate the exact force of any flexion-based interaction. This is a process called force-vectoring: controlling not just the intensity of forces (as most haptic gloves do), but their exact direction as well. 


With complete positional and force control over the entirety of each finger,  Tensors will be able to create more realistic force feedback than any other system on the market.

CS 3043: Social Implications of Technology — March 12, 2018

CS 3043: Social Implications of Technology

As a final project for my Social Implications of Technology course, I took a look at the UN’s Human Development Index dataset. I wanted to look at the relationship that education level has with other aspects of life. With 188 countries in this dataset and over 150 dimensions, there are almost 30,000 data points. I cut this down to 54 dimensions to focus on. This included data on expected years of schooling, mean years of schooling, and secondary education levels in every country.

To start off, I built a correlation matrix to relate these dimensions.


Correlation Matrix
Sample of the correlation matrix.


Then, I looked more closely at how strongly all of these dimensions correlated with education. Screen Shot 2018-03-12 at 8.53.41 PMFor example, years of education correlates strongly with median age, showing that more educated people live longer and vice versa.

Going from here, I took a look at the graphs of each of these correlations.

This slideshow requires JavaScript.

From these, we can visualize how tight each correlation is.

The final thing I looked at was the growth rate of education systems across the world. I calculated an index related to how much an education system grew every year. With this data set, I couldn’t really take into account the quality of education so this only looks at the quantity. I found that developing countries have the fastest growth rates and that richer countries tend to level off in their educational growth fairly quickly. Again, this is probably because more wealthy countries spend more money on the quality of education, not its quantity.

Screen Shot 2018-03-12 at 9.11.34 PM
Green = Higher Education Growth                 Red = Lower Education Growth


Screen Shot 2018-03-12 at 9.11.34 PM
More wealthy countries focus much less on education growth and more on education quality.



This all concluded in a box plot showing the relationship between education levels and each dimension in the dataset.




Screen Shot 2018-03-12 at 9.13.35 PM
Green indicates a positive correlation with years of education; Red indicates negative correlation Size indicates the strength of correlation.


This was presented as an interactive Tableau Story that can be found here:

Social Effects of Education


More data visualizations with Tableau coming soon!

MU 3620 Final Project — October 16, 2017

MU 3620 Final Project

For the final project in Electronic Music Composition, we each were assigned to compose and produce a two to five-minute composition using one or more of the methods discussed in class: randomness, sonification, remixing, multimedia, or interactivity.

I chose to write a remix of Genghis Khan by Miike Snow. I wanted to give the song a bit more kick, and switch it up from an indie pop to an EDM format. I made a lot of changes, but I mostly focused on rewriting both the song and chord structures and cutting through the vocals to make the song more accessible as a dance track.


Even though this class is over, I hope to continue making music on a regular basis. I’ll post big projects here but follow my SoundCloud for more regular updates.

Week 3 and 4: Remixing and Scoring — September 28, 2017

Week 3 and 4: Remixing and Scoring

For the last two weeks, we have been looking at writing music based on other’s creative work. The first week of this was looking at re-arrangement and remixing. By taking someone else’s music and playing it with a different set of instruments, or by cutting it up and piecing it back together, we can create new and different compositions. For that assignment, I remixed Halsey’s song, Colors. I focused in on the synth and bass and away from her voice to give the song a fresh interpretation.


For this past week, we have been looking at writing music for video games and film. The important part of this type of composition is that your music does not hijack the scene. This means creating music that fits the mood, the tempo, and sometimes even fits the cuts of the imagery. Everything must mesh so that the music adds to the imagery without taking too much of the viewer’s attention. For this project, I wrote a score for the crash scene from the movie Sully.

EMC Week 2: Mapping — September 15, 2017

EMC Week 2: Mapping

Our second assignment of Electronic Music Composition was to write a song using mapping or sonification. Basically, take data and turn it into music in some way. We could use live data (game controllers, mouse position, etc) or static data (climate stats, population stats, sales data, etc).

I used sorting algorithms. Some of you may remember the sorting algorithm visualizations that I did last year. These showed visually how computers sort through scrambled lists. In order to show this musically, I converted every number to a note and had my algorithms sort them into a scale.

The result of this algorithm is played by the guitar. It starts with a scrambled scale and plays over each step of the sorting, eventually reaching a straight scale. Layered with strings and drums, this produces a unique, poppy song.

EMC Week 1: Randomness — September 13, 2017

EMC Week 1: Randomness

Our first assignment for my Electronic Music Composition class was to write a piece using randomness in any way we see fit. I decided to use white noise (sounds like radio static) as my random source.

Inspired by the band “Public Service Broadcasting,” I used Apollo Flight Control audio from NASA as the basis for my song. Then, I created a drum kit from white noise samples to make the drums fit into the general theme of the song.

The resulting song is a fairly cinematic, ambient piece.

New Music Page — September 12, 2017

New Music Page

Check out the new tab in the top menu: Music!

I have been writing a lot more music more since I started college (love it!) and I will be posting most of it here. At the moment this is all instrumental, and many pieces have some sort of story behind them. I am currently taking an Electronic Music Composition class and I write at least a song each week.

Songs will start going up in the next few days but you can also follow me on SoundCloud!

Haptic Glove Update — March 21, 2017

Haptic Glove Update

The haptic glove project is progressing nicely!

At the beginning of the year, we decided to branch out a bit from our previous design to explore other solutions. However, we have returned to a largely similar design with some key modifications. The new glove is almost entirely made of 3d-printed parts, uses new communication protocols and tracking algorithms, and now integrates with Unity!

We have already printed most of the glove and have been working on mounting components. The next steps are finishing up the finger exoskeleton design and integrating it with the rest of the glove. The exoskeleton should be one of the more interesting improvements in this prototype. Instead of extending the Exo along the sides of the fingers to create joints, it now runs exclusively across the top. The new mechanism creates what we call “virtual joints” outside of its physical structure that allows it to move freely with the finger while maintaining its internal structure. This allows us to create pressure just at the fingertip without using the user’s own fingers for leverage.

On the software side, we are currently focusing on smoothing out the motion of the servos and improving the reliability of the force measurements. Instead of modifying servos to read their position, we are using external force sensors. While it adds a bit of complexity, it greatly reduces the twitchiness of the glove. This also means our communication system is a lot cleaner. Now, instead of sending full coordinates and force data back and forth, we just send position values to Unity and Unity responds with a target force. Unity builds a model finger from the glove’s positional tracking and calculates the required force on each joint that will result in a realistic experience. This system cuts back on the amount of data that is sent back and forth and improves the speed of the overall simulation.

We are finishing up integrating all of these parts and hope to have a full demo online by mid-April.

New Links Page! — March 12, 2017

New Links Page!

Check out my new Links page at the top of the site!

I am working on a few web projects and decided to have a common place to link to them. I will add to the list as I produce more.

You’ll notice that the first link is a web version of the Letter By Letter Visualization that I discussed in an earlier post. This is the first of a series of text visualizations that I am working on putting online. I am planning on setting up a site to sell custom art pieces generated from customer-submitted text once I have a few viz’s together.

Let me know if there are any specific visualizations I have done that you’d like to be able to experiment with online!

BEAM Programming Camp — September 26, 2016

BEAM Programming Camp

Earlier this summer, my friend Eli and I planned and taught a 1.5 week programming camp to a group of middle/high schoolers. Some of them had no programming experience, some had looked into some code, and some had taken full computer science classes. We created a camp that tried to cater to everyone and teach them all something new.

We taught a combination of software and hardware development. Starting off with Processing, the campers created a basic runner game with obstacles that the player had to jump over. They learned all of the basics of Object Oriented Programming and simple Java syntax.

Building off of this, we moved into hardware development with the Arduino. Programming in Processing and Arduino are very similar and it actually isn’t very hard to use them together. This allowed us to, after a quick intro and basic Arduino lesson, make game controllers with the Arduino for our Processing games.Camp_wk1_4

The students created a runner game, a brick breaker game, and a projectile dodging game. After students created the Arduino controllers, we moved them onto using 3rd party libraries to use their Xbox controllers with their games. IMG_0547

Finally, the students focused on finishing one of the games of their choice. Some made high score systems, some themed their games (Breaking Bad themed brick breaker), and some added bullets to the dodging game. A few students even made their games 3D.