New Links Page! — March 12, 2017

New Links Page!

Check out my new Links page at the top of the site!

I am working on a few web projects and decided to have a common place to link to them. I will add to the list as I produce more.

You’ll notice that the first link is a web version of the Letter By Letter Visualization that I discussed in an earlier post. This is the first of a series of text visualizations that I am working on putting online. I am planning on setting up a site to sell custom art pieces generated from customer-submitted text once I have a few viz’s together.

Let me know if there are any specific visualizations I have done that you’d like to be able to experiment with online!

BEAM Programming Camp — September 26, 2016

BEAM Programming Camp

Earlier this summer, my friend Eli and I planned and taught a 1.5 week programming camp to a group of middle/high schoolers. Some of them had no programming experience, some had looked into some code, and some had taken full computer science classes. We created a camp that tried to cater to everyone and teach them all something new.

We taught a combination of software and hardware development. Starting off with Processing, the campers created a basic runner game with obstacles that the player had to jump over. They learned all of the basics of Object Oriented Programming and simple Java syntax.

Building off of this, we moved into hardware development with the Arduino. Programming in Processing and Arduino are very similar and it actually isn’t very hard to use them together. This allowed us to, after a quick intro and basic Arduino lesson, make game controllers with the Arduino for our Processing games.Camp_wk1_4

The students created a runner game, a brick breaker game, and a projectile dodging game. After students created the Arduino controllers, we moved them onto using 3rd party libraries to use their Xbox controllers with their games. IMG_0547

Finally, the students focused on finishing one of the games of their choice. Some made high score systems, some themed their games (Breaking Bad themed brick breaker), and some added bullets to the dodging game. A few students even made their games 3D.


Fabric Simulation — August 8, 2016

Fabric Simulation

For my AP Computer Science final project, I created a fabric simulator from scratch in Processing. This idea spawned from my HAPTIX project. I wanted a way to better model the full 3D positioning of a glove with only a few fabric stretching measurements.

I designed the simulator to be able to simulate different stitching patterns. This is so I could use it to simulate any combination of materials for the glove project.

I tested the simulation by using a ball to push the virtual fabric around.

Screenshot 2016-08-07 21.07.28

The fabric was made up of a matrix of “nodes” with connecting stretchy “links.” It also included stiffness links that keep the fabric from bunching up. Essentially, these stiffness links represent the thickness and actual matter of the fabric in the real world. In the next two pictures, the black links are stretch links and the red links are stiffness links. The stiffness links actually span over two nodes in order to prevent the fabric from folding right in half.

Next year, in my Projects in CS class, I hope to use this simulation to create real time accurate hand models for HAPTIX.

Projector Add-on — September 15, 2015

Projector Add-on

I have been thinking a lot about an add-on for a projector. My idea would consist of a camera attached to the projector and wired back to the source computer. This could be an open source system with possibly 3D printed clips for the camera. It is a very simple hardware set up with many possibilities of features.

The original idea’s main feature was a color adjustment system. The camera could compare the projected image to the source image from the computer and notice any differences.This system would allow you to project onto any flat surface regardless of color variation. Say if there was a post-it note on the projection surface. This would cause a small part of the projection to appear as a different color. The system would notice that and modify the projected image to make the post-it “disappear.”

After thinking about this post-it concept for a while, I realized you could do even more with the camera’s ability to recognize things on the projection surface. With a specifically formatted sticky note, you could have the system recognize and highlight that note. You could even have it recognize and convert the handwriting on the note to text. This could be a nice form of input for meetings.

Another piece of this idea is one that has been tried by some companies is to make the system recognize hands. This means that you could effectively turn the projection surface into a giant touchscreen. This would be useful for a large number of purposes.

Overall, the system could be taught to recognize any number of situations and change the projected image based on these situations. By using gestures, handwriting, and color specific changes to the surface, a single person or a whole group could learn to work with the projection.

Does anyone have any other ideas on ways to interact with this sort of system?

Human Fluid Dynamics — September 11, 2015

Human Fluid Dynamics

I had an idea for an interesting project a few weeks ago while touring Columbia University. The tour group was entering a workshop through a small door. After passing through, I noticed how we had all slowed down to go through and once we spread out, we had not sped up again. I started thinking about how as a crowd, we were acting like a sort of fluid.
This started me thinking about how different situations could be used to model pedestrian traffic flow. If you think of the crowd as a whole acting as a fluid, you can set up interesting experiments. Think about a hallway. It is kind of like a channel for a liquid to flow through. If you cut off half of the flow with a wall you would see some interesting flow patterns appear. People would start getting stuck in front of the wall because they could not squeeze into the flow of people bypassing the wall. People might stop behind the wall in order to get out of the flow. This is similar to fluid dynamics where fluids will form high-flow channels (bypassing the wall) and eddies (slow flow and sometimes backwards flow behind the wall).
Some more interesting behaviors rise when you think about the fact that humans can make predictions and react to more than just a sense of touch. Imagine walking down a hallway in a crowd and seeing a wall blocking half of the hallway ahead. You would naturally try to move to the opposite side of the hallway so that you could easily pass the wall. This would repeat in the minds of everyone who can see the wall. As a group, the crowd would avoid the wall. This is different from how a liquid would interact with a wall because the liquid would not be able to see the wall ahead. So, in order to better match the liquid behavior and the crowd behavior, we can make the wall at hip level. This would prevent people from seeing the wall until they were up close to it and force them into more of a liquid-like behavior.
I am thinking about setting up an experiment at school in the halls related to this (if school ever starts). I think it would be interesting to see how people move around obstacles and to see if that could be matched to certain types of liquids.
Another interesting piece of this is viscosity. Viscosity is the “thickness” of a liquid. It is directly related to the friction between particles. I was thinking about how different human-flow situations would be matched to different viscosities and how this might affect the flow patterns. For example a crowd of business people or students might have less viscosity and might flow better than a crowd of families.
I might look further into this as a way to model traffic flow through buildings using fluids. It could be useful for architects and space planners to quickly test traffic flow. Comment below if you have any ideas about different flow experiments to try out or anything about the viscosity of different types of crowds.
Mind Reading via Vocal Chords — September 3, 2015

Mind Reading via Vocal Chords

I read an article about a week ago from The Guardian about internal monologue. The article explained how internal monologue (the voice you hear when you read something), is accompanied by tiny movements of your vocal cords.

I have been thinking a lot along these lines as to how this motion could be captured and used. I think with the proper setup, this movement could be an indicator of what someone is thinking. (that’s right, mind reading)

However, since your vocal cord movements don’t make up all of your speech, this technique would probably be limited to discovering tone and inflection. For example, when asked a yes or no question that people know the answer to, many will immediately respond (mentally) in a positive or negative manner. This could be detected by a difference in the movement of the vocal cords because of the difference in tone. Or, you could measure someone’s interest or curiosity on a subject by looking for low-to-high inflection (questions) from their vocal cords as they read about that subject.

This could all be measured by a combination of ultrasound techniques and electrodes to measure muscle activity. I think this could be a really interesting technology if not just a cool science fair project.

3D Printed Jewelry — April 26, 2015

3D Printed Jewelry

With my Printrbot up and working, I am trying to find interesting uses for it. I recently had the idea of creating a small charm (elephant/heart shape) for a necklace. The only issue with a small piece like this is the need for a smooth finish. After some looking around, I think I plan to go with an acetone brush-on solution. I am planning on sealing off the charm and getting rid of excess filament with an acetone-based nail polish remover. I will report back on the success of the process after I finish. YAY!