MU 3620 Final Project — October 16, 2017

MU 3620 Final Project

For the final project in Electronic Music Composition, we each were assigned to compose and produce a two to five-minute composition using one or more of the methods discussed in class: randomness, sonification, remixing, multimedia, or interactivity.

I chose to write a remix of Genghis Khan by Miike Snow. I wanted to give the song a bit more kick, and switch it up from an indie pop to an EDM format. I made a lot of changes, but I mostly focused on rewriting both the song and chord structures and cutting through the vocals to make the song more accessible as a dance track.

 

Even though this class is over, I hope to continue making music on a regular basis. I’ll post big projects here but follow my SoundCloud for more regular updates.

Advertisements
Week 3 and 4: Remixing and Scoring — September 28, 2017

Week 3 and 4: Remixing and Scoring

For the last two weeks, we have been looking at writing music based on other’s creative work. The first week of this was looking at re-arrangement and remixing. By taking someone else’s music and playing it with a different set of instruments, or by cutting it up and piecing it back together, we can create new and different compositions. For that assignment, I remixed Halsey’s song, Colors. I focused in on the synth and bass and away from her voice to give the song a fresh interpretation.

 

For this past week, we have been looking at writing music for video games and film. The important part of this type of composition is that your music does not hijack the scene. This means creating music that fits the mood, the tempo, and sometimes even fits the cuts of the imagery. Everything must mesh so that the music adds to the imagery without taking too much of the viewer’s attention. For this project, I wrote a score for the crash scene from the movie Sully.

EMC Week 2: Mapping — September 15, 2017

EMC Week 2: Mapping

Our second assignment of Electronic Music Composition was to write a song using mapping or sonification. Basically, take data and turn it into music in some way. We could use live data (game controllers, mouse position, etc) or static data (climate stats, population stats, sales data, etc).

I used sorting algorithms. Some of you may remember the sorting algorithm visualizations that I did last year. These showed visually how computers sort through scrambled lists. In order to show this musically, I converted every number to a note and had my algorithms sort them into a scale.

The result of this algorithm is played by the guitar. It starts with a scrambled scale and plays over each step of the sorting, eventually reaching a straight scale. Layered with strings and drums, this produces a unique, poppy song.

EMC Week 1: Randomness — September 13, 2017

EMC Week 1: Randomness

Our first assignment for my Electronic Music Composition class was to write a piece using randomness in any way we see fit. I decided to use white noise (sounds like radio static) as my random source.

Inspired by the band “Public Service Broadcasting,” I used Apollo Flight Control audio from NASA as the basis for my song. Then, I created a drum kit from white noise samples to make the drums fit into the general theme of the song.

The resulting song is a fairly cinematic, ambient piece.

New Music Page — September 12, 2017

New Music Page

Check out the new tab in the top menu: Music!

I have been writing a lot more music more since I started college (love it!) and I will be posting most of it here. At the moment this is all instrumental, and many pieces have some sort of story behind them. I am currently taking an Electronic Music Composition class and I write at least a song each week.

Songs will start going up in the next few days but you can also follow me on SoundCloud!

Haptic Glove Update — March 21, 2017

Haptic Glove Update

The haptic glove project is progressing nicely!

At the beginning of the year, we decided to branch out a bit from our previous design to explore other solutions. However, we have returned to a largely similar design with some key modifications. The new glove is almost entirely made of 3d-printed parts, uses new communication protocols and tracking algorithms, and now integrates with Unity!

We have already printed most of the glove and have been working on mounting components. The next steps are finishing up the finger exoskeleton design and integrating it with the rest of the glove. The exoskeleton should be one of the more interesting improvements in this prototype. Instead of extending the Exo along the sides of the fingers to create joints, it now runs exclusively across the top. The new mechanism creates what we call “virtual joints” outside of its physical structure that allows it to move freely with the finger while maintaining its internal structure. This allows us to create pressure just at the fingertip without using the user’s own fingers for leverage.

On the software side, we are currently focusing on smoothing out the motion of the servos and improving the reliability of the force measurements. Instead of modifying servos to read their position, we are using external force sensors. While it adds a bit of complexity, it greatly reduces the twitchiness of the glove. This also means our communication system is a lot cleaner. Now, instead of sending full coordinates and force data back and forth, we just send position values to Unity and Unity responds with a target force. Unity builds a model finger from the glove’s positional tracking and calculates the required force on each joint that will result in a realistic experience. This system cuts back on the amount of data that is sent back and forth and improves the speed of the overall simulation.

We are finishing up integrating all of these parts and hope to have a full demo online by mid-April.

New Links Page! — March 12, 2017

New Links Page!

Check out my new Links page at the top of the site!

I am working on a few web projects and decided to have a common place to link to them. I will add to the list as I produce more.

You’ll notice that the first link is a web version of the Letter By Letter Visualization that I discussed in an earlier post. This is the first of a series of text visualizations that I am working on putting online. I am planning on setting up a site to sell custom art pieces generated from customer-submitted text once I have a few viz’s together.

Let me know if there are any specific visualizations I have done that you’d like to be able to experiment with online!

BEAM Programming Camp — September 26, 2016

BEAM Programming Camp

Earlier this summer, my friend Eli and I planned and taught a 1.5 week programming camp to a group of middle/high schoolers. Some of them had no programming experience, some had looked into some code, and some had taken full computer science classes. We created a camp that tried to cater to everyone and teach them all something new.

We taught a combination of software and hardware development. Starting off with Processing, the campers created a basic runner game with obstacles that the player had to jump over. They learned all of the basics of Object Oriented Programming and simple Java syntax.

Building off of this, we moved into hardware development with the Arduino. Programming in Processing and Arduino are very similar and it actually isn’t very hard to use them together. This allowed us to, after a quick intro and basic Arduino lesson, make game controllers with the Arduino for our Processing games.Camp_wk1_4

The students created a runner game, a brick breaker game, and a projectile dodging game. After students created the Arduino controllers, we moved them onto using 3rd party libraries to use their Xbox controllers with their games. IMG_0547

Finally, the students focused on finishing one of the games of their choice. Some made high score systems, some themed their games (Breaking Bad themed brick breaker), and some added bullets to the dodging game. A few students even made their games 3D.

 

Terrain Generation — August 28, 2016

Terrain Generation

My third series in my AP Portfolio was on terrain generation. Computers mainly use terrain generation for simulations and games. For each of the visualizations in this series, I variations on the “midpoint displacement algorithm.”  The idea behind this algorithm is that you can add layers of detail to a data set by adding values based on the averages of the surrounding values. For example, if you have two points with the values 10 and 20, you can add another point at 15. By adding in a bit of random variation, a computer can create realistic terrain from only two starting points. This is called the midpoint displacement algorithm.

One Dimension: Noise

The first version of this is generating noise. At each iteration, the computer creates new points in between pairs of existing points.test

Over multiple iterations, this creates a more and more detailed set of points spaced at different distances. This can be audiolized as tones (hence the name “noise”) or can just be left as a series of randomly spaced points.

My final piece for this visualization included some extra colored circles for context:

End9.png


Two Dimensions: Path

In two dimensions, instead of varying the spacing, the algorithm varies the height of each point. With some coloring, this eventually creates a detailed 2D terrain like the bottom layer:Terrain Generation Algorithm

To better illustrate this, I laid out the algorithm’s steps by color:End56


Three Dimensions: Terrain

The midpoint displacement algorithm is most commonly used for 3D terrain generation. In this form, it is called the diamond-square algorithm.

Diamond_Square.svg

As shown in the above image, the algorithm takes a series of alternating diamonds and squares and fills in the center point of each. By repeating this hundreds of times, you end up with a 3D terrain map.

For my final piece on 3D terrain generation for my concentration, I again tried to combine each step in the algorithm into the final image. This image is shown to the right. Each step is displayed, stacked on top of the previous and denoted by a difference in the color. You can see how it gets more and more detailed as the algorithm progresses.

Terrain Generation 3-dimensions (1)


Four Dimensions: Space

This is where this all gets a little bit confusing. The first three dimensions of this algorithm are somewhat straightforward. They have basic goals and display methods. The fourth dimension, time, is a bit harder to picture. To make it a bit easier, imagine the fourth dimension is color. In each previous visualization, only the last dimension is affected by the algorithm. For the terrain, the x and y positions of each pixel are set, only the height of the pixel is changed. Extending this to the fourth-dimensional visualization, the position is completely locked and only the color is changed. Two different examples:

 

These are essentially clouds of color formed by what I call the Octo-Hexahedron algorithm. Essentially, it is a 3D extension of the diamond-square algorithm. These are full cubes filled with clouds of color. The fourth dimension can also be represented by transparency.

Screenshot 2016-08-24 23.36.18


Finally, you can actually represent the fourth dimension with time. This is in the form of an animation:


To tie these all together, I made a video with both the color and time-based animation:


Hexahedron-Octohedron AlgorithmFor the final still image for my concentration, I had to once again combine steps into one image. I used the same stacked approach with color changes to show time progression.


I had a ton of fun making this collection of pieces and exploring the midpoint displacement algorithm and its uses.

If you have any questions, comment below! Thanks!

 

Fabric Simulation — August 8, 2016

Fabric Simulation

For my AP Computer Science final project, I created a fabric simulator from scratch in Processing. This idea spawned from my HAPTIX project. I wanted a way to better model the full 3D positioning of a glove with only a few fabric stretching measurements.

I designed the simulator to be able to simulate different stitching patterns. This is so I could use it to simulate any combination of materials for the glove project.

I tested the simulation by using a ball to push the virtual fabric around.

Screenshot 2016-08-07 21.07.28

The fabric was made up of a matrix of “nodes” with connecting stretchy “links.” It also included stiffness links that keep the fabric from bunching up. Essentially, these stiffness links represent the thickness and actual matter of the fabric in the real world. In the next two pictures, the black links are stretch links and the red links are stiffness links. The stiffness links actually span over two nodes in order to prevent the fabric from folding right in half.

Next year, in my Projects in CS class, I hope to use this simulation to create real time accurate hand models for HAPTIX.