Project Overview

 Our project is to build a realistic guitar that has MIDI capabilities.

Background

 Almost all music these days is created using Digital Audio Workstations, or "DAW"s, for short. There are two main ways of actually inputting music into these programs: one way is to record audio straight into the program, and the other is to record/program MIDI notes.

 MIDI notes are a farily simple concept. They consist of two messages: note-on, and note-off. A note-on message (shown below in Fig. 1) simply consists of three parts:

  • Status byte (which indicates that this is a note-on message AND which channel this message is coming from)
  • Note number
  • Velocity
Fig 1: An example of a MIDI message
Fig. 1: An example of a MIDI message[1]

 Note that nowhere is "length" encoded into this message. That's because MIDI is meant to be a real-time standard, meaning that the computer isn't going to know how long the note is when you press it initially. That's where the note-off message comes in. The note-off message is the exact same as the note-on, except instead of 1001 for the upper half of the status byte, it's 1000[4]. The note number corresponds to which note to turn off, and the velocity says how fast the note should be released (the latter of which is not necessarily implemented on all keyboards, as it's more of a premium feature to determine how fast you release a key).

Problem

 Most MIDI controllers come in the form of a piano keyboard, one of which is pictured below in Fig. 2. The reason for this is probably due to the fact that there is no ambiguity as to which key corresponds to which note; each key has a 1:1 correspondence with a note. While this naturally makes sense for people who grow up learning piano, this is not at all intuitive for people who's only instrument is guitar. On top of that, MIDI controllers may not necessarily be the best controller for the job, especially when playing virtual guitar instruments.

Fig 2: An example of a MIDI controller
Fig. 2: An example of a MIDI controller[2]

Solution

 The solution for this is to create a MIDI controller in the form of a guitar. These do exist, however, a lot of them use buttons on the fretboard for note detection, as shown below in Fig. 3. Our solution uses real guitar strings with real frets in order to determine which note the user is playing, in order to achieve a more natural feel.

Fig 3: A MIDI guitar that uses buttons for note detection.
Fig. 3: A MIDI guitar that uses buttons for note detection[3]

Team Members

Alec Meyer

Project Lead Headshot of Alec.

Alec is a senior in Software Engineering pursuing a minor in Data Science. Alec is also Computer a Science TA and works part time as a Software Engineer at Maverick Software Consulting. Alec is from Cedar Rapids, Iowa.

Ethan Cooper

Embedded Applications Engineer Headshot of Ethan.

Ethan is a senior studying Electrical Engineering and Music. He has worked for three years as a recording technician for the ISU Department of Music and Theatre and has internship experience in acoustics, audiovisual design, and RF testing. Ethan will be moving to New York City in the Summer of 2022 to work as a hardware engineering intern for Crestron Electronics. Ethan is from Cedar Rapids, Iowa.

Hassein Rife

Hardware Engineer Headshot of Hassein.

Hassein is a senior in Electrical Engineering with a focus on analog design. He works as a freelance audio engineer, and has worked sound at the M-Shop since 2018. He has a strong interest in the overlap between music and analog engineering.

Kyle Strozinsky

Software Engineer Headshot of Kyle.

Kyle is a senior Software Engineering student. Kyle is a four-year member of the ISUCFV'MB trumpet player. Kyle is from Eden Prairie, Minnesota.

Sam Lavin

Docmentation/Team Organization Headshot of Sam.

Sam is a senior in Computer Engineering with minors in Music and Music Technology. They work in the recording studio for the Department of Music and Theatre, and is also the Production Director at 88.5 KURE Ames Alternative. In the future, they would like to work in Audio Engineering/Live Sound.

Weekly Reports

IRP Documents

Design Document

You can also view our design document directly here.

Videos

Below is a playlist of all of lightning talk videos and our news report video. You can also view the playlist directly here.

References

  1. Breve, Bernardo & Cirillo, Stefano & Cuofano, Mariano & Desiato, Domenico. (2020). Perceiving space through sound: mapping human movements into MIDI. 49-56. 10.18293/DMSVIVA20-011.
  2. Brandon Daniel from USA, CC BY-SA 2.0, via Wikimedia Commons
  3. Log, Starr. “A New Ztar for Rob Swire of Pendulum.” Starr Log, 3 May 2011, http://starrlabs.blogspot.com/2011/05/new-ztar-for-rob-swire-of-pendulum.html.
  4. The MIDI Manufacturers Association, "MIDI 1.0 Detailed Specification," Feb. 1996