I spent much of January 2016 working on a robot for MIT's Mobile Autonomous Systems Lab, which is a competition that gives teams a month to build fully autonomous robots to complete some task. The task for 2016 was to locate stacks of cubes in a randomly shaped arena and restack them by color. I wrote software for the robot, helped to design and implement our computer vision system, and did electrical work on our microcontroller. My team took first place in the competition. Our team wiki has details about our design and a log of how we managed our time throughout the month, and our code is available in our repository.
Pyxida is the flight computer that I helped MIT's rocket team develop as the lead of the Avionics subteam. It features an ARM microcontroller, barometer, GPS, and IMU that it uses to estimate the location and orientation of a rocket. It uses this information to trigger flight events and also transmits it to a ground station via the onboard XBee. The hardware was designed and fabricated by members of the team and has gone through several revisions over the course of a semester. I wrote most of the firmware for the device and the prototype version of the ground station application. I also developed testing and flight qualification plans for the project, managed the repository for the project, and set goals and deadlines for tasks.
I have experimented with processes for generating fractals for many years. My most recent foray into this field is a fractal renderer that uses the parallel nature of GPU computation to quickly draw Mandelbrot and Julia sets. The majority of the calculation is done on the GPU in a shader program, so the host program only needs to direct the user's inputs to the graphics card. Many properties of the fractal are broken out to the user, including the equation that is iterated. It is changeable to any arbitrary input at runtime by recompiling the shader.
I wrote a basic 3D game engine in C++ using OpenGL. This was the project that I used to teach myself OpenGL, and the original version was written using immediate mode and the fixed function pipeline. As the project progressed, I transitioned it to use modern OpenGL features such as VBOs, VAOs, and shaders. It features terrain heightmap loading, particle systems, and object/world management. I have learned a lot more about C++, graphics programming, and version control since I developed TDML, but the repository for the project is still hosted here for historical purposes.
RemotePrint is a tool that I developed to make a 3D printer at my school network-accessible. It allows users to upload G-code files for prints, preheat extruders, initiate jobs, and monitor the status of the printer all from the same web interface. It provides a webcam stream of the printer embedded in the page and captures still images over time to produce a timelapse that the user can download when the print is complete. It is written entirely in Python and is hosted as a CGI script on a server connected to the printer.
I have built numerous high power rockets over the past six years. I have L2 certification from the National Association of Rocketry and continue to be an active member of MIT's Rocket Team. Though I am currently the lead of the Avionics subteam, I still enjoy working on the physical construction of rockets as it allows me to step away from the digital world for a short time and do something with my hands.