The UC Berkeley Cal Rover team is a student-run team supported by the American Institute of Aeronautics and Astronautics. The current version of the Cal Rover has six wheels, holonomic drive, and a rocker bogie type suspension inspired by NASA’s Curiosity Rover.
In 2018/2019, I worked as the Cal Rover embedded system lead. Working with the team, I helped build out the electrical/embedded systems and controls for the drive system. The drive system is composed of six brushed DC drive motors, and six servos for steering. All low-level controls are handled by a set of Cypress Semiconductors’ PSoC 5LP micro controllers. The drive motors were additionally controlled using a Basic Micros RoboClaw motor controller.
When I joined the Cal Rover team, the project was in the middle of a major overhaul. I led the team redeveloping the low-level functionality of the drive system. The low-level controls were written in C and designed to receive higher-level commands via I2C. Higher-level remote and autonomous tasks were developed in LabVIEW and Python respectively.
The drive system we developed has two modes. The first mode is a standard differential drive, in which the wheels are locked in the forward direction, to be used for longer distance treks. This mode serves the dual purpose of increasing efficiency and making best use of the rocker bogie suspension. The second mode is a swerve-style drive mode to be used at lower speeds, especially during manipulation tasks. The swerve mode uses true omnidirectional drive, as each wheel can be driven and rotated independently of the rest of the system. This allows the rover to translate (forward/backward and lateral strafe at any angle), rotate in place about its center, or strafe and rotate simultaneously.
This drive testing video was shot before the wireless communication was implemented. (original music by me)
For much of my life, I worked in the arts. When I went back to school to get an engineering degree, I had to put some of my creative passions on hold. Time Drop is a project that grew out of the desire to combine my love for the arts with my engineering skills, while building a creative community of collaborators. Contributors to Time Drop include Kasey Boekholt, Sonia Aggarwal, Wesley MacHardy, and Scott MacHardy.
Time Drop is an immersive art experience that lets users feel like they can control the rate and direction of time. The experience is housed inside a 15’ diameter hexagonal room with mirror-clad walls that reflect the interior lighting and décor, creating a fractalized expanse. Inside the room, music plays and 20 emitters pour water from the ceiling. All this adds up to create a feeling of being inside an infinite rainstorm.
In the center of the room, there is a single Time Control Knob that acts as the main point of user interaction. By turning the knob, users can manipulate the playback speed of music, and the rate at which the drops of water appear to be falling. These two effects are synchronized to give users the feeling that they can control the rate and direction of time.
The illusion of water falling in slow motion, or even rising through the air, is created by aliasing between the rate of water drop formation, and the frequency of strobing light. The rate of drop formation is precisely controlled by using oscillating pumps, which provide small bursts of pressure at 60hz. By strobing all the interior lights (faster than can be seen with the naked eye), we create a perceived motion relative to the difference between the strobe frequency and 60hz— at 60hz all the drops appear to be frozen in mid air. The playback rate of the audio and the strobe are synchronized to create a multi sensory illusion, in which time itself seems to bend to the will of the user.
Time Drop’s systems can be broken down into three main subgroups: lighting/audio, water/mechatronics, and structure/décor.
LIGHTING/AUDIO: The program that controls the lighting and audio was written in Max/MSP, a visual programming language used primarily for audio and multimedia. This program receives MIDI signals from the main Time Control Knob, handles the time stretching algorithm, controls audio playback, and outputs a PWM signal (for triggering the lights). The output PWM signal is then fed into a custom power supply that amplifies the strobing signal to voltages appropriate for the connected LED strands. In order to achieve a clean stroboscopic effect, the pulses of light need to be very short. Because of this, the PWM signal is run with a very low duty cycle of 0.5%. With all the interior light strobing, the inside of Time Drop is essentially a giant stroboscope. The electronics for the lighting are fairly simple, and primarily consist of: variable voltage (0-48v) switched power supplies, IRF540 n-channel mosfets, and low voltage (5V-12V) waterproof LED strands. The PWM signal generated by the Max/MSP program is delivered via an external soundcard to a small audio amplifier and filter. After the signal is slightly boosted and filtered for noise, it is delivered to the to the gates of the n-channel mosfets. When triggered, the source/drain pins of the mosfet complete the circuit between the LEDs and the various ground and power buses. Multiple power buses are used to accommodate the use of different LED strands. Strands of similar length are connected in parallel to a dedicated power bus/supply.
WATER/MECHATRONICS: The water pouring from the ceiling is delivered by 11 Gorman Rupp Oscillating Pumps, which are housed in wooden enclosures on the exterior of the structure. The water is then piped into the room with vinyl tubing and released though custom-made laminar flow nozzles. Creating consistent, well-formed droplets was by far the most complicated part of the entire project. Developing our water emitters required extensive testing and design iterations, and was as much a function of cost as our desired design parameters. The design we settled on used a set of ball valves actuated by servos, with which we can manually adjust the water flow rate at each nozzle. After the ball valve throttle, the water moves into nozzles designed to make the flow more laminar. The nozzles are comprised of a 3d-printed taper section leading to a ¼”ID PVC stuffed with partitioned coffee stir straws. The stir straws are intended to partition the flow such that any vortices present will be broken down to a size that has relatively little effect on the overall flow. The water poring from the nozzles is collected in a system of interconnected buckets along the floor, which drain to two main water collector buckets outside of the structure. Inside the main water collector buckets are sump pumps that move the water to a 55gal holding tank. The holding tank then supplies the oscillating pumps, creating a closed loop system.
Each water line has three servos; two for throttles at the droplet nozzles, and one inline with the vinyl tubing, used to purge any air in the water line and help balance the pressure. The servos are adjusted from a central control panel that also handles turning on and of all the mechanical functions of Time Drop. The control panel primarily consists of three Arduino Mega boards, and four rotary encoders/buttons. In all, the panel controls twenty-nine servos and fifteen relays. The program running on the boards is written in the Aduino language (essentially c++), and utilizes an object-oriented design structure.
STRUCTURE/DECOR: Time Drop is housed in a 15’ diameter wood-framed hexagon. It has an A-frame roof made of wood and corrugated plastic. Toward the center of the back wall, there is a raised hexagonal platform with a podium for the Time Control Knob, where people can experience the piece. Off of the platform, the floor is covered with live plants that obscure the water collection buckets and create the feeling of a garden oasis. Littered among the plants is lighting, and glass blown mirrored sculptures (generously donated to the project by glass artist Alexander Sarkis Abajian). The ceiling is created by numerous strands of waterproof LEDs, which create a star-like topography of lighting. We consciously worked on giving the interior space a loose natural feel. We wanted to create an environment that felt like a plausible place to experience the rain. All of the interior decoration is multiplied by the fact that the interior walls are mirrored. The seemingly infinite reflections turn our small room into a vast expanse.
No “slow motion” video effects were used to produce this video. The the effect of slowing and reversing time is part of the art piece, and can be experienced in person.
This knob lets users feel like they can control the rate and direction of time. (Housed inside the podium below it is a wireless controller that sends MIDI to the audio/lighting program - this is pictured below.)
A quick caveat about the videos posted here for Time Drop: we haven’t completed our own video documentation yet, but it’s coming! In order to properly film the effect, we need to use a “global shutter” camera, which is quite costly (even to rent). At this point, the videos posted here were taken by individuals as they experienced the piece. While some of the videos look cool, they were all filmed on smartphones and have distortions caused by aliasing due to using a “rolling shutter” camera. These videos are no substitute for the in-person experience.
This controls the lighting and audio.
PWM signal from the Max program / soundcard triggers the gate of n-channel mosfets, which then deliver power from the larger variable voltage supplies to the strands of LED lights.
This lives inside the podium underneath the Time Control Knob, and is hidden from users.
VertiCal is a suction-based climbing robot. Using three servos and two vacuum pumps as actuators, the robot moves to any location on a two-dimensional smooth surface.
I worked on the VertiCal robot as part of a five person team in a mechatronics design competition at UC Berkeley. The project won first prize in the competition, and all five members of our team were awarded the Tan Scholarship.
Our team was highly collaborative, and most design ideas were developed collectively. Working with the team, I had the opportunity to contribute to the design of the robot’s electrical and mechanical systems, as well as modeling the robot’s dynamics and sourcing parts. My main contribution was leading the development and implementation of the robot’s software/control system.
The robot’s movement is provided by three 20kg/cm servos. The suction is provided by two vacuum pumps. The negative pressure is routed to or away from the cups using relay switches and three-way solenoid valves. The main chassis of the robot was machined from 6061 aluminum, and the cups were 3D printed PLA. The robot is controlled using a National instruments MyRio-1900 micro-controller. Its software was developed in LabVIEW using state machine program architecture, and it can be remotely-controlled using a GUI on a tablet or laptop.
In the image above, the GUI is pictured top left, the robot is on the right, and the control box is below.
We originally designed the robot with all the components on-board. However, when sourcing the parts, we realized they would cost more than a group of engineering students could afford. Because of this, we decided to off-board the larger components to a control box tethered to the robot.
In the summer of 2018, I interned with suitX, a robotic exoskeleton company. The medical division of suitX makes wearable robotic devices designed to assist people with mobility issues. At suitX, I worked on re-designing a support arm used to attach their PHOENIX Medical Exoskeleton to a range of commercially-available medical walkers. The support arm was originally designed as part of a PhD thesis by Nicholas Errico. While medical exoskeletons can give paraplegics and others with with limited mobility the ability to walk, most of them still require some type of balance assistance—and even with crutches, some users can still be at risk of falling.
The idea behind the support arm was to give novice users of the exoskeletons the ability to practice with minimal supervision, while greatly reducing the risk of dangerous falls. This could be particularly useful for physical therapists working with more than one client at a time. In addition to fall prevention, the support arm was also designed to aid controlled locomotion by restricting movement in the sagittal plane while allowing free movement in the frontal and transverse planes.
As is the case with many first versions of a product, there were a number limitations and unforeseen points of failure in the design when I received it. As part of my internship, I was asked to redesign the product. My redesign was extensive. I modeled the dynamics of a fall event to find what stresses would be placed on the support arm. I then did extensive Finite Element Analysis (FEA) to make sure components and assemblies could handle the stresses. By building out areas of high stress concentration and reducing materials in areas of low stress concentration, I was able to increase the strength of the support arm while reducing its weight. I redesigned the height and angle adjustment mechanisms and designed a quick-connect system to make the support arm easier to use. I also developed an adjustable system for attaching various other products to the spine of the exoskeleton.
The support arm connects the cross beam of the walker to the main spinal support of the exoskeleton.
(The chassis of the quick connect mechanism is transparent in this shot.)
The angle adjustment mechanism (left) and cart attachment mechanism (right) are pictured here.
The Four Bar Mechanism allows movement in the transverse and frontal planes while constricting movement in the sagittal plane. It also has a height adjustment lever, which can be seen in the lower part of this picture.
This animation was produced as part of a final project for an Advanced Engineering Design Graphics class at UC Berkeley. Working with two other team members, we modeled a classic Kodak Pageant AV-126-TR 16mm film projector. To make our CAD model, we disassembled a Kodak Pageant, and painstakingly measured and modeled each individual component. The modeling was done with Creo Parametric (formerly known as Pro-E), and the animation was done using 3ds Max.
This project helped develop my modeling skills, and it was also great practice in managing a large CAD project working with multiple other engineers. Additionally, it served as an incredibly interesting study into a complex mechanical system, from a time when mechanical systems were king. Though the digital revolution has allowed us to make countless systems more effective and efficient, there are still many brilliant lessons to be learned from these old relics.
I’m an engineer and artist currently living in Albuquerque, NM. I am currently building an 8000 square foot immersive art space in downtown ABQ called Cafe Entropy. I'm obsessed with the cross section between art, science, and engineering. with the I love to create—whether it’s producing music, building immersive art experiences, programming, or making mechatronic devices, I love the process of bringing something new into the world. I am also passionate about learning. We live in such an interesting world that sometimes I find myself overwhelmed with all the things I want to learn more about. Here are a few subjects I find particularly interesting: thermodynamics, electrochemistry, energy storage, quantum mechanics, control systems, carbon capture, system dynamics modeling, ecology, embedded systems, machine learning, electrical engineering, and applied math. For me, the process of learning and creating go hand and hand. I love to dream up ambitious ideas, and then learn what I need to know to make them happen.