top of page

DSP Capstone

Fig 1: Overdramatic demo video

For EE434: Digital Signal Processing Design Laboratory at USC, I designed, 3D-printed, and programmed a mechanical limb to track and mimic the user's hand-movement using computer vision and inverse kinematics.

The limb, itself, was designed to be maximally modular: A total of six components were modeled in CAD software, outfitted with friction-fitting connectors for convenient and non-destructive rearrangement

throughout the design process.

image.png

Fig 2: Components' CAD models

ezgif-2-c224c1af8d.gif

Fig 3: 2D IK simulation transmitting to limb (controlled via mouse-movement for demo)

A 2D visualization of the relevant Inverse Kinematics equations was then created with Python, and made to transmit its calculated joint angles over serial; an Arduino was prepared to receive these angles and actuate the limb's motors.

Finally, a script was written to derive translation of the end-effector from movement of a human hand via Computer Vision. The translation obtained could then be sent to the 2D visualization via TCP, which then passed its computed angles along to the limb's Arduino -- and voila.  

The relevant code can be found on my GitHub.

Crani-Arm

With Makers, an engineering club at USC, my team and I trained an LSTM to derive human hand/finger-positions from sEMG sensor data; we then designed and 3D-printed a mechanical hand such that it would assume positions in response to the user's hand movements -- more specifically, by playing rock-paper-scissors. The mechanical hand was also equipped to use computer-vision instead of sEMG sensors.

The mechanical hand utilized synthetic tendons to individually actuate each finger whilst maintaining a relatively compact form-factor.

The relevant code can be found on my GitHub.

c

ezgif-1-0a1baf37a6.gif

Fig 1: Demonstration of a mechanical claw actuating in response to sEMG sensor-data.
ac

ezgif-7-3551d1fda1.gif

Fig 2: Mechanical hand and myself playing rock-paper-scissors (w/ the hand using CV).
ac

Zooming Kitties

With Makers, an engineering club at USC, my team and I built a remote-controlled cat carrier capable of tracking and following after AprilTags; the carrier's perspective was also streamed to a VR headset for first-person piloting.

The relevant code can be found on my GitHub.

c

59025ee3-0f35-4f8a-9819-f3be8721b74c_edi

Fig 1: The cat carrier in question

ezgif-3-924fe97752.gif

Fig 2: AprilTag tracking demo

ezgif-3-cbd0f33770.gif

Fig 3: First-person view demo

bottom of page