![convergence.png](https://static.wixstatic.com/media/f689e4_ad0f56cc520d459a923a8567f8eb94cf~mv2.png/v1/crop/x_0,y_643,w_7680,h_3230/fill/w_844,h_355,al_c,q_85,usm_0.66_1.00_0.01,enc_auto/convergence.png)
JUSTIN SADO is me
We are what hasn't happened yet; we are the engine and
the engineer of everything immanent.
-Quixitl Mexer
DSP Capstone
Fig 1: Overdramatic demo video
For EE434: Digital Signal Processing Design Laboratory at USC, I designed, 3D-printed, and programmed a mechanical limb to track and mimic the user's hand-movement using computer vision and inverse kinematics.
The limb, itself, was designed to be maximally modular: A total of six components were modeled in CAD software, outfitted with friction-fitting connectors for convenient and non-destructive rearrangement
throughout the design process.
![image.png](https://static.wixstatic.com/media/f689e4_4ea6fb32622c4c87aa0b8ce7638f6bf8~mv2.png/v1/fill/w_88,h_54,al_c,q_85,usm_0.66_1.00_0.01,blur_2,enc_auto/f689e4_4ea6fb32622c4c87aa0b8ce7638f6bf8~mv2.png)
Fig 2: Components' CAD models
![ezgif-2-c224c1af8d.gif](https://static.wixstatic.com/media/f689e4_1d59ba16c3394a63aaeea3f57778d554~mv2.gif)
Fig 3: 2D IK simulation transmitting to limb (controlled via mouse-movement for demo)
A 2D visualization of the relevant Inverse Kinematics equations was then created with Python, and made to transmit its calculated joint angles over serial; an Arduino was prepared to receive these angles and actuate the limb's motors.
Finally, a script was written to derive translation of the end-effector from movement of a human hand via Computer Vision. The translation obtained could then be sent to the 2D visualization via TCP, which then passed its computed angles along to the limb's Arduino -- and voila.
The relevant code can be found on my GitHub.
Crani-Arm
With Makers, an engineering club at USC, my team and I trained an LSTM to derive human hand/finger-positions from sEMG sensor data; we then designed and 3D-printed a mechanical hand such that it would assume positions in response to the user's hand movements -- more specifically, by playing rock-paper-scissors. The mechanical hand was also equipped to use computer-vision instead of sEMG sensors.
The mechanical hand utilized synthetic tendons to individually actuate each finger whilst maintaining a relatively compact form-factor.
The relevant code can be found on my GitHub.
c
![ezgif-1-0a1baf37a6.gif](https://static.wixstatic.com/media/f689e4_7716a78a29d1429fa744215b756fc58b~mv2.gif)
Fig 1: Demonstration of a mechanical claw actuating in response to sEMG sensor-data.
ac
![ezgif-7-3551d1fda1.gif](https://static.wixstatic.com/media/f689e4_467d6d4100ce44169d5caf3d7f6159d1~mv2.gif)
Fig 2: Mechanical hand and myself playing rock-paper-scissors (w/ the hand using CV).
ac
Zooming Kitties
With Makers, an engineering club at USC, my team and I built a remote-controlled cat carrier capable of tracking and following after AprilTags; the carrier's perspective was also streamed to a VR headset for first-person piloting.
The relevant code can be found on my GitHub.
c
![59025ee3-0f35-4f8a-9819-f3be8721b74c_edi](https://static.wixstatic.com/media/f689e4_2d18a663b0fc4f06b04fccd593d1aa37~mv2.jpg/v1/crop/x_125,y_304,w_612,h_512/fill/w_87,h_73,al_c,q_80,usm_0.66_1.00_0.01,blur_2,enc_auto/59025ee3-0f35-4f8a-9819-f3be8721b74c_edi.jpg)
Fig 1: The cat carrier in question
![ezgif-3-924fe97752.gif](https://static.wixstatic.com/media/f689e4_38f3ce66ccbe4b11b0ca9b04caf256af~mv2.gif)
Fig 2: AprilTag tracking demo
![ezgif-3-cbd0f33770.gif](https://static.wixstatic.com/media/f689e4_2400f8391a994f25a0400fd5cca4a9ed~mv2.gif)