Open Source Skid Steer Requirements + Value Proposition
Specification
Video Discussion on specification - [1]
From Avinash Baskaran, Ph.D., OSHWA Postdoctoral Fellow
Based on our initial conversation, I identified broad steps for a full-stack, open-source grading autonomy system. The stack is scoped within ROS and “SLAM” framework (Simultaneous Localization And Mapping, which was a major focus of my PhD):
- Drone-based multiview reconstruction using ROS + openMVG/MVS for topographical mapping
- Open hardware rotary laser level + machine-mounted receiver for fine grading
- Real-time toolpath planning using RViz + Open3D + RTAB-Map
I plan to use the BeagleBone AI-64, an open-source platform to control the skid-steer, is this acceptable? Step 1 above is optional when elevation changes are minor compared to the receiver height, but I think it’s a valuable addition for enabling full autonomy. I’ve also received a high-fidelity SparkFun RTK GPS module from Nathan Seidle, which I’ll use to validate and calibrate the localization. Overall this approach balances cost, industry-aligned best practices, and high RoI. It also generalizes to applications at PARC, like autonomous crop and livestock management and hydrology.
Looking forward to sharing and discussing initial results soon.
Details
Wheels
- We build both industry standard version with skid steer wheels and 3D printed airless wheels for lifetime design and low cost made from TPU
- Budget - shredder + filament maker.
- TPU Regrind
Further Details
This is the RTK GPS payload:https://www.sparkfun.com/sparkfun-rtk-facet.html - It comes prepped to mount with UI, connectivity, etc.
Yes, I'll package an open source drone with self-charging and a universal controller joystick for the drone + skid steer. I agree that a headless skid steer would be ideal, this would reduce cost and improve cross-functionality. I would add to make it cab-compatible with an optional cab module that can receive the joystick like a steering wheel.