Open Source Skid Steer Requirements + Value Proposition

From Open Source Ecology
Revision as of 20:28, 8 August 2025 by Marcin (talk | contribs) (→‎Further Details)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Specification

Video Discussion on specification - [1]

From Avinash Baskaran, Ph.D., OSHWA Postdoctoral Fellow

Based on our initial conversation, I identified broad steps for a full-stack, open-source grading autonomy system. The stack is scoped within ROS and “SLAM” framework (Simultaneous Localization And Mapping, which was a major focus of my PhD):

  1. Drone-based multiview reconstruction using ROS + openMVG/MVS for topographical mapping
  2. Open hardware rotary laser level + machine-mounted receiver for fine grading
  3. Real-time toolpath planning using RViz + Open3D + RTAB-Map

I plan to use the BeagleBone AI-64, an open-source platform to control the skid-steer, is this acceptable? Step 1 above is optional when elevation changes are minor compared to the receiver height, but I think it’s a valuable addition for enabling full autonomy. I’ve also received a high-fidelity SparkFun RTK GPS module from Nathan Seidle, which I’ll use to validate and calibrate the localization. Overall this approach balances cost, industry-aligned best practices, and high RoI. It also generalizes to applications at PARC, like autonomous crop and livestock management and hydrology.

Looking forward to sharing and discussing initial results soon.

Further Details

This is the RTK GPS payload:https://www.sparkfun.com/sparkfun-rtk-facet.html - It comes prepped to mount with UI, connectivity, etc.

Yes, I'll package an open source drone with self-charging and a universal controller joystick for the drone + skid steer. I agree that a headless skid steer would be ideal, this would reduce cost and improve cross-functionality. I would add to make it cab-compatible with an optional cab module that can receive the joystick like a steering wheel.