
How to Define AGV Drive Wheel Acceptance Criteria Before Sampling
Set measurable pass/fail standards before sample delivery to avoid subjective review and redesign loops.
Sampling delays often come from unclear acceptance rules, not from hardware alone.
If pass/fail thresholds are undefined, teams debate results instead of closing corrective actions.
1) Build a One-Page Acceptance Matrix
Before sample shipment, lock a matrix with three columns:
- Metric definition.
- Test method and condition.
- Pass/fail threshold.
No metric should exist without a declared condition.
2) Recommended Baseline Metrics
Use these as a practical starting point, then adjust by load class and environment.
| Category | Metric | Suggested Baseline | Test Condition |
|---|---|---|---|
| Traction | Continuous wheel output torque | ≥ rated value at defined speed | Rated voltage, steady state |
| Traction | Peak wheel torque capability | ≥ 1.8x continuous for declared duration | Burst test with thermal logging |
| Thermal | Motor/gearbox temperature rise | Within agreed limit over ambient | Duty cycle simulation |
| NVH | Noise at 1 m distance | Within project dBA target | Fixed speed and load |
| NVH | Vibration at housing | Within RMS limit | Same fixture and sensor location |
| Precision | Radial runout | Within tolerance window | Dial indicator at wheel OD |
| Precision | Mounting interface accuracy | Within drawing tolerance | CMM or qualified fixture |
| Protection | IP compliance | Meets declared IP level | Defined ingress test method |
| Reliability | Endurance completion | No critical fault after target hours | Continuous cyclic run |
3) Example Threshold Pack (Reference Only)
A medium-load indoor AGV project may use a pack like:
- Continuous torque:
≥ 100%rated at specified speed. - Peak torque:
≥ 180%continuous for10 s. - Temperature rise:
≤ 65 Cover ambient in steady state. - Noise:
≤ 72 dBAat1 m. - Radial runout:
≤ 0.40 mm. - Endurance: no functional failure within defined validation duration.
These are reference values. Final limits must be tied to your platform risk and safety case.
4) Standardize the Test Setup
Most disputes come from inconsistent setup. Lock these items in writing:
- Payload fixture mass and center-of-gravity location.
- Floor/friction simulation method.
- Ramp profile and duty sequence.
- Ambient temperature range.
- Sensor type, sampling frequency, and calibration status.
- Data logging format and naming convention.
5) Severity and Disposition Rules
Define what happens when a metric fails.
| Severity | Definition | Typical Action |
|---|---|---|
| Critical | Safety or functional stop risk | Immediate stop, redesign required |
| Major | Performance outside agreed limit | Corrective action + retest |
| Minor | Cosmetic or low-risk deviation | Record and monitor |
Also assign owners and closure dates for every deviation.
6) Buyer Checklist Before Accepting Sample
- Test report includes raw data, not only screenshots.
- Conditions match agreed matrix and mission profile.
- Any deviation has root cause and corrective action evidence.
- Retest scope is explicitly defined and completed.
- Approval status is clear: approved, conditional, or rejected.
7) Pilot Entry Gate (After Sample Pass)
Sample pass should unlock pilot only when:
- Design baseline is frozen.
- Process controls for key dimensions are in place.
- Incoming and outgoing inspection criteria are aligned.
- Traceability method is confirmed for pilot lot.
Without these four gates, pilot data is usually not representative of SOP reality.
8) Practical Document Set You Should Request
From supplier side, request:
- Acceptance matrix signed by both teams.
- Test procedure (step-by-step).
- Sample test report with raw datasets.
- Deviation log and closure evidence.
- Pilot readiness checklist.
If you need a pre-sample acceptance sheet, send your target load class and test conditions to [email protected]. Jimmy Su will provide a manually reviewed matrix template for your project.



