Available Steps
All steps available in the LibStp DSL, grouped by category. These are the building blocks you use to compose robot missions.
calibrate(distance_cm: float = 30.0, persist_to_yaml: bool = True, ema_alpha: float = DEFAULT_EMA_ALPHA, calibration_sets = None, exclude_ir_sensors = None) -> CalibrateDistance
Run a unified distance and IR sensor calibration.
This is the recommended all-in-one calibration entry point. It drives
the robot a known distance, prompts the user to measure the actual
distance traveled, then adjusts the per-wheel ticks_to_rad values
to correct odometry. After distance calibration, it automatically
calibrates IR sensors by sampling them during a drive over the
calibration surface(s).
Calibration values are persisted to raccoon.project.yml using an
exponential moving average (EMA), so the baseline converges toward the
true value over multiple calibration runs. The EMA formula is:
new_baseline = old_baseline * alpha + measured * (1 - alpha).
Prerequisites:
The robot must have drive motors with encoder feedback and
a configured kinematics model. For IR calibration, IR sensors
must be registered in robot.defs.analog_sensors.
Args:
distance_cm: Distance (in cm) the robot drives during calibration.
Default is 30 cm. Longer distances yield more accurate results.
persist_to_yaml: If True, write the EMA-filtered baseline
back to raccoon.project.yml so it persists across runs.
ema_alpha: EMA smoothing coefficient between 0.0 and 1.0. Higher
values produce slower convergence but a more stable baseline.
With the default of 0.7, approximately 83% of systematic error
is absorbed after 5 calibration runs.
calibration_sets: List of named IR calibration surface sets to
run (e.g. ["default", "transparent"]). Each set beyond
the first triggers an additional drive-and-sample cycle.
Defaults to ["default"].
exclude_ir_sensors: List of IRSensor instances to skip during
IR calibration (e.g. sensors not mounted near the ground).
Returns: CalibrateDistance: A step that performs the full calibration flow.
Example::
from libstp.step.calibration import calibrate
# Basic calibration with defaults (30cm drive, persist to YAML)
seq([
calibrate(),
drive_forward(100),
])
# Custom calibration: longer drive, two IR surface sets
calibrate(
distance_cm=50.0,
calibration_sets=["default", "transparent"],
ema_alpha=0.8,
)
calibrate_distance(distance_cm: float = 30.0, calibrate_light_sensors: bool = False, persist_to_yaml: bool = True, ema_alpha: float = DEFAULT_EMA_ALPHA, calibration_sets: Optional[List[str]] = None, exclude_ir_sensors: Optional[List['IRSensor']] = None) -> CalibrateDistance
Calibrate per-wheel distance estimation via encoder measurement.
Drives the robot a known distance, then prompts the user to enter the
actual measured distance. The step computes a corrected ticks_to_rad
value for each drive motor so that odometry matches real-world distances.
The calibration operates in two modes simultaneously:
- Runtime: Applies the measured
ticks_to_raddirectly for best accuracy during this run. - Persistent: Updates the YAML baseline using an exponential moving
average (EMA) so the stored value converges toward the true value over
multiple calibration runs. The EMA formula is:
new_baseline = old * alpha + measured * (1 - alpha).
Convergence rates for different alpha values:
alpha=0.5: ~97% of error absorbed after 5 calibrations (responsive)alpha=0.7(default): ~83% after 5 calibrations (balanced)alpha=0.9: ~41% after 5 calibrations (very stable)
Prerequisites: The robot must have drive motors with encoder feedback and a configured kinematics model providing wheel radius.
Args:
distance_cm: Distance (in cm) the robot drives during calibration.
Default is 30 cm. Longer distances yield better accuracy.
calibrate_light_sensors: If True, run IR sensor calibration
after the distance calibration is confirmed. Sensors are
sampled during an additional drive over calibration surfaces.
persist_to_yaml: If True, write the EMA-filtered baseline to
raccoon.project.yml so it persists across program runs.
ema_alpha: EMA smoothing coefficient between 0.0 and 1.0. Higher
values produce slower convergence but a more stable baseline.
calibration_sets: List of named IR calibration surface sets (e.g.
["default", "transparent"]). Only used when
calibrate_light_sensors is True.
exclude_ir_sensors: List of IRSensor instances to skip during
IR calibration.
Returns: CalibrateDistance: A step that performs the interactive distance calibration flow.
Example::
from libstp.step.calibration import calibrate_distance
# Distance-only calibration with defaults
seq([
calibrate_distance(),
drive_forward(100),
])
# Distance + IR sensor calibration, slower EMA convergence
calibrate_distance(
distance_cm=50.0,
calibrate_light_sensors=True,
ema_alpha=0.9,
calibration_sets=["default", "transparent"],
)
calibrate_wait_for_light(sensor: AnalogSensor) -> CalibrateWaitForLight
Create a wait-for-light calibration step.
Args: sensor: The AnalogSensor to calibrate
Returns: CalibrateWaitForLight step instance
calibrate_deadzone(motor_ports: Optional[List[int]] = None, start_percent: int = 1, max_percent: int = 30, settle_time: float = 0.3) -> CalibrateDeadzone
Create a deadzone calibration step.
BEMF readings are unreliable at low RPM, so this step uses human observation via the UI to find the exact motor percentage where the wheel starts turning.
Args: motor_ports: List of motor ports to calibrate (None = all motors) start_percent: Starting power percentage to test (default: 1) max_percent: Maximum power percentage before giving up (default: 30) settle_time: Seconds to wait after setting power before asking (default: 0.3)
Returns: CalibrateDeadzone step instance
calibrate_sensors(calibration_time: float = 5.0, allow_use_existing: bool = True, calibration_sets: Optional[List[str]] = None) -> CalibrateSensors
Create an IR sensor calibration step.
Args: calibration_time: Duration for calibration sampling (seconds) allow_use_existing: If True, offer existing calibration values calibration_sets: Named calibration sets to calibrate (e.g. [“default”, “transparent”])
Returns: CalibrateSensors step instance
switch_calibration_set(set_name: str = 'default') -> SwitchCalibrationSet
Switch IR sensors to a named calibration set.
Args: set_name: Name of the calibration set to apply (e.g. “default”, “transparent”)
Returns: SwitchCalibrationSet step instance
do_while_active(reference_step: Step, task: Step)
Run a task concurrently with a reference step, cancelling the task when the reference finishes.
Both steps start executing at the same time. When reference_step
completes (either normally or via exception), task is immediately
cancelled. This is useful for running a background activity (e.g.
sensor polling, motor oscillation) only for as long as a primary
action is running.
If task finishes before reference_step, the reference step
continues running until it completes on its own.
Args:
reference_step: The primary step whose lifetime controls the task.
When this step finishes, task is cancelled.
task: The secondary step that runs concurrently and is cancelled
once reference_step completes.
Returns: DoWhileActive: A step that manages the concurrent execution.
Example::
from libstp.step.logic import do_while_active, loop_forever
# Flash an LED while the robot drives forward
flash_led = loop_forever(seq([
set_digital(0, True),
wait(0.25),
set_digital(0, False),
wait(0.25),
]))
do_while_active(drive_forward(50), flash_led)
defer(factory: Callable[['GenericRobot'], Step]) -> Defer
Defer step construction until execution time.
Wraps a factory callable that receives the robot instance and returns
a step. The factory is called when the Defer step executes, not
when the step tree is built. This allows steps to depend on runtime
values such as sensor readings, odometry data, or results computed by
earlier steps in a sequence.
Args:
factory: A callable that takes a GenericRobot and returns a
Step to execute. Called exactly once when the deferred
step runs.
Returns: Defer: A step that lazily constructs and runs its child.
Example::
from libstp.step.logic import defer
# Turn by an angle computed from a sensor reading at runtime
seq([
scan_step,
defer(lambda robot: turn_left(
compute_angle_from_scan(robot)
)),
])
# Drive a distance based on current odometry position
defer(lambda robot: drive_forward(
target_x - robot.odometry().getPosition().x
))
loop_for(step: StepProtocol, iterations: int) -> LoopForStep
Repeat a step a fixed number of times.
Wraps the given step in a counted loop. Each iteration awaits the child step to completion before starting the next. After all iterations complete, the step finishes normally.
Args:
step: The step to execute repeatedly. Must satisfy StepProtocol.
iterations: Number of times to run the step. Must be a positive
integer.
Returns:
LoopForStep: A step that runs step exactly iterations times.
Example::
from libstp.step.logic import loop_for
from libstp.step.motor import set_motor_speed
# Drive forward in three short bursts
burst = seq([
set_motor_speed(0, 800),
wait(0.3),
set_motor_speed(0, 0),
wait(0.2),
])
loop_for(burst, 3)
loop_forever(step: StepProtocol) -> LoopForeverStep
Repeat a step indefinitely until externally cancelled.
Wraps the given step in an infinite loop. Each iteration awaits the
child step to completion before starting the next. The loop only
terminates when the enclosing context cancels it (e.g. via
do_while_active or do_until_checkpoint).
Args:
step: The step to execute repeatedly. Must satisfy StepProtocol.
Returns:
LoopForeverStep: A step that runs step in an infinite loop.
Example::
from libstp.step.logic import loop_forever
from libstp.step.motor import set_motor_speed
# Continuously toggle a motor on and off (until parent cancels)
toggle = seq([
set_motor_speed(0, 1000),
wait(0.5),
set_motor_speed(0, 0),
wait(0.5),
])
do_until_checkpoint(30.0, loop_forever(toggle))
run(action: Callable[['GenericRobot'], Union[None, Awaitable[None]]]) -> Run
Execute an arbitrary callable as a step.
Wraps a sync or async callable so it can be used inline in a step sequence. This is useful for one-off side effects, logging, variable assignments, or any imperative code that does not warrant its own step class. The callable receives the robot instance and its return value is ignored (unless it returns an awaitable, which is then awaited).
Args:
action: A callable that takes a GenericRobot. May be
synchronous (returning None) or asynchronous (returning
an Awaitable[None]).
Returns:
Run: A step that calls action when executed.
Example::
from libstp.step.logic import run
# Emergency stop before turning
seq([
run(lambda robot: robot.drive.hard_stop()),
turn_left(90),
])
# Log odometry mid-sequence
run(lambda robot: print(f"Heading: {robot.odometry().getHeading()}"))
timeout(step: StepProtocol, seconds: float) -> Timeout
Wrap a step with a time limit, cancelling it if it runs too long.
Executes the given step normally but enforces a maximum wall-clock
duration. If the wrapped step completes within the budget, the
timeout step finishes successfully. If the step exceeds the time
limit, it is cancelled via asyncio.wait_for and an error is
logged. Any exception raised by the wrapped step propagates
normally.
This is especially useful around blocking steps like
motor_move_to or wait_for_button that could stall
indefinitely if the hardware misbehaves.
Args:
step: The step to execute under a time constraint. Must be a
valid Step (or StepProtocol) instance.
seconds: Maximum allowed execution time in seconds. Must be
positive.
Returns:
A Timeout step that wraps the original step with the given
time limit.
Example::
from libstp.step import timeout
from libstp.step.motor import motor_move_to
# Give the arm 5 seconds to reach position 300; cancel if stuck
timeout(
motor_move_to(robot.motor(2), position=300, velocity=800),
seconds=5.0,
)
# Ensure operator presses button within 30 seconds
timeout(wait_for_button(), seconds=30.0)
drive_arc(radius_cm: float, degrees: float, speed: float = 1.0) -> Arc
Drive along a circular arc with explicit direction.
Positive degrees = counter-clockwise (left), negative = clockwise (right).
Args: radius_cm: Turning radius in centimeters (always positive). degrees: Arc angle in degrees. Positive = left/CCW, negative = right/CW. speed: Fraction of max speed, 0.0 to 1.0 (default 1.0).
Returns: An Arc step.
Example::
from libstp.step.motion import drive_arc
# Left arc
drive_arc(radius_cm=30, degrees=90)
# Right arc
drive_arc(radius_cm=30, degrees=-90)
drive_arc_left(radius_cm: float, degrees: float, speed: float = 1.0) -> Arc
Drive along a circular arc curving to the left.
The robot drives forward while simultaneously turning counter-clockwise, tracing a circular arc of the given radius. The motion completes when the robot has turned by the specified number of degrees.
Args: radius_cm: Turning radius in centimeters (center of arc to robot center). degrees: Arc angle in degrees (how much the robot turns). speed: Fraction of max speed, 0.0 to 1.0 (default 1.0).
Returns: An Arc step configured for a left (CCW) arc.
Example::
from libstp.step.motion import drive_arc_left
# Quarter-circle left with 30 cm radius
drive_arc_left(radius_cm=30, degrees=90)
# Gentle wide arc at half speed
drive_arc_left(radius_cm=50, degrees=45, speed=0.5)
drive_arc_right(radius_cm: float, degrees: float, speed: float = 1.0) -> Arc
Drive along a circular arc curving to the right.
The robot drives forward while simultaneously turning clockwise, tracing a circular arc of the given radius. The motion completes when the robot has turned by the specified number of degrees.
Args: radius_cm: Turning radius in centimeters (center of arc to robot center). degrees: Arc angle in degrees (how much the robot turns). speed: Fraction of max speed, 0.0 to 1.0 (default 1.0).
Returns: An Arc step configured for a right (CW) arc.
Example::
from libstp.step.motion import drive_arc_right
# Quarter-circle right with 30 cm radius
drive_arc_right(radius_cm=30, degrees=90)
auto_tune(vel_axes: list[str] | None = None, characterize_axes: list[str] | None = None, motion_axes: list[str] | None = None, tune_characterize: bool = True, tune_velocity: bool = True, tune_motion: bool = True, characterize_trials: int = 3, characterize_command_speed: float = 1.0, persist: bool = True, csv_dir: str = _DEFAULT_CSV_DIR) -> AutoTune
Auto-tune the full drive system: characterize, velocity PID, motion PID.
Runs a three-phase sequential pipeline that takes the robot from unknown hardware limits to fully tuned PID controllers. Each phase builds on the results of the previous one:
Phase 1 – Drive characterization. Commands raw velocities to measure max velocity, acceleration, and deceleration for each axis. Multiple trials are run and the median is taken. Results are stored so that subsequent phases have accurate physical constraints for profile generation.
Phase 2 – Velocity controller tuning. For each velocity axis (e.g.,
vx, wz), a step-response is recorded at 100 Hz, plant parameters
(gain Ks, dead time Tu, time constant Tg) are identified via the
inflection tangent method, and PID gains are computed using CHR
set-point-follow formulas. A validation step-response is run with the
new gains; if the integral of squared error (ISE) improves, the gains
are accepted, otherwise the baseline is kept.
Phase 3 – Motion controller tuning. Uses Hooke-Jeeves coordinate descent to optimize the high-level distance and heading PID controllers. Real test drives and turns are executed, scored on a weighted combination of settling time, overshoot, and final error. The optimizer adjusts kp/kd iteratively, halving the search delta when no improvement is found.
All results are applied in-memory immediately and, if persist is
enabled, written to raccoon.project.yml so they survive restarts.
This step requires significant clear space and takes several minutes to complete. It is intended for initial robot setup, not competition runs.
Args:
vel_axes: Velocity axes to tune in Phase 2. Each entry is a velocity
component name ("vx" for forward, "wz" for angular).
Default ["vx", "wz"].
characterize_axes: Axes to characterize in Phase 1. Options are
"forward", "lateral", "angular". Default
["forward", "angular"].
motion_axes: Motion parameters to optimize in Phase 3. Options are
"distance" and "heading". Default
["distance", "heading"].
tune_characterize: Whether to run Phase 1. Set to False if the
robot’s limits are already known. Default True.
tune_velocity: Whether to run Phase 2. Default True.
tune_motion: Whether to run Phase 3. Default True.
characterize_trials: Number of trials per axis in Phase 1. More
trials improve robustness but take longer. Default 3.
characterize_command_speed: Raw velocity command magnitude for
Phase 1, in m/s (linear) or rad/s (angular). Default 1.0.
persist: If True, write all results to
raccoon.project.yml. Default True.
csv_dir: Directory for diagnostic CSV output (step-response
recordings, etc.). Default "/tmp/auto_tune".
Returns:
An AutoTune step that runs the full three-phase pipeline.
Example::
from libstp.step.motion import auto_tune
# Full auto-tune with defaults
step = auto_tune()
# Skip characterization (already done), tune only velocity + motion
step = auto_tune(tune_characterize=False)
# Tune only the forward velocity axis and distance controller
step = auto_tune(
vel_axes=["vx"],
motion_axes=["distance"],
characterize_axes=["forward"],
)
# Dry run without persisting to YAML
step = auto_tune(persist=False, csv_dir="/tmp/auto_tune_test")
auto_tune_motion(axes: list[str] | None = None, persist: bool = True, csv_dir: str = _DEFAULT_CSV_DIR) -> AutoTuneMotion
Tune motion PID controllers via iterative real-world optimization.
This is Phase 3 of the full auto-tune pipeline, usable standalone when velocity controllers are already tuned and drive limits are characterized.
Uses Hooke-Jeeves coordinate descent to optimize the high-level distance and/or heading PID controllers (kp and kd). The process for each parameter set is:
- Initial evaluation – run a test motion (0.5 m drive for distance, 90-degree turn for heading) with the current gains and compute a weighted score from settling time, overshoot, and final error.
- Coordinate descent – try perturbing kp up/down by a delta, keep the direction that improves the score. Then do the same for kd. Repeat for up to 10 iterations. When neither direction improves, halve the deltas. Stop when deltas fall below a minimum threshold.
- Constraint-aware scoring – the optimizer prioritizes fast completion but heavily penalizes candidates that exceed soft limits on overshoot (10 mm / 3 degrees) or final error (10 mm / 2 degrees), preventing “fast but sloppy” solutions.
- Secondary speed checks – after optimization at full speed, the final gains are tested at lower speeds (50%, 30%) for monitoring purposes (these do not affect the optimization).
Trials alternate between forward/backward (or CW/CCW) to reduce directional bias.
Prerequisites: Velocity controllers should be tuned first (Phase 2) and drive limits characterized (Phase 1) for best results. The robot needs enough space for 0.5 m drives and 90-degree turns.
Args:
axes: Motion parameters to optimize. Options are "distance"
(linear drive kp/kd) and "heading" (turn kp/kd). Default
["distance", "heading"].
persist: If True, write final gains to
raccoon.project.yml under robot.motion_pid. Default
True.
csv_dir: Directory for diagnostic CSV output. Default
"/tmp/auto_tune".
Returns:
An AutoTuneMotion step that optimizes motion controller gains.
Example::
from libstp.step.motion import auto_tune_motion
# Tune both distance and heading controllers
step = auto_tune_motion()
# Tune only the heading controller
step = auto_tune_motion(axes=["heading"])
# Tune without persisting (for experimentation)
step = auto_tune_motion(persist=False)
auto_tune_velocity(axes: list[str] | None = None, persist: bool = True, csv_dir: str = _DEFAULT_CSV_DIR) -> AutoTuneVelocity
Tune velocity controllers via step-response system identification.
This is Phase 2 of the full auto-tune pipeline, usable standalone when
drive characterization has already been performed (or when the hardware
limits are already configured in raccoon.project.yml).
For each velocity axis the process is:
- Baseline recording – command a step velocity at 50% of the characterized max and record the response at 100 Hz.
- Plant identification – extract gain (Ks), dead time (Tu), and time constant (Tg) using the inflection tangent method (local quadratic regression to find the inflection point, then tangent-line intersections). Falls back to 10%/63% rise-time estimation if the inflection method fails.
- Gain computation – compute PID gains via CHR set-point-follow formulas, scaled down because a kV=1.0 feedforward handles steady-state.
- Validation – apply the new gains and re-run the step response. If the integral of squared error (ISE) improves, the gains are accepted; otherwise the baseline gains are restored.
Accepted gains are applied in-memory to the drive’s velocity control
config and optionally persisted to raccoon.project.yml under
robot.drive.vel_config.
Prerequisites: Drive characterization should have been run first so that max velocity values are available for computing the step command magnitude.
Args:
axes: Velocity axes to tune. Each entry is a velocity component
name: "vx" (forward linear) or "wz" (angular). Default
["vx", "wz"].
persist: If True, write accepted gains to
raccoon.project.yml. Default True.
csv_dir: Directory for step-response CSV files (baseline and tuned
recordings per axis). Default "/tmp/auto_tune".
Returns:
An AutoTuneVelocity step that identifies and tunes velocity
controllers.
Example::
from libstp.step.motion import auto_tune_velocity
# Tune both velocity axes with defaults
step = auto_tune_velocity()
# Tune only the forward axis, save CSVs for plotting
step = auto_tune_velocity(
axes=["vx"],
csv_dir="/tmp/vel_tune_plots",
)
characterize_drive(axes: list[str] | None = None, trials: int = 3, command_speed: float = 1.0, accel_timeout: float = 3.0, decel_timeout: float = 3.0, persist: bool = True) -> CharacterizeDrive
Characterize the robot’s physical drive limits by commanding raw velocities.
Bypasses the profile and PID systems entirely to discover the true hardware capabilities of each drive axis. For each axis, the step runs multiple independent trials consisting of two phases:
Acceleration phase – commands the specified velocity and records odometry at ~100 Hz until a velocity plateau is detected or the timeout expires. From this data, max velocity and acceleration are extracted using 10%–90% rise-time analysis.
Deceleration phase – ramps the robot back up to speed, then commands zero velocity and records the coast-down. Deceleration is computed from 90%–10% fall-time analysis.
The median of all valid trials for each metric is taken as the final
result. Measured values are applied to the in-memory
motion_pid_config immediately and, if persist is enabled, written
to the robot.motion_pid section of raccoon.project.yml so they
survive restarts.
This step is intended for initial robot setup and should be run on a flat
surface with enough room for the robot to accelerate to full speed. It is
typically the first phase of the full auto_tune() pipeline.
Args:
axes: Which axes to characterize. Options are "forward",
"lateral", and "angular". Default ["forward"].
trials: Number of trials per axis. The median is used for
robustness against outliers. Default 3.
command_speed: Raw velocity command magnitude sent to the motors.
For forward/lateral this is in m/s; for angular it is in rad/s.
Should be at or near the maximum the robot can sustain.
Default 1.0.
accel_timeout: Maximum time in seconds to wait for the acceleration
phase before giving up. Default 3.0.
decel_timeout: Maximum time in seconds to record the deceleration
(coast-down) phase. Default 3.0.
persist: If True, write the measured limits to
raccoon.project.yml under robot.motion_pid. Default
True.
Returns:
A CharacterizeDrive step that measures and persists drive limits.
Example::
from libstp.step.motion import characterize_drive
# Characterize forward axis with default settings
step = characterize_drive()
# Characterize forward and angular axes with 5 trials each
step = characterize_drive(
axes=["forward", "angular"],
trials=5,
)
# Characterize without saving to disk (dry run)
step = characterize_drive(
axes=["forward", "lateral", "angular"],
persist=False,
)
tune_drive(distances_cm: list[float] | None = None, speeds: list[float] | None = None, csv_dir: str = '/tmp/drive_telemetry', axis: str = 'forward', settle_time: float = 1.5, timeout: float = 15.0) -> TuneDrive
Run test drives at various distances and speeds, saving telemetry to CSV.
Executes every combination of (distance, speed) as a real LinearMotion
drive, collecting per-cycle telemetry at 50 Hz. Each run produces a CSV
file containing columns such as time_s, position_m,
setpoint_position_m, setpoint_velocity_mps, distance_error_m,
actual_error_m, filtered_velocity_mps, cmd_vx_mps,
pid_primary_raw, heading_rad, saturated, and more (see
CSV_HEADER in the module source for the full list).
The CSV files are intended for offline analysis – plot position vs. setpoint to check tracking, examine PID outputs for saturation, compare overshoot across speeds, etc. This is a diagnostic/tuning tool used during robot setup, not during competition runs.
The robot must have enough clear space to drive the longest requested distance.
Args:
distances_cm: List of distances to test, in centimeters. Negative
values drive in reverse. Default [10, 25, 50, 100].
speeds: List of speed scales to test (0.0–1.0). Each distance is
driven at each speed. Default [0.3, 0.6, 1.0].
csv_dir: Directory where CSV files are written. Created
automatically if it does not exist. Default
"/tmp/drive_telemetry".
axis: Drive axis to test: "forward" or "lateral". Default
"forward".
settle_time: Seconds to wait between runs for the robot to come to
rest. Default 1.5.
timeout: Maximum seconds per run before the drive is aborted.
Default 15.0.
Returns:
A TuneDrive step that executes the test matrix and writes CSV
files.
Example::
from libstp.step.motion import tune_drive
# Quick test at a single distance and speed
step = tune_drive(
distances_cm=[50],
speeds=[0.5],
csv_dir="/tmp/quick_test",
)
# Full sweep for forward axis tuning
step = tune_drive(
distances_cm=[10, 25, 50, 100],
speeds=[0.3, 0.6, 1.0],
)
# Lateral axis characterization
step = tune_drive(
distances_cm=[20, 40],
speeds=[0.3, 0.6],
axis="lateral",
)
drive_angle(angle_deg: float, cm: float, speed: float = 1.0) -> DriveAngle
Drive at an arbitrary angle for a specified distance.
Decomposes the desired heading into forward and lateral velocity components, then runs a profiled PID controller in a rotated coordinate frame with heading maintenance and cross-track correction.
Requires a mecanum or omni-wheel drivetrain.
Angle convention (robot-centric):
- 0 = forward
- 90 = right
- -90 = left
- 180 = backward
Args: angle_deg: Travel angle in degrees. cm: Distance to travel in centimeters. speed: Fraction of max speed, 0.0 to 1.0 (default 1.0).
Returns: A DriveAngle step.
Example::
from libstp.step.motion import drive_angle
# Drive diagonally forward-right at 45 degrees
drive_angle(45, cm=30)
# Drive pure right (same as strafe_right)
drive_angle(90, cm=20)
drive_backward(cm: float, speed: float = 1.0) -> Drive
Drive backward a specified distance using profiled PID motion control.
Identical to drive_forward() but in reverse. The robot drives
backward while maintaining heading via IMU feedback.
Requires calibrate_distance() to have been run first.
Args: cm: Distance to drive in centimeters. speed: Fraction of max speed, 0.0 to 1.0 (default 1.0).
Returns: A Drive step configured for backward motion.
Raises:
CalibrationRequiredError: If calibrate_distance() has not been run.
Example::
from libstp.step.motion import drive_backward
# Back up 20 cm
drive_backward(20)
drive_forward(cm: float, speed: float = 1.0) -> Drive
Drive forward a specified distance using profiled PID motion control.
The robot accelerates, cruises, and decelerates along a trapezoidal velocity profile while maintaining heading via IMU feedback. Odometry tracks the distance traveled and the step completes when the target is reached.
Requires calibrate_distance() to have been run first so that
encoder-to-meter conversion is accurate.
Args: cm: Distance to drive in centimeters. speed: Fraction of max speed, 0.0 to 1.0 (default 1.0).
Returns: A Drive step configured for forward motion.
Raises:
CalibrationRequiredError: If calibrate_distance() has not been run.
Example::
from libstp.step.motion import drive_forward
# Drive forward 50 cm at full speed
drive_forward(50)
# Drive forward 30 cm at half speed
drive_forward(30, speed=0.5)
directional_follow_line(left_sensor: IRSensor, right_sensor: IRSensor, distance_cm: float, heading_speed: float = 0.0, strafe_speed: float = 0.0, kp: float = 0.75, ki: float = 0.0, kd: float = 0.5) -> DirectionalLineFollow
Follow a line for a distance with independent heading and strafe speeds.
Drive along a line using any combination of forward and lateral velocity
while a PID controller steers the robot via angular velocity. The error
signal is the difference between the left and right sensors'
probabilityOfBlack() readings. Distance is tracked via odometry as
the euclidean distance from the start position.
Unlike follow_line which only drives forward, this step accepts both
heading_speed (forward/backward) and strafe_speed (left/right)
as independent fractions of max velocity, enabling line following while
strafing or driving diagonally.
Both sensors must be calibrated (white/black thresholds set) before use.
Requires a mecanum or omni-wheel drivetrain if strafe_speed is
nonzero.
Args: left_sensor: Left IR sensor instance, positioned to the left of the line. right_sensor: Right IR sensor instance, positioned to the right of the line. distance_cm: Distance to follow in centimeters. The step finishes when this euclidean distance has been traveled. heading_speed: Forward/backward speed as a fraction of max velocity (-1.0 to 1.0). Positive = forward, negative = backward. Default 0.0. strafe_speed: Lateral speed as a fraction of max velocity (-1.0 to 1.0). Positive = right, negative = left. Default 0.0. kp: Proportional gain for steering PID. Default 0.75. ki: Integral gain for steering PID. Default 0.0. kd: Derivative gain for steering PID. Default 0.5.
Returns:
A DirectionalLineFollow step.
Example::
from libstp.step.motion import directional_follow_line
# Strafe right while following a line for 50 cm
directional_follow_line(
left_sensor=robot.left_ir,
right_sensor=robot.right_ir,
distance_cm=50.0,
strafe_speed=0.5,
)
# Drive diagonally forward-right along a line
directional_follow_line(
left_sensor=robot.left_ir,
right_sensor=robot.right_ir,
distance_cm=80.0,
heading_speed=0.3,
strafe_speed=0.4,
)
directional_follow_line_single(sensor: IRSensor, distance_cm: float, heading_speed: float = 0.0, strafe_speed: float = 0.0, side: LineSide = LineSide.LEFT, kp: float = 1.0, ki: float = 0.0, kd: float = 0.3) -> DirectionalSingleLineFollow
Follow a line edge with a single sensor and independent heading/strafe speeds.
The sensor tracks the boundary between the line and the background, where
probabilityOfBlack() is approximately 0.5. The side parameter
selects which edge to track. The PID output controls angular velocity
while heading and strafe velocities are set independently.
The sensor must be calibrated (white/black thresholds set) before use.
Requires a mecanum or omni-wheel drivetrain if strafe_speed is
nonzero.
Args:
sensor: IR sensor for edge tracking.
distance_cm: Distance to follow in centimeters.
heading_speed: Forward/backward speed fraction (-1.0 to 1.0).
Default 0.0.
strafe_speed: Lateral speed fraction (-1.0 to 1.0). Default 0.0.
side: Which edge of the line to track. Default LineSide.LEFT.
kp: Proportional gain for steering PID. Default 1.0.
ki: Integral gain for steering PID. Default 0.0.
kd: Derivative gain for steering PID. Default 0.3.
Returns:
A DirectionalSingleLineFollow step.
Example::
from libstp.step.motion import directional_follow_line_single, LineSide
# Strafe right while tracking the left edge of a line
directional_follow_line_single(
sensor=robot.front_ir,
distance_cm=50.0,
strafe_speed=0.4,
side=LineSide.LEFT,
)
directional_follow_line_single_until_black(sensor: IRSensor, stop_sensor: IRSensor, heading_speed: float = 0.0, strafe_speed: float = 0.0, side: LineSide = LineSide.LEFT, stop_threshold: float = 0.7, kp: float = 1.0, ki: float = 0.0, kd: float = 0.3) -> DirectionalSingleLineFollow
Follow a line edge with heading/strafe, stopping when a second sensor sees black.
Combines single-sensor edge tracking with an event-based stop condition.
The sensor tracks the line edge using PID control while the
stop_sensor is polled each cycle.
Both sensors must be calibrated before use. Requires a mecanum or
omni-wheel drivetrain if strafe_speed is nonzero.
Args:
sensor: IR sensor for edge tracking.
stop_sensor: Second IR sensor for the stop condition.
heading_speed: Forward/backward speed fraction (-1.0 to 1.0).
Default 0.0.
strafe_speed: Lateral speed fraction (-1.0 to 1.0). Default 0.0.
side: Which edge of the line to track. Default LineSide.LEFT.
stop_threshold: probabilityOfBlack() the stop sensor must
exceed. Default 0.7.
kp: Proportional gain for steering PID. Default 1.0.
ki: Integral gain for steering PID. Default 0.0.
kd: Derivative gain for steering PID. Default 0.3.
Returns:
A DirectionalSingleLineFollow step.
Example::
from libstp.step.motion import directional_follow_line_single_until_black, LineSide
# Strafe right along a line edge until the stop sensor hits a cross-line
directional_follow_line_single_until_black(
sensor=robot.left_ir,
stop_sensor=robot.right_ir,
strafe_speed=0.4,
side=LineSide.LEFT,
)
directional_follow_line_until_both_black(left_sensor: IRSensor, right_sensor: IRSensor, heading_speed: float = 0.0, strafe_speed: float = 0.0, kp: float = 0.75, ki: float = 0.0, kd: float = 0.5, both_black_threshold: float = 0.7) -> DirectionalLineFollow
Follow a line with heading and strafe until both sensors detect black.
Same as directional_follow_line but instead of stopping after a
fixed distance, the step monitors both sensors each cycle and finishes
when both probabilityOfBlack() readings exceed
both_black_threshold simultaneously, indicating an intersection.
Both sensors must be calibrated (white/black thresholds set) before use.
Requires a mecanum or omni-wheel drivetrain if strafe_speed is
nonzero.
Args:
left_sensor: Left IR sensor instance.
right_sensor: Right IR sensor instance.
heading_speed: Forward/backward speed fraction (-1.0 to 1.0).
Default 0.0.
strafe_speed: Lateral speed fraction (-1.0 to 1.0). Default 0.0.
kp: Proportional gain for steering PID. Default 0.75.
ki: Integral gain for steering PID. Default 0.0.
kd: Derivative gain for steering PID. Default 0.5.
both_black_threshold: probabilityOfBlack() value that both
sensors must exceed to trigger the stop. Default 0.7.
Returns:
A DirectionalLineFollow step that stops at an intersection.
Example::
from libstp.step.motion import directional_follow_line_until_both_black
# Strafe right along a line until hitting a cross-line
directional_follow_line_until_both_black(
left_sensor=robot.left_ir,
right_sensor=robot.right_ir,
strafe_speed=0.4,
)
follow_line(left_sensor: IRSensor, right_sensor: IRSensor, distance_cm: float, speed: float = 0.5, kp: float = 0.75, ki: float = 0.0, kd: float = 0.5) -> LineFollow
Follow a line for a specified distance using two IR sensors for steering.
Drives forward while a PID controller steers the robot to keep it centered
on a line. The error signal is the difference between the left and right
sensors’ probabilityOfBlack() readings. A positive error (left sees
more black) steers the robot back toward center. The underlying
LinearMotion handles profiled velocity control and odometry-based
distance tracking, while the PID output overrides the heading command
as an angular velocity (omega).
Both sensors must be calibrated (white/black thresholds set) before use.
Args: left_sensor: Left IR sensor instance, positioned to the left of the line. right_sensor: Right IR sensor instance, positioned to the right of the line. distance_cm: Distance to follow in centimeters. The step finishes when this distance has been traveled according to odometry. speed: Fraction of max velocity (0.0–1.0). Lower speeds give the PID more time to correct but are slower overall. Default 0.5. kp: Proportional gain for steering PID. Higher values produce sharper corrections. Default 0.75. ki: Integral gain for steering PID. Typically left at 0.0 unless there is a persistent drift. Default 0.0. kd: Derivative gain for steering PID. Damps oscillation around the line. Default 0.5.
Returns:
A LineFollow step configured for distance-based line following.
Example::
from libstp.step.motion import follow_line
# Follow a line for 80 cm at half speed
step = follow_line(
left_sensor=robot.left_ir,
right_sensor=robot.right_ir,
distance_cm=80.0,
speed=0.5,
)
# Tighter tracking with higher kp
step = follow_line(
left_sensor=robot.left_ir,
right_sensor=robot.right_ir,
distance_cm=120.0,
speed=0.3,
kp=1.2,
kd=0.8,
)
follow_line_single(sensor: IRSensor, distance_cm: float, speed: float = 0.5, side: LineSide = LineSide.LEFT, kp: float = 1.0, ki: float = 0.0, kd: float = 0.3) -> SingleSensorLineFollow
Follow a line edge using a single IR sensor for a specified distance.
The sensor tracks the boundary between the line and the background, where
probabilityOfBlack() is approximately 0.5. The PID controller drives
the error (reading - 0.5) toward zero, keeping the sensor positioned
right on the edge. The side parameter controls which edge: LEFT
means the sensor is to the left of the line (steers right when it sees
black), and RIGHT is the opposite.
This variant is useful when only one sensor is available, or when the line
is too narrow for two sensors. The underlying LinearMotion handles
profiled velocity and odometry-based distance tracking.
The sensor must be calibrated (white/black thresholds set) before use.
Args:
sensor: The IR sensor instance used for edge tracking.
distance_cm: Distance to follow in centimeters. The step finishes
when this distance has been traveled.
speed: Fraction of max velocity (0.0–1.0). Default 0.5.
side: Which edge of the line to track. LineSide.LEFT (default)
or LineSide.RIGHT.
kp: Proportional gain for steering PID. Default 1.0.
ki: Integral gain for steering PID. Default 0.0.
kd: Derivative gain for steering PID. Default 0.3.
Returns:
A SingleSensorLineFollow step.
Example::
from libstp.step.motion import follow_line_single, LineSide
# Follow the left edge of a line for 60 cm
step = follow_line_single(
sensor=robot.front_ir,
distance_cm=60.0,
speed=0.4,
side=LineSide.LEFT,
)
# Follow the right edge with custom PID gains
step = follow_line_single(
sensor=robot.front_ir,
distance_cm=100.0,
speed=0.5,
side=LineSide.RIGHT,
kp=1.5,
kd=0.5,
)
follow_line_single_until_black(sensor: IRSensor, stop_sensor: IRSensor, speed: float = 0.5, side: LineSide = LineSide.LEFT, stop_threshold: float = 0.7, kp: float = 1.0, ki: float = 0.0, kd: float = 0.3) -> SingleSensorLineFollowUntilBlack
Follow a line edge using one sensor, stopping when a second sensor sees black.
Combines single-sensor edge tracking with an event-based stop condition.
The sensor tracks the line edge (targeting probabilityOfBlack() ~ 0.5) using PID control, while the stop_sensor is polled each cycle.
When the stop sensor’s probabilityOfBlack() exceeds
stop_threshold, the step finishes immediately.
This is useful for following a line until the robot reaches a perpendicular marker or a specific position detected by a second sensor (e.g., a side- mounted sensor that crosses a branch line).
Both sensors must be calibrated (white/black thresholds set) before use.
Args:
sensor: The IR sensor used for edge-tracking along the line.
stop_sensor: A second IR sensor monitored for the stop condition.
The step finishes when this sensor’s probabilityOfBlack()
exceeds stop_threshold.
speed: Fraction of max velocity (0.0–1.0). Default 0.5.
side: Which edge of the line to track. LineSide.LEFT (default)
or LineSide.RIGHT.
stop_threshold: The probabilityOfBlack() value the stop sensor
must exceed to trigger the stop. Default 0.7.
kp: Proportional gain for steering PID. Default 1.0.
ki: Integral gain for steering PID. Default 0.0.
kd: Derivative gain for steering PID. Default 0.3.
Returns:
A SingleSensorLineFollowUntilBlack step.
Example::
from libstp.step.motion import follow_line_single_until_black, LineSide
# Follow left edge until the right sensor hits a cross-line
step = follow_line_single_until_black(
sensor=robot.left_ir,
stop_sensor=robot.right_ir,
speed=0.4,
side=LineSide.LEFT,
)
# Lower stop threshold for earlier detection
step = follow_line_single_until_black(
sensor=robot.front_ir,
stop_sensor=robot.side_ir,
speed=0.3,
stop_threshold=0.5,
kp=1.2,
)
follow_line_until_both_black(left_sensor: IRSensor, right_sensor: IRSensor, speed: float = 0.5, kp: float = 0.75, ki: float = 0.0, kd: float = 0.5, both_black_threshold: float = 0.7) -> LineFollow
Follow a line until both sensors detect black, indicating an intersection.
Drives forward with PID-based steering (same as follow_line) but instead
of stopping after a fixed distance, the step monitors both sensors each
cycle. When both probabilityOfBlack() readings exceed
both_black_threshold simultaneously, the robot has reached a
perpendicular line or intersection and the step finishes.
Internally the distance target is set very large so LinearMotion never
finishes on its own – the both-black condition is the sole termination
criterion.
Both sensors must be calibrated (white/black thresholds set) before use.
Args:
left_sensor: Left IR sensor instance, positioned to the left of the
line.
right_sensor: Right IR sensor instance, positioned to the right of
the line.
speed: Fraction of max velocity (0.0–1.0). Default 0.5.
kp: Proportional gain for steering PID. Default 0.75.
ki: Integral gain for steering PID. Default 0.0.
kd: Derivative gain for steering PID. Default 0.5.
both_black_threshold: The probabilityOfBlack() value that both
sensors must exceed to trigger the stop. Default 0.7.
Returns:
A LineFollow step that stops when an intersection is detected.
Example::
from libstp.step.motion import follow_line_until_both_black
# Follow a line until hitting a cross-line
step = follow_line_until_both_black(
left_sensor=robot.left_ir,
right_sensor=robot.right_ir,
speed=0.4,
)
# More sensitive intersection detection
step = follow_line_until_both_black(
left_sensor=robot.left_ir,
right_sensor=robot.right_ir,
speed=0.3,
both_black_threshold=0.6,
)
strafe_follow_line(left_sensor: IRSensor, right_sensor: IRSensor, distance_cm: float, speed: float = 0.5, kp: float = 0.75, ki: float = 0.0, kd: float = 0.5) -> DirectionalLineFollow
Follow a line by strafing right for a specified distance.
Convenience wrapper around directional_follow_line for pure lateral
line following. The robot strafes right at the given speed while PID
steering keeps it centered on the line using two sensors.
Both sensors must be calibrated. Requires a mecanum or omni-wheel drivetrain.
Args: left_sensor: Left IR sensor instance. right_sensor: Right IR sensor instance. distance_cm: Distance to strafe in centimeters. speed: Strafe speed as fraction of max lateral velocity (0.0 to 1.0). Default 0.5. Use negative values to strafe left. kp: Proportional gain for steering PID. Default 0.75. ki: Integral gain for steering PID. Default 0.0. kd: Derivative gain for steering PID. Default 0.5.
Returns:
A DirectionalLineFollow step configured for lateral motion.
Example::
from libstp.step.motion import strafe_follow_line
# Strafe right along a line for 40 cm
strafe_follow_line(
left_sensor=robot.left_ir,
right_sensor=robot.right_ir,
distance_cm=40.0,
speed=0.4,
)
# Strafe left along a line for 30 cm
strafe_follow_line(
left_sensor=robot.left_ir,
right_sensor=robot.right_ir,
distance_cm=30.0,
speed=-0.4,
)
strafe_follow_line_single(sensor: IRSensor, distance_cm: float, speed: float = 0.5, side: LineSide = LineSide.LEFT, kp: float = 1.0, ki: float = 0.0, kd: float = 0.3) -> DirectionalSingleLineFollow
Follow a line edge by strafing right using a single sensor.
Convenience wrapper around directional_follow_line_single for pure
lateral single-sensor line following. The robot strafes at the given
speed while PID edge-tracking keeps the sensor on the line boundary.
The sensor must be calibrated. Requires a mecanum or omni-wheel drivetrain.
Args:
sensor: IR sensor for edge tracking.
distance_cm: Distance to strafe in centimeters.
speed: Strafe speed as fraction of max lateral velocity (0.0 to
1.0). Default 0.5. Use negative values to strafe left.
side: Which edge of the line to track. Default LineSide.LEFT.
kp: Proportional gain for steering PID. Default 1.0.
ki: Integral gain for steering PID. Default 0.0.
kd: Derivative gain for steering PID. Default 0.3.
Returns:
A DirectionalSingleLineFollow step configured for lateral motion.
Example::
from libstp.step.motion import strafe_follow_line_single, LineSide
# Strafe right along a line edge for 40 cm
strafe_follow_line_single(
sensor=robot.front_ir,
distance_cm=40.0,
speed=0.4,
side=LineSide.LEFT,
)
# Strafe left along a line edge for 30 cm
strafe_follow_line_single(
sensor=robot.front_ir,
distance_cm=30.0,
speed=-0.4,
side=LineSide.RIGHT,
)
strafe_follow_line_single_until_black(sensor: IRSensor, stop_sensor: IRSensor, speed: float = 0.5, side: LineSide = LineSide.LEFT, stop_threshold: float = 0.7, kp: float = 1.0, ki: float = 0.0, kd: float = 0.3) -> DirectionalSingleLineFollow
Follow a line edge by strafing, stopping when a second sensor sees black.
Convenience wrapper around directional_follow_line_single_until_black
for pure lateral single-sensor line following with a stop-sensor trigger.
Both sensors must be calibrated. Requires a mecanum or omni-wheel drivetrain.
Args:
sensor: IR sensor for edge tracking.
stop_sensor: Second IR sensor for the stop condition.
speed: Strafe speed as fraction of max lateral velocity (0.0 to
1.0). Default 0.5. Use negative values to strafe left.
side: Which edge of the line to track. Default LineSide.LEFT.
stop_threshold: probabilityOfBlack() the stop sensor must
exceed. Default 0.7.
kp: Proportional gain for steering PID. Default 1.0.
ki: Integral gain for steering PID. Default 0.0.
kd: Derivative gain for steering PID. Default 0.3.
Returns:
A DirectionalSingleLineFollow step that stops on sensor trigger.
Example::
from libstp.step.motion import strafe_follow_line_single_until_black, LineSide
# Strafe right along a line edge until the stop sensor hits a cross-line
strafe_follow_line_single_until_black(
sensor=robot.left_ir,
stop_sensor=robot.right_ir,
speed=0.4,
side=LineSide.LEFT,
)
strafe_follow_line_until_both_black(left_sensor: IRSensor, right_sensor: IRSensor, speed: float = 0.5, kp: float = 0.75, ki: float = 0.0, kd: float = 0.5, both_black_threshold: float = 0.7) -> DirectionalLineFollow
Follow a line by strafing right until both sensors detect black.
Convenience wrapper around directional_follow_line_until_both_black
for pure lateral line following until an intersection is reached.
Both sensors must be calibrated. Requires a mecanum or omni-wheel drivetrain.
Args:
left_sensor: Left IR sensor instance.
right_sensor: Right IR sensor instance.
speed: Strafe speed as fraction of max lateral velocity (0.0 to
1.0). Default 0.5. Use negative values to strafe left.
kp: Proportional gain for steering PID. Default 0.75.
ki: Integral gain for steering PID. Default 0.0.
kd: Derivative gain for steering PID. Default 0.5.
both_black_threshold: probabilityOfBlack() value that both
sensors must exceed to trigger the stop. Default 0.7.
Returns:
A DirectionalLineFollow step that stops at an intersection.
Example::
from libstp.step.motion import strafe_follow_line_until_both_black
# Strafe right along a line until a cross-line
strafe_follow_line_until_both_black(
left_sensor=robot.left_ir,
right_sensor=robot.right_ir,
speed=0.4,
)
backward_lineup_on_black(left_sensor: IRSensor, right_sensor: IRSensor, detection_threshold: float = 0.7) -> Sequential
Drive backward onto a black line, align perpendicular, then clear to white.
Identical to forward_lineup_on_black but the robot reverses into the
line instead of driving forward. The corrective turn direction is
automatically mirrored to account for the reversed geometry. After
alignment the robot continues backward at half speed until both sensors see
white.
Prerequisites:
Two IR line sensors mounted on the left and right sides of the chassis,
with the robot’s distance_between_sensors configured.
Args: left_sensor: IR sensor on the left side of the chassis. right_sensor: IR sensor on the right side of the chassis. detection_threshold: Confidence value (0–1) each sensor must reach to count as detecting a line.
Returns: Sequential: measure (backward) + corrective turn + drive-until-white (backward).
Example::
from libstp.step.motion.lineup.forward import backward_lineup_on_black
step = backward_lineup_on_black(
left_sensor=robot.left_line_sensor,
right_sensor=robot.right_line_sensor,
detection_threshold=0.7,
)
step.run(robot)
backward_lineup_on_white(left_sensor: IRSensor, right_sensor: IRSensor, detection_threshold: float = 0.7) -> Sequential
Drive backward onto a white line, align perpendicular, then clear to black.
Identical to forward_lineup_on_white but the robot reverses into the
line instead of driving forward. The corrective turn direction is
automatically mirrored to account for the reversed geometry. After
alignment the robot continues backward at half speed until both sensors see
black.
Prerequisites:
Two IR line sensors mounted on the left and right sides of the chassis,
with the robot’s distance_between_sensors configured.
Args: left_sensor: IR sensor on the left side of the chassis. right_sensor: IR sensor on the right side of the chassis. detection_threshold: Confidence value (0–1) each sensor must reach to count as detecting a white surface.
Returns: Sequential: measure (backward) + corrective turn + drive-until-black (backward).
Example::
from libstp.step.motion.lineup.forward import backward_lineup_on_white
step = backward_lineup_on_white(
left_sensor=robot.left_line_sensor,
right_sensor=robot.right_line_sensor,
)
step.run(robot)
backward_single_lineup(sensor: IRSensor, line_width_cm: float = 5.0, correction_side: CorrectionSide = CorrectionSide.LEFT, forward_speed: float = 0.15, entry_threshold: float = 0.7, exit_threshold: float = 0.3) -> Sequential
Drive backward across a black line, measure skew, and correct with a turn.
Identical to forward_single_lineup but the robot reverses into the
line instead of driving forward. The forward_speed value is negated
internally so the robot always moves backward regardless of the sign
passed in. The same acos(line_width / apparent_crossing_width)
computation is used to determine the corrective angle.
Prerequisites: One IR line sensor and odometry. No lateral-drive capability required.
Args:
sensor: The IR sensor instance used to detect the line.
line_width_cm: Known physical width of the line in centimeters.
correction_side: Direction to turn for correction (LEFT or RIGHT).
forward_speed: Approach speed in m/s. The absolute value is negated
so the robot always drives backward.
entry_threshold: Black confidence (0–1) for detecting the leading
edge of the line.
exit_threshold: Black confidence (0–1) for detecting the trailing
edge. Should be lower than entry_threshold.
Returns: Sequential: crossing measurement (backward) + corrective turn.
Example::
from libstp.step.motion.lineup.single import (
backward_single_lineup, CorrectionSide,
)
step = backward_single_lineup(
sensor=robot.center_line_sensor,
line_width_cm=5.0,
correction_side=CorrectionSide.RIGHT,
forward_speed=0.12,
)
step.run(robot)
forward_lineup_on_black(left_sensor: IRSensor, right_sensor: IRSensor, detection_threshold: float = 0.7) -> Sequential
Drive forward onto a black line, align perpendicular, then clear to white.
The robot drives forward until both IR sensors detect a black line. The stagger distance between the two sensor hits is used to compute a corrective turn that aligns the chassis perpendicular to the line. After the turn the robot creeps forward at half speed until both sensors see white, leaving it just past the line and squared up.
Prerequisites:
Two IR line sensors mounted on the left and right sides of the chassis,
with the robot’s distance_between_sensors configured.
Args: left_sensor: IR sensor on the left side of the chassis. right_sensor: IR sensor on the right side of the chassis. detection_threshold: Confidence value (0–1) each sensor must reach to count as detecting a line. Lower values trigger earlier but are more susceptible to noise.
Returns: Sequential: measure + corrective turn + drive-until-white.
Example::
from libstp.step.motion.lineup.forward import forward_lineup_on_black
step = forward_lineup_on_black(
left_sensor=robot.left_line_sensor,
right_sensor=robot.right_line_sensor,
detection_threshold=0.8,
)
step.run(robot)
forward_lineup_on_white(left_sensor: IRSensor, right_sensor: IRSensor, detection_threshold: float = 0.7) -> Sequential
Drive forward onto a white line, align perpendicular, then clear to black.
The robot drives forward until both IR sensors detect a white surface. The stagger distance between the two sensor hits is used to compute a corrective turn that aligns the chassis perpendicular to the line edge. After the turn the robot creeps forward at half speed until both sensors see black, leaving it just past the white region and squared up.
Prerequisites:
Two IR line sensors mounted on the left and right sides of the chassis,
with the robot’s distance_between_sensors configured.
Args: left_sensor: IR sensor on the left side of the chassis. right_sensor: IR sensor on the right side of the chassis. detection_threshold: Confidence value (0–1) each sensor must reach to count as detecting a white surface. Lower values trigger earlier but are more susceptible to noise.
Returns: Sequential: measure + corrective turn + drive-until-black.
Example::
from libstp.step.motion.lineup.forward import forward_lineup_on_white
step = forward_lineup_on_white(
left_sensor=robot.left_line_sensor,
right_sensor=robot.right_line_sensor,
)
step.run(robot)
forward_single_lineup(sensor: IRSensor, line_width_cm: float = 5.0, correction_side: CorrectionSide = CorrectionSide.LEFT, forward_speed: float = 0.15, entry_threshold: float = 0.7, exit_threshold: float = 0.3) -> Sequential
Drive forward across a black line, measure skew, and correct with a turn.
The robot drives forward at forward_speed while a single IR sensor
watches for a black line. As it crosses, the apparent width of the line
is measured via odometry and compared against the known line_width_cm
to compute the angular displacement using
acos(line_width / apparent_crossing_width). A corrective turn is then
executed in the direction given by correction_side.
Because a single sensor cannot determine which way the robot is skewed, the caller must know (from the arena layout or prior steps) whether to correct left or right.
Prerequisites: One IR line sensor and odometry. No lateral-drive capability required.
Args:
sensor: The IR sensor instance used to detect the line.
line_width_cm: Known physical width of the line in centimeters.
correction_side: Direction to turn for correction (LEFT or RIGHT).
forward_speed: Approach speed in m/s. Always driven forward
(the absolute value is used).
entry_threshold: Black confidence (0–1) for detecting the leading
edge of the line.
exit_threshold: Black confidence (0–1) for detecting the trailing
edge. Should be lower than entry_threshold.
Returns: Sequential: crossing measurement + corrective turn.
Example::
from libstp.step.motion.lineup.single import (
forward_single_lineup, CorrectionSide,
)
step = forward_single_lineup(
sensor=robot.center_line_sensor,
line_width_cm=5.0,
correction_side=CorrectionSide.LEFT,
forward_speed=0.10,
)
step.run(robot)
strafe_left_lineup_on_black(front_sensor: IRSensor, back_sensor: IRSensor, detection_threshold: float = 0.7) -> Sequential
Strafe left onto a black line, align perpendicular, then clear to white.
Identical to strafe_right_lineup_on_black but the robot strafes left
instead of right. The corrective turn direction is automatically mirrored
to account for the reversed lateral geometry. After alignment the robot
continues strafing left at reduced speed until both sensors see white.
Prerequisites:
Two IR line sensors mounted on the front and rear of the chassis, with
the robot’s distance_between_sensors configured. Requires a
mecanum or omni-wheel drivetrain capable of lateral strafing.
Args: front_sensor: IR sensor on the front of the chassis. back_sensor: IR sensor on the rear of the chassis. detection_threshold: Confidence value (0–1) each sensor must reach to count as detecting a line.
Returns: Sequential: measure + corrective turn + strafe-until-white (leftward).
Example::
from libstp.step.motion.lineup.strafe import strafe_left_lineup_on_black
step = strafe_left_lineup_on_black(
front_sensor=robot.front_line_sensor,
back_sensor=robot.back_line_sensor,
)
step.run(robot)
strafe_left_lineup_on_white(front_sensor: IRSensor, back_sensor: IRSensor, detection_threshold: float = 0.7) -> Sequential
Strafe left onto a white line, align perpendicular, then clear to black.
Identical to strafe_right_lineup_on_white but the robot strafes left
instead of right. The corrective turn direction is automatically mirrored
to account for the reversed lateral geometry. After alignment the robot
continues strafing left at reduced speed until both sensors see black.
Prerequisites:
Two IR line sensors mounted on the front and rear of the chassis, with
the robot’s distance_between_sensors configured. Requires a
mecanum or omni-wheel drivetrain capable of lateral strafing.
Args: front_sensor: IR sensor on the front of the chassis. back_sensor: IR sensor on the rear of the chassis. detection_threshold: Confidence value (0–1) each sensor must reach to count as detecting a white surface.
Returns: Sequential: measure + corrective turn + strafe-until-black (leftward).
Example::
from libstp.step.motion.lineup.strafe import strafe_left_lineup_on_white
step = strafe_left_lineup_on_white(
front_sensor=robot.front_line_sensor,
back_sensor=robot.back_line_sensor,
detection_threshold=0.65,
)
step.run(robot)
strafe_right_lineup_on_black(front_sensor: IRSensor, back_sensor: IRSensor, detection_threshold: float = 0.7) -> Sequential
Strafe right onto a black line, align perpendicular, then clear to white.
The robot strafes right until both front and back IR sensors detect a black line. The lateral stagger distance between the two sensor hits is used to compute a corrective turn that aligns the chassis perpendicular to the line. After the turn the robot strafes right at reduced speed until both sensors see white, leaving it just past the line and squared up.
Prerequisites:
Two IR line sensors mounted on the front and rear of the chassis, with
the robot’s distance_between_sensors configured. Requires a
mecanum or omni-wheel drivetrain capable of lateral strafing.
Args: front_sensor: IR sensor on the front of the chassis. back_sensor: IR sensor on the rear of the chassis. detection_threshold: Confidence value (0–1) each sensor must reach to count as detecting a line. Lower values trigger earlier but are more susceptible to noise.
Returns: Sequential: measure + corrective turn + strafe-until-white.
Example::
from libstp.step.motion.lineup.strafe import strafe_right_lineup_on_black
step = strafe_right_lineup_on_black(
front_sensor=robot.front_line_sensor,
back_sensor=robot.back_line_sensor,
detection_threshold=0.8,
)
step.run(robot)
strafe_right_lineup_on_white(front_sensor: IRSensor, back_sensor: IRSensor, detection_threshold: float = 0.7) -> Sequential
Strafe right onto a white line, align perpendicular, then clear to black.
The robot strafes right until both front and back IR sensors detect a white surface. The lateral stagger distance between the two sensor hits is used to compute a corrective turn that aligns the chassis perpendicular to the line edge. After the turn the robot strafes right at reduced speed until both sensors see black, leaving it just past the white region and squared up.
Prerequisites:
Two IR line sensors mounted on the front and rear of the chassis, with
the robot’s distance_between_sensors configured. Requires a
mecanum or omni-wheel drivetrain capable of lateral strafing.
Args: front_sensor: IR sensor on the front of the chassis. back_sensor: IR sensor on the rear of the chassis. detection_threshold: Confidence value (0–1) each sensor must reach to count as detecting a white surface.
Returns: Sequential: measure + corrective turn + strafe-until-black.
Example::
from libstp.step.motion.lineup.strafe import strafe_right_lineup_on_white
step = strafe_right_lineup_on_white(
front_sensor=robot.front_line_sensor,
back_sensor=robot.back_line_sensor,
)
step.run(robot)
drive_angle_until_black(sensor: Union[IRSensor, list[IRSensor]], angle_deg: float, speed: float = 0.3, confidence_threshold: float = 0.7) -> MoveUntil
Drive at an arbitrary angle until any sensor detects a black surface.
Decomposes the desired travel direction into forward and strafe velocity
components using trigonometry (cos for forward, sin for strafe) and
commands them simultaneously. The robot’s heading does not change – only
its translational direction of travel. Each control cycle, the given IR
sensor(s) are polled and the step completes as soon as any sensor’s
probabilityOfBlack() meets or exceeds confidence_threshold.
The angle convention is: 0 degrees = pure forward, 90 degrees = pure right strafe, -90 degrees = pure left strafe, and 180 degrees = pure backward.
Prerequisites: Requires a mecanum or holonomic drivetrain capable of lateral movement.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
black.
angle_deg: Travel angle in degrees relative to the robot’s forward
axis. 0 = forward, 90 = right, -90 = left, 180 = backward.
speed: Speed magnitude in m/s (positive value). This is the overall
speed; it is decomposed into forward and strafe components based on
the angle. Defaults to 0.3.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for black before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that drives at the specified angle and stops when black is detected.
Example::
from libstp.sensor_ir import IRSensor
front_ir = IRSensor(0)
# Drive diagonally forward-right (45 deg) until black is detected
step = drive_angle_until_black(front_ir, angle_deg=45, speed=0.3)
# Drive purely left (-90 deg) until black -- equivalent to strafe left
step = drive_angle_until_black(front_ir, angle_deg=-90, speed=0.2)
drive_angle_until_white(sensor: Union[IRSensor, list[IRSensor]], angle_deg: float, speed: float = 0.3, confidence_threshold: float = 0.7) -> MoveUntil
Drive at an arbitrary angle until any sensor detects a white surface.
Decomposes the desired travel direction into forward and strafe velocity
components using trigonometry (cos for forward, sin for strafe) and
commands them simultaneously. The robot’s heading does not change – only
its translational direction of travel. Each control cycle, the given IR
sensor(s) are polled and the step completes as soon as any sensor’s
probabilityOfWhite() meets or exceeds confidence_threshold.
The angle convention is: 0 degrees = pure forward, 90 degrees = pure right strafe, -90 degrees = pure left strafe, and 180 degrees = pure backward.
Prerequisites: Requires a mecanum or holonomic drivetrain capable of lateral movement.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
white.
angle_deg: Travel angle in degrees relative to the robot’s forward
axis. 0 = forward, 90 = right, -90 = left, 180 = backward.
speed: Speed magnitude in m/s (positive value). This is the overall
speed; it is decomposed into forward and strafe components based on
the angle. Defaults to 0.3.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for white before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that drives at the specified angle and stops when white is detected.
Example::
from libstp.sensor_ir import IRSensor
front_ir = IRSensor(0)
# Drive diagonally forward-left (-45 deg) until white is detected
step = drive_angle_until_white(front_ir, angle_deg=-45, speed=0.3)
# Drive backward (180 deg) until white -- equivalent to drive backward
step = drive_angle_until_white(front_ir, angle_deg=180, speed=0.2)
drive_backward_until_black(sensor: Union[IRSensor, list[IRSensor]], speed: float = 1.0, confidence_threshold: float = 0.7) -> MoveUntil
Drive backward until any sensor detects a black surface.
Commands a constant backward velocity and polls the given IR sensor(s)
each control cycle. The step completes as soon as any sensor’s
probabilityOfBlack() meets or exceeds confidence_threshold. The
speed is negated internally, so you should pass a positive value.
Useful for backing up toward a black boundary line, for example repositioning before a scoring action.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
black.
speed: Backward driving speed in m/s. Pass a positive value; the sign
is negated internally. Defaults to 1.0.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for black before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that drives backward and stops when black is detected.
Example::
from libstp.sensor_ir import IRSensor
rear_ir = IRSensor(2)
# Back up at 0.4 m/s until the rear sensor detects black
step = drive_backward_until_black(rear_ir, speed=0.4)
drive_backward_until_white(sensor: Union[IRSensor, list[IRSensor]], speed: float = 1.0, confidence_threshold: float = 0.7) -> MoveUntil
Drive backward until any sensor detects a white surface.
Commands a constant backward velocity and polls the given IR sensor(s)
each control cycle. The step completes as soon as any sensor’s
probabilityOfWhite() meets or exceeds confidence_threshold. The
speed is negated internally, so you should pass a positive value.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
white.
speed: Backward driving speed in m/s. Pass a positive value; the sign
is negated internally. Defaults to 1.0.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for white before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that drives backward and stops when white is detected.
Example::
from libstp.sensor_ir import IRSensor
rear_ir = IRSensor(2)
# Back up slowly until the rear sensor sees white
step = drive_backward_until_white(rear_ir, speed=0.3)
drive_forward_until_black(sensor: Union[IRSensor, list[IRSensor]], speed: float = 1.0, confidence_threshold: float = 0.7) -> MoveUntil
Drive forward until any sensor detects a black surface.
Commands a constant forward velocity and polls the given IR sensor(s) each
control cycle. The step completes as soon as any sensor’s
probabilityOfBlack() meets or exceeds confidence_threshold. The
speed is forced positive (forward) regardless of the sign passed in.
This is the recommended way to drive forward until a black line or region is detected, for example when approaching a Botball scoring zone bounded by black tape.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
black.
speed: Forward driving speed in m/s. The absolute value is used, so
negative inputs are treated as positive. Defaults to 1.0.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for black before the step considers the condition met.
Lower values make detection more sensitive but increase the risk of
false positives. Defaults to 0.7.
Returns: MoveUntil: A configured motion step that drives forward and stops when black is detected.
Example::
from libstp.sensor_ir import IRSensor
front_ir = IRSensor(0)
# Drive forward at 0.5 m/s until the front sensor sees black
step = drive_forward_until_black(front_ir, speed=0.5)
# Use two sensors -- stop when either one detects black
left_ir = IRSensor(1)
step = drive_forward_until_black([front_ir, left_ir], speed=0.8)
drive_forward_until_white(sensor: Union[IRSensor, list[IRSensor]], speed: float = 1.0, confidence_threshold: float = 0.7) -> MoveUntil
Drive forward until any sensor detects a white surface.
Commands a constant forward velocity and polls the given IR sensor(s) each
control cycle. The step completes as soon as any sensor’s
probabilityOfWhite() meets or exceeds confidence_threshold. The
speed is forced positive (forward) regardless of the sign passed in.
Useful for driving across a dark region until the robot reaches a white surface, for example crossing black tape to re-enter the playing field.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
white.
speed: Forward driving speed in m/s. The absolute value is used, so
negative inputs are treated as positive. Defaults to 1.0.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for white before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that drives forward and stops when white is detected.
Example::
from libstp.sensor_ir import IRSensor
front_ir = IRSensor(0)
# Drive forward at default speed until white is found
step = drive_forward_until_white(front_ir)
# Slower approach with stricter detection
step = drive_forward_until_white(front_ir, speed=0.3, confidence_threshold=0.9)
strafe_left_until_black(sensor: Union[IRSensor, list[IRSensor]], speed: float = 0.3, confidence_threshold: float = 0.7) -> MoveUntil
Strafe left until any sensor detects a black surface.
Commands a constant leftward lateral velocity and polls the given IR
sensor(s) each control cycle. The step completes as soon as any sensor’s
probabilityOfBlack() meets or exceeds confidence_threshold. The
speed is negated internally to produce leftward motion, so you should pass
a positive value.
Prerequisites: Requires a mecanum or holonomic drivetrain capable of lateral movement.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
black.
speed: Lateral speed in m/s. Pass a positive value; the sign is
negated internally to strafe left. Defaults to 0.3.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for black before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that strafes left and stops when black is detected.
Example::
from libstp.sensor_ir import IRSensor
left_ir = IRSensor(1)
# Strafe left at 0.2 m/s until the left sensor finds a black line
step = strafe_left_until_black(left_ir, speed=0.2)
strafe_left_until_white(sensor: Union[IRSensor, list[IRSensor]], speed: float = 0.3, confidence_threshold: float = 0.7) -> MoveUntil
Strafe left until any sensor detects a white surface.
Commands a constant leftward lateral velocity and polls the given IR
sensor(s) each control cycle. The step completes as soon as any sensor’s
probabilityOfWhite() meets or exceeds confidence_threshold. The
speed is negated internally to produce leftward motion, so you should pass
a positive value.
Prerequisites: Requires a mecanum or holonomic drivetrain capable of lateral movement.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
white.
speed: Lateral speed in m/s. Pass a positive value; the sign is
negated internally to strafe left. Defaults to 0.3.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for white before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that strafes left and stops when white is detected.
Example::
from libstp.sensor_ir import IRSensor
left_ir = IRSensor(1)
# Strafe left at default speed until the left sensor finds white
step = strafe_left_until_white(left_ir)
strafe_right_until_black(sensor: Union[IRSensor, list[IRSensor]], speed: float = 0.3, confidence_threshold: float = 0.7) -> MoveUntil
Strafe right until any sensor detects a black surface.
Commands a constant rightward lateral velocity and polls the given IR
sensor(s) each control cycle. The step completes as soon as any sensor’s
probabilityOfBlack() meets or exceeds confidence_threshold. The
speed is forced positive (rightward) regardless of the sign passed in.
Prerequisites: Requires a mecanum or holonomic drivetrain capable of lateral movement.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
black.
speed: Lateral speed in m/s. The absolute value is used, so negative
inputs are treated as positive. Defaults to 0.3.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for black before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that strafes right and stops when black is detected.
Example::
from libstp.sensor_ir import IRSensor
right_ir = IRSensor(3)
# Strafe right at 0.25 m/s until the right sensor hits a black line
step = strafe_right_until_black(right_ir, speed=0.25)
strafe_right_until_white(sensor: Union[IRSensor, list[IRSensor]], speed: float = 0.3, confidence_threshold: float = 0.7) -> MoveUntil
Strafe right until any sensor detects a white surface.
Commands a constant rightward lateral velocity and polls the given IR
sensor(s) each control cycle. The step completes as soon as any sensor’s
probabilityOfWhite() meets or exceeds confidence_threshold. The
speed is forced positive (rightward) regardless of the sign passed in.
Prerequisites: Requires a mecanum or holonomic drivetrain capable of lateral movement.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
white.
speed: Lateral speed in m/s. The absolute value is used, so negative
inputs are treated as positive. Defaults to 0.3.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for white before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that strafes right and stops when white is detected.
Example::
from libstp.sensor_ir import IRSensor
right_ir = IRSensor(3)
# Strafe right at default speed until the right sensor finds white
step = strafe_right_until_white(right_ir)
# Use multiple sensors with high confidence
bottom_ir = IRSensor(4)
step = strafe_right_until_white(
[right_ir, bottom_ir], speed=0.2, confidence_threshold=0.85
)
turn_left_until_black(sensor: Union[IRSensor, list[IRSensor]], speed: float = 1.0, confidence_threshold: float = 0.7) -> MoveUntil
Turn left (counter-clockwise) until any sensor detects a black surface.
Commands a constant counter-clockwise angular velocity and polls the given
IR sensor(s) each control cycle. The step completes as soon as any sensor’s
probabilityOfBlack() meets or exceeds confidence_threshold. The
speed is forced positive (CCW) regardless of the sign passed in.
A common use case is sweeping a side-mounted IR sensor across the field to locate a black line, then stopping precisely when found.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
black.
speed: Angular speed in rad/s. The absolute value is used, so negative
inputs are treated as positive. Defaults to 1.0.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for black before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that turns left and stops when black is detected.
Example::
from libstp.sensor_ir import IRSensor
right_ir = IRSensor(3)
# Sweep left at 0.8 rad/s until the right sensor crosses a black line
step = turn_left_until_black(right_ir, speed=0.8)
turn_left_until_white(sensor: Union[IRSensor, list[IRSensor]], speed: float = 1.0, confidence_threshold: float = 0.7) -> MoveUntil
Turn left (counter-clockwise) until any sensor detects a white surface.
Commands a constant counter-clockwise angular velocity and polls the given
IR sensor(s) each control cycle. The step completes as soon as any sensor’s
probabilityOfWhite() meets or exceeds confidence_threshold. The
speed is forced positive (CCW) regardless of the sign passed in.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
white.
speed: Angular speed in rad/s. The absolute value is used, so negative
inputs are treated as positive. Defaults to 1.0.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for white before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that turns left and stops when white is detected.
Example::
from libstp.sensor_ir import IRSensor
right_ir = IRSensor(3)
# Sweep left at 0.6 rad/s until white surface is found
step = turn_left_until_white(right_ir, speed=0.6)
turn_right_until_black(sensor: Union[IRSensor, list[IRSensor]], speed: float = 1.0, confidence_threshold: float = 0.7) -> MoveUntil
Turn right (clockwise) until any sensor detects a black surface.
Commands a constant clockwise angular velocity and polls the given IR
sensor(s) each control cycle. The step completes as soon as any sensor’s
probabilityOfBlack() meets or exceeds confidence_threshold. The
speed is negated internally to produce clockwise rotation, so you should
pass a positive value.
A common use case is sweeping a side-mounted IR sensor across the field to locate a black line, then stopping precisely when found.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
black.
speed: Angular speed in rad/s. Pass a positive value; the sign is
negated internally to rotate clockwise. Defaults to 1.0.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for black before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that turns right and stops when black is detected.
Example::
from libstp.sensor_ir import IRSensor
left_ir = IRSensor(1)
# Sweep right at 0.8 rad/s until the left sensor crosses a black line
step = turn_right_until_black(left_ir, speed=0.8)
turn_right_until_white(sensor: Union[IRSensor, list[IRSensor]], speed: float = 1.0, confidence_threshold: float = 0.7) -> MoveUntil
Turn right (clockwise) until any sensor detects a white surface.
Commands a constant clockwise angular velocity and polls the given IR
sensor(s) each control cycle. The step completes as soon as any sensor’s
probabilityOfWhite() meets or exceeds confidence_threshold. The
speed is negated internally to produce clockwise rotation, so you should
pass a positive value.
Args:
sensor: A single :class:~libstp.sensor_ir.IRSensor or a list of
sensors. The step triggers when any sensor in the list detects
white.
speed: Angular speed in rad/s. Pass a positive value; the sign is
negated internally to rotate clockwise. Defaults to 1.0.
confidence_threshold: Minimum probability (0.0 – 1.0) that the sensor
must report for white before the step considers the condition met.
Defaults to 0.7.
Returns: MoveUntil: A configured motion step that turns right and stops when white is detected.
Example::
from libstp.sensor_ir import IRSensor
left_ir = IRSensor(1)
# Sweep right slowly until the left sensor finds white
step = turn_right_until_white(left_ir, speed=0.5)
stop(hard: bool = True) -> Stop
Stop all drive motors immediately.
Use this between motion sequences or at the end of a mission to ensure the robot is stationary.
Args:
hard: If True (default), immediately zero motor output.
If False, decelerate smoothly using the drive controller.
Returns: A Stop step instance.
Example::
from libstp.step.motion import drive_forward, stop
# Drive forward then stop
seq([drive_forward(50), stop()])
strafe_left(cm: float, speed: float = 1.0) -> Drive
Strafe left by a specified distance using profiled PID motion control.
Requires a mecanum or omni-wheel drivetrain. The robot moves laterally to the left while maintaining heading via IMU feedback.
Args: cm: Distance to strafe in centimeters. speed: Fraction of max speed, 0.0 to 1.0 (default 1.0).
Returns: A Drive step configured for leftward lateral motion.
Example::
from libstp.step.motion import strafe_left
# Strafe left 15 cm to dodge an obstacle
strafe_left(15)
strafe_right(cm: float, speed: float = 1.0) -> Drive
Strafe right by a specified distance using profiled PID motion control.
Requires a mecanum or omni-wheel drivetrain. The robot moves laterally to the right while maintaining heading via IMU feedback.
Args: cm: Distance to strafe in centimeters. speed: Fraction of max speed, 0.0 to 1.0 (default 1.0).
Returns: A Drive step configured for rightward lateral motion.
Example::
from libstp.step.motion import strafe_right
# Strafe right 15 cm to align with a game piece
strafe_right(15)
mark_heading_reference() -> MarkHeadingReference
Mark the current IMU heading as a reference point for absolute turns.
Captures the robot’s current absolute IMU heading and stores it as
a reference. Subsequent calls to :func:turn_to_heading will compute
turn angles relative to this stored reference, enabling absolute
heading control even after the robot has moved and turned through
other motion steps.
The reference uses the raw IMU heading which is unaffected by odometry resets that occur during normal motion steps.
Multiple calls overwrite the previous reference.
Returns: A MarkHeadingReference step that records the heading when executed.
Example::
from libstp.step.motion import mark_heading_reference, turn_to_heading
# Mark current heading as 0-degree reference
mark_heading_reference()
# ... robot drives around ...
# Turn to face 180 degrees from where we marked
turn_to_heading(180)
turn_left(degrees: float, speed: float = 1.0) -> Turn
Turn counter-clockwise (left) by a specified angle.
Uses a PID controller on the IMU heading to rotate the robot in place. The controller saturates output at the configured max angular rate, producing an implicit trapezoidal velocity profile.
Args: degrees: Angle to turn in degrees (positive = counter-clockwise). speed: Fraction of max angular speed, 0.0 to 1.0 (default 1.0).
Returns: A Turn step configured for counter-clockwise rotation.
Example::
from libstp.step.motion import turn_left
# Turn 90 degrees to the left
turn_left(90)
# Gentle 45-degree turn at half speed
turn_left(45, speed=0.5)
turn_right(degrees: float, speed: float = 1.0) -> Turn
Turn clockwise (right) by a specified angle.
Uses a PID controller on the IMU heading to rotate the robot in place. The controller saturates output at the configured max angular rate, producing an implicit trapezoidal velocity profile.
Args: degrees: Angle to turn in degrees (positive = clockwise). speed: Fraction of max angular speed, 0.0 to 1.0 (default 1.0).
Returns: A Turn step configured for clockwise rotation.
Example::
from libstp.step.motion import turn_right
# Turn 90 degrees to the right
turn_right(90)
turn_to_heading(degrees: float, speed: float = 1.0) -> Defer
Turn to an absolute heading relative to the marked reference.
Computes the shortest rotation from the robot’s current heading to
the target heading (reference + degrees) at execution time, then
delegates to :func:turn_left or :func:turn_right accordingly.
The turn direction is chosen automatically to minimize rotation.
Requires :func:mark_heading_reference to have been called earlier
in the mission.
Args: degrees: Target heading in degrees relative to the reference. 0 returns to the reference heading, 90 faces 90 degrees counter-clockwise from it, -90 faces 90 degrees clockwise, etc. speed: Fraction of max angular speed, 0.0 to 1.0 (default 1.0).
Returns: A deferred step that computes and executes the turn at runtime.
Raises: RuntimeError: If no heading reference has been set.
Example::
from libstp.step.motion import mark_heading_reference, turn_to_heading
# Mark heading at start of mission
mark_heading_reference()
# Drive around, turn, etc.
drive_forward(30)
turn_left(45)
drive_forward(20)
# Return to original heading
turn_to_heading(0)
# Face the opposite direction from reference
turn_to_heading(180)
wait_until_distance(cm: float) -> WaitUntilDistance
Wait until the robot has driven at least the given distance.
Polls odometry straight-line distance from the origin at 100 Hz.
Designed to run inside a parallel() branch alongside a drive step,
enabling actions to trigger at specific distances during a drive.
Args: cm: Distance threshold in centimeters.
Returns: A WaitUntilDistance step.
Example::
from libstp.step import parallel, seq
from libstp.step.motion import drive_forward, wait_until_distance
from libstp.step.servo import servo
# Open a servo after driving 30 cm into a 50 cm drive
parallel([
drive_forward(50),
seq([wait_until_distance(30), servo(claw, 90)]),
])
wall_align_backward(speed: float = 1.0, accel_threshold: float = 0.5, settle_duration: float = 0.2, max_duration: float = 5.0, grace_period: float = 0.3) -> WallAlign
Drive backward into a wall and align the back of the robot.
Apply constant backward velocity without heading correction so the robot naturally rotates flush against the wall surface. Uses IMU bump detection to know when the wall has been reached.
Args: speed: Drive speed in m/s (default 1.0). accel_threshold: Minimum XY linear-acceleration magnitude in m/s² to classify as a bump (default 0.5). settle_duration: Seconds to keep pushing after impact (default 0.2). max_duration: Safety timeout in seconds (default 5.0). grace_period: Seconds to ignore acceleration at start (default 0.3).
Returns: A WallAlign step driving backward with bump detection.
Example::
from libstp.step.motion import wall_align_backward, drive_backward
# Drive to the wall in reverse, then align against it
seq([drive_backward(30), wall_align_backward()])
wall_align_forward(speed: float = 1.0, accel_threshold: float = 0.5, settle_duration: float = 0.2, max_duration: float = 5.0, grace_period: float = 0.3) -> WallAlign
Drive forward into a wall and align the front of the robot.
Apply constant forward velocity without heading correction so the robot naturally rotates flush against the wall surface. The step monitors gravity-compensated linear acceleration from the IMU and stops once a collision spike is detected followed by a short settle period.
The grace period prevents false triggers from the initial acceleration transient when the robot starts moving.
After the step completes, step.bump_result contains the impact
magnitude, estimated wall misalignment angle, and the heading
correction applied during the settle push.
Args: speed: Drive speed in m/s (default 1.0). accel_threshold: Minimum XY linear-acceleration magnitude in m/s² to classify as a bump (default 0.5). Lower values are more sensitive but may false-trigger on rough surfaces. settle_duration: Seconds to keep pushing after the bump is detected, letting the chassis rotate flush (default 0.2). max_duration: Safety timeout in seconds — the step finishes even if no bump is detected (default 5.0). grace_period: Seconds to ignore acceleration after starting, so the robot’s own acceleration doesn’t trigger detection (default 0.3).
Returns: A WallAlign step driving forward with bump detection.
Example::
from libstp.step.motion import wall_align_forward, drive_forward
# Drive near the wall, then bump-align against it
seq([drive_forward(30), wall_align_forward()])
# More sensitive detection at slower speed
wall_align_forward(speed=0.3, accel_threshold=0.3)
wall_align_strafe_left(speed: float = 0.5, accel_threshold: float = 0.5, settle_duration: float = 0.2, max_duration: float = 5.0, grace_period: float = 0.3) -> WallAlign
Strafe left into a wall and align the left side of the robot.
Apply constant leftward velocity without heading correction so the robot naturally rotates flush against the wall surface. Uses IMU bump detection. Requires a mecanum or omni drivetrain capable of lateral movement.
Args: speed: Strafe speed in m/s (default 0.5). accel_threshold: Minimum XY linear-acceleration magnitude in m/s² to classify as a bump (default 0.5). settle_duration: Seconds to keep pushing after impact (default 0.2). max_duration: Safety timeout in seconds (default 5.0). grace_period: Seconds to ignore acceleration at start (default 0.3).
Returns: A WallAlign step strafing left with bump detection.
Example::
from libstp.step.motion import wall_align_strafe_left
# Strafe-align the left side against a wall
wall_align_strafe_left(speed=0.4)
wall_align_strafe_right(speed: float = 0.5, accel_threshold: float = 0.5, settle_duration: float = 0.2, max_duration: float = 5.0, grace_period: float = 0.3) -> WallAlign
Strafe right into a wall and align the right side of the robot.
Apply constant rightward velocity without heading correction so the robot naturally rotates flush against the wall surface. Uses IMU bump detection. Requires a mecanum or omni drivetrain capable of lateral movement.
Args: speed: Strafe speed in m/s (default 0.5). accel_threshold: Minimum XY linear-acceleration magnitude in m/s² to classify as a bump (default 0.5). settle_duration: Seconds to keep pushing after impact (default 0.2). max_duration: Safety timeout in seconds (default 5.0). grace_period: Seconds to ignore acceleration at start (default 0.3).
Returns: A WallAlign step strafing right with bump detection.
Example::
from libstp.step.motion import wall_align_strafe_right
# Strafe-align the right side against a wall
wall_align_strafe_right(speed=0.4)
motor_brake(motor: IMotor) -> StopMotor
Actively brake a motor and hold its current position.
Engages the firmware’s active brake (stop latch), which commands the motor controller to resist any external force and maintain the current shaft position. This is the strongest stop mode and is appropriate when the mechanism must not move (e.g. holding an arm up against gravity).
Args:
motor: The motor to brake, obtained from the robot hardware map
(e.g. robot.motor(2)).
Returns:
A StopMotor step configured for active braking mode.
Example::
from libstp.step.motor import motor_move_to, motor_brake
# Move the arm to a target, then lock it in place
sequence(
motor_move_to(robot.motor(2), position=400),
motor_brake(robot.motor(2)),
)
motor_dps(motor: IMotor, dps: float) -> SetMotorDps
Set a motor to closed-loop velocity in degrees per second.
Converts the requested angular velocity from degrees per second into firmware BEMF units using the motor’s calibration data, then engages the firmware’s closed-loop velocity controller. The step completes immediately; the motor continues at the requested speed until changed.
Prerequisites:
The motor must have valid calibration data (ticks_to_rad) so
the deg/s to BEMF conversion is accurate. Run the motor
calibration step first if calibration has not been performed.
Args:
motor: The motor to control, obtained from the robot hardware map
(e.g. robot.motor(0)).
dps: Target angular velocity in degrees per second. Positive
values drive forward; negative values drive in reverse.
Returns:
A SetMotorDps step ready to be scheduled in a step sequence.
Example::
from libstp.step.motor import motor_dps, motor_brake
# Spin the left wheel at 180 deg/s for 2 seconds
sequence(
motor_dps(robot.motor(0), 180.0),
wait(2.0),
motor_brake(robot.motor(0)),
)
motor_move_relative(motor: IMotor, delta: int, velocity: int = 1000, timeout: float | None = None) -> MoveMotorRelative
Move a motor by a relative encoder delta and wait for completion.
Commands the motor’s firmware position controller to move by the given number of encoder ticks relative to its current position. The step blocks until the firmware reports the move is complete, or until the optional timeout expires.
Args:
motor: The motor to control, obtained from the robot hardware map
(e.g. robot.motor(2)).
delta: Number of encoder ticks to move. Positive values move
forward; negative values move in reverse.
velocity: Movement speed in firmware ticks/s. Must be positive.
Defaults to 1000.
timeout: Maximum seconds to wait for the move to finish. None
(the default) means wait indefinitely. If the timeout fires
a warning is logged and the step returns without stopping the
motor.
Returns:
A MoveMotorRelative step ready to be scheduled in a step
sequence.
Example::
from libstp.step.motor import motor_move_relative, motor_brake
# Rotate the arm motor 200 ticks forward, then brake
sequence(
motor_move_relative(robot.motor(2), delta=200, velocity=600),
motor_brake(robot.motor(2)),
)
motor_move_to(motor: IMotor, position: int, velocity: int = 1000, timeout: float | None = None) -> MoveMotorTo
Move a motor to an absolute encoder position and wait for completion.
Commands the motor’s firmware position controller to drive to the given absolute encoder tick count. The step blocks (yielding to the async loop) until the firmware reports the move is complete, or until the optional timeout expires.
Args:
motor: The motor to control, obtained from the robot hardware map
(e.g. robot.motor(2)).
position: Target position in absolute encoder ticks.
velocity: Movement speed in firmware ticks/s. Must be positive.
Defaults to 1000.
timeout: Maximum seconds to wait for the move to finish. None
(the default) means wait indefinitely. If the timeout fires
a warning is logged and the step returns without stopping the
motor.
Returns:
A MoveMotorTo step ready to be scheduled in a step sequence.
Example::
from libstp.step.motor import motor_move_to
# Raise the arm to encoder position 500 at speed 800, with a 3s safety timeout
motor_move_to(robot.motor(2), position=500, velocity=800, timeout=3.0)
motor_off(motor: IMotor) -> StopMotor
Turn a motor off, allowing it to coast freely.
Removes all power from the motor so it spins down under friction alone. The motor shaft is not held in place and can be back-driven. Use this when you want the mechanism to move freely (e.g. letting an arm fall under gravity).
Args:
motor: The motor to turn off, obtained from the robot hardware map
(e.g. robot.motor(0)).
Returns:
A StopMotor step configured for coast / free-spin mode.
Example::
from libstp.step.motor import motor_power, motor_off
# Run a motor briefly, then let it coast to a stop
sequence(
motor_power(robot.motor(3), 80),
wait(1.5),
motor_off(robot.motor(3)),
)
motor_passive_brake(motor: IMotor) -> StopMotor
Passively brake a motor by commanding zero power.
Sets the motor power to zero, which causes the H-bridge to short the
motor leads. This provides electrical braking that decelerates the
motor faster than coasting (motor_off), but does not actively hold
position once the motor stops.
Args:
motor: The motor to brake, obtained from the robot hardware map
(e.g. robot.motor(0)).
Returns:
A StopMotor step configured for passive braking mode.
Example::
from libstp.step.motor import motor_velocity, motor_passive_brake
# Drive forward then passively brake
sequence(
motor_velocity(robot.motor(0), 600),
wait(2.0),
motor_passive_brake(robot.motor(0)),
)
motor_power(motor: IMotor, percent: int) -> SetMotorPower
Set a motor to open-loop percent power.
Commands the motor at a raw duty-cycle percentage without any feedback control. The command is applied immediately and the step completes without waiting – the motor keeps spinning until another command (or a stop step) is issued.
Args:
motor: The motor to control, obtained from the robot hardware map
(e.g. robot.motor(0)).
percent: Duty-cycle power from -100 (full reverse) to 100 (full
forward). Values outside this range raise ValueError.
Returns:
A SetMotorPower step ready to be scheduled in a step sequence.
Example::
from libstp.step.motor import motor_power, motor_off
# Spin the claw motor forward at half power for 2 seconds
sequence(
motor_power(robot.motor(1), 50),
wait(2.0),
motor_off(robot.motor(1)),
)
motor_velocity(motor: IMotor, velocity: int) -> SetMotorVelocity
Set a motor to closed-loop velocity in firmware BEMF units.
Commands the motor’s firmware PID controller to maintain a target velocity expressed in raw BEMF tick units. The step completes immediately; the motor continues at the requested speed until changed.
For a human-friendly velocity interface, prefer motor_dps which
accepts degrees per second.
Args:
motor: The motor to control, obtained from the robot hardware map
(e.g. robot.motor(0)).
velocity: Target velocity in firmware BEMF units (ticks per BEMF
sample period). Positive values drive forward; negative values
drive in reverse.
Returns:
A SetMotorVelocity step ready to be scheduled in a step sequence.
Example::
from libstp.step.motor import motor_velocity, motor_brake
# Drive the left wheel at 800 BEMF units, then brake
sequence(
motor_velocity(robot.motor(0), 800),
wait(3.0),
motor_brake(robot.motor(0)),
)
fully_disable_servos() -> FullyDisableServos
Fully disable all servo outputs, removing all power from the servo pins.
Commands the firmware to enter the fully-disabled servo mode for every servo port. In this mode, no PWM signal is sent and the servos can be moved freely by hand. This is useful for saving power or when the servos should not hold position (e.g. at the end of a run).
Servos will automatically re-enable when a new position command is
sent (e.g. via servo() or slow_servo()).
Returns:
A FullyDisableServos step.
Example::
from libstp.step.servo import fully_disable_servos
# Release all servos at the end of a mission
fully_disable_servos()
servo(servo: Servo, angle: float) -> SetServoPosition
Set a servo to a specific angle and wait for the move to finish.
Commands the servo to the requested angle, then sleeps for an estimated duration based on the angular distance the servo needs to travel and its approximate speed. This ensures subsequent steps do not begin until the servo has physically reached its target.
The valid angle range is 0 to 170 degrees.
Args:
servo: The servo to control, obtained from the robot hardware map
(e.g. robot.servo(0)).
angle: Target angle in degrees, from 0.0 to 170.0.
Returns:
A SetServoPosition step with automatically estimated wait
duration.
Example::
from libstp.step.servo import servo
# Open a claw by moving servo 0 to 90 degrees
servo(robot.servo(0), 90.0)
# Close the claw, then raise the arm
sequence(
servo(robot.servo(0), 10.0),
motor_move_to(robot.motor(2), position=400),
)
shake_servo(servo: Servo, duration: float, angle_a: float, angle_b: float) -> ShakeServo
Oscillate a servo back and forth between two angles for a set time.
Rapidly alternates the servo between angle_a and angle_b for
the given duration. The dwell time at each angle is automatically
estimated from the angular distance so the servo has time to
physically reach each endpoint before reversing. Useful for shaking
objects loose or signalling the operator.
The valid angle range is 0 to 170 degrees.
Args:
servo: The servo to control, obtained from the robot hardware map
(e.g. robot.servo(1)).
duration: Total oscillation time in seconds. Must be >= 0.
angle_a: First oscillation endpoint in degrees (0.0 – 170.0).
angle_b: Second oscillation endpoint in degrees (0.0 – 170.0).
Returns:
A ShakeServo step ready to be scheduled in a step sequence.
Example::
from libstp.step.servo import shake_servo
# Shake a sorting tray for 3 seconds between 60 and 120 degrees
shake_servo(robot.servo(1), duration=3.0, angle_a=60.0, angle_b=120.0)
slow_servo(servo: Servo, angle: float, speed: float = 60.0) -> EaseServo
Move a servo to an angle with smooth ease-in/ease-out motion.
Instead of commanding the servo to jump straight to the target (as
servo() does), this step interpolates through intermediate
positions using a smoothstep curve (3t^2 - 2t^3). The result is a
gentle acceleration and deceleration that avoids mechanical shock and
reduces jerk on the mechanism.
The total move duration is derived from the angular distance divided
by speed. Intermediate positions are updated at ~10 Hz.
The valid angle range is 0 to 170 degrees.
Args:
servo: The servo to control, obtained from the robot hardware map
(e.g. robot.servo(0)).
angle: Target angle in degrees (0.0 – 170.0).
speed: Movement speed in degrees per second. Must be positive.
Defaults to 60.0 deg/s, which moves the full range in about
2.8 seconds.
Returns:
An EaseServo step ready to be scheduled in a step sequence.
Example::
from libstp.step.servo import slow_servo
# Gently lower the arm servo to 20 degrees at 45 deg/s
slow_servo(robot.servo(0), angle=20.0, speed=45.0)
# Use default speed for a smooth open
slow_servo(robot.servo(0), angle=150.0)
wait_for_button() -> WaitForButton
Wait for the operator to press the hardware button before continuing.
Blocks the step sequence until the physical button on the robot controller is pressed. A “Waiting for button press…” message is displayed on the UI while waiting. This is useful for pausing between autonomous phases, confirming the robot is correctly positioned before starting, or gating destructive actions behind human approval.
Returns:
A WaitForButton UI step ready to be scheduled in a step
sequence.
Example::
from libstp.step import wait_for_button
# Wait for the operator to confirm placement before starting
sequence(
wait_for_button(),
motor_power(robot.motor(0), 100),
wait(3.0),
motor_brake(robot.motor(0)),
)
do_until_checkpoint(checkpoint: float, step) -> DoUntilCheckpoint
Run a step until a mission-relative time checkpoint, then cancel it.
Starts executing step immediately and cancels it when the robot’s
global synchronizer clock reaches checkpoint seconds since mission
start. If the step finishes before the checkpoint, execution continues
without waiting. This is useful for time-boxing actions within a timed
Botball run (e.g. “search for objects, but stop at T=45s no matter what”).
Prerequisites:
The robot must have a synchronizer configured. The synchronizer
clock starts when the mission begins.
Args:
checkpoint: The mission-relative deadline (in seconds) at which
step will be cancelled.
step: The step to run. Will be cancelled if still active when the
checkpoint time is reached.
Returns: DoUntilCheckpoint: A step that manages the time-boxed execution.
Example::
from libstp.step.timing import do_until_checkpoint
from libstp.step.logic import loop_forever
# Search for objects until T=45s, then move on
search = loop_forever(seq([
scan_for_object(),
drive_forward(10),
]))
seq([
do_until_checkpoint(45.0, search),
drive_to_start(),
])
wait_for_checkpoint(checkpoint_seconds: float) -> WaitForCheckpoint
Wait until a mission-relative time checkpoint is reached.
Pauses execution until the robot’s global synchronizer clock reaches the specified number of seconds since mission start. If the checkpoint time has already passed, the step returns immediately. This is useful for synchronizing actions to absolute times within a timed Botball run (e.g. “at T=20s, start collecting”).
Prerequisites:
The robot must have a synchronizer configured. The synchronizer
clock starts when the mission begins.
Args: checkpoint_seconds: The mission-relative time (in seconds) to wait for. Must be non-negative.
Returns: WaitForCheckpoint: A step that blocks until the checkpoint time.
Example::
from libstp.step.timing import wait_for_checkpoint
seq([
drive_forward(50),
# Ensure we don't start the next action before T=10s
wait_for_checkpoint(10.0),
pick_up_tribble(),
# Gate the final action to T=25s
wait_for_checkpoint(25.0),
drive_to_bin(),
])
wait(seconds: float) -> WaitForSeconds
Pause execution for a fixed number of seconds.
Suspends the current step sequence for the specified duration using an async sleep. No hardware commands are issued during the wait, and other concurrent tasks (e.g. odometry updates) continue running normally. The wait duration is deterministic and is reflected accurately in simulation estimates.
Args: seconds: Duration to pause in seconds. Must be non-negative. Passing 0 yields control to the event loop for one tick without any meaningful delay.
Returns:
A WaitForSeconds step ready to be scheduled in a step
sequence.
Example::
from libstp.step import wait
# Pause between two motor commands
sequence(
motor_power(robot.motor(0), 100),
wait(2.5),
motor_brake(robot.motor(0)),
)
# Brief yield to let sensors settle
wait(0.1)
wait_for_light(sensor: AnalogSensor, drop_fraction: float = 0.15, confirm_count: int = 3, warmup_seconds: float = 1.0, poll_interval: float = 0.005) -> WaitForLight
Wait for the start lamp using automatic Kalman-filtered flank detection.
Mount the light sensor facing downward with no shielding. The step
establishes a stable baseline reading via a 1D Kalman filter during a
short warm-up phase, then arms and polls for a sharp brightness
increase (sensor value drop). When the raw reading falls below
baseline * (1 - drop_fraction) for confirm_count consecutive
samples, the step returns and the mission begins.
The downward-facing mount reduces environmental noise by up to 76% compared to a horizontal mount (Gosling et al., 2023). No black tape or straw shielding is required.
Prerequisites: An analog light sensor (LDR) connected to the Wombat and mounted facing the table surface. The start lamp should be positioned diagonally above the sensor.
Args: sensor: The AnalogSensor instance for the light sensor. drop_fraction: Fraction the raw value must drop below the baseline to trigger. 0.15 means a 15% brightness increase triggers the start. Lower values are more sensitive (faster but riskier), higher values are safer but need a stronger signal. Default 0.15. confirm_count: Number of consecutive triggering samples required before starting. At the default 200 Hz poll rate, 3 samples equals ~15 ms of confirmation — effectively instant while rejecting single-sample noise spikes. Default 3. warmup_seconds: Duration in seconds to collect baseline samples before arming the detector. Default 1.0. poll_interval: Seconds between sensor reads. 0.005 gives ~200 Hz. Default 0.005.
Returns: A WaitForLight step that blocks until the lamp turns on.
Example::
from libstp.step.wait_for_light import wait_for_light
# Default settings — works well for most setups
wait_for_light(robot.defs.wait_for_light_sensor)
# More sensitive (10% drop, 2 confirms) for weak lamp signal
wait_for_light(robot.defs.wait_for_light_sensor, drop_fraction=0.10, confirm_count=2)
# More conservative (20% drop, 5 confirms) for noisy environments
wait_for_light(robot.defs.wait_for_light_sensor, drop_fraction=0.20, confirm_count=5)
wait_for_light_legacy(sensor: AnalogSensor) -> WaitForLightLegacy
Wait for light using the legacy manual-calibration threshold method.
Runs the traditional two-step calibration flow: the operator measures the sensor with the lamp off, then with it on, confirms the threshold, and the robot starts immediately. This approach requires manual interaction at the start of each run and uses a fixed midpoint threshold between the dark and light readings.
Use this only if the automatic flank-detection method
(wait_for_light) does not work for your setup, for example when
the sensor is not mounted downward or when the lamp signal is too
weak for reliable flank detection.
Args: sensor: The AnalogSensor instance for the light sensor.
Returns: A WaitForLightLegacy step that runs manual calibration.
Example::
from libstp.step.wait_for_light import wait_for_light_legacy
wait_for_light_legacy(robot.defs.wait_for_light_sensor)