Navigation Service
The navigation service is the stateful definition of Viam’s motion service. It uses GPS to autonomously navigate a rover base to user-defined waypoints.
Configure your base with a navigation service, add waypoints, and set the mode of the service to Waypoint to move your rover along a defined path at your desired motion configuration.
Requirements
You must configure a base with movement sensors as part of your machine to configure a navigation service.
To use the navigation service, configure a stack of movement sensors that implement the following methods in their models' implementations of the movement sensor API:
The base should implement the following:
See navigation concepts for more info on how to implement and use movement sensors taking these measurements.
Configuration
First, make sure your base is physically assembled and powered on. Then, configure the service:
Edit and fill in the attributes as applicable.
The following attributes are available for Navigation
services:
Name | Type | Required? | Description |
---|---|---|---|
store | obj | Required | The type and configuration of data storage to use. Either type "memory" , where no additional configuration is needed and the waypoints are stored in local memory while the navigation process is running, or "mongodb" , where data persists at the specified MongoDB URI of your MongoDB deployment.Default: "memory" |
base | string | Required | The name you have configured for the base you are operating with this service. |
movement_sensor | string | Required | The name of the movement sensor you have configured for the base you are operating with this service. |
motion_service | string | Optional | The name of the motion service you have configured for the base you are operating with this service. If you have not added a motion service to your machine, the default motion service will be used. Reference this default service in your code with the name "builtin" . |
obstacle_detectors | array | Optional | An array containing objects with the name of each "camera" you have configured for the base you are navigating, along with the name of the "vision_service" you are using to detect obstacles. Note that any vision services on remote parts will only be able to access cameras on the same remote part. |
position_polling_frequency_hz | float | Optional | The frequency in Hz to poll for the position of the machine. Default: 1 |
obstacle_polling_frequency_hz | float | Optional | The frequency in Hz to poll each vision service for new obstacles. Default: 1 |
plan_deviation_m | float | Optional | The distance in meters that a machine is allowed to deviate from the motion plan. Default: 2.6 |
degs_per_sec | float | Optional | The default angular velocity for the base in degrees per second. Default: 20 |
meters_per_sec | float | Optional | The default linear velocity for the base in meters per second. Default: 0.3 |
obstacles | obj | Optional | Any obstacles you wish to add to the machine’s path. See the motion service for more information. |
bounding_regions | obj | Optional | Set of bounds which the robot must remain within while navigating. See the motion service for more information. |
Configure and calibrate the frame system service for GPS navigation
Info
The frame system service is an internally managed and mostly static system for storing the reference frame of each component of a machine within a coordinate system configured by the user.
It stores the required contextual information for Viam’s services like Motion and Vision to use the position and orientation readings returned by components like movement sensors.
To make sure your rover base’s autonomous GPS navigation with the navigation service is accurate, configure and calibrate the frame system service for the components of your machine.
Configure
Add a nested reference frame configuration to your rover base and movement sensor:
Navigate to the CONFIGURE tab of your machine’s page in the Viam app and select the Frame mode.
From the left-hand menu, select your base.
Since you haven’t adjusted any parameters yet, the default reference frame will be shown for your base:
Keep the
parent
frame asworld
. Select the Geometry dropdown menu.Configure a Geometry for the base that reflects its physical dimensions. Measure the physical dimensions of your base and use them to configure the size of your geometry. Units are in mm.
For example, you would configure a box-shaped base with dimensions of 100mm x 100mm x 100mm (l x h x w) as follows:
Next, select your movement sensor from the left-hand menu. Click on the Parent menu and select your base component.
Give the movement sensor a Translation that reflects where it is mounted on your base, measuring the coordinates with respect to the origin of the base. In other words, designate the base origin as
(0,0,0)
and measure the distance between that and the origin of the sensor to obtain the coordinates.For example, you would configure a movement sensor mounted 200mm on top of your base as follows:
You can also adjust the Orientation and Geometry of your movement sensor or base, if necessary. See the frame system service for instructions.
Calibrate
Then, to calibrate your frame system for the most accurate autonomous GPS navigation with the navigation service:
- After configuring your machine, navigate to the CONTROL tab and select the card matching the name of your movement sensor.
- Monitor the readings displayed on the card, and verify that the compass or orientation readings from the movement sensor report
0
when the base is facing north. - If you cannot verify this:
- Navigate back to your machine’s CONFIGURE tab and Frame subtab. Scroll to the card with the name of your movement sensor. Adjust the Orientation of the frame to compensate for the mismatch.
- Navigate back to the movement sensor card on your CONTROL page, and confirm that the compass or orientation readings from the movement sensor now report
0
when the base is facing north, confirming that you’ve successfully calibrated your machine to be oriented accurately within the frame system. - If you cannot verify this, repeat as necessary.
API
The navigation service API supports the following methods:
Method Name | Description |
---|---|
GetMode | Get the Mode the service is operating in. |
SetMode | Set the Mode the service is operating in. |
GetLocation | Get the current location of the robot in the navigation service. |
GetWaypoints | Get an array of waypoints currently in the service’s data storage. |
AddWaypoint | Add a waypoint to the service’s data storage. |
RemoveWaypoint | Remove a waypoint from the service’s data storage. |
GetObstacles | Get an array or list of the obstacles currently in the service’s data storage. |
GetPaths | Get each path, the series of geo points the robot plans to travel through to get to a destination waypoint, in the machine’s motion planning. |
GetProperties | Get information about the navigation service. |
Reconfigure | Reconfigure this resource. |
DoCommand | Execute model-specific commands that are not otherwise defined by the service API. |
GetResourceName | Get the ResourceName for this instance of the navigation service with the given name. |
Close | Safely shut down the resource and prevent further use. |
Control tab usage
After configuring the navigation service for your machine, navigate to the CONTROL tab of the machine’s page in the Viam app and expand the card matching the name of your service to use an interface for rover navigation.
Here, you can toggle the mode of the service between Manual and Waypoint to start and stop navigation, add waypoints and obstacles, and view the position of your rover base on a map:
Navigation concepts
The following concepts are important to understand when utilizing the navigation service. Each concept is a type of relative or absolute measurement, taken by a movement sensor, which can then be used by your machine to navigate across space.
Here’s how to use the following types of measurements:
Compass heading
The following models of movement sensor take compass heading measurements:
- gps-nmea - some GPS hardware only report heading while moving.
- gps-nmea-rtk-pmtk - some GPS hardware only report heading while moving.
- gps-nmea-rtk-serial - some GPS hardware only report heading while moving.
- imu-wit
- imu-wit-hwt905
An example of a CompassHeading
reading:
// heading is a float64 between 0-360
heading, err := gps.CompassHeading(context.Background(), nil)
Use compass heading readings to determine the bearing of your machine, or, the cardinal direction that your machine is facing.
To read compass headings, configure a capable movement sensor on your machine.
Then use the movement sensor API’s GetCompassHeading()
method to get readings from the sensor.
Orientation
The following models of movement sensor take orientation measurements:
An example of an Orientation
reading:
// orientation is a OrientationVector struct with OX, OY, OZ denoting the coordinates of the vector and rotation about z-axis, Theta
orientation, err := imuwit.Orientation(context.Background(), nil)
Use orientation readings to determine the orientation of an object in 3D space as an orientation vector.
An orientation vector indicates how it is rotated relative to an origin coordinate system around the x, y, and z axes.
You can choose the origin reference frame by configuring it using Viam’s frame system.
The GetOrientation
readings will report orientations relative to that initial frame.
To read orientation, first configure a capable movement sensor on your machine.
Additionally, follow these instructions to configure the geometries of each component of your machine within the frame system.
Then use the movement sensor API’s GetOrientation()
method to get orientation readings.
Angular velocity
The following models of the movement sensor component take angular velocity measurements:
An example of an AngularVelocity
reading:
// angularVelocity is an AngularVelocity r3 Vector with X, Y, and Z magnitudes
angularVelocity, err := imu.AngularVelocity(context.Background(), nil)
Use angular velocity readings to determine the speed and direction at which your machine is rotating.
To get an angular velocity reading, first configure a capable movement sensor on your machine.
Then use the movement sensor API’s GetAngularVelocity()
method to get angular velocity readings from the sensor.
Position
The following models of the movement sensor component take position measurements:
An example of a Position
reading:
// position is a geo.Point consisting of Lat and Long: -73.98 and an altitude in float64
position, altitude, err := imu.Position(context.Background(), nil)
Use position readings to determine the GPS coordinates of an object in 3D space or its position in the geographic coordinate system (GCS). These position readings reflect the absolute position of components.
To get a position, configure a capable movement sensor on your machine.
Then use the movement sensor API’s GetPosition()
method to get position readings from the sensor.
Linear velocity
The following models of movement sensor take linear velocity measurements:
- gps-nmea
- gps-nmea-rtk-pmtk
- gps-nmea-rtk-serial
- wheeled-odometry (provides a relative estimate only based on where the base component has started)
An example of a LinearVelocity
reading:
// linearVelocity is an r3.Vector with X, Y, and Z magnitudes
linearVelocity, err := imu.LinearVelocity(context.Background(), nil)
Use linear velocity readings to determine the speed at which your machine is moving through space.
To get linear velocity, configure a capable movement sensor on your machine.
Then use the movement sensor API’s GetLinearVelocity()
method to get linear velocity readings from the sensor.
Linear acceleration
The following models of movement sensor take linear acceleration measurements:
An example of a LinearAcceleration
reading:
// linearAcceleration is an r3.Vector with X, Y, and Z magnitudes
linearAcceleration, err := imu.LinearAcceleration(context.Background(), nil)
You can use linear acceleration readings to determine the rate of change of the linear velocity of your machine, or, the acceleration at which your machine is moving through space.
To get linear acceleration, configure a capable movement sensor on your machine.
Then use the movement sensor API’s GetLinearAcceleration()
method to get linear acceleration readings from the sensor.
Next steps
If you would like to see the navigation service in action, check out this tutorial:
Was this page helpful?
Glad to hear it! If you have any other feedback please let us know:
We're sorry about that. To help us improve, please tell us what we can do better:
Thank you!