The Navigation Service
The Navigation service is the stateful definition of Viam’s motion service.
It uses GPS to autonomously navigate a rover base to user defined endpoints called Waypoints
.
Once these waypoints are added and the mode of the service is set to MODE_WAYPOINT
, the service begins to define the robot’s path.
Configuration
You must configure a base with a movement sensor as part of your robot to configure a Navigation service.
Important
Make sure the movement sensor you use supports GetPosition()
and at least one of GetCompassHeading()
or GetOrientation()
in its model's implementation of the Movement Sensor API.
- It must support
GetPosition()
to report the robot’s current GPS location. - It must also support either
GetCompassHeading()
orGetOrientation()
to report which way the robot is facing. - If your movement sensor provides multiple methods, your robot will default to using the values returned by
GetCompassHeading()
.
Edit and fill in the attributes as applicable.
The following attributes are available for Navigation
services:
Name | Type | Inclusion | Description |
---|---|---|---|
store | obj | Required | The type and configuration of data storage to use. Either type "memory" , where no additional configuration is needed and the waypoints are stored in local memory while the Navigation process is running, or "mongodb" , where data persists at the specified MongoDB URI of your MongoDB deployment. |
base | string | Required | The name you have configured for the base you are operating with this service. |
movement_sensor | string | Required | The name of the movement sensor you have configured for the base you are operating with this service. |
motion_service | string | Optional | The name of the motion service you have configured for the base you are operating with this service. If you have not added a motion service to your robot, the default motion service will be used. Reference this default service in your code with the name "builtin" . |
vision_services | array | Optional | The name of each vision service you have configured for the base you are operating with this service. |
position_polling_frequency | float | Optional | The frequency to poll for the position of the robot. Default: 2 |
obstacle_polling_frequency_hz | float | Optional | The frequency in hz to poll each vision service for new obstacles. Default: 2 |
plan_deviation_m | float | Optional | The distance in meters that a robot is allowed to deviate from the motion plan. Default: 3 |
degs_per_sec | float | Optional | The default angular velocity for the base in degrees per second. Default: 60 |
meters_per_sec | float | Optional | The default linear velocity for the base in meters per second. Default: 0.3 |
obstacles | obj | Optional | Any obstacles you wish to add to the robot’s path. See the motion service for more information. |
Configure and calibrate the frame system service for GPS navigation
Info
The frame system service is an internally managed and mostly static system for storing the reference frame of each component of a robot within a coordinate system configured by the user.
It stores the required contextual information for Viam’s services like Motion and Vision to use the position and orientation readings returned by components like movement sensors.
To make sure your rover base’s autonomous GPS navigation with the navigation service is accurate, configure and calibrate the frame system service for the components of your robot. To start, add the frame system service to your rover base and movement sensor.
Navigate to the Config tab of your robot’s page in the Viam app. Scroll to the card with the name of your base:
Click Add Frame.
Keep the
parent
frame asworld
. Select the Geometry drop-down menu.Configure a Geometry for the base that reflects its physical dimensions. Reference these instructions to configure your geometry and measure the physical dimensions of your base.
Scroll to the card with the name of your movement sensor. Click Add Frame and select the Parent box.
Type in the
name
of your base to specify this component as theparent
of the sensor in the reference frame coordinate system, and click Save Config to save your configuration. See how to configure nested reference frames for an explanation of this configuration process.Give the movement sensor a Translation that reflects where it is mounted on your base, measuring the coordinates with respect to the origin of the base.
In other words, designate the origin of the base as
(0,0,0)
, and measure the distance between the origin of the base and the origin of the sensor to obtain the coordinates of the Translation.See the frame system service for more information, and the Viam Internals for a detailed guide on conducting this measurement.
Then, to calibrate your frame system for the most accurate autonomous GPS navigation with the navigation service:
- After configuring your robot, navigate to the Control page and select the card matching the name of your movement sensor.
- Monitor the readings displayed on the card, and verify that the compass or orientation readings from the movement sensor report
0
when the base is facing north. - If you cannot verify this:
- Navigate back to your robot’s Config page. Scroll to the card with the name of your movement sensor. Adjust the Orientation of the frame to compensate for the mismatch.
- Navigate back to the Navigation card on your Control page, and confirm that the compass or orientation readings from the movement sensor now report
0
when the base is facing north, confirming that you’ve successfully calibrated your robot to be oriented accurately within the frame system. - If you cannot verify this, repeat as necessary.
API
The navigation service supports the following methods:
Method Name | Description |
---|---|
Mode | Get the mode the service is operating in. |
SetMode | Set the mode the service is operating in. |
Location | Get the current location of the robot. |
Waypoints | Get the waypoints currently in the service’s data storage. |
AddWaypoint | Add a waypoint to the service’s data storage. |
RemoveWaypoint | Remove a waypoint from the service’s data storage. |
GetObstacles | Get the obstacles currently in the service’s data storage. |
Tip
The following code examples assume that you have a robot configured with a Navigation
service, and that you add the required code to connect to your robot and import any required packages at the top of your code file.
Go to your robot’s Code Sample tab on the Viam app for boilerplate code to connect to your robot.
Mode
Get the Mode
the service is operating in.
There are two options for modes: MODE_MANUAL
or MODE_WAYPOINT
.
MODE_WAYPOINT
: Start to look for added waypoints and begin autonomous navigation.MODE_MANUAL
: Stop autonomous navigation between waypoints and allow the base to be controlled manually.
SetMode
Set the Mode
the service is operating in.
There are two options for modes: MODE_MANUAL
or MODE_WAYPOINT
.
MODE_WAYPOINT
: Start to look for added waypoints and begin autonomous navigation.MODE_MANUAL
: Stop autonomous navigation between waypoints and allow the base to be controlled manually.
Location
Get the current location of the robot in the navigation service.
Waypoints
Get an array of waypoints currently in the service’s data storage. These are locations designated within a path for the robot to navigate to.
AddWaypoint
Add a waypoint to the service’s data storage.
RemoveWaypoint
Remove a waypoint from the service’s data storage. If the robot is currently navigating to this waypoint, the motion will be canceled, and the robot will proceed to the next waypoint.
GetObstacles
Get an array of obstacles currently in the service’s data storage. These are locations designated for the robot to avoid when navigating. See the motion service for more information.
Concepts
The following concepts are important to understand when utilizing the navigation service. Each concept is a type of relative or absolute measurement, taken by a movement sensor, which can then be utilized by your robot to navigate across space.
Here’s how to make use of the following types of measurements:
Compass Heading
The following models of movement sensor take compass heading measurements:
- gps-nmea - some GPS hardware only report heading while moving.
- gps-nmea-rtk-pmtk - some GPS hardware only report heading while moving.
- gps-nmea-rtk-serial - some GPS hardware only report heading while moving.
- imu-wit
An example of a Compass Heading
reading:
// heading is a float64 between 0-360
heading, err := gps.CompassHeading(context.Background, nil)
Use compass heading readings to determine the bearing of your robot, or, the cardinal direction that your robot is facing.
To read compass headings, configure a capable movement sensor on your robot.
Then use the movement sensor API’s GetCompassHeading()
method to get readings from the sensor.
Orientation
The following models of movement sensor take orientation measurements:
An example of an Orientation
reading:
// orientation is a OrientationVector struct with OX, OY, OZ denoting the coordinates of the vector and rotation about z-axis, Theta
orientation, err := imuwit.Orientation(context.Background, nil)
Use orientation readings to determine the orientation of an object in 3D space as an orientation vector.
An orientation vector indicates how it is rotated relative to an origin coordinate system around the x, y, and z axes.
You can choose the origin reference frame by configuring it using Viam’s frame system.
The GetOrientation
readings will report orientations relative to that initial frame.
To read orientation, first configure a capable movement sensor on your robot.
Additionally, follow these instructions to configure the geometries of each component of your robot within the frame system.
Then use the movement sensor API’s GetOrientation()
method to get orientation readings.
Angular Velocity
The following models of the movement sensor component take angular velocity measurements:
An example of an AngularVelocity
reading:
// angularVelocity is an AngularVelocity r3 Vector with X, Y, and Z magnitudes
angularVelocity, err := imu.AngularVelocity(context.Background, nil)
Use angular velocity readings to determine the speed and direction at which your robot is rotating.
To get an angular velocity reading, first configure a capable movement sensor on your robot.
Then use the movement sensor API’s GetAngularVelocity()
method to get angular velocity readings from the sensor.
Position
The following models of the movement sensor component take position measurements:
An example of a Position
reading:
// position is a geo.Point consisting of Lat and Long: -73.98 and an altitude in float64
position, altitude, err:= imu.Position(context.Background, nil)
Use position readings to determine the GPS coordinates of an object in 3D space or its position in the geographic coordinate system (GCS). These position readings reflect the absolute position of components.
To get a position, configure a capable movement sensor on your robot.
Then use the movement sensor API’s GetPosition()
method to get position readings from the sensor.
Linear Velocity
The following models of movement sensor take linear velocity measurements:
- gps-nmea
- gps-nmea-rtk-pmtk
- gps-nmea-rtk-serial
- wheeled-odometry (provides a relative estimate only based on where the base component has started)
An example of a Linear Velocity
reading:
// linearVelocity is an r3.Vector with X, Y, and Z magnitudes
linearVelocity, err := imu.LinearVelocity(context.Background, nil)
Use linear velocity readings to determine the speed at which your robot is moving through space.
To get linear velocity, configure a capable movement sensor on your robot.
Then use the movement sensor API’s GetLinearVelocity()
method to get linear velocity readings from the sensor.
Linear Acceleration
The following models of movement sensor take linear acceleration measurements:
An example of a Linear Acceleration
reading:
// linearAcceleration is an r3.Vector with X, Y, and Z magnitudes
linearAcceleration, err := imu.LinearAcceleration(context.Background, nil)
You can use linear acceleration readings to determine the rate of change of the linear velocity of your robot, or, the acceleration at which your robot is moving through space.
To get linear acceleration, configure a capable movement sensor on your robot.
Then use the movement sensor API’s GetLinearAcceleration()
method to get linear acceleration readings from the sensor.
Was this page helpful?
Glad to hear it! If you have any other feedback please let us know:
We're sorry about that. To help us improve, please tell us what we can do better:
Thank you!