High-res image of mapping van

Introduction to the Penn State mapping vehicle





Welcome to the information page for Penn State’s mapping vehicle, aka the Penn State Mapping Van. Here you can find details, schedules, and public results as they are released. The activity of the vehicle was on hold during COVID and resumed in Spring ’21, during which time hardware issues were discovered. As of Summer ’22, necessary upgrades began in order to properly resume mapping activities with a particular goal to clean up details for I-99 between Altoona and Penn State to complete out our digital representations of the entire campus, State College, and local highway environments.

Overview of the Penn State Mapping Vehicle

The Penn State Intelligent Vehicle and Systems Group mapping vehicle, as shown in Fig.1, has been retrofitted with inertial and perception-based sensors for measuring the pose of the vehicle, as well as the surrounding environment. This vehicle is critical to collect the necessary pose and environmental data, all of which is referenced to specific geolocations.

Mapping Van at Beaver Stadium

Front view of mapping van

Figure 1: The mapping vehicle used to collect data and test algorithms.

Most of the research with this vehicle occurs at the testing facility at the Larson Institute test track in State College, PA, as shown in Fig. 2, which provides a measurable and repeatable location to collect testing data for algorithm development and validation. The test track is also the location where the vehicle’s sensors are calibrated, where the DGPS base station is housed that broadcasts corrections, and often hosts remote telemetry for the vehicle such as on-the-road delivery of information from the vehicle via cloud services.

Overhead view of the Penn State test track

Figure 2: An overview of the Larson Test Track Facility, a one-mile closed test track used for various transportation-based research.

 

Sensors commonly used for mapping

The main sensors used for road mapping are mounted on a ridged frame on the roof of the vehicle and housed in a Thule roof rack. The rigid frame allows for the motion of each sensor to be decoupled from the flex of the vehicle body. Data collection occurs on an automotive grade computer system using the Robot Operating System (ROS) for the communication layer between sensors. All sensors are time-synchronized based on a GPS pulse-per-second (PPS) triggering signal to ensure a common time scale. This PPS signal is, as it sounds, one high-low transition per second with nanosecond accuracy. A microcontroller measures this transition, calculates the required sampling time step (Table.1) for each sensor, and sends a trigger to each sensor to collect the data at that particular instance in time. This technique greatly improves the geolocation of each sensor reading by removing the transport delay of the sensor data from the sensor to the data collection computer. This time-of-transport correction can be made during post-processing when creating maps or archiving data. For real-time implementation, the transport delay can be incorporated as a slowly varying parameter into each of the system models, if necessary.

 

Table 1: Data rates of the sensors on the mapping vehicle.

Sensor Accuracy     Data   Rate
Global Positioning System (GPS):
These sensors measure the vehicle’s position relative to Earth-fixed coordinates (Latitude, Longitude, and Altitude). There are up to 4 units running on the vehicle simultaneously, ranging from defense-grade to Commercial-grade systems.
1 to 2 cm typical (1-sigma). At the test track, regularly get less than 1-cm with track-based DGPS corrections, 10 cm with WAAS corrections (PA area), adding 1 cm of error for each 10 km away from the test track. 20 Hz typically
Inertial Measurement Unit (IMU):
These sensors measure the rotational rate and acceleration of the vehicle, thus providing very rapid measurements of changes in position. There are two primary systems in use: a defense-grade ring-laser gyroscope (RLG) (Honeywell) and commercial-grade high-accuracy MEMS-based unit (Analog Devices).
The RLG unit has a drift rate of approximately 1 degree per 10 minutes, random-walk. The ADIS MEMS unit is about 1 degree per minute. The RLG has factory-integrated calibration and pose estimation using DGPS. 100 Hz typically, but 1 kHz or more if necessary
Cameras:
These sensors are similar to typical camera systems that one might find in their phone, but have research-grade features including much higher dynamic range (e.g. ability to operate in very bright or dark situations), fast operation, and most-importantly – external triggering to allow synchronization of image capture via a trigger input synchronized to the GPS PPS signal. The cameras used are the Point Grey Research Flea series (now owned by FLIR). They capture in RGB at HD resolutions. The limiting usage of the cameras is that their data feeds, as each camera is able to send by itself enough data to fully congest a 1 GB Ethernet feed. Three of the cameras are Ethernet enabled, but typically we operate the cameras via USB3 connections that allow networking of the cameras within the vehicle. Typically, we run from 3 to 8 cameras simultaneously, generating about 10 GB of data per second before compression.
Pixel resolution is generally about 0.02 degrees of arc. 25 Hz typically, but up to 200 Hz if needed.
Wheel Encoders:
These sensors are mounted to each wheel and measure the angular position of the tire to a resolution about 1/20,000th of a rotation (0.018 degrees). This allows us to measure the forward travel of the vehicle to exceptional accuracy. When integrated with the DGPS system and calibrated for the tire’s steady-state condition (inflation pressure, outdoor temperature, etc.) the system can obtain about 0.2 mm of position accuracy in the direction of travel. This is about half the width of a human fingernail!
0.018 degrees rotation per pulse 100 Hz typically, but up to 2kHz
Road Wheel Angle String Potentiometers and Encoders
We measure the steering wheel position of the vehicle by measuring deflection of the steering rack. This sensor is not widely used, but is intended to determine if the driver is making sudden corrections which may affect vibration of the vehicle. The sensor uses a coupled string potentiometer with encoders.
2 degrees of steering angle 100 Hz
LiDAR (not triggered)
These sensors consist of a laser system that measures the reflectivity and distance for a laser to strike surrounding surfaces, and reflecting back. These systems use an infra-rad, low-power (eye safe) unit that is not actually visible to the human eye. The LIDAR sensors used on the vehicle include a SICK LMS unit that spins at 25 Hz with 0.1 degree resolution (3 of these units may be used), or multi-scan Velodyne LIDAR systems typical of Autonomous Vehicle systems.
Distance errors are can be 3 to 6 cm absolute error for long-range measurements (up to 80 meters), but less error is typically seen for near features. 25 Hz

 

The mapping vehicle houses three high-definition FLIR Blackfly USB3 cameras (BFLY-U3-13S2C-CS) mounted on a camera bar located above the windshield, as shown in Fig.3, to measure the surrounding environment.

three high-definition FLIR Blackfly USB3 cameras

Figure 3: The forward-facing cameras located above the windshield.

 

The main sensor used to provide a global position estimate of the vehicle is a military-grade Novatel/Honeywell Inertial Navigation System which integrates a GPS, IMU, and ring-laser gyroscope through an Extended Kalman Filter (EKF) to provide a state estimate, as shown in Fig.4. A base station at the Larson Institute was calibrated to provide differential corrections. The base station was calibrated for an eight-hour period resulting in an absolute error (1-σ) of 2 mm in X, 2 mm in Y, and 8 mm in Z. The precision error of the Differential GPS was tested by long-term calibration (days) using the mapping vehicle in a stationary situation. A fast calibration, for ten minutes, results in a precision error (1-σ) of 4.6 mm in X, 5.5 mm in Y, and 3.7 mm in Z. A lower-cost IMU produced by Analog Devices (ADIS 16407) is attached the same frame as the Honeywell.

The GPS and IMU are housed within the upper storage area of the vehicle

Figure 4: Novatel/Honeywell Inertial Navigation System integrating GPS, IMU and ring-laser gyroscope

Additionally, two US Digital HD25 optical wheel encoders are mounted on the rear tires to measure the orientation and angular velocity of each wheel, as shown in Fig. 5. At a rate of 100 Hz, these encoders provide a measurement at a resolution of 10,000 counts per revolution, or approximately every 0.2 mm of travel. Fig. 6 shows specifics of the new design of the encoder plate mounting apparatus in the CAD software SolidWorks. It consists of three plates and new wheel studs to assist in aligning the encoder concentrically and parallel to the wheel.

Figure 5: The wheel encoder mount used for collecting odometry.

Figure 6: Encoder CAD

A downward facing SICK LMS511 LiDAR system is oriented in a “rake” style to measure the road perpendicular to the direction of travel is mounted on the rear of the vehicle, as shown in Fig. 7. This LiDAR generates a two-dimensional scan of the road, measuring both the range and bearing for a particular laser return. The SICK LMS511 has a rotating sensor with a 190°field-of-view which returns 1141 measurements per scan at a rate of 25 Hz. Each measurement has an angular decimation of 0.1667°.

An example LIDAR unit used for mapping

Figure 7: The SICK LMS511 laser-rangefinder used for scanning the road.

In Fig. 8, you can see the three road facing cameras as well as the Velodyne LIDAR in their respective updated PVC enclosures. These PVC enclosures not only protect the equipment from the environment, but also prevent the LIDAR from taking direct damage when driving under obstacles (such as short overpasses). The LIDAR is positioned at an angle at which it can see the road and traffic signals.

Figure8: The three road facing cameras and angled Velodyne VLP 16 HI-RES LIDAR

Unimeasure JX-PA string potentiometers are mounted to each front wheel, as shown in Fig.9, to measure the road wheel angle. The string potentiometers are time triggered at 100 Hz.

Location of string potentiometers

Figure 9: The Unimeasure JX-PA string potentiometers used to measure the road wheel angle of the front wheels.

To measure the steering angle with higher resolution than just the string potentiometer, a new design was created that combines the string potentiometer with the aforementioned encoders that are used for the rear wheels. The CAD design in Fig. 10 shows how the string potentiometer is coupled with the encoder.

Figure 10: CAD of the steering encoder

The power system was also updated during the summer to resolve issues discovered during Spring ’21. The updates include vibration dampening the plates that hold the various power distribution devices, addition of a new switch panel that allows for easy reading of voltages of the car’s battery and batteries in rear of vehicle, secure mounting for the battery charger, keyboard, PC, and required fire extinguishers. A CAD of the new power system can be seen in Fig. 11.

Figure 11: Screen shot of CAD showing the new updated Power system that sits in the back of the van.

The cameras and encoders are activated from a trigger signal. The trigger signal is generated from a PPS signal from GPS, this signal is then manipulated by an Arduino mega and then sent to its respective sensor. The box has diagnostic LED’s at the top to communicate its activities to the user as well as trigger wires that allow the user to visualize the signal as it leaves the box. There will be two boxes installed in the van – one for the cameras and rear encoders, and one for the front steering encoders. A CAD of the trigger box can be seen in Fig. 12.

Figure 12: CAD of the trigger box

The encoder signals are processed using the encoder box before they are sent to a ROS node. The encoder box is similar to the trigger box in that it has diagnostic LEDs on top to help the user understand the code processing, but it is instead based around Teensy 4.1. The encoder box can read 4 encoders simultaneously with 2 potentiometers. There are two encoder boxes installed in the van – one in the rear to read the values for the rear encoders and one in front to read the front encoders and potentiometers. Fig. 13 shows a CAD of the encoder box.

Figure 13: CAD of the encoder box

The vehicle is also equipped with a radar device, but this hasn’t been tested yet. It has just been installed, behind the front fascia, this can be seen in Fig.14.

Figure 14: Image showing the location of the front radar unit, behind the front fascia.

Signal flow in data collection and processing

Many real-word features can be mapped by the mapping vehicle including lane marker location, road geometry, road reflectivity, near-road geometry, road images, road conditions (potholes), and near-road barriers. The collected data is often used together with other map source, such as features from traffic simulation and online data sets. The raw data are usually noisy and with some uncertainty, thus are typically stored in a raw-data database. A data cleaning procedure is performed on the data to determine drop-outs, perform time alignment, outlier rejection, etc. The results are typically stored in a cleaned-data database. After that the cleaned data is further processed to extract features from it, such as lane center line, road geometry, elevation changes in the road, etc. These aggregated data are in a shareable format – we typically use RoadXML – and stored in a road map database, which can be queried for user applications such as localization, map-making for driving simulators, road assessment, fuel consumption, and road preview information for automated vehicles. The signal flow is shown in Fig.15.

A challenge with data processing is the variety of data representations used. For those interested in details on data processing and various coordinate representations, the team maintains GitHub repositories that include common tools for conversions for import/export, coordinate transforms, plotting, etc.

signal flow in data collection and processing

Figure 15: signal flow in data collection and processing