Newly integrated Accelerometer Sensor and Loads of Optimization.
We took a short break over the summer to work on few new features and do a bit of optimization before the end of the year.
BeamNG.tech now features an accelerometer sensor which can be used to measure acceleration or g-forces at some point either directly on the vehicle, or relative to it. The accelerometer sensor has its own coordinate system, so as to allow the user to more usefully see acceleration changes in these local dimensions. The readings are available to the tech user, and graphs can be made with the data - in a way similar to what can be found in the accelerometer experimentation literature.
We’ve also taken some time to do optimization work on our sensor suite as well as BeamNG.tech´s Python API - BeamNGpy.
As always, the BeamNG.tech v0.26 release includes all the cool features of the latest BeamNG.drive release.
Check the full report on BeamNG.drive v0.26
Sensors
In general
- Multi-sensor: GPU scheduling introduced to balance sensor computational load, needed when many sensors are used.
- Automatic updates introduced for Camera, Lidar, Ultrasonic and Accelerometer sensors, based on a set update time.
- Ad-hoc polling requests introduced for Camera, Lidar, Ultrasonic and Accelerometer sensors. This feature can be used also as a sole means by which to query a sensor, instead of having it automated to an update cycle.
- API changes to sensor suite, to allow for default and optional parameters, simplifying things for the tech user.
- Exposure of more sensor control parameters to the tech user, including sensor matrices and coordinate systems.
Accelerometer
- New physically-based tri-axial accelerometer sensor, which can be automated and attached to a vehicle.
- Sensor runs in its own local coordinate system, for more realistic measurements.
- Sensor can be used with or without gravity consideration, typical of commercial accelerometers.
Camera
- New automated camera sensor, capable of color, annotation, and depth imaging. This replaces old camera sensor.
- New functionality to convert world-space points to camera coordinates.
- New image processing algorithm for depth images to make them visibly clearer.
- Can now be used with or without shared memory.
- base64 encoding removed and replaced with binary strings, for efficiency.
- Minor bug fixes.
Lidar
- Color data is now available to the user, along with point cloud data. This is needed for the tech user to perform processing on the annotation data.
- Rendering efficiency improvements.
- Efficiency improvements to data transfer from simulation.
- Minor bug fixes.
Ultrasonic
- Rendering efficiency improvements.
- Minor bug fixes.
ROS1
- brake feature added to beamng_teleop_keyboard
BeamNGpy
- major changes to LuaSocket connection between Lua and Python. Now uses TCP_NODELAY, runs faster and more optimally.
- major updates to beamngpy sensor suite and its API, including more robust and better managed sensors in their own classes.
- set of comprehensive tests for each sensor, testing all sensor functionality, written in/for Python. This doubles as a complete example of how to use any sensor feature available to beamngpy (for Camera, Lidar, Ultrasonic, and Accelerometer sensors)
- brand new accelerometer sensor now available.
- removed: deprecated BeamNGpy functionality
- setup_logging (superseded by set_up_simple_logging and config_logging
- rot argument used for setting rotation of objects and vehicles in Euler angles, use rot_quat which expects quaternions (you can use the helper function angle_to_quat to convert Euler angles to quaternions)
- update_vehicle function is removed
- the requests argument in Vehicle.poll_sensors is removed
- poll_sensors now does not return a value
- the deploy argument of BeamNGpy.open
- added: support for loading TrackBuilder tracks
- added: support for loading Flowgraph scenarios
- fixed: multiple vehicles now do not share color in instance annotations
- added: Vehicle.teleport helper function which allows to teleport a vehicle directly through its instance