Skip to content

v1.3

  • v1.3
  • c7d0b51
  • Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
  • Choose a tag to compare

  • v1.3
  • c7d0b51
  • Choose a tag to compare

  • Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
@yukezhu yukezhu tagged this 19 Oct 16:17
# robosuite 1.3.0 Release Notes
- Highlights
- New Features
- Improvements
- Critical Bug Fixes
- Other Bug Fixes

# Highlights
This release of robosuite brings powerful rendering functionalities including new renderers and multiple vision modalities, in addition to some general-purpose camera utilities. Below, we discuss the key details of these new features:

## Renderers
In addition to the native Mujoco renderer, we present two new renderers, [NVISII](https://github.com/owl-project/NVISII) and [iGibson](http://svl.stanford.edu/igibson/), and introduce a standardized rendering interface API to enable easy swapping of renderers.

NVISII is a high-fidelity ray-tracing renderer originally developed by NVIDIA, and adapted for plug-and-play usage in **robosuite**. It is primarily used for training perception models and visualizing results in high quality. It can run at up to ~0.5 fps using a GTX 1080Ti GPU. Note that NVISII must be installed (`pip install nvisii`) in order to use this renderer.

iGibson is a much faster renderer that additionally supports physics-based rendering (PBR) and direct rendering to pytorch tensors. While not as high-fidelity as NVISII, it is incredibly fast and can run at up to ~1500 fps using a GTX 1080Ti GPU. Note that iGibson must be installed (`pip install igibson`) in order to use this renderer.

With the addition of these new renderers, we also introduce a standardized [renderer](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/renderers/base.py) for easy usage and customization of the various renderers. During each environment step, the renderer updates its internal state by calling `update()` and renders by calling `render(...)`. The resulting visual observations can be polled by calling `get_pixel_obs()` or by calling other methods specific to individual renderers. We provide a [demo script](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/demos/demo_segmentation.py) for testing each new renderer, and our docs also provide [additional information](http://robosuite.ai/docs/modules/renderers.md) on specific renderer details and installation procedures.

## Vision Modalities
In addition to new renderers, we also provide broad support for multiple vision modalities across all (Mujoco, NVISII, iGibson) renderers:

- **RGB**: Standard 3-channel color frames with values in range `[0, 255]`. This is set during environment construction with the `use_camera_obs` argument.
- **Depth**: 1-channel frame with normalized values in range `[0, 1]`. This is set during environment construction with the `camera_depths` argument.
- **Segmentation**: 1-channel frames with pixel values corresponding to integer IDs for various objects. Segmentation can occur by class, instance, or geom, and is set during environment construction with the `camera_segmentations` argument.
    
In addition to the above modalities, the following modalities are supported by a subset of renderers:

- **Surface Normals**: [NVISII, iGibson] 3-channel (x,y,z) normalized direction vectors.
- **Texture Coordinates**: [NVISII] 3-channel (x,y,z) coordinate texture mappings for each element
- **Texture Positioning**: [NVISII, iGibson] 3-channel (x,y,z) global coordinates of each pixel.

Specific modalities can be set during environment and renderer construction. We provide a [demo script](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/demos/demo_nvisii_modalities.py) for testing the different modalities supported by NVISII and a [demo script](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/demos/demo_igibson_modalities.py) for testing the different modalities supported by iGibson.

## Camera Utilities
We provide a set of general-purpose [camera utilities](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/utils/camera_utils.py) that intended to enable easy manipulation of environment cameras. Of note, we include transform utilities for mapping between pixel, camera, and world frames, and include a [CameraMover](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/utils/camera_utils.py#L244) class for dynamically moving a camera during simulation, which can be used for many purposes such as the [DemoPlaybackCameraMover](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/utils/camera_utils.py#L419) subclass that enables smooth visualization during demonstration playback.

# Improvements
The following briefly describes other changes that improve on the pre-existing structure. This is not an exhaustive list, but a highlighted list of changes.

- Standardize EEF frames (#204). Now, all grippers have identical conventions for plug-and-play usage across types.

- Add OSC_POSITION control option for spacemouse (#209).

- Improve model class hierarchy for robots. Now, robots own a subset of models (gripper(s), mount(s), etc.), allowing easy external access to the robot's internal model hierarchy.

- robosuite docs updated

- Add new papers


# Critical Bug Fixes
- Fix OSC global orientation limits (#228)


# Other Bug Fixes
- Fix default OSC orientation control (valid default rotation matrix) (#232)

- Fix Jaco self-collisions (#235)

- Fix joint velocity controller clipping and tune default kp (#236)

-------

## Contributor Spotlight
A big thank you to the following community members for spearheading the renderer PRs for this release!
@awesome-aj0123 
@divyanshj16
Assets 2
Loading