Sensor configuration
See original GitHub issueFirstly, Thank you for your great work!
Questions
- In configs/tasks/pointnav_gibson.yaml line 5 and configs/tasks/pointnav_mp3d.yaml line 5, SENSORS for SIMULATOR is set as [‘RGB_SENSOR’]. However, in the
observations
returned by the simulator, there are still ‘rgb’, ‘depth’ and ‘goal’. How can we specify the sensors we want to use in simulator observations correctly? - Are all the instances in the gibson/mp3d train/val/test set reachable within 500 steps? (One
instance
means one episode_id data like this: (I am not sure whether the wordinstance
is suitable to describe the above data example))
Issue Analytics
- State:
- Created 4 years ago
- Comments:10 (3 by maintainers)
Top Results From Across the Web
Sensor Configuration - JJS Technical Services
The Sensor Configuration tab adjusts settings for each individual sensor. A separate sensor tab is provided for each sensor. Figure 5. shows the...
Read more >Sensor Configuration Overview – Help Center - Support
CALIBRATION FACTORS. Many sensors require Calibration factors that come from the manufacturer. · ZERO VALUES. These values are initial values for ...
Read more >sensor-configuration | Junos OS - Juniper Networks
Configure various IDP parameters to match the properties of transiting network traffic.
Read more >Configuring the sensor - IBM
Performance monitoring sensors can either be managed manually as individual files on each node or managed automatically by IBM Spectrum Scale.
Read more >Advanced sensor configuration - SolarWinds Documentation
Learn about the advanced options for configuring your Quality of Experience monitoring in your SolarWinds Platform products.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
A single instance of the simulator can only utilize 1 GPU at a time, however, it is able to use any GPU when built in headless mode (using the
--headless
flag, this is only supported with nvidia GPUs on linux).Currently the baseline code isn’t setup to use multiple GPUs, but it shouldn’t be hard to modify. For the simulator, you can spawn different instances on different GPUs by modifying the loop here: https://github.com/facebookresearch/habitat-api/blob/master/baselines/evaluate_ppo.py#L49 to change the GPU id based on the rank of the process. For pytorch, proceed as normal.
The coordinate system in habitat-sim is -z forward, +y up.
Thank you for your quick reply!