question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

How can we know agent's angle in a top-view map?

See original GitHub issue

I’m trying to draw agent’s trajectory and the final position&rotation, just like this: Screen Shot 2019-06-06 at 1 50 06 PM

I understand that sim.get_agent_state().rotation gives its rotation in quaternion. But still struggling with converting that to an angle in top-view map.

Is there any utility that does this conversion in this repository?

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:6 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
mathfaccommented, Nov 7, 2019

In your TASK config specify TOP_DOWN_MAP measure and HEADING_SENSOR sensor. Here is an example of creating the video with top down map:

config=habitat.get_config("./configs/tasks/pointnav.yaml")
config.defrost()
config.TASK.SENSORS.append('HEADING_SENSOR')
config.TASK.MEASUREMENTS.append('TOP_DOWN_MAP')
config.freeze()
env = habitat.Env(config)
observations = env.reset()
images = []
while not env.episode_over:
     observations = env.step(env.action_space.sample())
     im = observations["rgb"]
     top_down_map = [draw_top_down_map](https://github.com/facebookresearch/habitat-api/blob/61af1b666bf082000280e407870fd42aa73da6a9/examples/shortest_path_follower_example.py#L37)(
         env.get_metrics(), observations["heading"], im.shape[0]
     )
     output_im = np.concatenate((im, top_down_map), axis=1)
     images.append(output_im)
habitat.utils.visualizations.utils.images_to_video(images, "./", "trajectory")

It doesn’t include trajectory.

2reactions
erikwijmanscommented, Nov 7, 2019

There isn’t one that I know of, but

from habitat.tasks.utils import quaternion_rotate_vector, cartesian_to_polar

heading_vector = quaternion_rotate_vector(
    agent_state.rotation.inverse(), np.array([0, 0, -1])
)

phi = cartesian_to_polar(-heading_vector[2], heading_vector[0])[1]
top_down_map_angle = phi - np.pi / 2

should do it

EDIT: @mathfac has a better solution bellow 😃

Read more comments on GitHub >

github_iconTop Results From Across the Web

ITOM Practitioner Portal - Micro Focus
Top View presents an interactive map of the CIs that can be intuitively ... elements are described below (unlabeled elements are shown in...
Read more >
How to Read a Topo Map - REI
A topo map is an indispensable navigational tool, but only if you know how to read it. In this article, we'll teach you...
Read more >
Learning to Look around Objects for Top-View ... - UCSD CSE
Given a single RGB image of a typical street scene (left), our approach creates an occlusion-reasoned semantic map of the scene layout in...
Read more >
All Valorant Maps Overview - Mobalytics
There are currently seven Valorant maps: Bind, Haven, Split, Ascent, Icebox, Breeze, and Fracture. Here's an overview that describes them.
Read more >
Map Reading and Land Navigation
preparing agency is the U.S. Army Infantry School. ... though the markings on a map have been erased, to determine some of the...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found