question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

RTSP Output as a Pipeline Node

See original GitHub issue

Start with the why:

In some cases it may be desirable to feed RTSP (Real Time Streaming Protocol, i.e. IP Camera) output to have compatibility with existing software stacks/etc. over Ethernet. To provide instant integration sort of compatibility, particularly if working with closed-source tools (or hard to modify tools) which already work with RTSP-compatible inputs.

This becomes increasingly powerful when the RTSP Output is a node in the pipeline. As instead of the whole video feed being output as a RTSP video stream, any video stream in the pipeline can be output. Take for using an object detector to guide digital PTZ (https://github.com/luxonis/depthai/issues/135), and then outputting this pan/tilt/zoomed stream directly out RTSP.

This way, the RTSP output would appear to a computer, smart phone, YouTube, etc. as if someone is actually just moving a camera around.

Move to the how:

Leverage live555 and the Gen2 Pipeline Builder (#136) to implement an RTSP output node over Ethernet to work with POE-capable DepthAI devices like the BW2098POE and other future DepthAI hardware designs (such as OAK-1-POE and OAK-D-POE) based on the Ethernet-capable BW2099 module.

Move to the what:

Implement a RTSP output node in the Gen2 Pipeline Builder (https://github.com/luxonis/depthai/issues/136). This will allow any node which produces video to be output over Ethernet.

Issue Analytics

  • State:open
  • Created 3 years ago
  • Reactions:2
  • Comments:12 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
VanDavvcommented, Jul 19, 2021

That’s a great idea @YijinLiu, thanks! I’ll migrate RSTP streaming to Gen2 and see if I can also use VideoEncoder as you suggested

1reaction
Luxonis-Brandoncommented, Jul 16, 2021

(Note that streaming disparity there seems broken.)

Read more comments on GitHub >

github_iconTop Results From Across the Web

Need to direct rtsp video output to stdout - node.js
I am able to do this on a terminal ( and get raw binary data back ) but when inside of node, I...
Read more >
Get started with Video Analyzer live pipelines in the Azure portal
This quickstart walks you through the steps to capture and record video from an RTSP camera using live pipelines in Azure Video Analyzer ......
Read more >
RTSP Streaming of processed video - Luxonis
My understanding, from some comments I've read, is that these processed output nodes cannot be linked to encoder nodes in the onboard pipeline....
Read more >
node-rtsp-stream - npm
node -rtsp-stream ... Please note that framerate from cameras must be greater than or equal to 15fps for mpeg1 encoding, otherwise ffmpeg errors ......
Read more >
Build a video streaming server with Node.js - LogRocket Blog
As you may have guessed from the application overview, our streaming server will be fairly simple to implement. Essentially, we're creating a ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found