question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

fatal python errors and reverting to CPU instead of TPU

See original GitHub issue

Describe the bug frigate is showing fatal python errors and reverting to CPU

Version of frigate docker stable-amd64

Config file Include your full config file wrapped in triple back ticks.

# Optional: port for http server (default: shown below)
web_port: 5000

# Optional: detectors configuration
# USB Coral devices will be auto detected with CPU fallback
detectors:
  # Required: name of the detector
  coral:
    # Required: type of the detector
    # Valid values are 'edgetpu' (requires device property below) and 'cpu'.
    type: edgetpu
    # Optional: device name as defined here: https://coral.ai/docs/edgetpu/multiple-edgetpu/#using-the-tensorflow-lite-python-api
    device: usb

# Required: mqtt configuration
mqtt:
  # Required: host name
  host: 192.168.0.84
  # Optional: port (default: shown below)
  port: 1883
  # Optional: topic prefix (default: shown below)
  # WARNING: must be unique if you are running multiple instances
  topic_prefix: motion
  # Optional: client id (default: shown below)
  # WARNING: must be unique if you are running multiple instances
  client_id: home
  # Optional: user
  user: mosquitto
  # Optional: password
  # NOTE: Environment variables that begin with 'FRIGATE_' may be referenced in {}. 
  #       eg. password: '{FRIGATE_MQTT_PASSWORD}'
  password: 

# Optional: Global configuration for saving clips
save_clips:
  # Optional: Maximum length of time to retain video during long events. (default: shown below)
  # NOTE: If an object is being tracked for longer than this amount of time, the cache
  #       will begin to expire and the resulting clip will be the last x seconds of the event.
  max_seconds: 300
  # Optional: Location to save event clips. (default: shown below)
  clips_dir: /clips
  # Optional: Location to save cache files for creating clips. (default: shown below)
  # NOTE: To reduce wear on SSDs and SD cards, use a tmpfs volume.
  cache_dir: /cache

# Optional: Global ffmpeg args
# "ffmpeg" + global_args + input_args + "-i" + input + output_args
ffmpeg:
  # Optional: global ffmpeg args (default: shown below)
  global_args:
    - -hide_banner
    - -loglevel
    - panic
  # Optional: global hwaccel args (default: shown below)
  # NOTE: See hardware acceleration docs for your specific device
  hwaccel_args: []
  # Optional: global input args (default: shown below)
  input_args:
    - -avoid_negative_ts
    - make_zero
    - -fflags
    - nobuffer
    - -flags
    - low_delay
    - -strict
    - experimental
    - -fflags
    - +genpts+discardcorrupt
    - -rtsp_transport
    - tcp
    - -stimeout
    - '5000000'
    - -use_wallclock_as_timestamps
    - '1'
  # Optional: global output args (default: shown below)
  output_args:
    - -f
    - rawvideo
    - -pix_fmt
    - yuv420p

# Optional: Global object filters for all cameras.
# NOTE: can be overridden at the camera level
objects:
  # Optional: list of objects to track from labelmap.txt (default: shown below)
  track:
    - person
  # Optional: filters to reduce false positives for specific object types
  filters:
    person:
      # Optional: minimum width*height of the bounding box for the detected object (default: 0)
      min_area: 5000
      # Optional: maximum width*height of the bounding box for the detected object (default: max_int)
      max_area: 100000
      # Optional: minimum score for the object to initiate tracking (default: shown below)
      min_score: 0.5
      # Optional: minimum decimal percentage for tracked object's computed score to be considered a true positive (default: shown below)
      threshold: 0.85

# Required: configuration section for cameras
cameras:
  # Required: name of the camera
  porch:
    # Required: ffmpeg settings for the camera
    ffmpeg:
      # Required: Source passed to ffmpeg after the -i parameter.
      # NOTE: Environment variables that begin with 'FRIGATE_' may be referenced in {}
      input: rtsp://admin:.....@192.168.0.10:554/cam/realmonitor?channel=1&subtype=1
      
    # Optional: height of the frame
    # NOTE: Recommended to set this value, but frigate will attempt to autodetect.
    height: 720
    # Optional: width of the frame
    # NOTE: Recommended to set this value, but frigate will attempt to autodetect.
    width: 1280
    # Optional: desired fps for your camera
    # NOTE: Recommended value of 5. Ideally, try and reduce your FPS on the camera.
    #       Frigate will attempt to autodetect if not specified.
    fps: 5

    # Optional: motion mask
    # NOTE: see docs for more detailed info on creating masks
    mask: poly,0,900,1080,900,1080,1920,0,1920

    # Optional: timeout for highest scoring image before allowing it
    # to be replaced by a newer image. (default: shown below)
    best_image_timeout: 60

    # Optional: camera specific mqtt settings
    mqtt:
      # Optional: crop the camera frame to the detection region of the object (default: False)
      crop_to_region: True
      # Optional: resize the image before publishing over mqtt
      snapshot_height: 300

    # Optional: zones for this camera
    zones:
      # Required: name of the zone
      # NOTE: This must be different than any camera names, but can match with another zone on another
      #       camera.
      front_steps:
        # Required: List of x,y coordinates to define the polygon of the zone.
        # NOTE: Coordinates can be generated at https://www.image-map.net/
        coordinates: 545,1077,747,939,788,805
        # Optional: Zone level object filters.
        # NOTE: The global and camera filters are applied upstream.
        filters:
          person:
            min_area: 5000
            max_area: 100000
            threshold: 0.8

    # Optional: save clips configuration
    # NOTE: This feature does not work if you have added "-vsync drop" in your input params. 
    #       This will only work for camera feeds that can be copied into the mp4 container format without
    #       encoding such as h264. It may not work for some types of streams.
    save_clips:
      # Required: enables clips for the camera (default: shown below)
      enabled: False
      # Optional: Number of seconds before the event to include in the clips (default: shown below)
      pre_capture: 30
      # Optional: Objects to save clips for. (default: all tracked objects)
      objects:
        - person      

    # Optional: Configuration for the snapshots in the debug view and mqtt
    snapshots:
      # Optional: print a timestamp on the snapshots (default: shown below)
      show_timestamp: True
      # Optional: draw zones on the debug mjpeg feed (default: shown below)
      draw_zones: False
      # Optional: draw bounding boxes on the mqtt snapshots (default: shown below)
      draw_bounding_boxes: True

    # Optional: Camera level object filters config. If defined, this is used instead of the global config.
    objects:
      track:
        - person
      filters:
        person:
          min_area: 5000
          max_area: 100000
          min_score: 0.5
          threshold: 0.85

Logs

On connect called
Starting detection process: 23
Attempting to load TPU as usb
Camera capture process started for porch: 24
Camera process started for porch: 25
Creating ffmpeg process...
ffmpeg -hide_banner -loglevel panic -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -rtsp_transport tcp -stimeout 5000000 -use_wallclock_as_timestamps 1 -i rtsp://admin:akbkmb20@192.168.0.20:554/cam/realmonitor?channel=1&subtype=1 -r 5 -f rawvideo -pix_fmt yuv420p pipe:
* Serving Flask app "detect_objects" (lazy loading)
* Environment: development
* Debug mode: off

d
F :1147] HandleQueuedBulkIn transfer in failed. Unknown: USB transfer error 1 [LibUsbDataInCallback]

Fatal Python error: Aborted


Thread 0x000014edf9e55740 (most recent call first):
File "/usr/lib/python3.8/multiprocessing/connection.py", line 379 in _recv
File "/usr/lib/python3.8/multiprocessing/connection.py", line 414 in _recv_bytes
File "/usr/lib/python3.8/multiprocessing/connection.py", line 216 in recv_bytes
File "/usr/lib/python3.8/multiprocessing/queues.py", line 97 in get
File "/opt/frigate/frigate/edgetpu.py", line 118 in run_detector
File "/usr/lib/python3.8/multiprocessing/process.py", line 108 in run
File "/usr/lib/python3.8/multiprocessing/process.py", line 315 in _bootstrap
File "/usr/lib/python3.8/multiprocessing/popen_fork.py", line 75 in _launch
File "/usr/lib/python3.8/multiprocessing/popen_fork.py", line 19 in __init__
File "/usr/lib/python3.8/multiprocessing/context.py", line 277 in _Popen
File "/usr/lib/python3.8/multiprocessing/context.py", line 224 in _Popen
File "/usr/lib/python3.8/multiprocessing/process.py", line 121 in start
File "/opt/frigate/frigate/edgetpu.py", line 159 in start_or_restart
File "/opt/frigate/frigate/edgetpu.py", line 142 in __init__
File "detect_objects.py", line 189 in main
File "detect_objects.py", line 441 in <module>
Detection appears to have stopped. Restarting detection process
Starting detection process: 66
Attempting to load TPU as usb

d
F :1147] HandleQueuedBulkIn transfer in failed. Unknown: USB transfer error 1 [LibUsbDataInCallback]

Fatal Python error: Aborted


Thread 0x000014edee840700 (most recent call first):
File "/usr/lib/python3.8/multiprocessing/synchronize.py", line 95 in __enter__
File "/usr/lib/python3.8/multiprocessing/queues.py", line 96 in get
File "/opt/frigate/frigate/edgetpu.py", line 118 in run_detector
File "/usr/lib/python3.8/multiprocessing/process.py", line 108 in run
File "/usr/lib/python3.8/multiprocessing/process.py", line 315 in _bootstrap
File "/usr/lib/python3.8/multiprocessing/popen_fork.py", line 75 in _launch
File "/usr/lib/python3.8/multiprocessing/popen_fork.py", line 19 in __init__
File "/usr/lib/python3.8/multiprocessing/context.py", line 277 in _Popen
File "/usr/lib/python3.8/multiprocessing/context.py", line 224 in _Popen
File "/usr/lib/python3.8/multiprocessing/process.py", line 121 in start
File "/opt/frigate/frigate/edgetpu.py", line 159 in start_or_restart
File "detect_objects.py", line 113 in run
File "/usr/lib/python3.8/threading.py", line 932 in _bootstrap_inner
File "/usr/lib/python3.8/threading.py", line 890 in _bootstrap
Detection appears to have stopped. Restarting detection process
Starting detection process: 75
Attempting to load TPU as usb

d
F :1147] HandleQueuedBulkIn transfer in failed. Unknown: USB transfer error 1 [LibUsbDataInCallback]

Fatal Python error: Aborted


Thread 0x000014edee840700 (most recent call first):
File "/usr/lib/python3.8/multiprocessing/synchronize.py", line 95 in __enter__
File "/usr/lib/python3.8/multiprocessing/queues.py", line 96 in get
File "/opt/frigate/frigate/edgetpu.py", line 118 in run_detector
File "/usr/lib/python3.8/multiprocessing/process.py", line 108 in run
File "/usr/lib/python3.8/multiprocessing/process.py", line 315 in _bootstrap
File "/usr/lib/python3.8/multiprocessing/popen_fork.py", line 75 in _launch
File "/usr/lib/python3.8/multiprocessing/popen_fork.py", line 19 in __init__
File "/usr/lib/python3.8/multiprocessing/context.py", line 277 in _Popen
File "/usr/lib/python3.8/multiprocessing/context.py", line 224 in _Popen
File "/usr/lib/python3.8/multiprocessing/process.py", line 121 in start
File "/opt/frigate/frigate/edgetpu.py", line 159 in start_or_restart
File "detect_objects.py", line 113 in run
File "/usr/lib/python3.8/threading.py", line 932 in _bootstrap_inner
File "/usr/lib/python3.8/threading.py", line 890 in _bootstrap
Detection appears to have stopped. Restarting detection process
Starting detection process: 84
Attempting to load TPU as usb
No EdgeTPU detected. Falling back to CPU.

Frigate debug stats

{
   "detection_fps":0.1,
   "detectors":{
      "coral":{
         "detection_start":0.0,
         "inference_speed":10.0,
         "pid":84
      }
   },
   "porch":{
      "camera_fps":2.3,
      "capture_pid":24,
      "detection_fps":0.1,
      "frame_info":{
         "detect":1607106166.109396,
         "process":0.0
      },
      "pid":25,
      "process_fps":0.1,
      "skipped_fps":0.0
   }
}

FFprobe from your camera

Run the following command and paste output below

ffprobe version n4.3.1 Copyright (c) 2007-2020 the FFmpeg developers
  built with gcc 10.1.0 (GCC)
  configuration: --prefix=/usr --disable-debug --disable-static --disable-stripping --enable-avisynth --enable-fontconfig --enable-gmp --enable-gnutls --enable-gpl --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libdav1d --enable-libdrm --enable-libfreetype --enable-libfribidi --enable-libgsm --enable-libiec61883 --enable-libjack --enable-libmfx --enable-libmodplug --enable-libmp3lame --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librav1e --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxml2 --enable-libxvid --enable-nvdec --enable-nvenc --enable-omx --enable-shared --enable-version3
  libavutil      56. 51.100 / 56. 51.100
  libavcodec     58. 91.100 / 58. 91.100
  libavformat    58. 45.100 / 58. 45.100
  libavdevice    58. 10.100 / 58. 10.100
  libavfilter     7. 85.100 /  7. 85.100
  libswscale      5.  7.100 /  5.  7.100
  libswresample   3.  7.100 /  3.  7.100
  libpostproc    55.  7.100 / 55.  7.100
Input #0, rtsp, from 'rtsp://@192.168.0.20:554/cam/realmonitor?channel=1&subtype=1':
  Metadata:
    title           : Media Server
  Duration: N/A, start: 0.040000, bitrate: N/A
    Stream #0:0: Video: h264 (High), yuvj420p(pc, bt709, progressive), 704x576, 25 tbr, 90k tbn, 180k tbc
    Stream #0:1: Audio: pcm_alaw, 64000 Hz, 1 channels, s16, 512 kb/s

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:8 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
rayzorbencommented, Dec 4, 2020

OK nevermind, found the option for privileged. I’m not seeing that message anymore.

0reactions
blakeblackshearcommented, Dec 5, 2020
Read more comments on GitHub >

github_iconTop Results From Across the Web

Troubleshooting TensorFlow - TPU - Google Cloud
If TensorFlow encounters an error during TPU execution, the script sometimes seems to stop responding rather than exit to the shell. If this...
Read more >
CPU backed failure on ARM due to XLA/LLVM (and a ... - GitHub
Hi, JAX CPU backend fails on ARM architecture (e.g. NVIDIA Jetson AGX, ARMv8.2) with the following errors: Python 3.6.12 (default, ...
Read more >
None values not supported. Code working properly on CPU ...
The code is working fine with CPU and GPU, but it is giving me errors while training on a TPU. This same question...
Read more >
keras model with TF 2.2.0 crashing during training with TPU ...
While there is no issues with CPU and GPU (K80/V100), the same code and data is crashing during training (model.fit) wwhen using TPU...
Read more >
Local realtime person detection for RTSP cameras
Starting detection process: 196. Attempting to load TPU as usb:0. Fatal Python error: Illegal instruction. and the Ingress page shows:.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found