[Support]:
See original GitHub issueDescribe the problem you are having
I have upgraded to the newest version today and I have a setup where Home Assistant pushes me a notification when an object has been detected. The new version has Stationary objects
feature. At the moment I don’t have any use for it but I think there will be some use cases when I have time to think about it! So, thank you for that but it’s making me trouble:
There a two stationary cars from my neighbour at the moment . One of them is causing me headache. It’s usually not there all the time but today it is. It’s a silver one (side view). Unfortunately I can’t post the picture of the car (privacy), but it’s 15 meters away from my cam. Another dark blue car (front view) is 5 meters to the right of the silver one This isn’t making any trouble.
I have set contour_area: 125
and also tried up to 1000.
The area where the two cars are is also completely masked.
I don’t know hot to solve that issue. So I have to deactivate frigate for now. My config worked for more than 6 months now without such an issue.
Thank you for your time and help!
Version
0.10.0-BFECEE9
Frigate config file
detectors:
coral:
type: edgetpu
device: usb
cameras:
hof:
ffmpeg:
inputs:
# Main stream (1920x1080)
- path: ***
roles:
- record
- rtmp
# Sub stream (640x352)
- path: ***
roles:
- detect
motion:
contour_area: 125 # reduce the sensitivity for motion detection
detect:
width: 640
height: 352
fps: 5
#record:
# Optional: Enable recording (default: shown below)
#enabled: True
#snapshots:
#enabled: True
# Optional: Configuration for stationary object tracking
stationary:
# Optional: Frequency for running detection on stationary objects (default: shown below)
# When set to 0, object detection will never be run on stationary objects. If set to 10,
# it will be run on every 10th frame.
interval: 0
# Optional: Number of frames without a position change for an object to be considered
# stationary (default: 10x the frame rate or 10s)
threshold: 50
# Optional: Define a maximum number of frames for tracking a stationary object (default: not set, track forever)
# This can help with false positives for objects that should only be stationary for a limited amount of time.
# It can also be used to disable stationary object tracking. For example, you may want to set a value for person,
# but leave car at the default.
max_frames:
# Optional: Default for all object types (default: not set, track forever)
default: 3000
# Optional: Object specific values
objects:
person: 1000
doorbird:
ffmpeg:
inputs:
# It's not possible to run two seperate streams from a doorbird!
# Main stream (1920x1080)
# - path: ***
# roles:
# Sub stream (1280x720)
- path: ***
roles:
- detect
- record
- rtmp
motion:
mask:
- 1280,0,1280,315,1280,521,1020,459,740,473,378,454,374,317,437,315,435,289,0,147,0,0,625,0
# reduce the sensitivity for motion detection
contour_area: 125
detect:
width: 1280
height: 720
fps: 5
# Optional: Configuration for stationary object tracking
stationary:
# Optional: Frequency for running detection on stationary objects (default: shown below)
# When set to 0, object detection will never be run on stationary objects. If set to 10,
# it will be run on every 10th frame.
interval: 0
# Optional: Number of frames without a position change for an object to be considered
# stationary (default: 10x the frame rate or 10s)
threshold: 100
# Optional: Define a maximum number of frames for tracking a stationary object (default: not set, track forever)
# This can help with false positives for objects that should only be stationary for a limited amount of time.
# It can also be used to disable stationary object tracking. For example, you may want to set a value for person,
# but leave car at the default.
max_frames:
# Optional: Default for all object types (default: not set, track forever)
default: 3000
# Optional: Object specific values
objects:
person: 1000
objects:
track:
- car
- person
- bicycle
filters:
person:
# Optional: minimum width*height of the bounding box for the detected object (default: 0)
min_area: 5000
# Optional: maximum width*height of the bounding box for the detected object (default: 24000000)
max_area: 100000
# Optional: minimum score for the object to initiate tracking (default: shown below)
min_score: 0.5
# Optional: minimum decimal percentage for tracked object's computed score to be considered a True positive (default: shown below)
threshold: 0.7
# Optional: Record configuration
# NOTE: Can be overridden at the camera level
record:
# Optional: Enable recording (default: shown below)
enabled: True
# Optional: Number of days to retain recordings regardless of events (default: shown below)
# NOTE: This should be set to 0 and retention should be defined in events section below
# if you only want to retain recordings of events.
retain:
days: 7
mode: motion
# Optional: Event recording settings
events:
# Optional: Maximum length of time to retain video during long events. (default: shown below)
# NOTE: If an object is being tracked for longer than this amount of time, the retained recordings
# will be the last x seconds of the event unless retain_days under record is > 0.
max_seconds: 300
# Optional: Number of seconds before the event to include (default: shown below)
pre_capture: 5
# Optional: Number of seconds after the event to include (default: shown below)
post_capture: 5
# Optional: Objects to save recordings for. (default: all tracked objects)
# objects:
# - person
# Optional: Restrict recordings to objects that entered any of the listed zones (default: no required zones)
# required_zones: []
# Optional: Retention settings for recordings of events
retain:
# Required: Default retention days (default: shown below)
default: 10
mode: active_objects
snapshots:
# Optional: Enable writing jpg snapshot to /media/frigate/clips (default: shown below)
# This value can be set via MQTT and will be updated in startup based on retained value
enabled: True
# Optional: print a timestamp on the snapshots (default: shown below)
timestamp: False
# Optional: draw bounding box on the snapshots (default: shown below)
bounding_box: False
# Optional: crop the snapshot (default: shown below)
crop: False
# Optional: height to resize the snapshot to (default: original size)
height: 175
# Optional: Restrict snapshots to objects that entered any of the listed zones (default: no required zones)
required_zones: []
# Optional: Camera override for retention settings (default: global values)
retain:
# Required: Default retention days (default: shown below)
default: 10
# Optional: Per object retention days
objects:
person: 15
Relevant log output
---
FFprobe output from your camera
# ffprobe rtsp://***
ffprobe version 4.3.1 Copyright (c) 2007-2020 the FFmpeg developers
built with gcc 9 (Ubuntu 9.3.0-17ubuntu1~20.04)
configuration: --disable-debug --disable-doc --disable-ffplay --enable-shared --enable-avresample --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-gpl --enable-libfreetype --enable-libvidstab --enable-libmfx --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libxcb --enable-libx265 --enable-libxvid --enable-libx264 --enable-nonfree --enable-openssl --enable-libfdk_aac --enable-postproc --enable-small --enable-version3 --enable-libzmq --extra-libs=-ldl --prefix=/opt/ffmpeg --enable-libopenjpeg --enable-libkvazaar --enable-libaom --extra-libs=-lpthread --enable-vaapi --extra-cflags=-I/opt/ffmpeg/include --extra-ldflags=-L/opt/ffmpeg/lib
libavutil 56. 51.100 / 56. 51.100
libavcodec 58. 91.100 / 58. 91.100
libavformat 58. 45.100 / 58. 45.100
libavdevice 58. 10.100 / 58. 10.100
libavfilter 7. 85.100 / 7. 85.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 7.100 / 5. 7.100
libswresample 3. 7.100 / 3. 7.100
libpostproc 55. 7.100 / 55. 7.100
Input #0, rtsp, from 'rtsp://***':
Metadata:
title : RTSP/RTP stream from DoorBird
comment : mpeg/1080p/media.amp
Duration: N/A, start: 0.094633, bitrate: N/A
Stream #0:0: Video: h264, yuvj420p(pc, bt709, progressive), 1920x1080, 10 fps, 10 tbr, 90k tbn, 180k tbc
Frigate stats
{"detection_fps":0.0,"detectors":{"coral":{"detection_start":0.0,"inference_speed":9.09,"pid":221}},"doorbird":{"camera_fps":5.0,"capture_pid":234,"detection_fps":0.0,"pid":230,"process_fps":5.0,"skipped_fps":0.0},"hof":{"camera_fps":5.1,"capture_pid":232,"detection_fps":0.0,"pid":228,"process_fps":5.1,"skipped_fps":0.0},"service":{"storage":{"/dev/shm":{"free":5366.0,"mount_type":"tmpfs","total":5368.7,"used":2.7},"/media/frigate/clips":{"free":5101658.4,"mount_type":"fuse.shfs","total":11997243.4,"used":6895584.9},"/media/frigate/recordings":{"free":5101658.4,"mount_type":"fuse.shfs","total":11997243.4,"used":6895584.9},"/tmp/cache":{"free":7337.1,"mount_type":"rootfs","total":8297.5,"used":960.4}},"temperatures":{},"uptime":53,"version":"0.10.0-bfecee9"}}
Operating system
UNRAID
Install method
Docker Compose
Coral version
USB
Network connection
Wired
Camera make and model
Doorbird D1101V
Any other information that may be helpful
I am using this Home Assistant config to push notifications when objects appear:
alias: Notification, Motion, Cam - Frigate object detected
trigger:
- platform: mqtt
topic: frigate/events
condition:
- condition: state
entity_id: input_boolean.disable_camera_hof_motion_notification_door_open
state: 'off'
- condition: state
entity_id: input_boolean.disable_camera_hof_motion_notification
state: 'off'
action:
- service: notify.mobile_app_iphone_12_pro_max
data_template:
message: Ein {{trigger.payload_json["after"]["label"]}} wurde erkannt.
data:
image: >-
https://***/api/frigate/notifications/{{trigger.payload_json["after"]["id"]}}/thumbnail.jpg?format=android
tag: '{{trigger.payload_json["after"]["id"]}}'
url: /lovelace/kameras
clickAction: /lovelace/kameras
- delay:
hours: 0
minutes: 0
seconds: 30
milliseconds: 0
mode: single
Issue Analytics
- State:
- Created 2 years ago
- Comments:15 (7 by maintainers)
Top GitHub Comments
@Coolie1101
It’s documented here: https://docs.frigate.video/configuration/masks/#further-clarification
and here: https://docs.frigate.video/guides/stationary_objects/
I’m not sure why you are getting notifications there. You should look at the automation trace tools to see what is happening.