You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.
- [ ] bug report -> please search issues before submitting
- [x] feature request
- [ ] documentation issue or request
- [ ] regression (a behavior that used to work and stopped in a new release)
Minimal steps to reproduce
We are trying to use the Factory-AI-Vision sample, however, RTSP streams have noticeable lag (1-2 seconds in our tests). For our scenario we need as close to real-time as possible. Would it be possible to specify a camera that is directly connected to the host (nVidia JetsonNano) say with host:0 or something instead of an RTSP stream URL when adding a camera?
Expected/desired behavior
I've connected my camera to the Jetson in graphical mode and run the following code with the real-time performance we are looking for so I think it is technologically feasible.
importcv2cap=cv2.VideoCapture(0)
if (cap.isOpened()==False):
print("Error opening video stream or file")
while(cap.isOpened()):
ret, frame=cap.read()
ifret==True:
cv2.imshow('Frame',frame)
ifcv2.waitKey(25) &0xFF==ord('q'):
breakelse:
breakcap.release()
cv2.destroyAllWindows()
OS and Version?
nVidia Jetson L4T Linux (Jetpack 4.4) running IoTEdge
Mention any other details that might be useful
A hint at the "HostConfig" settings to mimic docker run --device=/dev/video0 in the manifest so I can expose the camera to InferenceModule (I assume its the InferenceModule) on IoTEdge would be helpful as well.
The text was updated successfully, but these errors were encountered:
Being able to add the device directly would allow for a lot of additional use cases. I would suggest that investigation into this feature also include both UVC based and on UVC based cameras. This would increase our abilities to support Industrial Machine Cameras. Example:
This issue is for a: (mark with an
x
)Minimal steps to reproduce
We are trying to use the Factory-AI-Vision sample, however, RTSP streams have noticeable lag (1-2 seconds in our tests). For our scenario we need as close to real-time as possible. Would it be possible to specify a camera that is directly connected to the host (nVidia JetsonNano) say with
host:0
or something instead of an RTSP stream URL when adding a camera?Expected/desired behavior
I've connected my camera to the Jetson in graphical mode and run the following code with the real-time performance we are looking for so I think it is technologically feasible.
OS and Version?
nVidia Jetson L4T Linux (Jetpack 4.4) running IoTEdge
Mention any other details that might be useful
A hint at the "HostConfig" settings to mimic
docker run --device=/dev/video0
in the manifest so I can expose the camera to InferenceModule (I assume its the InferenceModule) on IoTEdge would be helpful as well.The text was updated successfully, but these errors were encountered: