我知道你不能播放视频。
但是你说数据传输成功地工作了。如果数据传输成功,套接字怎么会失效?
不流视频的原因可能是别的:比如帧格式、探头控制设置等。
请检查以下调试步骤是否对您有帮助。
UVC实现需要相机设备总是发送每个被声明的帧的正确字节数。任何东西或多或少都会导致框架被丢弃。如果发生这种情况,你会在应用程序中看到黑屏。
按照下面提到的步骤调试这个问题:
1。相机发送的数据和你配置的数据一样多吗?探测FV和LV线,并确保脉冲宽度与预期的一样多,即LV高持续时间=(每行像素数)*(每个像素字节)/PCLKYF频率高持续时间=(LV高持续时间+线消隐时间)*(每行像素数)某些相机。有死区(空白线)插入到框架中,也被FX3读取。这导致帧的大小比上报PC的增加。
2。确保垂直消隐周期至少为500μs,以确保恒定的帧速率。在帧结束时,GPIF状态机中断CPU(内部CPU)。一旦固件提交了部分缓冲区,GPIF状态机将切换到另一个套接字。垂直消隐周期应足够大,以允许GPIF状态机切换到另一个套接字,以便在下一帧中没有数据丢失。当帧大小与报告的PC不匹配时,帧将被丢弃。
三。确保由图像传感器设定的PCLK频率小于100MHz。
4。确保GPIF线以22Ω系列电阻终止。还要确保这些线的长度匹配到500米以内。
5。描述符文件CYFXVCCDSCR.C中的下列字段是否正确设置为相机实际发出的内容?
a帧宽度和高度在VS帧描述符中
VS格式描述符中每个像素的字节数
帧描述符中的帧速率
D.VS格式描述符中GUID所设置的视频格式
6。确保在探头控制结构(GLPROBECTRL)中设置的最大视频帧大小等于或大于您在一帧中发送的字节量。此外,确保在探头控制结构中设置的最大有效载荷大小大于您在有效载荷(通常是DMA缓冲区大小)中发送的量。
7。“CUU3PDMAMultChhannEngultPrimeBuff:代码71”的错误是否在UFART终端上通过FX3传输视频时显示?如果是,执行以下知识库文章中提到的修改:HTTPS://Cultuy.CyPress .COM/DOCS/DOC-10463
8。可以使用USB跟踪来确保有效载荷中的UVC头字段在后续帧中切换到0x8C和0x8D之间。
以上来自于百度翻译
以下为原文
I understood that you are not able stream the video.
But you said that the data transfer is successfully working. If the data transfer is successful, how come the sockets are being invalid?
The reason for not streaming the video may be some thing else: Like the Frame Format, Probe control settings and etc.
Please check whether the following debug steps help you.
UVC implementation requires the camera device to always send the right amount of bytes per frame that was declared. Anything more or less leads to the frame being dropped. If this happens, you will see black screen in the application.
Follow the below mentioned steps to debug this issue:
1. Does the camera send out exactly as much data as you configured it for? Probe the FV and LV lines and make sure that the pulse widths are exactly as much as you expect it to be, i.e. LV high duration = (number of pixels per line) * (bytes per pixel) / PCLK_frequency FV high duration = (LV high duration + line blanking time) * (number of pixels per line) Certain cameras have dead bands (blank lines) that are inserted in the frame which are also read by FX3. This leads to an increased frame size than what was reported to the PC.
2. Make sure that Vertical blanking period is at least 500us to ensure a constant frame rate. At the end of a frame, GPIF state machine interrupts the CPU (INTR_CPU). Once, the firmware has committed the partial buffer, GPIF state machine will switch to the other socket. Vertical blanking period should be large enough to allow GPIF state machine to switch to another socket, so that there is no data lost in the next frame. When there is a mismatch in the frame size than what was reported to PC, the frame will be dropped.
3. Make sure that the PCLK frequency set by the image sensor is less than 100MHz.
4. Make sure that the GPIF lines are terminated with 22Ω series resistors. Also make sure that these lines are length matched to within 500mils.
5. Are the following fields in the descriptor file cyfxuvcdscr.c, correctly set to what the camera actually gives out?
a. Frame width and height in VS Frame Descriptor
b. Bytes per pixel in VS Format Descriptor
c. Frame rate in VS Frame Descriptor
d. Video format as set by the GUID in VS Format Descriptor
6. Make sure that maximum video frame size set in the probe control structure (glProbeCtrl) is equal to or greater than the amount of bytes you are sending in one frame. Also, make sure that the maximum payload size set in the probe control structure is more than how much you send in the payload (which is typically the DMA buffer size)
7. Is “Error in CyU3PDmaMultiChannelCommitBuffer: code 71” debug message is displayed on a UART terminal when streaming video through FX3? If yes, implement the modifications mentioned in the following Knowledge Base Article:
https://community.cypress.com/docs/DOC-10463
8. You can take a USB trace to make sure that UVC header field in the payload toggles between 0x8C and 0x8D in the subsequent frames.
我知道你不能播放视频。
但是你说数据传输成功地工作了。如果数据传输成功,套接字怎么会失效?
不流视频的原因可能是别的:比如帧格式、探头控制设置等。
请检查以下调试步骤是否对您有帮助。
UVC实现需要相机设备总是发送每个被声明的帧的正确字节数。任何东西或多或少都会导致框架被丢弃。如果发生这种情况,你会在应用程序中看到黑屏。
按照下面提到的步骤调试这个问题:
1。相机发送的数据和你配置的数据一样多吗?探测FV和LV线,并确保脉冲宽度与预期的一样多,即LV高持续时间=(每行像素数)*(每个像素字节)/PCLKYF频率高持续时间=(LV高持续时间+线消隐时间)*(每行像素数)某些相机。有死区(空白线)插入到框架中,也被FX3读取。这导致帧的大小比上报PC的增加。
2。确保垂直消隐周期至少为500μs,以确保恒定的帧速率。在帧结束时,GPIF状态机中断CPU(内部CPU)。一旦固件提交了部分缓冲区,GPIF状态机将切换到另一个套接字。垂直消隐周期应足够大,以允许GPIF状态机切换到另一个套接字,以便在下一帧中没有数据丢失。当帧大小与报告的PC不匹配时,帧将被丢弃。
三。确保由图像传感器设定的PCLK频率小于100MHz。
4。确保GPIF线以22Ω系列电阻终止。还要确保这些线的长度匹配到500米以内。
5。描述符文件CYFXVCCDSCR.C中的下列字段是否正确设置为相机实际发出的内容?
a帧宽度和高度在VS帧描述符中
VS格式描述符中每个像素的字节数
帧描述符中的帧速率
D.VS格式描述符中GUID所设置的视频格式
6。确保在探头控制结构(GLPROBECTRL)中设置的最大视频帧大小等于或大于您在一帧中发送的字节量。此外,确保在探头控制结构中设置的最大有效载荷大小大于您在有效载荷(通常是DMA缓冲区大小)中发送的量。
7。“CUU3PDMAMultChhannEngultPrimeBuff:代码71”的错误是否在UFART终端上通过FX3传输视频时显示?如果是,执行以下知识库文章中提到的修改:HTTPS://Cultuy.CyPress .COM/DOCS/DOC-10463
8。可以使用USB跟踪来确保有效载荷中的UVC头字段在后续帧中切换到0x8C和0x8D之间。
以上来自于百度翻译
以下为原文
I understood that you are not able stream the video.
But you said that the data transfer is successfully working. If the data transfer is successful, how come the sockets are being invalid?
The reason for not streaming the video may be some thing else: Like the Frame Format, Probe control settings and etc.
Please check whether the following debug steps help you.
UVC implementation requires the camera device to always send the right amount of bytes per frame that was declared. Anything more or less leads to the frame being dropped. If this happens, you will see black screen in the application.
Follow the below mentioned steps to debug this issue:
1. Does the camera send out exactly as much data as you configured it for? Probe the FV and LV lines and make sure that the pulse widths are exactly as much as you expect it to be, i.e. LV high duration = (number of pixels per line) * (bytes per pixel) / PCLK_frequency FV high duration = (LV high duration + line blanking time) * (number of pixels per line) Certain cameras have dead bands (blank lines) that are inserted in the frame which are also read by FX3. This leads to an increased frame size than what was reported to the PC.
2. Make sure that Vertical blanking period is at least 500us to ensure a constant frame rate. At the end of a frame, GPIF state machine interrupts the CPU (INTR_CPU). Once, the firmware has committed the partial buffer, GPIF state machine will switch to the other socket. Vertical blanking period should be large enough to allow GPIF state machine to switch to another socket, so that there is no data lost in the next frame. When there is a mismatch in the frame size than what was reported to PC, the frame will be dropped.
3. Make sure that the PCLK frequency set by the image sensor is less than 100MHz.
4. Make sure that the GPIF lines are terminated with 22Ω series resistors. Also make sure that these lines are length matched to within 500mils.
5. Are the following fields in the descriptor file cyfxuvcdscr.c, correctly set to what the camera actually gives out?
a. Frame width and height in VS Frame Descriptor
b. Bytes per pixel in VS Format Descriptor
c. Frame rate in VS Frame Descriptor
d. Video format as set by the GUID in VS Format Descriptor
6. Make sure that maximum video frame size set in the probe control structure (glProbeCtrl) is equal to or greater than the amount of bytes you are sending in one frame. Also, make sure that the maximum payload size set in the probe control structure is more than how much you send in the payload (which is typically the DMA buffer size)
7. Is “Error in CyU3PDmaMultiChannelCommitBuffer: code 71” debug message is displayed on a UART terminal when streaming video through FX3? If yes, implement the modifications mentioned in the following Knowledge Base Article:
https://community.cypress.com/docs/DOC-10463
8. You can take a USB trace to make sure that UVC header field in the payload toggles between 0x8C and 0x8D in the subsequent frames.
举报