This topic describes the parameters related to the stream grabber.
In this topic Hide
The AccessMode parameter indicates the mode of access the current application has to the device:
This parameter is read-only.
Use the AutoPacketSize parameter to optimize the size of the data packets transferred via Ethernet.
When the parameter is set to true, the camera automatically negotiates the packet size to find the largest possible packet size.
To retrieve the current packet size, get the value of the GevSCPSPacketSize parameter.
Using large packets reduces the overhead for transferring images. The maximum packet size depends on the network hardware and its configuration.
Use the MaxBufferSize parameter to specify the maximum size (in bytes) of a buffer used for grabbing images.
A grab application must set this parameter before grabbing starts.
Use the MaxNumBuffer parameter to specify the maximum number of buffers that can be used simultaneously for grabbing images.
Use the MaxTransferSize parameter to specify the maximum USB data transfer size in bytes. The default value is appropriate for most applications. Increase the value to lower the CPU load.
USB host adapter drivers may require decreasing the value if the application fails to receive the image stream. The maximum value depends on the operating system.
Use the NumMaxQueuedUrbs parameter to specify the maximum number of USB request blocks (URBs) to be enqueued simultaneously.
Increasing this value may improve stability and reduce jitter, but requires more resources on the host computer.
Decreasing this value can be helpful if you get error messages related to insufficient system memory, e.g., "Failed to probe and lock buffer=0xe2010130" or "Failed to submit transfer status=0xe2100001".
Use the ReceiveThreadPriorityOverride parameter to enable assigning a custom priority to the thread which receives incoming stream packets. Only available if the socket driver is used.
To assign the priority, use the ReceiveThreadPriority parameter.
Use the ReceiveThreadPriority parameter to set the thread priority of the receive thread. Only available if the socket driver is used.
To assign the priority, the ReceiveThreadPriorityOverride parameter must be set to true.
Use the SocketBufferSize parameter to set the socket buffer size in kilobytes. Only available if the socket driver is used.
The Status parameter indicates the current status of the stream grabber:
This parameter is read-only.
Use the TransferLoopThreadPriority parameter to specify the priority of the threads that handle USB requests from the stream interface.
In pylon, there are two threads belonging to the USB transport layer, one for the image URBs (USB request blocks) and one for the event URBs. The transport layer enqueues the URBs to the xHCI driver and polls the bus for delivered URBs.
You can control the priority of both threads via the TransferLoopThreadPriority parameter.
On Windows, by default, the parameter is set to the following value:
On Linux and macOS, the default parameter value and the parameter value range may differ.
The transfer loop priority should always be higher than the grab engine thread priority (InternalGrabEngineThreadPriority parameter) and the grab loop thread priority (GrabLoopThreadPriority parameter).
For more information, see the C++ Programmer's Guide and Reference Documentation delivered with the Basler pylon Camera Software Suite ("Advanced Topics" -> "Application Settings for High Performance".
Use the Type parameter to set the host application's GigE Vision driver type:
The TypeIsSocketDriverAvailable parameter indicates whether the socket driver is currently available (1) or not available (0).
The TypeIsWindowsFilterDriverAvailable parameter indicates whether the pylon GigE Vision Filter Driver is currently available (1) or not available (0).
The TypeIsWindowsIntelPerformanceDriverAvailable parameter indicates whether the pylon GigE Vision Performance Driver is currently available (1) or not available (0).
The packet resend mechanism (GigE Vision only) optimizes the network performance by detecting and resending missing data packets.
In GigE Vision data transmission, each packet has a header consisting of an ascending 24-bit packet ID. This allows the receiving end to detect if a packet is missing.
You have to weigh the disadvantages and advantages for your special application to decide whether to enable or disable the mechanism:
The pylon GigE Vision Filter Driver and the Performance Driver use different packet resend mechanisms.
Use the EnableResend parameter to enable the packet resend mechanism.
The pylon GigE Vision Filter Driver has a simple packet resend mechanism.
If the driver detects that packets are missing, it waits for a specified period of time. If the packets don't arrive within the time specified, the driver sends one resend request.
Use the PacketTimeout parameter to specify how long (in milliseconds) the filter driver waits for the next expected packet before it initiates a resend request.
Make sure that the parameter is set to a longer time interval than the inter-packet delay.
Use the FrameRetention parameter to specify the maximum time in milliseconds to receive all packets of a frame. The timer starts when the first packet has been received. If the transmission is not completed within the time specified, the corresponding frame is delivered with the status "Failed".
The pylon GigE Vision Performance Driver has a more advanced packet resend mechanism.
It allows more fine-tuning. Also, the driver can send consecutive resend requests until a maximum number of requests has been reached.
Use the ReceiveWindowSize parameter to specify the size (in frames) of the "receive window" in which the stream grabber looks for missing packets.
Example: Assume the receive window size is set to 15. This means that the stream grabber looks for missing packets within the last 15 acquired frames.
The maximum value of the ReceiveWindowSize parameter is 16. If the parameter is set to 0, the packet resend mechanism is disabled.
Use the ResendRequestThreshold parameter to set the threshold after which resend requests are initiated.
The parameter value is set in percent of the receive window size.
Example: Assume the receive window size is set to 15, and the resend request threshold is set to 33 %. This means that the threshold is set after 15 * 0.3333 = 5 frames.
In the example above, frames 99 and 100 are already within the receive window. The stream grabber detects missing packets in these frames. However, the stream grabber does not yet send a resend request.
Rather, the grabber waits until frame 99 has passed the threshold:
Now, the grabber sends resend requests for missing packets in frames 99 and 100.
Use the ResendRequestBatching parameter to specify the amount of resend requests to be batched, i.e., sent together.
The parameter value is set in percent of the amount of frames between the resend request threshold and the start of the receive window.
Example: Assume the receive window size is set to 15, the resend request threshold is set to 33 %, and the resend request batching is set to 80 %. This means that the batching is set to 15 * 0.33 * 0.8 = 4 frames.
In the example above, frame 99 has just passed the resend request threshold. The stream grabber looks for missing packets in the frames between the two thresholds and groups them.
Now, the stream grabber sends a single resend request for all missing packets in frames 99, 100, 101, and 102.
Use the MaximumNumberResendRequests parameter to specify the maximum number of resend requests per missing packet.
Use the ResendTimeout parameter to specify how long (in milliseconds) the stream grabber waits between detecting a missing packet and sending a resend request.
Use the ResendRequestResponseTimeout parameter to specify how long (in milliseconds) the stream grabber waits between sending a resend request and considering the request as lost.
If a request is considered lost and the maximum number of resend requests hasn't been reached yet, the grabber sends another request.
If a request is considered lost and the maximum number of resend requests has been reached, the packet is considered lost.
The following parameters (GigE Vision only) allow you to configure where the stream grabber should send the grabbed data to.
The stream grabber can send the stream data to one specific device or to multiple devices in the network.
Use the TransmissionType parameter to define how stream data is transferred within the network. You can set the parameter to the following values:
Controlling and Monitoring Applications
When using limited broadcast, subnet-directed broadcast, or multicast, you usually want to send the image data stream from one camera to multiple destinations.
To achieve this, you must set up exactly one controlling application and one or more monitoring applications.
For testing purposes, you can use the pylon Viewer as the controlling application and the "pylon Viewer Multicast Monitor" as the monitoring application.
To start the pylon Viewer Multicast Monitor:
For more information about setting up controlling and monitoring applications, see the C++ Programmer's Guide and Reference Documentation delivered with the Basler pylon Camera Software Suite ("Advanced Topics" -> "GigE Multicast/Broadcast").
The DestinationAddr parameter indicates the IP address to which the stream grabber sends all stream data.
The value and the access mode of the parameter depend on the TransmissionType parameter value:
TransmissionType Parameter value | DestinationAddr Parameter Value | DestinationAddr Access Mode |
---|---|---|
Unicast | IP address of the camera's GigE network adapter | Read-only |
LimitedBroadcast | 255.255.255.255 | Read-only |
SubnetDirectedBroadcasting | (Camera's IP address) OR NOT (camera's subnet mask) | Read-only |
Multicast | Default: 239.0.0.1 Allowed range: 224.0.0.0 to 239.255.255.255a |
Read/Write |
a Some addresses in this range are reserved. If you are unsure, use an address between 239.255.0.0 and 239.255.255.255. This range is assigned by RFC 2365 as a locally administered address space.
The DestinationPort parameter indicates the port where the stream grabber will send all stream data to.
If the parameter is set to 0, pylon automatically selects an unused port.
For more information, see the C++ Programmer's Guide and Reference Documentation delivered with the Basler pylon Camera Software Suite ("Advanced Topics" -> "Selecting a Destination Port").
The pylon API provides statistics parameters that allow you to check whether your camera is set up correctly, your hardware components are appropriate, and your system performs well.
At camera startup, all statistics parameters are set to 0. While continuously grabbing images, the parameters are continuously updated to provide information about, e.g., lost images or buffers that were grabbed incompletely.
The Statistic_Buffer_Underrun_Count parameter counts the number of frames lost because there were no buffers in the queue.
The parameter value increases whenever an image is received, but there are no queued, free buffers in the driver input queue and therefore the frame is lost.
The Statistic_Failed_Buffer_Count parameter counts the number of buffers that returned with status "failed", i.e., buffers that were grabbed incompletely.
The error code for incompletely grabbed buffers is 0xE1000014 on GigE cameras and 0xE2000212 on USB 3.0 cameras.
The Statistic_Failed_Packet_Count parameter counts packets that were successfully received by the stream grabber, but have been reported as "failed" by the camera.
The most common reason for packets being reported as "failed" is that a packet resend request couldn't be satisfied by the camera. This occurs, e.g., if the requested data has already been overwritten by new image data inside the camera's memory.
The Failed Packet Count does not count packets that are considered lost because all resend requests have failed. In this case, the Failed Buffer Count will be increased, but not the Failed Packet Count.
The Statistic_Last_Block_Id parameter indicates the last grabbed block ID.
The Statistic_Last_Failed_Buffer_Status parameter indicates the status code of the last failed buffer.
The Statistic_Last_Failed_Buffer_Status_Text parameter indicates the last error status of a read or write operation.
The Statistic_Missed_Frame_Count parameter counts the number of frames that were acquired but skipped because the camera's internal frame buffer was already full. Basler USB 3.0 cameras are equipped with a frame buffer of 56 MB.
A high Missed Frame Count indicates that the xHCI host controller doesn't support the bandwidth of the camera, i.e., the host controller does not retrieve the acquired images on time. This causes the camera to buffer images in its internal frame buffer. When the internal frame buffer is full, the camera will start skipping newly acquired sensor data. For more information, see the USB 3.0 specification (Bulk Transaction type).
The Statistic_Resend_Packet_Count parameter counts the number of packets requested by resend requests.
The Statistic_Resend_Request_Count parameter counts the number of packet resend requests sent.
Depending on the driver type and the stream grabber settings, the stream grabber may send multiple requests for one missing packet, or it may send one request for multiple packets. Therefore, the Resend Request Count and the Resend Packet Count will most likely be different.
The Statistic_Resynchronization_Count parameter counts the number of stream resynchronizations.
If the host gets out of sync within the streaming process, it initiates a resynchronization, and the camera's internal buffer is flushed.
A host may get out of sync if it requests stream packets with a specific sequence of IDs, but the device delivers packets with a different sequence. This may occur when the connection between the camera and the host is faulty. A host being out of sync results in massive image loss.
A host resynchronization is considered the most serious error case in the USB 3.0 and USB3 Vision specification.
The Statistic_Total_Buffer_Count parameter counts the number of buffers that returned with "success" or "failed" status, i.e., all successfully or incompletely grabbed buffers.
The error code for incompletely grabbed buffers is 0xE1000014 on GigE cameras and 0xE2000212 on USB 3.0 cameras.
The Statistic_Total_Packet_Count parameter counts all packets received, including packets that have been reported as "failed", i.e., including the Failed Packet Count.
/* General Parameters */
// Access Mode
AccessModeEnums accessMode = camera.GetStreamGrabberParams().AccessMode.GetValue();
// Auto Packet Size
camera.GetStreamGrabberParams().AutoPacketSize.SetValue(true);
// Maximum Buffer Size
camera.GetStreamGrabberParams().MaxBufferSize.SetValue(131072);
// Maximum Number of Buffers
camera.GetStreamGrabberParams().MaxNumBuffer.SetValue(16);
// Maximum Transfer Size
camera.GetStreamGrabberParams().MaxTransferSize.SetValue(1048568);
// Num Max Queued Urbs
camera.GetStreamGrabberParams().NumMaxQueuedUrbs.SetValue(64);
// Receive Thread Priority Override
camera.GetStreamGrabberParams().ReceiveThreadPriorityOverride.SetValue(true);
// Receive Thread Priority
camera.GetStreamGrabberParams().ReceiveThreadPriority.SetValue(15);
// Socket Buffer Size (socket driver only)
camera.GetStreamGrabberParams().SocketBufferSize.SetValue(2048);
// Status
StatusEnums streamGrabberStatus = camera.GetStreamGrabberParams().Status.GetValue();
// Transfer Loop Thread Priority
camera.GetStreamGrabberParams().TransferLoopThreadPriority.SetValue(15);
// Type of GigE Vision Filter Driver
camera.GetStreamGrabberParams().Type.SetValue(Type_WindowsIntelPerformanceDriver);
// Type: Socket Driver Available
int64_t i = camera.GetStreamGrabberParams().TypeIsWindowsIntelPerformanceDriverAvailable.GetValue();
// Type: Windows Filter Driver Available
int64_t i = camera.GetStreamGrabberParams().TypeIsWindowsFilterDriverAvailable.GetValue();
// Type: Windows Intel Performance Driver Available
int64_t i = camera.GetStreamGrabberParams().TypeIsSocketDriverAvailable.GetValue();
/* Packet Resend Mechanism Parameters */
// Enable Resends
camera.GetStreamGrabberParams().EnableResend.SetValue(true);
// Packet Timeout (Filter Driver only)
camera.GetStreamGrabberParams().PacketTimeout.SetValue(40);
// Frame Retention (Filter Driver only)
camera.GetStreamGrabberParams().FrameRetention.SetValue(200);
// Receive Window Size (Performance Driver only)
camera.GetStreamGrabberParams().ReceiveWindowSize.SetValue(16);
// Resend Request Threshold (Performance Driver only)
camera.GetStreamGrabberParams().ResendRequestThreshold.SetValue(5);
// Resend Request Batching (Performance Driver only)
camera.GetStreamGrabberParams().ResendRequestBatching.SetValue(10);
// Maximum Number of Resend Requests (Performance Driver only)
camera.GetStreamGrabberParams().MaximumNumberResendRequests.SetValue(25);
// Resend Timeout (Performance Driver only)
camera.GetStreamGrabberParams().ResendTimeout.SetValue(2);
// Resend Request Response Timeout (Performance Driver only)
camera.GetStreamGrabberParams().ResendRequestResponseTimeout.SetValue(2);
/* Stream Destination Parameters */
// Transmission Type
camera.GetStreamGrabberParams().TransmissionType.SetValue(TransmissionType_Unicast);
// Destination Address
GenICam::gcstring destinationAddr = camera.GetStreamGrabberParams().DestinationAddr.GetValue();
// Destination Port
camera.GetStreamGrabberParams().DestinationPort.SetValue(0);
/* Statistics Parameters */
// Buffer Underrun Count
int64_t bufferUnderrunCount = camera.GetStreamGrabberParams().Statistic_Buffer_Underrun_Count.GetValue();
// Failed Buffer Count
int64_t failedBufferCount = camera.GetStreamGrabberParams().Statistic_Failed_Buffer_Count.GetValue();
// Failed Packet Count
int64_t failedPacketCount = camera.GetStreamGrabberParams().Statistic_Failed_Packet_Count.GetValue();
// Last Block ID
int64_t lastBlockId = camera.GetStreamGrabberParams().Statistic_Last_Block_Id.GetValue();
// Last Failed Buffer Status
Int64_t lastFailedBufferStatus = camera.GetStreamGrabberParams().Statistic_Last_Failed_Buffer_Status.GetValue();
// Last Failed Buffer Status Text
GenICam::gcstring lastFailedBufferStatusText = camera.GetStreamGrabberParams().Statistic_Last_Failed_Buffer_Status_Text.GetValue();
// Missed Frame Count
int64_t missedFrameCount = camera.GetStreamGrabberParams().Statistic_Missed_Frame_Count.GetValue();
// Resend Request Count
int64_t resendRequestCount = camera.GetStreamGrabberParams().Statistic_Resend_Request_Count.GetValue();
// Resend Packet Count
int64_t resendPacketCount = camera.GetStreamGrabberParams().Statistic_Resend_Packet_Count.GetValue();
// Resynchronization Count
int64_t resynchronizationCount = camera.GetStreamGrabberParams().Statistic_Resynchronization_Count.GetValue();
// Total Buffer Count
int64_t totalBufferCount = camera.GetStreamGrabberParams().Statistic_Total_Buffer_Count.GetValue();
// Total Packet Count
int64_t totalPacketCount = camera.GetStreamGrabberParams().Statistic_Total_Packet_Count.GetValue();
/* General Parameters */
// Access Mode
string accessMode = camera.Parameters[PLStream.AccessMode].GetValue();
// Auto Packet Size
camera.Parameters[PLStream.AutoPacketSize].SetValue(true);
// Maximum Buffer Size
camera.Parameters[PLStream.MaxBufferSize].SetValue(131072);
// Maximum Number of Buffers
camera.Parameters[PLStream.MaxNumBuffer].SetValue(16);
// Maximum Transfer Size
camera.Parameters[PLStream.MaxTransferSize].SetValue(1048568);
// Num Max Queued Urbs
camera.Parameters[PLStream.NumMaxQueuedUrbs].SetValue(64);
// Receive Thread Priority Override
camera.Parameters[PLStream.ReceiveThreadPriorityOverride].SetValue(true);
// Receive Thread Priority
camera.Parameters[PLStream.ReceiveThreadPriority].SetValue(15);
// Socket Buffer Size (socket driver only)
camera.Parameters[PLStream.SocketBufferSize].SetValue(2048);
// Status
string streamGrabberStatus = camera.Parameters[PLStream.Status].GetValue();
// Transfer Loop Thread Priority
camera.Parameters[PLStream.TransferLoopThreadPriority].SetValue(15);
// Type of GigE Vision Filter Driver
camera.Parameters[PLStream.Type].SetValue(PLStream.Type.WindowsIntelPerformanceDriver);
// Type: Socket Driver Available
Int64 performanceDriverAvailable = camera.Parameters[PLStream.TypeIsWindowsIntelPerformanceDriverAvailable].GetValue();
// Type: Windows Filter Driver Available
Int64 filterDriverAvailable = camera.Parameters[PLStream.TypeIsWindowsFilterDriverAvailable].GetValue();
// Type: Windows Intel Performance Driver Available
Int64 socketDriverAvailable = camera.Parameters[PLStream.TypeIsSocketDriverAvailable].GetValue();
/* Packet Resend Mechanism Parameters */
// Enable Resends
camera.Parameters[PLStream.EnableResend].SetValue(true);
// Packet Timeout (Filter Driver only)
camera.Parameters[PLStream.PacketTimeout].SetValue(40);
// Frame Retention (Filter Driver only)
camera.Parameters[PLStream.FrameRetention].SetValue(200);
// Receive Window Size (Performance Driver only)
camera.Parameters[PLStream.ReceiveWindowSize].SetValue(16);
// Resend Request Threshold (Performance Driver only)
camera.Parameters[PLStream.ResendRequestThreshold].SetValue(5);
// Resend Request Batching (Performance Driver only)
camera.Parameters[PLStream.ResendRequestBatching].SetValue(10);
// Maximum Number of Resend Requests (Performance Driver only)
camera.Parameters[PLStream.MaximumNumberResendRequests].SetValue(25);
// Resend Timeout (Performance Driver only)
camera.Parameters[PLStream.ResendTimeout].SetValue(2);
// Resend Request Response Timeout (Performance Driver only)
camera.Parameters[PLStream.ResendRequestResponseTimeout].SetValue(2);
/* Stream Destination Parameters */
// Transmission Type
camera.Parameters[PLStream.TransmissionType].SetValue(PLStream.TransmissionType.Unicast);
// Destination Address
string destinationAddr = camera.Parameters[PLStream.DestinationAddr].GetValue();
// Destination Port
camera.Parameters[PLStream.DestinationPort].SetValue(0);
/* Statistics Parameters */
// Buffer Underrun Count
Int64 bufferUnderrunCount = camera.Parameters[PLStream.Statistic_Buffer_Underrun_Count].GetValue();
// Failed Buffer Count
Int64 failedBufferCount = camera.Parameters[PLStream.Statistic_Total_Buffer_Count].GetValue();
// Failed Packet Count
Int64 failedPacketCount = camera.Parameters[PLStream.Statistic_Failed_Packet_Count].GetValue();
// Last Block ID
Int64 lastBlockId = camera.Parameters[PLStream.Statistic_Last_Block_Id].GetValue();
// Last Failed Buffer Status
Int64 lastFailedBufferStatus = camera.Parameters[PLStream.Statistic_Last_Failed_Buffer_Status].GetValue();
// Last Failed Buffer Status Text
string lastFailedBufferStatusText = camera.Parameters[PLStream.Statistic_Last_Failed_Buffer_Status_Text].GetValue();
// Missed Frame Count
Int64 missedFrameCount = camera.Parameters[PLStream.Statistic_Missed_Frame_Count].GetValue();
// Resend Packet Count
Int64 resendPacketCount = camera.Parameters[PLStream.Statistic_Resend_Packet_Count].GetValue();
// Resend Request Count
Int64 resendRequestCount = camera.Parameters[PLStream.Statistic_Resend_Request_Count].GetValue();
// Resynchronization Count
Int64 resynchronizationCount = camera.Parameters[PLStream.Statistic_Resynchronization_Count].GetValue();
// Total Buffer Count
Int64 totalBufferCount = camera.Parameters[PLStream.Statistic_Total_Buffer_Count].GetValue();
// Total Packet Count
Int64 totalPacketCount = camera.Parameters[PLStream.Statistic_Total_Packet_Count].GetValue();
You can also use the pylon Viewer to easily set the parameters.