Hello, I'm a student currently programming on Qt.
Now, I'm going to use the server-side program and the client-side program
to take a picture taken with the computer's webcam.
I'm writing the code to communicate, but it doesn't work at all.
I can send it to the client, but if the client reconstructs the video, somehow the video is cut off (it's like that when the TV is not working well).
Therefore, could you give me a hint about the program that sends and receives camera images on the server client side by UDP transmission?
I'll write down my method briefly.I don't mind just pointing out the problem.
Server Side
Video shooting with webcam → Capture with OpenCV function → Convert to QByteArray → Add vertical and horizontal information with QByteArray member function append → Send to client via UDP writeDatagram
"Client side"
Get information from UDP readDatagram → Find vertical and horizontal information from QByteArray member function indexOf (I tag it before sending it, so I'll let you find it) → Change the data type from QByteArray to char → Label QImage data type to QPixmap→QPix>
Function (1)
QImage*MainWindow::char2QImage(char*cdata)
{
int channels = 3;
QImage*qimg = new QImage(width_data, height_data, QImage::Format_ARGB32);
char*data=cdata;
for (inty=0;y<height_data;y++, data+=width_data*channels)
{
for (int x = 0; x <width_data; x++)
{
char, g, b, a = 0;
if(channels==1)
{
r=data [x*channels];
g = data [ x * channels ];
b=data [x*channels];
}
else if (channels==3||channels==4)
{
b=data [x*channels];
g = data [x*channels+1];
r = data [x*channels+2];
}
if(channels==4)
{
a = data [x*channels+3];
qimg->setPixel(x,y,qRgba(r,g,b,a));
}
else
{
qimg->setPixel(x,y,qRgb(r,g,b));
}
}
}
return qimg;
}
"Function for server-side transmission"
// Function representing the process of transmitting camera images
voidMainWindow::sendDatagram()
{
inti;
// Image data → QByteArray
ba = new QByteArray(img->imageData,img->imageSize);
qDebug()<<"w="<img->width;
qDebug()<<"h="<img->height;
qDebug()<<"ws="<img->widthStep;
// Protocol determination image data (variable length) <W> image horizontal </W> H> image vertical </H>
// Add width to QByteArray
ba->append("<WSTART>");
sqwidth.setNum(img->width);
ba->append(sqwidth);
ba->append("</WEND>");
// Add height to QByteArray
ba->append("<HSTART>");
sqheight.setNum(img->height);
ba->append(sqheight);
ba->append("</HEND>");
// Compression of video data
QByteArray comp=qCompress(*ba,5);
// Send video data
for(i=0;i<comp.size();i=i+100){
if(i<comp.size())
// Send 100 data each
udpserver.writeDatagram(comp.mid(i,100), QHostAddress::LocalHost,10000);
else if(i>=comp.size()){
i = i-100;
// Send data one at a time
for(;i==comp.size();i++)
udpserver.writeDatagram(comp.mid(i,1), QHostAddress::LocalHost,10000);
}
}
// Display that the transmission is complete
ui ->label_2 ->setText("transfer finished");
// Discard QByteArray
delete ba;
}
640*480*24bit color is over 7Mbpt per frame.The target frame rate is not written, but it should be at least 15fps to make it into a video.That would be over 100 Mbps.
If you don't have enough throughput, you have to reduce the amount of data.How to reduce it depends on the line speed.If it's not 100Mbps but 10Mbps, I might be able to compress JPEG every frame.Thinner lines need to be encoded as video.
If there should be enough throughput, but the problem is that the transmitter's sending interval is too short to overflow the transmitter's buffer, or the receiver's processing is not keeping up and the buffer is still overflowing.
© 2024 OneMinuteCode. All rights reserved.