I am running a streaming server with artificial frames. Frames contain 50% black pixels and 50% white pixels. Then i JPEG compress them and stream them. Here are the results

320 x 200
Jpeg compression time: 12400 micro seconds
FPS: 42

size of the JPEG frame 2KB

640 x 400
Jpeg compression time: 43200 micro seconds
FPS: 21

Size of the JPEG frame 4KB


I was wondering that VGA runs at 21fps, while QVGA (this is only 1/4th
of pixels) runs at 42fps, which is a slowdown of 50% for an increase of
pixels of 400%.