-
Notifications
You must be signed in to change notification settings - Fork 19
Description
Hello there,
I was using this demo to calibrating my own camera while the result showed a unacceptable drifts from the truth.
OpenCV version 3.2.0
Python version 3.6.7
The input images series is like below:
and the corresponding detected images is

There is one problem I am not that sure is, the asymmetric calibration target I used has a differently distance between the columns and rows which could be seen clearly from my code
longerCalibrate.txt
Finally, the result of intrinsic calibration is
camera_matrix:
- [451.6701112652139, 0.0, 777.2304056394676]
- [0.0, 469.37232191132745, 321.2437268938009]
- [0.0, 0.0, 1.0]
dist_coeff: - [-0.1705178885286108, 0.013773972732487686, 0.0012506566857040959, 0.004203751684462792,
0.006202670677996154]
but actually, the intrinsic matrix should be
9.543190920000000000e+02 0.000000000000000000e+00 6.399660030000000000e+02 0.000000000000000000e+00 9.543190920000000000e+02 3.634058530000000000e+02 0.000000000000000000e+00 0.000000000000000000e+00 1.000000000000000000e+00
this is true because the resolution of these pictures is 1280x720, more reasonable
So the question is, is there anything wrong with my code or is it a kernel problem of the opencv function?
