1 | |
2 |
|
3 | File: cvcap_v4l.cpp
|
4 | Current Location: ../opencv-0.9.6/otherlibs/highgui
|
5 |
|
6 | Original Version: 2003-03-12 Magnus Lundin [email protected]
|
7 | Original Comments:
|
8 |
|
9 | ML:This set of files adds support for firevre and usb cameras.
|
10 | First it tries to install a firewire camera,
|
11 | if that fails it tries a v4l/USB camera
|
12 | It has been tested with the motempl sample program
|
13 |
|
14 | First Patch: August 24, 2004 Travis Wood [email protected]
|
15 | For Release: OpenCV-Linux Beta4 opencv-0.9.6
|
16 | Tested On: LMLBT44 with 8 video inputs
|
17 | Problems? Post your questions at answers.opencv.org,
|
18 | Report bugs at code.opencv.org,
|
19 | Submit your fixes at https://github.com/Itseez/opencv/
|
20 | Patched Comments:
|
21 |
|
22 | TW: The cv cam utils that came with the initial release of OpenCV for LINUX Beta4
|
23 | were not working. I have rewritten them so they work for me. At the same time, trying
|
24 | to keep the original code as ML wrote it as unchanged as possible. No one likes to debug
|
25 | someone elses code, so I resisted changes as much as possible. I have tried to keep the
|
26 | same "ideas" where applicable, that is, where I could figure out what the previous author
|
27 | intended. Some areas I just could not help myself and had to "spiffy-it-up" my way.
|
28 |
|
29 | These drivers should work with other V4L frame capture cards other then my bttv
|
30 | driven frame capture card.
|
31 |
|
32 | Re Written driver for standard V4L mode. Tested using LMLBT44 video capture card.
|
33 | Standard bttv drivers are on the LMLBT44 with up to 8 Inputs.
|
34 |
|
35 | This utility was written with the help of the document:
|
36 | http://pages.cpsc.ucalgary.ca/~sayles/VFL_HowTo
|
37 | as a general guide for interfacing into the V4l standard.
|
38 |
|
39 | Made the index value passed for icvOpenCAM_V4L(index) be the number of the
|
40 | video device source in the /dev tree. The -1 uses original /dev/video.
|
41 |
|
42 | Index Device
|
43 | 0 /dev/video0
|
44 | 1 /dev/video1
|
45 | 2 /dev/video2
|
46 | 3 /dev/video3
|
47 | ...
|
48 | 7 /dev/video7
|
49 | with
|
50 | -1 /dev/video
|
51 |
|
52 | TW: You can select any video source, but this package was limited from the start to only
|
53 | ONE camera opened at any ONE time.
|
54 | This is an original program limitation.
|
55 | If you are interested, I will make my version available to other OpenCV users. The big
|
56 | difference in mine is you may pass the camera number as part of the cv argument, but this
|
57 | convention is non standard for current OpenCV calls and the camera number is not currently
|
58 | passed into the called routine.
|
59 |
|
60 | Second Patch: August 28, 2004 Sfuncia Fabio [email protected]
|
61 | For Release: OpenCV-Linux Beta4 Opencv-0.9.6
|
62 |
|
63 | FS: this patch fix not sequential index of device (unplugged device), and real numCameras.
|
64 | for -1 index (icvOpenCAM_V4L) i dont use /dev/video but real device available, because
|
65 | if /dev/video is a link to /dev/video0 and i unplugged device on /dev/video0, /dev/video
|
66 | is a bad link. I search the first available device with indexList.
|
67 |
|
68 | Third Patch: December 9, 2004 Frederic Devernay [email protected]
|
69 | For Release: OpenCV-Linux Beta4 Opencv-0.9.6
|
70 |
|
71 | [FD] I modified the following:
|
72 | - handle YUV420P, YUV420, and YUV411P palettes (for many webcams) without using floating-point
|
73 | - cvGrabFrame should not wait for the end of the first frame, and should return quickly
|
74 | (see highgui doc)
|
75 | - cvRetrieveFrame should in turn wait for the end of frame capture, and should not
|
76 | trigger the capture of the next frame (the user choses when to do it using GrabFrame)
|
77 | To get the old behavior, re-call cvRetrieveFrame just after cvGrabFrame.
|
78 | - having global bufferIndex and FirstCapture variables makes the code non-reentrant
|
79 | (e.g. when using several cameras), put these in the CvCapture struct.
|
80 | - according to V4L HowTo, incrementing the buffer index must be done before VIDIOCMCAPTURE.
|
81 | - the VID_TYPE_SCALES stuff from V4L HowTo is wrong: image size can be changed
|
82 | even if the hardware does not support scaling (e.g. webcams can have several
|
83 | resolutions available). Just don't try to set the size at 640x480 if the hardware supports
|
84 | scaling: open with the default (probably best) image size, and let the user scale it
|
85 | using SetProperty.
|
86 | - image size can be changed by two subsequent calls to SetProperty (for width and height)
|
87 | - bug fix: if the image size changes, realloc the new image only when it is grabbed
|
88 | - issue errors only when necessary, fix error message formatting.
|
89 |
|
90 | Fourth Patch: Sept 7, 2005 Csaba Kertesz [email protected]
|
91 | For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
|
92 |
|
93 | I modified the following:
|
94 | - Additional Video4Linux2 support :)
|
95 | - Use mmap functions (v4l2)
|
96 | - New methods are internal:
|
97 | try_palette_v4l2 -> rewrite try_palette for v4l2
|
98 | mainloop_v4l2, read_image_v4l2 -> this methods are moved from official v4l2 capture.c example
|
99 | try_init_v4l -> device v4l initialisation
|
100 | try_init_v4l2 -> device v4l2 initialisation
|
101 | autosetup_capture_mode_v4l -> autodetect capture modes for v4l
|
102 | autosetup_capture_mode_v4l2 -> autodetect capture modes for v4l2
|
103 | - Modifications are according with Video4Linux old codes
|
104 | - Video4Linux handling is automatically if it does not recognize a Video4Linux2 device
|
105 | - Tested succesful with Logitech Quickcam Express (V4L), Creative Vista (V4L) and Genius VideoCam Notebook (V4L2)
|
106 | - Correct source lines with compiler warning messages
|
107 | - Information message from v4l/v4l2 detection
|
108 |
|
109 | Fifth Patch: Sept 7, 2005 Csaba Kertesz [email protected]
|
110 | For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
|
111 |
|
112 | I modified the following:
|
113 | - SN9C10x chip based webcams support
|
114 | - New methods are internal:
|
115 | bayer2rgb24, sonix_decompress -> decoder routines for SN9C10x decoding from Takafumi Mizuno <[email protected]> with his pleasure :)
|
116 | - Tested succesful with Genius VideoCam Notebook (V4L2)
|
117 |
|
118 | Sixth Patch: Sept 10, 2005 Csaba Kertesz [email protected]
|
119 | For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
|
120 |
|
121 | I added the following:
|
122 | - Add capture control support (hue, saturation, brightness, contrast, gain)
|
123 | - Get and change V4L capture controls (hue, saturation, brightness, contrast)
|
124 | - New method is internal:
|
125 | icvSetControl -> set capture controls
|
126 | - Tested succesful with Creative Vista (V4L)
|
127 |
|
128 | Seventh Patch: Sept 10, 2005 Csaba Kertesz [email protected]
|
129 | For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
|
130 |
|
131 | I added the following:
|
132 | - Detect, get and change V4L2 capture controls (hue, saturation, brightness, contrast, gain)
|
133 | - New methods are internal:
|
134 | v4l2_scan_controls_enumerate_menu, v4l2_scan_controls -> detect capture control intervals
|
135 | - Tested succesful with Genius VideoCam Notebook (V4L2)
|
136 |
|
137 | 8th patch: Jan 5, 2006, [email protected]
|
138 | Add support of V4L2_PIX_FMT_YUYV and V4L2_PIX_FMT_MJPEG.
|
139 | With this patch, new webcams of Logitech, like QuickCam Fusion works.
|
140 | Note: For use these webcams, look at the UVC driver at
|
141 | http://linux-uvc.berlios.de/
|
142 |
|
143 | 9th patch: Mar 4, 2006, [email protected]
|
144 | - try V4L2 before V4L, because some devices are V4L2 by default,
|
145 | but they try to implement the V4L compatibility layer.
|
146 | So, I think this is better to support V4L2 before V4L.
|
147 | - better separation between V4L2 and V4L initialization. (this was needed to support
|
148 | some drivers working, but not fully with V4L2. (so, we do not know when we
|
149 | need to switch from V4L2 to V4L.
|
150 |
|
151 | 10th patch: July 02, 2008, Mikhail Afanasyev [email protected]
|
152 | Fix reliability problems with high-resolution UVC cameras on linux
|
153 | the symptoms were damaged image and 'Corrupt JPEG data: premature end of data segment' on stderr
|
154 | - V4L_ABORT_BADJPEG detects JPEG warnings and turns them into errors, so bad images
|
155 | could be filtered out
|
156 | - USE_TEMP_BUFFER fixes the main problem (improper buffer management) and
|
157 | prevents bad images in the first place
|
158 |
|
159 | 11th patch: April 2, 2013, Forrest Reiling [email protected]
|
160 | Added v4l2 support for getting capture property CV_CAP_PROP_POS_MSEC.
|
161 | Returns the millisecond timestamp of the last frame grabbed or 0 if no frames have been grabbed
|
162 | Used to successfully synchonize 2 Logitech C310 USB webcams to within 16 ms of one another
|
163 |
|
164 |
|
165 | make & enjoy!
|
166 |
|
167 | */
|
168 |
|
169 | |
170 | //
|
171 | // IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
|
172 | //
|
173 | // By downloading, copying, installing or using the software you agree to this license.
|
174 | // If you do not agree to this license, do not download, install,
|
175 | // copy or use the software.
|
176 | //
|
177 | //
|
178 | // Intel License Agreement
|
179 | // For Open Source Computer Vision Library
|
180 | //
|
181 | // Copyright (C) 2000, Intel Corporation, all rights reserved.
|
182 | // Third party copyrights are property of their respective owners.
|
183 | //
|
184 | // Redistribution and use in source and binary forms, with or without modification,
|
185 | // are permitted provided that the following conditions are met:
|
186 | //
|
187 | // * Redistribution's of source code must retain the above copyright notice,
|
188 | // this list of conditions and the following disclaimer.
|
189 | //
|
190 | // * Redistribution's in binary form must reproduce the above copyright notice,
|
191 | // this list of conditions and the following disclaimer in the documentation
|
192 | // and/or other materials provided with the distribution.
|
193 | //
|
194 | // * The name of Intel Corporation may not be used to endorse or promote products
|
195 | // derived from this software without specific prior written permission.
|
196 | //
|
197 | // This software is provided by the copyright holders and contributors "as is" and
|
198 | // any express or implied warranties, including, but not limited to, the implied
|
199 | // warranties of merchantability and fitness for a particular purpose are disclaimed.
|
200 | // In no event shall the Intel Corporation or contributors be liable for any direct,
|
201 | // indirect, incidental, special, exemplary, or consequential damages
|
202 | // (including, but not limited to, procurement of substitute goods or services;
|
203 | // loss of use, data, or profits; or business interruption) however caused
|
204 | // and on any theory of liability, whether in contract, strict liability,
|
205 | // or tort (including negligence or otherwise) arising in any way out of
|
206 | // the use of this software, even if advised of the possibility of such damage.
|
207 | //
|
208 | //M*/
|
209 |
|
210 | #include "precomp.hpp"
|
211 |
|
212 | #if !defined WIN32 && (defined HAVE_CAMV4L || defined HAVE_CAMV4L2 || defined HAVE_VIDEOIO)
|
213 |
|
214 | #define CLEAR(x) memset (&(x), 0, sizeof (x))
|
215 |
|
216 | #include <stdio.h>
|
217 | #include <unistd.h>
|
218 | #include <fcntl.h>
|
219 | #include <errno.h>
|
220 | #include <sys/ioctl.h>
|
221 | #include <sys/types.h>
|
222 | #include <sys/mman.h>
|
223 |
|
224 | #ifdef HAVE_CAMV4L
|
225 | #include <linux/videodev.h>
|
226 | #endif
|
227 |
|
228 | #include <string.h>
|
229 | #include <stdlib.h>
|
230 | #include <assert.h>
|
231 | #include <sys/stat.h>
|
232 | #include <sys/ioctl.h>
|
233 |
|
234 | #ifdef HAVE_CAMV4L2
|
235 | #include <asm/types.h>
|
236 | #include <linux/videodev2.h>
|
237 | #endif
|
238 |
|
239 | #ifdef HAVE_VIDEOIO
|
240 | #include <sys/videoio.h>
|
241 | #define HAVE_CAMV4L2
|
242 | #endif
|
243 |
|
244 |
|
245 | #define DEFAULT_V4L_WIDTH 640
|
246 | #define DEFAULT_V4L_HEIGHT 480
|
247 |
|
248 | #define CHANNEL_NUMBER 1
|
249 | #define MAX_CAMERAS 8
|
250 |
|
251 |
|
252 |
|
253 | #define MAX_V4L_BUFFERS 10
|
254 | #define DEFAULT_V4L_BUFFERS 4
|
255 |
|
256 |
|
257 | #define V4L_ABORT_BADJPEG
|
258 |
|
259 | #define MAX_DEVICE_DRIVER_NAME 80
|
260 |
|
261 |
|
262 |
|
263 | #ifdef HAVE_CAMV4L2
|
264 |
|
265 |
|
266 | struct buffer
|
267 | {
|
268 | void * start;
|
269 | size_t length;
|
270 | };
|
271 |
|
272 | static unsigned int n_buffers = 0;
|
273 |
|
274 |
|
275 | #ifndef V4L2_PIX_FMT_SBGGR8
|
276 | #define V4L2_PIX_FMT_SBGGR8 v4l2_fourcc('B','A','8','1')
|
277 | #endif
|
278 | #ifndef V4L2_PIX_FMT_SN9C10X
|
279 | #define V4L2_PIX_FMT_SN9C10X v4l2_fourcc('S','9','1','0')
|
280 | #endif
|
281 |
|
282 | #ifndef V4L2_PIX_FMT_SGBRG
|
283 | #define V4L2_PIX_FMT_SGBRG v4l2_fourcc('G','B','R','G')
|
284 | #endif
|
285 |
|
286 | #endif
|
287 |
|
288 | enum PALETTE_TYPE {
|
289 | PALETTE_BGR24 = 1,
|
290 | PALETTE_YVU420,
|
291 | PALETTE_YUV411P,
|
292 | PALETTE_YUYV,
|
293 | PALETTE_UYVY,
|
294 | PALETTE_SBGGR8,
|
295 | PALETTE_SN9C10X,
|
296 | PALETTE_MJPEG,
|
297 | PALETTE_SGBRG
|
298 | };
|
299 |
|
300 | typedef struct CvCaptureCAM_V4L
|
301 | {
|
302 | int deviceHandle;
|
303 | int bufferIndex;
|
304 | int FirstCapture;
|
305 | #ifdef HAVE_CAMV4L
|
306 | struct video_capability capability;
|
307 | struct video_window captureWindow;
|
308 | struct video_picture imageProperties;
|
309 | struct video_mbuf memoryBuffer;
|
310 | struct video_mmap *mmaps;
|
311 | #endif
|
312 | char *memoryMap;
|
313 | IplImage frame;
|
314 |
|
315 | #ifdef HAVE_CAMV4L2
|
316 | enum PALETTE_TYPE palette;
|
317 |
|
318 | buffer buffers[MAX_V4L_BUFFERS + 1];
|
319 | struct v4l2_capability cap;
|
320 | struct v4l2_input inp;
|
321 | struct v4l2_format form;
|
322 | struct v4l2_crop crop;
|
323 | struct v4l2_cropcap cropcap;
|
324 | struct v4l2_requestbuffers req;
|
325 | struct v4l2_control control;
|
326 | enum v4l2_buf_type type;
|
327 | struct v4l2_queryctrl queryctrl;
|
328 |
|
329 | struct timeval timestamp;
|
330 |
|
331 |
|
332 | int v4l2_brightness, v4l2_brightness_min, v4l2_brightness_max;
|
333 | int v4l2_contrast, v4l2_contrast_min, v4l2_contrast_max;
|
334 | int v4l2_saturation, v4l2_saturation_min, v4l2_saturation_max;
|
335 | int v4l2_hue, v4l2_hue_min, v4l2_hue_max;
|
336 | int v4l2_gain, v4l2_gain_min, v4l2_gain_max;
|
337 | int v4l2_exposure, v4l2_exposure_min, v4l2_exposure_max;
|
338 |
|
339 | #endif
|
340 |
|
341 | }
|
342 | CvCaptureCAM_V4L;
|
343 |
|
344 | #ifdef HAVE_CAMV4L2
|
345 |
|
346 | int V4L2_SUPPORT = 0;
|
347 |
|
348 | #endif
|
349 |
|
350 | static void icvCloseCAM_V4L( CvCaptureCAM_V4L* capture );
|
351 |
|
352 | static int icvGrabFrameCAM_V4L( CvCaptureCAM_V4L* capture );
|
353 | static IplImage* icvRetrieveFrameCAM_V4L( CvCaptureCAM_V4L* capture, int );
|
354 |
|
355 | static double icvGetPropertyCAM_V4L( CvCaptureCAM_V4L* capture, int property_id );
|
356 | static int icvSetPropertyCAM_V4L( CvCaptureCAM_V4L* capture, int property_id, double value );
|
357 |
|
358 | static int icvSetVideoSize( CvCaptureCAM_V4L* capture, int w, int h);
|
359 |
|
360 |
|
361 |
|
362 | static int numCameras = 0;
|
363 | static int indexList = 0;
|
364 |
|
365 | |
366 | Start from 0 and go to MAX_CAMERAS while checking for the device with that name.
|
367 | If it fails on the first attempt of /dev/video0, then check if /dev/video is valid.
|
368 | Returns the global numCameras with the correct value (we hope) */
|
369 |
|
370 | static void icvInitCapture_V4L() {
|
371 | int deviceHandle;
|
372 | int CameraNumber;
|
373 | char deviceName[MAX_DEVICE_DRIVER_NAME];
|
374 |
|
375 | CameraNumber = 0;
|
376 | while(CameraNumber < MAX_CAMERAS) {
|
377 |
|
378 | sprintf(deviceName, "/dev/video%1d", CameraNumber);
|
379 |
|
380 | deviceHandle = open(deviceName, O_RDONLY);
|
381 | if (deviceHandle != -1) {
|
382 |
|
383 |
|
384 | indexList|=(1 << CameraNumber);
|
385 | numCameras++;
|
386 | }
|
387 | if (deviceHandle != -1)
|
388 | close(deviceHandle);
|
389 |
|
390 | CameraNumber++;
|
391 | }
|
392 |
|
393 | };
|
394 |
|
395 | #ifdef HAVE_CAMV4L
|
396 |
|
397 | static int
|
398 | try_palette(int fd,
|
399 | struct video_picture *cam_pic,
|
400 | int pal,
|
401 | int depth)
|
402 | {
|
403 | cam_pic->palette = pal;
|
404 | cam_pic->depth = depth;
|
405 | if (ioctl(fd, VIDIOCSPICT, cam_pic) < 0)
|
406 | return 0;
|
407 | if (ioctl(fd, VIDIOCGPICT, cam_pic) < 0)
|
408 | return 0;
|
409 | if (cam_pic->palette == pal)
|
410 | return 1;
|
411 | return 0;
|
412 | }
|
413 |
|
414 | #endif
|
415 |
|
416 | #ifdef HAVE_CAMV4L2
|
417 |
|
418 | static int try_palette_v4l2(CvCaptureCAM_V4L* capture, unsigned long colorspace)
|
419 | {
|
420 | CLEAR (capture->form);
|
421 |
|
422 | capture->form.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
423 | capture->form.fmt.pix.pixelformat = colorspace;
|
424 | capture->form.fmt.pix.field = V4L2_FIELD_ANY;
|
425 | capture->form.fmt.pix.width = DEFAULT_V4L_WIDTH;
|
426 | capture->form.fmt.pix.height = DEFAULT_V4L_HEIGHT;
|
427 |
|
428 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_S_FMT, &capture->form))
|
429 | return -1;
|
430 |
|
431 |
|
432 | if (colorspace != capture->form.fmt.pix.pixelformat)
|
433 | return -1;
|
434 | else
|
435 | return 0;
|
436 | }
|
437 |
|
438 | #endif
|
439 |
|
440 | #ifdef HAVE_CAMV4L
|
441 |
|
442 | static int try_init_v4l(CvCaptureCAM_V4L* capture, char *deviceName)
|
443 | {
|
444 |
|
445 |
|
446 |
|
447 |
|
448 | int detect = 0;
|
449 |
|
450 |
|
451 |
|
452 |
|
453 |
|
454 |
|
455 | capture->deviceHandle = open(deviceName, O_RDWR);
|
456 |
|
457 | if (capture->deviceHandle == 0)
|
458 | {
|
459 | detect = -1;
|
460 |
|
461 | icvCloseCAM_V4L(capture);
|
462 | }
|
463 |
|
464 | if (detect == 0)
|
465 | {
|
466 |
|
467 | if (ioctl(capture->deviceHandle, VIDIOCGCAP, &capture->capability) < 0)
|
468 | {
|
469 | detect = 0;
|
470 | icvCloseCAM_V4L(capture);
|
471 | }
|
472 | else
|
473 | {
|
474 | detect = 1;
|
475 | }
|
476 | }
|
477 |
|
478 | return detect;
|
479 |
|
480 | }
|
481 |
|
482 | #endif
|
483 |
|
484 | #ifdef HAVE_CAMV4L2
|
485 |
|
486 | static int try_init_v4l2(CvCaptureCAM_V4L* capture, char *deviceName)
|
487 | {
|
488 |
|
489 |
|
490 |
|
491 |
|
492 |
|
493 |
|
494 | int deviceIndex;
|
495 |
|
496 |
|
497 | capture->deviceHandle = open (deviceName, O_RDWR | O_NONBLOCK, 0);
|
498 | if (-1 == capture->deviceHandle)
|
499 | {
|
500 | #ifndef NDEBUG
|
501 | fprintf(stderr, "(DEBUG) try_init_v4l2 open \"%s\": %s\n", deviceName, strerror(errno));
|
502 | #endif
|
503 | icvCloseCAM_V4L(capture);
|
504 | return -1;
|
505 | }
|
506 |
|
507 | CLEAR (capture->cap);
|
508 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_QUERYCAP, &capture->cap))
|
509 | {
|
510 | #ifndef NDEBUG
|
511 | fprintf(stderr, "(DEBUG) try_init_v4l2 VIDIOC_QUERYCAP \"%s\": %s\n", deviceName, strerror(errno));
|
512 | #endif
|
513 | icvCloseCAM_V4L(capture);
|
514 | return 0;
|
515 | }
|
516 |
|
517 |
|
518 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_G_INPUT, &deviceIndex))
|
519 | {
|
520 | #ifndef NDEBUG
|
521 | fprintf(stderr, "(DEBUG) try_init_v4l2 VIDIOC_G_INPUT \"%s\": %s\n", deviceName, strerror(errno));
|
522 | #endif
|
523 | icvCloseCAM_V4L(capture);
|
524 | return 0;
|
525 | }
|
526 |
|
527 |
|
528 | CLEAR (capture->inp);
|
529 | capture->inp.index = deviceIndex;
|
530 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_ENUMINPUT, &capture->inp))
|
531 | {
|
532 | #ifndef NDEBUG
|
533 | fprintf(stderr, "(DEBUG) try_init_v4l2 VIDIOC_ENUMINPUT \"%s\": %s\n", deviceName, strerror(errno));
|
534 | #endif
|
535 | icvCloseCAM_V4L(capture);
|
536 | return 0;
|
537 | }
|
538 |
|
539 | return 1;
|
540 |
|
541 | }
|
542 |
|
543 | static int autosetup_capture_mode_v4l2(CvCaptureCAM_V4L* capture)
|
544 | {
|
545 | if (try_palette_v4l2(capture, V4L2_PIX_FMT_BGR24) == 0)
|
546 | {
|
547 | capture->palette = PALETTE_BGR24;
|
548 | }
|
549 | else
|
550 | if (try_palette_v4l2(capture, V4L2_PIX_FMT_YVU420) == 0)
|
551 | {
|
552 | capture->palette = PALETTE_YVU420;
|
553 | }
|
554 | else
|
555 | if (try_palette_v4l2(capture, V4L2_PIX_FMT_YUV411P) == 0)
|
556 | {
|
557 | capture->palette = PALETTE_YUV411P;
|
558 | }
|
559 | else
|
560 |
|
561 | #ifdef HAVE_JPEG
|
562 | if (try_palette_v4l2(capture, V4L2_PIX_FMT_MJPEG) == 0 ||
|
563 | try_palette_v4l2(capture, V4L2_PIX_FMT_JPEG) == 0)
|
564 | {
|
565 | capture->palette = PALETTE_MJPEG;
|
566 | }
|
567 | else
|
568 | #endif
|
569 |
|
570 | if (try_palette_v4l2(capture, V4L2_PIX_FMT_YUYV) == 0)
|
571 | {
|
572 | capture->palette = PALETTE_YUYV;
|
573 | }
|
574 | else if (try_palette_v4l2(capture, V4L2_PIX_FMT_UYVY) == 0)
|
575 | {
|
576 | capture->palette = PALETTE_UYVY;
|
577 | }
|
578 | else
|
579 | if (try_palette_v4l2(capture, V4L2_PIX_FMT_SN9C10X) == 0)
|
580 | {
|
581 | capture->palette = PALETTE_SN9C10X;
|
582 | } else
|
583 | if (try_palette_v4l2(capture, V4L2_PIX_FMT_SBGGR8) == 0)
|
584 | {
|
585 | capture->palette = PALETTE_SBGGR8;
|
586 | } else
|
587 | if (try_palette_v4l2(capture, V4L2_PIX_FMT_SGBRG) == 0)
|
588 | {
|
589 | capture->palette = PALETTE_SGBRG;
|
590 | }
|
591 | else
|
592 | {
|
593 | fprintf(stderr, "HIGHGUI ERROR: V4L2: Pixel format of incoming image is unsupported by OpenCV\n");
|
594 | icvCloseCAM_V4L(capture);
|
595 | return -1;
|
596 | }
|
597 |
|
598 | return 0;
|
599 |
|
600 | }
|
601 |
|
602 | #endif
|
603 |
|
604 | #ifdef HAVE_CAMV4L
|
605 |
|
606 | static int autosetup_capture_mode_v4l(CvCaptureCAM_V4L* capture)
|
607 | {
|
608 |
|
609 | if(ioctl(capture->deviceHandle, VIDIOCGPICT, &capture->imageProperties) < 0) {
|
610 | fprintf( stderr, "HIGHGUI ERROR: V4L: Unable to determine size of incoming image\n");
|
611 | icvCloseCAM_V4L(capture);
|
612 | return -1;
|
613 | }
|
614 |
|
615 |
|
616 |
|
617 | if (try_palette(capture->deviceHandle, &capture->imageProperties, VIDEO_PALETTE_RGB24, 24)) {
|
618 |
|
619 | }
|
620 | else if (try_palette(capture->deviceHandle, &capture->imageProperties, VIDEO_PALETTE_YUV420P, 16)) {
|
621 |
|
622 | }
|
623 | else if (try_palette(capture->deviceHandle, &capture->imageProperties, VIDEO_PALETTE_YUV420, 16)) {
|
624 |
|
625 | }
|
626 | else if (try_palette(capture->deviceHandle, &capture->imageProperties, VIDEO_PALETTE_YUV411P, 16)) {
|
627 |
|
628 | }
|
629 | else {
|
630 | fprintf(stderr, "HIGHGUI ERROR: V4L: Pixel format of incoming image is unsupported by OpenCV\n");
|
631 | icvCloseCAM_V4L(capture);
|
632 | return -1;
|
633 | }
|
634 |
|
635 | return 0;
|
636 |
|
637 | }
|
638 |
|
639 | #endif
|
640 |
|
641 | #ifdef HAVE_CAMV4L2
|
642 |
|
643 |
|
644 | static void v4l2_scan_controls(CvCaptureCAM_V4L* capture)
|
645 | {
|
646 |
|
647 | __u32 ctrl_id;
|
648 |
|
649 | for (ctrl_id = V4L2_CID_BASE;
|
650 | ctrl_id < V4L2_CID_LASTP1;
|
651 | ctrl_id++)
|
652 | {
|
653 |
|
654 |
|
655 | CLEAR (capture->queryctrl);
|
656 | capture->queryctrl.id = ctrl_id;
|
657 |
|
658 | if (0 == ioctl (capture->deviceHandle, VIDIOC_QUERYCTRL,
|
659 | &capture->queryctrl))
|
660 | {
|
661 |
|
662 | if (capture->queryctrl.flags & V4L2_CTRL_FLAG_DISABLED)
|
663 | continue;
|
664 |
|
665 | if (capture->queryctrl.id == V4L2_CID_BRIGHTNESS)
|
666 | {
|
667 | capture->v4l2_brightness = 1;
|
668 | capture->v4l2_brightness_min = capture->queryctrl.minimum;
|
669 | capture->v4l2_brightness_max = capture->queryctrl.maximum;
|
670 | }
|
671 |
|
672 | if (capture->queryctrl.id == V4L2_CID_CONTRAST)
|
673 | {
|
674 | capture->v4l2_contrast = 1;
|
675 | capture->v4l2_contrast_min = capture->queryctrl.minimum;
|
676 | capture->v4l2_contrast_max = capture->queryctrl.maximum;
|
677 | }
|
678 |
|
679 | if (capture->queryctrl.id == V4L2_CID_SATURATION)
|
680 | {
|
681 | capture->v4l2_saturation = 1;
|
682 | capture->v4l2_saturation_min = capture->queryctrl.minimum;
|
683 | capture->v4l2_saturation_max = capture->queryctrl.maximum;
|
684 | }
|
685 |
|
686 | if (capture->queryctrl.id == V4L2_CID_HUE)
|
687 | {
|
688 | capture->v4l2_hue = 1;
|
689 | capture->v4l2_hue_min = capture->queryctrl.minimum;
|
690 | capture->v4l2_hue_max = capture->queryctrl.maximum;
|
691 | }
|
692 |
|
693 | if (capture->queryctrl.id == V4L2_CID_GAIN)
|
694 | {
|
695 | capture->v4l2_gain = 1;
|
696 | capture->v4l2_gain_min = capture->queryctrl.minimum;
|
697 | capture->v4l2_gain_max = capture->queryctrl.maximum;
|
698 | }
|
699 |
|
700 | if (capture->queryctrl.id == V4L2_CID_EXPOSURE)
|
701 | {
|
702 | capture->v4l2_exposure = 1;
|
703 | capture->v4l2_exposure_min = capture->queryctrl.minimum;
|
704 | capture->v4l2_exposure_max = capture->queryctrl.maximum;
|
705 | }
|
706 |
|
707 |
|
708 | } else {
|
709 |
|
710 | if (errno == EINVAL)
|
711 | continue;
|
712 |
|
713 | perror ("VIDIOC_QUERYCTRL");
|
714 |
|
715 | }
|
716 |
|
717 | }
|
718 |
|
719 | for (ctrl_id = V4L2_CID_PRIVATE_BASE;;ctrl_id++)
|
720 | {
|
721 |
|
722 |
|
723 | CLEAR (capture->queryctrl);
|
724 | capture->queryctrl.id = ctrl_id;
|
725 |
|
726 | if (0 == ioctl (capture->deviceHandle, VIDIOC_QUERYCTRL,
|
727 | &capture->queryctrl))
|
728 | {
|
729 |
|
730 | if (capture->queryctrl.flags & V4L2_CTRL_FLAG_DISABLED)
|
731 | continue;
|
732 |
|
733 | if (capture->queryctrl.id == V4L2_CID_BRIGHTNESS)
|
734 | {
|
735 | capture->v4l2_brightness = 1;
|
736 | capture->v4l2_brightness_min = capture->queryctrl.minimum;
|
737 | capture->v4l2_brightness_max = capture->queryctrl.maximum;
|
738 | }
|
739 |
|
740 | if (capture->queryctrl.id == V4L2_CID_CONTRAST)
|
741 | {
|
742 | capture->v4l2_contrast = 1;
|
743 | capture->v4l2_contrast_min = capture->queryctrl.minimum;
|
744 | capture->v4l2_contrast_max = capture->queryctrl.maximum;
|
745 | }
|
746 |
|
747 | if (capture->queryctrl.id == V4L2_CID_SATURATION)
|
748 | {
|
749 | capture->v4l2_saturation = 1;
|
750 | capture->v4l2_saturation_min = capture->queryctrl.minimum;
|
751 | capture->v4l2_saturation_max = capture->queryctrl.maximum;
|
752 | }
|
753 |
|
754 | if (capture->queryctrl.id == V4L2_CID_HUE)
|
755 | {
|
756 | capture->v4l2_hue = 1;
|
757 | capture->v4l2_hue_min = capture->queryctrl.minimum;
|
758 | capture->v4l2_hue_max = capture->queryctrl.maximum;
|
759 | }
|
760 |
|
761 | if (capture->queryctrl.id == V4L2_CID_GAIN)
|
762 | {
|
763 | capture->v4l2_gain = 1;
|
764 | capture->v4l2_gain_min = capture->queryctrl.minimum;
|
765 | capture->v4l2_gain_max = capture->queryctrl.maximum;
|
766 | }
|
767 |
|
768 | if (capture->queryctrl.id == V4L2_CID_EXPOSURE)
|
769 | {
|
770 | capture->v4l2_exposure = 1;
|
771 | capture->v4l2_exposure_min = capture->queryctrl.minimum;
|
772 | capture->v4l2_exposure_max = capture->queryctrl.maximum;
|
773 | }
|
774 |
|
775 | } else {
|
776 |
|
777 | if (errno == EINVAL)
|
778 | break;
|
779 |
|
780 | perror ("VIDIOC_QUERYCTRL");
|
781 |
|
782 | }
|
783 |
|
784 | }
|
785 |
|
786 | }
|
787 |
|
788 | static int _capture_V4L2 (CvCaptureCAM_V4L *capture, char *deviceName)
|
789 | {
|
790 | int detect_v4l2 = 0;
|
791 |
|
792 | detect_v4l2 = try_init_v4l2(capture, deviceName);
|
793 |
|
794 | if (detect_v4l2 != 1) {
|
795 |
|
796 | return -1;
|
797 | }
|
798 |
|
799 |
|
800 | V4L2_SUPPORT = 1;
|
801 |
|
802 |
|
803 | capture->v4l2_brightness = 0;
|
804 | capture->v4l2_contrast = 0;
|
805 | capture->v4l2_saturation = 0;
|
806 | capture->v4l2_hue = 0;
|
807 | capture->v4l2_gain = 0;
|
808 | capture->v4l2_exposure = 0;
|
809 |
|
810 | capture->v4l2_brightness_min = 0;
|
811 | capture->v4l2_contrast_min = 0;
|
812 | capture->v4l2_saturation_min = 0;
|
813 | capture->v4l2_hue_min = 0;
|
814 | capture->v4l2_gain_min = 0;
|
815 | capture->v4l2_exposure_min = 0;
|
816 |
|
817 | capture->v4l2_brightness_max = 0;
|
818 | capture->v4l2_contrast_max = 0;
|
819 | capture->v4l2_saturation_max = 0;
|
820 | capture->v4l2_hue_max = 0;
|
821 | capture->v4l2_gain_max = 0;
|
822 | capture->v4l2_exposure_max = 0;
|
823 |
|
824 | capture->timestamp.tv_sec = 0;
|
825 | capture->timestamp.tv_usec = 0;
|
826 |
|
827 |
|
828 | v4l2_scan_controls(capture);
|
829 |
|
830 | if ((capture->cap.capabilities & V4L2_CAP_VIDEO_CAPTURE) == 0) {
|
831 |
|
832 | fprintf( stderr, "HIGHGUI ERROR: V4L2: device %s is unable to capture video memory.\n",deviceName);
|
833 | icvCloseCAM_V4L(capture);
|
834 | return -1;
|
835 | }
|
836 |
|
837 | |
838 | have sub "Channel Numbers". For a typical V4L TV capture card, this is usually 1.
|
839 | I myself am using a simple NTSC video input capture card that uses the value of 1.
|
840 | If you are not in North America or have a different video standard, you WILL have to change
|
841 | the following settings and recompile/reinstall. This set of settings is based on
|
842 | the most commonly encountered input video source types (like my bttv card) */
|
843 |
|
844 | if(capture->inp.index > 0) {
|
845 | CLEAR (capture->inp);
|
846 | capture->inp.index = CHANNEL_NUMBER;
|
847 |
|
848 |
|
849 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_ENUMINPUT, &capture->inp))
|
850 | {
|
851 | fprintf (stderr, "HIGHGUI ERROR: V4L2: Aren't able to set channel number\n");
|
852 | icvCloseCAM_V4L (capture);
|
853 | return -1;
|
854 | }
|
855 | }
|
856 |
|
857 |
|
858 | CLEAR (capture->form);
|
859 | capture->form.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
860 |
|
861 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_G_FMT, &capture->form)) {
|
862 | fprintf( stderr, "HIGHGUI ERROR: V4L2: Could not obtain specifics of capture window.\n\n");
|
863 | icvCloseCAM_V4L(capture);
|
864 | return -1;
|
865 | }
|
866 |
|
867 | if (V4L2_SUPPORT == 0)
|
868 | {
|
869 | }
|
870 |
|
871 | if (autosetup_capture_mode_v4l2(capture) == -1)
|
872 | return -1;
|
873 |
|
874 | icvSetVideoSize(capture, DEFAULT_V4L_WIDTH, DEFAULT_V4L_HEIGHT);
|
875 |
|
876 | unsigned int min;
|
877 |
|
878 |
|
879 | min = capture->form.fmt.pix.width * 2;
|
880 |
|
881 | if (capture->form.fmt.pix.bytesperline < min)
|
882 | capture->form.fmt.pix.bytesperline = min;
|
883 |
|
884 | min = capture->form.fmt.pix.bytesperline * capture->form.fmt.pix.height;
|
885 |
|
886 | if (capture->form.fmt.pix.sizeimage < min)
|
887 | capture->form.fmt.pix.sizeimage = min;
|
888 |
|
889 | CLEAR (capture->req);
|
890 |
|
891 | unsigned int buffer_number = DEFAULT_V4L_BUFFERS;
|
892 |
|
893 | try_again:
|
894 |
|
895 | capture->req.count = buffer_number;
|
896 | capture->req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
897 | capture->req.memory = V4L2_MEMORY_MMAP;
|
898 |
|
899 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_REQBUFS, &capture->req))
|
900 | {
|
901 | if (EINVAL == errno)
|
902 | {
|
903 | fprintf (stderr, "%s does not support memory mapping\n", deviceName);
|
904 | } else {
|
905 | perror ("VIDIOC_REQBUFS");
|
906 | }
|
907 |
|
908 | icvCloseCAM_V4L (capture);
|
909 | return -1;
|
910 | }
|
911 |
|
912 | if (capture->req.count < buffer_number)
|
913 | {
|
914 | if (buffer_number == 1)
|
915 | {
|
916 | fprintf (stderr, "Insufficient buffer memory on %s\n", deviceName);
|
917 |
|
918 |
|
919 | icvCloseCAM_V4L (capture);
|
920 | return -1;
|
921 | } else {
|
922 | buffer_number--;
|
923 | fprintf (stderr, "Insufficient buffer memory on %s -- decreaseing buffers\n", deviceName);
|
924 |
|
925 | goto try_again;
|
926 | }
|
927 | }
|
928 |
|
929 | for (n_buffers = 0; n_buffers < capture->req.count; ++n_buffers)
|
930 | {
|
931 | struct v4l2_buffer buf;
|
932 |
|
933 | CLEAR (buf);
|
934 |
|
935 | buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
936 | buf.memory = V4L2_MEMORY_MMAP;
|
937 | buf.index = n_buffers;
|
938 |
|
939 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_QUERYBUF, &buf)) {
|
940 | perror ("VIDIOC_QUERYBUF");
|
941 |
|
942 |
|
943 | icvCloseCAM_V4L (capture);
|
944 | return -1;
|
945 | }
|
946 |
|
947 | capture->buffers[n_buffers].length = buf.length;
|
948 | capture->buffers[n_buffers].start =
|
949 | mmap (NULL ,
|
950 | buf.length,
|
951 | PROT_READ | PROT_WRITE ,
|
952 | MAP_SHARED ,
|
953 | capture->deviceHandle, buf.m.offset);
|
954 |
|
955 | if (MAP_FAILED == capture->buffers[n_buffers].start) {
|
956 | perror ("mmap");
|
957 |
|
958 |
|
959 | icvCloseCAM_V4L (capture);
|
960 | return -1;
|
961 | }
|
962 |
|
963 | if (n_buffers == 0) {
|
964 | capture->buffers[MAX_V4L_BUFFERS].start = malloc( buf.length );
|
965 | capture->buffers[MAX_V4L_BUFFERS].length = buf.length;
|
966 | }
|
967 | }
|
968 |
|
969 |
|
970 | cvInitImageHeader( &capture->frame,
|
971 | cvSize( capture->form.fmt.pix.width,
|
972 | capture->form.fmt.pix.height ),
|
973 | IPL_DEPTH_8U, 3, IPL_ORIGIN_TL, 4 );
|
974 |
|
975 | capture->frame.imageData = (char *)cvAlloc(capture->frame.imageSize);
|
976 |
|
977 | return 1;
|
978 | };
|
979 |
|
980 | #endif
|
981 |
|
982 | #ifdef HAVE_CAMV4L
|
983 |
|
984 | static int _capture_V4L (CvCaptureCAM_V4L *capture, char *deviceName)
|
985 | {
|
986 | int detect_v4l = 0;
|
987 |
|
988 | detect_v4l = try_init_v4l(capture, deviceName);
|
989 |
|
990 | if ((detect_v4l == -1)
|
991 | )
|
992 | {
|
993 | fprintf (stderr, "HIGHGUI ERROR: V4L"
|
994 | ": device %s: Unable to open for READ ONLY\n", deviceName);
|
995 |
|
996 | return -1;
|
997 | }
|
998 |
|
999 | if ((detect_v4l <= 0)
|
1000 | )
|
1001 | {
|
1002 | fprintf (stderr, "HIGHGUI ERROR: V4L"
|
1003 | ": device %s: Unable to query number of channels\n", deviceName);
|
1004 |
|
1005 | return -1;
|
1006 | }
|
1007 |
|
1008 | {
|
1009 | if ((capture->capability.type & VID_TYPE_CAPTURE) == 0) {
|
1010 |
|
1011 | fprintf( stderr, "HIGHGUI ERROR: V4L: "
|
1012 | "device %s is unable to capture video memory.\n",deviceName);
|
1013 | icvCloseCAM_V4L(capture);
|
1014 | return -1;
|
1015 | }
|
1016 |
|
1017 | }
|
1018 |
|
1019 |
|
1020 | |
1021 | have sub "Channel Numbers". For a typical V4L TV capture card, this is usually 1.
|
1022 | I myself am using a simple NTSC video input capture card that uses the value of 1.
|
1023 | If you are not in North America or have a different video standard, you WILL have to change
|
1024 | the following settings and recompile/reinstall. This set of settings is based on
|
1025 | the most commonly encountered input video source types (like my bttv card) */
|
1026 |
|
1027 | {
|
1028 |
|
1029 | if(capture->capability.channels>0) {
|
1030 |
|
1031 | struct video_channel selectedChannel;
|
1032 | memset(&selectedChannel, 0, sizeof(selectedChannel));
|
1033 |
|
1034 | selectedChannel.channel=CHANNEL_NUMBER;
|
1035 | if (ioctl(capture->deviceHandle, VIDIOCGCHAN , &selectedChannel) != -1) {
|
1036 |
|
1037 |
|
1038 | if (ioctl(capture->deviceHandle, VIDIOCSCHAN , &selectedChannel) == -1) {
|
1039 |
|
1040 |
|
1041 | }
|
1042 | }
|
1043 | }
|
1044 |
|
1045 | }
|
1046 |
|
1047 | {
|
1048 |
|
1049 | if(ioctl(capture->deviceHandle, VIDIOCGWIN, &capture->captureWindow) == -1) {
|
1050 | fprintf( stderr, "HIGHGUI ERROR: V4L: "
|
1051 | "Could not obtain specifics of capture window.\n\n");
|
1052 | icvCloseCAM_V4L(capture);
|
1053 | return -1;
|
1054 | }
|
1055 |
|
1056 | }
|
1057 |
|
1058 | {
|
1059 |
|
1060 | if (autosetup_capture_mode_v4l(capture) == -1)
|
1061 | return -1;
|
1062 |
|
1063 | }
|
1064 |
|
1065 | {
|
1066 |
|
1067 | ioctl(capture->deviceHandle, VIDIOCGMBUF, &capture->memoryBuffer);
|
1068 | capture->memoryMap = (char *)mmap(0,
|
1069 | capture->memoryBuffer.size,
|
1070 | PROT_READ | PROT_WRITE,
|
1071 | MAP_SHARED,
|
1072 | capture->deviceHandle,
|
1073 | 0);
|
1074 | if (capture->memoryMap == MAP_FAILED) {
|
1075 | fprintf( stderr, "HIGHGUI ERROR: V4L: Mapping Memmory from video source error: %s\n", strerror(errno));
|
1076 | icvCloseCAM_V4L(capture);
|
1077 | }
|
1078 |
|
1079 | |
1080 | retrieved from an index value */
|
1081 | capture->mmaps = (struct video_mmap *)
|
1082 | (malloc(capture->memoryBuffer.frames * sizeof(struct video_mmap)));
|
1083 | if (!capture->mmaps) {
|
1084 | fprintf( stderr, "HIGHGUI ERROR: V4L: Could not memory map video frames.\n");
|
1085 | icvCloseCAM_V4L(capture);
|
1086 | return -1;
|
1087 | }
|
1088 |
|
1089 | }
|
1090 |
|
1091 |
|
1092 | cvInitImageHeader( &capture->frame,
|
1093 | cvSize( capture->captureWindow.width,
|
1094 | capture->captureWindow.height ),
|
1095 | IPL_DEPTH_8U, 3, IPL_ORIGIN_TL, 4 );
|
1096 |
|
1097 | capture->frame.imageData = (char *)cvAlloc(capture->frame.imageSize);
|
1098 |
|
1099 | return 1;
|
1100 | };
|
1101 |
|
1102 | #endif
|
1103 |
|
1104 | static CvCaptureCAM_V4L * icvCaptureFromCAM_V4L (int index)
|
1105 | {
|
1106 | static int autoindex;
|
1107 | autoindex = 0;
|
1108 |
|
1109 | char deviceName[MAX_DEVICE_DRIVER_NAME];
|
1110 |
|
1111 | if (!numCameras)
|
1112 | icvInitCapture_V4L();
|
1113 | if (!numCameras)
|
1114 | return NULL;
|
1115 |
|
1116 |
|
1117 | if ( (index>-1) && ! ((1 << index) & indexList) )
|
1118 | {
|
1119 | fprintf( stderr, "HIGHGUI ERROR: V4L: index %d is not correct!\n",index);
|
1120 | return NULL;
|
1121 | }
|
1122 | |
1123 | the handles for V4L processing */
|
1124 | CvCaptureCAM_V4L * capture = (CvCaptureCAM_V4L*)cvAlloc(sizeof(CvCaptureCAM_V4L));
|
1125 | if (!capture) {
|
1126 | fprintf( stderr, "HIGHGUI ERROR: V4L: Could not allocate memory for capture process.\n");
|
1127 | return NULL;
|
1128 | }
|
1129 |
|
1130 | if (index<0) {
|
1131 | for (; autoindex<MAX_CAMERAS;autoindex++)
|
1132 | if (indexList & (1<<autoindex))
|
1133 | break;
|
1134 | if (autoindex==MAX_CAMERAS)
|
1135 | return NULL;
|
1136 | index=autoindex;
|
1137 | autoindex++;
|
1138 | }
|
1139 |
|
1140 | sprintf(deviceName, "/dev/video%1d", index);
|
1141 |
|
1142 |
|
1143 | memset(capture,0,sizeof(CvCaptureCAM_V4L));
|
1144 | |
1145 | the standard set of cv calls promoting transparency. "Vector Table" insertion. */
|
1146 | capture->FirstCapture = 1;
|
1147 |
|
1148 | #ifdef HAVE_CAMV4L2
|
1149 | if (_capture_V4L2 (capture, deviceName) == -1) {
|
1150 | icvCloseCAM_V4L(capture);
|
1151 | V4L2_SUPPORT = 0;
|
1152 | #endif
|
1153 | #ifdef HAVE_CAMV4L
|
1154 | if (_capture_V4L (capture, deviceName) == -1) {
|
1155 | icvCloseCAM_V4L(capture);
|
1156 | return NULL;
|
1157 | }
|
1158 | #endif
|
1159 | #ifdef HAVE_CAMV4L2
|
1160 | } else {
|
1161 | V4L2_SUPPORT = 1;
|
1162 | }
|
1163 | #endif
|
1164 |
|
1165 | return capture;
|
1166 | };
|
1167 |
|
1168 | #ifdef HAVE_CAMV4L2
|
1169 |
|
1170 | static int read_frame_v4l2(CvCaptureCAM_V4L* capture) {
|
1171 | struct v4l2_buffer buf;
|
1172 |
|
1173 | CLEAR (buf);
|
1174 |
|
1175 | buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
1176 | buf.memory = V4L2_MEMORY_MMAP;
|
1177 |
|
1178 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_DQBUF, &buf)) {
|
1179 | switch (errno) {
|
1180 | case EAGAIN:
|
1181 | return 0;
|
1182 |
|
1183 | case EIO:
|
1184 | if (!(buf.flags & (V4L2_BUF_FLAG_QUEUED | V4L2_BUF_FLAG_DONE)))
|
1185 | {
|
1186 | if (ioctl(capture->deviceHandle, VIDIOC_QBUF, &buf) == -1)
|
1187 | {
|
1188 | return 0;
|
1189 | }
|
1190 | }
|
1191 | return 0;
|
1192 |
|
1193 | default:
|
1194 |
|
1195 | perror ("VIDIOC_DQBUF");
|
1196 | return 1;
|
1197 | }
|
1198 | }
|
1199 |
|
1200 | assert(buf.index < capture->req.count);
|
1201 |
|
1202 | memcpy(capture->buffers[MAX_V4L_BUFFERS].start,
|
1203 | capture->buffers[buf.index].start,
|
1204 | capture->buffers[MAX_V4L_BUFFERS].length );
|
1205 | capture->bufferIndex = MAX_V4L_BUFFERS;
|
1206 |
|
1207 |
|
1208 |
|
1209 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_QBUF, &buf))
|
1210 | perror ("VIDIOC_QBUF");
|
1211 |
|
1212 |
|
1213 | capture->timestamp = buf.timestamp;
|
1214 |
|
1215 | return 1;
|
1216 | }
|
1217 |
|
1218 | static void mainloop_v4l2(CvCaptureCAM_V4L* capture) {
|
1219 | unsigned int count;
|
1220 |
|
1221 | count = 1;
|
1222 |
|
1223 | while (count-- > 0) {
|
1224 | for (;;) {
|
1225 | fd_set fds;
|
1226 | struct timeval tv;
|
1227 | int r;
|
1228 |
|
1229 | FD_ZERO (&fds);
|
1230 | FD_SET (capture->deviceHandle, &fds);
|
1231 |
|
1232 |
|
1233 | tv.tv_sec = 10;
|
1234 | tv.tv_usec = 0;
|
1235 |
|
1236 | r = select (capture->deviceHandle+1, &fds, NULL, NULL, &tv);
|
1237 |
|
1238 | if (-1 == r) {
|
1239 | if (EINTR == errno)
|
1240 | continue;
|
1241 |
|
1242 | perror ("select");
|
1243 | }
|
1244 |
|
1245 | if (0 == r) {
|
1246 | fprintf (stderr, "select timeout\n");
|
1247 |
|
1248 |
|
1249 | break;
|
1250 | }
|
1251 |
|
1252 | if (read_frame_v4l2 (capture))
|
1253 | break;
|
1254 | }
|
1255 | }
|
1256 | }
|
1257 |
|
1258 | #endif
|
1259 |
|
1260 | static int icvGrabFrameCAM_V4L(CvCaptureCAM_V4L* capture) {
|
1261 |
|
1262 | if (capture->FirstCapture) {
|
1263 |
|
1264 |
|
1265 | |
1266 | staggered SYNC is applied. SO, filler up. (see V4L HowTo) */
|
1267 |
|
1268 | #ifdef HAVE_CAMV4L2
|
1269 |
|
1270 | #ifdef HAVE_CAMV4L
|
1271 | if (V4L2_SUPPORT == 1)
|
1272 | #endif
|
1273 | {
|
1274 |
|
1275 | for (capture->bufferIndex = 0;
|
1276 | capture->bufferIndex < ((int)capture->req.count);
|
1277 | ++capture->bufferIndex)
|
1278 | {
|
1279 |
|
1280 | struct v4l2_buffer buf;
|
1281 |
|
1282 | CLEAR (buf);
|
1283 |
|
1284 | buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
1285 | buf.memory = V4L2_MEMORY_MMAP;
|
1286 | buf.index = (unsigned long)capture->bufferIndex;
|
1287 |
|
1288 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_QBUF, &buf)) {
|
1289 | perror ("VIDIOC_QBUF");
|
1290 | return 0;
|
1291 | }
|
1292 | }
|
1293 |
|
1294 |
|
1295 | capture->type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
1296 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_STREAMON,
|
1297 | &capture->type)) {
|
1298 |
|
1299 | perror ("VIDIOC_STREAMON");
|
1300 | return 0;
|
1301 | }
|
1302 | }
|
1303 | #endif
|
1304 | #if defined(HAVE_CAMV4L) && defined(HAVE_CAMV4L2)
|
1305 | else
|
1306 | #endif
|
1307 | #ifdef HAVE_CAMV4L
|
1308 | {
|
1309 |
|
1310 | for (capture->bufferIndex = 0;
|
1311 | capture->bufferIndex < (capture->memoryBuffer.frames-1);
|
1312 | ++capture->bufferIndex) {
|
1313 |
|
1314 | capture->mmaps[capture->bufferIndex].frame = capture->bufferIndex;
|
1315 | capture->mmaps[capture->bufferIndex].width = capture->captureWindow.width;
|
1316 | capture->mmaps[capture->bufferIndex].height = capture->captureWindow.height;
|
1317 | capture->mmaps[capture->bufferIndex].format = capture->imageProperties.palette;
|
1318 |
|
1319 | if (ioctl(capture->deviceHandle, VIDIOCMCAPTURE, &capture->mmaps[capture->bufferIndex]) == -1) {
|
1320 | fprintf( stderr, "HIGHGUI ERROR: V4L: Initial Capture Error: Unable to load initial memory buffers.\n");
|
1321 | return 0;
|
1322 | }
|
1323 | }
|
1324 |
|
1325 | }
|
1326 | #endif
|
1327 |
|
1328 | #if defined(V4L_ABORT_BADJPEG) && defined(HAVE_CAMV4L2)
|
1329 | if (V4L2_SUPPORT == 1)
|
1330 | {
|
1331 |
|
1332 |
|
1333 | mainloop_v4l2(capture);
|
1334 | }
|
1335 | #endif
|
1336 |
|
1337 |
|
1338 | capture->FirstCapture = 0;
|
1339 | }
|
1340 |
|
1341 | #ifdef HAVE_CAMV4L2
|
1342 |
|
1343 | if (V4L2_SUPPORT == 1)
|
1344 | {
|
1345 |
|
1346 | mainloop_v4l2(capture);
|
1347 |
|
1348 | }
|
1349 | #endif
|
1350 | #if defined(HAVE_CAMV4L) && defined(HAVE_CAMV4L2)
|
1351 | else
|
1352 | #endif
|
1353 | #ifdef HAVE_CAMV4L
|
1354 | {
|
1355 |
|
1356 | capture->mmaps[capture->bufferIndex].frame = capture->bufferIndex;
|
1357 | capture->mmaps[capture->bufferIndex].width = capture->captureWindow.width;
|
1358 | capture->mmaps[capture->bufferIndex].height = capture->captureWindow.height;
|
1359 | capture->mmaps[capture->bufferIndex].format = capture->imageProperties.palette;
|
1360 |
|
1361 | if (ioctl (capture->deviceHandle, VIDIOCMCAPTURE,
|
1362 | &capture->mmaps[capture->bufferIndex]) == -1) {
|
1363 |
|
1364 | return 1;
|
1365 | }
|
1366 |
|
1367 | ++capture->bufferIndex;
|
1368 | if (capture->bufferIndex == capture->memoryBuffer.frames) {
|
1369 | capture->bufferIndex = 0;
|
1370 | }
|
1371 |
|
1372 | }
|
1373 | #endif
|
1374 |
|
1375 | return(1);
|
1376 | }
|
1377 |
|
1378 | |
1379 | * Turn a YUV4:2:0 block into an RGB block
|
1380 | *
|
1381 | * Video4Linux seems to use the blue, green, red channel
|
1382 | * order convention-- rgb[0] is blue, rgb[1] is green, rgb[2] is red.
|
1383 | *
|
1384 | * Color space conversion coefficients taken from the excellent
|
1385 | * http://www.inforamp.net/~poynton/ColorFAQ.html
|
1386 | * In his terminology, this is a CCIR 601.1 YCbCr -> RGB.
|
1387 | * Y values are given for all 4 pixels, but the U (Pb)
|
1388 | * and V (Pr) are assumed constant over the 2x2 block.
|
1389 | *
|
1390 | * To avoid floating point arithmetic, the color conversion
|
1391 | * coefficients are scaled into 16.16 fixed-point integers.
|
1392 | * They were determined as follows:
|
1393 | *
|
1394 | * double brightness = 1.0; (0->black; 1->full scale)
|
1395 | * double saturation = 1.0; (0->greyscale; 1->full color)
|
1396 | * double fixScale = brightness * 256 * 256;
|
1397 | * int rvScale = (int)(1.402 * saturation * fixScale);
|
1398 | * int guScale = (int)(-0.344136 * saturation * fixScale);
|
1399 | * int gvScale = (int)(-0.714136 * saturation * fixScale);
|
1400 | * int buScale = (int)(1.772 * saturation * fixScale);
|
1401 | * int yScale = (int)(fixScale);
|
1402 | */
|
1403 |
|
1404 |
|
1405 | #define LIMIT(x) ((x)>0xffffff?0xff: ((x)<=0xffff?0:((x)>>16)))
|
1406 |
|
1407 | static inline void
|
1408 | move_420_block(int yTL, int yTR, int yBL, int yBR, int u, int v,
|
1409 | int rowPixels, unsigned char * rgb)
|
1410 | {
|
1411 | const int rvScale = 91881;
|
1412 | const int guScale = -22553;
|
1413 | const int gvScale = -46801;
|
1414 | const int buScale = 116129;
|
1415 | const int yScale = 65536;
|
1416 | int r, g, b;
|
1417 |
|
1418 | g = guScale * u + gvScale * v;
|
1419 |
|
1420 |
|
1421 |
|
1422 |
|
1423 | r = rvScale * v;
|
1424 | b = buScale * u;
|
1425 |
|
1426 |
|
1427 | yTL *= yScale; yTR *= yScale;
|
1428 | yBL *= yScale; yBR *= yScale;
|
1429 |
|
1430 |
|
1431 | rgb[0] = LIMIT(b+yTL); rgb[1] = LIMIT(g+yTL);
|
1432 | rgb[2] = LIMIT(r+yTL);
|
1433 |
|
1434 | rgb[3] = LIMIT(b+yTR); rgb[4] = LIMIT(g+yTR);
|
1435 | rgb[5] = LIMIT(r+yTR);
|
1436 |
|
1437 |
|
1438 | rgb += 3 * rowPixels;
|
1439 | rgb[0] = LIMIT(b+yBL); rgb[1] = LIMIT(g+yBL);
|
1440 | rgb[2] = LIMIT(r+yBL);
|
1441 |
|
1442 | rgb[3] = LIMIT(b+yBR); rgb[4] = LIMIT(g+yBR);
|
1443 | rgb[5] = LIMIT(r+yBR);
|
1444 | }
|
1445 |
|
1446 | static inline void
|
1447 | move_411_block(int yTL, int yTR, int yBL, int yBR, int u, int v,
|
1448 | int , unsigned char * rgb)
|
1449 | {
|
1450 | const int rvScale = 91881;
|
1451 | const int guScale = -22553;
|
1452 | const int gvScale = -46801;
|
1453 | const int buScale = 116129;
|
1454 | const int yScale = 65536;
|
1455 | int r, g, b;
|
1456 |
|
1457 | g = guScale * u + gvScale * v;
|
1458 |
|
1459 |
|
1460 |
|
1461 |
|
1462 | r = rvScale * v;
|
1463 | b = buScale * u;
|
1464 |
|
1465 |
|
1466 | yTL *= yScale; yTR *= yScale;
|
1467 | yBL *= yScale; yBR *= yScale;
|
1468 |
|
1469 |
|
1470 | rgb[0] = LIMIT(b+yTL); rgb[1] = LIMIT(g+yTL);
|
1471 | rgb[2] = LIMIT(r+yTL);
|
1472 |
|
1473 | rgb[3] = LIMIT(b+yTR); rgb[4] = LIMIT(g+yTR);
|
1474 | rgb[5] = LIMIT(r+yTR);
|
1475 |
|
1476 |
|
1477 | rgb += 6;
|
1478 | rgb[0] = LIMIT(b+yBL); rgb[1] = LIMIT(g+yBL);
|
1479 | rgb[2] = LIMIT(r+yBL);
|
1480 |
|
1481 | rgb[3] = LIMIT(b+yBR); rgb[4] = LIMIT(g+yBR);
|
1482 | rgb[5] = LIMIT(r+yBR);
|
1483 | }
|
1484 |
|
1485 |
|
1486 |
|
1487 |
|
1488 |
|
1489 |
|
1490 |
|
1491 |
|
1492 |
|
1493 |
|
1494 |
|
1495 |
|
1496 |
|
1497 | static void
|
1498 | yuv420p_to_rgb24(int width, int height,
|
1499 | unsigned char *pIn0, unsigned char *pOut0)
|
1500 | {
|
1501 | const int numpix = width * height;
|
1502 | const int bytes = 24 >> 3;
|
1503 | int i, j, y00, y01, y10, y11, u, v;
|
1504 | unsigned char *pY = pIn0;
|
1505 | unsigned char *pU = pY + numpix;
|
1506 | unsigned char *pV = pU + numpix / 4;
|
1507 | unsigned char *pOut = pOut0;
|
1508 |
|
1509 | for (j = 0; j <= height - 2; j += 2) {
|
1510 | for (i = 0; i <= width - 2; i += 2) {
|
1511 | y00 = *pY;
|
1512 | y01 = *(pY + 1);
|
1513 | y10 = *(pY + width);
|
1514 | y11 = *(pY + width + 1);
|
1515 | u = (*pU++) - 128;
|
1516 | v = (*pV++) - 128;
|
1517 |
|
1518 | move_420_block(y00, y01, y10, y11, u, v,
|
1519 | width, pOut);
|
1520 |
|
1521 | pY += 2;
|
1522 | pOut += 2 * bytes;
|
1523 |
|
1524 | }
|
1525 | pY += width;
|
1526 | pOut += width * bytes;
|
1527 | }
|
1528 | }
|
1529 |
|
1530 |
|
1531 |
|
1532 |
|
1533 |
|
1534 |
|
1535 |
|
1536 |
|
1537 |
|
1538 |
|
1539 |
|
1540 | #ifdef HAVE_CAMV4L
|
1541 | static void
|
1542 | yuv420_to_rgb24(int width, int height,
|
1543 | unsigned char *pIn0, unsigned char *pOut0)
|
1544 | {
|
1545 | const int bytes = 24 >> 3;
|
1546 | int i, j, y00, y01, y10, y11, u, v;
|
1547 | unsigned char *pY = pIn0;
|
1548 | unsigned char *pU = pY + 4;
|
1549 | unsigned char *pV = pU + width;
|
1550 | unsigned char *pOut = pOut0;
|
1551 |
|
1552 | for (j = 0; j <= height - 2; j += 2) {
|
1553 | for (i = 0; i <= width - 4; i += 4) {
|
1554 | y00 = *pY;
|
1555 | y01 = *(pY + 1);
|
1556 | y10 = *(pY + width);
|
1557 | y11 = *(pY + width + 1);
|
1558 | u = (*pU++) - 128;
|
1559 | v = (*pV++) - 128;
|
1560 |
|
1561 | move_420_block(y00, y01, y10, y11, u, v,
|
1562 | width, pOut);
|
1563 |
|
1564 | pY += 2;
|
1565 | pOut += 2 * bytes;
|
1566 |
|
1567 | y00 = *pY;
|
1568 | y01 = *(pY + 1);
|
1569 | y10 = *(pY + width);
|
1570 | y11 = *(pY + width + 1);
|
1571 | u = (*pU++) - 128;
|
1572 | v = (*pV++) - 128;
|
1573 |
|
1574 | move_420_block(y00, y01, y10, y11, u, v,
|
1575 | width, pOut);
|
1576 |
|
1577 | pY += 4;
|
1578 | pOut += 2 * bytes;
|
1579 |
|
1580 | }
|
1581 | pY += width;
|
1582 | pOut += width * bytes;
|
1583 | }
|
1584 | }
|
1585 | #endif
|
1586 |
|
1587 |
|
1588 |
|
1589 |
|
1590 |
|
1591 |
|
1592 |
|
1593 |
|
1594 |
|
1595 |
|
1596 |
|
1597 |
|
1598 |
|
1599 |
|
1600 |
|
1601 |
|
1602 | static void
|
1603 | yuv411p_to_rgb24(int width, int height,
|
1604 | unsigned char *pIn0, unsigned char *pOut0)
|
1605 | {
|
1606 | const int numpix = width * height;
|
1607 | const int bytes = 24 >> 3;
|
1608 | int i, j, y00, y01, y10, y11, u, v;
|
1609 | unsigned char *pY = pIn0;
|
1610 | unsigned char *pU = pY + numpix;
|
1611 | unsigned char *pV = pU + numpix / 4;
|
1612 | unsigned char *pOut = pOut0;
|
1613 |
|
1614 | for (j = 0; j <= height; j++) {
|
1615 | for (i = 0; i <= width - 4; i += 4) {
|
1616 | y00 = *pY;
|
1617 | y01 = *(pY + 1);
|
1618 | y10 = *(pY + 2);
|
1619 | y11 = *(pY + 3);
|
1620 | u = (*pU++) - 128;
|
1621 | v = (*pV++) - 128;
|
1622 |
|
1623 | move_411_block(y00, y01, y10, y11, u, v,
|
1624 | width, pOut);
|
1625 |
|
1626 | pY += 4;
|
1627 | pOut += 4 * bytes;
|
1628 |
|
1629 | }
|
1630 | }
|
1631 | }
|
1632 |
|
1633 |
|
1634 |
|
1635 | #define SAT(c) \
|
1636 | if (c & (~255)) { if (c < 0) c = 0; else c = 255; }
|
1637 |
|
1638 | #ifdef HAVE_CAMV4L2
|
1639 | static void
|
1640 | yuyv_to_rgb24 (int width, int height, unsigned char *src, unsigned char *dst)
|
1641 | {
|
1642 | unsigned char *s;
|
1643 | unsigned char *d;
|
1644 | int l, c;
|
1645 | int r, g, b, cr, cg, cb, y1, y2;
|
1646 |
|
1647 | l = height;
|
1648 | s = src;
|
1649 | d = dst;
|
1650 | while (l--) {
|
1651 | c = width >> 1;
|
1652 | while (c--) {
|
1653 | y1 = *s++;
|
1654 | cb = ((*s - 128) * 454) >> 8;
|
1655 | cg = (*s++ - 128) * 88;
|
1656 | y2 = *s++;
|
1657 | cr = ((*s - 128) * 359) >> 8;
|
1658 | cg = (cg + (*s++ - 128) * 183) >> 8;
|
1659 |
|
1660 | r = y1 + cr;
|
1661 | b = y1 + cb;
|
1662 | g = y1 - cg;
|
1663 | SAT(r);
|
1664 | SAT(g);
|
1665 | SAT(b);
|
1666 |
|
1667 | *d++ = b;
|
1668 | *d++ = g;
|
1669 | *d++ = r;
|
1670 |
|
1671 | r = y2 + cr;
|
1672 | b = y2 + cb;
|
1673 | g = y2 - cg;
|
1674 | SAT(r);
|
1675 | SAT(g);
|
1676 | SAT(b);
|
1677 |
|
1678 | *d++ = b;
|
1679 | *d++ = g;
|
1680 | *d++ = r;
|
1681 | }
|
1682 | }
|
1683 | }
|
1684 |
|
1685 | static void
|
1686 | uyvy_to_rgb24 (int width, int height, unsigned char *src, unsigned char *dst)
|
1687 | {
|
1688 | unsigned char *s;
|
1689 | unsigned char *d;
|
1690 | int l, c;
|
1691 | int r, g, b, cr, cg, cb, y1, y2;
|
1692 |
|
1693 | l = height;
|
1694 | s = src;
|
1695 | d = dst;
|
1696 | while (l--) {
|
1697 | c = width >> 1;
|
1698 | while (c--) {
|
1699 | cb = ((*s - 128) * 454) >> 8;
|
1700 | cg = (*s++ - 128) * 88;
|
1701 | y1 = *s++;
|
1702 | cr = ((*s - 128) * 359) >> 8;
|
1703 | cg = (cg + (*s++ - 128) * 183) >> 8;
|
1704 | y2 = *s++;
|
1705 |
|
1706 | r = y1 + cr;
|
1707 | b = y1 + cb;
|
1708 | g = y1 - cg;
|
1709 | SAT(r);
|
1710 | SAT(g);
|
1711 | SAT(b);
|
1712 |
|
1713 | *d++ = b;
|
1714 | *d++ = g;
|
1715 | *d++ = r;
|
1716 |
|
1717 | r = y2 + cr;
|
1718 | b = y2 + cb;
|
1719 | g = y2 - cg;
|
1720 | SAT(r);
|
1721 | SAT(g);
|
1722 | SAT(b);
|
1723 |
|
1724 | *d++ = b;
|
1725 | *d++ = g;
|
1726 | *d++ = r;
|
1727 | }
|
1728 | }
|
1729 | }
|
1730 | #endif
|
1731 |
|
1732 | #ifdef HAVE_JPEG
|
1733 |
|
1734 |
|
1735 | static bool
|
1736 | mjpeg_to_rgb24 (int width, int height,
|
1737 | unsigned char *src, int length,
|
1738 | unsigned char *dst)
|
1739 | {
|
1740 | cv::Mat temp=cv::imdecode(cv::Mat(std::vector<uchar>(src, src + length)), 1);
|
1741 | if( !temp.data || temp.cols != width || temp.rows != height )
|
1742 | return false;
|
1743 | memcpy(dst, temp.data, width*height*3);
|
1744 | return true;
|
1745 | }
|
1746 |
|
1747 | #endif
|
1748 |
|
1749 | |
1750 | * BAYER2RGB24 ROUTINE TAKEN FROM:
|
1751 | *
|
1752 | * Sonix SN9C10x based webcam basic I/F routines
|
1753 | * Takafumi Mizuno <[email protected]>
|
1754 | *
|
1755 | */
|
1756 |
|
1757 | #ifdef HAVE_CAMV4L2
|
1758 | static void bayer2rgb24(long int WIDTH, long int HEIGHT, unsigned char *src, unsigned char *dst)
|
1759 | {
|
1760 | long int i;
|
1761 | unsigned char *rawpt, *scanpt;
|
1762 | long int size;
|
1763 |
|
1764 | rawpt = src;
|
1765 | scanpt = dst;
|
1766 | size = WIDTH*HEIGHT;
|
1767 |
|
1768 | for ( i = 0; i < size; i++ ) {
|
1769 | if ( (i/WIDTH) % 2 == 0 ) {
|
1770 | if ( (i % 2) == 0 ) {
|
1771 |
|
1772 | if ( (i > WIDTH) && ((i % WIDTH) > 0) ) {
|
1773 | *scanpt++ = (*(rawpt-WIDTH-1)+*(rawpt-WIDTH+1)+
|
1774 | *(rawpt+WIDTH-1)+*(rawpt+WIDTH+1))/4;
|
1775 | *scanpt++ = (*(rawpt-1)+*(rawpt+1)+
|
1776 | *(rawpt+WIDTH)+*(rawpt-WIDTH))/4;
|
1777 | *scanpt++ = *rawpt;
|
1778 | } else {
|
1779 |
|
1780 | *scanpt++ = *(rawpt+WIDTH+1);
|
1781 | *scanpt++ = (*(rawpt+1)+*(rawpt+WIDTH))/2;
|
1782 | *scanpt++ = *rawpt;
|
1783 | }
|
1784 | } else {
|
1785 |
|
1786 | if ( (i > WIDTH) && ((i % WIDTH) < (WIDTH-1)) ) {
|
1787 | *scanpt++ = (*(rawpt+WIDTH)+*(rawpt-WIDTH))/2;
|
1788 | *scanpt++ = *rawpt;
|
1789 | *scanpt++ = (*(rawpt-1)+*(rawpt+1))/2;
|
1790 | } else {
|
1791 |
|
1792 | *scanpt++ = *(rawpt+WIDTH);
|
1793 | *scanpt++ = *rawpt;
|
1794 | *scanpt++ = *(rawpt-1);
|
1795 | }
|
1796 | }
|
1797 | } else {
|
1798 | if ( (i % 2) == 0 ) {
|
1799 |
|
1800 | if ( (i < (WIDTH*(HEIGHT-1))) && ((i % WIDTH) > 0) ) {
|
1801 | *scanpt++ = (*(rawpt-1)+*(rawpt+1))/2;
|
1802 | *scanpt++ = *rawpt;
|
1803 | *scanpt++ = (*(rawpt+WIDTH)+*(rawpt-WIDTH))/2;
|
1804 | } else {
|
1805 |
|
1806 | *scanpt++ = *(rawpt+1);
|
1807 | *scanpt++ = *rawpt;
|
1808 | *scanpt++ = *(rawpt-WIDTH);
|
1809 | }
|
1810 | } else {
|
1811 |
|
1812 | if ( i < (WIDTH*(HEIGHT-1)) && ((i % WIDTH) < (WIDTH-1)) ) {
|
1813 | *scanpt++ = *rawpt;
|
1814 | *scanpt++ = (*(rawpt-1)+*(rawpt+1)+
|
1815 | *(rawpt-WIDTH)+*(rawpt+WIDTH))/4;
|
1816 | *scanpt++ = (*(rawpt-WIDTH-1)+*(rawpt-WIDTH+1)+
|
1817 | *(rawpt+WIDTH-1)+*(rawpt+WIDTH+1))/4;
|
1818 | } else {
|
1819 |
|
1820 | *scanpt++ = *rawpt;
|
1821 | *scanpt++ = (*(rawpt-1)+*(rawpt-WIDTH))/2;
|
1822 | *scanpt++ = *(rawpt-WIDTH-1);
|
1823 | }
|
1824 | }
|
1825 | }
|
1826 | rawpt++;
|
1827 | }
|
1828 |
|
1829 | }
|
1830 |
|
1831 |
|
1832 |
|
1833 |
|
1834 |
|
1835 |
|
1836 | static void sgbrg2rgb24(long int WIDTH, long int HEIGHT, unsigned char *src, unsigned char *dst)
|
1837 | {
|
1838 | long int i;
|
1839 | unsigned char *rawpt, *scanpt;
|
1840 | long int size;
|
1841 |
|
1842 | rawpt = src;
|
1843 | scanpt = dst;
|
1844 | size = WIDTH*HEIGHT;
|
1845 |
|
1846 | for ( i = 0; i < size; i++ )
|
1847 | {
|
1848 | if ( (i/WIDTH) % 2 == 0 )
|
1849 | {
|
1850 | if ( (i % 2) == 0 )
|
1851 | {
|
1852 | if ( (i > WIDTH) && ((i % WIDTH) > 0) )
|
1853 | {
|
1854 | *scanpt++ = (*(rawpt-1)+*(rawpt+1))/2;
|
1855 | *scanpt++ = *(rawpt);
|
1856 | *scanpt++ = (*(rawpt-WIDTH) + *(rawpt+WIDTH))/2;
|
1857 | } else
|
1858 | {
|
1859 |
|
1860 |
|
1861 | *scanpt++ = *(rawpt+1);
|
1862 | *scanpt++ = *(rawpt);
|
1863 | *scanpt++ = *(rawpt+WIDTH);
|
1864 | }
|
1865 | } else
|
1866 | {
|
1867 | if ( (i > WIDTH) && ((i % WIDTH) < (WIDTH-1)) )
|
1868 | {
|
1869 | *scanpt++ = *(rawpt);
|
1870 | *scanpt++ = (*(rawpt-1)+*(rawpt+1)+*(rawpt-WIDTH)+*(rawpt+WIDTH))/4;
|
1871 | *scanpt++ = (*(rawpt-WIDTH-1) + *(rawpt-WIDTH+1) + *(rawpt+WIDTH-1) + *(rawpt+WIDTH+1))/4;
|
1872 | } else
|
1873 | {
|
1874 |
|
1875 |
|
1876 | *scanpt++ = *(rawpt);
|
1877 | *scanpt++ = (*(rawpt-1)+*(rawpt+WIDTH))/2;
|
1878 | *scanpt++ = *(rawpt+WIDTH-1);
|
1879 | }
|
1880 | }
|
1881 | } else
|
1882 | {
|
1883 | if ( (i % 2) == 0 )
|
1884 | {
|
1885 | if ( (i < (WIDTH*(HEIGHT-1))) && ((i % WIDTH) > 0) )
|
1886 | {
|
1887 | *scanpt++ = (*(rawpt-WIDTH-1)+*(rawpt-WIDTH+1)+*(rawpt+WIDTH-1)+*(rawpt+WIDTH+1))/4;
|
1888 | *scanpt++ = (*(rawpt-1)+*(rawpt+1)+*(rawpt-WIDTH)+*(rawpt+WIDTH))/4;
|
1889 | *scanpt++ = *(rawpt);
|
1890 | } else
|
1891 | {
|
1892 |
|
1893 |
|
1894 | *scanpt++ = *(rawpt-WIDTH+1);
|
1895 | *scanpt++ = (*(rawpt+1)+*(rawpt-WIDTH))/2;
|
1896 | *scanpt++ = *(rawpt);
|
1897 | }
|
1898 | } else
|
1899 | {
|
1900 | if ( i < (WIDTH*(HEIGHT-1)) && ((i % WIDTH) < (WIDTH-1)) )
|
1901 | {
|
1902 | *scanpt++ = (*(rawpt-WIDTH)+*(rawpt+WIDTH))/2;
|
1903 | *scanpt++ = *(rawpt);
|
1904 | *scanpt++ = (*(rawpt-1)+*(rawpt+1))/2;
|
1905 | } else
|
1906 | {
|
1907 |
|
1908 |
|
1909 | *scanpt++ = (*(rawpt-WIDTH));
|
1910 | *scanpt++ = *(rawpt);
|
1911 | *scanpt++ = (*(rawpt-1));
|
1912 | }
|
1913 | }
|
1914 | }
|
1915 | rawpt++;
|
1916 | }
|
1917 | }
|
1918 |
|
1919 | #define CLAMP(x) ((x)<0?0:((x)>255)?255:(x))
|
1920 |
|
1921 | typedef struct {
|
1922 | int is_abs;
|
1923 | int len;
|
1924 | int val;
|
1925 | } code_table_t;
|
1926 |
|
1927 |
|
1928 |
|
1929 | static code_table_t table[256];
|
1930 | static int init_done = 0;
|
1931 |
|
1932 |
|
1933 | |
1934 | sonix_decompress_init
|
1935 | =====================
|
1936 | pre-calculates a locally stored table for efficient huffman-decoding.
|
1937 |
|
1938 | Each entry at index x in the table represents the codeword
|
1939 | present at the MSB of byte x.
|
1940 |
|
1941 | */
|
1942 | static void sonix_decompress_init(void)
|
1943 | {
|
1944 | int i;
|
1945 | int is_abs, val, len;
|
1946 |
|
1947 | for (i = 0; i < 256; i++) {
|
1948 | is_abs = 0;
|
1949 | val = 0;
|
1950 | len = 0;
|
1951 | if ((i & 0x80) == 0) {
|
1952 |
|
1953 | val = 0;
|
1954 | len = 1;
|
1955 | }
|
1956 | else if ((i & 0xE0) == 0x80) {
|
1957 |
|
1958 | val = +4;
|
1959 | len = 3;
|
1960 | }
|
1961 | else if ((i & 0xE0) == 0xA0) {
|
1962 |
|
1963 | val = -4;
|
1964 | len = 3;
|
1965 | }
|
1966 | else if ((i & 0xF0) == 0xD0) {
|
1967 |
|
1968 | val = +11;
|
1969 | len = 4;
|
1970 | }
|
1971 | else if ((i & 0xF0) == 0xF0) {
|
1972 |
|
1973 | val = -11;
|
1974 | len = 4;
|
1975 | }
|
1976 | else if ((i & 0xF8) == 0xC8) {
|
1977 |
|
1978 | val = +20;
|
1979 | len = 5;
|
1980 | }
|
1981 | else if ((i & 0xFC) == 0xC0) {
|
1982 |
|
1983 | val = -20;
|
1984 | len = 6;
|
1985 | }
|
1986 | else if ((i & 0xFC) == 0xC4) {
|
1987 |
|
1988 | val = 0;
|
1989 | len = 8;
|
1990 | }
|
1991 | else if ((i & 0xF0) == 0xE0) {
|
1992 |
|
1993 | is_abs = 1;
|
1994 | val = (i & 0x0F) << 4;
|
1995 | len = 8;
|
1996 | }
|
1997 | table[i].is_abs = is_abs;
|
1998 | table[i].val = val;
|
1999 | table[i].len = len;
|
2000 | }
|
2001 |
|
2002 | init_done = 1;
|
2003 | }
|
2004 |
|
2005 |
|
2006 | |
2007 | sonix_decompress
|
2008 | ================
|
2009 | decompresses an image encoded by a SN9C101 camera controller chip.
|
2010 |
|
2011 | IN width
|
2012 | height
|
2013 | inp pointer to compressed frame (with header already stripped)
|
2014 | OUT outp pointer to decompressed frame
|
2015 |
|
2016 | Returns 0 if the operation was successful.
|
2017 | Returns <0 if operation failed.
|
2018 |
|
2019 | */
|
2020 | static int sonix_decompress(int width, int height, unsigned char *inp, unsigned char *outp)
|
2021 | {
|
2022 | int row, col;
|
2023 | int val;
|
2024 | int bitpos;
|
2025 | unsigned char code;
|
2026 | unsigned char *addr;
|
2027 |
|
2028 | if (!init_done) {
|
2029 |
|
2030 | return -1;
|
2031 | }
|
2032 |
|
2033 | bitpos = 0;
|
2034 | for (row = 0; row < height; row++) {
|
2035 |
|
2036 | col = 0;
|
2037 |
|
2038 |
|
2039 |
|
2040 |
|
2041 | if (row < 2) {
|
2042 | addr = inp + (bitpos >> 3);
|
2043 | code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
|
2044 | bitpos += 8;
|
2045 | *outp++ = code;
|
2046 |
|
2047 | addr = inp + (bitpos >> 3);
|
2048 | code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
|
2049 | bitpos += 8;
|
2050 | *outp++ = code;
|
2051 |
|
2052 | col += 2;
|
2053 | }
|
2054 |
|
2055 | while (col < width) {
|
2056 |
|
2057 | addr = inp + (bitpos >> 3);
|
2058 | code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
|
2059 |
|
2060 |
|
2061 | bitpos += table[code].len;
|
2062 |
|
2063 |
|
2064 | val = table[code].val;
|
2065 | if (!table[code].is_abs) {
|
2066 |
|
2067 | if (col < 2) {
|
2068 |
|
2069 | val += outp[-2*width];
|
2070 | }
|
2071 | else if (row < 2) {
|
2072 |
|
2073 | val += outp[-2];
|
2074 | }
|
2075 | else {
|
2076 |
|
2077 | val += (outp[-2] + outp[-2*width]) / 2;
|
2078 | }
|
2079 | }
|
2080 |
|
2081 |
|
2082 | *outp++ = CLAMP(val);
|
2083 | col++;
|
2084 | }
|
2085 | }
|
2086 |
|
2087 | return 0;
|
2088 | }
|
2089 | #endif
|
2090 |
|
2091 | static IplImage* icvRetrieveFrameCAM_V4L( CvCaptureCAM_V4L* capture, int) {
|
2092 |
|
2093 | #ifdef HAVE_CAMV4L2
|
2094 | if (V4L2_SUPPORT == 0)
|
2095 | #endif
|
2096 | #ifdef HAVE_CAMV4L
|
2097 | {
|
2098 |
|
2099 |
|
2100 | if (ioctl(capture->deviceHandle, VIDIOCSYNC, &capture->mmaps[capture->bufferIndex].frame) == -1) {
|
2101 | fprintf( stderr, "HIGHGUI ERROR: V4L: Could not SYNC to video stream. %s\n", strerror(errno));
|
2102 | }
|
2103 |
|
2104 | }
|
2105 | #endif
|
2106 |
|
2107 |
|
2108 |
|
2109 |
|
2110 |
|
2111 | #ifdef HAVE_CAMV4L2
|
2112 |
|
2113 | if (V4L2_SUPPORT == 1)
|
2114 | {
|
2115 |
|
2116 | if(((unsigned long)capture->frame.width != capture->form.fmt.pix.width)
|
2117 | || ((unsigned long)capture->frame.height != capture->form.fmt.pix.height)) {
|
2118 | cvFree(&capture->frame.imageData);
|
2119 | cvInitImageHeader( &capture->frame,
|
2120 | cvSize( capture->form.fmt.pix.width,
|
2121 | capture->form.fmt.pix.height ),
|
2122 | IPL_DEPTH_8U, 3, IPL_ORIGIN_TL, 4 );
|
2123 | capture->frame.imageData = (char *)cvAlloc(capture->frame.imageSize);
|
2124 | }
|
2125 |
|
2126 | }
|
2127 | #endif
|
2128 | #if defined(HAVE_CAMV4L) && defined(HAVE_CAMV4L2)
|
2129 | else
|
2130 | #endif
|
2131 | #ifdef HAVE_CAMV4L
|
2132 | {
|
2133 |
|
2134 | if((capture->frame.width != capture->mmaps[capture->bufferIndex].width)
|
2135 | || (capture->frame.height != capture->mmaps[capture->bufferIndex].height)) {
|
2136 | cvFree(&capture->frame.imageData);
|
2137 | cvInitImageHeader( &capture->frame,
|
2138 | cvSize( capture->captureWindow.width,
|
2139 | capture->captureWindow.height ),
|
2140 | IPL_DEPTH_8U, 3, IPL_ORIGIN_TL, 4 );
|
2141 | capture->frame.imageData = (char *)cvAlloc(capture->frame.imageSize);
|
2142 | }
|
2143 |
|
2144 | }
|
2145 | #endif
|
2146 |
|
2147 | #ifdef HAVE_CAMV4L2
|
2148 |
|
2149 | if (V4L2_SUPPORT == 1)
|
2150 | {
|
2151 | switch (capture->palette)
|
2152 | {
|
2153 | case PALETTE_BGR24:
|
2154 | memcpy((char *)capture->frame.imageData,
|
2155 | (char *)capture->buffers[capture->bufferIndex].start,
|
2156 | capture->frame.imageSize);
|
2157 | break;
|
2158 |
|
2159 | case PALETTE_YVU420:
|
2160 | yuv420p_to_rgb24(capture->form.fmt.pix.width,
|
2161 | capture->form.fmt.pix.height,
|
2162 | (unsigned char*)(capture->buffers[capture->bufferIndex].start),
|
2163 | (unsigned char*)capture->frame.imageData);
|
2164 | break;
|
2165 |
|
2166 | case PALETTE_YUV411P:
|
2167 | yuv411p_to_rgb24(capture->form.fmt.pix.width,
|
2168 | capture->form.fmt.pix.height,
|
2169 | (unsigned char*)(capture->buffers[capture->bufferIndex].start),
|
2170 | (unsigned char*)capture->frame.imageData);
|
2171 | break;
|
2172 | #ifdef HAVE_JPEG
|
2173 | case PALETTE_MJPEG:
|
2174 | if (!mjpeg_to_rgb24(capture->form.fmt.pix.width,
|
2175 | capture->form.fmt.pix.height,
|
2176 | (unsigned char*)(capture->buffers[capture->bufferIndex]
|
2177 | .start),
|
2178 | capture->buffers[capture->bufferIndex].length,
|
2179 | (unsigned char*)capture->frame.imageData))
|
2180 | return 0;
|
2181 | break;
|
2182 | #endif
|
2183 |
|
2184 | case PALETTE_YUYV:
|
2185 | yuyv_to_rgb24(capture->form.fmt.pix.width,
|
2186 | capture->form.fmt.pix.height,
|
2187 | (unsigned char*)(capture->buffers[capture->bufferIndex].start),
|
2188 | (unsigned char*)capture->frame.imageData);
|
2189 | break;
|
2190 | case PALETTE_UYVY:
|
2191 | uyvy_to_rgb24(capture->form.fmt.pix.width,
|
2192 | capture->form.fmt.pix.height,
|
2193 | (unsigned char*)(capture->buffers[capture->bufferIndex].start),
|
2194 | (unsigned char*)capture->frame.imageData);
|
2195 | break;
|
2196 | case PALETTE_SBGGR8:
|
2197 | bayer2rgb24(capture->form.fmt.pix.width,
|
2198 | capture->form.fmt.pix.height,
|
2199 | (unsigned char*)capture->buffers[capture->bufferIndex].start,
|
2200 | (unsigned char*)capture->frame.imageData);
|
2201 | break;
|
2202 |
|
2203 | case PALETTE_SN9C10X:
|
2204 | sonix_decompress_init();
|
2205 | sonix_decompress(capture->form.fmt.pix.width,
|
2206 | capture->form.fmt.pix.height,
|
2207 | (unsigned char*)capture->buffers[capture->bufferIndex].start,
|
2208 | (unsigned char*)capture->buffers[(capture->bufferIndex+1) % capture->req.count].start);
|
2209 |
|
2210 | bayer2rgb24(capture->form.fmt.pix.width,
|
2211 | capture->form.fmt.pix.height,
|
2212 | (unsigned char*)capture->buffers[(capture->bufferIndex+1) % capture->req.count].start,
|
2213 | (unsigned char*)capture->frame.imageData);
|
2214 | break;
|
2215 |
|
2216 | case PALETTE_SGBRG:
|
2217 | sgbrg2rgb24(capture->form.fmt.pix.width,
|
2218 | capture->form.fmt.pix.height,
|
2219 | (unsigned char*)capture->buffers[(capture->bufferIndex+1) % capture->req.count].start,
|
2220 | (unsigned char*)capture->frame.imageData);
|
2221 | break;
|
2222 | }
|
2223 | }
|
2224 | #endif
|
2225 | #if defined(HAVE_CAMV4L) && defined(HAVE_CAMV4L2)
|
2226 | else
|
2227 | #endif
|
2228 | #ifdef HAVE_CAMV4L
|
2229 | {
|
2230 |
|
2231 | switch(capture->imageProperties.palette)
|
2232 | {
|
2233 | case VIDEO_PALETTE_RGB24:
|
2234 | memcpy((char *)capture->frame.imageData,
|
2235 | (char *)(capture->memoryMap + capture->memoryBuffer.offsets[capture->bufferIndex]),
|
2236 | capture->frame.imageSize);
|
2237 | break;
|
2238 | case VIDEO_PALETTE_YUV420P:
|
2239 | yuv420p_to_rgb24(capture->captureWindow.width,
|
2240 | capture->captureWindow.height,
|
2241 | (unsigned char*)(capture->memoryMap + capture->memoryBuffer.offsets[capture->bufferIndex]),
|
2242 | (unsigned char*)capture->frame.imageData);
|
2243 | break;
|
2244 | case VIDEO_PALETTE_YUV420:
|
2245 | yuv420_to_rgb24(capture->captureWindow.width,
|
2246 | capture->captureWindow.height,
|
2247 | (unsigned char*)(capture->memoryMap + capture->memoryBuffer.offsets[capture->bufferIndex]),
|
2248 | (unsigned char*)capture->frame.imageData);
|
2249 | break;
|
2250 | case VIDEO_PALETTE_YUV411P:
|
2251 | yuv411p_to_rgb24(capture->captureWindow.width,
|
2252 | capture->captureWindow.height,
|
2253 | (unsigned char*)(capture->memoryMap + capture->memoryBuffer.offsets[capture->bufferIndex]),
|
2254 | (unsigned char*)capture->frame.imageData);
|
2255 | break;
|
2256 | default:
|
2257 | fprintf( stderr,
|
2258 | "HIGHGUI ERROR: V4L: Cannot convert from palette %d to RGB\n",
|
2259 | capture->imageProperties.palette);
|
2260 |
|
2261 | return 0;
|
2262 | }
|
2263 |
|
2264 | }
|
2265 | #endif
|
2266 |
|
2267 | return(&capture->frame);
|
2268 | }
|
2269 |
|
2270 | static double icvGetPropertyCAM_V4L (CvCaptureCAM_V4L* capture,
|
2271 | int property_id ) {
|
2272 |
|
2273 | #ifdef HAVE_CAMV4L2
|
2274 |
|
2275 | #ifdef HAVE_CAMV4L
|
2276 | if (V4L2_SUPPORT == 1)
|
2277 | #endif
|
2278 | {
|
2279 |
|
2280 |
|
2281 | int v4l2_min = 0;
|
2282 | int v4l2_max = 255;
|
2283 |
|
2284 | CLEAR (capture->form);
|
2285 | capture->form.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
2286 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_G_FMT, &capture->form)) {
|
2287 |
|
2288 | perror ("VIDIOC_G_FMT");
|
2289 | return -1;
|
2290 | }
|
2291 |
|
2292 | switch (property_id) {
|
2293 | case CV_CAP_PROP_FRAME_WIDTH:
|
2294 | return capture->form.fmt.pix.width;
|
2295 | case CV_CAP_PROP_FRAME_HEIGHT:
|
2296 | return capture->form.fmt.pix.height;
|
2297 | }
|
2298 |
|
2299 |
|
2300 |
|
2301 | switch (property_id) {
|
2302 | case CV_CAP_PROP_POS_MSEC:
|
2303 | if (capture->FirstCapture) {
|
2304 | return 0;
|
2305 | } else {
|
2306 | return 1000 * capture->timestamp.tv_sec + ((double) capture->timestamp.tv_usec) / 1000;
|
2307 | }
|
2308 | break;
|
2309 | case CV_CAP_PROP_BRIGHTNESS:
|
2310 | capture->control.id = V4L2_CID_BRIGHTNESS;
|
2311 | break;
|
2312 | case CV_CAP_PROP_CONTRAST:
|
2313 | capture->control.id = V4L2_CID_CONTRAST;
|
2314 | break;
|
2315 | case CV_CAP_PROP_SATURATION:
|
2316 | capture->control.id = V4L2_CID_SATURATION;
|
2317 | break;
|
2318 | case CV_CAP_PROP_HUE:
|
2319 | capture->control.id = V4L2_CID_HUE;
|
2320 | break;
|
2321 | case CV_CAP_PROP_GAIN:
|
2322 | capture->control.id = V4L2_CID_GAIN;
|
2323 | break;
|
2324 | case CV_CAP_PROP_EXPOSURE:
|
2325 | capture->control.id = V4L2_CID_EXPOSURE;
|
2326 | break;
|
2327 | default:
|
2328 | fprintf(stderr,
|
2329 | "HIGHGUI ERROR: V4L2: getting property #%d is not supported\n",
|
2330 | property_id);
|
2331 | return -1;
|
2332 | }
|
2333 |
|
2334 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_G_CTRL,
|
2335 | &capture->control)) {
|
2336 |
|
2337 | fprintf( stderr, "HIGHGUI ERROR: V4L2: ");
|
2338 | switch (property_id) {
|
2339 | case CV_CAP_PROP_BRIGHTNESS:
|
2340 | fprintf (stderr, "Brightness");
|
2341 | break;
|
2342 | case CV_CAP_PROP_CONTRAST:
|
2343 | fprintf (stderr, "Contrast");
|
2344 | break;
|
2345 | case CV_CAP_PROP_SATURATION:
|
2346 | fprintf (stderr, "Saturation");
|
2347 | break;
|
2348 | case CV_CAP_PROP_HUE:
|
2349 | fprintf (stderr, "Hue");
|
2350 | break;
|
2351 | case CV_CAP_PROP_GAIN:
|
2352 | fprintf (stderr, "Gain");
|
2353 | break;
|
2354 | case CV_CAP_PROP_EXPOSURE:
|
2355 | fprintf (stderr, "Exposure");
|
2356 | break;
|
2357 | }
|
2358 | fprintf (stderr, " is not supported by your device\n");
|
2359 |
|
2360 | return -1;
|
2361 | }
|
2362 |
|
2363 |
|
2364 | switch (property_id) {
|
2365 |
|
2366 | case CV_CAP_PROP_BRIGHTNESS:
|
2367 | v4l2_min = capture->v4l2_brightness_min;
|
2368 | v4l2_max = capture->v4l2_brightness_max;
|
2369 | break;
|
2370 | case CV_CAP_PROP_CONTRAST:
|
2371 | v4l2_min = capture->v4l2_contrast_min;
|
2372 | v4l2_max = capture->v4l2_contrast_max;
|
2373 | break;
|
2374 | case CV_CAP_PROP_SATURATION:
|
2375 | v4l2_min = capture->v4l2_saturation_min;
|
2376 | v4l2_max = capture->v4l2_saturation_max;
|
2377 | break;
|
2378 | case CV_CAP_PROP_HUE:
|
2379 | v4l2_min = capture->v4l2_hue_min;
|
2380 | v4l2_max = capture->v4l2_hue_max;
|
2381 | break;
|
2382 | case CV_CAP_PROP_GAIN:
|
2383 | v4l2_min = capture->v4l2_gain_min;
|
2384 | v4l2_max = capture->v4l2_gain_max;
|
2385 | break;
|
2386 | case CV_CAP_PROP_EXPOSURE:
|
2387 | v4l2_min = capture->v4l2_exposure_min;
|
2388 | v4l2_max = capture->v4l2_exposure_max;
|
2389 | break;
|
2390 | }
|
2391 |
|
2392 |
|
2393 | return ((float)capture->control.value - v4l2_min + 1) / (v4l2_max - v4l2_min);
|
2394 |
|
2395 | }
|
2396 | #endif
|
2397 | #if defined(HAVE_CAMV4L) && defined(HAVE_CAMV4L2)
|
2398 | else
|
2399 | #endif
|
2400 | #ifdef HAVE_CAMV4L
|
2401 | {
|
2402 |
|
2403 | int retval = -1;
|
2404 |
|
2405 | if (ioctl (capture->deviceHandle,
|
2406 | VIDIOCGWIN, &capture->captureWindow) < 0) {
|
2407 | fprintf (stderr,
|
2408 | "HIGHGUI ERROR: V4L: "
|
2409 | "Unable to determine size of incoming image\n");
|
2410 | icvCloseCAM_V4L(capture);
|
2411 | return -1;
|
2412 | }
|
2413 |
|
2414 | switch (property_id) {
|
2415 | case CV_CAP_PROP_FRAME_WIDTH:
|
2416 | retval = capture->captureWindow.width;
|
2417 | break;
|
2418 | case CV_CAP_PROP_FRAME_HEIGHT:
|
2419 | retval = capture->captureWindow.height;
|
2420 | break;
|
2421 | case CV_CAP_PROP_BRIGHTNESS:
|
2422 | retval = capture->imageProperties.brightness;
|
2423 | break;
|
2424 | case CV_CAP_PROP_CONTRAST:
|
2425 | retval = capture->imageProperties.contrast;
|
2426 | break;
|
2427 | case CV_CAP_PROP_SATURATION:
|
2428 | retval = capture->imageProperties.colour;
|
2429 | break;
|
2430 | case CV_CAP_PROP_HUE:
|
2431 | retval = capture->imageProperties.hue;
|
2432 | break;
|
2433 | case CV_CAP_PROP_GAIN:
|
2434 | fprintf(stderr,
|
2435 | "HIGHGUI ERROR: V4L: Gain control in V4L is not supported\n");
|
2436 | return -1;
|
2437 | break;
|
2438 | case CV_CAP_PROP_EXPOSURE:
|
2439 | fprintf(stderr,
|
2440 | "HIGHGUI ERROR: V4L: Exposure control in V4L is not supported\n");
|
2441 | return -1;
|
2442 | break;
|
2443 | default:
|
2444 | fprintf(stderr,
|
2445 | "HIGHGUI ERROR: V4L: getting property #%d is not supported\n",
|
2446 | property_id);
|
2447 | }
|
2448 |
|
2449 | if (retval == -1) {
|
2450 |
|
2451 | return -1;
|
2452 | }
|
2453 |
|
2454 |
|
2455 | return float (retval) / 0xFFFF;
|
2456 |
|
2457 | }
|
2458 | #endif
|
2459 |
|
2460 | };
|
2461 |
|
2462 | static int icvSetVideoSize( CvCaptureCAM_V4L* capture, int w, int h) {
|
2463 |
|
2464 | #ifdef HAVE_CAMV4L2
|
2465 |
|
2466 | if (V4L2_SUPPORT == 1)
|
2467 | {
|
2468 |
|
2469 | CLEAR (capture->cropcap);
|
2470 | capture->cropcap.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
2471 |
|
2472 | if (ioctl (capture->deviceHandle, VIDIOC_CROPCAP, &capture->cropcap) < 0) {
|
2473 | fprintf(stderr, "HIGHGUI ERROR: V4L/V4L2: VIDIOC_CROPCAP\n");
|
2474 | } else {
|
2475 |
|
2476 | CLEAR (capture->crop);
|
2477 | capture->crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
2478 | capture->crop.c= capture->cropcap.defrect;
|
2479 |
|
2480 |
|
2481 | if (ioctl (capture->deviceHandle, VIDIOC_S_CROP, &capture->crop) < 0) {
|
2482 | fprintf(stderr, "HIGHGUI ERROR: V4L/V4L2: VIDIOC_S_CROP\n");
|
2483 | }
|
2484 | }
|
2485 |
|
2486 | CLEAR (capture->form);
|
2487 | capture->form.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
2488 |
|
2489 |
|
2490 | ioctl (capture->deviceHandle, VIDIOC_G_FMT, &capture->form);
|
2491 |
|
2492 |
|
2493 | capture->form.fmt.pix.width = w;
|
2494 | capture->form.fmt.pix.height = h;
|
2495 | capture->form.fmt.win.chromakey = 0;
|
2496 | capture->form.fmt.win.field = V4L2_FIELD_ANY;
|
2497 | capture->form.fmt.win.clips = 0;
|
2498 | capture->form.fmt.win.clipcount = 0;
|
2499 | capture->form.fmt.pix.field = V4L2_FIELD_ANY;
|
2500 |
|
2501 | |
2502 | * don't test if the set of the size is ok, because some device
|
2503 | * don't allow changing the size, and we will get the real size
|
2504 | * later */
|
2505 | ioctl (capture->deviceHandle, VIDIOC_S_FMT, &capture->form);
|
2506 |
|
2507 |
|
2508 | struct v4l2_streamparm setfps;
|
2509 | memset (&setfps, 0, sizeof(struct v4l2_streamparm));
|
2510 | setfps.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
2511 | setfps.parm.capture.timeperframe.numerator = 1;
|
2512 | setfps.parm.capture.timeperframe.denominator = 30;
|
2513 | ioctl (capture->deviceHandle, VIDIOC_S_PARM, &setfps);
|
2514 |
|
2515 | |
2516 | * changed */
|
2517 | capture->FirstCapture = 1;
|
2518 |
|
2519 |
|
2520 | if (-1 == ioctl (capture->deviceHandle, VIDIOC_G_FMT, &capture->form))
|
2521 | {
|
2522 | fprintf(stderr, "HIGHGUI ERROR: V4L/V4L2: Could not obtain specifics of capture window.\n\n");
|
2523 |
|
2524 | icvCloseCAM_V4L(capture);
|
2525 |
|
2526 | return 0;
|
2527 | }
|
2528 |
|
2529 | return 0;
|
2530 |
|
2531 | }
|
2532 | #endif
|
2533 | #if defined(HAVE_CAMV4L) && defined(HAVE_CAMV4L2)
|
2534 | else
|
2535 | #endif
|
2536 | #ifdef HAVE_CAMV4L
|
2537 | {
|
2538 |
|
2539 | if (capture==0) return 0;
|
2540 | if (w>capture->capability.maxwidth) {
|
2541 | w=capture->capability.maxwidth;
|
2542 | }
|
2543 | if (h>capture->capability.maxheight) {
|
2544 | h=capture->capability.maxheight;
|
2545 | }
|
2546 |
|
2547 | capture->captureWindow.width=w;
|
2548 | capture->captureWindow.height=h;
|
2549 |
|
2550 | if (ioctl(capture->deviceHandle, VIDIOCSWIN, &capture->captureWindow) < 0) {
|
2551 | icvCloseCAM_V4L(capture);
|
2552 | return 0;
|
2553 | }
|
2554 |
|
2555 | if (ioctl(capture->deviceHandle, VIDIOCGWIN, &capture->captureWindow) < 0) {
|
2556 | icvCloseCAM_V4L(capture);
|
2557 | return 0;
|
2558 | }
|
2559 |
|
2560 | capture->FirstCapture = 1;
|
2561 |
|
2562 | }
|
2563 | #endif
|
2564 |
|
2565 | return 0;
|
2566 |
|
2567 | }
|
2568 |
|
2569 | static int icvSetControl (CvCaptureCAM_V4L* capture,
|
2570 | int property_id, double value) {
|
2571 |
|
2572 |
|
2573 | if (value < 0.0) {
|
2574 | value = 0.0;
|
2575 | } else if (value > 1.0) {
|
2576 | value = 1.0;
|
2577 | }
|
2578 |
|
2579 | #ifdef HAVE_CAMV4L2
|
2580 |
|
2581 | if (V4L2_SUPPORT == 1)
|
2582 | {
|
2583 |
|
2584 |
|
2585 | int v4l2_min = 0;
|
2586 | int v4l2_max = 255;
|
2587 |
|
2588 |
|
2589 | CLEAR (capture->control);
|
2590 |
|
2591 |
|
2592 | switch (property_id) {
|
2593 |
|
2594 | case CV_CAP_PROP_BRIGHTNESS:
|
2595 | capture->control.id = V4L2_CID_BRIGHTNESS;
|
2596 | break;
|
2597 | case CV_CAP_PROP_CONTRAST:
|
2598 | capture->control.id = V4L2_CID_CONTRAST;
|
2599 | break;
|
2600 | case CV_CAP_PROP_SATURATION:
|
2601 | capture->control.id = V4L2_CID_SATURATION;
|
2602 | break;
|
2603 | case CV_CAP_PROP_HUE:
|
2604 | capture->control.id = V4L2_CID_HUE;
|
2605 | break;
|
2606 | case CV_CAP_PROP_GAIN:
|
2607 | capture->control.id = V4L2_CID_GAIN;
|
2608 | break;
|
2609 | case CV_CAP_PROP_EXPOSURE:
|
2610 | capture->control.id = V4L2_CID_EXPOSURE;
|
2611 | break;
|
2612 | default:
|
2613 | fprintf(stderr,
|
2614 | "HIGHGUI ERROR: V4L2: setting property #%d is not supported\n",
|
2615 | property_id);
|
2616 | return -1;
|
2617 | }
|
2618 |
|
2619 |
|
2620 | if (-1 == ioctl (capture->deviceHandle,
|
2621 | VIDIOC_G_CTRL, &capture->control)) {
|
2622 |
|
2623 | return -1;
|
2624 | }
|
2625 |
|
2626 |
|
2627 | switch (property_id) {
|
2628 |
|
2629 | case CV_CAP_PROP_BRIGHTNESS:
|
2630 | v4l2_min = capture->v4l2_brightness_min;
|
2631 | v4l2_max = capture->v4l2_brightness_max;
|
2632 | break;
|
2633 | case CV_CAP_PROP_CONTRAST:
|
2634 | v4l2_min = capture->v4l2_contrast_min;
|
2635 | v4l2_max = capture->v4l2_contrast_max;
|
2636 | break;
|
2637 | case CV_CAP_PROP_SATURATION:
|
2638 | v4l2_min = capture->v4l2_saturation_min;
|
2639 | v4l2_max = capture->v4l2_saturation_max;
|
2640 | break;
|
2641 | case CV_CAP_PROP_HUE:
|
2642 | v4l2_min = capture->v4l2_hue_min;
|
2643 | v4l2_max = capture->v4l2_hue_max;
|
2644 | break;
|
2645 | case CV_CAP_PROP_GAIN:
|
2646 | v4l2_min = capture->v4l2_gain_min;
|
2647 | v4l2_max = capture->v4l2_gain_max;
|
2648 | break;
|
2649 | case CV_CAP_PROP_EXPOSURE:
|
2650 | v4l2_min = capture->v4l2_exposure_min;
|
2651 | v4l2_max = capture->v4l2_exposure_max;
|
2652 | break;
|
2653 | }
|
2654 |
|
2655 |
|
2656 | CLEAR (capture->control);
|
2657 |
|
2658 |
|
2659 | switch (property_id) {
|
2660 |
|
2661 | case CV_CAP_PROP_BRIGHTNESS:
|
2662 | capture->control.id = V4L2_CID_BRIGHTNESS;
|
2663 | break;
|
2664 | case CV_CAP_PROP_CONTRAST:
|
2665 | capture->control.id = V4L2_CID_CONTRAST;
|
2666 | break;
|
2667 | case CV_CAP_PROP_SATURATION:
|
2668 | capture->control.id = V4L2_CID_SATURATION;
|
2669 | break;
|
2670 | case CV_CAP_PROP_HUE:
|
2671 | capture->control.id = V4L2_CID_HUE;
|
2672 | break;
|
2673 | case CV_CAP_PROP_GAIN:
|
2674 | capture->control.id = V4L2_CID_GAIN;
|
2675 | break;
|
2676 | case CV_CAP_PROP_EXPOSURE:
|
2677 | capture->control.id = V4L2_CID_EXPOSURE;
|
2678 | break;
|
2679 | default:
|
2680 | fprintf(stderr,
|
2681 | "HIGHGUI ERROR: V4L2: setting property #%d is not supported\n",
|
2682 | property_id);
|
2683 | return -1;
|
2684 | }
|
2685 |
|
2686 |
|
2687 | capture->control.value = (int)(value * (v4l2_max - v4l2_min) + v4l2_min);
|
2688 |
|
2689 |
|
2690 | if (-1 == ioctl (capture->deviceHandle,
|
2691 | VIDIOC_S_CTRL, &capture->control) && errno != ERANGE) {
|
2692 | perror ("VIDIOC_S_CTRL");
|
2693 | return -1;
|
2694 | }
|
2695 | }
|
2696 | #endif
|
2697 | #if defined(HAVE_CAMV4L) && defined(HAVE_CAMV4L2)
|
2698 | else
|
2699 | #endif
|
2700 | #ifdef HAVE_CAMV4L
|
2701 | {
|
2702 |
|
2703 | int v4l_value;
|
2704 |
|
2705 |
|
2706 | v4l_value = (int)(0xFFFF * value);
|
2707 |
|
2708 | switch (property_id) {
|
2709 | case CV_CAP_PROP_BRIGHTNESS:
|
2710 | capture->imageProperties.brightness = v4l_value;
|
2711 | break;
|
2712 | case CV_CAP_PROP_CONTRAST:
|
2713 | capture->imageProperties.contrast = v4l_value;
|
2714 | break;
|
2715 | case CV_CAP_PROP_SATURATION:
|
2716 | capture->imageProperties.colour = v4l_value;
|
2717 | break;
|
2718 | case CV_CAP_PROP_HUE:
|
2719 | capture->imageProperties.hue = v4l_value;
|
2720 | break;
|
2721 | case CV_CAP_PROP_GAIN:
|
2722 | fprintf(stderr,
|
2723 | "HIGHGUI ERROR: V4L: Gain control in V4L is not supported\n");
|
2724 | return -1;
|
2725 | case CV_CAP_PROP_EXPOSURE:
|
2726 | fprintf(stderr,
|
2727 | "HIGHGUI ERROR: V4L: Exposure control in V4L is not supported\n");
|
2728 | return -1;
|
2729 | default:
|
2730 | fprintf(stderr,
|
2731 | "HIGHGUI ERROR: V4L: property #%d is not supported\n",
|
2732 | property_id);
|
2733 | return -1;
|
2734 | }
|
2735 |
|
2736 | if (ioctl(capture->deviceHandle, VIDIOCSPICT, &capture->imageProperties)
|
2737 | < 0)
|
2738 | {
|
2739 | fprintf(stderr,
|
2740 | "HIGHGUI ERROR: V4L: Unable to set video informations\n");
|
2741 | icvCloseCAM_V4L(capture);
|
2742 | return -1;
|
2743 | }
|
2744 | }
|
2745 | #endif
|
2746 |
|
2747 |
|
2748 | return 0;
|
2749 |
|
2750 | }
|
2751 |
|
2752 | static int icvSetPropertyCAM_V4L( CvCaptureCAM_V4L* capture,
|
2753 | int property_id, double value ){
|
2754 | static int width = 0, height = 0;
|
2755 | int retval;
|
2756 |
|
2757 |
|
2758 | retval = 0;
|
2759 |
|
2760 | |
2761 | the video size */
|
2762 |
|
2763 |
|
2764 | switch (property_id) {
|
2765 | case CV_CAP_PROP_FRAME_WIDTH:
|
2766 | width = cvRound(value);
|
2767 | if(width !=0 && height != 0) {
|
2768 | retval = icvSetVideoSize( capture, width, height);
|
2769 | width = height = 0;
|
2770 | }
|
2771 | break;
|
2772 | case CV_CAP_PROP_FRAME_HEIGHT:
|
2773 | height = cvRound(value);
|
2774 | if(width !=0 && height != 0) {
|
2775 | retval = icvSetVideoSize( capture, width, height);
|
2776 | width = height = 0;
|
2777 | }
|
2778 | break;
|
2779 | case CV_CAP_PROP_BRIGHTNESS:
|
2780 | case CV_CAP_PROP_CONTRAST:
|
2781 | case CV_CAP_PROP_SATURATION:
|
2782 | case CV_CAP_PROP_HUE:
|
2783 | case CV_CAP_PROP_GAIN:
|
2784 | case CV_CAP_PROP_EXPOSURE:
|
2785 | retval = icvSetControl(capture, property_id, value);
|
2786 | break;
|
2787 | default:
|
2788 | fprintf(stderr,
|
2789 | "HIGHGUI ERROR: V4L: setting property #%d is not supported\n",
|
2790 | property_id);
|
2791 | }
|
2792 |
|
2793 |
|
2794 | return retval;
|
2795 | }
|
2796 |
|
2797 | static void icvCloseCAM_V4L( CvCaptureCAM_V4L* capture ){
|
2798 |
|
2799 |
|
2800 | if (capture)
|
2801 | {
|
2802 |
|
2803 | #ifdef HAVE_CAMV4L2
|
2804 | if (V4L2_SUPPORT == 0)
|
2805 | #endif
|
2806 | #ifdef HAVE_CAMV4L
|
2807 | {
|
2808 |
|
2809 | if (capture->mmaps)
|
2810 | free(capture->mmaps);
|
2811 | if (capture->memoryMap)
|
2812 | munmap(capture->memoryMap, capture->memoryBuffer.size);
|
2813 |
|
2814 | }
|
2815 | #endif
|
2816 | #if defined(HAVE_CAMV4L) && defined(HAVE_CAMV4L2)
|
2817 | else
|
2818 | #endif
|
2819 | #ifdef HAVE_CAMV4L2
|
2820 | {
|
2821 | capture->type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
|
2822 | if (-1 == ioctl(capture->deviceHandle, VIDIOC_STREAMOFF, &capture->type)) {
|
2823 | perror ("Unable to stop the stream.");
|
2824 | }
|
2825 |
|
2826 | for (unsigned int n_buffers_ = 0; n_buffers_ < capture->req.count; ++n_buffers_)
|
2827 | {
|
2828 | if (-1 == munmap (capture->buffers[n_buffers_].start, capture->buffers[n_buffers_].length)) {
|
2829 | perror ("munmap");
|
2830 | }
|
2831 | }
|
2832 |
|
2833 | if (capture->buffers[MAX_V4L_BUFFERS].start)
|
2834 | {
|
2835 | free(capture->buffers[MAX_V4L_BUFFERS].start);
|
2836 | capture->buffers[MAX_V4L_BUFFERS].start = 0;
|
2837 | }
|
2838 | }
|
2839 | #endif
|
2840 |
|
2841 | if (capture->deviceHandle != -1)
|
2842 | close(capture->deviceHandle);
|
2843 |
|
2844 | if (capture->frame.imageData) cvFree(&capture->frame.imageData);
|
2845 |
|
2846 | }
|
2847 | };
|
2848 |
|
2849 |
|
2850 | class CvCaptureCAM_V4L_CPP : CvCapture
|
2851 | {
|
2852 | public:
|
2853 | CvCaptureCAM_V4L_CPP() { captureV4L = 0; }
|
2854 | virtual ~CvCaptureCAM_V4L_CPP() { close(); }
|
2855 |
|
2856 | virtual bool open( int index );
|
2857 | virtual void close();
|
2858 |
|
2859 | virtual double getProperty(int);
|
2860 | virtual bool setProperty(int, double);
|
2861 | virtual bool grabFrame();
|
2862 | virtual IplImage* retrieveFrame(int);
|
2863 | protected:
|
2864 |
|
2865 | CvCaptureCAM_V4L* captureV4L;
|
2866 | };
|
2867 |
|
2868 | bool CvCaptureCAM_V4L_CPP::open( int index )
|
2869 | {
|
2870 | close();
|
2871 | captureV4L = icvCaptureFromCAM_V4L(index);
|
2872 | return captureV4L != 0;
|
2873 | }
|
2874 |
|
2875 | void CvCaptureCAM_V4L_CPP::close()
|
2876 | {
|
2877 | if( captureV4L )
|
2878 | {
|
2879 | icvCloseCAM_V4L( captureV4L );
|
2880 | cvFree( &captureV4L );
|
2881 | }
|
2882 | }
|
2883 |
|
2884 | bool CvCaptureCAM_V4L_CPP::grabFrame()
|
2885 | {
|
2886 | return captureV4L ? icvGrabFrameCAM_V4L( captureV4L ) != 0 : false;
|
2887 | }
|
2888 |
|
2889 | IplImage* CvCaptureCAM_V4L_CPP::retrieveFrame(int)
|
2890 | {
|
2891 | return captureV4L ? icvRetrieveFrameCAM_V4L( captureV4L, 0 ) : 0;
|
2892 | }
|
2893 |
|
2894 | double CvCaptureCAM_V4L_CPP::getProperty( int propId )
|
2895 | {
|
2896 | return captureV4L ? icvGetPropertyCAM_V4L( captureV4L, propId ) : 0.0;
|
2897 | }
|
2898 |
|
2899 | bool CvCaptureCAM_V4L_CPP::setProperty( int propId, double value )
|
2900 | {
|
2901 | return captureV4L ? icvSetPropertyCAM_V4L( captureV4L, propId, value ) != 0 : false;
|
2902 | }
|
2903 |
|
2904 | CvCapture* cvCreateCameraCapture_V4L( int index )
|
2905 | {
|
2906 | CvCaptureCAM_V4L_CPP* capture = new CvCaptureCAM_V4L_CPP;
|
2907 |
|
2908 | if( capture->open( index ))
|
2909 | return (CvCapture*)capture;
|
2910 |
|
2911 | delete capture;
|
2912 | return 0;
|
2913 | }
|
2914 |
|
2915 | #endif
|