OclCascadeClassifier does not seem to use GPU(haarcascade_eye) (Bug #4237)


Added by Seunghwa Song almost 11 years ago. Updated almost 10 years ago.


Status:Cancelled Start date:2015-03-12
Priority:Normal Due date:
Assignee:- % Done:

100%

Category:-
Target version:2.4.11
Affected version:pre 2.4 (deprecated) Operating System:Windows
Difficulty: HW Platform:x64
Pull request:

Description

I used OclCascadeClassifier to test face detection and eye detection sample code.

I used "haarcascade_frontalface_default.xml" file as training data for face detection.
This worked using GPU resource properly.

However, when I use "haarcascade_eye.xml", it does not seed to use GPU, but only CPU.
And even performance of classifier was stupid.

I got screen shot for you. please refer to attached file.

Is this bug or did I do something wrong with code?

my sample code is similar with "ocl-example-facedetect" sample included in opencv repository.
https://github.com/sshtel/opencv_practice

Test environment:
OpenCV version : 2.4.10
OS : Windows 7.
CPU : Intel Core i5-2550 CPU @3.30Hz
GPU : AMD Radeon HD 6450.


issue.jpg (448 kB) Seunghwa Song, 2015-03-11 04:42 pm


History

Updated by Seunghwa Song almost 10 years ago

Affected version is 2.4.10.
NOT pre 2.4 (deprecated)

  • Status changed from New to Open

Updated by Seunghwa Song almost 10 years ago

This was not a bug.

  • Status changed from Open to Cancelled
  • Target version set to 2.4.11
  • % Done changed from 0 to 100

Updated by Seunghwa Song almost 10 years ago

Automatically selecting device is ambiguous.
Sometimes, it works on GPU, but it works on CPU for another time.

Also available in: Atom PDF