What is the normalization Open-CV performs when using haar classifiers?

35 Views Asked by At

I am trying to implement haar cascade classifiers in FPGA using the xml files acquired from OpenCV. I am first writing a test program in python to determine if my logic is completely correct.

I understand the entire file format and the concept of integral images but the thing that is unclear and I can't find in documentation is how OpenCV goes from the large rect values to the feature threshold of, for example 0.5

normalizing by dividing the sum with the sum of the entire matrix appears to be way too small and keeps failing at the same stage.

0

There are 0 best solutions below