osx - Use CIColorCube for histogram normalization -
osx - Use CIColorCube for histogram normalization -
as understand, cicolorcube specifies general mapping color values color values. seems utilize cicolorcube adjust contrast normalizing v value in hsv space.
below histogramnormalizer class colorcube property returns repopulated cicolorcube object. apply ciimage.
const nsinteger kcolorcubesize = 64; @implementation histogramnormalizer -(id)initwithimage:(ciimage *)image { self = [super init]; if (self) { civector *extent = [civector vectorwithcgrect:image.extent]; cgrect extentrect = image.extent; cifilter *maxfilter = [cifilter filterwithname:@"ciareamaximum" keysandvalues:@"inputimage", image, @"inputextent", extent, nil]; self.maxvalue = [self getvaluefromimage:[maxfilter valueforkey:@"outputimage"]]; cifilter *minfilter = [cifilter filterwithname:@"ciareaminimum" keysandvalues:@"inputimage", image, @"inputextent", extent, nil]; self.minvalue = [self getvaluefromimage:[minfilter valueforkey:@"outputimage"]]; _scale = 1.0 / (self.maxvalue - self.minvalue); _offset = - self.minvalue * self.scale; [self updatecolorcube]; } homecoming self; } -(float)getvaluefromimage:(ciimage*)image { nsbitmapimagerep *bitmap = [[nsbitmapimagerep alloc] initwithciimage:image]; // since underlying image grayness scale. components have // identical value. utilize reddish component here. nscolor *color = [bitmap coloratx:0 y:0]; homecoming color.redcomponent; } -(void)updatecolorcube { long length = kcolorcubesize * kcolorcubesize * kcolorcubesize * 4; float *cubedata = (float*)malloc(length * sizeof(float)); float *c = cubedata; (int b = 0; b < kcolorcubesize; ++b) { (int g = 0; g < kcolorcubesize; ++g) { (int r = 0; r < kcolorcubesize; ++r) { rgb rgb; rgb.r = (float)r / (kcolorcubesize - 1); rgb.g = (float)g / (kcolorcubesize - 1); rgb.b = (float)b / (kcolorcubesize - 1); hsv hsv = rgb2hsv(rgb); // function comes stack overflow post. hsv.v = [self normalizevalue:hsv.v]; rgb = hsv2rgb(hsv); c[0] = rgb.r; c[1] = rgb.g; c[2] = rgb.b; c[3] = 1; c += 4; } } } nsdata *data = [nsdata datawithbytesnocopy:cubedata length:length * sizeof(float) freewhendone:yes]; _colorcube = [cifilter filterwithname:@"cicolorcube" keysandvalues:@"inputcubedimension", [nsnumber numberwithinteger:kcolorcubesize], @"inputcubedata", data, nil]; } -(float)normalizevalue:(float)value { float tmp = (value - self.minvalue) * self.scale; float result = min(1, max(0, tmp)); homecoming tmp; } @end
however when apply color cube filter on next image. result isn't have expected.
with above image, have:
self.minvalue = 0.41 self.maxvalue = 0.82 self.scale = 2.43
but filtered image looks like:
just create sure have hooked filter correct, changed color mapping half rgb values:
for (int b = 0; b < kcolorcubesize; ++b) { (int g = 0; g < kcolorcubesize; ++g) { (int r = 0; r < kcolorcubesize; ++r) { c[0] = (float)r / (kcolorcubesize - 1) / 2; c[1] = (float)g / (kcolorcubesize - 1) / 2; c[2] = (float)b / (kcolorcubesize - 1) / 2; c += 4; } }
this produces have expected, uniformly darken image:
i wonder if thought of using color cube normalize histogram fundamentally wrong somehow.
osx core-image
Comments
Post a Comment