I am trying to achieve a the same gaussian blur as Photoshop does but it's seems the CIFilter in Core Image is very biased on the background of the image. I am not talking about the blurred edges, please look closer at the blue intensity between white and dark background of the CIFilter blurred version and Photoshop.
Input images:
Core Image result:
Photoshop result:
You can clearly see CIGaussianBlur outputs a very different image.
This is the code used to generate the images on iOS:
let imageURL = Bundle.main.url(forResource: "1", withExtension: "png")!
let ciImage = CIImage(contentsOf: imageURL)!
let extent = CGRect(origin: .zero, size: ciImage.extent.size)
let blurredImage = CIFilter.gaussianBlur()
blurredImage.radius = 64
blurredImage.inputImage = ciImage
let outputURL = try! FileService.shared.getNewFileURL(type: "jpeg")
let context = CIContext()
try! context.writeJPEGRepresentation(of: blurredImage.outputImage!.cropped(to: extent), to: outputURL, colorSpace: colorSpace)





