UIView has functions to convert points, rects, etc. between coordinate spaces. I need to do the same thing for UIBezierPaths. By way of illustration, consider two views, viewA and viewB. We have rectA : CGRect in the coordinate space of viewA. We can calculate:
let rectB = viewB.convert(rectA, from:viewA)
Now suppose we create a path from the rect:
let pathA = UIBezierPath(rect: rectA)
The question is, how to calculated rectB. We could simply:
let pathB = UIBezierPath(rect: rectB)
But that works only if we have rectB. What if we only have `pathA'?
I figure to do this we need a transform aTob. Then we can do:
var pathB = pathA
pathB.apply(aTob)
So the question boils down to how to calculate aTob. For simplicity, assume there is no rotation.
With the result, drawing pathA in viewA should draw a path in the same shape and location on the screen as drawing pathB in viewB.
One solution would be to call
apply()on the bezier path using a translation transform.You can calculate the needed x and y of the transform by converting the origin of
viewBto the coordinate space ofviewA.Here's is some sample code you can put in a Swift Playground:
Output:
I believe this is what you are looking for. The calculation of
offsetmight be backwards. If the result ofpathBis in the wrong direction, change theoffsetline to:Here's an extension to
UIViewfor converting aUIBezierPathfrom one view to/from another:This answer assumes that neither view has a non-identity transform applied to it. Based on the OP's own answer, apparently one or both of the views may have a non-identity transform.