Is There a Function f Whose Graph in Cartesian is the Same as its Graph in Polar?

The Problem

Based on a question originally posted to math.stackexchange, I set out to find a function whose graphs in Cartesian and polar overlapped. Try it in the Desmos interactive to the right: (Note the restricted domain notation {0 <= x <= pi/2}. You can read more about restricting the domain of Desmos plots here.)

One close solution I found is (tan(x))1.3 .

Getting Started

Suppose we had some sort of function f whose graph G was the same in polar and Cartesian coordinates. Because our graph G is the graph of f in Cartesian coordinates:

(x = a, y = b) is on G iff b = f(a).
Because our graph G is the graph of f in polar :
(θ = a, r = b) is on G iff b = f(a).
So the following are equivalent:
(x = a, y = b) is on G iff (θ = a, r = b) is on G.
And in fact, this is an exact characterization of the graph we're looking for. Any function graph that satisfies this equivalence is the same in polar and Cartesian.

Translating the right-hand side of our characteristic equivalence to cartesian coordinates, we get:

(x = a, y = b) is on G iff (x = b cos(a), y = b sin(a)) is on G.
I'll call this point the lowering of (a,b), since it is always closer to the x-axis.

Assuming 0 <= a <= pi/2 (I haven't been able to work out a solution that isn't restricted to this range), we can run this equivalence in reverse, replacing a in the above equation by arctan(b/a) and b by sqrt(a^2+b^2):

(x = atan(b/a), y = sqrt(a^2+b^2)) is on G iff (x = a, y = b) is on G.
I'll call the point on the right the raising of (a,b), since it is always higher.

But that's not all: given a point, since we know that its raising is on the graph, the raising of the raising is on the graph too!. Similarly, the lowering of the lowering is on the graph. (The raising of the lowering and the lowering of the raising are the original point). This gives us an infinite sequence of points on the graph going in both directions.

Sequence of Raisings/Lowerings
Raising depth:
Lowering depth:

Play around with these points to try to see any patterns you can find!

A Disappointing Solution

Note that the graph you get when iterating raising and lowering a point infinitely gives you a graph satisfying our condition: a point is on the graph iff its lowering is on the graph. If this graph is the graph of a function (no two points have the same x-coordinate), it's the graph of a function whose graph is the same in Cartesian and polar!

This is a little bit disappointing: playing around with the points, it's clear that there's an ideal smooth curve, and we want that!

Minimizing Angles

A good start is to define some measure of "badness" and try to minimize it. We'd like to minimize the zig-zagging behavior of our graph, so let's try to minimize how far every consecutive triple is from being collinear. We'll square each term and sum it up to particularly penalize large values:

badness(a,b) = sum from i = -8 to 30 of (pi - angle(raise^(i-2)(a,b), raise^(i-1)(a,b), raise^i(a,b)))^2
Shown below is the badness value for the currently selected (a,b):

We clearly can't get this measure to be 0, since the graph is going to curve, but we can minimize it. There are better ways of doing this, but a really simple way is to:

  1. Add a random vector to (a,b).
  2. If the badness of the resulting vector is less than the badness of our original (a,b), set (a,b) to this new value and start over.
  3. If not, leave (a,b) unchanged and try again.
Optimizing (a,b) Press this button to start/stop an optimization process:
Reducing the step size will slow the optimization, but is better for fine-tuning the point:
Step Size:

If there's a nice formula for our nice curve, we should be able to find it with regression. You can find the data below, or it as .csv

Regression

You can also do some regression in this interactive directly: use the input widget below to create a number of parameters, which you can use in your definition of f. Press the "start" button to start randomly wiggling around these parameters. If the new values of the parameters improve how well f fits the data, the interactive will keep the changes. Note that due to the way the Desmos side of this interactive works, it will show the current guess for the parameters, not the best values of the parameters found so far. Stop the optimizer to show the best values of the parameters. As before, you can reduce the step size to fine-tune your solution.

Parameter Setup Number of Parameters:
Built-in templates:
  • Numerator Degree: Denominator Degree: Pass through origin
Optimize Parameters to Fit Data Points
Step Size:

Best Badness:
Best Solution:

Digression: Direct Regression

We could also directly try to solve for a function that minimizes the difference between its Cartesian graph and its polar graph. The interactive below allows you to do this.

Optimize Parameters Directly
Step Size:

Best Badness:

Refining our Solution

Going back to our point-based regression, it might be nice to have a set of starting points and consider all of their raisings and lowerings together to fill out the curve. The widget below will allow you to create copies of our base point and show their raisings and lowerings as well.

Increase Number of Base Points Number of Base Points:

Play around with these extra points. With enough points, it should become clear that there's a problem: there are additional smooth solutions to this question! In fact, as long as one creates a smooth curve with the extra points that connects up correctly and doesn't curve backwards in its iterates, the resulting graph will be a smooth solution!

Let's look closer at the conditions on these arbitrary smooth curves. First, all of our points need to be in a resonable range: their raisings must be to their right and their lowerings to their left. These give us the inequalities:

What's more, the successive raisings and lowerings of our point also need to satisfy our inequalities. E.g.: the raising of the raising of our point needs to be to the right of the raising of our point, etc., and the lowering of the lowering of our point needs to be to the left of the lowering of our point.

We can plot the regions where some of these inequalities do not hold in Desmos: . The remaining region is where it is safe for our points to be.

Notice that the remaining region has some width to it, and it seems like our inequalities stop carving out the center of our region, so it's likely that adding more inequalities will not reduce the center of the safe zone. If our points are near the center of the region, the curve will be more smooth and towards the edge, the curve will be more wobbly.

These aren't the only conditions on our smooth curve: there are also conditions on the slope that keep iterates of the function from sloping backwards. One such condition can be obtained as follows. The points (t,f(t)) trace out the curve from left to right and bottom to top. As such, their raisings must also trace out the curve from left to right and bottom to top. Those raisings are given by:

(arctan(f(t)/t), sqrt(f(t)^2 + t^2))

Requiring these points to go from left to right as t increases means that arctan(f(t)/t) must be increasing in t. In other words, its derivative with respect to t must be positive:

(f'(t)*t-f(t))/(t^2 f(t)^2) > 0

The denominator is always positive, so:

f'(t)*t - f(t) > 0
f'(t)*t > f(t)
f'(t) > f(t)/t

So having too small a slope can result in the raisings being out of order from left to right. See if you can replicate this phenomenon by moving the points around in the interactive.

The Smoothest Solution

It still would be nice to have a dataset for the ideal smooth solution to our problem (assuming that even makes sense) so we can do some regression on it. The interactive below will wiggle the base points around to try to minimize the total distance squared between points in the iteration.

Optimizing Base Points Press this button to start/stop an optimization process:
Reducing the step size will slow the optimization, but is better for fine-tuning the point:
Step Size:

As before, you can find the data below or it as a .csv file to perform regression on the data in your favorite data analysis program.


Or, the widget below will allow you to wiggle parameters pi to try to fit f(x) to the data.

Optimize Parameters to Fit Data Points
Step Size:

Best Badness:
Best Solution:

This was used to find the highly accurate (but still quite simple!) approximation: f(x) = 0.1489x + x^2/((pi/2)^3 - x^3) .