I have a function f(x) = a/x and I have a set of data containing values for f(x) +- df(x) and x +- dx. How do I tell gnuplot to do a weighted fit for a with that?
I know that fitaccepts the using term and this works for df(x), but it does not work for dx. It seems gnuplot treats the error I have for x as the error for the whole RHS of my function (a/x +- dx).
How do I do a weighted fit that fits f(x) +- df(x) = a/(x +- dx) to find the optimal a?
Since version 5.0, gnuplot has an explicit provision for taking uncertainty in the independent variable into account
using "Orear's effective variance method".
(Above command expects data in the form
x y dx dy.)