Antiderivatives

A function $F(x)$ is defined to be an antiderivative of a function $f(x)$ when $F'(x) = f(x)$.

For a given function, we might have many antiderivatives. Consider $f(x)=3x^2 + \cos x$. We can easily see that the derivatives of $$\begin{array}{ccl} F_1(x) &=& x^3 +\sin x\\\\ F_2(x) &=& x^3 +\sin x + 7\\\\ F_3(x) &=& x^3 +\sin x - \sqrt{5} \end{array}$$ all agree with $f(x)$, and are consequently all antiderivatives of $f(x)$.

The examples above suggest that we can find additional antiderivatives for a given function, once we have one of them, by simply adding a constant on the end. One might wonder if there are any functions that differ by more than a constant that share the same derivative. To answer this question, consider the difference between functions $F_1(x)$ and $F_2(x)$, that are both antiderivatives of some function $f(x)$. More specifically, consider the derivative of this difference: $$\frac{d}{dx}[F_1(x) - F_2(x)] = F_1'(x) - F_2'(x) = f(x) - f(x) = 0$$ So the derivative of this difference is zero. What kind of functions have a derivative of zero? Certainly constant functions behave in this way. Could a non-constant function have a zero derivative?

Interestingly, the Mean Value Theorem can quickly resolve this question. For simplicity sake, let us suppose $g'(x)=0$ for all real values of $x$, but our argument is easily modified to account for cases when we only care about the behavior of our function in some open interval. Now, consider the interval $[x_1,x_2]$. By the Mean Value Theorem, there must be some $c$ in $(x_1,x_2)$ where $$g'(c) = \frac{g(x_2)-g(x_1)}{x_2-x_1}$$ But the derivative of $g(x)$ is zero everywhere, including at $x=c$, so $g'(c)=0$. But then, $g(x_2)-g(x_1)=0$, and consequently $g(x_2)=g(x_1)$. Since our choice of $x_1$ and $x_2$ was arbitrary, $g(x)$ must be a constant function.

Thus, if $g'(x) = 0$, then it must be the case that $g(x) = c$ for some constant $c$.

Going back to our discussion about $F_1(x)$ and $F_2(x)$ -- we observed that $$\frac{d}{dx}[F_1(x) - F_2(x)] = 0$$ Hence, for some constant $c$, $$F_1(x) - F_2(x) = c$$ Or equivalently $$F_2(x) = F_1(x) + c$$ So if we can get our hands on a single antiderivative, $F(x)$, of a given function, $f(x)$, we can describe the whole set of antiderivatives of $f(x)$. This set contains every function of the form $$F(x)+c$$ where $c$ is a constant.

For reasons that become clear once the Fundamental Theorem of Calculus is understood, we will denote the set of all antiderivatives of $f(x)$ by $$\int f(x) dx$$ but given the above analysis, once we know of a single antiderivative $F(x)$ to the function $f(x)$, we can write: $$\int f(x) dx = F(x) + C$$ where $C$ is to be thought of as any constant.


It is not hard to see that many of the basic differentiation rules can be "worked backwards" to produce some basic antidifferentiation rules. In particular, one might notice:

  • $\displaystyle{\int dx = x + C}$

  • $\displaystyle{\int k f(x) dx = k \int f(x) dx}$

  • $\displaystyle{\int [f(x) \pm g(x)] dx = \int f(x) dx \pm \int g(x) dx}$

  • $\displaystyle{\int x^n dx = \frac{x^{n+1}}{n+1} + C}$, when $n \neq 1$

  • $\displaystyle{\int \frac{1}{x} dx = \ln |x| + C}$

(Note, the absolute value seen in the last rule above shows up as a result of considering $x$ values that are in the domain of $1/x$ that are not in the domain of $\ln x$.)