$\begingroup$

This is a collection of a few ideas. Let me see where it takes us. It is probably too verbose since I am writing it as I think the question. Maybe later I can shorten it and correct errors. If you are hurried it is safe to scroll down to the last part, where there is what I think is an answer to the OP's question.

I want to concentrate on differentiation and integration as symbolic operations.

For differentiation we can consider a class $E$ of functions-symbols that contains the constants (complex constants maybe), $x$, is closed under the arithmetic operations, and composition. We can throw in other function-symbols like $e^x$, $\ln(x)$ together with $x^{-1}$, or many others. But notice that for every new function-symbol we throw in $E$ we assume we know how to compute its derivative (we have a symbol for it). The minimal assumption of having the constants and $x$ gives us $E$ to be the polynomials. A larger option would be the elementary functions.

If differentiation is considered as an operation in the symbols in $E$ then it is, by the definition of $E$, an algorithmic operation. Given a function-symbol from $E$ (which by this act it is assumed to be formed from a few symbols for which we know the derivative and arithmetic and composition operations) we can compute its derivatives because the properties of differentiation cover the operations generating $E$. In principle, what might be hard is the question of whether a function belongs to $E$.

Claim: Integration is, at least, as hard as derivation. (maybe harder)

This is clear for the case of polynomials, which are always contained in a reasonable minimal E.

Observation: The tentative claim that integration is harder than derivation is going to necessarily depend on $E$, since for $E$ being the polynomials both are simple operations.

_

Let us consider now constructing a domain, as we have have done for differentiation, that is adapted to the operation of integration. Consider $I$ to be a collection of function-symbols, that contains the constants, $x$, and possibly others $e^x$, $x^{-1}$, ... for which we assume we know their integrals. Assume that $I$ is closed by the following operations:

If $f$ and $g$ are in $I$,

$af+bg\in I$ for any constants $a$ and $b$. And the operations: $f\oplus g:=fg'+f'g\in I$ $f\otimes g:= (f\circ g)\cdot g'\in I$

An $I$ like this is a reasonable minimum domain in which to define integration. It is clear that in such an $I$, integration is algorithmic, for a given function written using these operations.

Claim: In $I$, derivation is simple if we assume $I$ contains the constants and either 2 or 3 are satisfied.

In fact, for a given f in I, its derivative is f'=f⊕1=1⊗f$.

This means that just one basic operation in $I$ allows to compute derivatives.

_

To translate the OP's question into another question:

Given an $E$ we already have a way to define linear combinations with constants, $\oplus$ and $\otimes$, since these are defined using the operations allowed in $E$. So, for an $E$ to be an $I$ or to form out of it a $\subset I$ we would need to have an algorithm that checks whether an element of $E$ is an $I$ (can be written using function-symbols with function-symbols integrals, linear combinations with constants coefficients, $\oplus$, and $\otimes$).

We have that the existence of such an algorithm depends on the $E$, on the function-symbols available in it. For $E$ being the polynomials in $x$ it is clear we have such an algorithm and it is simple.

We have also that for some $E$ the problem is undecidable. From Richardson's theorem we know that if $E$:

Contains $\ln(2),\pi,e^x,\sin(x)$ Contains $|x|$ and Contains a function with no primitive in $E$

Condition $3$ is satisfied for the $E$-closure of the elementary functions together with $|x|$, since we can take $e^{x^2}$ to verify $3$.

The theorem is true because they manage to prove there is an elementary function (using also $|x|$) $M(n,x)$ which is identically to either $0$ or $1$ for each natural number $n$ but for which it is undecidable for every natural number $n$ whether it is identically $0$ or $1$. Given such a function then, if we could decide integration in $E$, then we could decide, for each natural number $n$, whether $f_n(x):=e^{x^2}M(n,x)$ is integrable or not. But this would tell us when $M(n,x)$ is zero or one, since $f_n(x)$ is integrable when $M(n,x)=0$ and non-integrable in $E$ when $M(n,x)=1$.

So, for certain classes $E$ we have that, while derivation is elementary (after having shown the function belongs to $E$), integration is undecidable. This already shows that integration is harder than derivation (the statement depending of course on the class of functions we want to integrate).

Observation: The undecidability of integration for the $E$ above is of course deeply related to having function-symbols in $E$ without primitive function-symbol in $E$. This trivially disappears if we close $E$ by throwing in it symbols for each primitive. On the other hand, the inconvenience is that this makes $E$ not being generated by finitely many symbols. This makes the problem of detecting when a function is represented by a symbol in $E$ even more complex. So, the reason why for this large $E$, if we are given a function which we know is in $E$ we can compute its integral, is because we are pretty much assuming that we can by assuming that the input is in $E$.

It remains then the question:

Question: How small can $E$ be such that integration is harder than derivation?