Postgres functions are declared with volatility classification VOLATILE , STABLE or IMMUTABLE . The project is known to be very strict with these labels for built-in functions. And with good reason. Prominent example: expression indexes only allow IMMUTABLE functions and those have to be truly immutable to avoid incorrect results.

User-defined functions are still free to be declared as the owner chooses. The manual advises:

For best optimization results, you should label your functions with the strictest volatility category that is valid for them.

... and adds an extensive list of things that can go wrong with an incorrect volatility label.

Still, there are cases where faking immutability makes sense. Mostly when you know the function is, in fact, immutable within your scope. Example:

All possible implications on data integrity aside, what is the effect on performance? One might assume declaring a function IMMUTABLE can only be beneficial to performance . Is that so?

Can declaring function volatility IMMUTABLE harm performance?

Let's assume current Postgres 10 to narrow it down, but all recent versions are of interest.