The Universal Approximation Theorem for Neural Networks

regular

$\mu(K) < \infty$ for all compact sets $K$ $\mu(E) = \inf\{\mu(U) : E \subseteq U, U \text{ open } \}$ $\mu(E) = \sup\{\mu(K) : K \subseteq E, K \text{ compact} \}$

The set of all feedforward neural networks $\mathcal{N}$ is dense in $C(I_n)$. For every continuous function $f \in C(I_n)$, there exists a sequence of neural networks $(n_j) \in \mathcal{N}$ converging to $f$, i.e. $\lim_{j \to \infty} n_j = f$. For every continuous function $f \in C(I_n)$ and $\varepsilon > 0$ there exists a neural network $g \in \mathcal{N}$ such that $\|g - f\| < \varepsilon$.

sigmoidal

weights

network weights

bias

sigmoidal

discriminatory

Theorem 1

Hahn-Banach Theorem

Riesz Representation Theorem

Lemma