$\begingroup$

In §5.3 of Kontsevich's Formality Conjecture he writes:

This (...) gives a remarkable vector field on the space of bi-vector fields on $\mathbf{R}^d$. The evolution with respect to the time $t$ is described by the following non-linear partial differential equation: $$\frac{{\rm d}\alpha}{{\rm d}t} = \sum_{i,j,k,l,m,k',l',m'} \frac{\partial^3 \alpha_{ij}}{\partial x_k \partial x_l \partial x_m} \frac{\partial \alpha_{kk'}}{\partial x_{l'}} \frac{\partial \alpha_{ll'}}{\partial x_{m'}} \frac{\partial \alpha_{mm'}}{\partial x_{k'}} \left( \frac{\partial}{\partial x_i} \wedge \frac{\partial}{\partial x_j} \right),$$ where $\alpha = \sum_{i,j} \alpha_{ij}(x) \frac{\partial}{\partial x_i} \wedge \frac{\partial}{\partial x_j}$ is a bi-vector field on $\mathbf{R}^d$.

What does ${\rm d}\alpha/{\rm d}t$ here mean exactly? We presume it gives a system of $d(d-1)/2 = {d \choose 2}$ partial differential equations, looking at the equation above component-wise (modulo anti-symmetry of the wedge), and a solution would be a flow $t \mapsto \alpha(t)$, where each $\alpha(t)$ is a bi-vector field.

Next he writes:

Also, in dimension $d = 2$ the direct calculation shows that the evolution operator gives a conjugation of bi-vector field $\alpha$ by a vector field whose coefficients are differential polynomials in coefficients of $\alpha$.

In this case we have one equation:

$$\frac{{\rm d}\alpha_{12}}{{\rm d}t} = \sum_{k,l,m,k',l',m'} \frac{\partial^3 \alpha_{12}}{\partial x_k \partial x_l \partial x_m} \frac{\partial \alpha_{kk'}}{\partial x_{l'}} \frac{\partial \alpha_{ll'}}{\partial x_{m'}} \frac{\partial \alpha_{mm'}}{\partial x_{k'}}, $$ where the $\alpha_{ij}$ in the nonzero terms are all $\pm \alpha_{12}$. Explicitly, writing $u = \alpha_{12}$, $x_1 = x, x_2= y$, we get $$u_t = u_{xxx}(u_y)^3 - u_{yyy}(u_x)^3 - 3u_{xxy}u_x(u_y)^2 + 3u_{xyy}(u_x)^2u_y.$$

How do we proceed to show that the solution is a conjugation by a vector field?

Edit (17/06/15): Or is it the right hand side of the equation that is a conjugation?

The Schouten bracket of a bivector $\alpha = \alpha^{12}(x) \partial_1 \wedge \partial_2$ and a vector $X = X^1\partial_1 + X^2\partial_2$ is a bivector: $$\begin{align*}[\![ \alpha, X ]\!] &= [\![\alpha, X^1\partial_1]\!] + [\![\alpha, X^2\partial_2]\!]\\ &=[\alpha^{12}(x)\partial_1, X^1\partial_1] \wedge \partial_2 - [\partial_2, X^1\partial_1] \wedge \alpha^{12}(x)\partial_1 \\ & \quad + [\alpha^{12}(x)\partial_1, X^2\partial_2] \wedge \partial_2 - [\partial_2, X^2\partial_2] \wedge \alpha^{12}(x)\partial_1\\ &=(\alpha^{12}(x)\partial_1X^1 - X^1\partial_1\alpha^{12}(x) - X^2\partial_2 \alpha^{12}(x) + \alpha^{12}(x)\partial_2X^2)\partial_1 \wedge \partial_2.\end{align*}$$

Putting $u = \alpha^{12}(x), f = X^1, g = X^2,$ this is: $$uf_x - fu_x - gu_y + ug_y = u(f_x + g_y) - fu_x - gu_y.$$

But this involves $u$'s without derivatives, so we can't match it directly to the RHS above .

The Jacobi identity is of no use either : every bivector field on $\mathbf{R}^2$ is Poisson.

(Strikethrough on 02/07/15)