Change events streamed from a database by Debezium are (in developer parlance) strongly typed. This means that event consumers should be aware of the types of data conveyed in the events. This problem of passing along message type data can be solved in multiple ways:

the message structure is passed out-of-band to the consumer, which is able to process the data stored in it the message contains metadata (the schema) that is embedded within the message the message contains a reference to a registry which contains the associated metadata

An example of the first case is Apache Kafka’s well known JsonConverter . It can operate in two modes - with and without schemas. When configured to work without schemas, it generates a plain JSON message where the consumer either needs to know the types of each field beforehand, or it needs to execute heuristic rules to "guess" and map values to datatypes. While this approach is quite flexible it can fail for more advanced cases, e.g. temporal or other semantic types encoded as strings. Also, constraints associated with the types are usually lost.

Here’s an example of such a message:

{ "before": null, "after": { "id": 1001, "first_name": "Sally", "last_name": "Thomas", "email": "sally.thomas@acme.com" }, "source": { "version": "1.1.0.Final", "connector": "mysql", "name": "dbserver1", "ts_ms": 0, "snapshot": "true", "db": "inventory", "table": "customers", "server_id": 0, "gtid": null, "file": "mysql-bin.000003", "pos": 154, "row": 0, "thread": null, "query": null }, "op": "c", "ts_ms": 1586331101491, "transaction": null }

Note how no type information beyond JSON’s basic type system is present. E.g. a consumer cannot conclude from the event itself, which length the numeric id field has.

An example of the second case is again JsonConverter . By means of its schemas.enable option, the JSON message will consist of two parts - schema and payload . The payload part is exactly the same as in the previous case; the schema part contains a description of the message, its fields, field types and associated type constraints. This enables the consumer to process the message in a type-safe way. The drawback of this approach is that the message size has increased significantly, as the schema is quite a large object. As schemas tend to be changed rarely (how often do you change the definitions of the columns of your database tables?), adding the schema to each and every event poses a signficant overhead.

The following example of a message with a schema clearly shows that the schema itself can be significantly larger than the payload and is not very economical to use: