Nowadays JSON is used pretty much everywhere. Not just web developers like JSON. It is also used for configuration, data transfer, and a lot more. Luckily PostgreSQL is pretty good at JSON. Recently I have discovered a module called wal2json, which even allows the transformation of xlog to JSON. The module can be found here: https://github.com/eulerto/wal2json
The module can be compiled just like any other contrib module. Once it has been installed, postgresql.conf can be adapted to allow replication slots:
wal_level = logical max_replication_slots = 10
After a database restart a replication slot can be created:
test=# SELECT * FROM pg_create_logical_replication_slot('hans_slot', 'wal2json'); slot_name | xlog_position -----------+--------------- hans_slot | 0/18DD268
Some data can be inserted to demonstrate how the replication slot works:
<pre> test=# CREATE TABLE t_data (id int, name text, payload int[]); CREATE TABLE test=# INSERT INTO t_data VALUES (1, 'hans', '{10, 20, 30}'); INSERT 0 1 test=# INSERT INTO t_data VALUES (2, 'paul', '{23, 49, 87}'); INSERT 0 1
When the data is dequeued a perfect stream of JSON documents can be seen:
test=# SELECT * FROM pg_logical_slot_get_changes('hans_slot', NULL, NULL); location | xid | data -----------+-----+------------------------------------------------------------------- 0/18DD2F0 | 993 | { + | | "xid": 993, + | | "change": [ 0/18F9678 | 993 | ] + | | } 0/18F96B0 | 994 | { + | | "xid": 994, + | | "change": [ 0/18F96B0 | 994 | { + | | "kind": "insert", + | | "schema": "public", + | | "table": "t_data", + | | "columnnames": ["id", "name", "payload"],+ | | "columntypes": ["int4", "text", "_int4"],+ | | "columnvalues": [1, "hans", "{10,20,30}"]+ | | } 0/18F9748 | 994 | ] + | | } 0/18F9780 | 995 | { + | | "xid": 995, + | | "change": [ 0/18F9780 | 995 | { + | | "kind": "insert", + | | "schema": "public", + | | "table": "t_data", + | | "columnnames": ["id", "name", "payload"],+ | | "columntypes": ["int4", "text", "_int4"],+ | | "columnvalues": [2, "paul", "{23,49,87}"]+ | | } 0/18F9818 | 995 | ] + | | } (8 rows)
The real beauty here is that all PostgreSQL JSON functions can be used to process the JSON stream. It is pretty easy to do ex-post analysis on the changes fetched from the xlog.