Quantcast
Channel: TimescaleDB and PostgreSQL - Timescale Community Forum
Viewing all articles
Browse latest Browse all 291

Data corruption after using drop_chunks()

$
0
0

Within our monitoring stack we collect data (telegraf) and save these dat into a PostgreSQL database with Timescale hyper tables.

As the database had exploded, I deleted old data that was before 1.1.2024. A DELETE took too long and I wanted to solve this efficiently using the drop_chunks() function. I then also found a suitable statement:

SELECT DROP_CHUNKS (
	FORMAT('%I.%I', HYPERTABLE_SCHEMA, HYPERTABLE_NAME)::REGCLASS,
	OLDER_THAN => DATE_TRUNC('year', CURRENT_DATE) - INTERVAL '1 year'
)
FROM TIMESCALEDB_INFORMATION.HYPERTABLES;

So far, so good. After that I deleted unnecessary tags in the corresponding tag tables, for example:

DELETE FROM ASE_BLOCKER_TAG AS T
WHERE NOT EXISTS (
	SELECT *
	FROM ASE_BLOCKER
	WHERE TAG_ID = T.TAG_ID
);

Once that was done, I tried to VACUUM the database:

VACCUM FULL;

Damn!:

ERROR: pg_attribute catalog is missing 8 attribute(s) for relation OID 1380158

FEHLER: pg_attribute catalog is missing 8 attribute(s) for relation OID 1380158
SQL state: XX000

The statement …

SELECT DISTINCT relname 
FROM pg_class 
WHERE NOT EXISTS (
	SELECT 1
	FROM pg_attribute
	WHERE attrelid = relfilenode
);

… returned over 5,8k rows with many hyper tables and chunks.

What have I done wrong?

There is no backup, as this was not previously considered necessary for monitoring databases. I know: No backup, no mercy! We will learn from this.

But is it still possible to salvage something here and restore missing relationships in another way? How can I at least recreate the hypertables?

Thanks a lot!

6 posts - 2 participants

Read full topic


Viewing all articles
Browse latest Browse all 291

Trending Articles