This is one of those things that I always wind up writing a script to do; but I always wonder how would a DBA do it.
So you have a table with 3 fields: event_id, recorded_stamp, occurrances.
These entries are recorded at 5s intervals but after 30 days you don’t need this level of granularity. So, you want to aggregate older rows so that you have a one hour granularity. An example – albeit one I know is terrible – might explain:
LOCK TABLE event_log ; CREATE TEMPORARY TABLE t_ec_archive SELECT event_id , DATE_FORMAT(recorded, '%Y-%m-%d %H:00') AS recorded , SUM(occurances) FROM event_log WHERE recorded < DATE_SUB(now(), INTERVAL 30 DAY)GROUP BY 1 ; DELETE FROM event_log WHERE recorded <= (SELECT MAX(recorded) FROM t_ec_archive) ; INSERT INTO event_log SELECT * FROM t_ec_archive ; UNLOCK TABLE event_log ; DROP TABLE t_ec_archive ;
There are several obvious things wrong with this approach, although it does the job.
I usually write a script rather than doing it all in SQL, with the intent of burning CPU somewhere to avoid long table locks.