The Stationery Department, Govt. Kerala uses Postgres as back-end of our e-Governance application. Our application mainly handles stock accounts and file movements. We have faced two specific problems that related with the data storage.
1. As it handles stock accounts every change made in the db should be stored for future reference (Historical data reference of each row).
2. Notes written in each Government files are very sensitive and should not be edited. This problem is addressed in the application level. But the question of database edit always there. What if someone seduce the db admin to make a change in the database entry?
We have solved these problems up to an extent with some crude ideas.
We have managed the footprints with the help of triggers and local backup tables. Every sensitive tables will have a backup table. Backup tables are structurally same as of the original table but will not have index or keys and have additional column operation enum ‘u’, ‘d’. Trigger will be initiated in every updates and delete operations. The old data will be inserted into each table’s corresponding backup table. From application we send each update with the user id.
In case of delete we perform an update operation with the performing user id and operator ‘d’. So even for deleted record we have complete foot prints.
For the second issues we have implemented a hash verification method. The first record will be hashed and the hash will be stored in the row itself. The second row will be hashed with the concatenated value of the second row and the hash of the previous row. each row will have the id of the next row. So basically it’s a linked list with hash value.
When retrieving rows sequentially we will check the hash and if there is any issue with the hash the system will trigger an alert.
I want to present this as a case study and have discussion to find the best practice in connection with these.