Confluence has been updated to version 6.15.9

Page tree
Skip to end of metadata
Go to start of metadata

Problem

Once I import the data to PostgreSQL I get the following error

Could not execute JDBC batch update



Reason

We are using Hibernate ORM library to work with different databases in the same way. One of the specific features of the Hibernate on PostgreSQL is a way of generation of IDs for new rows. It uses database sequence "hibernate_sequence" to generate IDs for new rows in any table. While moving your JIRA backup to another instance "hibernate_sequence" is kept untouched since it is not a part of JIRA, thus not included into the backup. When you start using TGE on new PostgreSQL instance it generates new "hibernate_sequence" from scratch. New seqence does not know about the last ID which was generated by "hibenate_sequence" on old instance and starts countdown from the beginning. It soon appears, that the generated IDs are already used in some TGE tables, and it causes an error.

Workaround

All you have to do to fix this problem is reset "hibernate_sequnce" with the highest existing ID in all TGE tables.

First find the biggest ID in all of the TGE tables you are using. Find in your grid configuration "gd.tablename" property. By default it is set to "actions". The real tablename, that is created by TGE for data storage will look like "actions_cXXXXX", where XXXXX - arbitrary ID of the custom field context. After you found the table in the database run the following query:

select max (id) from actions_c14381; 

Repeat this operation for all of the TGE tables in your system. Then select the biggest ID among the query results. Let's say it is a 470. Now it is time to reset "hubernate_sequence" with the number, following by 470, with following query:

alter sequence hibernate_sequence restart with 471;