'psycopg2.extensions.TransactionRollbackError: could not serialize access due to concurrent update
I have an issue with TransactionRollbackError: could not serialize access due to concurrent update
while updating the table.I'm using postgres with odoo V13, the issue occurs while updating the pool with specific record set and context using write()
method.I'm migrating the code from odoo v7 to v13
I could see the same works in odoo V7 with no issues. I see no syntax errors but still i get this. I just want to understand is this a bug in the version or related to the concurrence of any data ?
I have a following line of code which is part of the one function.
self.env.get('pk.status').browse(pk_id).with_context(audit_log=True).write(update_vals)
I have a model named pk.status
and it has attribute write(self,update_vals)
, based on conditions it will have to run x1(update_vals)
as below.
def x1(self,update_vals):
product_pool = self.env.get('pk.product')
if update_vals:
if isinstance(update_vals, int):
update_vals = [update_vals]
for bs_obj in self.browse(update_vals).read(['End_Date']):
product_ids = product_pool.search([('id_pk_status', '=', bs_obj['id']),
('is_active', '=', 'Y')])
if product_ids:
end_date = bs_obj['End_Date'] or date.today()
force_update = self._context.get('force_update', False)
product_ids.with_context(audit_log=True,force_update=force_update).write(
{'is_active': 'N', 'end_date': end_date})
Product_ids
record set has a write(self, val)
function for 'pk.product'
model.
As part of the write() and its conditions will execute x2()
def x2(self, vals, condition=None):
try:
status_pool = self.env.get('pk.status')
product_pool = self.env.get('pk.product')
result = False
status_obj = status_pool.browse(vals['id_pk_status']).read()[0]
product_obj = product_pool.browse(vals['id_pk_product']).read()[0]
if not product_obj['end_date']:
product_obj['end_date'] = date.today()
extra_check = True
if condition:
statuses = (status_obj['Is_Active'], product_obj['is_active'])
extra_check = statuses in condition
if extra_check:
result = True
if isinstance(vals['start_date'], str):
vals['start_date'] = datetime.strptime(vals['start_date'], '%Y-%m-%d').date()
if not (result and vals['start_date'] >= status_obj['Start_Date']):
result = False
except Exception as e:
traceback.print_exc()
return result
The error occurs while executing the line
status_obj = status_pool.browse(vals['id_pk_status]).read()[0]
Complete Error:
2020-08-09 15:39:11,303 4224 ERROR ek_openerp_dev odoo.sql_db: bad query: UPDATE "pk_status" SET "Is_Active"='N',"write_uid"=1,"write_date"=(now() at time zone 'UTC') WHERE id IN (283150)
ERROR: could not serialize access due to concurrent update
Traceback (most recent call last):
File "/current/addons/models/pk_product.py", line 141, in x2()
status_obj = status_pool.browse(vals['id_pk_status']).read()[0]
File "/current/core/addons/nest_migration_utils/helpers/old_cr.py", line 51, in old_cursor
result = method(*args, **kwargs)
File "/current/odoo/odoo/models.py", line 2893, in read
self._read(stored_fields)
File "/current/odoo/odoo/models.py", line 2953, in _read
self.flush(fields, self)
File "/current/odoo/odoo/models.py", line 5419, in flush
process(self.env[model_name], id_vals)
File "/current/odoo/odoo/models.py", line 5374, in process
recs._write(vals)
File "/current/odoo/odoo/models.py", line 3619, in _write
cr.execute(query, params + [sub_ids])
File "/current/odoo/odoo/sql_db.py", line 163, in wrapper
return f(self, *args, **kwargs)
File "/current/odoo/odoo/sql_db.py", line 240, in execute
res = self._obj.execute(query, params)
psycopg2.extensions.TransactionRollbackError: could not serialize access due to concurrent update
I assume the concurrence in the error states that im doing two write operations in a single thread but im not sure about it. Hope this helps.
Solution 1:[1]
Each subprocess needs to have a global connection to the database. If you are using Pool
then you can define a function that creates a global connection and a cursor and pass it to the initializer
parameter. If you are instead using a Process
object then I'd recommend you create a single connection and pass the data via queues or pipes.
Like Klaver said, it would be better if you were to provide code so as to get a more accurate answer.
Solution 2:[2]
This happens if the transaction level is set to serializable and two processes are trying to update the same column values.
If your choice is to go with serializable isolation level, then you have to rollback and retry the transaction again.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Sebastian Liendo |
Solution 2 | Durja |