'Lost Connection to MySQL server at '<server name>:<port>', system error: 10053 An established connection was aborted

Here is one part of my python code for inserting data retrieved from Sharepoint into MySQL Database, it runs every 10 mins.

cnx = mysql.connector.connect(user='user', password='pwd', host='CP-MSQL03',port='3306',database='datacenter')
cnx.autocommit=True
MySQL_cursor = cnx.cursor()


r = requests.get(url, headers=headers, params=params)
delete_data = []
for item in r.json()['d']['results']:
    id = item['Id']
    lot_No = item['LOT_No']
    start = datetime.strptime(item['START'], '%Y-%m-%dT%H:%M:%SZ')
    lbs = item['LBS']
    line_NAV = item['Line_No']['NAV_x0020_HS_x0020_LINE']
    rpo_No = item['RPO_No']['RPO_No']
    item_NO = item['RPO_No']['Item_No']
    mrt_No = item['RPO_No']['MRT_No']
    SQL_Insert = (
            "INSERT INTO datacenter.mrt_consumption_archive (Line_No, RPO_No, Item_No, MRT_No, LOT_No, Start_Time, LBS) "
            "VALUES('%s', '%s', '%s', '%s', '%s', '%s', %s);" % (
                line_NAV, rpo_No, item_NO, mrt_No, lot_No, start, lbs))
    MySQL_cursor.execute(SQL_Insert)
    delete_data.append(id)

And Here is the error code I got after it ran successfully for a few hours.

enter image description here

My question is, why do I get this error? Is it a firewall issue? timeout setting issue? How can I troubleshoot it? And, why am I keep getting the same error at all the retries after it failed for the first time?



Solution 1:[1]

Connections drop, it happens. A firewall, NAT-enabled router, etc. may be making it happen more often than it should, but it's still not something you want your program to crash from.

So, in general, before you run your query, it has to do a test of the connection and catch any connection exceptions. when caught, restart the connection. Luckily this is a familiar concept called pooling and it's already available from the connector.

Excerpted from https://dev.mysql.com/doc/connector-python/en/connector-python-connection-pooling.html :

To create a connection pool implicitly: Open a connection and specify one or more pool-related arguments (pool_name, pool_size). For example:

dbconfig = {
  "database": "test",
  "user":     "joe"
}

cnx = mysql.connector.connect(pool_name = "mypool",
                              pool_size = 3,
                              **dbconfig)

If you just want to look at it from the perspective of keeping the connections open for some reason, you could also set a short duration keepalive (which may be all the workaround you need, if the problem is that an unreliable network device is purging your connections from tables in its memory). If you can get your network problem fixed, that is a better route than customizing connection settings.

Solution 2:[2]

If your script ran successfully for a few hours I think could be one parameter of your database configuration, review your configuration whit this sentence,

SHOW GLOBAL VARIABLES where variable_name like'%time%';

Maybe it's a lot of connections or maybe a timeout of session, but I'm prety sure that it's problem of your server of MySQL

Solution 3:[3]

I got the same ConnectionAbortedError and the resulting OperationalError while attempting to write a dataframe to a MySQL table.

df.to_sql('table_name', db_conn, if_exists='append', index=False)

Technically, the above operation is similar to yours:

SQL_Insert = (
        "INSERT INTO datacenter.mrt_consumption_archive (Line_No, RPO_No, Item_No, MRT_No, LOT_No, Start_Time, LBS) "
        "VALUES('%s', '%s', '%s', '%s', '%s', '%s', %s);" % (
            line_NAV, rpo_No, item_NO, mrt_No, lot_No, start, lbs))
MySQL_cursor.execute(SQL_Insert)

MySQL Documentation suggests increasing the connect_timeout might help:

Increasing the connect_timeout value might help if clients frequently encounter errors of the form Lost connection to MySQL server at 'XXX', system error: errno.

That did not work for me though. I resolved the error by specifying chunksize parameter:

df.to_sql('table_name', db_conn, if_exists='append', index=False, chunksize=2000)

Try:

  • increasing connect_timeout
  • specifying and increasing pool_size in your database connection.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2 Charly
Solution 3 Banty