Bootstrap FreeKB - Flask - Resolve SQLAlchemy "Lost connection to server during query"
Flask - Resolve SQLAlchemy "Lost connection to server during query"

Updated:   |  Flask articles

Let's say something like this is being returned.

sqlalchemy.exc.InterfaceError: (mariadb.InterfaceError) Lost connection to server during query

 

It may simply be the case that SQLAlchemy is opening a connection to the database but not closing the connection, and then the MariaDB timeout is reached, which causes "Lost connection to server during query" to be returned, perhaps something like this. In this example, the function is missing db.session.close().

def example():
  data = mytable.query.filter(mytable.email.ilike(email)).one()
  data.timestamp = func.now()
  db.session.commit()

 

If issues persist, I almost alway start by setting SQLALCHEMY_ECHO and SQLALCHEMY_RECORD_QUERIES to True so that I get verbose SQL output in the console.

app.config["SQLALCHEMY_ECHO"] = True
app.config["SQLALCHEMY_RECORD_QUERIES"] = True

 

Something like this could be returned.

2023-08-12 05:17:47,178 INFO sqlalchemy.engine.Engine BEGIN (implicit)
2023-08-12 05:17:47,186 INFO sqlalchemy.engine.Engine SELECT users.id AS users_id, users.email AS users_email FROM users WHERE lower (users.email) LIKE lower (?) LIMIT ?
2023-08-12 05:17:47,188 INFO sqlalchemy.engine.Engine [generated in 0.00224s] ('john.doe@example.com', 1)
2023-08-12 05:17:47,278 INFO sqlalchemy.engine.Engine ROLLBACK

 

And the Lost connection to server during query sequence may look something like this.

2023-08-12 05:18:48,484 INFO sqlalchemy.engine.Engine BEGIN (implicit)
2023-08-12 05:18:48,486 INFO sqlalchemy.engine.Engine SELECT users.id AS users_id, users.email AS users_email FROM users WHERE lower (users.email) LIKE lower (?) LIMIT ?
2023-08-12 05:18:48,491 INFO sqlalchemy.engine.Engine [cached since 61.31s ago] ('john.doe@example.com', 1)
2023-08-12 05:18:48,499 INFO sqlalchemy.pool.impl.QueuePool Invalidate connection <mariadb.connection connected to 'ec2-107-22-51-98.compute-1.amazonaws.com' at 0000023778900830> (reason: InterfaceError:Lost connection to server during query)
Traceback (most recent call last):
mariadb.InterfaceError: Lost connection to server during query

 

Notice in this example that after the SELECT statement the following transaction has cached since.

2023-08-12 05:18:48,491 INFO sqlalchemy.engine.Engine [cached since 61.31s ago] ('john.doe@example.com', 1)

 

If you have access to MariaDB, you can used the show processlist command to list the active connections.

~]$ mysql -e "show processlist \G"
*************************** 4. row ***************************
      Id: 11735
    User: john.doe
    Host: 10.13.59.27:38162
      db: mydb
 Command: Sleep
    Time: 20
   State:
    Info: NULL
Progress: 0.000

 

According to https://flask-sqlalchemy.palletsprojects.com/en/2.x/config/:

Certain database backends may impose different inactive connection timeouts, which interferes with Flask-SQLAlchemy’s connection pooling.

By default, MariaDB is configured to have a 600 second timeout. This often surfaces hard to debug, production environment only exceptions like 2013: Lost connection to MySQL server during query.

If you are using a backend (or a pre-configured database-as-a-service) with a lower connection timeout, it is recommended that you set SQLALCHEMY_POOL_RECYCLE to a value less than your backend’s timeout.

 

If you are using a MariaDB or mySQL database, the following commands can be used to return the session and global timeouts.

  • Session variables only affect the current session
  • Global variables only affect new sessions i.e. if you're already connected, you'll need to disconnect and reconnect for the changes to affect your session.
mysql -e "SHOW SESSION VARIABLES LIKE '%timeout%'"
mysql -e "SHOW GLOBAL VARIABLES LIKE '%timeout%'"

 

Which should return something like this. 0 (zero) means no timeout.

~]$ mysql -e "SHOW SESSION VARIABLES LIKE '%timeout%'"
Variable_name   Value
connect_timeout                       10
deadlock_timeout_long                 50000000
deadlock_timeout_short                10000
delayed_insert_timeout                300
idle_readonly_transaction_timeout     0
idle_transaction_timeout              0
idle_write_transaction_timeout        0
innodb_flush_log_at_timeout           1
innodb_lock_wait_timeout              50
innodb_rollback_on_timeout            OFF
interactive_timeout                   28800
lock_wait_timeout                     86400
net_read_timeout                      30
net_write_timeout                     60
rpl_semi_sync_master_timeout          10000
rpl_semi_sync_slave_kill_conn_timeout 5
slave_net_timeout                     60
thread_pool_idle_timeout              60
wait_timeout                          28800

 

I ran into this issue due to the thread_pool_idle_timeout set to 60 (seconds). I was running MariaDB on Docker, so I added the following directive to my 50-server.cnf file and then restarted my MariaDB Docker container.

thread_pool_idle_timeout = 900

 

You can try setting the following. You may need to adjust the values.

app.config['SQLALCHEMY_POOL_SIZE'] = 100
app.config['SQLALCHEMY_POOL_RECYCLE'] = 30
app.config['SQLALCHEMY_POOL_TIMEOUT'] = 30
app.config['SQLALCHEMY_POOL_PRE_PING'] = True

 

If using create_engine, you can try setting the isolation_level to READ UNCOMMITTED.

engine = create_engine(uri, isolation_level="READ UNCOMMITTED")

 

By default, mySQL / MariaDB log level should be 2.

~]# mysql -e "select @@log_warnings"
@@log_warnings
2

 

At log_warnings 2, when Lost connection to server during query occurs, the mySQL/MariaDB log could have something like this.

2023-08-14 10:18:26 5690 [Warning] Aborted connection 5690 to db: 'mytable' user: 'john.doe' host: '172.31.19.227' (Got an error reading communication packets)

 

You can try setting log_warnings to 9 to see if anything additional gets logged.

~]# mysql -e "SET GLOBAL log_warnings=9"

 




Did you find this article helpful?

If so, consider buying me a coffee over at Buy Me A Coffee



Comments


Add a Comment


Please enter d7be13 in the box below so that we can be sure you are a human.