Ran a newer version of hivemind and hafbe and died at start of massive sync when catching up
I've left things as-is on shed14 for diagnosis.
Here's the log:
[Entrypoint] 2023-11-29 23:26:49,596+00:00 INFO [global] (main) Hivemind arguments: --database-url=postgresql://hivemind@haf/haf_block_log sync
[Entrypoint] 2023-11-29 23:26:49,597+00:00 INFO [global] (main) Hivemind arguments: sync
[Entrypoint] 2023-11-29 23:26:49,597+00:00 INFO [global] (main) Using PostgreSQL instance: postgresql://hivemind@haf/haf_block_log
[Entrypoint] 2023-11-29 23:26:49,598+00:00 INFO [run_hive] (main) Starting Hivemind...
INFO - hive.conf:212 - The database instance is created...
INFO - hive.db.adapter:48 - A database offers maximum connections: 40. Required 15 connections.
INFO - hive.db.adapter:90 - Closing database connection: 'root'
INFO - hive.db.adapter:108 - Disposing SQL engine
INFO - hive.conf:261 - The database is disconnected...
INFO - hive.conf:212 - The database instance is created...
INFO - hive.indexer.sync:52 - Entering HAF mode synchronization
INFO - hive.db.db_state:44 - [MASSIVE] Welcome to hive!
INFO - hive.db.db_state:55 - [MASSIVE] Continue with massive sync...
INFO - hive.indexer.sync:67 - hivemind_version : 2.0.0dev1
INFO - hive.indexer.sync:68 - hivemind_git_rev : a084c16ac2d70c7220b0c77075d4dad5a46c87ee
INFO - hive.indexer.sync:69 - hivemind_git_date : 2023-11-29 06:17:50
INFO - hive.indexer.sync:71 - database_schema_version : 34
INFO - hive.indexer.sync:72 - database_patch_date : 2023-11-09 15:34:25.328418
INFO - hive.indexer.sync:73 - database_patched_to_revision : 9d2cc15bea71a39139abdf49569e0eac6dd0b970
INFO - hive.indexer.sync:75 - last_block_from_view : 80254073
INFO - hive.indexer.sync:76 - last_imported_block : 80254075
INFO - hive.indexer.sync:77 - last_completed_block : 80254075
INFO - hive.indexer.hive_db.haf_functions:37 - Trying to attach app context with block number: 80254075
INFO - hive.indexer.hive_db.haf_functions:39 - App context attaching done.
INFO - hive.indexer.sync:89 - Using HAF database as block data provider, pointed by url: 'postgresql://hivemind@haf/haf_block_log'
INFO - hive.indexer.sync:99 - Last imported block is: 80254075
INFO - hive.indexer.sync:186 - Querying for next block for app context...
INFO - hive.indexer.sync:188 - Next block range from hive.app_next_block is: <None:None>
INFO - hive.indexer.sync:99 - Last imported block is: 80254075
INFO - hive.indexer.sync:186 - Querying for next block for app context...
INFO - hive.indexer.sync:188 - Next block range from hive.app_next_block is: <80254076:80598732>
INFO - hive.indexer.sync:117 - target_head_block: 80598732
INFO - hive.indexer.sync:118 - test_max_block: None
INFO - hive.indexer.sync:122 - [MASSIVE] *** MASSIVE blocks processing ***
INFO - hive.db.db_state:278 - Dropping foreign keys
WARNING - hive.db.adapter:275 - [SQL-ERR] OperationalError in query ALTER TABLE hivemind_app.hive_mentions DROP CONSTRAINT IF EXISTS hive_mentions_fk2 ({})
INFO - hive.indexer.sync:71 - Exiting HAF mode synchronization
INFO - hive.db.adapter:90 - Closing database connection: 'root'
INFO - hive.db.adapter:108 - Disposing SQL engine
INFO - hive.conf:261 - The database is disconnected...
Traceback (most recent call last):
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1910, in _execute_context
self.dialect.do_execute(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.DeadlockDetected: deadlock detected
DETAIL: Process 766 waits for ShareLock on transaction 2448471; blocked by process 1135.
Process 1135 waits for ShareLock on transaction 2447964; blocked by process 766.
HINT: See server log for query details.
CONTEXT: while deleting tuple (20,42) in relation "pg_class"
SQL statement "DROP TABLE hive.shadow_hivemind_app_hive_mentions"
PL/pgSQL function hive.on_edit_registered_tables() line 53 at EXECUTE
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/cli.py", line 87, in launch_mode
sync.run()
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/indexer/sync.py", line 134, in run
DbState.before_massive_sync(self._lbound, self._ubound)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/db_state.py", line 281, in before_massive_sync
drop_fk(cls.db())
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/schema.py", line 518, in drop_fk
db.query_no_return(sql)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/adapter.py", line 166, in query_no_return
self._query(sql, **kwargs)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/adapter.py", line 276, in _query
raise e
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/adapter.py", line 269, in _query
result = self._basic_connection.execution_options(autocommit=False).execute(query, **kwargs)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1385, in execute
return meth(self, multiparams, params, _EMPTY_EXECUTION_OPTS)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 334, in _execute_on_connection
return connection._execute_clauseelement(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1577, in _execute_clauseelement
ret = self._execute_context(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1953, in _execute_context
self._handle_dbapi_exception(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 2134, in _handle_dbapi_exception
util.raise_(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 211, in raise_
raise exception
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1910, in _execute_context
self.dialect.do_execute(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (psycopg2.errors.DeadlockDetected) deadlock detected
DETAIL: Process 766 waits for ShareLock on transaction 2448471; blocked by process 1135.
Process 1135 waits for ShareLock on transaction 2447964; blocked by process 766.
HINT: See server log for query details.
CONTEXT: while deleting tuple (20,42) in relation "pg_class"
SQL statement "DROP TABLE hive.shadow_hivemind_app_hive_mentions"
PL/pgSQL function hive.on_edit_registered_tables() line 53 at EXECUTE
[SQL: ALTER TABLE hivemind_app.hive_mentions DROP CONSTRAINT IF EXISTS hive_mentions_fk2]
(Background on this error at: https://sqlalche.me/e/14/e3q8)
During handling of the above exception, another exception occurred: