hive issueshttps://gitlab.syncad.com/groups/hive/-/issues2024-03-08T12:27:35Zhttps://gitlab.syncad.com/hive/block_explorer_ui/-/issues/218Remove , from block number inputs2024-03-08T12:27:35ZJakub LachórRemove , from block number inputsAll inputs when you can put numbers should be resistant to localizations.All inputs when you can put numbers should be resistant to localizations.Piotr BerezkaPiotr Berezkahttps://gitlab.syncad.com/hive/haf/-/issues/215Examine ideas for APIs and frontend libraries to improve caching performance2024-03-13T23:01:55ZDan NotesteinExamine ideas for APIs and frontend libraries to improve caching performanceCurrent APIs as used aren't great for caching. For example, asking for the last 1000 operations in the blockchain wouldn't match a similar request made 3s before by another client. And requests made after to fetch older data will also no...Current APIs as used aren't great for caching. For example, asking for the last 1000 operations in the blockchain wouldn't match a similar request made 3s before by another client. And requests made after to fetch older data will also not generate cache hits between the clients (i.e. if we ask for the next 1000 operations that occurred before the first request).
But at the expense of two calls for the first fetch, and an assumption about what is a reasonable amount of data to fetch per call, we could develop libraries that made calls to nodes that are efficiently cacheable.
For example, assume we had a chain that currently had 1000132 operations at the moment of the call and we think it is reasonable to fetch 100 items at a time. We could make a library function that asked for the most recent operations up to the nearest 100s place. This call would return items from 1,000,132 to 1,000,100. This first call wouldn't generate a hit with a call made 3s later from another client, of course (the 2nd client would for example might end up fetch 1,000,150 to 1,000,100). But any further calls made by either client to get more data would generate cache hits between each other. In other words, both would fetch 1,000,100 to 1,000,000 next. With many clients using a library that fetches in this manner, cache hits would dramatically increase.https://gitlab.syncad.com/hive/hive-renderer/-/issues/17HtmlDomParser incorrectly build images and links state2024-03-06T23:14:12ZBartłomiej GórnickiHtmlDomParser incorrectly build images and links stateHtmlDomParser after parsing the text, it build the state to keep images and links. It incorrectly uses hardoded youtube regex and ignores other embeds.HtmlDomParser after parsing the text, it build the state to keep images and links. It incorrectly uses hardoded youtube regex and ignores other embeds.https://gitlab.syncad.com/hive/haf_block_explorer/-/issues/70CI rewrite2024-03-12T08:21:32ZKonrad BotorCI rewriteCI needs to be rewritten.
- [ ] Test jobs' code needs to be more unified - especially *after_script* sections (ideally one for all jobs).
- [ ] Fixing Compose project name (should be haf_be, but is btracker).
- [ ] Possible updates afte...CI needs to be rewritten.
- [ ] Test jobs' code needs to be more unified - especially *after_script* sections (ideally one for all jobs).
- [ ] Fixing Compose project name (should be haf_be, but is btracker).
- [ ] Possible updates after hive#648 (although in current form it should *just work* after *copy_data.sh* execution is removed from the entrypoint).
- [ ] Move *psql* Docker image to hive/common-ci-configuration>. Rewrite Composefiles. See also balance_tracker#16.
- [ ] Check if HAF replay and block_log.artifacts are generated in every test job. If so make sure they aren't.
- [ ] Turn off logging `docker pull`.
- [ ] Make the *Running tests...* section uncollapsed by default.
- [ ] Investigate the lack of logs if HAF startup fails.
- [ ] Move larger embedded Bash scripts to separate files.
- [ ] Make a template that can be reused in other projects.
- [ ] Implement #72
See also changes in fddb083f75aa3f6a2dd6607589a4ddd90e8d34a3.Konrad BotorKonrad Botorhttps://gitlab.syncad.com/hive/hive-renderer/-/issues/16Update RendererOptions interface to simplify configuration2024-03-13T14:42:13ZBartłomiej GórnickiUpdate RendererOptions interface to simplify configurationTo make the renderer API simpler and easier to use, we should make most of the options optional with default values when not provided.To make the renderer API simpler and easier to use, we should make most of the options optional with default values when not provided.Bartłomiej GórnickiBartłomiej Górnickihttps://gitlab.syncad.com/hive/clive/-/issues/163Refactor savings screen2024-03-06T07:17:53ZJakub ZiebinskiRefactor savings screenSome elements on the savings screen can be rafactored with some custom clive-widgets. Two the most important:
1. `CliveCheckerboardTable` -> we can use this widget to create a table of pending transfers from savings.
2. `CliveDataTable` ...Some elements on the savings screen can be rafactored with some custom clive-widgets. Two the most important:
1. `CliveCheckerboardTable` -> we can use this widget to create a table of pending transfers from savings.
2. `CliveDataTable` -> this widget can be used to display savings balances.MVP - Minimum Viable ProductJakub ZiebinskiJakub Ziebinskihttps://gitlab.syncad.com/hive/hive-renderer/-/issues/15Link attributes `rel` and `target` incorrectly depends on `isLinkSafeFn`2024-03-06T21:57:42ZBartłomiej GórnickiLink attributes `rel` and `target` incorrectly depends on `isLinkSafeFn`Those attributes should be assigned to external links, not only "phishy" links.Those attributes should be assigned to external links, not only "phishy" links.https://gitlab.syncad.com/hive/hive-renderer/-/issues/14Default localization options should depends on baseUrl2024-03-05T21:59:14ZBartłomiej GórnickiDefault localization options should depends on baseUrlDefault localization message `externalLink` uses `example.com` and should use `baseUrl` provided when initializing the renderer.
Please improve documentation on how to provide custom localization to the renderer for other languages.Default localization message `externalLink` uses `example.com` and should use `baseUrl` provided when initializing the renderer.
Please improve documentation on how to provide custom localization to the renderer for other languages.https://gitlab.syncad.com/hive/hive-renderer/-/issues/13Add `dir` attribute to every paragraph to support right-to-left languages2024-03-05T15:39:53ZBartłomiej GórnickiAdd `dir` attribute to every paragraph to support right-to-left languageshttps://gitlab.syncad.com/hive/imagehoster/-/issues/12Change default format to webp2024-03-04T23:08:09ZDamian JanusChange default format to webpDamian JanusDamian Janushttps://gitlab.syncad.com/hive/haf/-/issues/214Support haf-app based lite accounts2024-03-28T17:15:54ZDan NotesteinSupport haf-app based lite accountsThis is probably best done as a haf app, but creating issue here for now.
Purpose: allow people to use Hive (or other Hive based apps like some games) without a need to create Hive account.
Initial design goals/functionality for furth...This is probably best done as a haf app, but creating issue here for now.
Purpose: allow people to use Hive (or other Hive based apps like some games) without a need to create Hive account.
Initial design goals/functionality for further discussion:
1. App code reuse. It should be possible to embed this app code by 3rd party apps.
Reusing should be done by explicit Lite-Account app integration into parent application code to expose final functionality together with other final app services (i.e provided via REST)
1. While configuring the app, regular Hive account should be used as Hive-blockchain bridge and allow interaction to Hive
1. Lite account identifiers should be unique - to make it possible we need to introduce some namespaces
1. Bridge account name should be used as a namespace holding account identifiers.
1. Only account creation operation is limited to namespace matching the bridge account. All other operations (i.e. property modification) are accepted regardless to matching account namespace and bridge account and are only confirmed by matching a public key associated to LA.
1. Lite-Account should have associated its public key, used to authorize actions performed by LA (i.e. changing its properties)
1. Lite-Account properties can be created/modified/deleted by any bridge account when LA public key is matching
1. All actions specific to Lite-Accounts should be done by using custom operation (custom_binary/custom/custom_json). The internal action embedded into custom operation should be signed by LA private key to proof that given account scheduled given action. Whole Hive transaction containing a custom operation should be signed by bridge account, to match Hive protocol requirements.
1. If Lite-Account would like to perform regular Hive activity (i.e. vote or write a post), it could be possible by preparing (inside single transaction) a custom operation containing required Hive action being signed by LA private key, followed by regular Hive operation finally signed by bridge account.
Above solution will allow to distinguish LA direct Hive activity from bridge account operations.Post-1.27.5https://gitlab.syncad.com/hive/hive/-/issues/665Class `fc::ofstream` should use only one type of a file mode. Now are used 2 ...2024-03-25T07:56:46ZMariusz TrelaClass `fc::ofstream` should use only one type of a file mode. Now are used 2 types: `std::ios_base::openmode` and `fc::ofstream::mode`Mariusz TrelaMariusz Trelahttps://gitlab.syncad.com/hive/haf/-/issues/213Create a prototype for a super simple haf that manages forks just using neste...2024-03-27T19:59:52ZDan NotesteinCreate a prototype for a super simple haf that manages forks just using nested transactionsToday, I was thinking about if we could create a super simple version of HAF, that might be more suitable for a blockchain with OBI support, and this is what I came up with as an idea:
During live sync, each block would start a new nest...Today, I was thinking about if we could create a super simple version of HAF, that might be more suitable for a blockchain with OBI support, and this is what I came up with as an idea:
During live sync, each block would start a new nested transaction and all "reversible" apps would be run in the same transaction that adds the block data. Simplest mechanism would just be to fill a table with calls to all the app "process_block" procedures that should be run by the transaction.
Whenever blocks are orphaned, the nested transactions (and hence the state of all reversible apps) would be rolled back to remove the impacts of all orphaned blocks on the reversible apps. One issue I need to lookup is if nested transactions allow committing the impacts of the earlier parts of the nested transaction (i.e. can we commit the early parts of the transaction).
Irreversible apps could continue to run in separate threads/transactions as they would only be working on committed blocks and don't need rollbacks.
This super simple methodology for now assumes reversible apps would be replayed at the same time as HAF itself is replayed, but it isn't strictly necessary. Transaction nesting would only need to start once the last irreversible block was reached.Post-1.27.5https://gitlab.syncad.com/hive/hivemind/-/issues/232Hivemind crashed on startup on "clean" system2024-03-01T22:52:02ZDan NotesteinHivemind crashed on startup on "clean" systemReported by mahdiyari:
```
What I did:
clone haf_api_node
create zfs datasets
edit .env to --replay-blockchain
copied over block_log and artifacts to blockchain folder
docker compose up -d
haf is replaying just fine
```
```
docker-ps-a....Reported by mahdiyari:
```
What I did:
clone haf_api_node
create zfs datasets
edit .env to --replay-blockchain
copied over block_log and artifacts to blockchain folder
docker compose up -d
haf is replaying just fine
```
```
docker-ps-a.txt - Text
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
840e41b152f3 registry.hive.blog/hivemind/instance:v1.27.5rc7 "/home/hivemind/dock…" 13 minutes ago Created haf-world-hivemind-server-1
64153c123f64 registry.gitlab.syncad.com/hive/haf_api_node/postgrest:latest "/bin/postgrest" 13 minutes ago Up 13 minutes (healthy) haf-world-hafah-postgrest-1
82cf6cdeee0e registry.hive.blog/jussi:latest "python -m jussi.ser…" 13 minutes ago Exited (0) 11 minutes ago haf-world-jussi-1
8d37f7ae33b2 registry.hive.blog/hivemind/instance:v1.27.5rc7 "/home/hivemind/dock…" 13 minutes ago Exited (1) 11 minutes ago haf-world-hivemind-block-processing-1
734728a5e946 ankane/pghero:v3.3.3 "/bin/sh -c 'puma -C…" 13 minutes ago Up 13 minutes 8080/tcp haf-world-pghero-1
19124fe9acfd registry.hive.blog/hafah/setup:v1.27.5rc7 "/hafah/scripts/setu…" 13 minutes ago Exited (0) 11 minutes ago haf-world-hafah-install-1
da6309a1e8db varnish:7.3.0-alpine "/usr/local/bin/dock…" 13 minutes ago Up 13 minutes (healthy) 80/tcp, 8443/tcp haf-world-varnish-1
26d0e42e8ff9 redis:7.2-alpine "docker-entrypoint.s…" 13 minutes ago Up 13 minutes (healthy) 6379/tcp haf-world-redis-1
e02b34cae4ad registry.hive.blog/haf/minimal-instance:v1.27.5rc7 "/home/haf_admin/doc…" 13 minutes ago Up 13 minutes (healthy) 8091/tcp haf-world-haf-1
20fd71daf9ac registry.hive.blog/haf_api_node/version-display:latest "docker-entrypoint.s…" 13 minutes ago Up 13 minutes haf-world-version-display-1
eece23cdb395 registry.hive.blog/haf_api_node/caddy:latest "caddy run --config …" 13 minutes ago Up 13 minutes (healthy) 0.0.0.0:80->80/tcp, :::80->80/tcp, 0.0.0.0:443->443/tcp, :::443->443/tcp, 0.0.0.0:443->443/udp, :::443->443/udp, 2019/tcp haf-world-caddy-1
1cfbe3c82a0f dpage/pgadmin4:8.3 "/entrypoint.sh" 13 minutes ago Up 13 minutes 80/tcp, 443/tcp haf-world-pgadmin-1
69320ef8c6ed registry.hive.blog/haf_api_node/haproxy-healthchecks:latest "/docker_entrypoint.…" 13 minutes ago Up 13 minutes haf-world-haproxy-healthchecks-1
6b2bb05d2786 haproxy:2.9.3-alpine "docker-entrypoint.s…" 13 minutes ago Up 13 minutes (healthy)
```
This is hivemind log:
```
Entrypoint] 2024-03-01 21:43:35,099+00:00 INFO [global] (main) Parameters passed directly to Hivemind docker entrypoint: sync --database-url=postgresql://hivemind@haf/haf_block_log --database-admin-url=postgresql://haf_admin@haf/haf_block_log --install-app
[Entrypoint] 2024-03-01 21:43:35,100+00:00 INFO [global] (main) Collected Hivemind arguments: sync
[Entrypoint] 2024-03-01 21:43:35,100+00:00 INFO [global] (main) Using PostgreSQL instance: postgresql://hivemind@haf/haf_block_log
[Entrypoint] 2024-03-01 21:43:35,101+00:00 INFO [global] (main) Using PostgreSQL Admin URL: postgresql://haf_admin@haf/haf_block_log
[Entrypoint] 2024-03-01 21:43:35,101+00:00 INFO [global] (main) Running install_app step because it was requested via the --install-app argument
[Entrypoint] 2024-03-01 21:43:35,102+00:00 INFO [setup] (main) Setting up the database...
./setup_postgres.sh parameters: --postgres-url=postgresql://haf_admin@haf/haf_block_log
/home/hivemind/app/../haf/scripts/create_haf_app_role.sh parameters: --postgres-url=postgresql://haf_admin@haf/haf_block_log --haf-app-account=hivemind
postgresql://haf_admin@haf/haf_block_log
perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
LANGUAGE = (unset),
LC_ALL = (unset),
LANG = "en_US.UTF-8"
are supported and installed on your system.
perl: warning: Falling back to the standard locale ("C").
DO $$
BEGIN
BEGIN
CREATE ROLE hivemind WITH LOGIN INHERIT IN ROLE hive_applications_owner_group;
EXCEPTION WHEN DUPLICATE_OBJECT THEN
RAISE NOTICE 'hivemind role already exists';
END;
END
$$;
DO
Attempting to supplement definition of hivemind builtin roles...
perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
LANGUAGE = (unset),
LC_ALL = (unset),
LANG = "en_US.UTF-8"
are supported and installed on your system.
perl: warning: Falling back to the standard locale ("C").
GRANT ROLE
./install_app.sh parameters: --postgres-url=postgresql://haf_admin@haf/haf_block_log
perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
LANGUAGE = (unset),
LC_ALL = (unset),
LANG = "en_US.UTF-8"
are supported and installed on your system.
perl: warning: Falling back to the standard locale ("C").
CREATE EXTENSION
psql:/home/hivemind/app/install_app.sql:19: NOTICE: Disabling a JIT optimization on the current database level...
DO
[Entrypoint] 2024-03-01 21:43:35,221+00:00 INFO [run_hive] (main) Starting Hivemind...
INFO - hive.conf:212 - The database instance is created...
INFO - hive.db.adapter:49 - A database offers maximum connections: 100. Required 15 connections.
INFO - hive.db.adapter:98 - Closing database connection: 'root'
INFO - hive.db.adapter:109 - Disposing SQL engine
INFO - hive.conf:261 - The database is disconnected...
INFO - hive.conf:212 - The database instance is created...
INFO - hive.db.db_state:48 - Welcome to hive!
INFO - hive.db.db_state:54 - Create db schema...
INFO - hive.indexer.hive_db.haf_functions:10 - Looking for 'hivemind_app' context.
INFO - hive.indexer.hive_db.haf_functions:13 - No application context present. Attempting to create a 'hivemind_app' context...
INFO - hive.indexer.hive_db.haf_functions:15 - Application context creation done.
INFO - hive.indexer.hive_db.haf_functions:31 - Trying to detach app context...
INFO - hive.indexer.hive_db.haf_functions:33 - App context detaching done.
WARNING - hive.db.adapter:276 - [SQL-ERR] IntegrityError in query CALL hive.appproc_context_attach('hivemind_app') ({})
INFO - hive.db.adapter:98 - Closing database connection: 'root'
INFO - hive.db.adapter:109 - Disposing SQL engine
INFO - hive.conf:261 - The database is disconnected...
Traceback (most recent call last):
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1910, in _execute_context
self.dialect.do_execute(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.NotNullViolation: null value in column "fork_id" of relation "contexts" violates not-null constraint
CONTEXT: SQL statement "UPDATE hive.contexts
SET fork_id = __fork_id
, irreversible_block = COALESCE( __head_of_irreversible_block, 0 )
, events_id = 0 -- during app_next_block correct event will be found
, last_active_at = NOW()
WHERE name =ANY( _contexts )"
PL/pgSQL function hive.app_context_attach(hive.contexts_group) line 29 at SQL statement
SQL statement "SELECT hive.app_context_attach( _contexts )"
PL/pgSQL function hive.appproc_context_attach(hive.contexts_group) line 3 at PERFORM
SQL statement "CALL hive.appproc_context_attach( ARRAY[ _context ] )"
PL/pgSQL function hive.appproc_context_attach(hive.context_name) line 3 at CALL
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/hivemind/.hivemind-venv/bin/hive", line 8, in <module>
sys.exit(run())
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/cli.py", line 65, in run
launch_mode(mode, conf)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/cli.py", line 83, in launch_mode
with SyncHiveDb(conf=conf, enter_sync = False) as schema_builder:
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/indexer/sync.py", line 61, in __enter__
DbState.initialize(self._enter_sync)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/db_state.py", line 57, in initialize
setup(admin_db=db_setup_admin, db=db_setup_owner)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/schema.py", line 629, in setup
context_attach(db=db)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/indexer/hive_db/haf_functions.py", line 46, in context_attach
db.query_no_return(f"CALL hive.appproc_context_attach('{SCHEMA_NAME}')")
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/adapter.py", line 167, in query_no_return
self._query(sql, **kwargs)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/adapter.py", line 277, in _query
raise e
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/adapter.py", line 270, in _query
result = self._basic_connection.execution_options(autocommit=False).execute(query, **kwargs)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1385, in execute
return meth(self, multiparams, params, _EMPTY_EXECUTION_OPTS)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 334, in _execute_on_connection
return connection._execute_clauseelement(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1577, in _execute_clauseelement
ret = self._execute_context(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1953, in _execute_context
self._handle_dbapi_exception(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 2134, in _handle_dbapi_exception
util.raise_(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 211, in raise_
raise exception
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1910, in _execute_context
self.dialect.do_execute(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.IntegrityError: (psycopg2.errors.NotNullViolation) null value in column "fork_id" of relation "contexts" violates not-null constraint
CONTEXT: SQL statement "UPDATE hive.contexts
SET fork_id = __fork_id
, irreversible_block = COALESCE( __head_of_irreversible_block, 0 )
, events_id = 0 -- during app_next_block correct event will be found
, last_active_at = NOW()
WHERE name =ANY( _contexts )"
PL/pgSQL function hive.app_context_attach(hive.contexts_group) line 29 at SQL statement
SQL statement "SELECT hive.app_context_attach( _contexts )"
PL/pgSQL function hive.appproc_context_attach(hive.contexts_group) line 3 at PERFORM
SQL statement "CALL hive.appproc_context_attach( ARRAY[ _context ] )"
PL/pgSQL function hive.appproc_context_attach(hive.context_name) line 3 at CALL
[SQL: CALL hive.appproc_context_attach('hivemind_app')]
(Background on this error at: https://sqlalche.me/e/14/gkpj)
[Entrypoint] 2024-03-01 21:45:34,413+00:00 INFO [global] (main) Parameters passed directly to Hivemind docker entrypoint: sync --database-url=postgresql://hivemind@haf/haf_block_log --database-admin-url=postgresql://haf_admin@haf/haf_block_log --install-app
[Entrypoint] 2024-03-01 21:45:34,414+00:00 INFO [global] (main) Collected Hivemind arguments: sync
[Entrypoint] 2024-03-01 21:45:34,414+00:00 INFO [global] (main) Using PostgreSQL instance: postgresql://hivemind@haf/haf_block_log
[Entrypoint] 2024-03-01 21:45:34,415+00:00 INFO [global] (main) Using PostgreSQL Admin URL: postgresql://haf_admin@haf/haf_block_log
[Entrypoint] 2024-03-01 21:45:34,415+00:00 INFO [global] (main) Running install_app step because it was requested via the --install-app argument
[Entrypoint] 2024-03-01 21:45:34,416+00:00 INFO [setup] (main) Setting up the database...
./setup_postgres.sh parameters: --postgres-url=postgresql://haf_admin@haf/haf_block_log
/home/hivemind/app/../haf/scripts/create_haf_app_role.sh parameters: --postgres-url=postgresql://haf_admin@haf/haf_block_log --haf-app-account=hivemind
postgresql://haf_admin@haf/haf_block_log
perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
LANGUAGE = (unset),
LC_ALL = (unset),
LANG = "en_US.UTF-8"
are supported and installed on your system.
perl: warning: Falling back to the standard locale ("C").
DO $$
Attempting to supplement definition of hivemind builtin roles...
perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
LANGUAGE = (unset),
LC_ALL = (unset),
LANG = "en_US.UTF-8"
are supported and installed on your system.
perl: warning: Falling back to the standard locale ("C").
NOTICE: role "haf_admin" is already a member of role "hivemind"
GRANT ROLE
BEGIN
BEGIN
CREATE ROLE hivemind WITH LOGIN INHERIT IN ROLE hive_applications_owner_group;
EXCEPTION WHEN DUPLICATE_OBJECT THEN
RAISE NOTICE 'hivemind role already exists';
END;
END
$$;
psql:<stdin>:10: NOTICE: hivemind role already exists
DO
./install_app.sh parameters: --postgres-url=postgresql://haf_admin@haf/haf_block_log
perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
LANGUAGE = (unset),
LC_ALL = (unset),
LANG = "en_US.UTF-8"
are supported and installed on your system.
perl: warning: Falling back to the standard locale ("C").
psql:/home/hivemind/app/install_app.sql:2: NOTICE: extension "intarray" already exists, skipping
CREATE EXTENSION
psql:/home/hivemind/app/install_app.sql:19: NOTICE: Disabling a JIT optimization on the current database level...
DO
[Entrypoint] 2024-03-01 21:45:36,355+00:00 INFO [run_hive] (main) Starting Hivemind...
INFO - hive.conf:212 - The database instance is created...
INFO - hive.db.adapter:49 - A database offers maximum connections: 100. Required 15 connections.
INFO - hive.db.adapter:98 - Closing database connection: 'root'
INFO - hive.db.adapter:109 - Disposing SQL engine
INFO - hive.conf:261 - The database is disconnected...
INFO - hive.conf:212 - The database instance is created...
INFO - hive.db.db_state:48 - Welcome to hive!
INFO - hive.db.adapter:98 - Closing database connection: 'setup_owner'
INFO - hive.indexer.sync:67 - hivemind_version : 2.0.0dev1
INFO - hive.indexer.sync:68 - hivemind_git_rev : b50bd3dfe0a12b07f1fcfe8b2817217d83e38bae
INFO - hive.indexer.sync:69 - hivemind_git_date : 2024-02-27 21:44:25
INFO - hive.indexer.sync:71 - database_schema_version : 34
INFO - hive.indexer.sync:72 - database_patch_date : 2024-03-01 21:45:47.019338
INFO - hive.indexer.sync:73 - database_patched_to_revision : 9d2cc15bea71a39139abdf49569e0eac6dd0b970
INFO - hive.indexer.sync:75 - last_block_from_view : 0
INFO - hive.indexer.sync:76 - last_imported_block : 1
INFO - hive.indexer.sync:77 - last_completed_block : 1
INFO - hive.indexer.sync:92 - Attempting to build Hivemind database schema if needed
INFO - hive.db.adapter:98 - Closing database connection: 'root'
INFO - hive.db.adapter:109 - Disposing SQL engine
INFO - hive.conf:261 - The database is disconnected...
[Entrypoint] 2024-03-01 21:45:47,108+00:00 INFO [global] (main) Done running install_app, now running the block processor
[Entrypoint] 2024-03-01 21:45:47,109+00:00 INFO [run_hive] (main) Starting Hivemind...
INFO - hive.conf:212 - The database instance is created...
INFO - hive.db.adapter:49 - A database offers maximum connections: 100. Required 15 connections.
INFO - hive.db.adapter:98 - Closing database connection: 'root'
INFO - hive.db.adapter:109 - Disposing SQL engine
INFO - hive.conf:261 - The database is disconnected...
INFO - hive.conf:212 - The database instance is created...
INFO - hive.indexer.sync:54 - Entering HAF mode synchronization
INFO - hive.db.db_state:48 - Welcome to hive!
INFO - hive.db.adapter:98 - Closing database connection: 'setup_owner'
INFO - hive.db.db_state:67 - [MASSIVE] Continue with massive sync...
INFO - hive.indexer.sync:67 - hivemind_version : 2.0.0dev1
INFO - hive.indexer.sync:68 - hivemind_git_rev : b50bd3dfe0a12b07f1fcfe8b2817217d83e38bae
INFO - hive.indexer.sync:69 - hivemind_git_date : 2024-02-27 21:44:25
INFO - hive.indexer.sync:71 - database_schema_version : 34
INFO - hive.indexer.sync:72 - database_patch_date : 2024-03-01 21:45:47.019338
INFO - hive.indexer.sync:73 - database_patched_to_revision : 9d2cc15bea71a39139abdf49569e0eac6dd0b970
INFO - hive.indexer.sync:75 - last_block_from_view : 0
INFO - hive.indexer.sync:76 - last_imported_block : 1
INFO - hive.indexer.sync:77 - last_completed_block : 1
WARNING - hive.db.adapter:276 - [SQL-ERR] IntegrityError in query CALL hive.appproc_context_attach('hivemind_app') ({})
INFO - hive.db.adapter:98 - Closing database connection: 'root'
INFO - hive.db.adapter:109 - Disposing SQL engine
INFO - hive.conf:261 - The database is disconnected...
Traceback (most recent call last):
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1910, in _execute_context
self.dialect.do_execute(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.NotNullViolation: null value in column "fork_id" of relation "contexts" violates not-null constraint
CONTEXT: SQL statement "UPDATE hive.contexts
SET fork_id = __fork_id
, irreversible_block = COALESCE( __head_of_irreversible_block, 0 )
, events_id = 0 -- during app_next_block correct event will be found
, last_active_at = NOW()
WHERE name =ANY( _contexts )"
PL/pgSQL function hive.app_context_attach(hive.contexts_group) line 29 at SQL statement
SQL statement "SELECT hive.app_context_attach( _contexts )"
PL/pgSQL function hive.appproc_context_attach(hive.contexts_group) line 3 at PERFORM
SQL statement "CALL hive.appproc_context_attach( ARRAY[ _context ] )"
PL/pgSQL function hive.appproc_context_attach(hive.context_name) line 3 at CALL
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/hivemind/.hivemind-venv/bin/hive", line 8, in <module>
sys.exit(run())
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/cli.py", line 65, in run
launch_mode(mode, conf)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/cli.py", line 94, in launch_mode
with SyncHiveDb(conf=conf, enter_sync = True) as sync:
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/indexer/sync.py", line 68, in __enter__
context_attach(db=self._db)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/indexer/hive_db/haf_functions.py", line 46, in context_attach
db.query_no_return(f"CALL hive.appproc_context_attach('{SCHEMA_NAME}')")
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/adapter.py", line 167, in query_no_return
self._query(sql, **kwargs)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/adapter.py", line 277, in _query
raise e
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/hive/db/adapter.py", line 270, in _query
result = self._basic_connection.execution_options(autocommit=False).execute(query, **kwargs)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1385, in execute
return meth(self, multiparams, params, _EMPTY_EXECUTION_OPTS)
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 334, in _execute_on_connection
return connection._execute_clauseelement(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1577, in _execute_clauseelement
ret = self._execute_context(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1953, in _execute_context
self._handle_dbapi_exception(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 2134, in _handle_dbapi_exception
util.raise_(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 211, in raise_
raise exception
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1910, in _execute_context
self.dialect.do_execute(
File "/home/hivemind/.hivemind-venv/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.IntegrityError: (psycopg2.errors.NotNullViolation) null value in column "fork_id" of relation "contexts" violates not-null constraint
CONTEXT: SQL statement "UPDATE hive.contexts
SET fork_id = __fork_id
, irreversible_block = COALESCE( __head_of_irreversible_block, 0 )
, events_id = 0 -- during app_next_block correct event will be found
, last_active_at = NOW()
WHERE name =ANY( _contexts )"
PL/pgSQL function hive.app_context_attach(hive.contexts_group) line 29 at SQL statement
SQL statement "SELECT hive.app_context_attach( _contexts )"
PL/pgSQL function hive.appproc_context_attach(hive.contexts_group) line 3 at PERFORM
SQL statement "CALL hive.appproc_context_attach( ARRAY[ _context ] )"
PL/pgSQL function hive.appproc_context_attach(hive.context_name) line 3 at CALL
[SQL: CALL hive.appproc_context_attach('hivemind_app')]
(Background on this error at: https://sqlalche.me/e/14/gkpj)
```https://gitlab.syncad.com/hive/haf/-/issues/212Increase shared-file-size in the generated config.ini2024-03-01T20:49:33ZMahdi YariIncrease shared-file-size in the generated config.iniThe default config.ini generated with all the plugins has `shared-file-size` of 24G but it says in the same config to use 28G if using more plugins which it does itself use all of them.The default config.ini generated with all the plugins has `shared-file-size` of 24G but it says in the same config to use 28G if using more plugins which it does itself use all of them.https://gitlab.syncad.com/hive/block_explorer_ui/-/issues/206Use cards from Shad in our UI2024-03-01T13:39:19ZJakub LachórUse cards from Shad in our UIShad UI provides some cards, we should use it more in our app. For now replace it everywhere except operations.Shad UI provides some cards, we should use it more in our app. For now replace it everywhere except operations.Piotr BerezkaPiotr Berezkahttps://gitlab.syncad.com/hive/hive/-/issues/664Signature generation cleanup2024-02-29T21:45:34ZBartek WronaSignature generation cleanupAtm in many places are used bip0062 and canonical signatures. They should be unified to bip0062Atm in many places are used bip0062 and canonical signatures. They should be unified to bip0062Post 1.27.5https://gitlab.syncad.com/hive/haf/-/issues/211Get back to idea to start autodetach time counting relatively to hived start ...2024-03-27T03:50:55ZBartek WronaGet back to idea to start autodetach time counting relatively to hived start timeAlready done work was reverted to eliminate risk for upcoming release due to lack of testing time.
https://gitlab.syncad.com/hive/haf/-/merge_requests/446Already done work was reverted to eliminate risk for upcoming release due to lack of testing time.
https://gitlab.syncad.com/hive/haf/-/merge_requests/446Post-1.27.5https://gitlab.syncad.com/hive/haf/-/issues/210Irreversible (non-forkable) contexts should not create rowid indexes in the a...2024-03-27T03:50:40ZBartek WronaIrreversible (non-forkable) contexts should not create rowid indexes in the applications tableIt allows another space optimization, which atm for whole HAF stack could give back up to 100GB space and slight faster writes.It allows another space optimization, which atm for whole HAF stack could give back up to 100GB space and slight faster writes.Post-1.27.5https://gitlab.syncad.com/hive/haf/-/issues/209Further reaserch related to table partitions2024-02-29T21:32:00ZBartek WronaFurther reaserch related to table partitionsSome initial reasearch has been done here: https://gitlab.syncad.com/hive/haf/-/merge_requests/360
Probably selected approach was wrong and too much partitions has been created. Anyway idea to use partitions can be useful to remove expli...Some initial reasearch has been done here: https://gitlab.syncad.com/hive/haf/-/merge_requests/360
Probably selected approach was wrong and too much partitions has been created. Anyway idea to use partitions can be useful to remove explicit reversible tables and store HAF data as single logical tables splitted as follows:
- reversible part - determined by some column value (fork_id - which has set nonzero value in this case) best being stored as part of block_num value
- hot irreversible part - determined by fork_id = 0 and block_num and storing fresh N blocks (i.e. last 5M what covers c.a. half of year)
- archive irreversible part - for all older block_nums.
Benefits:
- simplification of complex definition of data access views (where right now UNION ALL statements must be specified
- speedup of data write mostly in LIVE sync. In test done , multiple archive partitions also shortened overall replay time, although to high number can lead to API query degradation (in test selected threshold was 1M what was too small).
- makes possible split data across separate storages having different speed (and also cost)Post-1.27.5