- Mar 28, 2025
-
-
- Mar 27, 2025
-
-
Bartek Wrona authored
Removed Dockerfile.jmeter (shared base job definition: .jmeter_benchmark_job and image stored in the ccc repo is used now for testing)
-
Bartek Wrona authored
block_api_tests job refactored into 2 jobs: one preforming a replay to produce data, second performing benchmark on started haf instance
-
- Mar 26, 2025
-
-
Bartek Wrona authored
.prepare_haf_data_5m base job definition can have parametrized directory where it should generate data
-
Bartek Wrona authored
-
Bartek Wrona authored
-
Bartek Wrona authored
haf-local-tools Python deps update: pandas = "2.2.3", psycopg2-binary = "2.9.10", sqlalchemy = "2.0.39"
-
Bartek Wrona authored
-
-
-
-
- Mar 25, 2025
-
-
Marcin authored
-
- Mar 18, 2025
-
- Mar 17, 2025
-
- Mar 14, 2025
- Mar 11, 2025
-
-
Dan Notestein authored
Handle cases where current block being processed is only in the blocks_reversible table (avoid returning 0 for a block).
-
- Mar 04, 2025
-
-
Dan Notestein authored
-
- Mar 03, 2025
-
-
Dan Notestein authored
- hive: develop (4cd3fc3d23f4074be91a11f00a7a6035405faaf8)
-
- Feb 21, 2025
-
-
Konrad Botor authored
-
Marek Kochanowicz authored
-
- Feb 10, 2025
-
-
Konrad Botor authored
-
- Feb 06, 2025
-
-
Dan Notestein authored
-
- Jan 30, 2025
-
-
Dan Notestein authored
autovacuum settings that can only be set in postgres.conf, reduce default work_mem to 64MB, plus small tweaks
-
- Jan 28, 2025
-
-
Konrad Botor authored
-
- Jan 24, 2025
-
-
Konrad Botor authored
-
- Jan 17, 2025
-
-
Marcin authored
-
- Jan 15, 2025
-
-
Dan Notestein authored
- hive: develop (74eb54442330ace71c37a43b464aee6b1bd4dae2)
-
- Jan 14, 2025
-
-
Marcin authored
previously only state providers shadow tables were excluded, what cause problems when haf with installed context with registered tables was updated.
-
- Jan 09, 2025
-
-
Marcin authored
-
Marcin authored
-
Marcin authored
-
Marcin authored
-
Marcin authored
Currently state providers creates tables in hafd schema. All tables in hafd schemas are taken to db hash computation, but hashes for state providers are computed differently, ans should not affect hafd hash schema.
-
Marcin authored
-
Marcin authored
After many changes hash computed on databse could only be used to check if hfm can be updated to a given new version. The hash cannot be used to check if the database schema for a given hfm version was modified, becuase it does not take all haf elements for computation. It means there is no need to have stored hash of the databas because it has no usage, moreover it is misleading and could mask that the some parts of the schema was modified. Warning: the change modifies hafd schema, what means that old hfm versions cannot be updated to it.
-
Marcin authored
-