Loading
Commits on Source 17
-
Konrad Botor authored
-
Marcin authored
There is no sense to pass schema parameter to hash db computation, at the end there is a list of tables to take into computation and all of them belongs to hafd schema.
-
Marcin authored
Now there is no need to extend list of hashed tables each time when new table is added to schema hafd. WARNING: previously people forget to extend the list and there are few tables which were not hashed. Because of this update from previous version of HAF is impossible
-
Marcin authored
Newer hash computation method is injected into the db with older hfm version. Both naked new HAF db and old one use the same algorithm to getting hash.
-
Marcin authored
-
Marcin authored
-
Marcin authored
-
Marcin authored
-
Marcin authored
After many changes hash computed on databse could only be used to check if hfm can be updated to a given new version. The hash cannot be used to check if the database schema for a given hfm version was modified, becuase it does not take all haf elements for computation. It means there is no need to have stored hash of the databas because it has no usage, moreover it is misleading and could mask that the some parts of the schema was modified. Warning: the change modifies hafd schema, what means that old hfm versions cannot be updated to it.
-
Marcin authored
-
Marcin authored
Currently state providers creates tables in hafd schema. All tables in hafd schemas are taken to db hash computation, but hashes for state providers are computed differently, ans should not affect hafd hash schema.
-
Marcin authored
-
Marcin authored
-
Marcin authored
-
Marcin authored
-
Marcin authored
previously only state providers shadow tables were excluded, what cause problems when haf with installed context with registered tables was updated.
-
Dan Notestein authored
- hive: develop (74eb54442330ace71c37a43b464aee6b1bd4dae2)