|
|
## 1. Preparing database
|
|
|
## 1. Preparing Postgres database for HAF usage
|
|
|
|
|
|
Install postgres extension using this [README](https://gitlab.syncad.com/hive/psql_tools/-/blob/master/src/hive_fork_manager/Readme.md)
|
|
|
by cloning this [repo](https://gitlab.syncad.com/hive/psql_tools)
|
|
|
|
|
|
⚠️ It requires super user privillages when installing ⚠️
|
|
|
⚠️ It requires super user privileges when installing ⚠️
|
|
|
|
|
|
When you create new database with extension `hive_fork_manager` you can proceed to next step
|
|
|
|
|
|
## 2 Fulling database
|
|
|
## 2 Filling database
|
|
|
|
|
|
### Build hived (blockchain node software)
|
|
|
```
|
|
|
git clone git@gitlab.syncad.com:hive/hive.git
|
|
|
pushd hive
|
... | ... | @@ -23,19 +24,20 @@ pushd programs/hived |
|
|
vim data/config.ini
|
|
|
```
|
|
|
|
|
|
now add following lines to the file, next save and exit:
|
|
|
Add the following lines to hived's config file, then save and exit:
|
|
|
```
|
|
|
plugin = sql_serializer
|
|
|
psql-url = dbname=<db name> user=<user> password=<password> hostaddr=127.0.0.1 port=5432
|
|
|
```
|
|
|
|
|
|
additionally You can add block_log in: `data/blockchain` if you have one<br>
|
|
|
at the end run:
|
|
|
Next you need to sync your hived node, or alternatively you can add an existing block_log in: `data/blockchain` if you have one<br>
|
|
|
|
|
|
If you initialized your hived from a block_log file, you will need to run a replay to generate blockchain state data:
|
|
|
```
|
|
|
./hived -d data --replay-blockchain --stop-replay-at-block 5000000 --exit-before-sync
|
|
|
```
|
|
|
|
|
|
hived should exit by itself.
|
|
|
hived should exit by itself after completing the above command.
|
|
|
|
|
|
To follow this instruction execute following commands (moving to proper dir):
|
|
|
|
... | ... | @@ -61,7 +63,7 @@ make -j${nproc} |
|
|
popd
|
|
|
```
|
|
|
|
|
|
## 4. Running API
|
|
|
## 4. Running HAF-based Account History API server
|
|
|
|
|
|
```
|
|
|
git clone git@gitlab.syncad.com:hive/HAfAH.git -b develop
|
... | ... | |