Skip to content

GitLab

  • Menu
Projects Groups Snippets
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • H haf
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 28
    • Issues 28
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 5
    • Merge requests 5
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Monitor
    • Monitor
    • Incidents
  • Packages & Registries
    • Packages & Registries
    • Package Registry
    • Container Registry
    • Infrastructure Registry
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Repository
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • hive
  • haf
  • Issues
  • #58

Closed
Open
Created Jun 07, 2022 by Bartek Wrona@bwronaOwner

Create regression test scenario when filtered data collection is enabled after resuming replay process

To provide a way to have a "pruned" HAF instance, which up to some block height will store minimal amount of data and next starts collect actual one, we should create test where:

  • initialy HAF replay is started with option psql-enable-account-operations-dump=false. This will generate minimal amount of data.
  • next replay process stops (i.e. because of used --stop-replay-at-block=num option).
  • Next replay resumes with psql-enable-account-operations-dump=true and defined some additional filter, i.e. regexp based to collect only speficific operations.
Assignee
Assign to
Time tracking