2nd update of 2023: Moving to Ubuntu 22, binary storage for hive operations in HAF

blocktrades update.png

Below are a few highlights of the Hive-related programming issues worked on by the BlockTrades team since my last report.

Hived (blockchain node software)

One of the most notable points for developers and users of hived is that we're moving to Ubuntu 22 as the offically supported platform for the next release of hived. There's two main reasons for this: 1) we want to use a new version of rocksdb that requires a faster file IO library that was added in Ubuntu 22 and 2) it will allow us to deprecate support of Postgres 12 in favor of Postgres 14 (these two often behave differently in terms of query performance, so it will simplify releases of haf apps that consistently perform well).

In order to move to Ubuntu 22 we also needed to solve some issues with the OpenSSL librariy used by hived as support for RIPEMD169 was removed from the library and hived uses this algorithm in several places.

As part of the upgrade to the new version of rocksdb, we also moved it out of the core hived repo, and rocksdb is now a submodule referenced by the main repo. This will make it easier and cleaner to upgrade to later versions of rocksdb in the future.

Other changes to hived include:

  • switching from using the fc class fc:uint128 to the native C++ uint128 type: https://gitlab.syncad.com/hive/hive/-/merge_requests/804
  • Tests for massive recurrent transfers to verify it according to relaxed rules to be added at HF28
  • Fixed problem related to silent truncation of fixed_string type (it silently cut off Hive account names that were too long).
  • debug_node_plugin API improvements (added function to simplify setting up VESTS/HIVE price for tests)
  • Fixes related to old cli_wallet stability (previously it could randomly crash on automated test runs while shutting down and requiring multiple test runs at times).
  • Extending unit test suite related to account history API calls
  • Preprequisites for changes required by new binary HAF storage implementation.
  • Simplifying (and speeding up) preparing of hived 5 million block data instance for automated tests.

Hive Application Framework (HAF) and related apps

A lot of work lately has focused on using binary storage of hive operations inside HAF databases, replacing the json format that was previously used. This change lowered the size of a fully populated HAF database by 700GB. Once we proved the basic theory was feasible, we've been working on measuring and optimizing performance of the new storage format.

We also wrote some new regression tests to verify the HAF database update procedure (the way an existing HAF database gets upgraded when the schema or core code of a HAF database needs to be changed).

Generic devops tasks

We updated the base docker images used by our docker containers (these are used both for CI testing and for production deployment) in various ways, including moving to Ubuntu 22.

We also created common CI jobs useful for publishing official docker images for various code repos (e.g. hived, HAF, hivemind, hafah, and other haf apps).

Some upcoming tasks

  • Re-use docker images in CI tests when possible to speedup testing time (in progress)
  • Continue work on HAF-based block explorer
  • Collect benchmarks for a hafah app operating in “irreversible block mode” and compare to a hafah app operating in “normal” mode (low priority)
  • Finalize plans for changes to hived and HAF in the coming year that the BlockTrades team will be working on.
  • Work out schedule for creation of various HAF-based applications (most important one here will be a HAF-based smart contract processing engine).
H2
H3
H4
3 columns
2 columns
1 column
Join the conversation now