3rd HAF Projects Development Report for 2022

in HiveDevs2 years ago (edited)

haf-report-3.png

Plug & Play V2 is now live and using the new PostgreSQL-heavy setup. Since the last update, I've worked on Version 2 of Plug & Play, running multiple tests to ensure stability. The code is pushed to the master branch of the repo.

I'm still ironing out a few issues with Version 2 of Global Notification System (GNS). I will post an update on it soon.



plug_play_icon.png

Plug & Play Highlights:

  • Moved the Plug & Play project to the new organization structure
  • Moved all sync related code to PostgreSQL
  • Implemented fail safe and health check SQL code for plugs
  • Updated the sync status report endpoint to include block_num
  • Updated the README with new installation and setup instructions for Production and Development environments
  • Published new documentation on:
    • Defining a custom defs.json for your plug
    • Writing your plug’s functions
    • Designing your plug’s tables

New GitHub repository location and new Podping server

The Plug & Play repository is now hosted here: https://github.com/FreeBeings-io/haf-plug-play.

I have also deployed a new Podping Plug & Play server.

https://podping.hpp.freebeings.io

Notice the new domain. The domain is for the new umbrella organization I mentioned early this year. More information about the organization and the projects we have planned for Hive will be posted in the near future. I’m in the process of moving all Hive projects to this organization.

With the new Podping deployment, I implemented:

  • The latest Plug & Play V2 code
  • A feature request from @brianoflondon to:
    • Use iri instead URL
    • Add iri and time_since_last_update to the /history/latest/ endpoint
    • Add id, reason, and medium fields to response

Failsafe and health checks in SQL

Each plug now has a set of functions dedicated to:

  • tracking when the last check-in was made by the main sync function
  • periodically checking if the sync process is still running
  • checking if the last check-in was made longer than a minute ago, if so, the sync is considered "hanging"
  • terminating and restarting the sync process

Updated System Status endpoint

The system status endpoint /api now returns:

  • a list of all enabled plugs
  • each plug's last check-in time for sync
  • the latest_block_num processed by each plug
  • an overall health field GOOD | BAD for sync (essential for setting up monitoring and downtime notifications)

Updated README

The README has been updated with:

  • easier installation instructions (most steps are automated by the Docker method of installing HAF)
  • separate installation and setup guidelines for production and development environments

Additional documentation: Guidelines for custom plugs

For those who want to create their own plugs to retrieve data from HAF and process it, I wrote some extra documentation with more information.

  • plug-definitions.md: how to write a valid defs.json for a plug, if you are creating your own plug.

  • plug-schema.md: how to design your plug's tables and views schema

I'm working on more documentation on how to write valid functions to allow your plug to sync and trigger table updates.

What’s Next?

Docker support

Status: Local testing

I am adding docker support, to make deployment easier. With this, developers can deploy instances of Plug & Play that are either connected to a local HAF database or a remote one. They will also be able to configure how they want to use Plug & Play i.e. which plugs to enable.

Zero-coding plugs

Status: Local testing

I am working on "zero-coding" plugs that developers can use to extract specific types of operations from HAF, by simply configuring a plug for the operation type, providing a filter for which operations to save.

An example use-case to extract all HIVE/HBD transfers ever made to the account @imwatsi:

- Choose the transfer_operation zero plug and enable it
- Use filter: {“to”: “imwatsi”}

With the docker support, this can be as easy as running a single docker command.

Hive Engine NFT Plug

I successfully created a Plug that parses and stores one action and provides access via an endpoint. The aim was to succeed in this one action and see if the method I used to design the plug scales, before implementing the rest of the actions.

Going forward, I’ll be writing code to:

  • expand the range of NFT actions supported
  • implement verification techniques (where needed)

Thanks for reading. Feel free to leave feedback.


Witness (1).png

To support my witness node: