25th update of 2021 on BlockTrades work on Hive software

in HiveDevs3 years ago

blocktrades update.png
Below is a list of Hive-related programming issues worked on by BlockTrades team during the past two weeks:

Hived work (blockchain node software)

sql_serializer plugin (writes blockchain data to a HAF database)

Our primary hived work this week was focused on testing, benchmarking, and making improvements to the SQL serializer plugin:

We modified fc’s json serializer to use a string with preallocated memory instead of a stream in order to speedup replays with the sql_serializer plugin (we saw speedups as much as 40% on some systems). We expect this optimization will also result in significant speedups in hived's API response time, but we haven’t done any benchmarks for that yet.

Miscellaneous code cleanup

Test tool improvements (New python library for testing hived)

  • Added support for connecting wallet to remote node
  • Enabled passing command line arguments to wallet
  • Fixed random fail on CI related to implementation of empty directory removing
  • Updated existing tests to use new TestTools interface
  • Made current directory available to user (via context.get_current_directory())
  • Removed main net singleton
    https://gitlab.syncad.com/hive/hive/-/merge_requests/270

We're also incorporating a library into TestTools called faketime that will enable us to make our fork testing repeatable. It will also allow us to speedup hived tests by using small, pre-generated block_logs as testing fixtures. Instead of requiring us to execute a sequence of transactions to setup a test case, the test will just replay hived using the pre-generated block log to create the desired testing environment.

Command-line wallet for Hived

Hivemind (2nd layer applications + social media middleware)

As mentioned previously, we’re planning to migrate to Ubuntu 20 as the recommended deployment environment for hived and hivemind. As part of this change, we’re planning to move to postgres 12 for hivemind, because this is the default version of postgres shipped with Ubuntu 20.

We’ve modified the continuous integration (CI) system to test the develop branch against postgres 12 and we’ll probably release the final version of hivemind for postgres 10 in a day or so (with support for upgrading existing hivemind installations to the latest database schema changes).

Hive Application Framework: framework for building robust and scalable Hive apps

Fixing/Optimizing HAF-based account history app (Hafah)

We’re currently optimizing and testing our first HAF-based app (code-named Hafah) that emulates the functionality of hived’s account history plugin (and ultimately will replace it). A lot of our recent benchmarks focused on testing on different systems, with different numbers of threads allocated to the sql_serializer and the Hafah plugin.

We’ve also created a fully python-based implementation of Hafah (the indexer for the original Hafah implementation is C++ based) to better showcase how a typical HAF-based app will look.

In preliminary benchmarks, the python-based application was about 20% slower to index 5M blocks than the one with a C++ based indexer, but we still need to do full benchmark testing to see how the two implementations compare over a more realistic data set (e.g. 50M+ blocks). In terms of API performance, both should perform the same, as that python code is shared between the two implementations.

Multi-threading the jsonrpc server used by HAF

We’ve completed a preliminary implementation of a multi-threaded jsonrpc server for use by HAF applications (and traditional hivemind). As mentioned in my previous report, we discovered that this becomes a bottleneck for API traffic at high loads when the API calls themselves are fast. In the next week we’ll begin testing, benchmarking, and optimization.

Conversion of hivemind to HAF-based app

We didn’t have a chance to work on HAF-based hivemind during the previous week as we were tied up with HAF and the HAF account history app (and one of the key devs for this work was sick), but we’ll be resuming work on it during the upcoming week, and preliminary analysis leads us to believe we may be able to complete the work in just a week.

Condenser (source code for hive.blog and a number of other Hive frontend web sites)

I worked with @quochuy and jsalyers on a number of condenser-related issues:

  • Support for tiktok videos
  • Fixes for youtube videos and instagram
  • Fix for muted, blacklisted, and followed pages
  • Major code cleanup
  • Investigated issues related to letsencrypt certificate delisting

Upcoming work for next week

  • Release a final official version of hivemind with postgres 10 support, then update hivemind CI to start testing using postgres 12 instead of 10. We also want to run a full sync to headblock of it against current develop version of hived.
  • Finish testing fixes and optimizations to HAF base-level components (sql_serializer and forkmanager).
  • For Hafah, we’ll be 1) testing/optimizing the new multithreaded jsonrpc server, 2) further benchmarking of API performance, 3) verifying results against a hived account history node, and 4) continuing set up of continuous integration testing for Hafah.
  • Resume work on HAF-based hivemind. Once we’re further along with HAF-based hivemind, we’ll test it using the fork-inducing tool.
  • Continue work on speedup of TestTools-based tests.
  • Investigate possible changes required to increase haircut limit for Hive-backed dollars (HBD) in hardfork 26. Public discussion is still needed as to the amount to increase the limit (personally I'm inclined to either 20% or 30%, with my preference for 30% to allow for a robust expansion of HBD supply).
Sort:  

Going to need to get a massive expansion of the HBD float to really tighten the peg. It seems the lack of liquidity is at least one factor in creating too wide a trading range. Having more HBD on the market might help to mitigate that especially since most of the HBD appears to be located on one exchange.

As long as there is protection against massive Hive printing through the conversion mechanism if the price of Hive collapse, then 30% might do the trick to reduce the volatility that still exists in HBD.

I think that increasing supply will likely lead to more trading activity and liquidity, especially on external markets. And that in turn should lead to a better peg to USD.

Even more importantly, whenever we get close to the haircut ratio, it creates a loss of confidence in HBD that actively detracts from the appeal of HBD as an investment vehicle.

As to the potential risks, even a 30% haircut rule puts a strong limit on just how much Hive can be created via the conversion mechanism, regardless of how far Hive price might fall in a severe crypto winter scenario. After giving it a lot of though, I think we could easily go to even a 50% ratio without significant risk to long term Hive pricing, but in the end I decided it would be best to demonstrate that a 30% haircut ratio poses no real threat instead of proposing an immediate move to a more aggressive ratio.

With an increased haircut ratio, I think we should also consider adding more leeway between when the print rate decreases and when the haircut ratio hits. Ending HBD printing further away from the haircut ratio should encourage confidence in the peg of HBD (because it decreases the chance of the haircut ratio being hit). And the dramatic reduction in HBD inflation at that point is a supply rate change that works in favor of a rise in HBD price, which can have virtuous effects on the price of Hive itself due to the Hive->HBD conversion mechanism.

For example, with a 30% haircut ratio, we could set the print rate to stop around 20%. In this way, we should only expect to see the haircut ratio hit under one of two circumstances: 1) hive price drops significantly on its own after the print rate hits 0 or 2) HBD price is above $1 and therefore encourages conversions of Hive to HBD.

And in this latter case, where large amounts of HBD are created by decreasing the supply of Hive, the rules of the conversion mechanism will make Hive more valuable and place upward pressure on the price of Hive (and therefore decrease the risk of ever hitting the haircut ratio despite the increased supply of HBD).

Interesting :)
I'm for increasing the debt ratio as well.
Might reduce the conversion time as well from 3 days to 1. Three days is a bit to much, thats why we see people dont wanna risk and convert when hbd is around 0.95. Faster conversions will work betternfor the peg.

I understand the risks of price manipulation, but 24h should is still a lot of time.

I'm not in favor of reducing the conversion time, as I don't think it particularly offers much benefit in exchange for the increased risk of price manipulation. I don't really agree that people who don't want to risk 3.5 days for a 5% return would suddenly want to risk 1 day for a 5% return. Sure it could move a few on the margin, but mostly not. Either of these returns is more than ample.

The main obstacle IMO is not the 3.5 days or the 1 day or even the 7 days it was before (IMO 3.5 days has worked about the same as 7 days; why?). It is just the small size of the market and limited overall market maturity (e.g. few if any derivatives) so there is limited capital people are willing to invest to get involved. Are any of the big crypto trading and market making firms involved? No. But the reason they are not is the size of the market, and not the conversion time.

I don't think anything really bad would happen with 1 day, but I think it is a pointless distraction to look there.

Loading...

More use cases, bigger market, better stability yes ... the thing is HBD is still struguling to prove itself to Hivers as well, who are willing to use it and suport it even at bigger risk then some other options. The native community is still a decent market, and I bet if there is more confidence in hbd, we would see 20M to 50M in HBD in savings, unlike 1 to 2M now ....

Have you seen terra.... the concept is very similar to hive-hbd, just there luna-ust .... instant conversions, with some aditinal infrastructure etc...

reduce the conversion time as well from 3 days to 1.

I'd also consider that. The 3 days conversion time doesn't seem to deter pumps and dumps, but a shorter window might entice regular people to use the conversion mechanism more often and closer to the peg.

This makes sense to me and I would support increasing it to 30% since it is at least a further attempt at getting our HBD more stable (it will probably require more adjustments of course)

Any ideas on an AMM for the Hive internal market?

Changing the haircut rule and print rate should be simple, but creating an AMM is a more complicated and risky thing to do, so I don't think it should be considered for the upcoming hardfork.

please add @telosnetwork TLOS and add some hive engine tokens to blocktrades!!!!

if we increase haircut ratio im out btw. That would be too easy to game to print an infinite amount of hive.

If we want a stable HBD, simple add onchain mechanic worth 1$.

Wallet creation. Pay hive or HBD. HBD is cheaper than 1$. So you get a discount.

It would be an indirect peg even after haircut in place. With more mechanics like that, perfect.

Another thing could be an onchain pool that allows high liquidy trades, that has a micro burn fee. Like 0,1%. With high volume, it would decrease the HBD too and would allow to reward liquidity providers.

An additional lock-up mechanic to print HBD would be IMO better than increase the limit.

If the price fall under 1$, you can pay back the lock-ups cheaper. Would be also a mechanic that benefits HBD.

I suppose that you will make a post soon re-explaining the full mechanism and the fact that going to 30% will not make the task more difficult for the stabilization mechanisms in place

Increasing the haircut ratio can only improve the peg of HBD. The haircut ratio was established to limit the risk to Hive price, not to help stabilize the price of HBD. In fact, I think it is fair to say that nowadays the primary risk to the pegging of HBD to $1 is the relatively low haircut ratio we have now.

Please correct me if I'm wrong, but the change to a 30% haircut ratio will have an impact on when the HBD part of the reward is replaced by HIVE. So this will print more HBDs that will end up in the market, which will make it more difficult to stabilize when the price is below $1, no? Ultimately, it's a question of which one we want to focus on, HIVE or HBD.

So this will print more HBDs that will end up in the market, which will make it more difficult to stabilize when the price is below $1, no?

No, the amount of HBD on the market doesn't impact the price of HBD too much as long as the haircut doesn't threaten the convertable value of the HBD.

Ultimately, it's a question of which one we want to focus on, HIVE or HBD.

No, I completely disagree with this idea. IMO increasing the utility of HBD increases the utility of Hive as well. And we've already seen dramatic benefits to Hive price from a more stable HBD.

Thanks for your answer 👍

My needs for some of my projects include a stable HBD price, so I wish you're right even if I remain skeptical about the method (I think that you don't take enough into consideration the effects external to the eco system that may result).

I am not saying that an action on one has no effect (good or bad) on the other, I just think that depending on the choice, the degree of effect is different and therefore the time needed to wait for our objectives are more or less long.

We don't always agree on the methods to be used, but regardless, our goal and care for HIVE is the same, that's why you have my trust, so go for the 30% and let's see what happens.

 3 years ago  

As long as the haircut limit is not reached, the internal conversions will keep the peg in place. Regardless of the market volume.

I'm in favor of increasing the haircut ratio (and print rate cutoff). I'm neutral on where to raise it to.

I added some thoughts about where we should potentially set the print rate in my comment to taskmaster below.

All makes sense to me.

Congratulations @blocktrades! You have completed the following achievement on the Hive blockchain and have been rewarded with new badge(s) :

You received more than 1355000 HP as payout for your posts and comments.
Your next payout target is 1360000 HP.
The unit is Hive Power equivalent because your rewards can be split into HP and HBD

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

To support your work, I also upvoted your post!

Check out the last post from @hivebuzz:

Feedback from the October 1st Hive Power Up Day
Hive Power Up Month Challenge - Winners List

Investigate possible changes required to increase haircut limit for Hive-backed dollars (HBD) in hardfork 26. Public discussion is still needed as to the amount to increase the limit (personally I'm inclined to either 20% or 30%, with my preference for 30% to allow for a robust expansion of HBD supply).

I hope I understand you correctly: Are you referring to the increase APR at 30% for HBD in Savings?

No, I'm referring to the haircut ratio, it's not related to the APR%. Currently, whenever the supply of HBD (excluding HBD stored in @hive.fund) exceeds 10% of the virtual supply of Hive, the haircut rule takes effect and breaks the pegging mechanism between Hive and HBD that allows 1 HBD to be exchanged for one USD worth of Hive. Most stakeholders want to increase the allowed supply of HBD before the peg is broken, which would further stabilize the peg of HBD and allow for an increased supply, so we'll be testing to see if the required coding changes are as trivial as they sound.

ok, thanks for the explanation!

The APR rate for HBD is set by consensus voting of the top 20 witnesses, no code change is required. Currently most are voting for 10% rate.

Investigate possible changes required to increase haircut limit for Hive-backed dollars (HBD) in hardfork 26. Public discussion is still needed as to the amount to increase the limit (personally I'm inclined to either 20% or 30%, with my preference for 30% to allow for a robust expansion of HBD supply).

I would say 30% is a very reasonable step increase for the debtratio limit.

Increase for 10% to 30% means more of a sharp change. We may go in steps. For a starting point somewhere between 15-29% seems reasonable.

Because HIVE have many other use cases and utility other than just printing a stable coin. More debt ratio can compromise long term economic models when prices fall and bear market is here. Anyways the way HIVE price is going up we are seeing decent amount of HBD to be printed.

Once we enable smart contract functionality and layer 2, We may see more user adoption and stablity.

Nice work. 👏

I'm in favor of setting the debt ratio to 30%. We've seen during the last few weeks that the current level does limit our capabilities to keep the HBD peg.

Can we get archive.org embeds for videos?

That sounds like a frontend issue, so pinging @quochuy.

 3 years ago (edited) 

What issue @blocktrades?
@antisocialist, archive.org video embeds already work, I can see the video you embed in the comment.

Screen Shot 2021-10-08 at 10.19.49 am.jpg

Hmmm, it's not showing for me.
It must be a peakd thing.
I'll ask @asgarth then.

Thank you for allowing those, quochuy.

Yea it’s probably not implemented on Peakd.

An interesting reading..


Hello @blocktrades… I have chosen your post about “-25th update of 2021 on BlockTrades work on Hive software-” for my daily initiative to re-blog - vote and comment…
26.jpg
Let's keep working and supporting each other to grow at Hive!...