29th update of 2021 on BlockTrades work on Hive software

in HiveDevs2 years ago (edited)

blocktrades update.png
Below is a list of some of the Hive-related programming issues worked on by BlockTrades team during the past work period:

Hived work (blockchain node software)

Updates to resource credit (RC) plugin

We’ve implemented the changes to include costs for verifying cryptographic signatures during resource credit (RC) cost calculations for transactions.

Now we’ve begun analyzing the rationality of costs computed by the resource pools.

We also created more tests for RC delegations and reported some errors to @howo based on the test results. We’ll resume creating more tests after those issues are fixed.

Miscellaneous work on hived

We fixed an erroneous error message that was sometimes displayed related to the –exit-before-sync command-line option.

Also, yesterday I started an analysis of how transactions are being treated when there wasn’t enough RC to get them included into the blockchain. This is because while analyzing a high traffic time period on the p2p network, I had seen evidence that such transactions are lingering until the account either regenerates enough RC or the transaction expires, which isn’t desirable, in my opinion, from a performance view.

I’m currently testing an experimental version of hived that should reject such transactions and avoid propagating them to the rest of the p2p network.

Hive Application Framework: framework for building robust and scalable Hive apps

Most of our work lately continues to be HAF-related:

Experimenting with sql_serializer implementation and operation filtering

We’re modifying the sql_serializer to directly write some additional tables to the database to indicate what accounts are affected by which blockchain operations.

The hope is that this will result in a speedup versus the current method, where each HAF app has to re-compute this data on-the-fly.

We have a first pass at this done now, but the sql_serializer phase takes twice as long as it did before, so we’re going to see if there is anything that can be done to optimize that, since it doesn’t seem like it should take twice as long.

We’re also planning to add an option to sql_serializer to allow filtering of what operations get saved to the database. This should be useful for lowering the storage requirements and improving performance of HAF servers that are dedicated to supporting a specific HAF app.

New example HAF app: balance_tracker

We created a new HAF application called balance_tracker that maintains a history of all the coin balances for an account changes over time (e.g. you can plot a graph of how your hive and hbd balance change over time). It’s not completely finished yet, but it works well as a prototype already, and has excellent performance (processed 58M blocks in 2.5 hours), so it’s a great example of how fast HAF apps can be.

It’s also not fully optimized for efficiency yet (account names are stored as strings in all the records), but even so its disk usage isn’t too bad (~22GB to store asset histories for every account on the blockchain).

The primary reason for creating this app was to serve as an example for new programmers starting out with HAF, but I think it is useful enough that we will further improve it and add a web UI for it (and we still need to add an API interface to the data before that).

Each time we create one of these small apps, we learn a little bit more about the most efficient way to build them, so we hope these examples will serve as guidelines for future app development.

Optimizing HAF-based account history app (Hafah)

We completed benchmarking for Hafah and achieved a new performance record for syncing the data to headblock with the latest version (4.9 hours to sync 58M blocks, which was about a 40% speedup over
pre-optimized version).

Some progress on optimized multi-threaded json-rpc server for HAF apps

We finished an initial implementation of a the multi-threaded json-rpc server for HAF apps, but unfortunately benchmarking it as an API server for Hafah showed its performance was worse than the previous async-io based json-rpc server.

We’ve identified the likely problems (one of which was continual creation of new connections to the SQL server), so we’ll be updating the server this week and re-running the benchmark after we’ve improved the implementation. I hope we can complete this task in the coming week, but if the upcoming optimizations aren’t sufficient, we may need to look at another jsonrpc server implementation such as Sanic (Sanic is used by the Jussi proxy server which is also implemented in Python).

We are also creating a CI test for hafah and it should be done in the next week.

Hivemind (social media middleware app)

We’ve tagged the final version of hivemind that supports postgres 10 for production deployment. We’re currently creating a database dump file to ease upgrading by API server nodes.

All new versions of hivemind after the one just released will require postgres 12, but we’re planning to convert hivemind to be HAF-based before we tag another production version, and that new version will need to undergo rigorous testing and benchmarking before we would tag it for production usage because of the magnitude of the change in the hivemind sync algorithm.

Condenser (code for hive.blog)

We deployed a new version of hive.blog with @quochuy’s change to display the amount of resource credits (RC) available as a circle around each account’s profile icon (it is also displayed at some other new places such as "Account stats" below a post being created).

Work in progress and upcoming work

  • Continue experimenting with generating the “impacted accounts” table directly from sql_serializer to see if it is faster than our current method where hafah generates this data on demand as it needs it.
  • Above task will also require creating a simplified form of hafah.
  • Fix the algorithm used by sql_serializer to determine when it should drop out of massive sync mode into normal live sync mode (currently it drops out of massive sync too early).
  • Finish up work on multi-threaded jsonrpc server.
  • Run tests to compare results between account history plugin and HAF-based account history apps.
  • Finish conversion of hivemind to HAF-based app. Once we’re further along with HAF-based hivemind, we’ll test it using the fork-inducing tool.
  • Continue RC related work.
Sort:  

I don’t understand a single thing from your post but by god it’s good to have you keep developing and pushing Hive forward 🙌🏻

Yeah, I think my last couple of posts have been increasingly technical in nature, and have primarily been aimed at developers who may want to try some HAF in their applications.

Part of the issue is the weekly nature of the reporting: the goals that are achieved in that time are often small steps towards a larger goal, and the small steps are very technical oriented and only the larger goal is likely to be meaningful to regular users of the platform.

In this case the larger goal is making it easier and cheaper to create new Hive-based applications.

100% appreciate that. I’m pretty much the anomaly in your viewership (so to speak) who follows these updates but isn’t technical in any way. But I love to see the progress and the clear love and skill behind what you do 💪🏻
All power to you and your team for making great shit happen!

Hivemind (social media middleware app)
We’ve tagged the final version of hivemind that supports postgres 10 for production deployment. We’re currently creating a database dump file to ease upgrading by API server nodes.

My node is currently running hivemind v1.25.3 that was fully synced from scratch, and dump is available at https://gtg.openhive.network/get/hivemind

And you see a difference in some way? I have no idea about it, so i ask :P

Of course, especially performance wise. If only I had more time for writing I would do Hive Pressure episode featuring Hivemind :-)

that would be really cool! :)

We need a system where people who don't have staked HIVE can pay for their transfers with their balance. A good starting point would be 1 cent per transaction which will be burned. This would increase the usage and revenue of the Hive Blockchain and add deflationary pressure on HIVE.

I see your point, but it seems like anyone can just stake a small amount of Hive, so why would they want to pay transaction fees? And even if they have 0 HP they can still do some small amount of operations.

I see your point, but it seems like anyone can just stake a small amount of Hive, so why would they want to pay transaction fees?

In order for that, the new users need to learn about staking. There are some DApps that require frequent txes like games etc for which large volume of tx is needed. A small stake may not cover that.

And even if they have 0 HP they can still do some small amount of operations.

Yes, but what if that new user is using a HIVE DApp and has no idea of how HIVE as a blockchain works? He just wants to play his favorite game. Since most other blockchains have a fee structure, it would be helpful for onboarding existing crypto users to HIVE. It's basically a UX thing.

This issue can be fixed by making it possible for users to buy resources and do as many tx as they want. It is a win win for both HIVE and the noob users who are over 75% of the crypto community. People want things simple and easy. I know of people who have $1000s in crypto but don't know how to make an onchain tx. He is using Binance.

Congratulations @blocktrades! You have completed the following achievement on the Hive blockchain and have been rewarded with new badge(s):

You received more than 1420000 HP as payout for your posts and comments.
Your next payout target is 1425000 HP.
The unit is Hive Power equivalent because your rewards can be split into HP and HBD

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

To support your work, I also upvoted your post!

Thank you for all your hard work. We appreciates you <3


~~~ embed:1461257411185061892 twitter metadata:SGl2ZWJ1bGx8fGh0dHBzOi8vdHdpdHRlci5jb20vSGl2ZWJ1bGwvc3RhdHVzLzE0NjEyNTc0MTExODUwNjE4OTJ8 ~~~
The rewards earned on this comment will go directly to the person sharing the post on Twitter as long as they are registered with @poshtoken. Sign up at https://hiveposh.com.

Very useful for Hive users.
👍

Really great job. I think it's a great news for every hive users!

@blocktrades you are doing excellent work keep it up

We deployed a new version of hive.blog with @quochuy’s change to display the amount of resource credits (RC) available as a circle around each account’s profile icon (it is also displayed at some other new places such as "Account stats" below a post being created).

While this is a good idea, I feel that it would be more useful for voting power as most user's RC is going to stay at ~100%

When I saw the new feature, I had the same thought that it would be nice to also show VP there, maybe with an additional circle, for example.

Also, yesterday I started an analysis of how transactions are being treated when there wasn’t enough RC to get them included into the blockchain. This is because while analyzing a high traffic time period on the p2p network, I had seen evidence that such transactions are lingering until the account either regenerates enough RC or the transaction expires, which isn’t desirable, in my opinion, from a performance view.

Did I instigate this little discovery with my dumbness of not understanding why my node was laggin and assuming it was the p2p plugin? =p

Yes, it was while looking into your report about high traffic that I saw this strange behavior in the RC code.

I've never looked into this code before, but my initial feeling is that the code isn't behaving exactly as the original coder intended.

Hello @blocktrades… I have chosen your post about “-29th update of 2021 on BlockTrades work on Hive software-” for my daily initiative to re-blog - vote and comment…
05.jpg
Let's keep working and supporting each other to grow at Hive!...

Congratulations @blocktrades! Your post has been a top performer on the Hive blockchain and you have been rewarded with the following badge:

Post with the highest payout of the day.

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

Thanks for the detailed update.also thanks for being the authority for my account lol.

This is likely off-topic, but is there any primary reason that HBD is never effectively pegged to $1? After all these years between both Steem and Hive, and with limited technical understanding here, it just doesn't add up. So many other chains seem to have this stabilized.

There's a reason it wasn't pegged well in the past: it was only pegged in one direction. That is no longer true, and from my perspective, the peg works really well now.

The farthest it has moved from the peg since the last peg improvement was about 1.20, which is not too bad in my opinion (we've seen fluctuations that high between between Euro and USD).

Also even that 1.20 fluctuation could have been easily squashed by stakeholders, but it was just so profitable to stakeholders and the DHF that no stakeholder really wanted to do it. It was probably a once-in-a-lifetime chance to make some bank for the DHF and most of the big traders involved understood the situation well enough to keep their profits at a reasonable level in order to keep the profit train going. I think none of us were too concerned about keeping the peg that tight at that time, because there just weren't a lot of commercial uses of HBD at that point that would be disrupted by the slight rise in the price of HBD.

But I don't think we're likely to see that situation repeat again (although it would be nice if it did, assuming commerce use hasn't picked up by then) as I suspect that whoever tried to push the price up learned an important lesson about how strong the new pegging mechanism is, even though it is decentralized in nature.

We have an extremely forceful mechanism for driving the price down to 1.05 via the HBD conversion operation. For tighter pegging beyond that point, the DHF can do the rest, as it grows in size and the hbd stabilizer gains trading power. I believe that eventually sentiment combined with hard lessons to traders who try to break the peg will keep it pretty much at $1 on the upside (well, probably a little more that $1 if more external traders become aware of the interest rate they can receive for holding it, in which case witnesses should be able to impact the price some by updates to the interest rate).

If you look at the low side of the peg, it doesn't tend to go below 0.90 and usually hangs around 0.95 to 1.00. Again that slight variation is just a source of profit for stakeholders and the DHF, who can buy it up at below $1 price, then convert it to $1 worth of Hive. This cycle acts as a decent incentive for stakeholders to support the price and profit from their virtuous behavior :-)

As the trading volume for HBD increases, I expect the trading range around the peg to narrow, because traders will be able to profit more from smaller spreads between the trading price and the conversion price when they can make more trades (less profit per trade but they can make it up in volume, so there will be more price competition among traders).

One final factor that will improve the lower side of the peg is the change we're making in this upcoming hardfork to allow the HBD supply to increase. This will reduce the chance for a haircut, which is probably the main threat to sustaining HBD at the value of $1 right now (it's not a high risk even now, but the more headroom the better).

So, basically I think the peg rules are "good enough". We could tweak it more, but I think it is better to wait and see if the peg continues to hold well enough with current rules.

I personally think the slight variation around the peg is a strong favorable point for HBD, since it makes it hard to attack via the regulatory rules that seem likely to be aimed at most stablecoins soon.

Thanks for taking the time to thoroughly explain this.

Hello, I just wanted to let you know that I just followed your account. hopefully we can be friends if it's not a hassle please visit my account who knows you like my post

Will the power-down period be shortened from 13 weeks to let say 4 weeks or something like this?

I think the most likely thing will be that we allow an insta-powerdown for a fee, as suggested by @theycallmedan in a post he wrote a while back (see his post for more details on the idea). But I'm not sure we have time to code it up in this hardfork.

Yeah, I heard about it during the Hive Fest.

Excellent publication, esteemed, it is propitious the occasion to thank the visits that you make to my publications on yoga. I wish you every success in your publishing work and in curating our works. congratulations. @omarrojas

why are you downvoting my comments with other people?

I guess you should contact spamminator to find out

why? i thought this was a permissionless, trustless blockchain. why should i let the ego of other people influence my experience of a blockchain?

this is good software for managed clinics online, named Asyster

top witness, no1 consensus, $10m in account, bitter enough to downvote a "comment" between me and someone else? huh. power corrupts absolutely.

Posted via inji.com