18th update of 2021 on BlockTrades work on Hive software

in HiveDevs3 years ago

blocktrades update.png

Below is a list of Hive-related programming issues worked on by BlockTrades team since my last report.

Hive network upgraded to Equilibrium

The most significant accomplishment was successful upgrade of Hive via hardfork 25 (the Equilibrium release), and a lot of time was spent monitoring the upgrade and supporting Hive apps as needed during the transition.

The Equilibrium upgrade did have one unanticipated side-effect: the change in curation rules resulted in votes cast after the hardfork being much stronger than votes cast before the hardfork with respect to curation weight, which meant that votes cast in the days before the hardfork generally didn’t receive much in the way of curation rewards. This was a temporary effect that has now been resolved since all posts being actively voted on now were created after the hardfork, but hopefully we’ll have enough traffic on our next iteration of the testnet that we’ll be able to detect such issues ahead of time.

Hived work (blockchain node software)

Improvements to Testtools and tests used to verify hived functionality: https://gitlab.syncad.com/hive/hive/-/merge_requests/272

Added new –exit-before-sync flag to hived’s command-line interface (useful for dumping a snapshot without then syncing, see https://gitlab.syncad.com/hive/hive/-/issues/66 for more details on why this option was added):
https://gitlab.syncad.com/hive/hive/-/merge_requests/232
https://gitlab.syncad.com/hive/hive/-/merge_requests/273

We fixed the previously reported bug that requires a hived to be restarted after loading a snapshot:
https://gitlab.syncad.com/hive/hive/-/merge_requests/274

We have been analyzing the performance of our new “blockchain converter” tool for creating testnets quickly that mirror the mainnet and we’ve identified some code associated with nonces as the potential bottleneck.

Hivemind (social media middleware)

We’ve added a new programmer to our hivemind team and he’ll initially be working on testing and minor bug fixes as a means of learning the code base. His first merge request is here:
https://gitlab.syncad.com/hive/hivemind/-/merge_requests/518

We added code for checking consistency of the hive_blocks table (this is part of previously mentioned plan to ensure robust operation in the case where hivemind’s postgres process shuts down suddenly or a rollback fails):
https://gitlab.syncad.com/hive/hivemind/-/merge_requests/516

We’re continuing work on improving performance of the update_rshares function immediately after massive sync of a hivemind instance. We’re trying two different alternatives to improve overall performance: 1) changing massive sync so that it updates rshares for paid posts on-the-fly, reducing the work of update_rshares to only updating rshares for unpaid posts (this approach requires introducing a new hived virtual_operation) and 2) adding an index (at least temporarily just after massive sync) to speed up update_rshares. Both approaches are currently undergoing testing.

We also resumed research into the reason why some hivemind nodes consume more memory than others. It has been suggested that it may be related to differences in python or python-library installations on the different systems, which I’m inclined to believe at this point, as we’re no longer seeing unexpected memory consumption on any of our production nodes running the latest official hivemind version. So if we’re unable to replicate this issue in our forthcoming tests, we’ll likely drop this issue soon, after merging in our diagnostic changes that identify sources of memory usage better.

Hive Application Framework (HAF)

Our primary dev for this work is currently on vacation.

I had hoped we would still be able to work on the psql_serializer plugin (which feeds data from hived to hivemind under the HAF system) in the meantime, but the dev tasked with that was tied up with other issues (e.g. fix of snapshot problem). A new dev has been assigned to work on psql_serializer this week (the previously tasked one is going on vacation for two weeks).

Condenser and wallet (open-source code base for https://hive.blog and other similar frontends)

We reviewed and merged in a number of community-contributed upgrades to condenser and its wallet.

From @quochuy: https://gitlab.syncad.com/hive/wallet/-/merge_requests/102

From @guiltyparties:
https://gitlab.syncad.com/hive/wallet/-/merge_requests/101
https://gitlab.syncad.com/hive/condenser/-/merge_requests/268

From @eonwarped: https://gitlab.syncad.com/hive/condenser/-/merge_requests/269

What’s next?

Several of our Hive devs are either on vacation now or going on vacation this coming week (they had been delaying their vacations to be available for the hardfork and any potential problems that might arise afterwards). So we’ll only have 8 BlockTrades devs working on Hive for the next two weeks, and our progress will inevitably slow some during this time.

After all our Hive devs return from vacation, we’ll take a couple of weeks to begin planning what work to schedule for the next hardfork (HF26). I have some preliminary ideas for improvements that our team will work on, but we’ll make a full list of proposed changes, then begin to prioritize what we want to fit on the roadmap for HF26. My plan at this time is to stick to our existing “upgrade Hive protocol every six months” schedule if possible.

Also, as during previous hardforks, our roadmaps aren’t fixed in stone, so we may consider making other proposed changes even after the initial roadmap is published, assuming the changes aren’t too big.

Note that the above process doesn’t mean we don’t have clear development goals prior to the completion of the HF26 roadmap. For one thing, we will be making performance upgrades to hived that don’t require an actual hardfork, and these changes will generally be released as they are completed.

Even more importantly, at this point our highest priority tasks revolve around the creation of the HAF framework and HAF-based applications, and this is all layer 2 work that doesn’t require any Hive protocol changes that would necessitate a hardfork. In other words, we can also release HAF and associated apps to the dev community as soon as they are ready, without the labor and scheduling issues involved in getting nodes to upgrade as part of a hardfork.

Sort:  

Threw a bit of spare change into HBD savings because why not. I'm not sure if it was mentioned or if there ever was any indication but for some reason I was under the impression I'd be able to convert to HIVE directly from savings. Takes roughly the same amount of time to convert as it would to remove from savings. Am I missing something? No option available to convert from savings using PeakD for instance. Never bothered me enough to dig further.

I don't think that operation exists in the blockchain. It's not a bad idea though.

Unless someone can build a strong case against it, I'd say throw it in next round.

A use case for savings is to keep it in delayed access in-case credentials are somehow gained by other parties. Is there another use case?

That 7% just sitting there for the taking might be interesting to some.

The feature does make the savings dollars a little more liquid in that in order to add this feature it would otherwise need 6 days rather than three.

Very good idea !!!!!

It'll come in handy.

No, it's not possible right now. I'll add it to the list of possible changes.

Sounds good. But for now, gives yourselves a pat on the back and enjoy that well deserved vacation.

Impressively good hardfork, clearly it was well tested.

In addition to going well from a technical perspective the changes were well explained and well received.

One of the smoothest hardforks in forever, hope the devs enjoy their vacations!

indeed, amazing job with the HF! vacation well deserved :)

I am looking forward to the ideas for HF26!

Excited about HAF!
Any updates about RCs? I know howo was working mainly on that, but still any info?

We'll definitely be reviewing the RC system for possible improvements (e.g. with respect to rationalization of costs).

Can we get a trading bot a'la dswap.trade for the internal market?

Also, if the 30 day payout window could make a comeback with linear rewards(?) from day 7 to day 30, that might help retain more persistent content producers.
Or, to add some gaming to it, perhaps a curve that slightly favors days 12 to 18 for voting old posts?

What are your thoughts on expanding consensus to 29 + 2 witnesses?
With a corresponding change to the overall inflation split.
Newb attraction pool: 50%
Staking hbd/hive rewards 3day withdraw: 12%
Staking rewards 10 week withdraw: 20%
Witnesses: 15%
DAO: 3%

How much longer do you think the code will require this level of development?

Sorry for the delay responding, but that was a lot of suggestions at once, and this week has been even more busy for me than usual.

I don't think it should be too difficult to add an AMM between Hive/HBD if it looks like it will help liquidity. But before our team built one, I'd need to review how these things are performing in the real world, I've mostly stayed away from them because I felt many AMMs are marketed with an overly rosy view.

30 day window could be possible, but I don't know what the performance tradeoffs would be yet.

Expanding consensus to 29+2 has one known disadvantage: it increases the time to transaction finality. It's not a huge increase though. Overall, I don't have a strong opinion on this issue as I haven't seen any real problems coming from the current number of witnesses. As far as the other changes go, I'm not eager to tinker with rewards ratios, as such changes are always controversial.

How much longer do you think the code will require this level of development?

It's hard to say. As far as hived goes, we've done many of the things I would characterize as critical to scalability of the chain. But there are still several important improvements I would like to make. My guess is we can complete them in the next 6 months.

Loading...

RC delegation? My understanding is that doesn't require a hard fork.

Is that on the agenda anytime soon?

will need a hardfork for anything that effects price/pay (like transaction cost)

I believe @howo is planning to work on this.

I am literally working on it as we speak 😄

Thanks for your work guys

After all our Hive devs return from vacation, we’ll take a couple of weeks to begin planning what work to schedule for the next hardfork (HF26). I have some preliminary ideas for improvements that our team will work on, but we’ll make a full list of proposed changes, then begin to prioritize what we want to fit on the roadmap for HF26.

Do any of you have anything that would help the majority of the users, instead of the whales? Something that would motivate people to keep using the platform. Nowadays I have more than 1200 followers, but most of them are inactive nowadays. They left the platform long months/years ago. I put hours of work and effort into my posts, and I comment a lot, but still almost no one cares about my posts. And the majority of the users are in the same shoes. This platform is quite disappointing both in the monetary and in the social aspect for most people.

There are some talented developers all over this chain but not one of them will be able to inject enough code into a human so they go around liking every post.

This is a tough gig anywhere you go. And because seemingly majority in the social setting sign on hoping to be paid content creators, while there's very little focus placed on attracting consumers for that content by the content creators, the struggles some face here are only natural.

Check out the dlux.io frontend. You can do just about anything there like post videos, applications, VR/AR.

Do you have any specific ideas to improve the platform?

Here is mine: It's an upgrade for the dhive library. The patch is at the end of this post

Feel free to open a Merge Request with your changes here https://gitlab.syncad.com/hive/dhive

I tried signing up but the sign up process didn't work. If you can program for dhive, then you definitely know how to use the Linux patch utility (which is way easier). You can look a gift horse in the mouth, but I'm not going to beg them/or you guys to accept it. You're free to ignore it or to use it. Are you a developer from the dhive team? I will be abandoning my local fork once you guys get the functionality I added. I don't want to maintain my own fork.

I see you have a 'Producer Reward'. Are you a witness?

I suppose you can just update your dhive instance because new transactions are already added to the serializer (https://gitlab.syncad.com/hive/dhive/-/merge_requests/29)

Yes, I'm a witness.

Stay awesome.

I have just read your profile information. Now I think I encountered dblog before browsing. It's really positive to have things decentralized among many different domain names. The patch is hereby released under an MIT license. Sorry if the whitespace amounts are wrong.

Very good job with the last hard fork. Besides the juicy curation rewards for me, everything works fine. Super good.

The Curation system works now really good IMO, no reason to be fast. Also no reason to use bots that much.

The next big thing is IMO the RC delegation pool. With this feature i can give away all my RCs i don't need to dapps that benefit hive. If there will be a market, i can imagine also cool things there :)

Any date for it?

untitled.gif

@howo is working on direct RC delegations, I personally hope that will be sufficient.

this sounds really good. Does it mean if i delegate Rcs to an dapp, the dapp can delegate it again to the users?

Or is this the pool version of it and direct is like HP delegations?

Congratulations @blocktrades! You have completed the following achievement on the Hive blockchain and have been rewarded with new badge(s) :

You received more than 1215000 HP as payout for your posts and comments.
Your next payout target is 1220000 HP.
The unit is Hive Power equivalent because your rewards can be split into HP and HBD

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP


The rewards earned on this comment will go directly to the person sharing the post on Twitter as long as they are registered with @poshtoken. Sign up at https://hiveposh.com.

I have an addendum:

First off, you have done a most excellent job here under difficult circumstances, please don't think my criticisms in any way besmirches that.
Even if this right now is the peak, you have helped 10's of thousands of people to better enjoy their lives.


We forked out the vote selling bots.
When do we address the vote selling through leasing?

Personally, I prefer personal restraint.
The leasing concerns should die on the vine, imo.
The crowd enforcing its will is a better option than coding, for me.

If you can explain to me how leasing one's vote differs from selling one's vote singularly, perhaps I could come to terms with this seeming contradiction?

I don't view either vote bots or leased delegation to be inherently wrong. Both of these are ways to temporarily magnify voting power as opposed to buying the stake directly. It can be viewed as a "rent or buy" approach to hive power.

But the problem, from my point of view, was that most of the votes cast by vote bots were often voting on poor content and was typically just self-votes by people looking to directly profit from an imbalance between what they received via author rewards versus what they paid to vote bots. The blockchain economics before the EIP changes were terrible and incentivized self voting by posters that had no long term tie to the ecosystem's performance.

We forked out the vote selling bots.
When do we address the vote selling through leasing?

The vote selling bots weren't forked out per se, it was just users deciding to downvote them (i.e. "The crowd enforcing its will").

The EIP changes just made it more economical to do that (change of author/curator rewards ratio disincentivized vote bot usage some and free downvotes did the rest. These same changes potentially impacted votes cast with leased hive power as well, which begs the question, why hasn't it?

My guess is that people aren't as bothered by the way delegated hive power is being used to vote on posts in most cases nowadays. If someone did lease hive power and started using it abusively, I suspect it would draw downvotes. Just the same way it could happen even with staked hive power being used abusively.