Core development proposal year 7

in #core9 hours ago (edited)

Hello everyone!

I have had the pleasure of working on Hive for six incredible years, and I am grateful for the support you have all given me. I would like to carry on working as a core developer and contributing to Hive for a seventh year.

Who am I

WEB-Martin-Lees-Photographer-Ruben-Cress-01.jpg

I have been on this chain for more than 9 years now.
I went from regular blogging, to contributing to open source on Hive, to building dapps and running a top 20 witness (@steempress, now deprecated). And now I sit at rank 21 with my renewed witness (@howo). I became much more involved in core development when we forked away from Steem, rising to the occasion as we needed every hand we could get to birth the chain. I contributed to the soft fork that locked Justin Sun's funds and later to the very first hard fork that created Hive. Ever since, I've been working as a core developer contributing to Hive and Hivemind, implementing sensible requests from the community and hosting core dev meetings where every month, core contributors to the ecosystem get together and collaborate on ideas and share what they are doing.

If you're interested in my full journey, I made a throwback post retracing most of my activities on Steem and then Hive here a while ago: https://peakd.com/hive/@howo/my-hive-story

My work

Over the years I've shipped a number of features you may be familiar with: recurrent transfers, RC delegations, the Mesh API (coinbase integration), to name a few. Rather than revisiting those at length, I'd like to focus on what I've been building more recently.

Hard Fork 28

HF28 shipped a few months ago and included a bunch of my features:

Multiple recurrent transfers for the same sender/receiver pair: Previously, you could only have a single active recurrent transfer between any two accounts. This was a meaningful limitation for builders — imagine wanting to run both a subscription payment and a separate payment stream to the same account. HF28 lifts that restriction entirely, allowing any number of concurrent recurrent transfers. This is what peakd leverages for their open market feature

Removing DHF HBD from inflation calculations: This one is directly relevant to the current inflation debate (more on that later). Before this change, there was a risk of an uncontrolled inflationary spiral: if the HIVE price dropped far enough, the HBD sitting in the DHF would represent a growing share of market cap, which would increase inflation — which would push the price down further, which would increase inflation, and so on. The only way to prevent this was a proposal to burn DHF funds, but the protocol prevents burning more than 1% of the fund per day — which is far too slow to react to a fast-moving market event (and also forces us to burn pretty much the majority of the DHF funds). This change eliminates that risk entirely. A side effect of it is also that inflation via the reward pool and witness rewards dropped significantly post hard fork.

Community Features

This year has been focused on a lot of in-depth reworks. Thanks to big advancements in agentic coding (AI), many tasks that I've been putting off for years because they would take too long are now possible and being tackled.

For instance I've migrated most of the community code from Python to PgSQL, which led to massive improvements, doubling the speed at which we process those transactions by moving all of that logic into optimized SQL functions that run directly in the database instead of round-tripping through Python.

Another big rework completely changed the way notifications are processed for communities, enabling a lot of UX improvements for users and moderators. You now know when your post gets muted (and why!). When your role or title changes, you get notified. Users can now flag a post for moderators to review if they think it broke the rules.

There's also a bunch of features that are ready to ship but haven't been included in the latest release yet:

In line with the better UX for communities, I've added a complete moderation API to let community owners actually manage their communities properly. You'll be able to pull up any user and see their full history - how many times they've been flagged, muted, what actions were taken and by whom, with full date-based pagination so you can browse through everything. There's also a moderation stats endpoint that gives you an overview of your team's and users' activity. The idea is to give community owners real tools to make informed decisions instead of just guessing whether someone deserves a permanent ban or just a warning.

On the notification side, several users have surfaced the need for the ability to "subscribe" to a post and get notified of every reply. Right now if you make a post and a conversation starts in the comments, you have no way to follow it without manually checking back. With this feature, you'll be able to subscribe to any post or comment and receive a notification for every new reply in that thread. It works as a simple custom_json operation on-chain, so any frontend can integrate it easily. There's a cap of 16 subscriptions per user to keep things reasonable, and unsubscribing is just as straightforward.

For You Feed

Finally, my most recent work tackles one of Hive's biggest usability gaps: content discovery. Right now, finding content you actually care about means manually jumping between communities, trending pages, curator feeds, and reblogs. That was fine in the Myspace era, but it's not how people expect social media to work in 2026.

I'm building a "for you" feed - a single unified feed that pulls posts from all your sources (communities, follows, trending, tags) and ranks them based on your actual behavior. If you start commenting on gardening posts, more gardening shows up. If you interact with a specific author daily, their new posts surface first. If your close mutual connections are all engaging with a post from someone you don't follow, it gets surfaced as a discovery candidate.

Under the hood, every post gets scored on multiple signals: how much you interact with the author, comment velocity, payout quality, freshness, social proof from mutual connections, and community/tag affinity. There are also diversity rules to prevent any single author from dominating and to ensure a healthy mix of followed content and new discoveries.

This is still a work in progress - it's computationally heavy and because we want to stay as decentralized as possible, we can't just throw more hardware at it. But the foundation is there and I'm building it up piece by piece. I'm also exploring extending this to post-level recommendations, think "if you liked this post, check out this one."

If you're interested in the full technical breakdown, I wrote about it here: https://peakd.com/hive/@howo/content-discovery-feeds-and-decentralized-social-media

The small tasks:tm:

Then there's all the smaller stuff that doesn't always make it into changelogs but matters just as much: fixing bugs that users and Dapp developers report, speeding up slow queries, plugging gaps in input validation, keeping the devportal docs accurate and up to date, squashing UI annoyances on Block Explorer and Condenser (dark mode glitches, broken pickers, missing translations), cleaning up old code nobody needs anymore (dead MySQL references, deprecated tables, leftover files), tweaking APIs when Dapp teams ask for something reasonable (extra filter params, cleaner response formats, consistent errors), making tests less flaky. None of it is glamorous, but it's what keeps things running smoothly and saves everyone else headaches down the line.

A word on the current spending debate

There has been a lot of discussion lately about DHF spending, inflation and price, and I want to address it directly rather than pretend it isn't happening.

I am reducing my proposal by approximately 17% — going from 350 to 300 HBD/day. I think some belt-tightening is healthy for the ecosystem, and I want to lead by example rather than just talk about it.

That said, I think it's misguided to focus all the pressure on reducing spending. We are living through a moment where AI is changing the economics of software development.

That can be used in one of two ways: we can go at the same pace for a third of the cost, or we can go ten times faster for the same cost. Every serious competitor in this space is choosing the second option. Beyond the promise of better inflation numbers, I don't see a concrete plan for what Hive does with the savings. I'd much rather see us continue to use all the resources available, move fast, and take a real shot at growing and succeeding than limp along into irrelevance while our developers quietly migrate to other ecosystems.

What's next

In the short term, the priority is getting the features that are already done into production: the moderation API and post subscriptions. These are built, tested, and waiting for a release.

The for-you feed is the big ongoing project and will likely last for a while as I fine-tune it with feedback and monitor performance impacts. The foundation is solid and I'm actively working on it. I'll also be looking into expanding the algorithm for post-level suggestions — think "if you liked this, check this out" — to improve content virality and make users want to spend more time on front ends and make our chain more valuable.

I'll also be spending time on the reputation system. It's been broken for a while and there are ways to improve it to make it a more relevant metric again. Speaking of reputation, I'll be working on tackling issues around certain users spamming the chain and improving the overall experience for normal users so they can effectively mute them into nonexistence. This has already started with some of my API changes but there is more work to do.

There is also a lot of optimization work waiting to be done, especially now that making large overhauls is so much cheaper. Very much in line with what I've done with communities, there are plenty of opportunities to speed things up significantly.

On top of that, AI has gotten incredibly good at finding vulnerabilities, as we've seen in recent developments. This is both a good and a bad thing — it's now a race to leverage those tools to find and fix vulnerabilities before they are exploited. The Hive ecosystem is no different, and I've already started this work with input validation hardening and XSS fixes on Condenser and HiveD. Expect more of this going forward.

Beyond that, I'll keep doing what I've always done: fixing bugs, responding to feedback from users and Dapp teams, keeping the devportal and docs up to date, and hosting the monthly core dev meetings. The work is rarely glamorous but it's what keeps the chain healthy and the developer experience good.

If you've followed my work over the years, you know what you're voting for. More of the same, but faster. Note that this list isn't exhaustive. I'll definitely be building more, but priorities shift quickly, especially now with AI, and we adjust month to month with the rest of the core team. For instance, once the light accounts specification is finalized, I'll likely lend a hand on the implementation side.

Voting

Here is an easy link to vote on the proposal:

https://peakd.com/proposals/371
https://ecency.com/proposals/371

You can view all proposals on:

https://wallet.hive.blog/proposals
https://ecency.com/proposals
https://peakd.com/proposals

(Make sure to vote on the upcoming proposal and not the old one!)

Closing words

If you have any questions, please feel free to ask them in the comments!

@howo

Sort:  

You got it! Happy to help you help us.
Much appreciation for your efforts @howo

Thank you !

There is currently a lot of debate about the risks of quantum computing for Bitcoin and cryptocurrencies. Is Hive addressing this issue?

Hi ! There's a lot of debate about those risks but realistically nonexistent for now. The paper is theoretical, not about actual hardware that could exist in the near future.

I'm not saying we don't need to tackle it, but it's a very low priority at the moment, we will watch what is done in the space research-wise and if the threat becomes somewhat credible we will shift priorities.

Loading...

You are reducing the proposal by approximately 17%, but coding costs have reduced way more by now. Given the severe situation Hive is in currently, proposals need to leverage the AI potential way more than that. Let alone breaking down the spent and showing more transparency. How can a proposal be so vague?
Specifically, what portion is personal compensation vs. other expenses?
If other expenses, which ones and how much?

@howo, the DHF has over 23M HBD. How much of it is going to be burnt?

Glad you are here, howo.

Can we get bitchute.com and xcancel.com links to embed?

All the utub links get broken by my anonymizer so that utub doesn't get a list of what I have seen, so if the crowd had an option to utub, that would be great.

Xcancel is a nitter instance, it anonymizes x.com traffic, so that x.com doesn't get a list of all the things I see.

These things would allow a broader, more non-corporate conversation to the chain.