You are viewing a single comment's thread from:

RE: Using Hive For Its Most Basic Value

in LeoFinance2 years ago (edited)

Being a programmer, it does take some rethinking to make blockchain access as easy as a traditional backends, only because there's so many tools that exist making programming almost like a reflex. Web 2 has become the same techniques applied over and over again to different scenarios.

That is what HAF (HIVE Application Framework) is for. Hardfork 26 will be very interesting for developers/node operators.

Sort:  

I am very excited for HAF. But I seem to be having to choose between:

  • Install locally
    And burning up 4TB+ storage to host hived+HAF.. per platform: Dev, Staging, Test, Live.
  • Use a hosted HAF
    For any project like ERP, CRM, KPI dashboards and the like, hitting a WAN for scenarios that involve thousands of requests per sec, kinda limits what you could offer at scale, just considering network latency.

I'm thinking business apps here. And I do realize tooling just doesn't all of a sudden become enterprise capable.

But, none the less, that's my conundrum at the moment. IMHO, core business adoption would really up the user count and value to the chain and I'm anxious to see that happen.

Don't get me wrong, I am thoroughly impressed with HAF and love that it handles all the intricate detailed scenarios that exist on chain, extrapolated to Postgres so you can SQL query it. That, in and of itself is amazing and I love it.

Feedback like this is important.

I understand your dilemma. It's either storage or speed. You have to give up something. For a solution that gives access to the full blockchain without doing blockchain programming, 4TB+ per platform isn't very high. It was either disc space or RAM. And we know what's cheaper and more scalable.

That is exactly right. Fully agree I'd rather scale storage than memory. Tradeoffs abound when planning out a project and application offering.

I'm considering middle layers, almost like a cache, that reads only a subset of data. Smaller and faster for specific duties. As one example: account analysis (user growth, activity, etc), where post content is not of concern. There might be things like that out there already and I still wonder if hafah is projected to be that.

I know filtering based on a subset of operations is possible, so if you are interested in specific content, you can scale down the storage requirements significantly.

That sounds perfect! I will definitely continue diving deeper into HAF to better understand. Thank you so much for the guidance.