• Breaking News

    Saturday, July 25, 2020

    BTC ASERTI3 vs Grasberg in condensed, integer-only python3: a visual comparison

    BTC ASERTI3 vs Grasberg in condensed, integer-only python3: a visual comparison


    ASERTI3 vs Grasberg in condensed, integer-only python3: a visual comparison

    Posted: 24 Jul 2020 11:45 AM PDT

    Thoughts on Grasberg DAA

    Posted: 24 Jul 2020 09:11 AM PDT

    Preface: I'd given up hope on BCH ever getting a DAA and I'm pleasantly surprised that this seems to be happening. I've decided to do a review. Sorry if this repeats some points made by other people, and my apologies that I haven't had the energy to track all the discussions and developments being done on this topic. Also, I originally intended to just post this as a reply on another post but it got too long.


    Grasberg is a new difficulty algorithm designed by Amaury Sechet, which is based on the exponential difficulty algorithm family under recent discussion, but with a modification to change the block interval. The main practical consequences for BCH adopting it would be:

    • For the next 6-7 years, only ~128 blocks will be mined per day (i.e., 1 per ~11 minutes), instead of the usual 144.
    • Miners who are dedicated to BCH ('steady miners') should be much less penalized, compared to current DAA.
    • It is expected that this in turn will incentivise more miners to mine on BCH steadily, which means a reduction in block time variance and thus better user experience. (Perhaps there will also be less switch miners, but my understanding is that this does not have significant impact.)
    • Unlike a regular upgrade, all SPV wallet users will need to upgrade too.

    A high level I judge the algorithm should be safe. HOWEVER:

    I find the detailed algorithm is unnecessarily complex for what it accomplishes, i.e., the same thing could be done much simpler. Complexity is a concern especially for this consensus rule, because not only do full nodes have to implement the algorithm perfectly, so do SPV wallets. People will have to implement this in java, python, etc., perfectly implementing all the intricacies of each arithmetic operation. I feel like just a few more hours spent on clean KISS engineering here would save a lot of headache for everyone else down the road.

    On a related note, I also strongly urge to adopt additional timestamp control rules, regardless of the DAA. (see bottom)

    Since there is no spec, I am basing my review on this specific version of the source code; I assume this is the definition of Grasberg. I may have made some mistakes in understanding the code, and if so, this means that an actual specification and better code comments are sorely needed. I am going to focus on the underlying algorithm and ideas which other implementors must exactly reproduce, not the details of ABC implementation or C++. I will start out with a basic description:

    High-level description of algorithm

    The algorithm bases the difficulty of each block on the previous block's difficulty, increased or decreased according to an exponential function of the timestamp difference between the previous and previous-previous blocks. As explained in a document I wrote a while ago, simple algorithms of this kind allow a recursive understanding of the difficulty in terms of any past block (the ASERT idea), which in turn means we know that timestamp manipulations will always cancel out in the end. This is an important security property, though of course not the only important property.

    Grasberg is not just this, rather it is tweaked slightly to vary the targeted block interval - why would this be? There is a philosophical question of whether in the long run, blockchains should aim for an exact average block time, or, whether an accumulated or historical discrepancy is fine. Discrepancies can happen due to bad design or they may be a 'natural' and intentional part of design from the fact that hashrate is ever-rising, as in the case of the Satoshi difficulty algorithm (ASERT too). The Grasberg modification is apparently based on the viewpoint that this discrepancy is wrong, and blocks really ought to have been mined at exactly 10 minutes per block all along. So it modifies the exponential to choose a slightly longer block time until this overmining injustice has been 'repaid', if you will.

    I'm not surprised at this idea, as I've thought about precisely this modification in the past, and my conclusion was (and is) that I can't see any serious technical issues with it. It does break the recursive property of the simple exponential, and so the strict security property I mentioned is now invalid. But practically, it is clipped to +/-13% so my intuition is that any attacks would be bounded to a 13% profitability, and it would be tricky to exploit so it is a minor concern. But this deserves a disclaimer: DAA algorithm repercussions are sometimes very nonintuitive, and any small tweak that 'shouldn't really hurt' can sometimes end up being a vulnerability, only realized much later on. Every tweak is playing with fire.

    Personally, I don't have strong feelings either way about the philosophy of doing blockchain reparations to maintain an exact long term schedule, so I think the high-level idea of this algorithm is fine. However, I can imagine that that blockchain reparations are not widely desired, and I would personally not have pushed to include them unless there was clear demand.

    Comments on algorithm details

    As alluded to above, I think the algorithm has way more complexity and computation than is needed.

    deterministicExp2 is an approximation to 2x - 1, for 0<=x<1, in fixed point math. It is a compound curve made up of 16 quadratic segments with discontinuities. It has a ~1 ppm accuracy (if I have translated it correctly). I find this algorithm is much more complicated than is needed for its accuracy level, but also I don't know why this accuracy level was selected in the first place:

    • It uses truncated Taylor series. Such truncated Taylor series are OK but not great approximations. (Taylor series are not at all intended to be ideal when truncated). For the given segments, local curve fits of the same polynomial order would be significantly better. Not only would fits be more accurate, but they would be able to remove the discontinuities (if so desired). Conversely, the same accuracy would be obtained with much fewer segments.
    • The implementation requires local recentering of the coordinate which is just more steps. These are polynomials and there are already coefficient tables, can the coefficients not simply absorb the shifts?
    • At the same time, I do not know why a 1 ppm accuracy level was chosen. For the purposes of the DAA in question, a 1000 ppm accuracy is likely entirely sufficient. This can be achieved even with just two quadratic segments (fitted). A question that should be easy to answer if 1000 ppm is insufficient: why is 1 ppm sufficient instead of making this function have "perfect" accuracy (0.2 ppb for the 32-bit precision)? See also my remark on GetCompact inaccuracy, below.
    • There ought to be many simpler alternatives, they are not hard to come up with. A third-order polynomial over full range (as I've posted before) is nice since it allows 16-bit inputs and needs only a few 64-bit internal operations, and gives decent accuracy. That is an example of simplicity, there are many other simple ways to do it (I can work on this and provide some more, if anyone cares).

    computeTargetBlockTime computes the drift-adjusted target block time: an exponential, with clipped upper and lower values. I do not understand why an exponential curve was chosen here, and I can't think of any underlying reason or theory why it must be this curve rather than some other curve. As can be seen by the plot, however, it is actually nearly piecewise linear, very slightly curving up. Why does this need to very slightly curve up, instead of, say, very slightly curving down, or just being straight? I have no idea, but I do know that a simple line is the cleanest design unless these curvings have a reason. Just use a line.

    As for the slope used for computeTargetBlockTime (600 seconds per two weeks), my feeling is this is a bit fast: when I was mulling over this same concept a long time ago, I was imagining a ~10x longer timescale than that. As a general rule a gentler slope is not going to cause a problem, but a too-fast slope certainly might, and there is no actual need to 'rush' towards correcting block times. The fact that the clipping points are placed at +/-2.5 days drift also makes me a bit nervous since this is comparable to the main algorithm's timescale (~2 days). Right now I can't imagine a specific interaction, but it just feels weird.

    (as for the clipping, the chosen value of 13% is probably a good level. I would not go any higher than 15%, and 5% or under wouldn't really be worth the trouble.)

    ComputeNextWork and GetNextGrasbergWorkRequired requires unnecessarily many 256-bit arithmetic steps. In fact, the whole involvement of 256-bit arithmetic can be totally avoided! Remember, the whole idea and goal is to compute the next DiffBits based on previous DiffBits, and all the intermediate steps (256-bit target or difficulty) have no independent meaning. These DiffBits are floating point numbers of low precision, and the difficulty algorithm is exponential; therefore it is easier, simpler, and faster to do the whole thing without touching 256-bit arithmetic, instead just a few easy computations on mantissa and exponent of DiffBits. It would be equally accurate. Actually, it would be more accurate since it would not use the poor arith_uint256::GetCompact algorithm which does not even round properly. (this is by far the worst inaccuracy in the whole algorithm, by the way, and even with proper rounding there will always be the intrinsic imprecision in the final DiffBits result). I thought these would be pretty obvious advantages of the exponential algorithms and I'm surprised this route was not taken.

    ComputeNextWork includes clipping of the difficulty change at extreme values (factor of 232 increase or decrease in difficulty). Note this issue would be irrelevant if doing DiffBits arithmetic directly (see above), but even if it is insisted to use 256-bit arithmetic, these limits seem to be due to a misunderstanding. For right-shifting, there is no overflow. There is a concern that a divide by zero in ComputeTargetFromWork might occur, but this is directly prevented by replacing a 0 difficulty with 1 (gets converted to powLimit in the end, anyway). As for left-shifting, I see the concern but this is artificial. For xi >= 0, it is perfectly acceptable to compute (lastBlockWork + (offsetWork32 >> 32)) << xi; this is still super accurate since lastBlockWork >= 232 (as already assumed by this function), and it's easy to examine .bits() before left shifting to see if it is going to overflow. By the way if overflow is really a concern, then there technically ought to also be checks on whether the offsetWork32 computation or `(lastBlockWork + (offsetWork32 >> 32)) computation are going to overflow, which could happen regardless of the value of xi. If these overflows are believed to be impossible, then comments to the effect should be included.

    (Note that if xi > 32 and the difficulty increases by 4 billion times, or whatever, then something seriously wrong has happened because this means a very backwards timestamp appeared, and now the blockchain is probably stuck in some way. It's worth reminding that this kind of situation would not happen in the first place with timestamp controls, as described at end of post.)

    The testnet difficulty adjustment seems weird to me (if I understand it right) because when min-difficulty blocks are being mined, they do not cause the the 'real difficulty' to decrease naturally over time like with the previous DAAs. Instead the real difficulty stays high and constant until a real-difficulty block is actually mined, then the difficulty will suddenly drop by an unnecessarily large factor if the chain was "stuck" at high difficulty, disregarding the fact that there may have been many min-difficulty blocks in-between. A natural dropping could be easily incorporated by using the recursive property of ASERT, i.e., using the last real-difficulty block as the 'reference block'. This doesn't have to add any complexity to the code besides computing a block height difference. Alternatively, it could be a fun experiment (and appropriate IMO) to use a real-time ASERT on testnet, instead of having the min-difficutly reset. This would of course be more inconvenient after someone pumps up the difficulty, unless the relaxation time is very short. The main downside is it would be more complex.

    Finally a minor point: The algorithm involves both concepts of relaxation time (ex) and half-life (2x). It would be simpler to just use half-lives everywhere -- they are the natural variable since the math is binary. I.e., just define the half-life as 120000 seconds. (a second very minor nit on how to think about this: in exponential algos the crucial point is that half life really is a time, not a number of blocks, so it's best to define it and think of it explicitly as X time rather than as N block intervals.).

    Importance of improved timestamp controls

    (This is not part of difficulty algorithm, but it is related.)

    Currently, BCH only applies the median-time-past rule for block timestamps. This allows non-monotonic block time sequences. As documented by Zawy, this is something which in general has been a subtle cause of very serious grief for difficulty algorithms. In the worst cases, a bad DAA allows miners to manipulate timestamps to allow rapidly mining blocks without raising the difficulty. This means an economic disaster for the blockchain since it causes rapid inflation by block subsidies. Various difficulty algorithms employed in the cryptocurrency space are vulnerable to this, even right now.

    The exponential algorithms are fundamentally immune to this, though small modifications (like Grasberg's) can actually compromise this immunity. In the case of Grasberg's gradually adapting target block time, this will of course not matter for the next 6 years since it is pinned. After that, however, I would not be surprised if theoretically, miners could use weird nonmonotonic timestamp sequences to have blocks coming out 13% faster than ought to happen. Even if true, though, I expect this attack to be very tricky, probably also risky, and given that I expect a bounded finite benefit I guess it would be impractical.

    There is however a totally separate and more immediate issue relating to non-monotonic timestamps. Exponential algorithms allow large upward jumps in difficulty if the timestamp change is very negative. This means that if there is a large span of time over which few blocks are mined (as happens if the hashrate falls, because difficulty is too high), and finally some blocks are mined that lower difficulty, it means that disruptive miners can mine blocks with highly backwards timestamps, which immediately shoots the difficulty back up again to its previous level; the drought continues. Of course, the difficulty algorithm must respond in this exact way (or else it can be exploited for inflation, which would be far far worse), however we should not be allowing such a situation in the first place. I want to emphasize that this kind of disruptive back-mining has worse consequences for the exponential DAA (compared to the averaging DAAs). That said, it only becomes serious in extreme situations.

    I can't see any reason to allow non-monotonic timestamps. Speaking for myself, every counterargument to this idea that I've seen has been invalid on further thinking. (There is far too much on this topic to go into detail here.)

    It would be sufficient to have even just an added restriction that timestamps can't go backwards by more than 1 hour, if people are scared of monotonicity. (but, they have no good reason to be)

    submitted by /u/markblundeberg
    [link] [comments]

    Is there a "Receive Payments" API for BCH?

    Posted: 24 Jul 2020 09:32 PM PDT

    I'm looking for something just like this except for Bitcoin Cash instead of Bitcoin:

    https://www.blockchain.com/api/api_receive

    To be specific, I'd like to be able to programmatically create a unique address for each user (so I know whose subscription to add time onto), and provide a callback url to be hit when a transaction occurs (so I know I need to add time to a subscription).

    Do you guys know of an API for BCH that can do those two things?

    submitted by /u/Thanah85
    [link] [comments]

    First time using bitcoin

    Posted: 24 Jul 2020 08:38 PM PDT

    Hi there ✌️ my name is Victor and im a graphic designer. Recently i accepted a freelance work and received in Bitcoins. Never ever use or even think about using, i dont know nothing about It haha. Now i have 50$ in btc and was thinking what i can do with this. Tips?

    submitted by /u/hugostttoso
    [link] [comments]

    Mike Dudas: “After 7 years of 100% BTC... I sold 15% of my BTC for ETH today.”

    Posted: 24 Jul 2020 06:50 PM PDT

    I'm really sad to discover that Sideshift.ai banned the Venezuelans. This is something I already criticized from CryptoPhyl, and I didn't think it was adopted by this platform as well. I hope that CoinEx.com will not be the next to adopt this policy ��

    Posted: 24 Jul 2020 04:01 PM PDT

    WhaleAlertBCH: “326,413 #BitcoinCash #BCH (77,033,468 USD) transferred”

    Posted: 24 Jul 2020 05:30 PM PDT

    It's Fusion Friday!

    Posted: 24 Jul 2020 03:14 PM PDT

    CashFusion brings exceptional levels of privacy to Bitcoin Cash, comparable to (or maybe even superior to) other privacy coins.

    CashFusion is currently in "Alpha", but it's already been used to fuse 92,307 BCH, and has reduced the size of the UTXO set by 9,481 outputs.

    Early adopters, make sure to spin up Electron Cash 4.1.0 today and get that liquidity flowing! (The more people using CashFusion at once, the more Fusions happen, and the larger your "anonymity set".)

    New to CashFusion? No problem. Help us reach critical mass by becoming a CashFusion tester:

    How to become a CashFusion tester

    And here's the CashFusion Telegram group (where you can find the links to download the CashFusion alpha build of Electron Cash)

    submitted by /u/AD1AD
    [link] [comments]

    Serious issues with Bitcoin ABC dev team

    Posted: 24 Jul 2020 09:49 PM PDT

    Guys, this latest drama with Amaury bulldozing through his Trashberg DAA is too much. IFP was bad enough, now Amaury is steamrolling over all logical debate AND f*cking around with the BCH block interval. Changing the block interval is a SERIOUS change with broad repercussions for smart comtracts, and more importantly, it changes the rate of coin issuance and hence the BCH inflation rate!

    We simply CANNOT have this insane tyrant running the reference implementation for BCH. The BCH Node team seems to be solid but it's extra work for them to maintain that codebase and it can't succeed unless more miners switch to it.

    IMHO the pressure of the frequent hard fork schedule is causing this drama to happen every few months. Perhaps the BCHN team should just maintain a stable older version of ABC and miners can band together and resist any further hard forks at all until ABC is either rational or Amaury is gone? Thinking out loud here.

    submitted by /u/wtfCraigwtf
    [link] [comments]

    Get your nails done for Bitcoin Cash - Bitcoin Cash City

    Posted: 24 Jul 2020 10:34 AM PDT

    Spotted in Mexico City! I'll check it out tomorrow

    Posted: 24 Jul 2020 08:58 PM PDT

    If it wasn't for the uncertainty created by ABC each and every time BCH upgrades it would have re-gained strength on the market where it is judged by its merit no only by its die hard supporters.

    Posted: 24 Jul 2020 07:17 AM PDT

    BCH is pro free market and pro competition. But the lack of market understanding worries me.

    submitted by /u/gr8ful4
    [link] [comments]

    Coinbase Processed $500,000 in Bitcoin Payments That Helped Ex-Nissan Chairman Flee Japan

    Posted: 25 Jul 2020 02:34 AM PDT

    ESPN Launching Bitcoin Payment Enabled Online Gaming Platform

    Posted: 25 Jul 2020 02:33 AM PDT

    Crypto Scammers Steal $381 Million in 2020, While Twitter Hackers Direct Funds to Mixers

    Posted: 25 Jul 2020 02:33 AM PDT

    pNetwork Launches Yield Farming Programme, Steroids

    Posted: 25 Jul 2020 02:32 AM PDT

    NBA Point Guard Spencer Dinwiddie's Tokenized Contract Raises $1.3 Million

    Posted: 25 Jul 2020 02:31 AM PDT

    Is there a way to see the precise time of when a transcaction was sent out?

    Posted: 24 Jul 2020 10:14 PM PDT

    I'm a developer. I'm not really knowledgeable in bitcoin development yet.

    Say, I'm using Electrum wallet. I share an address with someone. The thing is that I need that that someone to pay a certain amount of BTC within a certain time-window, 15-30-60 minutes. It's because I have products that are priced in fiat money.

    That person will send me BTC. I may open a wallet not right away but in 12-30 hours, to check incomming payments. Or there may be delays in BTC network, right?

    Is the precise date and time of *when* a payment was *sent* attached to a transcaction and can it be retrieved? How, where?

    Update1:

    (1) Goal: to prevent a customer from taking a potential advantage of fluctuation of BTC/FIAT prices.

    If there's no way to find out precisely when (date-time) a person sent money, how would I still achieve my goal? I don't run a full node. I could, but as of now I don't want to complicate things

    (2) Why bitcoin and not other coin? Because most of my customers, I guess, would rather have bitcoin than any other coin. Other coin may repel them proceesing to buy.

    P.S.

    I don't consider handling my payments via 3rd-party crypto-coin payment providers or their APIs.

    submitted by /u/dariji_alhored
    [link] [comments]

    Final Trades: XHB, DHI, TSE & BTC

    Posted: 25 Jul 2020 12:44 AM PDT

    ABC ferociously pushing Bitcoin ABC into the future! One upgrade at a time! So much leadership, so much collaboration. YES!

    Posted: 24 Jul 2020 02:36 PM PDT

    ABC does not have enough funds and/or time/resources yet they have the ability to code up a DRIFT solution , DRIFT was something NOT on the community roadmap.

    Again, ABC does not have enough funds and/or time/resources yet they have the ability to code up 2 different IFP proposals, both of which were dramatically rejected by the community and hash power, yet one remains in the code. The IFP is another thing that is NOT part of the community roadmap.

    Does anyone else see something wrong here?!?!

    If they want more funds can they AT LEAST stick to the community roadmap. Can they AT LEAST in good faith review some community proposals and efficiently incorporate them when appropiate (DAA).

    Finally, could ABC AT LEAST submit there own proposals in good faith EARLY so that the community has some type of opportunity to review and understand them.

    Am I the only one that feels this way?!

    submitted by /u/Terrible-Chipmunk
    [link] [comments]

    DAA adventure

    Posted: 24 Jul 2020 05:46 AM PDT

    Let me get this straight. The original DAA was thrown together haphazardly due to time constraints. Now we have proposed a new DAA, thrown together due to time constraints. . .yet somehow this one also solves DRIFT, DRIFT, something that was NOT on the roadmap???

    I don't have the technical skills to assess DAA in this timeframe, however SOMETHING seems WIERD.........

    If the first DAA was rushed, why are we making the SAME mistake here again. Can't we take some extra time to get this one right? At least consider the OTHER proposals that CLEARLY exist despite ABCs best efforts to pretend that other proposals DO NOT EXIST ON EARTH. So much fun!!!

    submitted by /u/Terrible-Chipmunk
    [link] [comments]

    Did you know Bitcoin.com has a Spotify podcast? Listen to the latest BCH developments during your commute, work-out, or family visits.

    Posted: 24 Jul 2020 06:53 AM PDT

    P2P Bitcoin volume hitting all-time highs

    Posted: 24 Jul 2020 02:29 PM PDT

    No comments:

    Post a Comment