Conversation
d7fb314 to
53c3229
Compare
|
I think we should not add the complexity of incremental reserve increase, rather just increase the allowable size for the same existing reserve cost. I think 1 reserve for 2048 bytes is probably ok. An alternative simpler fee model would be to nominate your account for "large hook data" whereupon all your hook states cost more to store but you have access to the larger size. I really dislike the idea of a variable length field claiming different reserves depending on its current size. |
Can you explain “large hook data” in more detail? |
Sure, so you do an account set flag which is like the equivalent of specifying large pages in your OS. All your hook states cost more but they're allowed to be bigger. It makes the computation easier. So let's say in large mode you get 2048 bytes per hook state but in computing reserve requirements for your hook state we take your HookStateCount and multiply it by 8. Once the flag is enabled it can only be disabled if HookStateCount is 0. |
That's interesting. If we use the flag, whenever we need to increase the protocol's maximum allowed size, we end up having to add a new flag. |
I think that's reasonable. We could say HookStateScale or something 0,1 ... |
| uint32_t const oldOwnerCount = sle->getFieldU32(sfOwnerCount); | ||
|
|
||
| uint32_t const newOwnerCount = | ||
| oldOwnerCount - (oldScale * stateCount) + (scale * stateCount); |
There was a problem hiding this comment.
sanity check newOwnerCount > oldOwnerCount
src/ripple/protocol/impl/TER.cpp
Outdated
| MAKE_ERROR(tecINSUF_RESERVE_SELLER, "The seller of an object has insufficient reserves, and thus cannot complete the sale."), | ||
| MAKE_ERROR(tecIMMUTABLE, "The remark is marked immutable on the object, and therefore cannot be updated."), | ||
| MAKE_ERROR(tecTOO_MANY_REMARKS, "The number of remarks on the object would exceed the limit of 32."), | ||
| MAKE_ERROR(tecHAS_HOOK_STATE, "The account has hook state. Delete all existing state first."), |
There was a problem hiding this comment.
Change comment to something like: Delete all hook state before reducing scale.
There was a problem hiding this comment.
Since the number of TER codes is limited, we used a generic error message. However, we will change it as you pointed out, as it can be modified later when this code is used for other purposes.
|
|
||
| uint16_t scale = tx.getFieldU16(sfHookStateScale); | ||
| if (scale == 0 || | ||
| scale > 16) // Min: 1, Max: 16 (256 * 16 = 4096 bytes) |
There was a problem hiding this comment.
extract a constant for this ?
| // should not happen, but just in case | ||
| return 256U; | ||
| } | ||
| return 256U * hookStateScale; |
There was a problem hiding this comment.
we could add an extra defensive check here with a constant, I guess
There was a problem hiding this comment.
like this?
inline uint32_t
maxHookStateDataSize(uint16_t hookStateScale)
{
if (hookStateScale == 0)
{
// should not happen, but just in case
return 256U;
}
if (hookStateScale > maxHookStateScale())
{
// should not happen, but just in case
return 256 * maxHookStateScale();
}
return 256U * hookStateScale;
}There was a problem hiding this comment.
I think assert/throw "fail fast" because it's a violated invariant indicating something went awry ?
There was a problem hiding this comment.
I think it's too much to add that kind of processing here.
There was a problem hiding this comment.
Well, we can do "return 256U * hookStateScale;" if it's just a simple calculation, but if you want to handle strange cases (0 || > max_scale) this way, it means you're explicitly washing over bad state. If it should not happen, assert that it doesn't happen. This is some weird middle ground, no?
| BEAST_EXPECT(env.le(alice)->getFieldU32(sfHookStateCount) == 10); | ||
| BEAST_EXPECT(env.le(alice)->getFieldU32(sfOwnerCount) == 110); | ||
| } | ||
|
|
There was a problem hiding this comment.
- Decrease WITHOUT state (the escape hatch)
applyCount(5, 0, 50); // ← stateCount=0 (all state deleted)
jt[sfHookStateScale.fieldName] = 2;
env(jt); // ← Should succeed! This is how you escape high scale
BEAST_EXPECT(env.le(alice)->getFieldU16(sfHookStateScale) == 2);
This is important because it tests the only way to reduce scale - by deleting all state first. The spec calls this the "Scale Commitment Trap" escape route.
- Insufficient reserves on increase
// Alice has exactly enough for current reserves, but not for 16x increase
applyCount(1, 100, 200); // 100 entries at scale=1
// Manually set balance to just cover current reserves
jt[sfHookStateScale.fieldName] = 16; // Needs 1600 reserves (100×16)
env(jt, ter(tecINSUFFICIENT_RESERVE)); // Should fail - can't afford it
High Level Overview of Change
Context of Change
Type of Change
.gitignore, formatting, dropping support for older tooling)API Impact
libxrplchange (any change that may affectlibxrplor dependents oflibxrpl)