Skip to content

Stripe - Data too long exception #3350

Open
@derXear

Description

Describe the bug
Using the Stripe plugin with the remix storefront leads to the following error message:

[server] warn 2/6/25, 11:26 AM - [Vendure Server] Job data for "update-search-index" is too long to store with the mysql driver (145kb).
[server] The following keys were truncated:  
[server] /home/georg/Develop/bigrep/repo/shop/vendure-shop/node_modules/typeorm/src/driver/mysql/MysqlQueryRunner.ts:246
[server]                                 new QueryFailedError(query, parameters, err),
[server]                                 ^
...
[server]   sqlMessage: "Data too long for column 'data' at row 1",
[server]   sql: 'INSERT INTO `job_record`(`createdAt`, `updatedAt`, `queueName`, `data`, `state`, `progress`, `result`, `error`, `startedAt`, `settledAt`, `isSettled`, `retries`, `attempts`, `id`) VALUES (DEFAULT, DEFAULT, \'update-search-index\', \'{\\"type\\":\\"update-variants\\" ...
,\\"body\\":{\\"0\\":123,\\"1\\":10,\\"2\\":32,\\"3\\":32,\\"4\\":34,\\"5\\":105,\\"6\\":100,
...

The payment itself is processed successful in Stripe and Vendure, but there seems to be side-effect with the SearchIndexService.updateVariants function probably due to the stock level change.

To Reproduce
Steps to reproduce the behavior:

  1. Start Vendure w/ payments-plugin & DefaultJobQueuePlugn in v.3.1.2, mysql and stripe configured
  2. Start the remix-storefront
  3. Perform a Stripe payment
  4. See error

In the StripeController class, when having a breakpoint at the constructEventFromPayload function call:

@Post('stripe')
    async webhook(
        @Headers('stripe-signature') signature: string | undefined,
        @Req() request: RequestWithRawBody,
        @Res() response: Response,
    ): Promise<void> {
        ...

        const outerCtx = await this.createContext(channelToken, languageCode, request);

        await this.connection.withTransaction(outerCtx, async (ctx: RequestContext) => {
            const order = await this.orderService.findOneByCode(ctx, orderCode);

            if (!order) {
                throw new Error(
                    `Unable to find order ${orderCode}, unable to settle payment ${paymentIntent.id}!`,
                );
            }

            try {
                // Throws an error if the signature is invalid
                const event = await this.stripeService.constructEventFromPayload(ctx, order, request.rawBody, signature);

                // TODO Clearing the buffer here fixes the issue
                request.rawBody = Buffer.alloc(0);
                request.body = undefined;

In my case the request.rawBody is very large and that is part of the context used in SearchIndexService to create a job to reindex the variants changed. This will store the rawBody as part of the job.data in the database (using the DefaultJobQueuePlugn).

Expected behavior
In my opinion the rawBody / body from the request should not be stored in the job_record database. I see two possible options to have this:

Option A:

Immediately clear the rawBody in Stripe plugin after it was validated as seen above:

// Throws an error if the signature is invalid
const event = await this.stripeService.constructEventFromPayload(ctx, order, request.rawBody, signature);

// TODO Clearing the buffer here fixes the issue
request.rawBody = Buffer.alloc(0);
request.body = undefined;

Option B:

Improve the SqlJobQueueStrategy.constrainDataSize function by clearing any unnecessary data and not only checking string lengths

private constrainDataSize<Data extends JobData<Data> = object>(job: Job<Data>): Data | undefined {
        const type = this.rawConnection?.options.type;
        if (type === 'mysql' || type === 'mariadb') {
            const stringified = JSON.stringify(job.data);
            if (64 * 1024 <= stringified.length) {
                const truncatedKeys: Array<{ key: string; size: number }> = [];
                const reduced = JSON.parse(stringified, (key, value) => {
                    if (typeof value === 'string' && 2048 < value.length) {
                        truncatedKeys.push({ key, size: value.length });
                        return `[truncated - originally ${value.length} bytes]`;
                    // ADDED ->
                    } else if (typeof value === 'object' && value !== null) {
                        // TODO to be implemented - check serialized object size
                        const objectSize = getObjectSize(value);
                        if (objectSize > 2048) {
                            truncatedKeys.push({ key, size: objectSize });
                            return `[truncated - originally ${objectSize} bytes]`;
                        }
                    }
                    // ADDED <-
                    return value;
                });
               ...
                return reduced;
            }
        }
    }

Might be the better way to in general not pollute the database with unnecessary data?

Environment (please complete the following information):

  • @vendure/core: v3.1.2 (same for payment-plugin)
  • Nodejs version: v20
  • Database: mysql2 client v3.12.0

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    • Status

      📦 Backlog

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions