Skip to content

Add ClusterFuzzLite for continuous fuzz testing#1282

Merged
bact merged 17 commits intodevfrom
copilot/setup-clusterfuzzlite-testing
Mar 6, 2026
Merged

Add ClusterFuzzLite for continuous fuzz testing#1282
bact merged 17 commits intodevfrom
copilot/setup-clusterfuzzlite-testing

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Feb 5, 2026

What do these changes do

Adds ClusterFuzzLite infrastructure to continuously fuzz test PyThaiNLP functions for crashes, edge cases, and security vulnerabilities, along with safe corpus management practices.

What was wrong

PyThaiNLP lacked automated fuzz testing to discover memory safety issues, Unicode handling bugs, and unexpected crashes in text processing functions. The fuzzing setup also lacked guidance on corpus handling and had no protection against committing large/duplicate corpus files, sensitive data, or crash-triggering inputs to version control.

Several CI build failures and warnings were encountered and resolved during setup:

  1. The initial configuration placed Dockerfile and build.sh in fuzz/ with a custom dockerfile-path parameter, but ClusterFuzzLite hardcodes the build context to .clusterfuzzlite/Dockerfile regardless of that parameter, causing lstat .clusterfuzzlite: no such file or directory.

  2. After fixing the directory structure, the build failed with No fuzz targets found in out dir because the build script used cp + chmod to place Python files in $OUT. ClusterFuzzLite's bad-build-check requires proper libFuzzer executables; a plain Python script without an executable wrapper fails this check.

  3. After switching to compile_python_fuzzer, the executables crashed at startup with ModuleNotFoundError: No module named 'pythainlp'. PyInstaller (used internally by compile_python_fuzzer) cannot follow the path hooks created by an editable install (pip install -e .), so pythainlp was missing from the bundled executables.

  4. Without wrapping the target module imports in atheris.instrument_imports(), the Python bytecode was not instrumented for coverage, causing libFuzzer to print WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? and preventing coverage-guided mutation.

Additionally, TestOneInput did not follow Python naming conventions, and TypeError was being caught in the except clause after being explicitly raised for output validation, silencing those checks. The except clauses also redundantly listed UnicodeDecodeError alongside ValueError, since UnicodeDecodeError is a subclass of UnicodeError which is a subclass of ValueError.

How this fixes it

GitHub Actions Workflow (.github/workflows/clusterfuzzlite.yml)

  • Runs on PRs to dev (code-change mode), push to dev (batch mode), and daily at 06:00 UTC (batch mode)
  • Uses AddressSanitizer for memory safety checks
  • Fuzzes for 300 seconds (5 minutes) per harness
  • Uploads crash artifacts on failure
  • Permissions include contents: write for storing corpus and coverage data to gh-pages

ClusterFuzzLite Configuration (.clusterfuzzlite/)

  • Dockerfile — required location for ClusterFuzzLite's fuzzing Docker image (extends gcr.io/oss-fuzz-base/base-builder-python); installs pythainlp with pip install . (non-editable) so PyInstaller can discover and bundle it
  • build.sh — uses compile_python_fuzzer to create proper libFuzzer executable wrappers for each fuzz/fuzz_*.py harness; uses pinned atheris==2.3.0 for supply-chain security

Fuzzing Harnesses (fuzz/)

  • fuzz_tokenize.py — tests word_tokenize() with random Unicode input
  • fuzz_util_normalize.py — tests normalize() with random Unicode input
  • Both harnesses wrap target module imports in with atheris.instrument_imports(): to enable coverage-guided fuzzing; use test_one_input (Python snake_case convention); only catch ValueError (which already covers UnicodeDecodeError via the exception hierarchy); TypeError raised by output validation propagates as a fuzzer finding
  • README.md — local testing guide, directory structure documentation, corpus best practices, expansion roadmap, and known first-run warnings (one-time gh-pages branch creation message, coverage instrumentation requirement)

Corpus Storage Best Practices

  • .gitignore updated to exclude local fuzzing artifacts (fuzz/corpus/, fuzz/crashes/, crash-*, leak-*, timeout-*, oom-*, etc.)
  • fuzz/README.md documents minimize/de-duplicate procedures, data sanitization requirements, dedicated storage approach (gh-pages branch), crash handling workflow, and security considerations

Your checklist for this pull request

  • Passed code styles and structures
  • Passed code linting checks and unit test
Original prompt

Setup ClusterFuzzLite for Fuzz Testing

Context

Issue #1070 requests setting up ClusterFuzzLite for continuous fuzz testing of PyThaiNLP. This will help discover edge cases, crashes, and potential security issues in text processing functions.

Objectives

  1. Add ClusterFuzzLite GitHub Actions workflow that integrates with existing CI/CD
  2. Create fuzzing harnesses for high-priority modules (tokenization and utilities)
  3. Add Docker configuration for ClusterFuzzLite build environment
  4. Provide documentation for maintaining and expanding fuzz tests

Requirements

1. GitHub Actions Workflow (.github/workflows/clusterfuzzlite.yml)

Create a new workflow that:

  • Runs on pull requests to dev branch (matching existing workflow patterns)
  • Runs on push to dev branch
  • Runs on schedule (daily at 06:00 UTC)
  • Uses AddressSanitizer for memory safety checks
  • Fuzzes for 5 minutes per harness (300 seconds)
  • Uses mode: 'code-change' to focus on PR changes
  • Follows the concurrency pattern used in other workflows (cancel-in-progress for same branch)
  • Uploads crash artifacts if failures occur
  • Has proper permissions (contents: read, issues: write)

2. Docker Configuration (Dockerfile)

Create a Dockerfile at the repository root that:

  • Extends gcr.io/oss-fuzz-base/base-builder-python
  • Installs Python development dependencies
  • Copies the repository to $SRC/pythainlp
  • Installs pythainlp in development mode with minimal dependencies

3. Fuzzing Harnesses (fuzz/ directory)

Create the following fuzz test files:

fuzz/fuzz_tokenize.py

  • Test pythainlp.tokenize.word_tokenize() with random Unicode input
  • Use atheris.FuzzedDataProvider to generate test data
  • Validate output is a list of strings
  • Catch expected exceptions (ValueError, TypeError, UnicodeDecodeError)
  • Re-raise unexpected exceptions for investigation

fuzz/fuzz_util_normalize.py

  • Test pythainlp.util.normalize() with random Unicode input
  • Use atheris.FuzzedDataProvider to generate test data
  • Validate output is a string
  • Catch expected exceptions
  • Re-raise unexpected exceptions

fuzz/build.sh

  • Install atheris for Python fuzzing
  • Compile each fuzzer (all fuzz/fuzz_*.py files)
  • Make fuzzers executable
  • Copy to $OUT/ directory
  • Print build success message

fuzz/README.md

  • Explain the fuzzing setup and directory structure
  • Provide instructions for local testing
  • Document how to add new fuzzers
  • List expansion plan for future fuzzing targets:
    • spell/ (spelling correction)
    • corpus/ (data loading)
    • soundex/ (phonetic encoding)
    • transliterate/ (romanization)
    • tag/ (part-of-speech tagging)

4. Code Style and Standards

  • Follow existing PyThaiNLP code style (check .github/workflows/lint.yml)
  • Add SPDX headers to new files where appropriate:
    # SPDX-FileCopyrightText: 2026 PyThaiNLP Project
    # SPDX-License-Identifier: Apache-2.0
    # SPDX-FileType: SOURCE
  • Use proper error handling and logging
  • Include descriptive docstrings

5. Integration Testing

The solution should:

  • Not break existing CI/CD workflows
  • Use minimal dependencies (only atheris for fuzzing)
  • Be compatible with Python 3.9+ (matching requires-python in pyproject.toml)
  • Follow the path-ignore patterns used in other workflows

Success Criteria

  • GitHub Actions workflow runs successfully on PRs
  • Fuzzing harnesses execute without immediate crashes
  • Docker container builds successfully
  • Documentation is clear and complete
  • No conflicts with existing workflows
  • Follows PyThaiNLP's existing CI/CD patterns

References

Notes

  • Start with minimal fuzzing (2 harnesses) to validate the setup
  • Can expand to more modules after initial validation
  • Fuzzing time (300s) can be adjusted based on CI/CD resource constraints
  • The workflow uses path-ignore patterns similar to existing workflows to avoid unnecessary runs

This pull request was created from Copilot chat.


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Copilot AI assigned Copilot and bact Feb 5, 2026
@bact bact linked an issue Feb 5, 2026 that may be closed by this pull request
@bact bact added infrastructure build process, configuration, GitHub Actions tests Unit test, code coverage, test case labels Feb 5, 2026
@bact bact added this to PyThaiNLP Feb 5, 2026
@bact bact moved this to In progress in PyThaiNLP Feb 5, 2026
@bact bact added this to the Future milestone Feb 5, 2026
@github-actions
Copy link
Copy Markdown

github-actions Bot commented Feb 5, 2026

Hello @Copilot, thank you for submitting a PR! We will respond as soon as possible.

สวัสดี @Copilot ขอบคุณที่ส่ง PR เข้ามา เราจะตอบกลับให้เร็วที่สุดเท่าที่จะทำได้

Copilot AI and others added 2 commits February 5, 2026 00:48
Co-authored-by: bact <128572+bact@users.noreply.github.com>
Co-authored-by: bact <128572+bact@users.noreply.github.com>
Copilot AI changed the title [WIP] Add ClusterFuzzLite setup for continuous fuzz testing Add ClusterFuzzLite for continuous fuzz testing Feb 5, 2026
Copilot AI requested a review from bact February 5, 2026 00:51
@bact
Copy link
Copy Markdown
Member

bact commented Feb 5, 2026

@copilot people will confuse about Dockerfile* files. we should name it properly, telling which one is for normal use, which one is for fuzzing. Alternatively, put the Dockerfile for fuzzing in the fuzz/ directory.

What is the best pattern that commonly used?

Co-authored-by: bact <128572+bact@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Feb 5, 2026

@copilot people will confuse about Dockerfile* files. we should name it properly, telling which one is for normal use, which one is for fuzzing. Alternatively, put the Dockerfile for fuzzing in the fu...

Moved the fuzzing Dockerfile to fuzz/Dockerfile and restored the original Dockerfile at root for the main application. This follows the common pattern of keeping fuzzing infrastructure in the fuzz/ directory. The workflow now uses dockerfile-path: fuzz/Dockerfile to point to the fuzzing-specific Docker image. (604792e)

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request adds ClusterFuzzLite infrastructure to enable continuous fuzz testing of PyThaiNLP functions, helping discover edge cases, crashes, and security vulnerabilities in text processing operations.

Changes:

  • Added GitHub Actions workflow for automated fuzzing on PRs, pushes, and daily schedule
  • Created fuzzing Docker environment with OSS-Fuzz base builder
  • Implemented two initial fuzzing harnesses for word_tokenize() and normalize() functions
  • Added build script and comprehensive documentation for fuzzing infrastructure

Reviewed changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated 18 comments.

Show a summary per file
File Description
.github/workflows/clusterfuzzlite.yml GitHub Actions workflow orchestrating ClusterFuzzLite fuzzing with AddressSanitizer
fuzz/Dockerfile Docker configuration extending OSS-Fuzz base builder for Python projects
fuzz/build.sh Build script for compiling fuzzing harnesses with atheris
fuzz/fuzz_tokenize.py Fuzzing harness testing word_tokenize() with random Unicode input
fuzz/fuzz_util_normalize.py Fuzzing harness testing normalize() with random Unicode input
fuzz/README.md Documentation covering setup, local testing, CI/CD integration, and expansion roadmap

Comment thread .github/workflows/clusterfuzzlite.yml Outdated
Comment thread fuzz/fuzz_util_normalize.py Outdated
Comment thread fuzz/build.sh Outdated
Comment thread fuzz/fuzz_util_normalize.py
Comment thread fuzz/build.sh Outdated
Comment thread fuzz/fuzz_tokenize.py Outdated
Comment thread fuzz/fuzz_tokenize.py
Comment thread fuzz/fuzz_tokenize.py Outdated
Comment thread fuzz/README.md Outdated
Comment thread .github/workflows/clusterfuzzlite.yml Outdated
@bact
Copy link
Copy Markdown
Member

bact commented Feb 5, 2026

@copilot apply suggestions in this thread #1282 (review)

Co-authored-by: bact <128572+bact@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Feb 5, 2026

@copilot apply suggestions in this thread #1282 (review)

Applied all suggestions from the code review (499966b):

Workflow changes:

  • Removed **.yml from paths-ignore to allow workflow updates to trigger
  • Changed permissions to contents: write for storing corpus/coverage data
  • Made fuzzing mode conditional: code-change for PRs, batch for scheduled/push events

Fuzzer files:

  • Added :rtype: None to docstrings for complete type annotations
  • Replaced assert statements with explicit TypeError raises
  • Removed unnecessary except Exception: blocks

Build script:

  • Fixed to copy fuzzer files directly (instrumentation happens at runtime)
  • Added version constraint: atheris>=2.3.0
  • Removed unused fuzzer_package variable

Dockerfile:

  • Reordered SPDX headers to match project convention
  • Added comment about build script location for OSS-Fuzz/ClusterFuzzLite

@bact bact requested a review from Copilot February 5, 2026 02:58
Copilot finished work on behalf of bact March 6, 2026 06:06
@bact
Copy link
Copy Markdown
Member

bact commented Mar 6, 2026

@copilot

Two issues in fuzz/fuzz_util_normalize.py

  1. At line 17, the TestOneInput function name does not follow Python convention. Use test_one_input if possible.
  2. At line 36, the exception may be already caught from the superclass. Recheck if this is necessary.

The fuzz testing job can started now but still failed.
See the run at https://github.com/PyThaiNLP/pythainlp/actions/runs/22751418729/job/65987217516?pr=1282 :
Here is the error message:

+ echo 'Build completed successfully!'
2026-03-06 06:15:53,478 - root - INFO - Removing unaffected fuzz targets.
2026-03-06 06:15:53,636 - root - INFO - Diffing against dev.
2026-03-06 06:15:53,640 - root - INFO - Files changed in PR: ['.clusterfuzzlite/Dockerfile', '.clusterfuzzlite/build.sh', '.github/workflows/clusterfuzzlite.yml', '.gitignore', 'fuzz/README.md', 'fuzz/fuzz_tokenize.py', 'fuzz/fuzz_util_normalize.py']
2026-03-06 06:15:53,643 - root - ERROR - No fuzz targets found in out dir.
2026-03-06 06:15:53,765 - root - ERROR - Build check failed.
2026-03-06 06:15:53,765 - root - ERROR - Error building fuzzers for (commit: 04039396d4bd7a88cd63b669677f264ab06e5e28, pr_ref: refs/pull/1282/merge).
Build check: stdout: ERROR: No fuzz targets found.

Fix all the issues.

- Rename TestOneInput to test_one_input (Python snake_case convention)
- Remove TypeError from except clause in both fuzzers: since we raise
  TypeError ourselves for validation failures, catching it was silently
  swallowing those checks instead of reporting them as fuzzer findings
- Use compile_python_fuzzer instead of cp+chmod in build.sh: bare cp
  produces a Python file without a proper executable wrapper, which
  fails ClusterFuzzLite's bad-build-check and causes 'No fuzz targets
  found in out dir' (CI job 65987217516)
- Update README example to match new naming and exception handling

Co-authored-by: bact <128572+bact@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Mar 6, 2026

@copilot

Two issues in fuzz/fuzz_util_normalize.py
...

Fixed all three issues (06a94ea):

1. Renamed TestOneInputtest_one_input in both harnesses and the README example, following Python snake_case conventions.

2. Removed TypeError from except clauses in both harnesses. We raise TypeError ourselves for validation failures (lines 33–34), so catching it was silently swallowing those checks instead of letting them propagate as fuzzer findings. Only ValueError and UnicodeDecodeError remain — exceptions that the target functions themselves may raise legitimately.

3. Fixed "No fuzz targets found" (CI job 65987217516): The root cause was using cp + chmod, which places a raw Python file in $OUT. ClusterFuzzLite's bad-build-check requires proper libFuzzer executables — it tries to run them with -runs=0. A bare Python script without a shebang fails this check, so all targets get removed. Fixed by using compile_python_fuzzer (provided by the gcr.io/oss-fuzz-base/base-builder-python image), which creates a proper executable wrapper around each .py fuzzer.

Copilot finished work on behalf of bact March 6, 2026 06:27
@bact
Copy link
Copy Markdown
Member

bact commented Mar 6, 2026

@copilot There are issues with the fuzz build and fuzz targets. Fix them.

See the run log at https://github.com/PyThaiNLP/pythainlp/actions/runs/22751954112/job/65988614803?pr=1282

Run google/clusterfuzzlite/actions/build_fuzzers@v1
/usr/bin/docker run --name gcrioossfuzzbaseclusterfuzzlitebuildfuzzersv1_e9a019 --label f87f13 --workdir /github/workspace --rm -e "INPUT_SANITIZER" -e "INPUT_LANGUAGE" -e "INPUT_DRY-RUN" -e "INPUT_ALLOWED-BROKEN-TARGETS-PERCENTAGE" -e "INPUT_PROJECT-SRC-PATH" -e "INPUT_BAD-BUILD-CHECK" -e "INPUT_KEEP-UNAFFECTED-FUZZ-TARGETS" -e "INPUT_STORAGE-REPO" -e "INPUT_STORAGE-REPO-BRANCH" -e "INPUT_STORAGE-REPO-BRANCH-COVERAGE" -e "INPUT_UPLOAD-BUILD" -e "INPUT_GITHUB-TOKEN" -e "ALLOWED_BROKEN_TARGETS_PERCENTAGE" -e "BAD_BUILD_CHECK" -e "UPLOAD_BUILD" -e "LANGUAGE" -e "DRY_RUN" -e "SANITIZER" -e "PROJECT_SRC_PATH" -e "GITHUB_TOKEN" -e "GIT_STORE_REPO" -e "GIT_STORE_BRANCH" -e "GIT_STORE_BRANCH_COVERAGE" -e "CFL_PLATFORM" -e "LOW_DISK_SPACE" -e "KEEP_UNAFFECTED_FUZZ_TARGETS" -e "HOME" -e "GITHUB_JOB" -e "GITHUB_REF" -e "GITHUB_SHA" -e "GITHUB_REPOSITORY" -e "GITHUB_REPOSITORY_OWNER" -e "GITHUB_REPOSITORY_OWNER_ID" -e "GITHUB_RUN_ID" -e "GITHUB_RUN_NUMBER" -e "GITHUB_RETENTION_DAYS" -e "GITHUB_RUN_ATTEMPT" -e "GITHUB_ACTOR_ID" -e "GITHUB_ACTOR" -e "GITHUB_WORKFLOW" -e "GITHUB_HEAD_REF" -e "GITHUB_BASE_REF" -e "GITHUB_EVENT_NAME" -e "GITHUB_SERVER_URL" -e "GITHUB_API_URL" -e "GITHUB_GRAPHQL_URL" -e "GITHUB_REF_NAME" -e "GITHUB_REF_PROTECTED" -e "GITHUB_REF_TYPE" -e "GITHUB_WORKFLOW_REF" -e "GITHUB_WORKFLOW_SHA" -e "GITHUB_REPOSITORY_ID" -e "GITHUB_TRIGGERING_ACTOR" -e "GITHUB_WORKSPACE" -e "GITHUB_ACTION" -e "GITHUB_EVENT_PATH" -e "GITHUB_ACTION_REPOSITORY" -e "GITHUB_ACTION_REF" -e "GITHUB_PATH" -e "GITHUB_ENV" -e "GITHUB_STEP_SUMMARY" -e "GITHUB_STATE" -e "GITHUB_OUTPUT" -e "RUNNER_OS" -e "RUNNER_ARCH" -e "RUNNER_NAME" -e "RUNNER_ENVIRONMENT" -e "RUNNER_TOOL_CACHE" -e "RUNNER_TEMP" -e "RUNNER_WORKSPACE" -e "ACTIONS_RUNTIME_URL" -e "ACTIONS_RUNTIME_TOKEN" -e "ACTIONS_CACHE_URL" -e "ACTIONS_RESULTS_URL" -e "ACTIONS_ORCHESTRATION_ID" -e GITHUB_ACTIONS=true -e CI=true -v "/var/run/docker.sock":"/var/run/docker.sock" -v "/home/runner/work/_temp":"/github/runner_temp" -v "/home/runner/work/_temp/_github_home":"/github/home" -v "/home/runner/work/_temp/_github_workflow":"/github/workflow" -v "/home/runner/work/_temp/_runner_file_commands":"/github/file_commands" -v "/home/runner/work/pythainlp/pythainlp":"/github/workspace" gcr.io/oss-fuzz-base/clusterfuzzlite-build-fuzzers:v1
2026-03-06 06:32:42,859 - root - INFO - ci_system: <continuous_integration.ExternalGithub object at 0x7f46c66450d0>.
2026-03-06 06:32:42,860 - root - INFO - ClusterFuzzDeployment: <clusterfuzz_deployment.ClusterFuzzLite object at 0x7f46c645be10>.
2026-03-06 06:32:42,860 - root - INFO - ExternalGithub: preparing for fuzzer build.
2026-03-06 06:32:46,465 - common_utils - INFO - Running: docker build -t external-cfl-project-443fff9cdf524eb4ab07b2498a891b17 -f /github/workspace/storage/pythainlp/.clusterfuzzlite/Dockerfile /github/workspace/storage/pythainlp.
#0 building with "default" instance using docker driver

#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 974B done
#1 DONE 0.0s

#2 [internal] load metadata for gcr.io/oss-fuzz-base/base-builder-python:latest
#2 DONE 0.8s

#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

#4 [1/6] FROM gcr.io/oss-fuzz-base/base-builder-python:latest@sha256:b7df26f0af28e4db390c7a71ad0effd95b32feafd9733149292f1f72ce9f6075
#4 resolve gcr.io/oss-fuzz-base/base-builder-python:latest@sha256:b7df26f0af28e4db390c7a71ad0effd95b32feafd9733149292f1f72ce9f6075 done
#4 sha256:cdb03031daaa4e1d19fb823e8a385c77b44ea99537734f1e3ee1024518c2e300 27.42kB / 27.42kB done
#4 sha256:b7df26f0af28e4db390c7a71ad0effd95b32feafd9733149292f1f72ce9f6075 8.51kB / 8.51kB done
#4 sha256:9b1ec6a6e433773ed8f3f803bdfaa11b11ee82ef67b9ae0182f98ef064df6eb2 0B / 83.66MB 0.2s
#4 sha256:dde9760edb7c6fc4db29528e3ba5a831a02e66e671346e999079da66936203d1 0B / 34.58MB 0.2s
#4 sha256:271fc0ab47334601319a29901e20583cfa0c044ac34be1a9bd424258ea2732cc 0B / 175B 0.2s
#4 sha256:3be1f579c639f93e2cb0b7db8661af5855f5f85ebcb7402b9f4cf5b6aaa4fb3d 0B / 3.52kB 0.3s
#4 sha256:271fc0ab47334601319a29901e20583cfa0c044ac34be1a9bd424258ea2732cc 175B / 175B 0.2s done
#4 sha256:dde9760edb7c6fc4db29528e3ba5a831a02e66e671346e999079da66936203d1 2.10MB / 34.58MB 0.5s
#4 sha256:3be1f579c639f93e2cb0b7db8661af5855f5f85ebcb7402b9f4cf5b6aaa4fb3d 3.52kB / 3.52kB 0.4s done
#4 sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 0B / 338.24MB 0.5s
#4 ...

#5 [internal] load build context
#5 transferring context: 140.82MB 0.5s done
#5 DONE 0.5s

#4 [1/6] FROM gcr.io/oss-fuzz-base/base-builder-python:latest@sha256:b7df26f0af28e4db390c7a71ad0effd95b32feafd9733149292f1f72ce9f6075
#4 sha256:9b1ec6a6e433773ed8f3f803bdfaa11b11ee82ef67b9ae0182f98ef064df6eb2 16.78MB / 83.66MB 0.6s
#4 sha256:dde9760edb7c6fc4db29528e3ba5a831a02e66e671346e999079da66936203d1 16.78MB / 34.58MB 0.6s
#4 sha256:9b1ec6a6e433773ed8f3f803bdfaa11b11ee82ef67b9ae0182f98ef064df6eb2 44.04MB / 83.66MB 0.8s
#4 sha256:dde9760edb7c6fc4db29528e3ba5a831a02e66e671346e999079da66936203d1 34.58MB / 34.58MB 0.7s done
#4 sha256:3658d6ffd80b0557e5d7e117fc7f007c11773365ba50845a571424a2f4fb0e4a 0B / 148B 0.8s
#4 sha256:9b1ec6a6e433773ed8f3f803bdfaa11b11ee82ef67b9ae0182f98ef064df6eb2 76.55MB / 83.66MB 1.0s
#4 sha256:3658d6ffd80b0557e5d7e117fc7f007c11773365ba50845a571424a2f4fb0e4a 148B / 148B 0.9s done
#4 sha256:9d45206dd2cb9f950bc766d36a2d238b4a0b58e7728689a4a88792915d340412 0B / 688B 1.0s
#4 sha256:9b1ec6a6e433773ed8f3f803bdfaa11b11ee82ef67b9ae0182f98ef064df6eb2 83.66MB / 83.66MB 1.1s done
#4 sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 47.19MB / 338.24MB 1.1s
#4 sha256:9d45206dd2cb9f950bc766d36a2d238b4a0b58e7728689a4a88792915d340412 688B / 688B 1.1s done
#4 extracting sha256:9b1ec6a6e433773ed8f3f803bdfaa11b11ee82ef67b9ae0182f98ef064df6eb2
#4 sha256:7fd25feceef56a2b81ee8bdfac95431d1150dfeb0f2c29138aca3a93fbc75965 0B / 54.71MB 1.1s
#4 sha256:b3031a60ccfaea224a6c0446b2ee45bb259648826d18d5f61b0f58e517473c36 0B / 139.15MB 1.2s
#4 sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 79.69MB / 338.24MB 1.3s
#4 sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 112.20MB / 338.24MB 1.5s
#4 sha256:7fd25feceef56a2b81ee8bdfac95431d1150dfeb0f2c29138aca3a93fbc75965 16.78MB / 54.71MB 1.6s
#4 sha256:b3031a60ccfaea224a6c0446b2ee45bb259648826d18d5f61b0f58e517473c36 18.87MB / 139.15MB 1.5s
#4 sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 144.70MB / 338.24MB 1.7s
#4 sha256:7fd25feceef56a2b81ee8bdfac95431d1150dfeb0f2c29138aca3a93fbc75965 25.17MB / 54.71MB 1.7s
#4 sha256:b3031a60ccfaea224a6c0446b2ee45bb259648826d18d5f61b0f58e517473c36 51.38MB / 139.15MB 1.7s
#4 sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 176.16MB / 338.24MB 1.9s
#4 sha256:7fd25feceef56a2b81ee8bdfac95431d1150dfeb0f2c29138aca3a93fbc75965 42.99MB / 54.71MB 1.9s
#4 sha256:b3031a60ccfaea224a6c0446b2ee45bb259648826d18d5f61b0f58e517473c36 83.89MB / 139.15MB 1.9s
#4 sha256:7fd25feceef56a2b81ee8bdfac95431d1150dfeb0f2c29138aca3a93fbc75965 54.71MB / 54.71MB 2.0s done
#4 sha256:b3031a60ccfaea224a6c0446b2ee45bb259648826d18d5f61b0f58e517473c36 99.61MB / 139.15MB 2.0s
#4 sha256:d4a1d91c108341131be61f9136b5b49c5c16cf3324130bc55ef090e86c0927e8 0B / 2.93MB 2.0s
#4 sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 208.67MB / 338.24MB 2.1s
#4 sha256:b3031a60ccfaea224a6c0446b2ee45bb259648826d18d5f61b0f58e517473c36 115.34MB / 139.15MB 2.1s
#4 sha256:b3031a60ccfaea224a6c0446b2ee45bb259648826d18d5f61b0f58e517473c36 132.12MB / 139.15MB 2.2s
#4 sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 240.12MB / 338.24MB 2.3s
#4 sha256:b3031a60ccfaea224a6c0446b2ee45bb259648826d18d5f61b0f58e517473c36 139.15MB / 139.15MB 2.3s done
#4 sha256:d4a1d91c108341131be61f9136b5b49c5c16cf3324130bc55ef090e86c0927e8 2.93MB / 2.93MB 2.2s done
#4 sha256:1484730110d905d23ea946ffe5c731476427ff4c9949577359a7a148b9f887d6 0B / 2.09MB 2.3s
#4 sha256:aa98f9a1bc279c6e09688c2d265977570899c735d7211517ddfdca0016ad5f7c 0B / 862.72kB 2.3s
#4 sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 289.20MB / 338.24MB 2.6s
#4 sha256:1484730110d905d23ea946ffe5c731476427ff4c9949577359a7a148b9f887d6 2.09MB / 2.09MB 2.6s done
#4 sha256:aa98f9a1bc279c6e09688c2d265977570899c735d7211517ddfdca0016ad5f7c 862.72kB / 862.72kB 2.5s done
#4 sha256:612d5e142a3e174ede8fc127254ed0d3e265d3ad83c4ebc186e18158af26bab6 0B / 675B 2.6s
#4 sha256:77889d288af8247852357e588c6a0b646d7da205f49bdce438199c388006b4e8 0B / 3.06MB 2.6s
#4 sha256:612d5e142a3e174ede8fc127254ed0d3e265d3ad83c4ebc186e18158af26bab6 675B / 675B 2.7s done
#4 sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 321.91MB / 338.24MB 2.8s
#4 sha256:8eee3e26dce8a300a552c4fdb3b29bbb5eddf7ccf856504b9cc3637fa630e3e4 0B / 787B 2.8s
#4 sha256:77889d288af8247852357e588c6a0b646d7da205f49bdce438199c388006b4e8 3.06MB / 3.06MB 2.9s done
#4 sha256:8eee3e26dce8a300a552c4fdb3b29bbb5eddf7ccf856504b9cc3637fa630e3e4 787B / 787B 2.8s done
#4 sha256:703aa672aab943db59034d28eaccd7c78ed9ab2aa0c1f16b5fe9424331329c06 0B / 1.23MB 3.0s
#4 sha256:69733bba3950edb7ec067458aec4ed38bf8c2c8178d53eb091b5923adf6e7b33 0B / 5.24MB 3.0s
#4 sha256:703aa672aab943db59034d28eaccd7c78ed9ab2aa0c1f16b5fe9424331329c06 1.23MB / 1.23MB 3.2s
#4 sha256:69733bba3950edb7ec067458aec4ed38bf8c2c8178d53eb091b5923adf6e7b33 2.10MB / 5.24MB 3.2s
#4 sha256:69733bba3950edb7ec067458aec4ed38bf8c2c8178d53eb091b5923adf6e7b33 5.24MB / 5.24MB 3.3s
#4 sha256:69733bba3950edb7ec067458aec4ed38bf8c2c8178d53eb091b5923adf6e7b33 5.24MB / 5.24MB 3.4s done
#4 sha256:d6f9a35520c637701d992665e5951c3efd4e34f7b0ff27e69e26b989124144b2 0B / 526.33kB 3.5s
#4 sha256:7b7f5b1ab8c93aa5fd7ca2ba964ce5acdb3d62d84c4474bbe9fdf8c63440e0ba 0B / 1.06kB 3.5s
#4 sha256:dc8faeec9efd50f42e6e82bdb51def5852733de8ba7895aaa9447dde2bcd41ef 0B / 1.77MB 3.5s
#4 sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 338.24MB / 338.24MB 3.4s done
#4 sha256:703aa672aab943db59034d28eaccd7c78ed9ab2aa0c1f16b5fe9424331329c06 1.23MB / 1.23MB 3.4s done
#4 sha256:dc8faeec9efd50f42e6e82bdb51def5852733de8ba7895aaa9447dde2bcd41ef 1.77MB / 1.77MB 3.7s done
#4 sha256:0e1d47f5d1b20c7d1340841b51bad800e6c74b31d3c45421500a6a2fa24639f3 0B / 1.23kB 3.7s
#4 sha256:7b7f5b1ab8c93aa5fd7ca2ba964ce5acdb3d62d84c4474bbe9fdf8c63440e0ba 1.06kB / 1.06kB 3.7s done
#4 sha256:409fffdb170ad335ebebd9bc598634c86e43addbdff1c65075160354647e7f97 0B / 3.65MB 3.8s
#4 sha256:d6f9a35520c637701d992665e5951c3efd4e34f7b0ff27e69e26b989124144b2 526.33kB / 526.33kB 3.8s done
#4 sha256:0e1d47f5d1b20c7d1340841b51bad800e6c74b31d3c45421500a6a2fa24639f3 1.23kB / 1.23kB 3.9s done
#4 sha256:d3ba3e1a606b82df21566c6d10d59f1b522e78b2e06d78c3c424d6b6dfa93078 0B / 8.88kB 3.9s
#4 sha256:5bcf34ca81971172c727b7faa196347eb57cf02a68a9dd9949639e201c1c5f43 0B / 19.37kB 3.9s
#4 sha256:409fffdb170ad335ebebd9bc598634c86e43addbdff1c65075160354647e7f97 3.65MB / 3.65MB 4.0s done
#4 sha256:44c936e67ebf0b725393ff6d46dbe7ccd66d455eec346bff03a79dab2f455246 0B / 1.66MB 4.0s
#4 sha256:d3ba3e1a606b82df21566c6d10d59f1b522e78b2e06d78c3c424d6b6dfa93078 8.88kB / 8.88kB 4.0s done
#4 sha256:5bcf34ca81971172c727b7faa196347eb57cf02a68a9dd9949639e201c1c5f43 19.37kB / 19.37kB 4.1s done
#4 sha256:f752c1656a63aab8e00a0f9bba810b067d6aae9d4e03d935dab132063c246f90 0B / 1.99MB 4.1s
#4 sha256:a06bc94dec2cb4e263d13b40f74cdb1f9498cc7431d53cbbafee8efa89bfd2db 0B / 1.66MB 4.1s
#4 sha256:44c936e67ebf0b725393ff6d46dbe7ccd66d455eec346bff03a79dab2f455246 1.66MB / 1.66MB 4.2s done
#4 sha256:a06bc94dec2cb4e263d13b40f74cdb1f9498cc7431d53cbbafee8efa89bfd2db 1.66MB / 1.66MB 4.3s
#4 sha256:7630681a69a4acda801747c8546703474d5ef63f1fd500ae393c6eaee669f56c 0B / 1.99MB 4.3s
#4 extracting sha256:9b1ec6a6e433773ed8f3f803bdfaa11b11ee82ef67b9ae0182f98ef064df6eb2 3.3s done
#4 sha256:f752c1656a63aab8e00a0f9bba810b067d6aae9d4e03d935dab132063c246f90 1.99MB / 1.99MB 4.3s done
#4 sha256:a06bc94dec2cb4e263d13b40f74cdb1f9498cc7431d53cbbafee8efa89bfd2db 1.66MB / 1.66MB 4.3s done
#4 sha256:e9c42f209884cd2a06a9dd3061f2c6d8c7aa62a4f2ecb47c8a2ec061beecd979 0B / 7.31MB 4.4s
#4 sha256:4f893604dacc0360cd1bfea8203e88af94730a846470b7f9f975d8bf00b16e08 0B / 767B 4.4s
#4 sha256:7630681a69a4acda801747c8546703474d5ef63f1fd500ae393c6eaee669f56c 1.99MB / 1.99MB 4.5s done
#4 sha256:e9c42f209884cd2a06a9dd3061f2c6d8c7aa62a4f2ecb47c8a2ec061beecd979 2.10MB / 7.31MB 4.6s
#4 sha256:4f893604dacc0360cd1bfea8203e88af94730a846470b7f9f975d8bf00b16e08 767B / 767B 4.6s done
#4 sha256:c3df89808b34138bfb379ecdfe321fcac5bbbbad3c28b0047b3d8943e4629eec 0B / 574B 4.6s
#4 sha256:ba522f78426367eb7319808eef2198f5269cd6f11daa53ca1acc4fad40c78b4b 0B / 2.35kB 4.6s
#4 sha256:e9c42f209884cd2a06a9dd3061f2c6d8c7aa62a4f2ecb47c8a2ec061beecd979 7.31MB / 7.31MB 4.7s done
#4 sha256:c3df89808b34138bfb379ecdfe321fcac5bbbbad3c28b0047b3d8943e4629eec 574B / 574B 4.8s
#4 sha256:ba522f78426367eb7319808eef2198f5269cd6f11daa53ca1acc4fad40c78b4b 2.35kB / 2.35kB 4.8s
#4 sha256:2b2a5d795cfda238d6651d384e7613a71f25f07a5393e73c1521244cb13de743 0B / 242B 4.8s
#4 sha256:c3df89808b34138bfb379ecdfe321fcac5bbbbad3c28b0047b3d8943e4629eec 574B / 574B 4.9s done
#4 sha256:ba522f78426367eb7319808eef2198f5269cd6f11daa53ca1acc4fad40c78b4b 2.35kB / 2.35kB 4.9s done
#4 sha256:2b2a5d795cfda238d6651d384e7613a71f25f07a5393e73c1521244cb13de743 242B / 242B 4.9s done
#4 sha256:895e54f99fa976a24be506df302cc9a3b13530fb4c9bf17cca0eba00aaf887da 0B / 30.25kB 5.0s
#4 sha256:1bec63e211537144ddf6848526d036ab036a3c883ae780d3b6d0331714602121 0B / 1.40MB 5.0s
#4 sha256:48907a1598e54328b05b415d900ed1030cfd312432fb23abbd82fdeeb8cc18d3 0B / 23.75MB 5.0s
#4 extracting sha256:271fc0ab47334601319a29901e20583cfa0c044ac34be1a9bd424258ea2732cc
#4 sha256:895e54f99fa976a24be506df302cc9a3b13530fb4c9bf17cca0eba00aaf887da 30.25kB / 30.25kB 5.2s done
#4 sha256:1bec63e211537144ddf6848526d036ab036a3c883ae780d3b6d0331714602121 1.40MB / 1.40MB 5.2s done
#4 sha256:48907a1598e54328b05b415d900ed1030cfd312432fb23abbd82fdeeb8cc18d3 22.02MB / 23.75MB 5.3s
#4 extracting sha256:271fc0ab47334601319a29901e20583cfa0c044ac34be1a9bd424258ea2732cc done
#4 sha256:bed048b4f935a011118a00e7f32a9c074e3144b1421f925a6c8330b38c8e9cc5 0B / 23.76MB 5.3s
#4 extracting sha256:dde9760edb7c6fc4db29528e3ba5a831a02e66e671346e999079da66936203d1
#4 sha256:2500495383a0bcbbaca8c1d1f9129c9dfc740de078800b6aacaa4e1524d00291 0B / 42.95MB 5.3s
#4 sha256:48907a1598e54328b05b415d900ed1030cfd312432fb23abbd82fdeeb8cc18d3 23.75MB / 23.75MB 5.3s done
#4 sha256:bed048b4f935a011118a00e7f32a9c074e3144b1421f925a6c8330b38c8e9cc5 16.78MB / 23.76MB 5.5s
#4 sha256:bed048b4f935a011118a00e7f32a9c074e3144b1421f925a6c8330b38c8e9cc5 23.76MB / 23.76MB 5.6s done
#4 sha256:2500495383a0bcbbaca8c1d1f9129c9dfc740de078800b6aacaa4e1524d00291 13.63MB / 42.95MB 5.8s
#4 sha256:2500495383a0bcbbaca8c1d1f9129c9dfc740de078800b6aacaa4e1524d00291 16.78MB / 42.95MB 5.9s
#4 sha256:2500495383a0bcbbaca8c1d1f9129c9dfc740de078800b6aacaa4e1524d00291 25.17MB / 42.95MB 6.0s
#4 sha256:2500495383a0bcbbaca8c1d1f9129c9dfc740de078800b6aacaa4e1524d00291 37.75MB / 42.95MB 6.2s
#4 sha256:2500495383a0bcbbaca8c1d1f9129c9dfc740de078800b6aacaa4e1524d00291 42.95MB / 42.95MB 6.2s done
#4 extracting sha256:dde9760edb7c6fc4db29528e3ba5a831a02e66e671346e999079da66936203d1 1.4s done
#4 extracting sha256:3be1f579c639f93e2cb0b7db8661af5855f5f85ebcb7402b9f4cf5b6aaa4fb3d
#4 extracting sha256:3be1f579c639f93e2cb0b7db8661af5855f5f85ebcb7402b9f4cf5b6aaa4fb3d done
#4 extracting sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130
#4 extracting sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 5.1s
#4 extracting sha256:ad22ce9f588292e88d37a6455bf87f60afb0a90ccb047320e3c2885f12640130 7.2s done
#4 extracting sha256:3658d6ffd80b0557e5d7e117fc7f007c11773365ba50845a571424a2f4fb0e4a
#4 extracting sha256:3658d6ffd80b0557e5d7e117fc7f007c11773365ba50845a571424a2f4fb0e4a done
#4 extracting sha256:9d45206dd2cb9f950bc766d36a2d238b4a0b58e7728689a4a88792915d340412 done
#4 extracting sha256:b3031a60ccfaea224a6c0446b2ee45bb259648826d18d5f61b0f58e517473c36 0.1s
#4 extracting sha256:b3031a60ccfaea224a6c0446b2ee45bb259648826d18d5f61b0f58e517473c36 3.0s done
#4 extracting sha256:7fd25feceef56a2b81ee8bdfac95431d1150dfeb0f2c29138aca3a93fbc75965
#4 extracting sha256:7fd25feceef56a2b81ee8bdfac95431d1150dfeb0f2c29138aca3a93fbc75965 2.2s done
#4 extracting sha256:d4a1d91c108341131be61f9136b5b49c5c16cf3324130bc55ef090e86c0927e8
#4 extracting sha256:d4a1d91c108341131be61f9136b5b49c5c16cf3324130bc55ef090e86c0927e8 0.0s done
#4 extracting sha256:aa98f9a1bc279c6e09688c2d265977570899c735d7211517ddfdca0016ad5f7c 0.1s done
#4 extracting sha256:1484730110d905d23ea946ffe5c731476427ff4c9949577359a7a148b9f887d6 0.0s done
#4 extracting sha256:612d5e142a3e174ede8fc127254ed0d3e265d3ad83c4ebc186e18158af26bab6 done
#4 extracting sha256:77889d288af8247852357e588c6a0b646d7da205f49bdce438199c388006b4e8 0.1s
#4 extracting sha256:77889d288af8247852357e588c6a0b646d7da205f49bdce438199c388006b4e8 0.3s done
#4 extracting sha256:8eee3e26dce8a300a552c4fdb3b29bbb5eddf7ccf856504b9cc3637fa630e3e4
#4 extracting sha256:8eee3e26dce8a300a552c4fdb3b29bbb5eddf7ccf856504b9cc3637fa630e3e4 done
#4 extracting sha256:69733bba3950edb7ec067458aec4ed38bf8c2c8178d53eb091b5923adf6e7b33 0.1s done
#4 extracting sha256:703aa672aab943db59034d28eaccd7c78ed9ab2aa0c1f16b5fe9424331329c06
#4 extracting sha256:703aa672aab943db59034d28eaccd7c78ed9ab2aa0c1f16b5fe9424331329c06 0.0s done
#4 extracting sha256:7b7f5b1ab8c93aa5fd7ca2ba964ce5acdb3d62d84c4474bbe9fdf8c63440e0ba done
#4 extracting sha256:dc8faeec9efd50f42e6e82bdb51def5852733de8ba7895aaa9447dde2bcd41ef 0.1s
#4 extracting sha256:dc8faeec9efd50f42e6e82bdb51def5852733de8ba7895aaa9447dde2bcd41ef 0.3s done
#4 extracting sha256:d6f9a35520c637701d992665e5951c3efd4e34f7b0ff27e69e26b989124144b2
#4 extracting sha256:d6f9a35520c637701d992665e5951c3efd4e34f7b0ff27e69e26b989124144b2 0.1s done
#4 extracting sha256:0e1d47f5d1b20c7d1340841b51bad800e6c74b31d3c45421500a6a2fa24639f3
#4 extracting sha256:0e1d47f5d1b20c7d1340841b51bad800e6c74b31d3c45421500a6a2fa24639f3 done
#4 extracting sha256:409fffdb170ad335ebebd9bc598634c86e43addbdff1c65075160354647e7f97 0.1s
#4 extracting sha256:409fffdb170ad335ebebd9bc598634c86e43addbdff1c65075160354647e7f97 0.1s done
#4 extracting sha256:d3ba3e1a606b82df21566c6d10d59f1b522e78b2e06d78c3c424d6b6dfa93078 done
#4 extracting sha256:5bcf34ca81971172c727b7faa196347eb57cf02a68a9dd9949639e201c1c5f43 done
#4 extracting sha256:44c936e67ebf0b725393ff6d46dbe7ccd66d455eec346bff03a79dab2f455246 0.0s done
#4 extracting sha256:a06bc94dec2cb4e263d13b40f74cdb1f9498cc7431d53cbbafee8efa89bfd2db 0.0s done
#4 extracting sha256:f752c1656a63aab8e00a0f9bba810b067d6aae9d4e03d935dab132063c246f90 0.0s done
#4 extracting sha256:7630681a69a4acda801747c8546703474d5ef63f1fd500ae393c6eaee669f56c 0.0s done
#4 extracting sha256:e9c42f209884cd2a06a9dd3061f2c6d8c7aa62a4f2ecb47c8a2ec061beecd979 0.1s done
#4 extracting sha256:4f893604dacc0360cd1bfea8203e88af94730a846470b7f9f975d8bf00b16e08 done
#4 extracting sha256:ba522f78426367eb7319808eef2198f5269cd6f11daa53ca1acc4fad40c78b4b done
#4 extracting sha256:c3df89808b34138bfb379ecdfe321fcac5bbbbad3c28b0047b3d8943e4629eec done
#4 extracting sha256:2b2a5d795cfda238d6651d384e7613a71f25f07a5393e73c1521244cb13de743 done
#4 extracting sha256:1bec63e211537144ddf6848526d036ab036a3c883ae780d3b6d0331714602121 0.0s done
#4 extracting sha256:895e54f99fa976a24be506df302cc9a3b13530fb4c9bf17cca0eba00aaf887da
#4 extracting sha256:895e54f99fa976a24be506df302cc9a3b13530fb4c9bf17cca0eba00aaf887da done
#4 extracting sha256:48907a1598e54328b05b415d900ed1030cfd312432fb23abbd82fdeeb8cc18d3 0.1s
#4 extracting sha256:48907a1598e54328b05b415d900ed1030cfd312432fb23abbd82fdeeb8cc18d3 0.3s done
#4 extracting sha256:bed048b4f935a011118a00e7f32a9c074e3144b1421f925a6c8330b38c8e9cc5
#4 extracting sha256:bed048b4f935a011118a00e7f32a9c074e3144b1421f925a6c8330b38c8e9cc5 0.3s done
#4 extracting sha256:2500495383a0bcbbaca8c1d1f9129c9dfc740de078800b6aacaa4e1524d00291
#4 extracting sha256:2500495383a0bcbbaca8c1d1f9129c9dfc740de078800b6aacaa4e1524d00291 1.6s done
#4 DONE 26.5s

#6 [2/6] RUN apt-get update &&     apt-get install -y --no-install-recommends     build-essential     libicu-dev     pkg-config &&     rm -rf /var/lib/apt/lists/*
#6 0.396 Hit:1 http://security.ubuntu.com/ubuntu focal-security InRelease
#6 0.491 Hit:2 http://archive.ubuntu.com/ubuntu focal InRelease
#6 0.588 Hit:3 http://archive.ubuntu.com/ubuntu focal-updates InRelease
#6 0.677 Hit:4 http://archive.ubuntu.com/ubuntu focal-backports InRelease
#6 0.769 Reading package lists...
#6 1.909 Reading package lists...
#6 2.989 Building dependency tree...
#6 3.210 Reading state information...
#6 3.350 build-essential is already the newest version (12.8ubuntu1.1).
#6 3.350 The following packages were automatically installed and are no longer required:
#6 3.350   autotools-dev libsigsegv2 m4
#6 3.351 Use 'apt autoremove' to remove them.
#6 3.351 The following additional packages will be installed:
#6 3.351   icu-devtools libglib2.0-0 libicu66
#6 3.352 Suggested packages:
#6 3.352   icu-doc
#6 3.352 Recommended packages:
#6 3.352   libglib2.0-data shared-mime-info xdg-user-dirs
#6 3.387 The following NEW packages will be installed:
#6 3.387   icu-devtools libglib2.0-0 libicu-dev libicu66 pkg-config
#6 3.454 0 upgraded, 5 newly installed, 0 to remove and 0 not upgraded.
#6 3.454 Need to get 19.5 MB of archives.
#6 3.454 After this operation, 83.2 MB of additional disk space will be used.
#6 3.454 Get:1 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libglib2.0-0 amd64 2.64.6-1~ubuntu20.04.9 [1290 kB]
#6 3.649 Get:2 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libicu66 amd64 66.1-2ubuntu2.1 [8515 kB]
#6 3.743 Get:3 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 icu-devtools amd64 66.1-2ubuntu2.1 [189 kB]
#6 3.745 Get:4 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 libicu-dev amd64 66.1-2ubuntu2.1 [9451 kB]
#6 3.820 Get:5 http://archive.ubuntu.com/ubuntu focal/main amd64 pkg-config amd64 0.29.1-0ubuntu4 [45.5 kB]
#6 3.931 debconf: delaying package configuration, since apt-utils is not installed
#6 3.959 Fetched 19.5 MB in 0s (45.5 MB/s)
#6 3.978 Selecting previously unselected package libglib2.0-0:amd64.
#6 3.978 (Reading database ... 
(Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 17483 files and directories currently installed.)
#6 3.993 Preparing to unpack .../libglib2.0-0_2.64.6-1~ubuntu20.04.9_amd64.deb ...
#6 3.995 Unpacking libglib2.0-0:amd64 (2.64.6-1~ubuntu20.04.9) ...
#6 4.090 Selecting previously unselected package libicu66:amd64.
#6 4.092 Preparing to unpack .../libicu66_66.1-2ubuntu2.1_amd64.deb ...
#6 4.093 Unpacking libicu66:amd64 (66.1-2ubuntu2.1) ...
#6 4.695 Selecting previously unselected package icu-devtools.
#6 4.697 Preparing to unpack .../icu-devtools_66.1-2ubuntu2.1_amd64.deb ...
#6 4.698 Unpacking icu-devtools (66.1-2ubuntu2.1) ...
#6 4.725 Selecting previously unselected package libicu-dev:amd64.
#6 4.727 Preparing to unpack .../libicu-dev_66.1-2ubuntu2.1_amd64.deb ...
#6 4.728 Unpacking libicu-dev:amd64 (66.1-2ubuntu2.1) ...
#6 5.405 Selecting previously unselected package pkg-config.
#6 5.408 Preparing to unpack .../pkg-config_0.29.1-0ubuntu4_amd64.deb ...
#6 5.409 Unpacking pkg-config (0.29.1-0ubuntu4) ...
#6 5.433 Setting up libglib2.0-0:amd64 (2.64.6-1~ubuntu20.04.9) ...
#6 5.439 No schema files found: doing nothing.
#6 5.442 Setting up libicu66:amd64 (66.1-2ubuntu2.1) ...
#6 5.445 Setting up icu-devtools (66.1-2ubuntu2.1) ...
#6 5.448 Setting up pkg-config (0.29.1-0ubuntu4) ...
#6 5.485 Setting up libicu-dev:amd64 (66.1-2ubuntu2.1) ...
#6 5.487 Processing triggers for libc-bin (2.31-0ubuntu9.18) ...
#6 DONE 5.6s

#7 [3/6] COPY . /src/pythainlp
#7 DONE 0.3s

#8 [4/6] WORKDIR /src/pythainlp
#8 DONE 0.0s

#9 [5/6] RUN pip install --no-cache-dir -e .
#9 0.388 Obtaining file:///src/pythainlp
#9 0.392   Installing build dependencies: started
#9 1.011   Installing build dependencies: finished with status 'done'
#9 1.012   Checking if build backend supports build_editable: started
#9 1.284   Checking if build backend supports build_editable: finished with status 'done'
#9 1.285   Getting requirements to build editable: started
#9 1.639   Getting requirements to build editable: finished with status 'done'
#9 1.640   Preparing editable metadata (pyproject.toml): started
#9 1.872   Preparing editable metadata (pyproject.toml): finished with status 'done'
#9 1.909 Building wheels for collected packages: pythainlp
#9 1.910   Building editable for pythainlp (pyproject.toml): started
#9 2.228   Building editable for pythainlp (pyproject.toml): finished with status 'done'
#9 2.229   Created wheel for pythainlp: filename=pythainlp-5.2.0-0.editable-py3-none-any.whl size=13445 sha256=3e49529703337b7dd7493de0b8680704bc7747fe59a6eb08cadd3a5af9594fd8
#9 2.229   Stored in directory: /tmp/pip-ephem-wheel-cache-0l0u3irg/wheels/5b/8d/99/76fb73da426627a9f1ace799b52cfc55ea3c0294fb71bcd6be
#9 2.234 Successfully built pythainlp
#9 2.244 Installing collected packages: pythainlp
#9 2.256 Successfully installed pythainlp-5.2.0
#9 2.256 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.
#9 DONE 2.4s

#10 [6/6] COPY .clusterfuzzlite/build.sh /src/
#10 DONE 0.0s

#11 exporting to image
#11 exporting layers
#11 exporting layers 2.9s done
#11 writing image sha256:76e2fb07ad95ef5c20cedd4dfdb5e20d33202859d53b911fac1fa7ae42f51a05 done
#11 naming to docker.io/library/external-cfl-project-443fff9cdf524eb4ab07b2498a891b17 done
#11 DONE 2.9s
2026-03-06 06:33:25,166 - root - INFO - repo_dir: /github/workspace/storage/pythainlp.
2026-03-06 06:33:25,172 - root - INFO - Docker container: a79c90c7006b.
2026-03-06 06:33:25,172 - root - INFO - Building with address sanitizer.
2026-03-06 06:33:25,172 - helper - INFO - Running: docker run --privileged --shm-size=2g --platform linux/amd64 --rm -e FUZZING_ENGINE=libfuzzer -e CIFUZZ=True -e SANITIZER=address -e ARCHITECTURE=x86_64 -e FUZZING_LANGUAGE=python -e OUT=/github/workspace/build-out --volumes-from a79c90c7006b external-cfl-project-443fff9cdf524eb4ab07b2498a891b17 /bin/bash -c compile.
---------------------------------------------------------------
vm.mmap_rnd_bits = 28
Compiling libFuzzer to /usr/lib/libFuzzingEngine.a...  done.
---------------------------------------------------------------
CC=clang
CXX=clang++
CFLAGS=-O1   -fno-omit-frame-pointer   -gline-tables-only   -Wno-error=incompatible-function-pointer-types   -Wno-error=int-conversion   -Wno-error=deprecated-declarations   -Wno-error=implicit-function-declaration   -Wno-error=implicit-int   -Wno-error=unknown-warning-option   -Wno-error=vla-cxx-extension   -DFUZZING_BUILD_MODE_UNSAFE_FOR_PRODUCTION -fsanitize=address -fsanitize-address-use-after-scope -fsanitize=fuzzer-no-link -fno-sanitize=function,leak,vptr,
CXXFLAGS=-O1   -fno-omit-frame-pointer   -gline-tables-only   -Wno-error=incompatible-function-pointer-types   -Wno-error=int-conversion   -Wno-error=deprecated-declarations   -Wno-error=implicit-function-declaration   -Wno-error=implicit-int   -Wno-error=unknown-warning-option   -Wno-error=vla-cxx-extension   -DFUZZING_BUILD_MODE_UNSAFE_FOR_PRODUCTION -fsanitize=address -fsanitize-address-use-after-scope -fsanitize=fuzzer-no-link -stdlib=libc++ -fno-sanitize=function,leak,vptr
RUSTFLAGS=--cfg fuzzing -Zsanitizer=address -Cdebuginfo=1 -Cforce-frame-pointers
---------------------------------------------------------------
Building PyThaiNLP fuzz targets...
+ echo 'Building PyThaiNLP fuzz targets...'
+ pip install atheris==2.3.0
Collecting atheris==2.3.0
  Downloading atheris-2.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (17 kB)
Downloading atheris-2.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (31.2 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 31.2/31.2 MB 158.3 MB/s  0:00:00
Installing collected packages: atheris
  Attempting uninstall: atheris
    Found existing installation: atheris 3.0.0
    Uninstalling atheris-3.0.0:
      Successfully uninstalled atheris-3.0.0
Successfully installed atheris-2.3.0
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.
+ for fuzzer in "${SRC}/pythainlp/fuzz"/fuzz_*.py
+ [[ -e /src/pythainlp/fuzz/fuzz_tokenize.py ]]
++ basename /src/pythainlp/fuzz/fuzz_tokenize.py
+ echo 'Compiling fuzz_tokenize.py...'
+ compile_python_fuzzer /src/pythainlp/fuzz/fuzz_tokenize.py
Compiling fuzz_tokenize.py...
+ fuzzer_path=/src/pythainlp/fuzz/fuzz_tokenize.py
+ shift 1
++ basename -s .py /src/pythainlp/fuzz/fuzz_tokenize.py
+ fuzzer_basename=fuzz_tokenize
+ fuzzer_package=fuzz_tokenize.pkg
+ PYFUZZ_WORKPATH=/src/pyfuzzworkdir/
+ FUZZ_WORKPATH=/src/pyfuzzworkdir//fuzz_tokenize
+ [[ address = *introspector* ]]
+ [[ address = *coverage* ]]
+ [[ 0 != \0 ]]
+ rm -rf /src/pyfuzzworkdir/
+ mkdir /src/pyfuzzworkdir/ /src/pyfuzzworkdir//fuzz_tokenize
+ pyinstaller --distpath /github/workspace/build-out --workpath=/src/pyfuzzworkdir//fuzz_tokenize --onefile --name fuzz_tokenize.pkg /src/pythainlp/fuzz/fuzz_tokenize.py
47 INFO: PyInstaller: 6.10.0, contrib hooks: 2026.2
47 INFO: Python: 3.11.13
48 INFO: Platform: Linux-6.14.0-1017-azure-x86_64-with-glibc2.31
48 INFO: Python environment: /usr/local
49 INFO: wrote /src/pythainlp/fuzz_tokenize.pkg.spec
51 INFO: Module search paths (PYTHONPATH):
['/usr/local/lib/python311.zip',
 '/usr/local/lib/python3.11',
 '/usr/local/lib/python3.11/lib-dynload',
 '/usr/local/lib/python3.11/site-packages',
 '__editable__.pythainlp-5.2.0.finder.__path_hook__',
 '/src/pythainlp/fuzz']
133 INFO: checking Analysis
133 INFO: Building Analysis because Analysis-00.toc is non existent
133 INFO: Running Analysis Analysis-00.toc
133 INFO: Target bytecode optimization level: 0
134 INFO: Initializing module dependency graph...
134 INFO: Caching module graph hooks...
143 INFO: Analyzing base_library.zip ...
795 INFO: Processing standard module hook 'hook-heapq.py' from '/usr/local/lib/python3.11/site-packages/PyInstaller/hooks'
901 INFO: Processing standard module hook 'hook-encodings.py' from '/usr/local/lib/python3.11/site-packages/PyInstaller/hooks'
2215 INFO: Processing standard module hook 'hook-pickle.py' from '/usr/local/lib/python3.11/site-packages/PyInstaller/hooks'
3749 INFO: Caching module dependency graph...
3817 INFO: Looking for Python shared library...
3825 INFO: Using Python shared library: /usr/local/lib/libpython3.11.so.1.0
3825 INFO: Analyzing /src/pythainlp/fuzz/fuzz_tokenize.py
3828 INFO: Processing standard module hook 'hook-atheris.py' from '/usr/local/lib/python3.11/site-packages/atheris'
3886 INFO: Processing module hooks (post-graph stage)...
3892 INFO: Performing binary vs. data reclassification (2 entries)
3896 INFO: Looking for ctypes DLLs
3906 INFO: Analyzing run-time hooks ...
3907 INFO: Including run-time hook 'pyi_rth_inspect.py' from '/usr/local/lib/python3.11/site-packages/PyInstaller/hooks/rthooks'
3911 INFO: Looking for dynamic libraries
4279 INFO: Warnings written to /src/pyfuzzworkdir/fuzz_tokenize/fuzz_tokenize.pkg/warn-fuzz_tokenize.pkg.txt
4288 INFO: Graph cross-reference written to /src/pyfuzzworkdir/fuzz_tokenize/fuzz_tokenize.pkg/xref-fuzz_tokenize.pkg.html
4297 INFO: checking PYZ
4297 INFO: Building PYZ because PYZ-00.toc is non existent
4297 INFO: Building PYZ (ZlibArchive) /src/pyfuzzworkdir/fuzz_tokenize/fuzz_tokenize.pkg/PYZ-00.pyz
4485 INFO: Building PYZ (ZlibArchive) /src/pyfuzzworkdir/fuzz_tokenize/fuzz_tokenize.pkg/PYZ-00.pyz completed successfully.
4494 INFO: checking PKG
4495 INFO: Building PKG because PKG-00.toc is non existent
4495 INFO: Building PKG (CArchive) fuzz_tokenize.pkg.pkg
15811 INFO: Building PKG (CArchive) fuzz_tokenize.pkg.pkg completed successfully.
15813 INFO: Bootloader /usr/local/lib/python3.11/site-packages/PyInstaller/bootloader/Linux-64bit-intel/run
15813 INFO: checking EXE
15813 INFO: Building EXE because EXE-00.toc is non existent
15813 INFO: Building EXE from EXE-00.toc
15813 INFO: Copying bootloader EXE to /github/workspace/build-out/fuzz_tokenize.pkg
15814 INFO: Appending PKG archive to custom ELF section in EXE
15855 INFO: Building EXE from EXE-00.toc completed successfully.
+ chmod -x /github/workspace/build-out/fuzz_tokenize.pkg
+ [[ address = *coverage* ]]
+ echo '#!/bin/sh
# LLVMFuzzerTestOneInput for fuzzer detection.
this_dir=$(dirname "$0")
chmod +x $this_dir/fuzz_tokenize.pkg
LD_PRELOAD=$this_dir/sanitizer_with_fuzzer.so ASAN_OPTIONS=$ASAN_OPTIONS:symbolize=1:external_symbolizer_path=$this_dir/llvm-symbolizer:detect_leaks=0 $this_dir/fuzz_tokenize.pkg $@'
+ chmod +x /github/workspace/build-out/fuzz_tokenize
+ for fuzzer in "${SRC}/pythainlp/fuzz"/fuzz_*.py
+ [[ -e /src/pythainlp/fuzz/fuzz_util_normalize.py ]]
++ basename /src/pythainlp/fuzz/fuzz_util_normalize.py
Compiling fuzz_util_normalize.py...
+ echo 'Compiling fuzz_util_normalize.py...'
+ compile_python_fuzzer /src/pythainlp/fuzz/fuzz_util_normalize.py
+ fuzzer_path=/src/pythainlp/fuzz/fuzz_util_normalize.py
+ shift 1
++ basename -s .py /src/pythainlp/fuzz/fuzz_util_normalize.py
+ fuzzer_basename=fuzz_util_normalize
+ fuzzer_package=fuzz_util_normalize.pkg
+ PYFUZZ_WORKPATH=/src/pyfuzzworkdir/
+ FUZZ_WORKPATH=/src/pyfuzzworkdir//fuzz_util_normalize
+ [[ address = *introspector* ]]
+ [[ address = *coverage* ]]
+ [[ 0 != \0 ]]
+ rm -rf /src/pyfuzzworkdir/
+ mkdir /src/pyfuzzworkdir/ /src/pyfuzzworkdir//fuzz_util_normalize
+ pyinstaller --distpath /github/workspace/build-out --workpath=/src/pyfuzzworkdir//fuzz_util_normalize --onefile --name fuzz_util_normalize.pkg /src/pythainlp/fuzz/fuzz_util_normalize.py
47 INFO: PyInstaller: 6.10.0, contrib hooks: 2026.2
47 INFO: Python: 3.11.13
48 INFO: Platform: Linux-6.14.0-1017-azure-x86_64-with-glibc2.31
48 INFO: Python environment: /usr/local
49 INFO: wrote /src/pythainlp/fuzz_util_normalize.pkg.spec
50 INFO: Module search paths (PYTHONPATH):
['/usr/local/lib/python311.zip',
 '/usr/local/lib/python3.11',
 '/usr/local/lib/python3.11/lib-dynload',
 '/usr/local/lib/python3.11/site-packages',
 '__editable__.pythainlp-5.2.0.finder.__path_hook__',
 '/src/pythainlp/fuzz']
134 INFO: checking Analysis
134 INFO: Building Analysis because Analysis-00.toc is non existent
134 INFO: Running Analysis Analysis-00.toc
135 INFO: Target bytecode optimization level: 0
135 INFO: Initializing module dependency graph...
135 INFO: Caching module graph hooks...
144 INFO: Analyzing base_library.zip ...
789 INFO: Processing standard module hook 'hook-heapq.py' from '/usr/local/lib/python3.11/site-packages/PyInstaller/hooks'
862 INFO: Processing standard module hook 'hook-encodings.py' from '/usr/local/lib/python3.11/site-packages/PyInstaller/hooks'
2114 INFO: Processing standard module hook 'hook-pickle.py' from '/usr/local/lib/python3.11/site-packages/PyInstaller/hooks'
3475 INFO: Caching module dependency graph...
3555 INFO: Looking for Python shared library...
3563 INFO: Using Python shared library: /usr/local/lib/libpython3.11.so.1.0
3563 INFO: Analyzing /src/pythainlp/fuzz/fuzz_util_normalize.py
3565 INFO: Processing standard module hook 'hook-atheris.py' from '/usr/local/lib/python3.11/site-packages/atheris'
3624 INFO: Processing module hooks (post-graph stage)...
3630 INFO: Performing binary vs. data reclassification (2 entries)
3633 INFO: Looking for ctypes DLLs
3643 INFO: Analyzing run-time hooks ...
3645 INFO: Including run-time hook 'pyi_rth_inspect.py' from '/usr/local/lib/python3.11/site-packages/PyInstaller/hooks/rthooks'
3649 INFO: Looking for dynamic libraries
4012 INFO: Warnings written to /src/pyfuzzworkdir/fuzz_util_normalize/fuzz_util_normalize.pkg/warn-fuzz_util_normalize.pkg.txt
4021 INFO: Graph cross-reference written to /src/pyfuzzworkdir/fuzz_util_normalize/fuzz_util_normalize.pkg/xref-fuzz_util_normalize.pkg.html
4030 INFO: checking PYZ
4030 INFO: Building PYZ because PYZ-00.toc is non existent
4030 INFO: Building PYZ (ZlibArchive) /src/pyfuzzworkdir/fuzz_util_normalize/fuzz_util_normalize.pkg/PYZ-00.pyz
4212 INFO: Building PYZ (ZlibArchive) /src/pyfuzzworkdir/fuzz_util_normalize/fuzz_util_normalize.pkg/PYZ-00.pyz completed successfully.
4222 INFO: checking PKG
4223 INFO: Building PKG because PKG-00.toc is non existent
4223 INFO: Building PKG (CArchive) fuzz_util_normalize.pkg.pkg
15534 INFO: Building PKG (CArchive) fuzz_util_normalize.pkg.pkg completed successfully.
15535 INFO: Bootloader /usr/local/lib/python3.11/site-packages/PyInstaller/bootloader/Linux-64bit-intel/run
15535 INFO: checking EXE
15536 INFO: Building EXE because EXE-00.toc is non existent
15536 INFO: Building EXE from EXE-00.toc
15536 INFO: Copying bootloader EXE to /github/workspace/build-out/fuzz_util_normalize.pkg
15536 INFO: Appending PKG archive to custom ELF section in EXE
15577 INFO: Building EXE from EXE-00.toc completed successfully.
+ chmod -x /github/workspace/build-out/fuzz_util_normalize.pkg
+ [[ address = *coverage* ]]
+ echo '#!/bin/sh
# LLVMFuzzerTestOneInput for fuzzer detection.
this_dir=$(dirname "$0")
chmod +x $this_dir/fuzz_util_normalize.pkg
LD_PRELOAD=$this_dir/sanitizer_with_fuzzer.so ASAN_OPTIONS=$ASAN_OPTIONS:symbolize=1:external_symbolizer_path=$this_dir/llvm-symbolizer:detect_leaks=0 $this_dir/fuzz_util_normalize.pkg $@'
+ chmod +x /github/workspace/build-out/fuzz_util_normalize
Build completed successfully!
2026-03-06 06:33:58,611 - root - INFO - Removing unaffected fuzz targets.
+ echo 'Build completed successfully!'
2026-03-06 06:33:58,779 - root - INFO - Diffing against dev.
2026-03-06 06:33:58,783 - root - INFO - Files changed in PR: ['.clusterfuzzlite/Dockerfile', '.clusterfuzzlite/build.sh', '.github/workflows/clusterfuzzlite.yml', '.gitignore', 'fuzz/README.md', 'fuzz/fuzz_tokenize.py', 'fuzz/fuzz_util_normalize.py']
2026-03-06 06:33:58,849 - root - ERROR - Request to https://api.github.com/repos/PyThaiNLP/pythainlp/actions/artifacts?per_page=100&page=1 failed. Code: 401. Response: {'message': 'Bad credentials', 'documentation_url': 'https://docs.github.com/rest', 'status': '401'}
2026-03-06 06:33:58,849 - root - ERROR - Could not get coverage: Github API request failed..
2026-03-06 06:33:58,850 - root - ERROR - Could not find latest coverage report.
2026-03-06 06:34:02,188 - root - ERROR - Build check failed.
2026-03-06 06:34:02,188 - root - ERROR - Error building fuzzers for (commit: f03c0f3281946c7fe6ff7aefc9b2b1fbc931b18a, pr_ref: refs/pull/1282/merge).
Build check: stdout: INFO: performing bad build checks for /tmp/not-out/tmpb9w97vx8/fuzz_util_normalize
INFO: performing bad build checks for /tmp/not-out/tmpb9w97vx8/fuzz_tokenize
Retrying failed fuzz targets sequentially 2
INFO: performing bad build checks for /tmp/not-out/tmpb9w97vx8/fuzz_util_normalize
INFO: performing bad build checks for /tmp/not-out/tmpb9w97vx8/fuzz_tokenize
Broken fuzz targets 2
('/tmp/not-out/tmpb9w97vx8/fuzz_util_normalize', CompletedProcess(args=['bad_build_check', '/tmp/not-out/tmpb9w97vx8/fuzz_util_normalize'], returncode=1, stdout=b'BAD BUILD: /tmp/not-out/tmpb9w97vx8/fuzz_util_normalize seems to have either startup crash or exit:\nsysctl: setting key "vm.mmap_rnd_bits", ignoring: Read-only file system\n/tmp/not-out/tmpb9w97vx8/fuzz_util_normalize -- -rss_limit_mb=2560 -timeout=25 -seed=1337 -runs=4 < /dev/null\nTraceback (most recent call last):\n  File "fuzz_util_normalize.py", line 14, in <module>\nModuleNotFoundError: No module named \'pythainlp\'\n[PYI-179:ERROR] Failed to execute script \'fuzz_util_normalize\' due to unhandled exception!\n', stderr=b''))
BAD BUILD: /tmp/not-out/tmpb9w97vx8/fuzz_util_normalize seems to have either startup crash or exit:
sysctl: setting key "vm.mmap_rnd_bits", ignoring: Read-only file system
/tmp/not-out/tmpb9w97vx8/fuzz_util_normalize -- -rss_limit_mb=2560 -timeout=25 -seed=1337 -runs=4 < /dev/null
Traceback (most recent call last):
  File "fuzz_util_normalize.py", line 14, in <module>
ModuleNotFoundError: No module named 'pythainlp'
[PYI-215:ERROR] Failed to execute script 'fuzz_util_normalize' due to unhandled exception!

('/tmp/not-out/tmpb9w97vx8/fuzz_tokenize', CompletedProcess(args=['bad_build_check', '/tmp/not-out/tmpb9w97vx8/fuzz_tokenize'], returncode=1, stdout=b'BAD BUILD: /tmp/not-out/tmpb9w97vx8/fuzz_tokenize seems to have either startup crash or exit:\nsysctl: setting key "vm.mmap_rnd_bits", ignoring: Read-only file system\n/tmp/not-out/tmpb9w97vx8/fuzz_tokenize -- -rss_limit_mb=2560 -timeout=25 -seed=1337 -runs=4 < /dev/null\nTraceback (most recent call last):\n  File "fuzz_tokenize.py", line 14, in <module>\nModuleNotFoundError: No module named \'pythainlp\'\n[PYI-180:ERROR] Failed to execute script \'fuzz_tokenize\' due to unhandled exception!\n', stderr=b''))
BAD BUILD: /tmp/not-out/tmpb9w97vx8/fuzz_tokenize seems to have either startup crash or exit:
sysctl: setting key "vm.mmap_rnd_bits", ignoring: Read-only file system
/tmp/not-out/tmpb9w97vx8/fuzz_tokenize -- -rss_limit_mb=2560 -timeout=25 -seed=1337 -runs=4 < /dev/null
Traceback (most recent call last):
  File "fuzz_tokenize.py", line 14, in <module>
ModuleNotFoundError: No module named 'pythainlp'
[PYI-244:ERROR] Failed to execute script 'fuzz_tokenize' due to unhandled exception!

ERROR: 100.0% of fuzz targets seem to be broken. See the list above for a detailed information.

stderr: 

…install

PyInstaller (used by compile_python_fuzzer) cannot follow the editable
install path hooks created by 'pip install -e .'. When bundling the
fuzzer executables, pythainlp is not found in the bundle, causing:

  ModuleNotFoundError: No module named 'pythainlp'

Fix: use 'pip install .' so pythainlp lands in site-packages where
PyInstaller's dependency scanner can discover and bundle it.

Seen in CI job 65988614803.

Co-authored-by: bact <128572+bact@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Mar 6, 2026

@copilot There are issues with the fuzz build and fuzz targets. Fix them.

See the run log at https://github.com/PyThaiNLP/pythainlp/actions/runs/22751954112/job/65988614803?pr=12...

Fixed in d3d661f. The full error from job 65988614803:

ModuleNotFoundError: No module named 'pythainlp'
[PYI-179:ERROR] Failed to execute script 'fuzz_util_normalize' due to unhandled exception!

compile_python_fuzzer uses PyInstaller to bundle the fuzzer as a standalone executable. PyInstaller's static dependency scanner cannot follow the path hooks created by an editable install (pip install -e .). As a result, pythainlp was missing from the bundle.

Fix: changed the Dockerfile from pip install -e . to pip install . so pythainlp lands in site-packages where PyInstaller can discover and bundle it.

Copilot finished work on behalf of bact March 6, 2026 07:10
@bact bact marked this pull request as ready for review March 6, 2026 07:19
@bact
Copy link
Copy Markdown
Member

bact commented Mar 6, 2026

@copilot In fuzz/fuzz_tokenize.py and fuzz/fuzz_util_normalize.py, check if ValueError and UnicodeDecodeError are redundantly capture and fix them if necessary.

The workflow can finished successfully now. Check the run log at
https://github.com/PyThaiNLP/pythainlp/actions/runs/22753084476/job/65991658653?pr=1282 to see if there's anything does look right.

UnicodeDecodeError is a subclass of UnicodeError which is a subclass of
ValueError (UnicodeDecodeError -> UnicodeError -> ValueError). Listing
both in the except clause is therefore redundant; ValueError alone is
sufficient to catch UnicodeDecodeError as well.

The CI run (job 65991658653) completed successfully: both fuzz targets
built and ran for 300 seconds with no crashes found. The only notable
entry in the logs is a 401 on the coverage API which is a known
ClusterFuzzLite limitation when no prior coverage artifact exists.

Co-authored-by: bact <128572+bact@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Mar 6, 2026

@copilot In fuzz/fuzz_tokenize.py and fuzz/fuzz_util_normalize.py, check if ValueError and UnicodeDecodeError are redundantly capture and fix them if necessary.

The workflow can f...

Fixed redundant exception in 88a86f3. UnicodeDecodeErrorUnicodeErrorValueError, so listing both was redundant. Simplified to just except ValueError: in both harnesses.

The workflow run (job 65991658653) looks good overall:

  • Both fuzz targets built and ran for 300 seconds each with no crashes found
  • The only non-ideal entry is a 401 on the GitHub API coverage fetch: Could not get coverage: Github API request failed. This is a known ClusterFuzzLite limitation on first runs — the GITHUB_TOKEN can't read prior run artifacts for coverage diffing. It's non-fatal and will resolve once a coverage artifact exists from a batch/scheduled run.

Copilot finished work on behalf of bact March 6, 2026 07:29
@bact
Copy link
Copy Markdown
Member

bact commented Mar 6, 2026

@copilot
There are two kinds of warnings during Run Fuzzers step in this run log: https://github.com/PyThaiNLP/pythainlp/actions/runs/22753626606/job/65993244836?pr=1282

WARNING: no interesting inputs were found so far. Is the code instrumented for coverage?
This may also happen if the target rejected all inputs we tried so far
fatal: 'origin/gh-pages' is not a commit and a branch 'gh-pages' cannot be created from it
Switched to a new branch 'gh-pages'

Can we fix that?

…rts()

Without atheris.instrument_imports(), the Python bytecode is not
instrumented for coverage, so libFuzzer receives no feedback on which
code paths are hit and cannot guide mutation. This causes the warning:

  WARNING: no interesting inputs were found so far.
  Is the code instrumented for coverage?

Fix: wrap the pythainlp imports in both harnesses inside a
'with atheris.instrument_imports():' block so atheris instruments the
module bytecode at import time.

Also update README:
- Fix the example code to use atheris.instrument_imports()
- Fix example except clause: ValueError alone covers UnicodeDecodeError
  (UnicodeDecodeError -> UnicodeError -> ValueError)
- Add "Known warnings on first run" section documenting:
  - gh-pages branch bootstrap message (one-time, benign)
  - coverage instrumentation warning (fixed by this commit)

Co-authored-by: bact <128572+bact@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Mar 6, 2026

@copilot
There are two kinds of warnings during Run Fuzzers step in this run log: https://github.com/PyThaiNLP/pythainlp/actions/runs/22753626606/job/65993244836?pr=1282

...

Fixed in 9b853f7.

Coverage instrumentation warning — The root cause was that pythainlp was imported outside of atheris.instrument_imports(). Without that context manager, atheris never instruments the Python bytecode and libFuzzer receives no coverage signal, causing the warning. Fixed by wrapping the imports in both harnesses:

with atheris.instrument_imports():
    import pythainlp.tokenize   # or pythainlp.util

gh-pages branch warning — This is a one-time bootstrap message. On the first run the remote branch doesn't exist yet, so ClusterFuzzLite creates it locally (Switched to a new branch 'gh-pages') and pushes an empty commit. Subsequent runs will not show this message. Added a "Known warnings on first run" section to fuzz/README.md documenting both warnings.

Copilot finished work on behalf of bact March 6, 2026 07:43
bact added 2 commits March 6, 2026 14:54
Updated headings to follow consistent casing and formatting. Adjusted sections related to directory structure, fuzzing targets, local testing, CI/CD integration, adding new fuzzers, and best practices for corpus storage.
Update README to clarify error message during workflow run.
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Mar 6, 2026

@bact Unfortunately I hit an unexpected error while processing your comment. I've automatically reported this to GitHub.

You can ask me to try again later by mentioning me in a new comment.

If you want to contact GitHub about this error, please mention the following identifier so they can better serve you: 84f32013-802b-495e-b570-07b656deda9f

Sorry for the inconvenience!

@bact bact merged commit e9ba05f into dev Mar 6, 2026
10 of 11 checks passed
@github-project-automation github-project-automation Bot moved this from In progress to Done in PyThaiNLP Mar 6, 2026
@sonarqubecloud
Copy link
Copy Markdown

sonarqubecloud Bot commented Mar 6, 2026

Quality Gate Failed Quality Gate failed

Failed conditions
4 Security Hotspots

See analysis details on SonarQube Cloud

@bact bact deleted the copilot/setup-clusterfuzzlite-testing branch March 6, 2026 08:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

infrastructure build process, configuration, GitHub Actions tests Unit test, code coverage, test case

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

Setup ClusterFuzzLite for fuzz testing

3 participants