Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:usesCleartextTraffic="true"
android:largeHeap="true"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Adding android:largeHeap="true" is a band-aid; fix the blob-loading OOM at the source.

largeHeap is not guaranteed and can mask underlying inefficiencies. Given the OOM arises when returning image/blob data via JSON, prioritize:

  • Avoid base64/JSON for large binaries; return a file URI via FileProvider or a persisted content URI instead, and stream on the JS side.
  • If decoding images, downsample aggressively (inSampleSize, inPreferredConfig RGB_565), and stream with ContentResolver rather than loading whole arrays in memory.
  • If you must pass bytes, chunk/stream and avoid one-shot arrays.

I recommend keeping this attribute only as a temporary mitigation while landing a plugin refactor.

Refactor sketch (Kotlin) for the plugin to return a URI instead of base64:

// Inside NbStorePlugin.getBlob(...)
val file = File(context.cacheDir, "${blob.key}.bin")
file.outputStream().use { it.write(blob.data) }
// Return a content URI via FileProvider declared in your manifest
val uri = FileProvider.getUriForFile(context, "${context.packageName}.fileprovider", file)
JSObject().apply {
  put("key", blob.key)
  put("uri", uri.toString())
  put("mimeType", blob.mimeType)
  put("size", blob.size)
  put("createdAt", blob.createdAt)
}.also { call.resolve(it) }
// On JS side, fetch via the URI/stream instead of reading a base64 field

Optional: log available heap to validate any effect of largeHeap and guide downsampling:

// In Application.onCreate
val am = getSystemService(Context.ACTIVITY_SERVICE) as ActivityManager
Log.i("MemClass", "memoryClass=${am.memoryClass}, largeMemoryClass=${am.largeMemoryClass}")

To locate the risky code paths now and confirm base64/JSON usage, run:


🏁 Script executed:

#!/bin/bash
# Find NbStorePlugin.getBlob and base64/JSON payload construction
rg -n --glob '!**/node_modules/**' $'class NbStorePlugin|fun getBlob\\(|Base64|encodeToString\\(|JSONObject\\(|put\\(\\s*"data"' -A 3 -B 3

Length of output: 32804


🏁 Script executed:

#!/bin/bash
rg -n 'getBlob' -A 3 -B 3 packages/frontend/apps/android/src/plugins/nbstore/index.ts

Length of output: 606


Refactor native blob handling instead of relying on largeHeap

Adding android:largeHeap="true" in AndroidManifest only masks the OOM when returning large base64 blobs via the Capacitor plugin. To eliminate the root cause, update your NbStorePlugin.getBlob implementation to stream files instead of inlining giant Base64 strings:

• In packages/frontend/apps/android/App/app/src/main/java/app/affine/pro/plugin/NbStorePlugin.kt (lines ~241–254), you currently do:

JSObject().apply {
  put("key", it.key)
  put("data", it.data)           // full Base64 payload
  put("mime", it.mime)
  put("size", it.size)
  put("createdAt", it.createdAt)
}

• Replace with a FileProvider-backed URI:

val file = File(context.cacheDir, "${blob.key}.bin")
file.outputStream().use { it.write(blob.data) }
val uri = FileProvider.getUriForFile(
  context,
  "${context.packageName}.fileprovider",
  file
)
JSObject().apply {
  put("key", blob.key)
  put("uri", uri.toString())    // stream on JS side
  put("mimeType", blob.mimeType)
  put("size", blob.size)
  put("createdAt", blob.createdAt)
}.also { call.resolve(it) }

• On the JS side, fetch via the URI and process streams instead of decoding one-shot Base64.

• For image blobs, also downsample aggressively with BitmapFactory.Options (e.g. inSampleSize, inPreferredConfig=RGB_565) and load via ContentResolver.openInputStream.

Optional heap logging to guide tuning:

val am = getSystemService(Context.ACTIVITY_SERVICE) as ActivityManager
Log.i("MemClass", "memoryClass=${am.memoryClass}, largeMemoryClass=${am.largeMemoryClass}")

To verify all Base64 paths in the Android plugin implementation:

rg -n 'NbStorePlugin.kt' -e 'put("data"' -A3 -B3
🤖 Prompt for AI Agents
In packages/frontend/apps/android/App/app/src/main/AndroidManifest.xml at line
18, remove the android:largeHeap="true" attribute as it only masks out-of-memory
issues caused by large Base64 blobs. Instead, update the NbStorePlugin.getBlob
method in
packages/frontend/apps/android/App/app/src/main/java/app/affine/pro/plugin/NbStorePlugin.kt
(around lines 241–254) to write the blob data to a file in the cache directory,
obtain a FileProvider URI for that file, and return this URI in the JSObject
instead of the full Base64 string. On the JavaScript side, modify the code to
fetch and process the blob data via the URI stream rather than decoding Base64
in one shot. For image blobs, implement downsampling using BitmapFactory.Options
and load images through ContentResolver.openInputStream to reduce memory usage.
Optionally, add heap logging to monitor memory class for tuning purposes.

android:theme="@style/AppTheme">

<activity
Expand Down