uni-app Error: APP-SERVICE-SDK:setStorageSync:fail exceed storage item max length
uni-app Error: APP-SERVICE-SDK:setStorageSync:fail exceed storage item max length
在开发的时候,没有问题,微信开发工具和真机测试都没有问题。 提交发行,在微信开发工具中没有问题。但是到线上,页面卡了。然后在发行后的微信开发工具打开真机调试。 报出了这个问题。 并且是不停的报错。
目前毫无头绪,我把我代码中所有的uni.setStorageSync函数全部注释掉了。
但是还是有问题报错。不停的报错。
我测试了切换我的代码到更早版本,因为原来没遇到这个问题,但是目前很早的代码【绝对在正式环境和真机调试】也开始报这个错误了。
目前怀疑是hbuildx 发行代码时有什么问题,还有就是微信的基础库发生了变化。
经过测试,基础库,我换了几个不同的版本,报错依旧。
目前我的hbuidx 是3.4.7.20220422版本。
微信基础库实验了:2.13.2 和2.24.4,2.14.1
崩溃了。
有遇到同样问题的吗?
目前无法定位。
目前处理的办法是运行模式,直接发布。
应该是 storage 满了,点 storage 标签卡,看是否可以清理一下。
死循环触发的。一直在循环。
目前的问题是,我跟踪不到问题。不确定为什么会触发。
目前的问题是,我跟踪不到问题。不确定为什么会触发。
在没有修改代码的情况下 发行版,不能用了。
微信客户端是否更新过,如果都没动,那么变量可能就是存储数据量的变化,积累到一定程度,问题显现了。
问题修复了,应为跳转链接带了一个比较大的参数。然后报错了。换了传参方式好了。
The error APP-SERVICE-SDK:setStorageSync:fail exceed storage item max length
in uni-app indicates that the data you are trying to store using uni.setStorageSync
exceeds the maximum allowed length for a single storage item.
Explanation:
In uni-app, uni.setStorageSync
is used to synchronously store data in the local storage. However, there is a limit to the size of data that can be stored in a single key. The exact limit can vary depending on the platform (e.g., H5, WeChat Mini Program, etc.), but it is generally around 1MB per key.
Solution:
To resolve this issue, you can try the following approaches:
-
Split the Data:
- If the data you are trying to store is large, consider splitting it into smaller chunks and storing them under different keys.
- For example, instead of storing a large JSON object as a single key, you can split it into multiple parts and store each part separately.
const largeData = { /* your large data object */ }; const chunkSize = 500 * 1024; // 500KB per chunk const chunks = []; for (let i = 0; i < largeData.length; i += chunkSize) { chunks.push(largeData.slice(i, i + chunkSize)); } chunks.forEach((chunk, index) => { uni.setStorageSync(`data_chunk_${index}`, chunk); });
-
Compress the Data:
- If the data is compressible (e.g., JSON, text), you can compress it before storing it and decompress it when retrieving it.
- Libraries like
pako
can be used for compression.
import pako from 'pako'; const data = { /* your large data object */ }; const compressedData = pako.deflate(JSON.stringify(data)); uni.setStorageSync('compressed_data', compressedData); // To retrieve and decompress const retrievedData = uni.getStorageSync('compressed_data'); const decompressedData = pako.inflate(retrievedData, { to: 'string' }); const originalData = JSON.parse(decompressedData);
-
Use IndexedDB (for H5):
- If you are working on an H5 platform, consider using IndexedDB for storing large amounts of data. IndexedDB has a much higher storage limit compared to
localStorage
.
const request = indexedDB.open('MyDatabase', 1); request.onupgradeneeded = function(event) { const db = event.target.result; const store = db.createObjectStore('MyStore', { keyPath: 'id' }); }; request.onsuccess = function(event) { const db = event.target.result; const transaction = db.transaction('MyStore', 'readwrite'); const store = transaction.objectStore('MyStore'); store.put({ id: 1, data: largeData }); };
- If you are working on an H5 platform, consider using IndexedDB for storing large amounts of data. IndexedDB has a much higher storage limit compared to