14 KiB
title | pagination_prev | pagination_next | sidebar_custom_props | ||
---|---|---|---|---|---|
Large Datasets | demos/extensions/index | demos/engines/index |
|
import current from '/version.js'; import CodeBlock from '@theme/CodeBlock';
For maximal compatibility, the library reads entire files at once and generates files at once. Browsers and other JS engines enforce tight memory limits. In these cases, the library offers strategies to optimize for memory or space by using platform-specific APIs.
Dense Mode
read
, readFile
and aoa_to_sheet
accept the dense
option. When enabled,
the methods create worksheet objects that store cells in arrays of arrays:
var dense_wb = XLSX.read(ab, {dense: true});
var dense_sheet = XLSX.utils.aoa_to_sheet(aoa, {dense: true});
Historical Note (click to show)
The earliest versions of the library aimed for IE6+ compatibility. In early testing, both in Chrome 26 and in IE6, the most efficient worksheet storage for small sheets was a large object whose keys were cell addresses.
Over time, V8 (the engine behind Chrome and NodeJS) evolved in a way that made the array of arrays approach more efficient but reduced the performance of the large object approach.
In the interest of preserving backwards compatibility, the library opts to make
the array of arrays approach available behind a special dense
option.
The various API functions will seamlessly handle dense and sparse worksheets.
Streaming Write
The streaming write functions are available in the XLSX.stream
object. They
take the same arguments as the normal write functions:
XLSX.stream.to_csv
is the streaming version ofXLSX.utils.sheet_to_csv
.XLSX.stream.to_html
is the streaming version ofXLSX.utils.sheet_to_html
.XLSX.stream.to_json
is the streaming version ofXLSX.utils.sheet_to_json
.
"Stream" refers to the NodeJS push streams API.
Historical Note (click to show)
NodeJS push streams were introduced in 2012. The text streaming methods to_csv
and to_html
are supported in NodeJS v0.10 and later while the object streaming
method to_json
is supported in NodeJS v0.12 and later.
The first streaming write function, to_csv
, was introduced in April 2017. It
used and still uses the same NodeJS streaming API.
Years later, browser vendors are settling on a different stream API.
For maximal compatibility, the library uses NodeJS push streams.
NodeJS
In a CommonJS context, NodeJS Streams and fs
immediately work with SheetJS:
const XLSX = require("xlsx"); // "just works"
:::warning ECMAScript Module Machinations
In NodeJS ESM, the dependency must be loaded manually:
import * as XLSX from 'xlsx';
import { Readable } from 'stream';
XLSX.stream.set_readable(Readable); // manually load stream helpers
Additionally, for file-related operations in NodeJS ESM, fs
must be loaded:
import * as XLSX from 'xlsx';
import * as fs from 'fs';
XLSX.set_fs(fs); // manually load fs helpers
It is strongly encouraged to use CommonJS in NodeJS whenever possible.
:::
XLSX.stream.to_csv
This example reads a worksheet passed as an argument to the script, pulls the
first worksheet, converts to CSV and writes to SheetJSNodeJStream.csv
:
var XLSX = require("xlsx"), fs = require("fs");
var wb = XLSX.readFile(process.argv[2]);
var ws = wb.Sheets[wb.SheetNames[0]];
var ostream = fs.createWriteStream("SheetJSNodeJStream.csv");
// highlight-next-line
XLSX.stream.to_csv(ws).pipe(ostream);
XLSX.stream.to_json
stream.to_json
uses Object-mode streams. A Transform
stream can be used to
generate a normal stream for streaming to a file or the screen:
var XLSX = require("xlsx"), Transform = require("stream").Transform;
var wb = XLSX.readFile(process.argv[2], {dense: true});
var ws = wb.Sheets[wb.SheetNames[0]];
/* this Transform stream converts JS objects to text */
var conv = new Transform({writableObjectMode:true});
conv._transform = function(obj, e, cb){ cb(null, JSON.stringify(obj) + "\n"); };
/* pipe `to_json` -> transformer -> standard output */
// highlight-next-line
XLSX.stream.to_json(ws, {raw: true}).pipe(conv).pipe(process.stdout);
Demo
:::note
This demo was last tested in the following deployments:
Node Version | Date | Node Status when tested |
---|---|---|
0.12.18 |
2023-09-02 | End-of-Life |
4.9.1 |
2023-09-02 | End-of-Life |
6.17.1 |
2023-09-02 | End-of-Life |
8.17.0 |
2023-09-02 | End-of-Life |
10.24.1 |
2023-09-02 | End-of-Life |
12.22.12 |
2023-09-02 | End-of-Life |
14.21.3 |
2023-09-02 | End-of-Life |
16.20.0 |
2023-09-02 | Maintenance LTS |
18.17.1 |
2023-09-02 | Active LTS |
20.5.1 |
2023-09-02 | Current |
While streaming methods work in End-of-Life versions of NodeJS, production deployments should upgrade to a Current or LTS version of NodeJS.
:::
- Install the NodeJS module
{\ npm i --save https://cdn.sheetjs.com/xlsx-${current}/xlsx-${current}.tgz
}
- Download
SheetJSNodeJStream.js
:
curl -LO https://docs.sheetjs.com/stream/SheetJSNodeJStream.js
- Download the test file:
curl -LO https://sheetjs.com/pres.xlsx
- Run the script:
node SheetJSNodeJStream.js pres.xlsx
Expected Output (click to show)
The console will display a list of objects:
{"Name":"Bill Clinton","Index":42}
{"Name":"GeorgeW Bush","Index":43}
{"Name":"Barack Obama","Index":44}
{"Name":"Donald Trump","Index":45}
{"Name":"Joseph Biden","Index":46}
The script will also generate SheetJSNodeJStream.csv
:
Name,Index
Bill Clinton,42
GeorgeW Bush,43
Barack Obama,44
Donald Trump,45
Joseph Biden,46
Browser
:::note
The live demo was last tested on 2023-09-02 in Chromium 116.
:::
NodeJS streaming APIs are not available in the browser. The following function
supplies a pseudo stream object compatible with the to_csv
function:
function sheet_to_csv_cb(ws, cb, opts, batch = 1000) {
XLSX.stream.set_readable(() => ({
__done: false,
// this function will be assigned by the SheetJS stream methods
_read: function() { this.__done = true; },
// this function is called by the stream methods
push: function(d) { if(!this.__done) cb(d); if(d == null) this.__done = true; },
resume: function pump() { for(var i = 0; i < batch && !this.__done; ++i) this._read(); if(!this.__done) setTimeout(pump.bind(this), 0); }
}));
return XLSX.stream.to_csv(ws, opts);
}
// assuming `workbook` is a workbook, stream the first sheet
const ws = workbook.Sheets[workbook.SheetNames[0]];
const strm = sheet_to_csv_cb(ws, (csv)=>{ if(csv != null) console.log(csv); });
strm.resume();
Web Workers
For processing large files in the browser, it is strongly encouraged to use Web Workers. The Worker demo includes examples using the File System Access API.
Web Worker Details (click to show)
Typically, the file and stream processing occurs in the Web Worker. CSV rows can be sent back to the main thread in the callback:
{\ /* load standalone script from CDN */ importScripts("https://cdn.sheetjs.com/xlsx-${current}/package/dist/xlsx.full.min.js"); \n\ function sheet_to_csv_cb(ws, cb, opts, batch = 1000) { XLSX.stream.set_readable(() => ({ __done: false, // this function will be assigned by the SheetJS stream methods _read: function() { this.__done = true; }, // this function is called by the stream methods push: function(d) { if(!this.__done) cb(d); if(d == null) this.__done = true; }, resume: function pump() { for(var i = 0; i < batch && !this.__done; ++i) this._read(); if(!this.__done) setTimeout(pump.bind(this), 0); } })); return XLSX.stream.to_csv(ws, opts); } \n\ /* this callback will run once the main context sends a message */ self.addEventListener('message', async(e) => { try { postMessage({state: "fetching " + e.data.url}); /* Fetch file */ const res = await fetch(e.data.url); const ab = await res.arrayBuffer(); \n\ /* Parse file */ postMessage({state: "parsing"}); const wb = XLSX.read(ab, {dense: true}); const ws = wb.Sheets[wb.SheetNames[0]]; \n\ /* Generate CSV rows */ postMessage({state: "csv"}); const strm = sheet_to_csv_cb(ws, (csv) => { if(csv != null) postMessage({csv}); else postMessage({state: "done"}); }); strm.resume(); } catch(e) { /* Pass the error message back */ postMessage({error: String(e.message || e) }); } }, false);
}
The main thread will receive messages with CSV rows for further processing:
worker.onmessage = function(e) {
if(e.data.error) { console.error(e.data.error); /* show an error message */ }
else if(e.data.state) { console.info(e.data.state); /* current state */ }
else {
/* e.data.csv is the row generated by the stream */
console.log(e.data.csv);
}
};
Live Demo
The following live demo fetches and parses a file in a Web Worker. The to_csv
streaming function is used to generate CSV rows and pass back to the main thread
for further processing.
:::note pass
For Chromium browsers, the File System Access API provides a modern worker-only approach. The Web Workers demo includes a live example of CSV streaming write.
:::
The demo has a URL input box. Feel free to change the URL. For example,
https://raw.githubusercontent.com/SheetJS/test_files/master/large_strings.xls
is an XLS file over 50 MB
https://raw.githubusercontent.com/SheetJS/libreoffice_test-files/master/calc/xlsx-import/perf/8-by-300000-cells.xlsx
is an XLSX file with 300000 rows (approximately 20 MB)
{\ function SheetJSFetchCSVStreamWorker() { const [__html, setHTML] = React.useState(""); const [state, setState] = React.useState(""); const [cnt, setCnt] = React.useState(0); const [url, setUrl] = React.useState("https://docs.sheetjs.com/test_files/large_strings.xlsx"); \n\ return ( <> <b>URL: </b><input type="text" value={url} onChange={(e) => setUrl(e.target.value)} size="80"/> <button onClick={() => { /* this mantra embeds the worker source in the function */ const worker = new Worker(URL.createObjectURL(new Blob([\
\
/* load standalone script from CDN /
importScripts("https://cdn.sheetjs.com/xlsx-${current}/package/dist/xlsx.full.min.js");
\n
function sheet_to_csv_cb(ws, cb, opts, batch = 1000) {
XLSX.stream.set_readable(() => ({
__done: false,
// this function will be assigned by the SheetJS stream methods
_read: function() { this.__done = true; },
// this function is called by the stream methods
push: function(d) { if(!this.__done) cb(d); if(d == null) this.__done = true; },
resume: function pump() { for(var i = 0; i < batch && !this.__done; ++i) this._read(); if(!this.__done) setTimeout(pump.bind(this), 0); }
}));
return XLSX.stream.to_csv(ws, opts);
}
\n
/ this callback will run once the main context sends a message /
self.addEventListener('message', async(e) => {
try {
postMessage({state: "fetching " + e.data.url});
/ Fetch file /
const res = await fetch(e.data.url);
const ab = await res.arrayBuffer();
\n
/ Parse file /
let len = ab.byteLength;
if(len < 1024) len += " bytes"; else { len /= 1024;
if(len < 1024) len += " KB"; else { len /= 1024; len += " MB"; }
}
postMessage({state: "parsing " + len});
const wb = XLSX.read(ab, {dense: true});
const ws = wb.Sheets[wb.SheetNames[0]];
\n
/ Generate CSV rows /
postMessage({state: "csv"});
const strm = sheet_to_csv_cb(ws, (csv) => {
if(csv != null) postMessage({csv});
else postMessage({state: "done"});
});
strm.resume();
} catch(e) {
/ Pass the error message back /
postMessage({error: String(e.message || e) });
}
}, false);
`])));
/ when the worker sends back data, add it to the DOM /
worker.onmessage = function(e) {
if(e.data.error) return setHTML(e.data.error);
else if(e.data.state) return setState(e.data.state);
setHTML(e.data.csv);
setCnt(cnt => cnt+1);
};
setCnt(0); setState("");
/ post a message to the worker with the URL to fetch */
worker.postMessage({url});
}}>Click to Start
State: {state}<pre dangerouslySetInnerHTML={{ __html }}/> </> ); }`}
Number of rows: {cnt}
Deno
Deno does not support NodeJS streams in normal execution, so a wrapper is used:
{\ // @deno-types="https://cdn.sheetjs.com/xlsx-${current}/package/types/index.d.ts" import { stream } from 'https://cdn.sheetjs.com/xlsx-${current}/package/xlsx.mjs'; \n\ /* Callback invoked on each row (string) and at the end (null) */ const csv_cb = (d:string|null) => { if(d == null) return; /* The strings include line endings, so raw write ops should be used */ Deno.stdout.write(new TextEncoder().encode(d)); }; \n\ /* Prepare \
Readable` function /
const Readable = () => ({
__done: false,
// this function will be assigned by the SheetJS stream methods
_read: function() { this.__done = true; },
// this function is called by the stream methods
push: function(d: any) {
if(!this.__done) csv_cb(d);
if(d == null) this.__done = true;
},
resume: function pump() {
for(var i = 0; i < 1000 && !this.__done; ++i) this._read();
if(!this.__done) setTimeout(pump.bind(this), 0);
}
})
/ Wire up /
stream.set_readable(Readable);
\n
/ assuming `workbook` is a workbook, stream the first sheet */
const ws = workbook.Sheets[workbook.SheetNames[0]];
stream.to_csv(wb.Sheets[wb.SheetNames[0]]).resume();`}
:::note
This demo was last tested on 2023-09-02 against Deno 1.36.4
:::
SheetJSDenoStream.ts
is a small
example script that downloads https://sheetjs.com/pres.numbers and prints
CSV row objects.
- Run
deno run -A https://docs.sheetjs.com/stream/SheetJSDenoStream.ts