Adding Cloudflare to Demos section #6
@ -121,336 +121,111 @@ free requests per day and 10 milliseconds of CPU time per invocation.
|
||||
|
||||
## Cloudflare R2
|
||||
|
||||
The main NodeJS module for S3 and all Cloudflare services is `Cloudflare-sdk`[^8].
|
||||
Due to R2's S3 compatability, the main NodeJS module for Cloudflare R2 is actually the AWS S3 SDK `@aws-sdk/client-s3`[^8].
|
||||
|
||||
The [SheetJS NodeJS module](/docs/getting-started/installation/nodejs) can be
|
||||
required in NodeJS scripts.
|
||||
|
||||
:::note pass
|
||||
|
||||
As the time of writing, Cloudflare R2 is not 100% compatible with the S3 API, please refer to the [S3 API Compatibility section of the official Cloudflare R2 Docs](https://developers.cloudflare.com/r2/api/s3/api/) to know more about what S3 Features are and are not supported.
|
||||
|
||||
:::
|
||||
|
||||
### Connecting to R2
|
||||
|
||||
The `Cloudflare-sdk` module exports a function `S3` that performs the connection. The
|
||||
function expects an options object that includes an API version and credentials.
|
||||
Access keys for an IAM user[^9] must be used:
|
||||
The `@aws-sdk/client-s3` module exports a class `S3Client` that performs the connection. The
|
||||
function expects an options object that includes a region, credentials and in the case of R2 an endpoint.
|
||||
this client is used to interact with R2.
|
||||
|
||||
```js
|
||||
```ts
|
||||
import { S3Client } from "@aws-sdk/client-s3";
|
||||
/* credentials */
|
||||
var accessKeyId = "...", secretAccessKey = "..."";
|
||||
const accessKeyId = "...", secretAccessKey = "...";
|
||||
|
||||
/* file location */
|
||||
var Bucket = "...", Key = "pres.numbers";
|
||||
const Bucket = "...", Key = "pres.numbers";
|
||||
|
||||
/* connect to s3 account */
|
||||
var Cloudflare = require('Cloudflare-sdk');
|
||||
var s3 = new Cloudflare.S3({
|
||||
apiVersion: '2006-03-01',
|
||||
credentials: { accessKeyId, secretAccessKey }
|
||||
const Cloudflare = require('Cloudflare-sdk');
|
||||
const s3 = new S3Client({
|
||||
region: "REGION", //For cloudflare, if the cloudflare region is Auto then you put "auto"
|
||||
credentials: {
|
||||
accessKeyId,
|
||||
secretAccessKey
|
||||
},
|
||||
endpoint: 'R2-ENDPOINT' //Note that the cloudflare dashboard will include the bucket prefix to the endpoint
|
||||
//Make sure the endpoint ends with .r2.cloudflarestorage.com not .r2.cloudflarestorage.com/bucket-name
|
||||
});
|
||||
```
|
||||
|
||||
### Downloading Data
|
||||
|
||||
#### Fetching Files from S3
|
||||
#### Fetching Files from R2
|
||||
|
||||
The `s3#getObject` method returns an object with a `createReadStream` method.
|
||||
`createReadStream` returns a NodeJS stream:
|
||||
The S3 SDK's client mentioned above when instantiated does needs to be fed commands through its ```.send()``` method
|
||||
to interact with R2.
|
||||
|
||||
```js
|
||||
/* open stream to the file */
|
||||
var stream = s3.getObject({ Bucket: Bucket, Key: Key }).createReadStream();
|
||||
```
|
||||
|
||||
#### Concatenating NodeJS Streams
|
||||
|
||||
Buffers can be concatenated from the stream into one unified Buffer object:
|
||||
|
||||
```js
|
||||
/* array of buffers */
|
||||
var bufs = [];
|
||||
/* add each data chunk to the array */
|
||||
stream.on('data', function(data) { bufs.push(data); });
|
||||
/* the callback will be called after all of the data is collected */
|
||||
stream.on('end', function() {
|
||||
/* concatenate */
|
||||
var buf = Buffer.concat(bufs);
|
||||
|
||||
/* AT THIS POINT, `buf` is a NodeJS Buffer */
|
||||
});
|
||||
```
|
||||
|
||||
#### Parsing NodeJS Buffers
|
||||
|
||||
The SheetJS `read` method[^10] can read the final object and generate SheetJS
|
||||
workbook objects[^11] which can be processed with other API functions.
|
||||
|
||||
For example, a callback can use `sheet_to_csv`[^12] to generate CSV text:
|
||||
|
||||
```js
|
||||
stream.on('end', function() {
|
||||
/* concatenate */
|
||||
var buf = Buffer.concat(bufs);
|
||||
|
||||
/* parse */
|
||||
var wb = XLSX.read(Buffer.concat(bufs));
|
||||
|
||||
/* generate CSV from first worksheet */
|
||||
var first_ws = wb.Sheets[wb.SheetNames[0]];
|
||||
var csv = XLSX.utils.sheet_to_csv(first_ws);
|
||||
console.log(csv);
|
||||
});
|
||||
```ts
|
||||
import { GetObjectCommand, S3Client } from "@aws-sdk/client-s3";
|
||||
/*
|
||||
... Client instantiation code from before
|
||||
...
|
||||
...
|
||||
*/
|
||||
const command = new GetObjectCommand({
|
||||
Bucket: 'bucket-name',
|
||||
Key: 'key'
|
||||
}) //Create the command object, we are using GetObjectCommand to fetch an item from R2
|
||||
const data = await client.send(command); //Send the command
|
||||
const bytes = await data.Body?.transformToByteArray(); //After receiving the item, transform it to a byte array
|
||||
const wb = XLSX.read(bytes); //Use SheetJS as normal
|
||||
const first_ws = wb.Sheets[wb.SheetNames[0]];
|
||||
const csv = XLSX.utils.sheet_to_csv(first_ws);
|
||||
```
|
||||
|
||||
### Uploading Data
|
||||
|
||||
The SheetJS `write` method[^13] with the option `type: "buffer"` will generate
|
||||
NodeJS Buffers. `S3#upload` directly accepts these Buffer objects.
|
||||
The SheetJS `write` method[^9] with the option `type: "buffer"` will generate
|
||||
NodeJS Buffers. `R2` directly accepts these Buffer objects.
|
||||
|
||||
This example creates a sample workbook object, generates XLSX file data in a
|
||||
NodeJS Buffer, and uploads the data to S3:
|
||||
NodeJS Buffer, and uploads the data to R2:
|
||||
|
||||
Here we are using the PutObjectCommand object to upload the data written by
|
||||
`write` into the R2 bucket
|
||||
|
||||
```js
|
||||
import { PutObjectCommand, S3Client } from "@aws-sdk/client-s3";
|
||||
/*
|
||||
... Client instantiation code from before
|
||||
...
|
||||
...
|
||||
*/
|
||||
/* generate sample workbook */
|
||||
var wb = XLSX.read("S,h,e,e,t,J,S\n5,4,3,3,7,9,5", {type: "binary"});
|
||||
|
||||
/* write to XLSX file in a NodeJS Buffer */
|
||||
var Body = XLSX.write(wb, {type: "buffer", bookType: "xlsx"});
|
||||
|
||||
/* upload buffer */
|
||||
s3.upload({ Bucket, Key, Body }, function(err, data) {
|
||||
if(err) throw err;
|
||||
console.log("Uploaded to " + data.Location);
|
||||
});
|
||||
const command = new PutObjectCommand({
|
||||
Bucket: 'bucket-name',
|
||||
Key: 'key',
|
||||
Body
|
||||
})
|
||||
await client.send(command);
|
||||
```
|
||||
|
||||
### S3 Demo
|
||||
### R2 Demo
|
||||
|
||||
:::note pass
|
||||
|
||||
At the time of writing, the Cloudflare Free Tier included 5GB of Cloudflare R2 with 20,000
|
||||
Get requests and 2000 Put requests per month.
|
||||
At the time of writing, the Cloudflare Free Tier included 10GB of Cloudflare R2 with 1 million Class A operations
|
||||
and 10 million Class B operations per month
|
||||
|
||||
Please visit the [official pricing model for R2 for more info](https://developers.cloudflare.com/r2/pricing/)
|
||||
|
||||
:::
|
||||
|
||||
This sample fetches a buffer from S3 and parses the workbook.
|
||||
|
||||
0) If you do not have an account, create a new Cloudflare free tier account[^14].
|
||||
|
||||
#### Create S3 Bucket
|
||||
|
||||
1) Sign into the [Cloudflare Management Console](https://Cloudflare.amazon.com/console/) with
|
||||
a root user account.
|
||||
|
||||
2) Type "S3" in the top search box and click S3 (under Services).
|
||||
|
||||
3) Open "Buckets" in the left sidebar.
|
||||
|
||||
If the left sidebar is not open, click the `≡` icon in the left edge of the page.
|
||||
|
||||
4) Click the "Create bucket" button in the main panel.
|
||||
|
||||
5) Select the following options:
|
||||
|
||||
- Type a memorable "Bucket Name" ("sheetjsbouquet" when last tested)
|
||||
|
||||
- In the "Object Ownership" section, select "ACLs disabled"
|
||||
|
||||
- Check "Block *all* public access"
|
||||
|
||||
- Look for the "Bucket Versioning" section and select "Disable"
|
||||
|
||||
6) Click "Create bucket" to create the bucket.
|
||||
|
||||
#### Create IAM User
|
||||
|
||||
7) Type "IAM" in the top search box and click IAM (under Services).
|
||||
|
||||
8) Open "Users" in the left sidebar.
|
||||
|
||||
If the left sidebar is not open, click the `≡` icon in the left edge of the page.
|
||||
|
||||
9) Click the "Create user" button in the main panel.
|
||||
|
||||
10) In step 1, type a memorable "Bucket Name" ("sheetjs-user" when last tested).
|
||||
Click "Next".
|
||||
|
||||
11) In step 2, click "Next"
|
||||
|
||||
12) In step 3, click "Create user" to create the user.
|
||||
|
||||
#### Add Permissions
|
||||
|
||||
13) Click the new user name in the Users table.
|
||||
|
||||
14) Select the "Permissions" tab
|
||||
|
||||
15) Click the "Add permissions" dropdown and select "Add permissions".
|
||||
|
||||
16) Select "Attach policies directly".
|
||||
|
||||
17) In the "Permissions policies" section, search for "AmazonS3FullAccess".
|
||||
There should be one entry.
|
||||
|
||||
18) Check the checkbox next to "AmazonS3FullAccess" and click the "Next" button.
|
||||
|
||||
19) In the "Review" screen, click "Add permissions"
|
||||
|
||||
#### Generate Keys
|
||||
|
||||
20) Click "Security credentials", then click "Create access key".
|
||||
|
||||
21) Select the "Local code" option. Check "I understand the above recommendation
|
||||
and want to proceed to create an access key." and click "Next"
|
||||
|
||||
22) Click "Create Access Key" and click "Download .csv file" in the next screen.
|
||||
|
||||
In the generated CSV:
|
||||
|
||||
- Cell A2 is the "Access key ID" (`accessKeyId` in the Cloudflare API)
|
||||
- Cell B2 is the "Secret access key" (`secretAccessKey` in the Cloudflare API)
|
||||
|
||||
#### Set up Project
|
||||
|
||||
23) Create a new NodeJS project:
|
||||
|
||||
```bash
|
||||
mkdir SheetJSS3
|
||||
cd SheetJSS3
|
||||
npm init -y
|
||||
```
|
||||
|
||||
24) Install dependencies:
|
||||
|
||||
<CodeBlock language="bash">{`\
|
||||
mkdir -p node_modules
|
||||
npm i --save https://cdn.sheetjs.com/xlsx-${current}/xlsx-${current}.tgz Cloudflare-sdk@2.1467.0`}
|
||||
</CodeBlock>
|
||||
|
||||
#### Write Test
|
||||
|
||||
:::note pass
|
||||
|
||||
This sample creates a simple workbook, generates a NodeJS buffer, and uploads
|
||||
the buffer to S3.
|
||||
|
||||
```
|
||||
| A | B | C | D | E | F | G |
|
||||
---+---|---|---|---|---|---|---|
|
||||
1 | S | h | e | e | t | J | S |
|
||||
2 | 5 | 4 | 3 | 3 | 7 | 9 | 5 |
|
||||
```
|
||||
|
||||
:::
|
||||
|
||||
25) Save the following script to `SheetJSWriteToS3.js`:
|
||||
|
||||
```js title="SheetJSWriteToS3.js"
|
||||
var XLSX = require("xlsx");
|
||||
var Cloudflare = require('Cloudflare-sdk');
|
||||
|
||||
/* replace these constants */
|
||||
// highlight-start
|
||||
var accessKeyId = "<REPLACE WITH ACCESS KEY ID>";
|
||||
var secretAccessKey = "<REPLACE WITH SECRET ACCESS KEY>";
|
||||
var Bucket = "<REPLACE WITH BUCKET NAME>";
|
||||
// highlight-end
|
||||
|
||||
var Key = "test.xlsx";
|
||||
|
||||
/* Create a simple workbook and write XLSX to buffer */
|
||||
var ws = XLSX.utils.aoa_to_sheet(["SheetJS".split(""), [5,4,3,3,7,9,5]]);
|
||||
var wb = XLSX.utils.book_new(); XLSX.utils.book_append_sheet(wb, ws, "Sheet1");
|
||||
var Body = XLSX.write(wb, {type: "buffer", bookType: "xlsx"});
|
||||
|
||||
/* upload buffer */
|
||||
var s3 = new Cloudflare.S3({
|
||||
apiVersion: '2006-03-01',
|
||||
credentials: {
|
||||
accessKeyId: accessKeyId,
|
||||
secretAccessKey: secretAccessKey
|
||||
}
|
||||
});
|
||||
s3.upload({ Bucket: Bucket, Key: Key, Body: Body }, function(err, data) {
|
||||
if(err) throw err;
|
||||
console.log("Uploaded to " + data.Location);
|
||||
});
|
||||
```
|
||||
|
||||
26) Edit `SheetJSWriteToS3.js` and replace the highlighted lines:
|
||||
|
||||
- `accessKeyId`: access key for the Cloudflare account
|
||||
- `secretAccessKey`: secret access key for the Cloudflare account
|
||||
- `Bucket`: name of the bucket
|
||||
|
||||
The keys are found in the CSV from step 22. The Bucket is the name from step 5.
|
||||
|
||||
27) Run the script:
|
||||
|
||||
```bash
|
||||
node SheetJSWriteToS3.js
|
||||
```
|
||||
|
||||
This file will be stored with the object name `test.xlsx`. It can be manually
|
||||
downloaded from the S3 web interface.
|
||||
|
||||
#### Read Test
|
||||
|
||||
This sample will download and process the test file from "Write Test".
|
||||
|
||||
28) Save the following script to `SheetJSReadFromS3.js`:
|
||||
|
||||
```js title="SheetJSReadFromS3.js"
|
||||
var XLSX = require("xlsx");
|
||||
var Cloudflare = require('Cloudflare-sdk');
|
||||
|
||||
/* replace these constants */
|
||||
// highlight-start
|
||||
var accessKeyId = "<REPLACE WITH ACCESS KEY ID>";
|
||||
var secretAccessKey = "<REPLACE WITH SECRET ACCESS KEY>";
|
||||
var Bucket = "<REPLACE WITH BUCKET NAME>";
|
||||
// highlight-end
|
||||
|
||||
var Key = "test.xlsx";
|
||||
|
||||
/* Get stream */
|
||||
var s3 = new Cloudflare.S3({
|
||||
apiVersion: '2006-03-01',
|
||||
credentials: {
|
||||
accessKeyId: accessKeyId,
|
||||
secretAccessKey: secretAccessKey
|
||||
}
|
||||
});
|
||||
var f = s3.getObject({ Bucket: Bucket, Key: Key }).createReadStream();
|
||||
|
||||
/* collect data */
|
||||
var bufs = [];
|
||||
f.on('data', function(data) { bufs.push(data); });
|
||||
f.on('end', function() {
|
||||
/* concatenate and parse */
|
||||
var wb = XLSX.read(Buffer.concat(bufs));
|
||||
console.log(XLSX.utils.sheet_to_csv(wb.Sheets[wb.SheetNames[0]]));
|
||||
});
|
||||
```
|
||||
|
||||
29) Edit `SheetJSReadFromS3.js` and replace the highlighted lines:
|
||||
|
||||
- `accessKeyId`: access key for the Cloudflare account
|
||||
- `secretAccessKey`: secret access key for the Cloudflare account
|
||||
- `Bucket`: name of the bucket
|
||||
|
||||
The keys are found in the CSV from Step 22. The Bucket is the name from Step 5.
|
||||
|
||||
30) Run the script:
|
||||
|
||||
```bash
|
||||
node SheetJSReadFromS3.js
|
||||
```
|
||||
|
||||
The program will display the data in CSV format.
|
||||
|
||||
```
|
||||
S,h,e,e,t,J,S
|
||||
5,4,3,3,7,9,5
|
||||
```
|
||||
|
||||
[^1]: See ["Node.js compatibility"](https://developers.cloudflare.com/workers/runtime-apis/nodejs/) in the Cloudflare documentation
|
||||
[^2]: See ["Get started guide"](https://developers.cloudflare.com/workers/get-started/guide/#1-create-a-new-worker-project)
|
||||
[^3]: [Wrangler documentation](https://developers.cloudflare.com/workers/wrangler/)
|
||||
@ -458,10 +233,6 @@ S,h,e,e,t,J,S
|
||||
[^5]: See ["Workbook Object" in "SheetJS Data Model"](/docs/csf/book) for more details.
|
||||
[^6]: See [`sheet_to_csv` in "CSV and Text"](/docs/api/utilities/csv#delimiter-separated-output)
|
||||
[^7]: Registering for a free account [on the Cloudflare Free Tier](https://dash.cloudflare.com/sign-up).
|
||||
[^8]: The `Cloudflare-sdk` module is distributed [on the public NPM registry](https://npm.im/Cloudflare-sdk)
|
||||
[^9]: See ["Managing access keys for IAM users"](https://docs.Cloudflare.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html) in the Cloudflare documentation
|
||||
[^10]: See [`read` in "Reading Files"](/docs/api/parse-options)
|
||||
[^11]: See ["Workbook Object" in "SheetJS Data Model"](/docs/csf/book) for more details.
|
||||
[^12]: See [`sheet_to_csv` in "CSV and Text"](/docs/api/utilities/csv#delimiter-separated-output)
|
||||
[^13]: See [`write` in "Writing Files"](/docs/api/write-options)
|
||||
[^8]: The `@aws-sdk/client-s3` module is distributed [on the public NPM registry](https://www.npmjs.com/package/@aws-sdk/client-s3)
|
||||
[^9]: See [`write` in "Writing Files"](/docs/api/write-options)
|
||||
[^14]: Registering for a free account [on the Cloudflare Free Tier](https://Cloudflare.amazon.com/free/) requires a valid phone number and a valid credit card.
|
Loading…
Reference in New Issue
Block a user