fix: 修复配额说明重复和undefined问题
- 在editStorageForm中初始化oss_storage_quota_value和oss_quota_unit - 删除重复的旧配额说明块,保留新的当前配额设置显示 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
201
backend/node_modules/@smithy/util-stream/LICENSE
generated
vendored
Normal file
201
backend/node_modules/@smithy/util-stream/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,201 @@
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "{}"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright 2018-2020 Amazon.com, Inc. or its affiliates. All Rights Reserved.
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
6
backend/node_modules/@smithy/util-stream/README.md
generated
vendored
Normal file
6
backend/node_modules/@smithy/util-stream/README.md
generated
vendored
Normal file
@@ -0,0 +1,6 @@
|
||||
# @smithy/util-stream
|
||||
|
||||
[](https://www.npmjs.com/package/@smithy/util-stream)
|
||||
[](https://www.npmjs.com/package/@smithy/util-stream)
|
||||
|
||||
Package with utilities to operate on streams.
|
||||
36
backend/node_modules/@smithy/util-stream/dist-cjs/ByteArrayCollector.js
generated
vendored
Normal file
36
backend/node_modules/@smithy/util-stream/dist-cjs/ByteArrayCollector.js
generated
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.ByteArrayCollector = void 0;
|
||||
class ByteArrayCollector {
|
||||
allocByteArray;
|
||||
byteLength = 0;
|
||||
byteArrays = [];
|
||||
constructor(allocByteArray) {
|
||||
this.allocByteArray = allocByteArray;
|
||||
}
|
||||
push(byteArray) {
|
||||
this.byteArrays.push(byteArray);
|
||||
this.byteLength += byteArray.byteLength;
|
||||
}
|
||||
flush() {
|
||||
if (this.byteArrays.length === 1) {
|
||||
const bytes = this.byteArrays[0];
|
||||
this.reset();
|
||||
return bytes;
|
||||
}
|
||||
const aggregation = this.allocByteArray(this.byteLength);
|
||||
let cursor = 0;
|
||||
for (let i = 0; i < this.byteArrays.length; ++i) {
|
||||
const bytes = this.byteArrays[i];
|
||||
aggregation.set(bytes, cursor);
|
||||
cursor += bytes.byteLength;
|
||||
}
|
||||
this.reset();
|
||||
return aggregation;
|
||||
}
|
||||
reset() {
|
||||
this.byteArrays = [];
|
||||
this.byteLength = 0;
|
||||
}
|
||||
}
|
||||
exports.ByteArrayCollector = ByteArrayCollector;
|
||||
7
backend/node_modules/@smithy/util-stream/dist-cjs/checksum/ChecksumStream.browser.js
generated
vendored
Normal file
7
backend/node_modules/@smithy/util-stream/dist-cjs/checksum/ChecksumStream.browser.js
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.ChecksumStream = void 0;
|
||||
const ReadableStreamRef = typeof ReadableStream === "function" ? ReadableStream : function () { };
|
||||
class ChecksumStream extends ReadableStreamRef {
|
||||
}
|
||||
exports.ChecksumStream = ChecksumStream;
|
||||
53
backend/node_modules/@smithy/util-stream/dist-cjs/checksum/ChecksumStream.js
generated
vendored
Normal file
53
backend/node_modules/@smithy/util-stream/dist-cjs/checksum/ChecksumStream.js
generated
vendored
Normal file
@@ -0,0 +1,53 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.ChecksumStream = void 0;
|
||||
const util_base64_1 = require("@smithy/util-base64");
|
||||
const stream_1 = require("stream");
|
||||
class ChecksumStream extends stream_1.Duplex {
|
||||
expectedChecksum;
|
||||
checksumSourceLocation;
|
||||
checksum;
|
||||
source;
|
||||
base64Encoder;
|
||||
constructor({ expectedChecksum, checksum, source, checksumSourceLocation, base64Encoder, }) {
|
||||
super();
|
||||
if (typeof source.pipe === "function") {
|
||||
this.source = source;
|
||||
}
|
||||
else {
|
||||
throw new Error(`@smithy/util-stream: unsupported source type ${source?.constructor?.name ?? source} in ChecksumStream.`);
|
||||
}
|
||||
this.base64Encoder = base64Encoder ?? util_base64_1.toBase64;
|
||||
this.expectedChecksum = expectedChecksum;
|
||||
this.checksum = checksum;
|
||||
this.checksumSourceLocation = checksumSourceLocation;
|
||||
this.source.pipe(this);
|
||||
}
|
||||
_read(size) { }
|
||||
_write(chunk, encoding, callback) {
|
||||
try {
|
||||
this.checksum.update(chunk);
|
||||
this.push(chunk);
|
||||
}
|
||||
catch (e) {
|
||||
return callback(e);
|
||||
}
|
||||
return callback();
|
||||
}
|
||||
async _final(callback) {
|
||||
try {
|
||||
const digest = await this.checksum.digest();
|
||||
const received = this.base64Encoder(digest);
|
||||
if (this.expectedChecksum !== received) {
|
||||
return callback(new Error(`Checksum mismatch: expected "${this.expectedChecksum}" but received "${received}"` +
|
||||
` in response header "${this.checksumSourceLocation}".`));
|
||||
}
|
||||
}
|
||||
catch (e) {
|
||||
return callback(e);
|
||||
}
|
||||
this.push(null);
|
||||
return callback();
|
||||
}
|
||||
}
|
||||
exports.ChecksumStream = ChecksumStream;
|
||||
39
backend/node_modules/@smithy/util-stream/dist-cjs/checksum/createChecksumStream.browser.js
generated
vendored
Normal file
39
backend/node_modules/@smithy/util-stream/dist-cjs/checksum/createChecksumStream.browser.js
generated
vendored
Normal file
@@ -0,0 +1,39 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.createChecksumStream = void 0;
|
||||
const util_base64_1 = require("@smithy/util-base64");
|
||||
const stream_type_check_1 = require("../stream-type-check");
|
||||
const ChecksumStream_browser_1 = require("./ChecksumStream.browser");
|
||||
const createChecksumStream = ({ expectedChecksum, checksum, source, checksumSourceLocation, base64Encoder, }) => {
|
||||
if (!(0, stream_type_check_1.isReadableStream)(source)) {
|
||||
throw new Error(`@smithy/util-stream: unsupported source type ${source?.constructor?.name ?? source} in ChecksumStream.`);
|
||||
}
|
||||
const encoder = base64Encoder ?? util_base64_1.toBase64;
|
||||
if (typeof TransformStream !== "function") {
|
||||
throw new Error("@smithy/util-stream: unable to instantiate ChecksumStream because API unavailable: ReadableStream/TransformStream.");
|
||||
}
|
||||
const transform = new TransformStream({
|
||||
start() { },
|
||||
async transform(chunk, controller) {
|
||||
checksum.update(chunk);
|
||||
controller.enqueue(chunk);
|
||||
},
|
||||
async flush(controller) {
|
||||
const digest = await checksum.digest();
|
||||
const received = encoder(digest);
|
||||
if (expectedChecksum !== received) {
|
||||
const error = new Error(`Checksum mismatch: expected "${expectedChecksum}" but received "${received}"` +
|
||||
` in response header "${checksumSourceLocation}".`);
|
||||
controller.error(error);
|
||||
}
|
||||
else {
|
||||
controller.terminate();
|
||||
}
|
||||
},
|
||||
});
|
||||
source.pipeThrough(transform);
|
||||
const readable = transform.readable;
|
||||
Object.setPrototypeOf(readable, ChecksumStream_browser_1.ChecksumStream.prototype);
|
||||
return readable;
|
||||
};
|
||||
exports.createChecksumStream = createChecksumStream;
|
||||
12
backend/node_modules/@smithy/util-stream/dist-cjs/checksum/createChecksumStream.js
generated
vendored
Normal file
12
backend/node_modules/@smithy/util-stream/dist-cjs/checksum/createChecksumStream.js
generated
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.createChecksumStream = createChecksumStream;
|
||||
const stream_type_check_1 = require("../stream-type-check");
|
||||
const ChecksumStream_1 = require("./ChecksumStream");
|
||||
const createChecksumStream_browser_1 = require("./createChecksumStream.browser");
|
||||
function createChecksumStream(init) {
|
||||
if (typeof ReadableStream === "function" && (0, stream_type_check_1.isReadableStream)(init.source)) {
|
||||
return (0, createChecksumStream_browser_1.createChecksumStream)(init);
|
||||
}
|
||||
return new ChecksumStream_1.ChecksumStream(init);
|
||||
}
|
||||
60
backend/node_modules/@smithy/util-stream/dist-cjs/createBufferedReadable.js
generated
vendored
Normal file
60
backend/node_modules/@smithy/util-stream/dist-cjs/createBufferedReadable.js
generated
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.createBufferedReadable = createBufferedReadable;
|
||||
const node_stream_1 = require("node:stream");
|
||||
const ByteArrayCollector_1 = require("./ByteArrayCollector");
|
||||
const createBufferedReadableStream_1 = require("./createBufferedReadableStream");
|
||||
const stream_type_check_1 = require("./stream-type-check");
|
||||
function createBufferedReadable(upstream, size, logger) {
|
||||
if ((0, stream_type_check_1.isReadableStream)(upstream)) {
|
||||
return (0, createBufferedReadableStream_1.createBufferedReadableStream)(upstream, size, logger);
|
||||
}
|
||||
const downstream = new node_stream_1.Readable({ read() { } });
|
||||
let streamBufferingLoggedWarning = false;
|
||||
let bytesSeen = 0;
|
||||
const buffers = [
|
||||
"",
|
||||
new ByteArrayCollector_1.ByteArrayCollector((size) => new Uint8Array(size)),
|
||||
new ByteArrayCollector_1.ByteArrayCollector((size) => Buffer.from(new Uint8Array(size))),
|
||||
];
|
||||
let mode = -1;
|
||||
upstream.on("data", (chunk) => {
|
||||
const chunkMode = (0, createBufferedReadableStream_1.modeOf)(chunk, true);
|
||||
if (mode !== chunkMode) {
|
||||
if (mode >= 0) {
|
||||
downstream.push((0, createBufferedReadableStream_1.flush)(buffers, mode));
|
||||
}
|
||||
mode = chunkMode;
|
||||
}
|
||||
if (mode === -1) {
|
||||
downstream.push(chunk);
|
||||
return;
|
||||
}
|
||||
const chunkSize = (0, createBufferedReadableStream_1.sizeOf)(chunk);
|
||||
bytesSeen += chunkSize;
|
||||
const bufferSize = (0, createBufferedReadableStream_1.sizeOf)(buffers[mode]);
|
||||
if (chunkSize >= size && bufferSize === 0) {
|
||||
downstream.push(chunk);
|
||||
}
|
||||
else {
|
||||
const newSize = (0, createBufferedReadableStream_1.merge)(buffers, mode, chunk);
|
||||
if (!streamBufferingLoggedWarning && bytesSeen > size * 2) {
|
||||
streamBufferingLoggedWarning = true;
|
||||
logger?.warn(`@smithy/util-stream - stream chunk size ${chunkSize} is below threshold of ${size}, automatically buffering.`);
|
||||
}
|
||||
if (newSize >= size) {
|
||||
downstream.push((0, createBufferedReadableStream_1.flush)(buffers, mode));
|
||||
}
|
||||
}
|
||||
});
|
||||
upstream.on("end", () => {
|
||||
if (mode !== -1) {
|
||||
const remainder = (0, createBufferedReadableStream_1.flush)(buffers, mode);
|
||||
if ((0, createBufferedReadableStream_1.sizeOf)(remainder) > 0) {
|
||||
downstream.push(remainder);
|
||||
}
|
||||
}
|
||||
downstream.push(null);
|
||||
});
|
||||
return downstream;
|
||||
}
|
||||
103
backend/node_modules/@smithy/util-stream/dist-cjs/createBufferedReadableStream.js
generated
vendored
Normal file
103
backend/node_modules/@smithy/util-stream/dist-cjs/createBufferedReadableStream.js
generated
vendored
Normal file
@@ -0,0 +1,103 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.createBufferedReadable = void 0;
|
||||
exports.createBufferedReadableStream = createBufferedReadableStream;
|
||||
exports.merge = merge;
|
||||
exports.flush = flush;
|
||||
exports.sizeOf = sizeOf;
|
||||
exports.modeOf = modeOf;
|
||||
const ByteArrayCollector_1 = require("./ByteArrayCollector");
|
||||
function createBufferedReadableStream(upstream, size, logger) {
|
||||
const reader = upstream.getReader();
|
||||
let streamBufferingLoggedWarning = false;
|
||||
let bytesSeen = 0;
|
||||
const buffers = ["", new ByteArrayCollector_1.ByteArrayCollector((size) => new Uint8Array(size))];
|
||||
let mode = -1;
|
||||
const pull = async (controller) => {
|
||||
const { value, done } = await reader.read();
|
||||
const chunk = value;
|
||||
if (done) {
|
||||
if (mode !== -1) {
|
||||
const remainder = flush(buffers, mode);
|
||||
if (sizeOf(remainder) > 0) {
|
||||
controller.enqueue(remainder);
|
||||
}
|
||||
}
|
||||
controller.close();
|
||||
}
|
||||
else {
|
||||
const chunkMode = modeOf(chunk, false);
|
||||
if (mode !== chunkMode) {
|
||||
if (mode >= 0) {
|
||||
controller.enqueue(flush(buffers, mode));
|
||||
}
|
||||
mode = chunkMode;
|
||||
}
|
||||
if (mode === -1) {
|
||||
controller.enqueue(chunk);
|
||||
return;
|
||||
}
|
||||
const chunkSize = sizeOf(chunk);
|
||||
bytesSeen += chunkSize;
|
||||
const bufferSize = sizeOf(buffers[mode]);
|
||||
if (chunkSize >= size && bufferSize === 0) {
|
||||
controller.enqueue(chunk);
|
||||
}
|
||||
else {
|
||||
const newSize = merge(buffers, mode, chunk);
|
||||
if (!streamBufferingLoggedWarning && bytesSeen > size * 2) {
|
||||
streamBufferingLoggedWarning = true;
|
||||
logger?.warn(`@smithy/util-stream - stream chunk size ${chunkSize} is below threshold of ${size}, automatically buffering.`);
|
||||
}
|
||||
if (newSize >= size) {
|
||||
controller.enqueue(flush(buffers, mode));
|
||||
}
|
||||
else {
|
||||
await pull(controller);
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
return new ReadableStream({
|
||||
pull,
|
||||
});
|
||||
}
|
||||
exports.createBufferedReadable = createBufferedReadableStream;
|
||||
function merge(buffers, mode, chunk) {
|
||||
switch (mode) {
|
||||
case 0:
|
||||
buffers[0] += chunk;
|
||||
return sizeOf(buffers[0]);
|
||||
case 1:
|
||||
case 2:
|
||||
buffers[mode].push(chunk);
|
||||
return sizeOf(buffers[mode]);
|
||||
}
|
||||
}
|
||||
function flush(buffers, mode) {
|
||||
switch (mode) {
|
||||
case 0:
|
||||
const s = buffers[0];
|
||||
buffers[0] = "";
|
||||
return s;
|
||||
case 1:
|
||||
case 2:
|
||||
return buffers[mode].flush();
|
||||
}
|
||||
throw new Error(`@smithy/util-stream - invalid index ${mode} given to flush()`);
|
||||
}
|
||||
function sizeOf(chunk) {
|
||||
return chunk?.byteLength ?? chunk?.length ?? 0;
|
||||
}
|
||||
function modeOf(chunk, allowBuffer = true) {
|
||||
if (allowBuffer && typeof Buffer !== "undefined" && chunk instanceof Buffer) {
|
||||
return 2;
|
||||
}
|
||||
if (chunk instanceof Uint8Array) {
|
||||
return 1;
|
||||
}
|
||||
if (typeof chunk === "string") {
|
||||
return 0;
|
||||
}
|
||||
return -1;
|
||||
}
|
||||
31
backend/node_modules/@smithy/util-stream/dist-cjs/getAwsChunkedEncodingStream.browser.js
generated
vendored
Normal file
31
backend/node_modules/@smithy/util-stream/dist-cjs/getAwsChunkedEncodingStream.browser.js
generated
vendored
Normal file
@@ -0,0 +1,31 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.getAwsChunkedEncodingStream = void 0;
|
||||
const getAwsChunkedEncodingStream = (readableStream, options) => {
|
||||
const { base64Encoder, bodyLengthChecker, checksumAlgorithmFn, checksumLocationName, streamHasher } = options;
|
||||
const checksumRequired = base64Encoder !== undefined &&
|
||||
bodyLengthChecker !== undefined &&
|
||||
checksumAlgorithmFn !== undefined &&
|
||||
checksumLocationName !== undefined &&
|
||||
streamHasher !== undefined;
|
||||
const digest = checksumRequired ? streamHasher(checksumAlgorithmFn, readableStream) : undefined;
|
||||
const reader = readableStream.getReader();
|
||||
return new ReadableStream({
|
||||
async pull(controller) {
|
||||
const { value, done } = await reader.read();
|
||||
if (done) {
|
||||
controller.enqueue(`0\r\n`);
|
||||
if (checksumRequired) {
|
||||
const checksum = base64Encoder(await digest);
|
||||
controller.enqueue(`${checksumLocationName}:${checksum}\r\n`);
|
||||
controller.enqueue(`\r\n`);
|
||||
}
|
||||
controller.close();
|
||||
}
|
||||
else {
|
||||
controller.enqueue(`${(bodyLengthChecker(value) || 0).toString(16)}\r\n${value}\r\n`);
|
||||
}
|
||||
},
|
||||
});
|
||||
};
|
||||
exports.getAwsChunkedEncodingStream = getAwsChunkedEncodingStream;
|
||||
41
backend/node_modules/@smithy/util-stream/dist-cjs/getAwsChunkedEncodingStream.js
generated
vendored
Normal file
41
backend/node_modules/@smithy/util-stream/dist-cjs/getAwsChunkedEncodingStream.js
generated
vendored
Normal file
@@ -0,0 +1,41 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.getAwsChunkedEncodingStream = getAwsChunkedEncodingStream;
|
||||
const node_stream_1 = require("node:stream");
|
||||
const getAwsChunkedEncodingStream_browser_1 = require("./getAwsChunkedEncodingStream.browser");
|
||||
const stream_type_check_1 = require("./stream-type-check");
|
||||
function getAwsChunkedEncodingStream(stream, options) {
|
||||
const readable = stream;
|
||||
const readableStream = stream;
|
||||
if ((0, stream_type_check_1.isReadableStream)(readableStream)) {
|
||||
return (0, getAwsChunkedEncodingStream_browser_1.getAwsChunkedEncodingStream)(readableStream, options);
|
||||
}
|
||||
const { base64Encoder, bodyLengthChecker, checksumAlgorithmFn, checksumLocationName, streamHasher } = options;
|
||||
const checksumRequired = base64Encoder !== undefined &&
|
||||
checksumAlgorithmFn !== undefined &&
|
||||
checksumLocationName !== undefined &&
|
||||
streamHasher !== undefined;
|
||||
const digest = checksumRequired ? streamHasher(checksumAlgorithmFn, readable) : undefined;
|
||||
const awsChunkedEncodingStream = new node_stream_1.Readable({
|
||||
read: () => { },
|
||||
});
|
||||
readable.on("data", (data) => {
|
||||
const length = bodyLengthChecker(data) || 0;
|
||||
if (length === 0) {
|
||||
return;
|
||||
}
|
||||
awsChunkedEncodingStream.push(`${length.toString(16)}\r\n`);
|
||||
awsChunkedEncodingStream.push(data);
|
||||
awsChunkedEncodingStream.push("\r\n");
|
||||
});
|
||||
readable.on("end", async () => {
|
||||
awsChunkedEncodingStream.push(`0\r\n`);
|
||||
if (checksumRequired) {
|
||||
const checksum = base64Encoder(await digest);
|
||||
awsChunkedEncodingStream.push(`${checksumLocationName}:${checksum}\r\n`);
|
||||
awsChunkedEncodingStream.push(`\r\n`);
|
||||
}
|
||||
awsChunkedEncodingStream.push(null);
|
||||
});
|
||||
return awsChunkedEncodingStream;
|
||||
}
|
||||
34
backend/node_modules/@smithy/util-stream/dist-cjs/headStream.browser.js
generated
vendored
Normal file
34
backend/node_modules/@smithy/util-stream/dist-cjs/headStream.browser.js
generated
vendored
Normal file
@@ -0,0 +1,34 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.headStream = headStream;
|
||||
async function headStream(stream, bytes) {
|
||||
let byteLengthCounter = 0;
|
||||
const chunks = [];
|
||||
const reader = stream.getReader();
|
||||
let isDone = false;
|
||||
while (!isDone) {
|
||||
const { done, value } = await reader.read();
|
||||
if (value) {
|
||||
chunks.push(value);
|
||||
byteLengthCounter += value?.byteLength ?? 0;
|
||||
}
|
||||
if (byteLengthCounter >= bytes) {
|
||||
break;
|
||||
}
|
||||
isDone = done;
|
||||
}
|
||||
reader.releaseLock();
|
||||
const collected = new Uint8Array(Math.min(bytes, byteLengthCounter));
|
||||
let offset = 0;
|
||||
for (const chunk of chunks) {
|
||||
if (chunk.byteLength > collected.byteLength - offset) {
|
||||
collected.set(chunk.subarray(0, collected.byteLength - offset), offset);
|
||||
break;
|
||||
}
|
||||
else {
|
||||
collected.set(chunk, offset);
|
||||
}
|
||||
offset += chunk.length;
|
||||
}
|
||||
return collected;
|
||||
}
|
||||
42
backend/node_modules/@smithy/util-stream/dist-cjs/headStream.js
generated
vendored
Normal file
42
backend/node_modules/@smithy/util-stream/dist-cjs/headStream.js
generated
vendored
Normal file
@@ -0,0 +1,42 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.headStream = void 0;
|
||||
const stream_1 = require("stream");
|
||||
const headStream_browser_1 = require("./headStream.browser");
|
||||
const stream_type_check_1 = require("./stream-type-check");
|
||||
const headStream = (stream, bytes) => {
|
||||
if ((0, stream_type_check_1.isReadableStream)(stream)) {
|
||||
return (0, headStream_browser_1.headStream)(stream, bytes);
|
||||
}
|
||||
return new Promise((resolve, reject) => {
|
||||
const collector = new Collector();
|
||||
collector.limit = bytes;
|
||||
stream.pipe(collector);
|
||||
stream.on("error", (err) => {
|
||||
collector.end();
|
||||
reject(err);
|
||||
});
|
||||
collector.on("error", reject);
|
||||
collector.on("finish", function () {
|
||||
const bytes = new Uint8Array(Buffer.concat(this.buffers));
|
||||
resolve(bytes);
|
||||
});
|
||||
});
|
||||
};
|
||||
exports.headStream = headStream;
|
||||
class Collector extends stream_1.Writable {
|
||||
buffers = [];
|
||||
limit = Infinity;
|
||||
bytesBuffered = 0;
|
||||
_write(chunk, encoding, callback) {
|
||||
this.buffers.push(chunk);
|
||||
this.bytesBuffered += chunk.byteLength ?? 0;
|
||||
if (this.bytesBuffered >= this.limit) {
|
||||
const excess = this.bytesBuffered - this.limit;
|
||||
const tailBuffer = this.buffers[this.buffers.length - 1];
|
||||
this.buffers[this.buffers.length - 1] = tailBuffer.subarray(0, tailBuffer.byteLength - excess);
|
||||
this.emit("finish");
|
||||
}
|
||||
callback();
|
||||
}
|
||||
}
|
||||
86
backend/node_modules/@smithy/util-stream/dist-cjs/index.js
generated
vendored
Normal file
86
backend/node_modules/@smithy/util-stream/dist-cjs/index.js
generated
vendored
Normal file
@@ -0,0 +1,86 @@
|
||||
'use strict';
|
||||
|
||||
var utilBase64 = require('@smithy/util-base64');
|
||||
var utilUtf8 = require('@smithy/util-utf8');
|
||||
var ChecksumStream = require('./checksum/ChecksumStream');
|
||||
var createChecksumStream = require('./checksum/createChecksumStream');
|
||||
var createBufferedReadable = require('./createBufferedReadable');
|
||||
var getAwsChunkedEncodingStream = require('./getAwsChunkedEncodingStream');
|
||||
var headStream = require('./headStream');
|
||||
var sdkStreamMixin = require('./sdk-stream-mixin');
|
||||
var splitStream = require('./splitStream');
|
||||
var streamTypeCheck = require('./stream-type-check');
|
||||
|
||||
class Uint8ArrayBlobAdapter extends Uint8Array {
|
||||
static fromString(source, encoding = "utf-8") {
|
||||
if (typeof source === "string") {
|
||||
if (encoding === "base64") {
|
||||
return Uint8ArrayBlobAdapter.mutate(utilBase64.fromBase64(source));
|
||||
}
|
||||
return Uint8ArrayBlobAdapter.mutate(utilUtf8.fromUtf8(source));
|
||||
}
|
||||
throw new Error(`Unsupported conversion from ${typeof source} to Uint8ArrayBlobAdapter.`);
|
||||
}
|
||||
static mutate(source) {
|
||||
Object.setPrototypeOf(source, Uint8ArrayBlobAdapter.prototype);
|
||||
return source;
|
||||
}
|
||||
transformToString(encoding = "utf-8") {
|
||||
if (encoding === "base64") {
|
||||
return utilBase64.toBase64(this);
|
||||
}
|
||||
return utilUtf8.toUtf8(this);
|
||||
}
|
||||
}
|
||||
|
||||
Object.defineProperty(exports, "isBlob", {
|
||||
enumerable: true,
|
||||
get: function () { return streamTypeCheck.isBlob; }
|
||||
});
|
||||
Object.defineProperty(exports, "isReadableStream", {
|
||||
enumerable: true,
|
||||
get: function () { return streamTypeCheck.isReadableStream; }
|
||||
});
|
||||
exports.Uint8ArrayBlobAdapter = Uint8ArrayBlobAdapter;
|
||||
Object.keys(ChecksumStream).forEach(function (k) {
|
||||
if (k !== 'default' && !Object.prototype.hasOwnProperty.call(exports, k)) Object.defineProperty(exports, k, {
|
||||
enumerable: true,
|
||||
get: function () { return ChecksumStream[k]; }
|
||||
});
|
||||
});
|
||||
Object.keys(createChecksumStream).forEach(function (k) {
|
||||
if (k !== 'default' && !Object.prototype.hasOwnProperty.call(exports, k)) Object.defineProperty(exports, k, {
|
||||
enumerable: true,
|
||||
get: function () { return createChecksumStream[k]; }
|
||||
});
|
||||
});
|
||||
Object.keys(createBufferedReadable).forEach(function (k) {
|
||||
if (k !== 'default' && !Object.prototype.hasOwnProperty.call(exports, k)) Object.defineProperty(exports, k, {
|
||||
enumerable: true,
|
||||
get: function () { return createBufferedReadable[k]; }
|
||||
});
|
||||
});
|
||||
Object.keys(getAwsChunkedEncodingStream).forEach(function (k) {
|
||||
if (k !== 'default' && !Object.prototype.hasOwnProperty.call(exports, k)) Object.defineProperty(exports, k, {
|
||||
enumerable: true,
|
||||
get: function () { return getAwsChunkedEncodingStream[k]; }
|
||||
});
|
||||
});
|
||||
Object.keys(headStream).forEach(function (k) {
|
||||
if (k !== 'default' && !Object.prototype.hasOwnProperty.call(exports, k)) Object.defineProperty(exports, k, {
|
||||
enumerable: true,
|
||||
get: function () { return headStream[k]; }
|
||||
});
|
||||
});
|
||||
Object.keys(sdkStreamMixin).forEach(function (k) {
|
||||
if (k !== 'default' && !Object.prototype.hasOwnProperty.call(exports, k)) Object.defineProperty(exports, k, {
|
||||
enumerable: true,
|
||||
get: function () { return sdkStreamMixin[k]; }
|
||||
});
|
||||
});
|
||||
Object.keys(splitStream).forEach(function (k) {
|
||||
if (k !== 'default' && !Object.prototype.hasOwnProperty.call(exports, k)) Object.defineProperty(exports, k, {
|
||||
enumerable: true,
|
||||
get: function () { return splitStream[k]; }
|
||||
});
|
||||
});
|
||||
68
backend/node_modules/@smithy/util-stream/dist-cjs/sdk-stream-mixin.browser.js
generated
vendored
Normal file
68
backend/node_modules/@smithy/util-stream/dist-cjs/sdk-stream-mixin.browser.js
generated
vendored
Normal file
@@ -0,0 +1,68 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.sdkStreamMixin = void 0;
|
||||
const fetch_http_handler_1 = require("@smithy/fetch-http-handler");
|
||||
const util_base64_1 = require("@smithy/util-base64");
|
||||
const util_hex_encoding_1 = require("@smithy/util-hex-encoding");
|
||||
const util_utf8_1 = require("@smithy/util-utf8");
|
||||
const stream_type_check_1 = require("./stream-type-check");
|
||||
const ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED = "The stream has already been transformed.";
|
||||
const sdkStreamMixin = (stream) => {
|
||||
if (!isBlobInstance(stream) && !(0, stream_type_check_1.isReadableStream)(stream)) {
|
||||
const name = stream?.__proto__?.constructor?.name || stream;
|
||||
throw new Error(`Unexpected stream implementation, expect Blob or ReadableStream, got ${name}`);
|
||||
}
|
||||
let transformed = false;
|
||||
const transformToByteArray = async () => {
|
||||
if (transformed) {
|
||||
throw new Error(ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED);
|
||||
}
|
||||
transformed = true;
|
||||
return await (0, fetch_http_handler_1.streamCollector)(stream);
|
||||
};
|
||||
const blobToWebStream = (blob) => {
|
||||
if (typeof blob.stream !== "function") {
|
||||
throw new Error("Cannot transform payload Blob to web stream. Please make sure the Blob.stream() is polyfilled.\n" +
|
||||
"If you are using React Native, this API is not yet supported, see: https://react-native.canny.io/feature-requests/p/fetch-streaming-body");
|
||||
}
|
||||
return blob.stream();
|
||||
};
|
||||
return Object.assign(stream, {
|
||||
transformToByteArray: transformToByteArray,
|
||||
transformToString: async (encoding) => {
|
||||
const buf = await transformToByteArray();
|
||||
if (encoding === "base64") {
|
||||
return (0, util_base64_1.toBase64)(buf);
|
||||
}
|
||||
else if (encoding === "hex") {
|
||||
return (0, util_hex_encoding_1.toHex)(buf);
|
||||
}
|
||||
else if (encoding === undefined || encoding === "utf8" || encoding === "utf-8") {
|
||||
return (0, util_utf8_1.toUtf8)(buf);
|
||||
}
|
||||
else if (typeof TextDecoder === "function") {
|
||||
return new TextDecoder(encoding).decode(buf);
|
||||
}
|
||||
else {
|
||||
throw new Error("TextDecoder is not available, please make sure polyfill is provided.");
|
||||
}
|
||||
},
|
||||
transformToWebStream: () => {
|
||||
if (transformed) {
|
||||
throw new Error(ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED);
|
||||
}
|
||||
transformed = true;
|
||||
if (isBlobInstance(stream)) {
|
||||
return blobToWebStream(stream);
|
||||
}
|
||||
else if ((0, stream_type_check_1.isReadableStream)(stream)) {
|
||||
return stream;
|
||||
}
|
||||
else {
|
||||
throw new Error(`Cannot transform payload to web stream, got ${stream}`);
|
||||
}
|
||||
},
|
||||
});
|
||||
};
|
||||
exports.sdkStreamMixin = sdkStreamMixin;
|
||||
const isBlobInstance = (stream) => typeof Blob === "function" && stream instanceof Blob;
|
||||
54
backend/node_modules/@smithy/util-stream/dist-cjs/sdk-stream-mixin.js
generated
vendored
Normal file
54
backend/node_modules/@smithy/util-stream/dist-cjs/sdk-stream-mixin.js
generated
vendored
Normal file
@@ -0,0 +1,54 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.sdkStreamMixin = void 0;
|
||||
const node_http_handler_1 = require("@smithy/node-http-handler");
|
||||
const util_buffer_from_1 = require("@smithy/util-buffer-from");
|
||||
const stream_1 = require("stream");
|
||||
const sdk_stream_mixin_browser_1 = require("./sdk-stream-mixin.browser");
|
||||
const ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED = "The stream has already been transformed.";
|
||||
const sdkStreamMixin = (stream) => {
|
||||
if (!(stream instanceof stream_1.Readable)) {
|
||||
try {
|
||||
return (0, sdk_stream_mixin_browser_1.sdkStreamMixin)(stream);
|
||||
}
|
||||
catch (e) {
|
||||
const name = stream?.__proto__?.constructor?.name || stream;
|
||||
throw new Error(`Unexpected stream implementation, expect Stream.Readable instance, got ${name}`);
|
||||
}
|
||||
}
|
||||
let transformed = false;
|
||||
const transformToByteArray = async () => {
|
||||
if (transformed) {
|
||||
throw new Error(ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED);
|
||||
}
|
||||
transformed = true;
|
||||
return await (0, node_http_handler_1.streamCollector)(stream);
|
||||
};
|
||||
return Object.assign(stream, {
|
||||
transformToByteArray,
|
||||
transformToString: async (encoding) => {
|
||||
const buf = await transformToByteArray();
|
||||
if (encoding === undefined || Buffer.isEncoding(encoding)) {
|
||||
return (0, util_buffer_from_1.fromArrayBuffer)(buf.buffer, buf.byteOffset, buf.byteLength).toString(encoding);
|
||||
}
|
||||
else {
|
||||
const decoder = new TextDecoder(encoding);
|
||||
return decoder.decode(buf);
|
||||
}
|
||||
},
|
||||
transformToWebStream: () => {
|
||||
if (transformed) {
|
||||
throw new Error(ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED);
|
||||
}
|
||||
if (stream.readableFlowing !== null) {
|
||||
throw new Error("The stream has been consumed by other callbacks.");
|
||||
}
|
||||
if (typeof stream_1.Readable.toWeb !== "function") {
|
||||
throw new Error("Readable.toWeb() is not supported. Please ensure a polyfill is available.");
|
||||
}
|
||||
transformed = true;
|
||||
return stream_1.Readable.toWeb(stream);
|
||||
},
|
||||
});
|
||||
};
|
||||
exports.sdkStreamMixin = sdkStreamMixin;
|
||||
10
backend/node_modules/@smithy/util-stream/dist-cjs/splitStream.browser.js
generated
vendored
Normal file
10
backend/node_modules/@smithy/util-stream/dist-cjs/splitStream.browser.js
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.splitStream = splitStream;
|
||||
async function splitStream(stream) {
|
||||
if (typeof stream.stream === "function") {
|
||||
stream = stream.stream();
|
||||
}
|
||||
const readableStream = stream;
|
||||
return readableStream.tee();
|
||||
}
|
||||
16
backend/node_modules/@smithy/util-stream/dist-cjs/splitStream.js
generated
vendored
Normal file
16
backend/node_modules/@smithy/util-stream/dist-cjs/splitStream.js
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.splitStream = splitStream;
|
||||
const stream_1 = require("stream");
|
||||
const splitStream_browser_1 = require("./splitStream.browser");
|
||||
const stream_type_check_1 = require("./stream-type-check");
|
||||
async function splitStream(stream) {
|
||||
if ((0, stream_type_check_1.isReadableStream)(stream) || (0, stream_type_check_1.isBlob)(stream)) {
|
||||
return (0, splitStream_browser_1.splitStream)(stream);
|
||||
}
|
||||
const stream1 = new stream_1.PassThrough();
|
||||
const stream2 = new stream_1.PassThrough();
|
||||
stream.pipe(stream1);
|
||||
stream.pipe(stream2);
|
||||
return [stream1, stream2];
|
||||
}
|
||||
10
backend/node_modules/@smithy/util-stream/dist-cjs/stream-type-check.js
generated
vendored
Normal file
10
backend/node_modules/@smithy/util-stream/dist-cjs/stream-type-check.js
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.isBlob = exports.isReadableStream = void 0;
|
||||
const isReadableStream = (stream) => typeof ReadableStream === "function" &&
|
||||
(stream?.constructor?.name === ReadableStream.name || stream instanceof ReadableStream);
|
||||
exports.isReadableStream = isReadableStream;
|
||||
const isBlob = (blob) => {
|
||||
return typeof Blob === "function" && (blob?.constructor?.name === Blob.name || blob instanceof Blob);
|
||||
};
|
||||
exports.isBlob = isBlob;
|
||||
32
backend/node_modules/@smithy/util-stream/dist-es/ByteArrayCollector.js
generated
vendored
Normal file
32
backend/node_modules/@smithy/util-stream/dist-es/ByteArrayCollector.js
generated
vendored
Normal file
@@ -0,0 +1,32 @@
|
||||
export class ByteArrayCollector {
|
||||
allocByteArray;
|
||||
byteLength = 0;
|
||||
byteArrays = [];
|
||||
constructor(allocByteArray) {
|
||||
this.allocByteArray = allocByteArray;
|
||||
}
|
||||
push(byteArray) {
|
||||
this.byteArrays.push(byteArray);
|
||||
this.byteLength += byteArray.byteLength;
|
||||
}
|
||||
flush() {
|
||||
if (this.byteArrays.length === 1) {
|
||||
const bytes = this.byteArrays[0];
|
||||
this.reset();
|
||||
return bytes;
|
||||
}
|
||||
const aggregation = this.allocByteArray(this.byteLength);
|
||||
let cursor = 0;
|
||||
for (let i = 0; i < this.byteArrays.length; ++i) {
|
||||
const bytes = this.byteArrays[i];
|
||||
aggregation.set(bytes, cursor);
|
||||
cursor += bytes.byteLength;
|
||||
}
|
||||
this.reset();
|
||||
return aggregation;
|
||||
}
|
||||
reset() {
|
||||
this.byteArrays = [];
|
||||
this.byteLength = 0;
|
||||
}
|
||||
}
|
||||
23
backend/node_modules/@smithy/util-stream/dist-es/blob/Uint8ArrayBlobAdapter.js
generated
vendored
Normal file
23
backend/node_modules/@smithy/util-stream/dist-es/blob/Uint8ArrayBlobAdapter.js
generated
vendored
Normal file
@@ -0,0 +1,23 @@
|
||||
import { fromBase64, toBase64 } from "@smithy/util-base64";
|
||||
import { fromUtf8, toUtf8 } from "@smithy/util-utf8";
|
||||
export class Uint8ArrayBlobAdapter extends Uint8Array {
|
||||
static fromString(source, encoding = "utf-8") {
|
||||
if (typeof source === "string") {
|
||||
if (encoding === "base64") {
|
||||
return Uint8ArrayBlobAdapter.mutate(fromBase64(source));
|
||||
}
|
||||
return Uint8ArrayBlobAdapter.mutate(fromUtf8(source));
|
||||
}
|
||||
throw new Error(`Unsupported conversion from ${typeof source} to Uint8ArrayBlobAdapter.`);
|
||||
}
|
||||
static mutate(source) {
|
||||
Object.setPrototypeOf(source, Uint8ArrayBlobAdapter.prototype);
|
||||
return source;
|
||||
}
|
||||
transformToString(encoding = "utf-8") {
|
||||
if (encoding === "base64") {
|
||||
return toBase64(this);
|
||||
}
|
||||
return toUtf8(this);
|
||||
}
|
||||
}
|
||||
3
backend/node_modules/@smithy/util-stream/dist-es/checksum/ChecksumStream.browser.js
generated
vendored
Normal file
3
backend/node_modules/@smithy/util-stream/dist-es/checksum/ChecksumStream.browser.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
const ReadableStreamRef = typeof ReadableStream === "function" ? ReadableStream : function () { };
|
||||
export class ChecksumStream extends ReadableStreamRef {
|
||||
}
|
||||
49
backend/node_modules/@smithy/util-stream/dist-es/checksum/ChecksumStream.js
generated
vendored
Normal file
49
backend/node_modules/@smithy/util-stream/dist-es/checksum/ChecksumStream.js
generated
vendored
Normal file
@@ -0,0 +1,49 @@
|
||||
import { toBase64 } from "@smithy/util-base64";
|
||||
import { Duplex } from "stream";
|
||||
export class ChecksumStream extends Duplex {
|
||||
expectedChecksum;
|
||||
checksumSourceLocation;
|
||||
checksum;
|
||||
source;
|
||||
base64Encoder;
|
||||
constructor({ expectedChecksum, checksum, source, checksumSourceLocation, base64Encoder, }) {
|
||||
super();
|
||||
if (typeof source.pipe === "function") {
|
||||
this.source = source;
|
||||
}
|
||||
else {
|
||||
throw new Error(`@smithy/util-stream: unsupported source type ${source?.constructor?.name ?? source} in ChecksumStream.`);
|
||||
}
|
||||
this.base64Encoder = base64Encoder ?? toBase64;
|
||||
this.expectedChecksum = expectedChecksum;
|
||||
this.checksum = checksum;
|
||||
this.checksumSourceLocation = checksumSourceLocation;
|
||||
this.source.pipe(this);
|
||||
}
|
||||
_read(size) { }
|
||||
_write(chunk, encoding, callback) {
|
||||
try {
|
||||
this.checksum.update(chunk);
|
||||
this.push(chunk);
|
||||
}
|
||||
catch (e) {
|
||||
return callback(e);
|
||||
}
|
||||
return callback();
|
||||
}
|
||||
async _final(callback) {
|
||||
try {
|
||||
const digest = await this.checksum.digest();
|
||||
const received = this.base64Encoder(digest);
|
||||
if (this.expectedChecksum !== received) {
|
||||
return callback(new Error(`Checksum mismatch: expected "${this.expectedChecksum}" but received "${received}"` +
|
||||
` in response header "${this.checksumSourceLocation}".`));
|
||||
}
|
||||
}
|
||||
catch (e) {
|
||||
return callback(e);
|
||||
}
|
||||
this.push(null);
|
||||
return callback();
|
||||
}
|
||||
}
|
||||
35
backend/node_modules/@smithy/util-stream/dist-es/checksum/createChecksumStream.browser.js
generated
vendored
Normal file
35
backend/node_modules/@smithy/util-stream/dist-es/checksum/createChecksumStream.browser.js
generated
vendored
Normal file
@@ -0,0 +1,35 @@
|
||||
import { toBase64 } from "@smithy/util-base64";
|
||||
import { isReadableStream } from "../stream-type-check";
|
||||
import { ChecksumStream } from "./ChecksumStream.browser";
|
||||
export const createChecksumStream = ({ expectedChecksum, checksum, source, checksumSourceLocation, base64Encoder, }) => {
|
||||
if (!isReadableStream(source)) {
|
||||
throw new Error(`@smithy/util-stream: unsupported source type ${source?.constructor?.name ?? source} in ChecksumStream.`);
|
||||
}
|
||||
const encoder = base64Encoder ?? toBase64;
|
||||
if (typeof TransformStream !== "function") {
|
||||
throw new Error("@smithy/util-stream: unable to instantiate ChecksumStream because API unavailable: ReadableStream/TransformStream.");
|
||||
}
|
||||
const transform = new TransformStream({
|
||||
start() { },
|
||||
async transform(chunk, controller) {
|
||||
checksum.update(chunk);
|
||||
controller.enqueue(chunk);
|
||||
},
|
||||
async flush(controller) {
|
||||
const digest = await checksum.digest();
|
||||
const received = encoder(digest);
|
||||
if (expectedChecksum !== received) {
|
||||
const error = new Error(`Checksum mismatch: expected "${expectedChecksum}" but received "${received}"` +
|
||||
` in response header "${checksumSourceLocation}".`);
|
||||
controller.error(error);
|
||||
}
|
||||
else {
|
||||
controller.terminate();
|
||||
}
|
||||
},
|
||||
});
|
||||
source.pipeThrough(transform);
|
||||
const readable = transform.readable;
|
||||
Object.setPrototypeOf(readable, ChecksumStream.prototype);
|
||||
return readable;
|
||||
};
|
||||
9
backend/node_modules/@smithy/util-stream/dist-es/checksum/createChecksumStream.js
generated
vendored
Normal file
9
backend/node_modules/@smithy/util-stream/dist-es/checksum/createChecksumStream.js
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
import { isReadableStream } from "../stream-type-check";
|
||||
import { ChecksumStream } from "./ChecksumStream";
|
||||
import { createChecksumStream as createChecksumStreamWeb } from "./createChecksumStream.browser";
|
||||
export function createChecksumStream(init) {
|
||||
if (typeof ReadableStream === "function" && isReadableStream(init.source)) {
|
||||
return createChecksumStreamWeb(init);
|
||||
}
|
||||
return new ChecksumStream(init);
|
||||
}
|
||||
57
backend/node_modules/@smithy/util-stream/dist-es/createBufferedReadable.js
generated
vendored
Normal file
57
backend/node_modules/@smithy/util-stream/dist-es/createBufferedReadable.js
generated
vendored
Normal file
@@ -0,0 +1,57 @@
|
||||
import { Readable } from "node:stream";
|
||||
import { ByteArrayCollector } from "./ByteArrayCollector";
|
||||
import { createBufferedReadableStream, flush, merge, modeOf, sizeOf } from "./createBufferedReadableStream";
|
||||
import { isReadableStream } from "./stream-type-check";
|
||||
export function createBufferedReadable(upstream, size, logger) {
|
||||
if (isReadableStream(upstream)) {
|
||||
return createBufferedReadableStream(upstream, size, logger);
|
||||
}
|
||||
const downstream = new Readable({ read() { } });
|
||||
let streamBufferingLoggedWarning = false;
|
||||
let bytesSeen = 0;
|
||||
const buffers = [
|
||||
"",
|
||||
new ByteArrayCollector((size) => new Uint8Array(size)),
|
||||
new ByteArrayCollector((size) => Buffer.from(new Uint8Array(size))),
|
||||
];
|
||||
let mode = -1;
|
||||
upstream.on("data", (chunk) => {
|
||||
const chunkMode = modeOf(chunk, true);
|
||||
if (mode !== chunkMode) {
|
||||
if (mode >= 0) {
|
||||
downstream.push(flush(buffers, mode));
|
||||
}
|
||||
mode = chunkMode;
|
||||
}
|
||||
if (mode === -1) {
|
||||
downstream.push(chunk);
|
||||
return;
|
||||
}
|
||||
const chunkSize = sizeOf(chunk);
|
||||
bytesSeen += chunkSize;
|
||||
const bufferSize = sizeOf(buffers[mode]);
|
||||
if (chunkSize >= size && bufferSize === 0) {
|
||||
downstream.push(chunk);
|
||||
}
|
||||
else {
|
||||
const newSize = merge(buffers, mode, chunk);
|
||||
if (!streamBufferingLoggedWarning && bytesSeen > size * 2) {
|
||||
streamBufferingLoggedWarning = true;
|
||||
logger?.warn(`@smithy/util-stream - stream chunk size ${chunkSize} is below threshold of ${size}, automatically buffering.`);
|
||||
}
|
||||
if (newSize >= size) {
|
||||
downstream.push(flush(buffers, mode));
|
||||
}
|
||||
}
|
||||
});
|
||||
upstream.on("end", () => {
|
||||
if (mode !== -1) {
|
||||
const remainder = flush(buffers, mode);
|
||||
if (sizeOf(remainder) > 0) {
|
||||
downstream.push(remainder);
|
||||
}
|
||||
}
|
||||
downstream.push(null);
|
||||
});
|
||||
return downstream;
|
||||
}
|
||||
95
backend/node_modules/@smithy/util-stream/dist-es/createBufferedReadableStream.js
generated
vendored
Normal file
95
backend/node_modules/@smithy/util-stream/dist-es/createBufferedReadableStream.js
generated
vendored
Normal file
@@ -0,0 +1,95 @@
|
||||
import { ByteArrayCollector } from "./ByteArrayCollector";
|
||||
export function createBufferedReadableStream(upstream, size, logger) {
|
||||
const reader = upstream.getReader();
|
||||
let streamBufferingLoggedWarning = false;
|
||||
let bytesSeen = 0;
|
||||
const buffers = ["", new ByteArrayCollector((size) => new Uint8Array(size))];
|
||||
let mode = -1;
|
||||
const pull = async (controller) => {
|
||||
const { value, done } = await reader.read();
|
||||
const chunk = value;
|
||||
if (done) {
|
||||
if (mode !== -1) {
|
||||
const remainder = flush(buffers, mode);
|
||||
if (sizeOf(remainder) > 0) {
|
||||
controller.enqueue(remainder);
|
||||
}
|
||||
}
|
||||
controller.close();
|
||||
}
|
||||
else {
|
||||
const chunkMode = modeOf(chunk, false);
|
||||
if (mode !== chunkMode) {
|
||||
if (mode >= 0) {
|
||||
controller.enqueue(flush(buffers, mode));
|
||||
}
|
||||
mode = chunkMode;
|
||||
}
|
||||
if (mode === -1) {
|
||||
controller.enqueue(chunk);
|
||||
return;
|
||||
}
|
||||
const chunkSize = sizeOf(chunk);
|
||||
bytesSeen += chunkSize;
|
||||
const bufferSize = sizeOf(buffers[mode]);
|
||||
if (chunkSize >= size && bufferSize === 0) {
|
||||
controller.enqueue(chunk);
|
||||
}
|
||||
else {
|
||||
const newSize = merge(buffers, mode, chunk);
|
||||
if (!streamBufferingLoggedWarning && bytesSeen > size * 2) {
|
||||
streamBufferingLoggedWarning = true;
|
||||
logger?.warn(`@smithy/util-stream - stream chunk size ${chunkSize} is below threshold of ${size}, automatically buffering.`);
|
||||
}
|
||||
if (newSize >= size) {
|
||||
controller.enqueue(flush(buffers, mode));
|
||||
}
|
||||
else {
|
||||
await pull(controller);
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
return new ReadableStream({
|
||||
pull,
|
||||
});
|
||||
}
|
||||
export const createBufferedReadable = createBufferedReadableStream;
|
||||
export function merge(buffers, mode, chunk) {
|
||||
switch (mode) {
|
||||
case 0:
|
||||
buffers[0] += chunk;
|
||||
return sizeOf(buffers[0]);
|
||||
case 1:
|
||||
case 2:
|
||||
buffers[mode].push(chunk);
|
||||
return sizeOf(buffers[mode]);
|
||||
}
|
||||
}
|
||||
export function flush(buffers, mode) {
|
||||
switch (mode) {
|
||||
case 0:
|
||||
const s = buffers[0];
|
||||
buffers[0] = "";
|
||||
return s;
|
||||
case 1:
|
||||
case 2:
|
||||
return buffers[mode].flush();
|
||||
}
|
||||
throw new Error(`@smithy/util-stream - invalid index ${mode} given to flush()`);
|
||||
}
|
||||
export function sizeOf(chunk) {
|
||||
return chunk?.byteLength ?? chunk?.length ?? 0;
|
||||
}
|
||||
export function modeOf(chunk, allowBuffer = true) {
|
||||
if (allowBuffer && typeof Buffer !== "undefined" && chunk instanceof Buffer) {
|
||||
return 2;
|
||||
}
|
||||
if (chunk instanceof Uint8Array) {
|
||||
return 1;
|
||||
}
|
||||
if (typeof chunk === "string") {
|
||||
return 0;
|
||||
}
|
||||
return -1;
|
||||
}
|
||||
27
backend/node_modules/@smithy/util-stream/dist-es/getAwsChunkedEncodingStream.browser.js
generated
vendored
Normal file
27
backend/node_modules/@smithy/util-stream/dist-es/getAwsChunkedEncodingStream.browser.js
generated
vendored
Normal file
@@ -0,0 +1,27 @@
|
||||
export const getAwsChunkedEncodingStream = (readableStream, options) => {
|
||||
const { base64Encoder, bodyLengthChecker, checksumAlgorithmFn, checksumLocationName, streamHasher } = options;
|
||||
const checksumRequired = base64Encoder !== undefined &&
|
||||
bodyLengthChecker !== undefined &&
|
||||
checksumAlgorithmFn !== undefined &&
|
||||
checksumLocationName !== undefined &&
|
||||
streamHasher !== undefined;
|
||||
const digest = checksumRequired ? streamHasher(checksumAlgorithmFn, readableStream) : undefined;
|
||||
const reader = readableStream.getReader();
|
||||
return new ReadableStream({
|
||||
async pull(controller) {
|
||||
const { value, done } = await reader.read();
|
||||
if (done) {
|
||||
controller.enqueue(`0\r\n`);
|
||||
if (checksumRequired) {
|
||||
const checksum = base64Encoder(await digest);
|
||||
controller.enqueue(`${checksumLocationName}:${checksum}\r\n`);
|
||||
controller.enqueue(`\r\n`);
|
||||
}
|
||||
controller.close();
|
||||
}
|
||||
else {
|
||||
controller.enqueue(`${(bodyLengthChecker(value) || 0).toString(16)}\r\n${value}\r\n`);
|
||||
}
|
||||
},
|
||||
});
|
||||
};
|
||||
38
backend/node_modules/@smithy/util-stream/dist-es/getAwsChunkedEncodingStream.js
generated
vendored
Normal file
38
backend/node_modules/@smithy/util-stream/dist-es/getAwsChunkedEncodingStream.js
generated
vendored
Normal file
@@ -0,0 +1,38 @@
|
||||
import { Readable } from "node:stream";
|
||||
import { getAwsChunkedEncodingStream as getAwsChunkedEncodingStreamBrowser } from "./getAwsChunkedEncodingStream.browser";
|
||||
import { isReadableStream } from "./stream-type-check";
|
||||
export function getAwsChunkedEncodingStream(stream, options) {
|
||||
const readable = stream;
|
||||
const readableStream = stream;
|
||||
if (isReadableStream(readableStream)) {
|
||||
return getAwsChunkedEncodingStreamBrowser(readableStream, options);
|
||||
}
|
||||
const { base64Encoder, bodyLengthChecker, checksumAlgorithmFn, checksumLocationName, streamHasher } = options;
|
||||
const checksumRequired = base64Encoder !== undefined &&
|
||||
checksumAlgorithmFn !== undefined &&
|
||||
checksumLocationName !== undefined &&
|
||||
streamHasher !== undefined;
|
||||
const digest = checksumRequired ? streamHasher(checksumAlgorithmFn, readable) : undefined;
|
||||
const awsChunkedEncodingStream = new Readable({
|
||||
read: () => { },
|
||||
});
|
||||
readable.on("data", (data) => {
|
||||
const length = bodyLengthChecker(data) || 0;
|
||||
if (length === 0) {
|
||||
return;
|
||||
}
|
||||
awsChunkedEncodingStream.push(`${length.toString(16)}\r\n`);
|
||||
awsChunkedEncodingStream.push(data);
|
||||
awsChunkedEncodingStream.push("\r\n");
|
||||
});
|
||||
readable.on("end", async () => {
|
||||
awsChunkedEncodingStream.push(`0\r\n`);
|
||||
if (checksumRequired) {
|
||||
const checksum = base64Encoder(await digest);
|
||||
awsChunkedEncodingStream.push(`${checksumLocationName}:${checksum}\r\n`);
|
||||
awsChunkedEncodingStream.push(`\r\n`);
|
||||
}
|
||||
awsChunkedEncodingStream.push(null);
|
||||
});
|
||||
return awsChunkedEncodingStream;
|
||||
}
|
||||
31
backend/node_modules/@smithy/util-stream/dist-es/headStream.browser.js
generated
vendored
Normal file
31
backend/node_modules/@smithy/util-stream/dist-es/headStream.browser.js
generated
vendored
Normal file
@@ -0,0 +1,31 @@
|
||||
export async function headStream(stream, bytes) {
|
||||
let byteLengthCounter = 0;
|
||||
const chunks = [];
|
||||
const reader = stream.getReader();
|
||||
let isDone = false;
|
||||
while (!isDone) {
|
||||
const { done, value } = await reader.read();
|
||||
if (value) {
|
||||
chunks.push(value);
|
||||
byteLengthCounter += value?.byteLength ?? 0;
|
||||
}
|
||||
if (byteLengthCounter >= bytes) {
|
||||
break;
|
||||
}
|
||||
isDone = done;
|
||||
}
|
||||
reader.releaseLock();
|
||||
const collected = new Uint8Array(Math.min(bytes, byteLengthCounter));
|
||||
let offset = 0;
|
||||
for (const chunk of chunks) {
|
||||
if (chunk.byteLength > collected.byteLength - offset) {
|
||||
collected.set(chunk.subarray(0, collected.byteLength - offset), offset);
|
||||
break;
|
||||
}
|
||||
else {
|
||||
collected.set(chunk, offset);
|
||||
}
|
||||
offset += chunk.length;
|
||||
}
|
||||
return collected;
|
||||
}
|
||||
38
backend/node_modules/@smithy/util-stream/dist-es/headStream.js
generated
vendored
Normal file
38
backend/node_modules/@smithy/util-stream/dist-es/headStream.js
generated
vendored
Normal file
@@ -0,0 +1,38 @@
|
||||
import { Writable } from "stream";
|
||||
import { headStream as headWebStream } from "./headStream.browser";
|
||||
import { isReadableStream } from "./stream-type-check";
|
||||
export const headStream = (stream, bytes) => {
|
||||
if (isReadableStream(stream)) {
|
||||
return headWebStream(stream, bytes);
|
||||
}
|
||||
return new Promise((resolve, reject) => {
|
||||
const collector = new Collector();
|
||||
collector.limit = bytes;
|
||||
stream.pipe(collector);
|
||||
stream.on("error", (err) => {
|
||||
collector.end();
|
||||
reject(err);
|
||||
});
|
||||
collector.on("error", reject);
|
||||
collector.on("finish", function () {
|
||||
const bytes = new Uint8Array(Buffer.concat(this.buffers));
|
||||
resolve(bytes);
|
||||
});
|
||||
});
|
||||
};
|
||||
class Collector extends Writable {
|
||||
buffers = [];
|
||||
limit = Infinity;
|
||||
bytesBuffered = 0;
|
||||
_write(chunk, encoding, callback) {
|
||||
this.buffers.push(chunk);
|
||||
this.bytesBuffered += chunk.byteLength ?? 0;
|
||||
if (this.bytesBuffered >= this.limit) {
|
||||
const excess = this.bytesBuffered - this.limit;
|
||||
const tailBuffer = this.buffers[this.buffers.length - 1];
|
||||
this.buffers[this.buffers.length - 1] = tailBuffer.subarray(0, tailBuffer.byteLength - excess);
|
||||
this.emit("finish");
|
||||
}
|
||||
callback();
|
||||
}
|
||||
}
|
||||
9
backend/node_modules/@smithy/util-stream/dist-es/index.js
generated
vendored
Normal file
9
backend/node_modules/@smithy/util-stream/dist-es/index.js
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
export * from "./blob/Uint8ArrayBlobAdapter";
|
||||
export * from "./checksum/ChecksumStream";
|
||||
export * from "./checksum/createChecksumStream";
|
||||
export * from "./createBufferedReadable";
|
||||
export * from "./getAwsChunkedEncodingStream";
|
||||
export * from "./headStream";
|
||||
export * from "./sdk-stream-mixin";
|
||||
export * from "./splitStream";
|
||||
export { isReadableStream, isBlob } from "./stream-type-check";
|
||||
64
backend/node_modules/@smithy/util-stream/dist-es/sdk-stream-mixin.browser.js
generated
vendored
Normal file
64
backend/node_modules/@smithy/util-stream/dist-es/sdk-stream-mixin.browser.js
generated
vendored
Normal file
@@ -0,0 +1,64 @@
|
||||
import { streamCollector } from "@smithy/fetch-http-handler";
|
||||
import { toBase64 } from "@smithy/util-base64";
|
||||
import { toHex } from "@smithy/util-hex-encoding";
|
||||
import { toUtf8 } from "@smithy/util-utf8";
|
||||
import { isReadableStream } from "./stream-type-check";
|
||||
const ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED = "The stream has already been transformed.";
|
||||
export const sdkStreamMixin = (stream) => {
|
||||
if (!isBlobInstance(stream) && !isReadableStream(stream)) {
|
||||
const name = stream?.__proto__?.constructor?.name || stream;
|
||||
throw new Error(`Unexpected stream implementation, expect Blob or ReadableStream, got ${name}`);
|
||||
}
|
||||
let transformed = false;
|
||||
const transformToByteArray = async () => {
|
||||
if (transformed) {
|
||||
throw new Error(ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED);
|
||||
}
|
||||
transformed = true;
|
||||
return await streamCollector(stream);
|
||||
};
|
||||
const blobToWebStream = (blob) => {
|
||||
if (typeof blob.stream !== "function") {
|
||||
throw new Error("Cannot transform payload Blob to web stream. Please make sure the Blob.stream() is polyfilled.\n" +
|
||||
"If you are using React Native, this API is not yet supported, see: https://react-native.canny.io/feature-requests/p/fetch-streaming-body");
|
||||
}
|
||||
return blob.stream();
|
||||
};
|
||||
return Object.assign(stream, {
|
||||
transformToByteArray: transformToByteArray,
|
||||
transformToString: async (encoding) => {
|
||||
const buf = await transformToByteArray();
|
||||
if (encoding === "base64") {
|
||||
return toBase64(buf);
|
||||
}
|
||||
else if (encoding === "hex") {
|
||||
return toHex(buf);
|
||||
}
|
||||
else if (encoding === undefined || encoding === "utf8" || encoding === "utf-8") {
|
||||
return toUtf8(buf);
|
||||
}
|
||||
else if (typeof TextDecoder === "function") {
|
||||
return new TextDecoder(encoding).decode(buf);
|
||||
}
|
||||
else {
|
||||
throw new Error("TextDecoder is not available, please make sure polyfill is provided.");
|
||||
}
|
||||
},
|
||||
transformToWebStream: () => {
|
||||
if (transformed) {
|
||||
throw new Error(ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED);
|
||||
}
|
||||
transformed = true;
|
||||
if (isBlobInstance(stream)) {
|
||||
return blobToWebStream(stream);
|
||||
}
|
||||
else if (isReadableStream(stream)) {
|
||||
return stream;
|
||||
}
|
||||
else {
|
||||
throw new Error(`Cannot transform payload to web stream, got ${stream}`);
|
||||
}
|
||||
},
|
||||
});
|
||||
};
|
||||
const isBlobInstance = (stream) => typeof Blob === "function" && stream instanceof Blob;
|
||||
50
backend/node_modules/@smithy/util-stream/dist-es/sdk-stream-mixin.js
generated
vendored
Normal file
50
backend/node_modules/@smithy/util-stream/dist-es/sdk-stream-mixin.js
generated
vendored
Normal file
@@ -0,0 +1,50 @@
|
||||
import { streamCollector } from "@smithy/node-http-handler";
|
||||
import { fromArrayBuffer } from "@smithy/util-buffer-from";
|
||||
import { Readable } from "stream";
|
||||
import { sdkStreamMixin as sdkStreamMixinReadableStream } from "./sdk-stream-mixin.browser";
|
||||
const ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED = "The stream has already been transformed.";
|
||||
export const sdkStreamMixin = (stream) => {
|
||||
if (!(stream instanceof Readable)) {
|
||||
try {
|
||||
return sdkStreamMixinReadableStream(stream);
|
||||
}
|
||||
catch (e) {
|
||||
const name = stream?.__proto__?.constructor?.name || stream;
|
||||
throw new Error(`Unexpected stream implementation, expect Stream.Readable instance, got ${name}`);
|
||||
}
|
||||
}
|
||||
let transformed = false;
|
||||
const transformToByteArray = async () => {
|
||||
if (transformed) {
|
||||
throw new Error(ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED);
|
||||
}
|
||||
transformed = true;
|
||||
return await streamCollector(stream);
|
||||
};
|
||||
return Object.assign(stream, {
|
||||
transformToByteArray,
|
||||
transformToString: async (encoding) => {
|
||||
const buf = await transformToByteArray();
|
||||
if (encoding === undefined || Buffer.isEncoding(encoding)) {
|
||||
return fromArrayBuffer(buf.buffer, buf.byteOffset, buf.byteLength).toString(encoding);
|
||||
}
|
||||
else {
|
||||
const decoder = new TextDecoder(encoding);
|
||||
return decoder.decode(buf);
|
||||
}
|
||||
},
|
||||
transformToWebStream: () => {
|
||||
if (transformed) {
|
||||
throw new Error(ERR_MSG_STREAM_HAS_BEEN_TRANSFORMED);
|
||||
}
|
||||
if (stream.readableFlowing !== null) {
|
||||
throw new Error("The stream has been consumed by other callbacks.");
|
||||
}
|
||||
if (typeof Readable.toWeb !== "function") {
|
||||
throw new Error("Readable.toWeb() is not supported. Please ensure a polyfill is available.");
|
||||
}
|
||||
transformed = true;
|
||||
return Readable.toWeb(stream);
|
||||
},
|
||||
});
|
||||
};
|
||||
7
backend/node_modules/@smithy/util-stream/dist-es/splitStream.browser.js
generated
vendored
Normal file
7
backend/node_modules/@smithy/util-stream/dist-es/splitStream.browser.js
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
export async function splitStream(stream) {
|
||||
if (typeof stream.stream === "function") {
|
||||
stream = stream.stream();
|
||||
}
|
||||
const readableStream = stream;
|
||||
return readableStream.tee();
|
||||
}
|
||||
13
backend/node_modules/@smithy/util-stream/dist-es/splitStream.js
generated
vendored
Normal file
13
backend/node_modules/@smithy/util-stream/dist-es/splitStream.js
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
import { PassThrough } from "stream";
|
||||
import { splitStream as splitWebStream } from "./splitStream.browser";
|
||||
import { isBlob, isReadableStream } from "./stream-type-check";
|
||||
export async function splitStream(stream) {
|
||||
if (isReadableStream(stream) || isBlob(stream)) {
|
||||
return splitWebStream(stream);
|
||||
}
|
||||
const stream1 = new PassThrough();
|
||||
const stream2 = new PassThrough();
|
||||
stream.pipe(stream1);
|
||||
stream.pipe(stream2);
|
||||
return [stream1, stream2];
|
||||
}
|
||||
5
backend/node_modules/@smithy/util-stream/dist-es/stream-type-check.js
generated
vendored
Normal file
5
backend/node_modules/@smithy/util-stream/dist-es/stream-type-check.js
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
export const isReadableStream = (stream) => typeof ReadableStream === "function" &&
|
||||
(stream?.constructor?.name === ReadableStream.name || stream instanceof ReadableStream);
|
||||
export const isBlob = (blob) => {
|
||||
return typeof Blob === "function" && (blob?.constructor?.name === Blob.name || blob instanceof Blob);
|
||||
};
|
||||
13
backend/node_modules/@smithy/util-stream/dist-types/ByteArrayCollector.d.ts
generated
vendored
Normal file
13
backend/node_modules/@smithy/util-stream/dist-types/ByteArrayCollector.d.ts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
/**
|
||||
* Aggregates byteArrays on demand.
|
||||
* @internal
|
||||
*/
|
||||
export declare class ByteArrayCollector {
|
||||
readonly allocByteArray: (size: number) => Uint8Array;
|
||||
byteLength: number;
|
||||
private byteArrays;
|
||||
constructor(allocByteArray: (size: number) => Uint8Array);
|
||||
push(byteArray: Uint8Array): void;
|
||||
flush(): Uint8Array;
|
||||
private reset;
|
||||
}
|
||||
22
backend/node_modules/@smithy/util-stream/dist-types/blob/Uint8ArrayBlobAdapter.d.ts
generated
vendored
Normal file
22
backend/node_modules/@smithy/util-stream/dist-types/blob/Uint8ArrayBlobAdapter.d.ts
generated
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
/**
|
||||
* Adapter for conversions of the native Uint8Array type.
|
||||
* @public
|
||||
*/
|
||||
export declare class Uint8ArrayBlobAdapter extends Uint8Array {
|
||||
/**
|
||||
* @param source - such as a string or Stream.
|
||||
* @param encoding - utf-8 or base64.
|
||||
* @returns a new Uint8ArrayBlobAdapter extending Uint8Array.
|
||||
*/
|
||||
static fromString(source: string, encoding?: string): Uint8ArrayBlobAdapter;
|
||||
/**
|
||||
* @param source - Uint8Array to be mutated.
|
||||
* @returns the same Uint8Array but with prototype switched to Uint8ArrayBlobAdapter.
|
||||
*/
|
||||
static mutate(source: Uint8Array): Uint8ArrayBlobAdapter;
|
||||
/**
|
||||
* @param encoding - default 'utf-8'.
|
||||
* @returns the blob as string.
|
||||
*/
|
||||
transformToString(encoding?: string): string;
|
||||
}
|
||||
37
backend/node_modules/@smithy/util-stream/dist-types/checksum/ChecksumStream.browser.d.ts
generated
vendored
Normal file
37
backend/node_modules/@smithy/util-stream/dist-types/checksum/ChecksumStream.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,37 @@
|
||||
import type { Checksum, Encoder } from "@smithy/types";
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export interface ChecksumStreamInit {
|
||||
/**
|
||||
* Base64 value of the expected checksum.
|
||||
*/
|
||||
expectedChecksum: string;
|
||||
/**
|
||||
* For error messaging, the location from which the checksum value was read.
|
||||
*/
|
||||
checksumSourceLocation: string;
|
||||
/**
|
||||
* The checksum calculator.
|
||||
*/
|
||||
checksum: Checksum;
|
||||
/**
|
||||
* The stream to be checked.
|
||||
*/
|
||||
source: ReadableStream;
|
||||
/**
|
||||
* Optional base 64 encoder if calling from a request context.
|
||||
*/
|
||||
base64Encoder?: Encoder;
|
||||
}
|
||||
declare const ChecksumStream_base: any;
|
||||
/**
|
||||
* This stub exists so that the readable returned by createChecksumStream
|
||||
* identifies as "ChecksumStream" in alignment with the Node.js
|
||||
* implementation.
|
||||
*
|
||||
* @extends ReadableStream
|
||||
*/
|
||||
export declare class ChecksumStream extends ChecksumStream_base {
|
||||
}
|
||||
export {};
|
||||
60
backend/node_modules/@smithy/util-stream/dist-types/checksum/ChecksumStream.d.ts
generated
vendored
Normal file
60
backend/node_modules/@smithy/util-stream/dist-types/checksum/ChecksumStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
import type { Checksum, Encoder } from "@smithy/types";
|
||||
import type { Readable } from "stream";
|
||||
import { Duplex } from "stream";
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export interface ChecksumStreamInit<T extends Readable | ReadableStream> {
|
||||
/**
|
||||
* Base64 value of the expected checksum.
|
||||
*/
|
||||
expectedChecksum: string;
|
||||
/**
|
||||
* For error messaging, the location from which the checksum value was read.
|
||||
*/
|
||||
checksumSourceLocation: string;
|
||||
/**
|
||||
* The checksum calculator.
|
||||
*/
|
||||
checksum: Checksum;
|
||||
/**
|
||||
* The stream to be checked.
|
||||
*/
|
||||
source: T;
|
||||
/**
|
||||
* Optional base 64 encoder if calling from a request context.
|
||||
*/
|
||||
base64Encoder?: Encoder;
|
||||
}
|
||||
/**
|
||||
* Wrapper for throwing checksum errors for streams without
|
||||
* buffering the stream.
|
||||
*
|
||||
* @internal
|
||||
*/
|
||||
export declare class ChecksumStream extends Duplex {
|
||||
private expectedChecksum;
|
||||
private checksumSourceLocation;
|
||||
private checksum;
|
||||
private source?;
|
||||
private base64Encoder;
|
||||
constructor({ expectedChecksum, checksum, source, checksumSourceLocation, base64Encoder, }: ChecksumStreamInit<Readable>);
|
||||
/**
|
||||
* Do not call this directly.
|
||||
* @internal
|
||||
*/
|
||||
_read(size: number): void;
|
||||
/**
|
||||
* When the upstream source flows data to this stream,
|
||||
* calculate a step update of the checksum.
|
||||
* Do not call this directly.
|
||||
* @internal
|
||||
*/
|
||||
_write(chunk: Buffer, encoding: string, callback: (err?: Error) => void): void;
|
||||
/**
|
||||
* When the upstream source finishes, perform the checksum comparison.
|
||||
* Do not call this directly.
|
||||
* @internal
|
||||
*/
|
||||
_final(callback: (err?: Error) => void): Promise<void>;
|
||||
}
|
||||
14
backend/node_modules/@smithy/util-stream/dist-types/checksum/createChecksumStream.browser.d.ts
generated
vendored
Normal file
14
backend/node_modules/@smithy/util-stream/dist-types/checksum/createChecksumStream.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
import type { ChecksumStreamInit } from "./ChecksumStream.browser";
|
||||
/**
|
||||
* Alias prevents compiler from turning
|
||||
* ReadableStream into ReadableStream<any>, which is incompatible
|
||||
* with the NodeJS.ReadableStream global type.
|
||||
* @internal
|
||||
*/
|
||||
export type ReadableStreamType = ReadableStream;
|
||||
/**
|
||||
* Creates a stream adapter for throwing checksum errors for streams without
|
||||
* buffering the stream.
|
||||
* @internal
|
||||
*/
|
||||
export declare const createChecksumStream: ({ expectedChecksum, checksum, source, checksumSourceLocation, base64Encoder, }: ChecksumStreamInit) => ReadableStreamType;
|
||||
13
backend/node_modules/@smithy/util-stream/dist-types/checksum/createChecksumStream.d.ts
generated
vendored
Normal file
13
backend/node_modules/@smithy/util-stream/dist-types/checksum/createChecksumStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
import type { Readable } from "stream";
|
||||
import type { ChecksumStreamInit } from "./ChecksumStream";
|
||||
import type { ReadableStreamType } from "./createChecksumStream.browser";
|
||||
/**
|
||||
* Creates a stream mirroring the input stream's interface, but
|
||||
* performs checksumming when reading to the end of the stream.
|
||||
* @internal
|
||||
*/
|
||||
export declare function createChecksumStream(init: ChecksumStreamInit<ReadableStreamType>): ReadableStreamType;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare function createChecksumStream(init: ChecksumStreamInit<Readable>): Readable;
|
||||
15
backend/node_modules/@smithy/util-stream/dist-types/createBufferedReadable.d.ts
generated
vendored
Normal file
15
backend/node_modules/@smithy/util-stream/dist-types/createBufferedReadable.d.ts
generated
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
import type { Logger } from "@smithy/types";
|
||||
import { Readable } from "node:stream";
|
||||
/**
|
||||
* @internal
|
||||
* @param upstream - any Readable or ReadableStream.
|
||||
* @param size - byte or character length minimum. Buffering occurs when a chunk fails to meet this value.
|
||||
* @param logger - for emitting warnings when buffering occurs.
|
||||
* @returns another stream of the same data and stream class, but buffers chunks until
|
||||
* the minimum size is met, except for the last chunk.
|
||||
*/
|
||||
export declare function createBufferedReadable(upstream: Readable, size: number, logger?: Logger): Readable;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare function createBufferedReadable(upstream: ReadableStream, size: number, logger?: Logger): ReadableStream;
|
||||
50
backend/node_modules/@smithy/util-stream/dist-types/createBufferedReadableStream.d.ts
generated
vendored
Normal file
50
backend/node_modules/@smithy/util-stream/dist-types/createBufferedReadableStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,50 @@
|
||||
import type { Logger } from "@smithy/types";
|
||||
import { ByteArrayCollector } from "./ByteArrayCollector";
|
||||
export type BufferStore = [string, ByteArrayCollector, ByteArrayCollector?];
|
||||
export type BufferUnion = string | Uint8Array;
|
||||
export type Modes = 0 | 1 | 2;
|
||||
/**
|
||||
* @internal
|
||||
* @param upstream - any ReadableStream.
|
||||
* @param size - byte or character length minimum. Buffering occurs when a chunk fails to meet this value.
|
||||
* @param logger - for emitting warnings when buffering occurs.
|
||||
* @returns another stream of the same data, but buffers chunks until
|
||||
* the minimum size is met, except for the last chunk.
|
||||
*/
|
||||
export declare function createBufferedReadableStream(upstream: ReadableStream, size: number, logger?: Logger): ReadableStream;
|
||||
/**
|
||||
* Replaces R/RS polymorphic implementation in environments with only ReadableStream.
|
||||
* @internal
|
||||
*/
|
||||
export declare const createBufferedReadable: typeof createBufferedReadableStream;
|
||||
/**
|
||||
* @internal
|
||||
* @param buffers
|
||||
* @param mode
|
||||
* @param chunk
|
||||
* @returns the new buffer size after merging the chunk with its appropriate buffer.
|
||||
*/
|
||||
export declare function merge(buffers: BufferStore, mode: Modes, chunk: string | Uint8Array): number;
|
||||
/**
|
||||
* @internal
|
||||
* @param buffers
|
||||
* @param mode
|
||||
* @returns the buffer matching the mode.
|
||||
*/
|
||||
export declare function flush(buffers: BufferStore, mode: Modes | -1): BufferUnion;
|
||||
/**
|
||||
* @internal
|
||||
* @param chunk
|
||||
* @returns size of the chunk in bytes or characters.
|
||||
*/
|
||||
export declare function sizeOf(chunk?: {
|
||||
byteLength?: number;
|
||||
length?: number;
|
||||
}): number;
|
||||
/**
|
||||
* @internal
|
||||
* @param chunk - from upstream Readable.
|
||||
* @param allowBuffer - allow mode 2 (Buffer), otherwise Buffer will return mode 1.
|
||||
* @returns type index of the chunk.
|
||||
*/
|
||||
export declare function modeOf(chunk: BufferUnion, allowBuffer?: boolean): Modes | -1;
|
||||
5
backend/node_modules/@smithy/util-stream/dist-types/getAwsChunkedEncodingStream.browser.d.ts
generated
vendored
Normal file
5
backend/node_modules/@smithy/util-stream/dist-types/getAwsChunkedEncodingStream.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
import type { GetAwsChunkedEncodingStream } from "@smithy/types";
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare const getAwsChunkedEncodingStream: GetAwsChunkedEncodingStream<ReadableStream>;
|
||||
10
backend/node_modules/@smithy/util-stream/dist-types/getAwsChunkedEncodingStream.d.ts
generated
vendored
Normal file
10
backend/node_modules/@smithy/util-stream/dist-types/getAwsChunkedEncodingStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
import type { GetAwsChunkedEncodingStreamOptions } from "@smithy/types";
|
||||
import { Readable } from "node:stream";
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare function getAwsChunkedEncodingStream(stream: Readable, options: GetAwsChunkedEncodingStreamOptions): Readable;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare function getAwsChunkedEncodingStream(stream: ReadableStream, options: GetAwsChunkedEncodingStreamOptions): ReadableStream;
|
||||
7
backend/node_modules/@smithy/util-stream/dist-types/headStream.browser.d.ts
generated
vendored
Normal file
7
backend/node_modules/@smithy/util-stream/dist-types/headStream.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
/**
|
||||
* Caution: the input stream must be destroyed separately, this function does not do so.
|
||||
* @internal
|
||||
* @param stream
|
||||
* @param bytes - read head bytes from the stream and discard the rest of it.
|
||||
*/
|
||||
export declare function headStream(stream: ReadableStream, bytes: number): Promise<Uint8Array>;
|
||||
9
backend/node_modules/@smithy/util-stream/dist-types/headStream.d.ts
generated
vendored
Normal file
9
backend/node_modules/@smithy/util-stream/dist-types/headStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
import type { Readable } from "stream";
|
||||
/**
|
||||
* Caution: the input stream must be destroyed separately, this function does not do so.
|
||||
*
|
||||
* @internal
|
||||
* @param stream - to be read.
|
||||
* @param bytes - read head bytes from the stream and discard the rest of it.
|
||||
*/
|
||||
export declare const headStream: (stream: Readable | ReadableStream, bytes: number) => Promise<Uint8Array>;
|
||||
12
backend/node_modules/@smithy/util-stream/dist-types/index.d.ts
generated
vendored
Normal file
12
backend/node_modules/@smithy/util-stream/dist-types/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
export * from "./blob/Uint8ArrayBlobAdapter";
|
||||
export * from "./checksum/ChecksumStream";
|
||||
export * from "./checksum/createChecksumStream";
|
||||
export * from "./createBufferedReadable";
|
||||
export * from "./getAwsChunkedEncodingStream";
|
||||
export * from "./headStream";
|
||||
export * from "./sdk-stream-mixin";
|
||||
export * from "./splitStream";
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export { isReadableStream, isBlob } from "./stream-type-check";
|
||||
7
backend/node_modules/@smithy/util-stream/dist-types/sdk-stream-mixin.browser.d.ts
generated
vendored
Normal file
7
backend/node_modules/@smithy/util-stream/dist-types/sdk-stream-mixin.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
import type { SdkStream } from "@smithy/types";
|
||||
/**
|
||||
* The stream handling utility functions for browsers and React Native
|
||||
*
|
||||
* @internal
|
||||
*/
|
||||
export declare const sdkStreamMixin: (stream: unknown) => SdkStream<ReadableStream | Blob>;
|
||||
8
backend/node_modules/@smithy/util-stream/dist-types/sdk-stream-mixin.d.ts
generated
vendored
Normal file
8
backend/node_modules/@smithy/util-stream/dist-types/sdk-stream-mixin.d.ts
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
import type { SdkStream } from "@smithy/types";
|
||||
import { Readable } from "stream";
|
||||
/**
|
||||
* The function that mixes in the utility functions to help consuming runtime-specific payload stream.
|
||||
*
|
||||
* @internal
|
||||
*/
|
||||
export declare const sdkStreamMixin: (stream: unknown) => SdkStream<ReadableStream | Blob> | SdkStream<Readable>;
|
||||
5
backend/node_modules/@smithy/util-stream/dist-types/splitStream.browser.d.ts
generated
vendored
Normal file
5
backend/node_modules/@smithy/util-stream/dist-types/splitStream.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
/**
|
||||
* @param stream
|
||||
* @returns stream split into two identical streams.
|
||||
*/
|
||||
export declare function splitStream(stream: ReadableStream | Blob): Promise<[ReadableStream, ReadableStream]>;
|
||||
11
backend/node_modules/@smithy/util-stream/dist-types/splitStream.d.ts
generated
vendored
Normal file
11
backend/node_modules/@smithy/util-stream/dist-types/splitStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
import type { Readable } from "stream";
|
||||
/**
|
||||
* @internal
|
||||
* @param stream - to be split.
|
||||
* @returns stream split into two identical streams.
|
||||
*/
|
||||
export declare function splitStream(stream: Readable): Promise<[Readable, Readable]>;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare function splitStream(stream: ReadableStream): Promise<[ReadableStream, ReadableStream]>;
|
||||
17
backend/node_modules/@smithy/util-stream/dist-types/stream-type-check.d.ts
generated
vendored
Normal file
17
backend/node_modules/@smithy/util-stream/dist-types/stream-type-check.d.ts
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
/**
|
||||
* Alias prevents compiler from turning
|
||||
* ReadableStream into ReadableStream<any>, which is incompatible
|
||||
* with the NodeJS.ReadableStream global type.
|
||||
*
|
||||
* @internal
|
||||
*/
|
||||
type ReadableStreamType = ReadableStream;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare const isReadableStream: (stream: unknown) => stream is ReadableStreamType;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare const isBlob: (blob: unknown) => blob is Blob;
|
||||
export {};
|
||||
13
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/ByteArrayCollector.d.ts
generated
vendored
Normal file
13
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/ByteArrayCollector.d.ts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
/**
|
||||
* Aggregates byteArrays on demand.
|
||||
* @internal
|
||||
*/
|
||||
export declare class ByteArrayCollector {
|
||||
readonly allocByteArray: (size: number) => Uint8Array;
|
||||
byteLength: number;
|
||||
private byteArrays;
|
||||
constructor(allocByteArray: (size: number) => Uint8Array);
|
||||
push(byteArray: Uint8Array): void;
|
||||
flush(): Uint8Array;
|
||||
private reset;
|
||||
}
|
||||
22
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/blob/Uint8ArrayBlobAdapter.d.ts
generated
vendored
Normal file
22
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/blob/Uint8ArrayBlobAdapter.d.ts
generated
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
/**
|
||||
* Adapter for conversions of the native Uint8Array type.
|
||||
* @public
|
||||
*/
|
||||
export declare class Uint8ArrayBlobAdapter extends Uint8Array {
|
||||
/**
|
||||
* @param source - such as a string or Stream.
|
||||
* @param encoding - utf-8 or base64.
|
||||
* @returns a new Uint8ArrayBlobAdapter extending Uint8Array.
|
||||
*/
|
||||
static fromString(source: string, encoding?: string): Uint8ArrayBlobAdapter;
|
||||
/**
|
||||
* @param source - Uint8Array to be mutated.
|
||||
* @returns the same Uint8Array but with prototype switched to Uint8ArrayBlobAdapter.
|
||||
*/
|
||||
static mutate(source: Uint8Array): Uint8ArrayBlobAdapter;
|
||||
/**
|
||||
* @param encoding - default 'utf-8'.
|
||||
* @returns the blob as string.
|
||||
*/
|
||||
transformToString(encoding?: string): string;
|
||||
}
|
||||
37
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/checksum/ChecksumStream.browser.d.ts
generated
vendored
Normal file
37
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/checksum/ChecksumStream.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,37 @@
|
||||
import { Checksum, Encoder } from "@smithy/types";
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export interface ChecksumStreamInit {
|
||||
/**
|
||||
* Base64 value of the expected checksum.
|
||||
*/
|
||||
expectedChecksum: string;
|
||||
/**
|
||||
* For error messaging, the location from which the checksum value was read.
|
||||
*/
|
||||
checksumSourceLocation: string;
|
||||
/**
|
||||
* The checksum calculator.
|
||||
*/
|
||||
checksum: Checksum;
|
||||
/**
|
||||
* The stream to be checked.
|
||||
*/
|
||||
source: ReadableStream;
|
||||
/**
|
||||
* Optional base 64 encoder if calling from a request context.
|
||||
*/
|
||||
base64Encoder?: Encoder;
|
||||
}
|
||||
declare const ChecksumStream_base: any;
|
||||
/**
|
||||
* This stub exists so that the readable returned by createChecksumStream
|
||||
* identifies as "ChecksumStream" in alignment with the Node.js
|
||||
* implementation.
|
||||
*
|
||||
* @extends ReadableStream
|
||||
*/
|
||||
export declare class ChecksumStream extends ChecksumStream_base {
|
||||
}
|
||||
export {};
|
||||
60
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/checksum/ChecksumStream.d.ts
generated
vendored
Normal file
60
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/checksum/ChecksumStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
import { Checksum, Encoder } from "@smithy/types";
|
||||
import { Readable } from "stream";
|
||||
import { Duplex } from "stream";
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export interface ChecksumStreamInit<T extends Readable | ReadableStream> {
|
||||
/**
|
||||
* Base64 value of the expected checksum.
|
||||
*/
|
||||
expectedChecksum: string;
|
||||
/**
|
||||
* For error messaging, the location from which the checksum value was read.
|
||||
*/
|
||||
checksumSourceLocation: string;
|
||||
/**
|
||||
* The checksum calculator.
|
||||
*/
|
||||
checksum: Checksum;
|
||||
/**
|
||||
* The stream to be checked.
|
||||
*/
|
||||
source: T;
|
||||
/**
|
||||
* Optional base 64 encoder if calling from a request context.
|
||||
*/
|
||||
base64Encoder?: Encoder;
|
||||
}
|
||||
/**
|
||||
* Wrapper for throwing checksum errors for streams without
|
||||
* buffering the stream.
|
||||
*
|
||||
* @internal
|
||||
*/
|
||||
export declare class ChecksumStream extends Duplex {
|
||||
private expectedChecksum;
|
||||
private checksumSourceLocation;
|
||||
private checksum;
|
||||
private source?;
|
||||
private base64Encoder;
|
||||
constructor({ expectedChecksum, checksum, source, checksumSourceLocation, base64Encoder, }: ChecksumStreamInit<Readable>);
|
||||
/**
|
||||
* Do not call this directly.
|
||||
* @internal
|
||||
*/
|
||||
_read(size: number): void;
|
||||
/**
|
||||
* When the upstream source flows data to this stream,
|
||||
* calculate a step update of the checksum.
|
||||
* Do not call this directly.
|
||||
* @internal
|
||||
*/
|
||||
_write(chunk: Buffer, encoding: string, callback: (err?: Error) => void): void;
|
||||
/**
|
||||
* When the upstream source finishes, perform the checksum comparison.
|
||||
* Do not call this directly.
|
||||
* @internal
|
||||
*/
|
||||
_final(callback: (err?: Error) => void): Promise<void>;
|
||||
}
|
||||
14
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/checksum/createChecksumStream.browser.d.ts
generated
vendored
Normal file
14
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/checksum/createChecksumStream.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
import { ChecksumStreamInit } from "./ChecksumStream.browser";
|
||||
/**
|
||||
* Alias prevents compiler from turning
|
||||
* ReadableStream into ReadableStream<any>, which is incompatible
|
||||
* with the NodeJS.ReadableStream global type.
|
||||
* @internal
|
||||
*/
|
||||
export type ReadableStreamType = ReadableStream;
|
||||
/**
|
||||
* Creates a stream adapter for throwing checksum errors for streams without
|
||||
* buffering the stream.
|
||||
* @internal
|
||||
*/
|
||||
export declare const createChecksumStream: ({ expectedChecksum, checksum, source, checksumSourceLocation, base64Encoder, }: ChecksumStreamInit) => ReadableStreamType;
|
||||
13
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/checksum/createChecksumStream.d.ts
generated
vendored
Normal file
13
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/checksum/createChecksumStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
import { Readable } from "stream";
|
||||
import { ChecksumStreamInit } from "./ChecksumStream";
|
||||
import { ReadableStreamType } from "./createChecksumStream.browser";
|
||||
/**
|
||||
* Creates a stream mirroring the input stream's interface, but
|
||||
* performs checksumming when reading to the end of the stream.
|
||||
* @internal
|
||||
*/
|
||||
export declare function createChecksumStream(init: ChecksumStreamInit<ReadableStreamType>): ReadableStreamType;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare function createChecksumStream(init: ChecksumStreamInit<Readable>): Readable;
|
||||
15
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/createBufferedReadable.d.ts
generated
vendored
Normal file
15
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/createBufferedReadable.d.ts
generated
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
import { Logger } from "@smithy/types";
|
||||
import { Readable } from "node:stream";
|
||||
/**
|
||||
* @internal
|
||||
* @param upstream - any Readable or ReadableStream.
|
||||
* @param size - byte or character length minimum. Buffering occurs when a chunk fails to meet this value.
|
||||
* @param logger - for emitting warnings when buffering occurs.
|
||||
* @returns another stream of the same data and stream class, but buffers chunks until
|
||||
* the minimum size is met, except for the last chunk.
|
||||
*/
|
||||
export declare function createBufferedReadable(upstream: Readable, size: number, logger?: Logger): Readable;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare function createBufferedReadable(upstream: ReadableStream, size: number, logger?: Logger): ReadableStream;
|
||||
54
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/createBufferedReadableStream.d.ts
generated
vendored
Normal file
54
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/createBufferedReadableStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,54 @@
|
||||
import { Logger } from "@smithy/types";
|
||||
import { ByteArrayCollector } from "./ByteArrayCollector";
|
||||
export type BufferStore = [
|
||||
string,
|
||||
ByteArrayCollector,
|
||||
ByteArrayCollector?
|
||||
];
|
||||
export type BufferUnion = string | Uint8Array;
|
||||
export type Modes = 0 | 1 | 2;
|
||||
/**
|
||||
* @internal
|
||||
* @param upstream - any ReadableStream.
|
||||
* @param size - byte or character length minimum. Buffering occurs when a chunk fails to meet this value.
|
||||
* @param logger - for emitting warnings when buffering occurs.
|
||||
* @returns another stream of the same data, but buffers chunks until
|
||||
* the minimum size is met, except for the last chunk.
|
||||
*/
|
||||
export declare function createBufferedReadableStream(upstream: ReadableStream, size: number, logger?: Logger): ReadableStream;
|
||||
/**
|
||||
* Replaces R/RS polymorphic implementation in environments with only ReadableStream.
|
||||
* @internal
|
||||
*/
|
||||
export declare const createBufferedReadable: typeof createBufferedReadableStream;
|
||||
/**
|
||||
* @internal
|
||||
* @param buffers
|
||||
* @param mode
|
||||
* @param chunk
|
||||
* @returns the new buffer size after merging the chunk with its appropriate buffer.
|
||||
*/
|
||||
export declare function merge(buffers: BufferStore, mode: Modes, chunk: string | Uint8Array): number;
|
||||
/**
|
||||
* @internal
|
||||
* @param buffers
|
||||
* @param mode
|
||||
* @returns the buffer matching the mode.
|
||||
*/
|
||||
export declare function flush(buffers: BufferStore, mode: Modes | -1): BufferUnion;
|
||||
/**
|
||||
* @internal
|
||||
* @param chunk
|
||||
* @returns size of the chunk in bytes or characters.
|
||||
*/
|
||||
export declare function sizeOf(chunk?: {
|
||||
byteLength?: number;
|
||||
length?: number;
|
||||
}): number;
|
||||
/**
|
||||
* @internal
|
||||
* @param chunk - from upstream Readable.
|
||||
* @param allowBuffer - allow mode 2 (Buffer), otherwise Buffer will return mode 1.
|
||||
* @returns type index of the chunk.
|
||||
*/
|
||||
export declare function modeOf(chunk: BufferUnion, allowBuffer?: boolean): Modes | -1;
|
||||
5
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/getAwsChunkedEncodingStream.browser.d.ts
generated
vendored
Normal file
5
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/getAwsChunkedEncodingStream.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
import { GetAwsChunkedEncodingStream } from "@smithy/types";
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare const getAwsChunkedEncodingStream: GetAwsChunkedEncodingStream<ReadableStream>;
|
||||
10
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/getAwsChunkedEncodingStream.d.ts
generated
vendored
Normal file
10
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/getAwsChunkedEncodingStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
import { GetAwsChunkedEncodingStreamOptions } from "@smithy/types";
|
||||
import { Readable } from "node:stream";
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare function getAwsChunkedEncodingStream(stream: Readable, options: GetAwsChunkedEncodingStreamOptions): Readable;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare function getAwsChunkedEncodingStream(stream: ReadableStream, options: GetAwsChunkedEncodingStreamOptions): ReadableStream;
|
||||
7
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/headStream.browser.d.ts
generated
vendored
Normal file
7
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/headStream.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
/**
|
||||
* Caution: the input stream must be destroyed separately, this function does not do so.
|
||||
* @internal
|
||||
* @param stream
|
||||
* @param bytes - read head bytes from the stream and discard the rest of it.
|
||||
*/
|
||||
export declare function headStream(stream: ReadableStream, bytes: number): Promise<Uint8Array>;
|
||||
9
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/headStream.d.ts
generated
vendored
Normal file
9
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/headStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
import { Readable } from "stream";
|
||||
/**
|
||||
* Caution: the input stream must be destroyed separately, this function does not do so.
|
||||
*
|
||||
* @internal
|
||||
* @param stream - to be read.
|
||||
* @param bytes - read head bytes from the stream and discard the rest of it.
|
||||
*/
|
||||
export declare const headStream: (stream: Readable | ReadableStream, bytes: number) => Promise<Uint8Array>;
|
||||
12
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/index.d.ts
generated
vendored
Normal file
12
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
export * from "./blob/Uint8ArrayBlobAdapter";
|
||||
export * from "./checksum/ChecksumStream";
|
||||
export * from "./checksum/createChecksumStream";
|
||||
export * from "./createBufferedReadable";
|
||||
export * from "./getAwsChunkedEncodingStream";
|
||||
export * from "./headStream";
|
||||
export * from "./sdk-stream-mixin";
|
||||
export * from "./splitStream";
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export { isReadableStream, isBlob } from "./stream-type-check";
|
||||
7
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/sdk-stream-mixin.browser.d.ts
generated
vendored
Normal file
7
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/sdk-stream-mixin.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
import { SdkStream } from "@smithy/types";
|
||||
/**
|
||||
* The stream handling utility functions for browsers and React Native
|
||||
*
|
||||
* @internal
|
||||
*/
|
||||
export declare const sdkStreamMixin: (stream: unknown) => SdkStream<ReadableStream | Blob>;
|
||||
8
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/sdk-stream-mixin.d.ts
generated
vendored
Normal file
8
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/sdk-stream-mixin.d.ts
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
import { SdkStream } from "@smithy/types";
|
||||
import { Readable } from "stream";
|
||||
/**
|
||||
* The function that mixes in the utility functions to help consuming runtime-specific payload stream.
|
||||
*
|
||||
* @internal
|
||||
*/
|
||||
export declare const sdkStreamMixin: (stream: unknown) => SdkStream<ReadableStream | Blob> | SdkStream<Readable>;
|
||||
8
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/splitStream.browser.d.ts
generated
vendored
Normal file
8
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/splitStream.browser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
/**
|
||||
* @param stream
|
||||
* @returns stream split into two identical streams.
|
||||
*/
|
||||
export declare function splitStream(stream: ReadableStream | Blob): Promise<[
|
||||
ReadableStream,
|
||||
ReadableStream
|
||||
]>;
|
||||
17
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/splitStream.d.ts
generated
vendored
Normal file
17
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/splitStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
import { Readable } from "stream";
|
||||
/**
|
||||
* @internal
|
||||
* @param stream - to be split.
|
||||
* @returns stream split into two identical streams.
|
||||
*/
|
||||
export declare function splitStream(stream: Readable): Promise<[
|
||||
Readable,
|
||||
Readable
|
||||
]>;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare function splitStream(stream: ReadableStream): Promise<[
|
||||
ReadableStream,
|
||||
ReadableStream
|
||||
]>;
|
||||
17
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/stream-type-check.d.ts
generated
vendored
Normal file
17
backend/node_modules/@smithy/util-stream/dist-types/ts3.4/stream-type-check.d.ts
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
/**
|
||||
* Alias prevents compiler from turning
|
||||
* ReadableStream into ReadableStream<any>, which is incompatible
|
||||
* with the NodeJS.ReadableStream global type.
|
||||
*
|
||||
* @internal
|
||||
*/
|
||||
type ReadableStreamType = ReadableStream;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare const isReadableStream: (stream: unknown) => stream is ReadableStreamType;
|
||||
/**
|
||||
* @internal
|
||||
*/
|
||||
export declare const isBlob: (blob: unknown) => blob is Blob;
|
||||
export {};
|
||||
99
backend/node_modules/@smithy/util-stream/package.json
generated
vendored
Normal file
99
backend/node_modules/@smithy/util-stream/package.json
generated
vendored
Normal file
@@ -0,0 +1,99 @@
|
||||
{
|
||||
"name": "@smithy/util-stream",
|
||||
"version": "4.5.10",
|
||||
"scripts": {
|
||||
"build": "concurrently 'yarn:build:cjs' 'yarn:build:es' 'yarn:build:types && yarn build:types:downlevel'",
|
||||
"build:cjs": "node ../../scripts/inline util-stream",
|
||||
"build:es": "yarn g:tsc -p tsconfig.es.json",
|
||||
"build:types": "yarn g:tsc -p tsconfig.types.json",
|
||||
"build:types:downlevel": "rimraf dist-types/ts3.4 && downlevel-dts dist-types dist-types/ts3.4",
|
||||
"stage-release": "rimraf ./.release && yarn pack && mkdir ./.release && tar zxvf ./package.tgz --directory ./.release && rm ./package.tgz",
|
||||
"clean": "rimraf ./dist-* && rimraf *.tsbuildinfo || exit 0",
|
||||
"lint": "eslint -c ../../.eslintrc.js \"src/**/*.ts\"",
|
||||
"format": "prettier --config ../../prettier.config.js --ignore-path ../../.prettierignore --write \"**/*.{ts,md,json}\"",
|
||||
"extract:docs": "api-extractor run --local",
|
||||
"test": "yarn g:vitest run && yarn test:browser",
|
||||
"test:integration": "yarn g:vitest run -c vitest.config.integ.mts",
|
||||
"test:watch": "yarn g:vitest watch",
|
||||
"test:integration:watch": "yarn g:vitest watch -c vitest.config.integ.mts",
|
||||
"test:browser": "yarn g:vitest run -c vitest.config.browser.mts",
|
||||
"test:browser:watch": "yarn g:vitest watch -c vitest.config.browser.mts"
|
||||
},
|
||||
"main": "./dist-cjs/index.js",
|
||||
"module": "./dist-es/index.js",
|
||||
"types": "./dist-types/index.d.ts",
|
||||
"author": {
|
||||
"name": "AWS SDK for JavaScript Team",
|
||||
"url": "https://aws.amazon.com/javascript/"
|
||||
},
|
||||
"license": "Apache-2.0",
|
||||
"sideEffects": false,
|
||||
"dependencies": {
|
||||
"@smithy/fetch-http-handler": "^5.3.9",
|
||||
"@smithy/node-http-handler": "^4.4.8",
|
||||
"@smithy/types": "^4.12.0",
|
||||
"@smithy/util-base64": "^4.3.0",
|
||||
"@smithy/util-buffer-from": "^4.2.0",
|
||||
"@smithy/util-hex-encoding": "^4.2.0",
|
||||
"@smithy/util-utf8": "^4.2.0",
|
||||
"tslib": "^2.6.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@smithy/util-test": "^0.2.8",
|
||||
"@types/node": "^18.11.9",
|
||||
"concurrently": "7.0.0",
|
||||
"downlevel-dts": "0.10.1",
|
||||
"rimraf": "5.0.10",
|
||||
"typedoc": "0.23.23"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18.0.0"
|
||||
},
|
||||
"typesVersions": {
|
||||
"<4.0": {
|
||||
"dist-types/*": [
|
||||
"dist-types/ts3.4/*"
|
||||
]
|
||||
}
|
||||
},
|
||||
"files": [
|
||||
"dist-*/**"
|
||||
],
|
||||
"browser": {
|
||||
"./dist-es/checksum/ChecksumStream": "./dist-es/checksum/ChecksumStream.browser",
|
||||
"./dist-es/checksum/createChecksumStream": "./dist-es/checksum/createChecksumStream.browser",
|
||||
"./dist-es/createBufferedReadable": "./dist-es/createBufferedReadableStream",
|
||||
"./dist-es/getAwsChunkedEncodingStream": "./dist-es/getAwsChunkedEncodingStream.browser",
|
||||
"./dist-es/headStream": "./dist-es/headStream.browser",
|
||||
"./dist-es/sdk-stream-mixin": "./dist-es/sdk-stream-mixin.browser",
|
||||
"./dist-es/splitStream": "./dist-es/splitStream.browser"
|
||||
},
|
||||
"react-native": {
|
||||
"./dist-es/checksum/createChecksumStream": "./dist-es/checksum/createChecksumStream.browser",
|
||||
"./dist-es/checksum/ChecksumStream": "./dist-es/checksum/ChecksumStream.browser",
|
||||
"./dist-es/getAwsChunkedEncodingStream": "./dist-es/getAwsChunkedEncodingStream.browser",
|
||||
"./dist-es/sdk-stream-mixin": "./dist-es/sdk-stream-mixin.browser",
|
||||
"./dist-es/headStream": "./dist-es/headStream.browser",
|
||||
"./dist-es/splitStream": "./dist-es/splitStream.browser",
|
||||
"./dist-es/createBufferedReadable": "./dist-es/createBufferedReadableStream",
|
||||
"./dist-cjs/checksum/createChecksumStream": "./dist-cjs/checksum/createChecksumStream.browser",
|
||||
"./dist-cjs/checksum/ChecksumStream": "./dist-cjs/checksum/ChecksumStream.browser",
|
||||
"./dist-cjs/getAwsChunkedEncodingStream": "./dist-cjs/getAwsChunkedEncodingStream.browser",
|
||||
"./dist-cjs/sdk-stream-mixin": "./dist-cjs/sdk-stream-mixin.browser",
|
||||
"./dist-cjs/headStream": "./dist-cjs/headStream.browser",
|
||||
"./dist-cjs/splitStream": "./dist-cjs/splitStream.browser",
|
||||
"./dist-cjs/createBufferedReadable": "./dist-cjs/createBufferedReadableStream"
|
||||
},
|
||||
"homepage": "https://github.com/smithy-lang/smithy-typescript/tree/main/packages/util-stream",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/smithy-lang/smithy-typescript.git",
|
||||
"directory": "packages/util-stream"
|
||||
},
|
||||
"typedoc": {
|
||||
"entryPoint": "src/index.ts"
|
||||
},
|
||||
"publishConfig": {
|
||||
"directory": ".release/package"
|
||||
}
|
||||
}
|
||||
Reference in New Issue
Block a user