Conexio amb la api

This commit is contained in:
janmaroto 2022-02-09 18:30:03 +01:00
commit b12369cb47
48513 changed files with 7391639 additions and 7 deletions

6
node_modules/streamroller/.travis.yml generated vendored Executable file
View file

@ -0,0 +1,6 @@
language: node_js
sudo: false
node_js:
- "12"
- "10"
- "8"

62
node_modules/streamroller/CHANGELOG.md generated vendored Executable file
View file

@ -0,0 +1,62 @@
# Streamroller Changelog
## 2.2.4
- [Fix for incorrect filename matching](https://github.com/log4js-node/streamroller/pull/61) - thanks [@rnd-debug](https://github.com/rnd-debug)
## 2.2.3
- [Fix for unhandled promise rejection during cleanup](https://github.com/log4js-node/streamroller/pull/56)
## 2.2.2
- [Fix for overwriting current file when using date rotation](https://github.com/log4js-node/streamroller/pull/54)
## 2.2.1
- Fix for num to keep not working when date pattern is all digits (forgot to do a PR for this one)
## 2.2.0
- [Fallback to copy and truncate when file is busy](https://github.com/log4js-node/streamroller/pull/53)
## 2.1.0
- [Improve Windows support (closing streams)](https://github.com/log4js-node/streamroller/pull/52)
## 2.0.0
- [Remove support for node v6](https://github.com/log4js-node/streamroller/pull/44)
- [Replace lodash with native alternatives](https://github.com/log4js-node/streamroller/pull/45) - thanks [@devoto13](https://github.com/devoto13)
- [Simplify filename formatting and parsing](https://github.com/log4js-node/streamroller/pull/46)
- [Removed async lib from main code](https://github.com/log4js-node/streamroller/pull/47)
- [Fix timezone issues in tests](https://github.com/log4js-node/streamroller/pull/48) - thanks [@devoto13](https://github.com/devoto13)
- [Fix for flag values that need existing file size](https://github.com/log4js-node/streamroller/pull/49)
- [Refactor for better readability](https://github.com/log4js-node/streamroller/pull/50)
- [Removed async lib from test code](https://github.com/log4js-node/streamroller/pull/51)
## 1.0.6
- [Fix for overwriting old backup files](https://github.com/log4js-node/streamroller/pull/43)
- Updated lodash to 4.17.14
## 1.0.5
- [Updated dependencies](https://github.com/log4js-node/streamroller/pull/38)
- [Fix for initial file date when appending to existing file](https://github.com/log4js-node/streamroller/pull/40)
## 1.0.4
- [Fix for initial size when appending to existing file](https://github.com/log4js-node/streamroller/pull/35)
## 1.0.3
- [Fix for crash when pattern is all digits](https://github.com/log4js-node/streamroller/pull/33)
## 1.0.2
- is exactly the same as 1.0.1, due to me being an idiot and not pulling before I pushed
## Previous versions
Previous release details are available by browsing the milestones in github.

20
node_modules/streamroller/LICENSE generated vendored Executable file
View file

@ -0,0 +1,20 @@
The MIT License (MIT)
Copyright (c) 2013 Gareth Jones
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

57
node_modules/streamroller/README.md generated vendored Executable file
View file

@ -0,0 +1,57 @@
streamroller
============
node.js file streams that roll over when they reach a maximum size, or a date/time.
npm install streamroller
## usage
var rollers = require('streamroller');
var stream = new rollers.RollingFileStream('myfile', 1024, 3);
stream.write("stuff");
stream.end();
The streams behave the same as standard node.js streams, except that when certain conditions are met they will rename the current file to a backup and start writing to a new file.
### new RollingFileStream(filename [, maxSize, numBackups, options])
* `filename` (String)
* `maxSize` - the size in bytes to trigger a rollover, if not provided this defaults to MAX_SAFE_INTEGER and the stream will not roll.
* `numBackups` - the number of old files to keep
* `options` - Object
* `encoding` - defaults to 'utf8'
* `mode` - defaults to 0644
* `flags` - defaults to 'a' (see [fs.open](https://nodejs.org/dist/latest-v8.x/docs/api/fs.html#fs_fs_open_path_flags_mode_callback) for more details)
* `compress` - (boolean) defaults to `false` - compress the backup files using gzip (files will have `.gz` extension).
* `keepFileExt` - (boolean) defaults to `false` - keep the file original extension. e.g.: `abc.log -> abc.1.log`.
This returns a `WritableStream`. When the current file being written to (given by `filename`) gets up to or larger than `maxSize`, then the current file will be renamed to `filename.1` and a new file will start being written to. Up to `numBackups` of old files are maintained, so if `numBackups` is 3 then there will be 4 files:
<pre>
filename
filename.1
filename.2
filename.3
</pre>
When filename size >= maxSize then:
<pre>
filename -> filename.1
filename.1 -> filename.2
filename.2 -> filename.3
filename.3 gets overwritten
filename is a new file
</pre>
### new DateRollingFileStream(filename, pattern, options)
* `filename` (String)
* `pattern` (String) - the date pattern to trigger rolling (see below)
* `options` - Object
* `encoding` - defaults to 'utf8'
* `mode` defaults to 0644
* `flags` defaults to 'a' (see [fs.open](https://nodejs.org/dist/latest-v8.x/docs/api/fs.html#fs_fs_open_path_flags_mode_callback) for more details)
* `compress` - (boolean) compress the backup files, defaults to false
* `keepFileExt` - (boolean) defaults to `false` - keep the file original extension. e.g.: `abc.log -> abc.2013-08-30.log`.
* `alwaysIncludePattern` - (boolean) extend the initial file with the pattern, defaults to false
* `daysToKeep` - (integer) if this is greater than 0, then files older than `daysToKeep` days will be deleted during file rolling.
This returns a `WritableStream`. When the current time, formatted as `pattern`, changes then the current file will be renamed to `filename.formattedDate` where `formattedDate` is the result of processing the date through the pattern, and a new file will begin to be written. Streamroller uses [date-format](http://github.com/nomiddlename/date-format) to format dates, and the `pattern` should use the date-format format. e.g. with a `pattern` of `".yyyy-MM-dd"`, and assuming today is August 29, 2013 then writing to the stream today will just write to `filename`. At midnight (or more precisely, at the next file write after midnight), `filename` will be renamed to `filename.2013-08-29` and a new `filename` will be created. If `options.alwaysIncludePattern` is true, then the initial file will be `filename.2013-08-29` and no renaming will occur at midnight, but a new file will be written to with the name `filename.2013-08-30`.

33
node_modules/streamroller/lib/DateRollingFileStream.js generated vendored Executable file
View file

@ -0,0 +1,33 @@
const RollingFileWriteStream = require('./RollingFileWriteStream');
// just to adapt the previous version
class DateRollingFileStream extends RollingFileWriteStream {
constructor(filename, pattern, options) {
if (pattern && typeof(pattern) === 'object') {
options = pattern;
pattern = null;
}
if (!options) {
options = {};
}
if (!pattern) {
pattern = 'yyyy-MM-dd';
}
if (options.daysToKeep) {
options.numToKeep = options.daysToKeep;
}
if (pattern.startsWith('.')) {
pattern = pattern.substring(1);
}
options.pattern = pattern;
super(filename, options);
this.mode = this.options.mode;
}
get theStream() {
return this.currentFileStream;
}
}
module.exports = DateRollingFileStream;

27
node_modules/streamroller/lib/RollingFileStream.js generated vendored Executable file
View file

@ -0,0 +1,27 @@
const RollingFileWriteStream = require('./RollingFileWriteStream');
// just to adapt the previous version
class RollingFileStream extends RollingFileWriteStream {
constructor(filename, size, backups, options) {
if (!options) {
options = {};
}
if (size) {
options.maxSize = size;
}
if (!backups) {
backups = 1;
}
options.numToKeep = backups;
super(filename, options);
this.backups = this.options.numToKeep;
this.size = this.options.maxSize;
}
get theStream() {
return this.currentFileStream;
}
}
module.exports = RollingFileStream;

267
node_modules/streamroller/lib/RollingFileWriteStream.js generated vendored Executable file
View file

@ -0,0 +1,267 @@
const debug = require("debug")("streamroller:RollingFileWriteStream");
const fs = require("fs-extra");
const path = require("path");
const newNow = require("./now");
const format = require("date-format");
const { Writable } = require("stream");
const fileNameFormatter = require("./fileNameFormatter");
const fileNameParser = require("./fileNameParser");
const moveAndMaybeCompressFile = require("./moveAndMaybeCompressFile");
/**
* RollingFileWriteStream is mainly used when writing to a file rolling by date or size.
* RollingFileWriteStream inherits from stream.Writable
*/
class RollingFileWriteStream extends Writable {
/**
* Create a RollingFileWriteStream
* @constructor
* @param {string} filePath - The file path to write.
* @param {object} options - The extra options
* @param {number} options.numToKeep - The max numbers of files to keep.
* @param {number} options.maxSize - The maxSize one file can reach. Unit is Byte.
* This should be more than 1024. The default is Number.MAX_SAFE_INTEGER.
* @param {string} options.mode - The mode of the files. The default is '0644'. Refer to stream.writable for more.
* @param {string} options.flags - The default is 'a'. Refer to stream.flags for more.
* @param {boolean} options.compress - Whether to compress backup files.
* @param {boolean} options.keepFileExt - Whether to keep the file extension.
* @param {string} options.pattern - The date string pattern in the file name.
* @param {boolean} options.alwaysIncludePattern - Whether to add date to the name of the first file.
*/
constructor(filePath, options) {
debug(`constructor: creating RollingFileWriteStream. path=${filePath}`);
super(options);
this.options = this._parseOption(options);
this.fileObject = path.parse(filePath);
if (this.fileObject.dir === "") {
this.fileObject = path.parse(path.join(process.cwd(), filePath));
}
this.fileFormatter = fileNameFormatter({
file: this.fileObject,
alwaysIncludeDate: this.options.alwaysIncludePattern,
needsIndex: this.options.maxSize < Number.MAX_SAFE_INTEGER,
compress: this.options.compress,
keepFileExt: this.options.keepFileExt
});
this.fileNameParser = fileNameParser({
file: this.fileObject,
keepFileExt: this.options.keepFileExt,
pattern: this.options.pattern
});
this.state = {
currentSize: 0
};
if (this.options.pattern) {
this.state.currentDate = format(this.options.pattern, newNow());
}
this.filename = this.fileFormatter({
index: 0,
date: this.state.currentDate
});
if (["a", "a+", "as", "as+"].includes(this.options.flags)) {
this._setExistingSizeAndDate();
}
debug(
`constructor: create new file ${this.filename}, state=${JSON.stringify(
this.state
)}`
);
this._renewWriteStream();
}
_setExistingSizeAndDate() {
try {
const stats = fs.statSync(this.filename);
this.state.currentSize = stats.size;
if (this.options.pattern) {
this.state.currentDate = format(this.options.pattern, stats.mtime);
}
} catch (e) {
//file does not exist, that's fine - move along
return;
}
}
_parseOption(rawOptions) {
const defaultOptions = {
maxSize: Number.MAX_SAFE_INTEGER,
numToKeep: Number.MAX_SAFE_INTEGER,
encoding: "utf8",
mode: parseInt("0644", 8),
flags: "a",
compress: false,
keepFileExt: false,
alwaysIncludePattern: false
};
const options = Object.assign({}, defaultOptions, rawOptions);
if (options.maxSize <= 0) {
throw new Error(`options.maxSize (${options.maxSize}) should be > 0`);
}
if (options.numToKeep <= 0) {
throw new Error(`options.numToKeep (${options.numToKeep}) should be > 0`);
}
debug(
`_parseOption: creating stream with option=${JSON.stringify(options)}`
);
return options;
}
_final(callback) {
this.currentFileStream.end("", this.options.encoding, callback);
}
_write(chunk, encoding, callback) {
this._shouldRoll().then(() => {
debug(
`_write: writing chunk. ` +
`file=${this.currentFileStream.path} ` +
`state=${JSON.stringify(this.state)} ` +
`chunk=${chunk}`
);
this.currentFileStream.write(chunk, encoding, e => {
this.state.currentSize += chunk.length;
callback(e);
});
});
}
async _shouldRoll() {
if (this._dateChanged() || this._tooBig()) {
debug(
`_shouldRoll: rolling because dateChanged? ${this._dateChanged()} or tooBig? ${this._tooBig()}`
);
await this._roll();
}
}
_dateChanged() {
return (
this.state.currentDate &&
this.state.currentDate !== format(this.options.pattern, newNow())
);
}
_tooBig() {
return this.state.currentSize >= this.options.maxSize;
}
_roll() {
debug(`_roll: closing the current stream`);
return new Promise((resolve, reject) => {
this.currentFileStream.end("", this.options.encoding, () => {
this._moveOldFiles()
.then(resolve)
.catch(reject);
});
});
}
async _moveOldFiles() {
const files = await this._getExistingFiles();
const todaysFiles = this.state.currentDate
? files.filter(f => f.date === this.state.currentDate)
: files;
for (let i = todaysFiles.length; i >= 0; i--) {
debug(`_moveOldFiles: i = ${i}`);
const sourceFilePath = this.fileFormatter({
date: this.state.currentDate,
index: i
});
const targetFilePath = this.fileFormatter({
date: this.state.currentDate,
index: i + 1
});
await moveAndMaybeCompressFile(
sourceFilePath,
targetFilePath,
this.options.compress && i === 0
);
}
this.state.currentSize = 0;
this.state.currentDate = this.state.currentDate
? format(this.options.pattern, newNow())
: null;
debug(
`_moveOldFiles: finished rolling files. state=${JSON.stringify(
this.state
)}`
);
this._renewWriteStream();
// wait for the file to be open before cleaning up old ones,
// otherwise the daysToKeep calculations can be off
await new Promise((resolve, reject) => {
this.currentFileStream.write("", "utf8", () => {
this._clean()
.then(resolve)
.catch(reject);
});
});
}
// Sorted from the oldest to the latest
async _getExistingFiles() {
const files = await fs.readdir(this.fileObject.dir).catch(() => []);
debug(`_getExistingFiles: files=${files}`);
const existingFileDetails = files
.map(n => this.fileNameParser(n))
.filter(n => n);
const getKey = n =>
(n.timestamp ? n.timestamp : newNow().getTime()) - n.index;
existingFileDetails.sort((a, b) => getKey(a) - getKey(b));
return existingFileDetails;
}
_renewWriteStream() {
fs.ensureDirSync(this.fileObject.dir);
const filePath = this.fileFormatter({
date: this.state.currentDate,
index: 0
});
const ops = {
flags: this.options.flags,
encoding: this.options.encoding,
mode: this.options.mode
};
this.currentFileStream = fs.createWriteStream(filePath, ops);
this.currentFileStream.on("error", e => {
this.emit("error", e);
});
}
async _clean() {
const existingFileDetails = await this._getExistingFiles();
debug(
`_clean: numToKeep = ${this.options.numToKeep}, existingFiles = ${existingFileDetails.length}`
);
debug("_clean: existing files are: ", existingFileDetails);
if (this._tooManyFiles(existingFileDetails.length)) {
const fileNamesToRemove = existingFileDetails
.slice(0, existingFileDetails.length - this.options.numToKeep - 1)
.map(f => path.format({ dir: this.fileObject.dir, base: f.filename }));
await deleteFiles(fileNamesToRemove);
}
}
_tooManyFiles(numFiles) {
return this.options.numToKeep > 0 && numFiles > this.options.numToKeep;
}
}
const deleteFiles = fileNames => {
debug(`deleteFiles: files to delete: ${fileNames}`);
return Promise.all(fileNames.map(f => fs.unlink(f).catch((e) => {
debug(`deleteFiles: error when unlinking ${f}, ignoring. Error was ${e}`);
})));
};
module.exports = RollingFileWriteStream;

37
node_modules/streamroller/lib/fileNameFormatter.js generated vendored Executable file
View file

@ -0,0 +1,37 @@
const debug = require("debug")("streamroller:fileNameFormatter");
const path = require("path");
const FILENAME_SEP = ".";
const ZIP_EXT = ".gz";
module.exports = ({
file,
keepFileExt,
needsIndex,
alwaysIncludeDate,
compress
}) => {
const dirAndName = path.join(file.dir, file.name);
const ext = f => f + file.ext;
const index = (f, i, d) =>
(needsIndex || !d) && i ? f + FILENAME_SEP + i : f;
const date = (f, i, d) => {
return (i > 0 || alwaysIncludeDate) && d ? f + FILENAME_SEP + d : f;
};
const gzip = (f, i) => (i && compress ? f + ZIP_EXT : f);
const parts = keepFileExt
? [date, index, ext, gzip]
: [ext, date, index, gzip];
return ({ date, index }) => {
debug(`_formatFileName: date=${date}, index=${index}`);
return parts.reduce(
(filename, part) => part(filename, index, date),
dirAndName
);
};
};

95
node_modules/streamroller/lib/fileNameParser.js generated vendored Executable file
View file

@ -0,0 +1,95 @@
const debug = require("debug")("streamroller:fileNameParser");
const FILENAME_SEP = ".";
const ZIP_EXT = ".gz";
const format = require("date-format");
module.exports = ({ file, keepFileExt, pattern }) => {
// All these functions take two arguments: f, the filename, and p, the result placeholder
// They return the filename with any matching parts removed.
// The "zip" function, for instance, removes the ".gz" part of the filename (if present)
const zip = (f, p) => {
if (f.endsWith(ZIP_EXT)) {
debug("it is gzipped");
p.isCompressed = true;
return f.slice(0, -1 * ZIP_EXT.length);
}
return f;
};
const __NOT_MATCHING__ = "__NOT_MATCHING__";
const extAtEnd = f => {
if (f.startsWith(file.name) && f.endsWith(file.ext)) {
debug("it starts and ends with the right things");
return f.slice(file.name.length + 1, -1 * file.ext.length);
}
return __NOT_MATCHING__;
};
const extInMiddle = f => {
if (f.startsWith(file.base)) {
debug("it starts with the right things");
return f.slice(file.base.length + 1);
}
return __NOT_MATCHING__;
};
const dateAndIndex = (f, p) => {
const items = f.split(FILENAME_SEP);
let indexStr = items[items.length - 1];
debug("items: ", items, ", indexStr: ", indexStr);
let dateStr = f;
if (indexStr !== undefined && indexStr.match(/^\d+$/)) {
dateStr = f.slice(0, -1 * (indexStr.length + 1));
debug(`dateStr is ${dateStr}`);
if (pattern && !dateStr) {
dateStr = indexStr;
indexStr = "0";
}
} else {
indexStr = "0";
}
try {
// Two arguments for new Date() are intentional. This will set other date
// components to minimal values in the current timezone instead of UTC,
// as new Date(0) will do.
const date = format.parse(pattern, dateStr, new Date(0, 0));
if (format.asString(pattern, date) !== dateStr) return f;
p.index = parseInt(indexStr, 10);
p.date = dateStr;
p.timestamp = date.getTime();
return "";
} catch (e) {
//not a valid date, don't panic.
debug(`Problem parsing ${dateStr} as ${pattern}, error was: `, e);
return f;
}
};
const index = (f, p) => {
if (f.match(/^\d+$/)) {
debug("it has an index");
p.index = parseInt(f, 10);
return "";
}
return f;
};
let parts = [
zip,
keepFileExt ? extAtEnd : extInMiddle,
pattern ? dateAndIndex : index
];
return filename => {
let result = { filename, index: 0, isCompressed: false };
// pass the filename through each of the file part parsers
let whatsLeftOver = parts.reduce(
(remains, part) => part(remains, result),
filename
);
// if there's anything left after parsing, then it wasn't a valid filename
return whatsLeftOver ? null : result;
};
};

5
node_modules/streamroller/lib/index.js generated vendored Executable file
View file

@ -0,0 +1,5 @@
module.exports = {
RollingFileWriteStream: require('./RollingFileWriteStream'),
RollingFileStream: require('./RollingFileStream'),
DateRollingFileStream: require('./DateRollingFileStream')
};

58
node_modules/streamroller/lib/moveAndMaybeCompressFile.js generated vendored Executable file
View file

@ -0,0 +1,58 @@
const debug = require('debug')('streamroller:moveAndMaybeCompressFile');
const fs = require('fs-extra');
const zlib = require('zlib');
const moveAndMaybeCompressFile = async (
sourceFilePath,
targetFilePath,
needCompress
) => {
if (sourceFilePath === targetFilePath) {
debug(
`moveAndMaybeCompressFile: source and target are the same, not doing anything`
);
return;
}
if (await fs.pathExists(sourceFilePath)) {
debug(
`moveAndMaybeCompressFile: moving file from ${sourceFilePath} to ${targetFilePath} ${
needCompress ? "with" : "without"
} compress`
);
if (needCompress) {
await new Promise((resolve, reject) => {
fs.createReadStream(sourceFilePath)
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream(targetFilePath))
.on("finish", () => {
debug(
`moveAndMaybeCompressFile: finished compressing ${targetFilePath}, deleting ${sourceFilePath}`
);
fs.unlink(sourceFilePath)
.then(resolve)
.catch(() => {
debug(`Deleting ${sourceFilePath} failed, truncating instead`);
fs.truncate(sourceFilePath).then(resolve).catch(reject)
});
});
});
} else {
debug(
`moveAndMaybeCompressFile: deleting file=${targetFilePath}, renaming ${sourceFilePath} to ${targetFilePath}`
);
try {
await fs.move(sourceFilePath, targetFilePath, { overwrite: true });
} catch (e) {
debug(
`moveAndMaybeCompressFile: error moving ${sourceFilePath} to ${targetFilePath}`, e
);
debug(`Trying copy+truncate instead`);
await fs.copy(sourceFilePath, targetFilePath, { overwrite: true });
await fs.truncate(sourceFilePath);
}
}
}
};
module.exports = moveAndMaybeCompressFile;

2
node_modules/streamroller/lib/now.js generated vendored Executable file
View file

@ -0,0 +1,2 @@
// allows us to inject a mock date in tests
module.exports = () => new Date();

12
node_modules/streamroller/node_modules/date-format/.eslintrc generated vendored Executable file
View file

@ -0,0 +1,12 @@
{
"extends": [
"eslint:recommended"
],
"env": {
"node": true,
"mocha": true
},
"plugins": [
"mocha"
]
}

View file

@ -0,0 +1,6 @@
language: node_js
sudo: false
node_js:
- "10"
- "8"
- "6"

20
node_modules/streamroller/node_modules/date-format/LICENSE generated vendored Executable file
View file

@ -0,0 +1,20 @@
The MIT License (MIT)
Copyright (c) 2013 Gareth Jones
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

58
node_modules/streamroller/node_modules/date-format/README.md generated vendored Executable file
View file

@ -0,0 +1,58 @@
date-format
===========
node.js formatting of Date objects as strings. Probably exactly the same as some other library out there.
```sh
npm install date-format
```
usage
=====
Formatting dates as strings
----
```javascript
var format = require('date-format');
format.asString(); //defaults to ISO8601 format and current date.
format.asString(new Date()); //defaults to ISO8601 format
format.asString('hh:mm:ss.SSS', new Date()); //just the time
```
or
```javascript
var format = require('date-format');
format(); //defaults to ISO8601 format and current date.
format(new Date());
format('hh:mm:ss.SSS', new Date());
```
Format string can be anything, but the following letters will be replaced (and leading zeroes added if necessary):
* dd - `date.getDate()`
* MM - `date.getMonth() + 1`
* yy - `date.getFullYear().toString().substring(2, 4)`
* yyyy - `date.getFullYear()`
* hh - `date.getHours()`
* mm - `date.getMinutes()`
* ss - `date.getSeconds()`
* SSS - `date.getMilliseconds()`
* O - timezone offset in +hm format (note that time will be in UTC if displaying offset)
Built-in formats:
* `format.ISO8601_FORMAT` - `2017-03-14T14:10:20.391` (local time used)
* `format.ISO8601_WITH_TZ_OFFSET_FORMAT` - `2017-03-14T03:10:20.391+1100` (UTC + TZ used)
* `format.DATETIME_FORMAT` - `14 03 2017 14:10:20.391` (local time used)
* `format.ABSOLUTETIME_FORMAT` - `14:10:20.391` (local time used)
Parsing strings as dates
----
The date format library has limited ability to parse strings into dates. It can convert strings created using date format patterns (as above), but if you're looking for anything more sophisticated than that you should probably look for a better library ([momentjs](https://momentjs.com) does pretty much everything).
```javascript
var format = require('date-format');
// pass in the format of the string as first argument
format.parse(format.ISO8601_FORMAT, '2017-03-14T14:10:20.391');
// returns Date
```

View file

@ -0,0 +1,195 @@
"use strict";
function padWithZeros(vNumber, width) {
var numAsString = vNumber.toString();
while (numAsString.length < width) {
numAsString = "0" + numAsString;
}
return numAsString;
}
function addZero(vNumber) {
return padWithZeros(vNumber, 2);
}
/**
* Formats the TimeOffset
* Thanks to http://www.svendtofte.com/code/date_format/
* @private
*/
function offset(timezoneOffset) {
var os = Math.abs(timezoneOffset);
var h = String(Math.floor(os / 60));
var m = String(os % 60);
if (h.length === 1) {
h = "0" + h;
}
if (m.length === 1) {
m = "0" + m;
}
return timezoneOffset < 0 ? "+" + h + m : "-" + h + m;
}
function datePart(date, displayUTC, part) {
return displayUTC ? date["getUTC" + part]() : date["get" + part]();
}
function asString(format, date) {
if (typeof format !== "string") {
date = format;
format = module.exports.ISO8601_FORMAT;
}
if (!date) {
date = module.exports.now();
}
var displayUTC = format.indexOf("O") > -1;
var vDay = addZero(datePart(date, displayUTC, "Date"));
var vMonth = addZero(datePart(date, displayUTC, "Month") + 1);
var vYearLong = addZero(datePart(date, displayUTC, "FullYear"));
var vYearShort = addZero(vYearLong.substring(2, 4));
var vYear = format.indexOf("yyyy") > -1 ? vYearLong : vYearShort;
var vHour = addZero(datePart(date, displayUTC, "Hours"));
var vMinute = addZero(datePart(date, displayUTC, "Minutes"));
var vSecond = addZero(datePart(date, displayUTC, "Seconds"));
var vMillisecond = padWithZeros(
datePart(date, displayUTC, "Milliseconds"),
3
);
var vTimeZone = offset(date.getTimezoneOffset());
var formatted = format
.replace(/dd/g, vDay)
.replace(/MM/g, vMonth)
.replace(/y{1,4}/g, vYear)
.replace(/hh/g, vHour)
.replace(/mm/g, vMinute)
.replace(/ss/g, vSecond)
.replace(/SSS/g, vMillisecond)
.replace(/O/g, vTimeZone);
return formatted;
}
function extractDateParts(pattern, str, missingValuesDate) {
var matchers = [
{
pattern: /y{1,4}/,
regexp: "\\d{1,4}",
fn: function(date, value) {
date.setFullYear(value);
}
},
{
pattern: /MM/,
regexp: "\\d{1,2}",
fn: function(date, value) {
date.setMonth(value - 1);
}
},
{
pattern: /dd/,
regexp: "\\d{1,2}",
fn: function(date, value) {
date.setDate(value);
}
},
{
pattern: /hh/,
regexp: "\\d{1,2}",
fn: function(date, value) {
date.setHours(value);
}
},
{
pattern: /mm/,
regexp: "\\d\\d",
fn: function(date, value) {
date.setMinutes(value);
}
},
{
pattern: /ss/,
regexp: "\\d\\d",
fn: function(date, value) {
date.setSeconds(value);
}
},
{
pattern: /SSS/,
regexp: "\\d\\d\\d",
fn: function(date, value) {
date.setMilliseconds(value);
}
},
{
pattern: /O/,
regexp: "[+-]\\d{3,4}|Z",
fn: function(date, value) {
if (value === "Z") {
value = 0;
}
var offset = Math.abs(value);
var minutes = (offset % 100) + Math.floor(offset / 100) * 60;
date.setMinutes(date.getMinutes() + (value > 0 ? minutes : -minutes));
}
}
];
var parsedPattern = matchers.reduce(
function(p, m) {
if (m.pattern.test(p.regexp)) {
m.index = p.regexp.match(m.pattern).index;
p.regexp = p.regexp.replace(m.pattern, "(" + m.regexp + ")");
} else {
m.index = -1;
}
return p;
},
{ regexp: pattern, index: [] }
);
var dateFns = matchers.filter(function(m) {
return m.index > -1;
});
dateFns.sort(function(a, b) {
return a.index - b.index;
});
var matcher = new RegExp(parsedPattern.regexp);
var matches = matcher.exec(str);
if (matches) {
var date = missingValuesDate || module.exports.now();
dateFns.forEach(function(f, i) {
f.fn(date, matches[i + 1]);
});
return date;
}
throw new Error(
"String '" + str + "' could not be parsed as '" + pattern + "'"
);
}
function parse(pattern, str, missingValuesDate) {
if (!pattern) {
throw new Error("pattern must be supplied");
}
return extractDateParts(pattern, str, missingValuesDate);
}
/**
* Used for testing - replace this function with a fixed date.
*/
function now() {
return new Date();
}
module.exports = asString;
module.exports.asString = asString;
module.exports.parse = parse;
module.exports.now = now;
module.exports.ISO8601_FORMAT = "yyyy-MM-ddThh:mm:ss.SSS";
module.exports.ISO8601_WITH_TZ_OFFSET_FORMAT = "yyyy-MM-ddThh:mm:ss.SSSO";
module.exports.DATETIME_FORMAT = "dd MM yyyy hh:mm:ss.SSS";
module.exports.ABSOLUTETIME_FORMAT = "hh:mm:ss.SSS";

View file

@ -0,0 +1,64 @@
{
"_from": "date-format@^2.1.0",
"_id": "date-format@2.1.0",
"_inBundle": false,
"_integrity": "sha512-bYQuGLeFxhkxNOF3rcMtiZxvCBAquGzZm6oWA1oZ0g2THUzivaRhv8uOhdr19LmoobSOLoIAxeUK2RdbM8IFTA==",
"_location": "/streamroller/date-format",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "date-format@^2.1.0",
"name": "date-format",
"escapedName": "date-format",
"rawSpec": "^2.1.0",
"saveSpec": null,
"fetchSpec": "^2.1.0"
},
"_requiredBy": [
"/streamroller"
],
"_resolved": "https://registry.npmjs.org/date-format/-/date-format-2.1.0.tgz",
"_shasum": "31d5b5ea211cf5fd764cd38baf9d033df7e125cf",
"_spec": "date-format@^2.1.0",
"_where": "/home/jack/Documents/JDA/m14/projecte_janmaroto/node_modules/streamroller",
"author": {
"name": "Gareth Jones",
"email": "gareth.nomiddlename@gmail.com"
},
"bugs": {
"url": "https://github.com/nomiddlename/date-format/issues"
},
"bundleDependencies": false,
"deprecated": false,
"description": "Formatting Date objects as strings since 2013",
"devDependencies": {
"eslint": "^5.16.0",
"eslint-plugin-mocha": "^5.3.0",
"mocha": "^5.2.0",
"should": "^13.2.3"
},
"engines": {
"node": ">=4.0"
},
"gitHead": "bf59015ab6c9e86454b179374f29debbdb403522",
"homepage": "https://github.com/nomiddlename/date-format#readme",
"keywords": [
"date",
"format",
"string"
],
"license": "MIT",
"main": "lib/index.js",
"name": "date-format",
"repository": {
"type": "git",
"url": "git+https://github.com/nomiddlename/date-format.git"
},
"scripts": {
"lint": "eslint lib/* test/*",
"pretest": "eslint lib/* test/*",
"test": "mocha"
},
"version": "2.1.0"
}

View file

@ -0,0 +1,64 @@
'use strict';
require('should');
var dateFormat = require('../lib');
function createFixedDate() {
return new Date(2010, 0, 11, 14, 31, 30, 5);
}
describe('date_format', function() {
var date = createFixedDate();
it('should default to now when a date is not provided', function() {
dateFormat.asString(dateFormat.DATETIME_FORMAT).should.not.be.empty();
});
it('should be usable directly without calling asString', function() {
dateFormat(dateFormat.DATETIME_FORMAT, date).should.eql('11 01 2010 14:31:30.005');
});
it('should format a date as string using a pattern', function() {
dateFormat.asString(dateFormat.DATETIME_FORMAT, date).should.eql('11 01 2010 14:31:30.005');
});
it('should default to the ISO8601 format', function() {
dateFormat.asString(date).should.eql('2010-01-11T14:31:30.005');
});
it('should provide a ISO8601 with timezone offset format', function() {
var tzDate = createFixedDate();
tzDate.setMinutes(tzDate.getMinutes() - tzDate.getTimezoneOffset() - 660);
tzDate.getTimezoneOffset = function () {
return -660;
};
// when tz offset is in the pattern, the date should be in UTC
dateFormat.asString(dateFormat.ISO8601_WITH_TZ_OFFSET_FORMAT, tzDate)
.should.eql('2010-01-11T03:31:30.005+1100');
tzDate = createFixedDate();
tzDate.setMinutes((tzDate.getMinutes() - tzDate.getTimezoneOffset()) + 120);
tzDate.getTimezoneOffset = function () {
return 120;
};
dateFormat.asString(dateFormat.ISO8601_WITH_TZ_OFFSET_FORMAT, tzDate)
.should.eql('2010-01-11T16:31:30.005-0200');
});
it('should provide a just-the-time format', function() {
dateFormat.asString(dateFormat.ABSOLUTETIME_FORMAT, date).should.eql('14:31:30.005');
});
it('should provide a custom format', function() {
var customDate = createFixedDate();
customDate.setMinutes((customDate.getMinutes() - customDate.getTimezoneOffset()) + 120);
customDate.getTimezoneOffset = function () {
return 120;
};
dateFormat.asString('O.SSS.ss.mm.hh.dd.MM.yy', customDate).should.eql('-0200.005.30.31.16.11.01.10');
});
});

View file

@ -0,0 +1,177 @@
"use strict";
require("should");
var dateFormat = require("../lib");
describe("dateFormat.parse", function() {
it("should require a pattern", function() {
(function() {
dateFormat.parse();
}.should.throw(/pattern must be supplied/));
(function() {
dateFormat.parse(null);
}.should.throw(/pattern must be supplied/));
(function() {
dateFormat.parse("");
}.should.throw(/pattern must be supplied/));
});
describe("with a pattern that has no replacements", function() {
it("should return a new date when the string matches", function() {
dateFormat.parse("cheese", "cheese").should.be.a.Date();
});
it("should throw if the string does not match", function() {
(function() {
dateFormat.parse("cheese", "biscuits");
}.should.throw(/String 'biscuits' could not be parsed as 'cheese'/));
});
});
describe("with a full pattern", function() {
var pattern = "yyyy-MM-dd hh:mm:ss.SSSO";
it("should return the correct date if the string matches", function() {
var testDate = new Date();
testDate.setFullYear(2018);
testDate.setMonth(8);
testDate.setDate(13);
testDate.setHours(18);
testDate.setMinutes(10);
testDate.setSeconds(12);
testDate.setMilliseconds(392);
testDate.getTimezoneOffset = function() {
return 600;
};
dateFormat
.parse(pattern, "2018-09-13 08:10:12.392+1000")
.getTime()
.should.eql(testDate.getTime());
});
it("should throw if the string does not match", function() {
(function() {
dateFormat.parse(pattern, "biscuits");
}.should.throw(
/String 'biscuits' could not be parsed as 'yyyy-MM-dd hh:mm:ss.SSSO'/
));
});
});
describe("with a partial pattern", function() {
var testDate = new Date();
dateFormat.now = function() {
return testDate;
};
function verifyDate(actual, expected) {
actual.getFullYear().should.eql(expected.year || testDate.getFullYear());
actual.getMonth().should.eql(expected.month || testDate.getMonth());
actual.getDate().should.eql(expected.day || testDate.getDate());
actual.getHours().should.eql(expected.hours || testDate.getHours());
actual.getMinutes().should.eql(expected.minutes || testDate.getMinutes());
actual.getSeconds().should.eql(expected.seconds || testDate.getSeconds());
actual
.getMilliseconds()
.should.eql(expected.milliseconds || testDate.getMilliseconds());
}
it("should return a date with missing values defaulting to current time", function() {
var date = dateFormat.parse("yyyy-MM", "2015-09");
verifyDate(date, { year: 2015, month: 8 });
});
it("should use a passed in date for missing values", function() {
var missingValueDate = new Date(2010, 1, 11, 10, 30, 12, 100);
var date = dateFormat.parse("yyyy-MM", "2015-09", missingValueDate);
verifyDate(date, {
year: 2015,
month: 8,
day: 11,
hours: 10,
minutes: 30,
seconds: 12,
milliseconds: 100
});
});
it("should handle variations on the same pattern", function() {
var date = dateFormat.parse("MM-yyyy", "09-2015");
verifyDate(date, { year: 2015, month: 8 });
date = dateFormat.parse("yyyy MM", "2015 09");
verifyDate(date, { year: 2015, month: 8 });
date = dateFormat.parse("MM, yyyy.", "09, 2015.");
verifyDate(date, { year: 2015, month: 8 });
});
it("should match all the date parts", function() {
var date = dateFormat.parse("dd", "21");
verifyDate(date, { day: 21 });
date = dateFormat.parse("hh", "12");
verifyDate(date, { hours: 12 });
date = dateFormat.parse("mm", "34");
verifyDate(date, { minutes: 34 });
date = dateFormat.parse("ss", "59");
verifyDate(date, { seconds: 59 });
date = dateFormat.parse("ss.SSS", "23.452");
verifyDate(date, { seconds: 23, milliseconds: 452 });
date = dateFormat.parse("hh:mm O", "05:23 +1000");
verifyDate(date, { hours: 15, minutes: 23 });
date = dateFormat.parse("hh:mm O", "05:23 -200");
verifyDate(date, { hours: 3, minutes: 23 });
date = dateFormat.parse("hh:mm O", "05:23 +0930");
verifyDate(date, { hours: 14, minutes: 53 });
});
});
describe("with a date formatted by this library", function() {
var testDate = new Date();
testDate.setUTCFullYear(2018);
testDate.setUTCMonth(8);
testDate.setUTCDate(13);
testDate.setUTCHours(18);
testDate.setUTCMinutes(10);
testDate.setUTCSeconds(12);
testDate.setUTCMilliseconds(392);
it("should format and then parse back to the same date", function() {
dateFormat
.parse(
dateFormat.ISO8601_WITH_TZ_OFFSET_FORMAT,
dateFormat(dateFormat.ISO8601_WITH_TZ_OFFSET_FORMAT, testDate)
)
.should.eql(testDate);
dateFormat
.parse(
dateFormat.ISO8601_FORMAT,
dateFormat(dateFormat.ISO8601_FORMAT, testDate)
)
.should.eql(testDate);
dateFormat
.parse(
dateFormat.DATETIME_FORMAT,
dateFormat(dateFormat.DATETIME_FORMAT, testDate)
)
.should.eql(testDate);
dateFormat
.parse(
dateFormat.ABSOLUTETIME_FORMAT,
dateFormat(dateFormat.ABSOLUTETIME_FORMAT, testDate)
)
.should.eql(testDate);
});
});
});

103
node_modules/streamroller/package.json generated vendored Executable file
View file

@ -0,0 +1,103 @@
{
"_from": "streamroller@^2.2.4",
"_id": "streamroller@2.2.4",
"_inBundle": false,
"_integrity": "sha512-OG79qm3AujAM9ImoqgWEY1xG4HX+Lw+yY6qZj9R1K2mhF5bEmQ849wvrb+4vt4jLMLzwXttJlQbOdPOQVRv7DQ==",
"_location": "/streamroller",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "streamroller@^2.2.4",
"name": "streamroller",
"escapedName": "streamroller",
"rawSpec": "^2.2.4",
"saveSpec": null,
"fetchSpec": "^2.2.4"
},
"_requiredBy": [
"/log4js"
],
"_resolved": "https://registry.npmjs.org/streamroller/-/streamroller-2.2.4.tgz",
"_shasum": "c198ced42db94086a6193608187ce80a5f2b0e53",
"_spec": "streamroller@^2.2.4",
"_where": "/home/jack/Documents/JDA/m14/projecte_janmaroto/node_modules/log4js",
"author": {
"name": "Gareth Jones",
"email": "gareth.nomiddlename@gmail.com"
},
"bugs": {
"url": "https://github.com/nomiddlename/streamroller/issues"
},
"bundleDependencies": false,
"commitlint": {
"extends": [
"@commitlint/config-conventional"
]
},
"dependencies": {
"date-format": "^2.1.0",
"debug": "^4.1.1",
"fs-extra": "^8.1.0"
},
"deprecated": false,
"description": "file streams that roll over when size limits, or dates are reached",
"devDependencies": {
"@commitlint/cli": "^8.1.0",
"@commitlint/config-conventional": "^8.1.0",
"eslint": "^6.0.1",
"husky": "^3.0.0",
"mocha": "^6.1.4",
"nyc": "^14.1.1",
"proxyquire": "^2.1.1",
"should": "^13.2.3"
},
"directories": {
"test": "test"
},
"engines": {
"node": ">=8.0"
},
"eslintConfig": {
"env": {
"browser": false,
"node": true,
"es6": true,
"mocha": true
},
"parserOptions": {
"ecmaVersion": 8
},
"extends": "eslint:recommended",
"rules": {
"no-console": "off"
}
},
"gitHead": "ece35d7d86c87c04ff09e8604accae81cf36a0ce",
"homepage": "https://github.com/nomiddlename/streamroller#readme",
"husky": {
"hooks": {
"commit-msg": "commitlint -e $HUSKY_GIT_PARAMS"
}
},
"keywords": [
"stream",
"rolling"
],
"license": "MIT",
"main": "lib/index.js",
"name": "streamroller",
"repository": {
"type": "git",
"url": "git+https://github.com/nomiddlename/streamroller.git"
},
"scripts": {
"clean": "rm -rf node_modules/",
"codecheck": "eslint \"lib/*.js\" \"test/*.js\"",
"html-report": "nyc report --reporter=html",
"prepublishOnly": "npm test",
"pretest": "npm run codecheck",
"test": "nyc --check-coverage --lines 100 --branches 100 --functions 100 mocha"
},
"version": "2.2.4"
}

560
node_modules/streamroller/test/DateRollingFileStream-test.js generated vendored Executable file
View file

@ -0,0 +1,560 @@
require("should");
const fs = require("fs-extra"),
path = require("path"),
zlib = require("zlib"),
proxyquire = require("proxyquire").noPreserveCache(),
util = require("util"),
streams = require("stream");
let fakeNow = new Date(2012, 8, 12, 10, 37, 11);
const mockNow = () => fakeNow;
const RollingFileWriteStream = proxyquire("../lib/RollingFileWriteStream", {
"./now": mockNow
});
const DateRollingFileStream = proxyquire("../lib/DateRollingFileStream", {
"./RollingFileWriteStream": RollingFileWriteStream
});
const gunzip = util.promisify(zlib.gunzip);
const gzip = util.promisify(zlib.gzip);
const remove = filename => fs.unlink(filename).catch(() => {});
const close = async (stream) => new Promise(
(resolve, reject) => stream.end(e => e ? reject(e) : resolve())
);
describe("DateRollingFileStream", function() {
describe("arguments", function() {
let stream;
before(function() {
stream = new DateRollingFileStream(
path.join(__dirname, "test-date-rolling-file-stream-1"),
"yyyy-MM-dd.hh"
);
});
after(async function() {
await close(stream);
await remove(path.join(__dirname, "test-date-rolling-file-stream-1"));
});
it("should take a filename and a pattern and return a WritableStream", function() {
stream.filename.should.eql(
path.join(__dirname, "test-date-rolling-file-stream-1")
);
stream.options.pattern.should.eql("yyyy-MM-dd.hh");
stream.should.be.instanceOf(streams.Writable);
});
it("with default settings for the underlying stream", function() {
stream.currentFileStream.mode.should.eql(420);
stream.currentFileStream.flags.should.eql("a");
});
});
describe("default arguments", function() {
var stream;
before(function() {
stream = new DateRollingFileStream(
path.join(__dirname, "test-date-rolling-file-stream-2")
);
});
after(async function() {
await close(stream);
await remove(path.join(__dirname, "test-date-rolling-file-stream-2"));
});
it("should have pattern of .yyyy-MM-dd", function() {
stream.options.pattern.should.eql("yyyy-MM-dd");
});
});
describe("with stream arguments", function() {
var stream;
before(function() {
stream = new DateRollingFileStream(
path.join(__dirname, "test-date-rolling-file-stream-3"),
"yyyy-MM-dd",
{ mode: parseInt("0666", 8) }
);
});
after(async function() {
await close(stream);
await remove(path.join(__dirname, "test-date-rolling-file-stream-3"));
});
it("should pass them to the underlying stream", function() {
stream.theStream.mode.should.eql(parseInt("0666", 8));
});
});
describe("with stream arguments but no pattern", function() {
var stream;
before(function() {
stream = new DateRollingFileStream(
path.join(__dirname, "test-date-rolling-file-stream-4"),
{ mode: parseInt("0666", 8) }
);
});
after(async function() {
await close(stream);
await remove(path.join(__dirname, "test-date-rolling-file-stream-4"));
});
it("should pass them to the underlying stream", function() {
stream.theStream.mode.should.eql(parseInt("0666", 8));
});
it("should use default pattern", function() {
stream.options.pattern.should.eql("yyyy-MM-dd");
});
});
describe("with a pattern of .yyyy-MM-dd", function() {
var stream;
before(function(done) {
stream = new DateRollingFileStream(
path.join(__dirname, "test-date-rolling-file-stream-5"),
".yyyy-MM-dd",
null
);
stream.write("First message\n", "utf8", done);
});
after(async function() {
await close(stream);
await remove(path.join(__dirname, "test-date-rolling-file-stream-5"));
});
it("should create a file with the base name", async function() {
const contents = await fs.readFile(
path.join(__dirname, "test-date-rolling-file-stream-5"),
"utf8"
);
contents.should.eql("First message\n");
});
describe("when the day changes", function() {
before(function(done) {
fakeNow = new Date(2012, 8, 13, 0, 10, 12);
stream.write("Second message\n", "utf8", done);
});
after(async function() {
await remove(
path.join(__dirname, "test-date-rolling-file-stream-5.2012-09-12")
);
});
describe("the number of files", function() {
it("should be two", async function() {
const files = await fs.readdir(__dirname);
files
.filter(
file => file.indexOf("test-date-rolling-file-stream-5") > -1
)
.should.have.length(2);
});
});
describe("the file without a date", function() {
it("should contain the second message", async function() {
const contents = await fs.readFile(
path.join(__dirname, "test-date-rolling-file-stream-5"),
"utf8"
);
contents.should.eql("Second message\n");
});
});
describe("the file with the date", function() {
it("should contain the first message", async function() {
const contents = await fs.readFile(
path.join(__dirname, "test-date-rolling-file-stream-5.2012-09-12"),
"utf8"
);
contents.should.eql("First message\n");
});
});
});
});
describe("with alwaysIncludePattern", function() {
var stream;
before(async function() {
fakeNow = new Date(2012, 8, 12, 11, 10, 12);
await remove(
path.join(
__dirname,
"test-date-rolling-file-stream-pattern.2012-09-12-11.log"
)
);
stream = new DateRollingFileStream(
path.join(__dirname, "test-date-rolling-file-stream-pattern"),
".yyyy-MM-dd-hh.log",
{ alwaysIncludePattern: true }
);
await new Promise(resolve => {
setTimeout(function() {
stream.write("First message\n", "utf8", () => resolve());
}, 50);
});
});
after(async function() {
await close(stream);
await remove(
path.join(
__dirname,
"test-date-rolling-file-stream-pattern.2012-09-12-11.log"
)
);
});
it("should create a file with the pattern set", async function() {
const contents = await fs.readFile(
path.join(
__dirname,
"test-date-rolling-file-stream-pattern.2012-09-12-11.log"
),
"utf8"
);
contents.should.eql("First message\n");
});
describe("when the day changes", function() {
before(function(done) {
fakeNow = new Date(2012, 8, 12, 12, 10, 12);
stream.write("Second message\n", "utf8", done);
});
after(async function() {
await remove(
path.join(
__dirname,
"test-date-rolling-file-stream-pattern.2012-09-12-12.log"
)
);
});
describe("the number of files", function() {
it("should be two", async function() {
const files = await fs.readdir(__dirname);
files
.filter(
file => file.indexOf("test-date-rolling-file-stream-pattern") > -1
)
.should.have.length(2);
});
});
describe("the file with the later date", function() {
it("should contain the second message", async function() {
const contents = await fs.readFile(
path.join(
__dirname,
"test-date-rolling-file-stream-pattern.2012-09-12-12.log"
),
"utf8"
);
contents.should.eql("Second message\n");
});
});
describe("the file with the date", function() {
it("should contain the first message", async function() {
const contents = await fs.readFile(
path.join(
__dirname,
"test-date-rolling-file-stream-pattern.2012-09-12-11.log"
),
"utf8"
);
contents.should.eql("First message\n");
});
});
});
});
describe("with a pattern that evaluates to digits", function() {
let stream;
before(done => {
fakeNow = new Date(2012, 8, 12, 0, 10, 12);
stream = new DateRollingFileStream(
path.join(__dirname, "digits.log"),
".yyyyMMdd"
);
stream.write("First message\n", "utf8", done);
});
describe("when the day changes", function() {
before(function(done) {
fakeNow = new Date(2012, 8, 13, 0, 10, 12);
stream.write("Second message\n", "utf8", done);
});
it("should be two files (it should not get confused by indexes)", async function() {
const files = await fs.readdir(__dirname);
var logFiles = files.filter(file => file.indexOf("digits.log") > -1);
logFiles.should.have.length(2);
const contents = await fs.readFile(
path.join(__dirname, "digits.log.20120912"),
"utf8"
);
contents.should.eql("First message\n");
const c = await fs.readFile(path.join(__dirname, "digits.log"), "utf8");
c.should.eql("Second message\n");
});
});
after(async function() {
await close(stream);
await remove(path.join(__dirname, "digits.log"));
await remove(path.join(__dirname, "digits.log.20120912"));
});
});
describe("with compress option", function() {
var stream;
before(function(done) {
fakeNow = new Date(2012, 8, 12, 0, 10, 12);
stream = new DateRollingFileStream(
path.join(__dirname, "compressed.log"),
".yyyy-MM-dd",
{ compress: true }
);
stream.write("First message\n", "utf8", done);
});
describe("when the day changes", function() {
before(function(done) {
fakeNow = new Date(2012, 8, 13, 0, 10, 12);
stream.write("Second message\n", "utf8", done);
});
it("should be two files, one compressed", async function() {
const files = await fs.readdir(__dirname);
var logFiles = files.filter(
file => file.indexOf("compressed.log") > -1
);
logFiles.should.have.length(2);
const gzipped = await fs.readFile(
path.join(__dirname, "compressed.log.2012-09-12.gz")
);
const contents = await gunzip(gzipped);
contents.toString("utf8").should.eql("First message\n");
(await fs.readFile(
path.join(__dirname, "compressed.log"),
"utf8"
)).should.eql("Second message\n");
});
});
after(async function() {
await close(stream);
await remove(path.join(__dirname, "compressed.log"));
await remove(path.join(__dirname, "compressed.log.2012-09-12.gz"));
});
});
describe("with keepFileExt option", function() {
var stream;
before(function(done) {
fakeNow = new Date(2012, 8, 12, 0, 10, 12);
stream = new DateRollingFileStream(
path.join(__dirname, "keepFileExt.log"),
".yyyy-MM-dd",
{ keepFileExt: true }
);
stream.write("First message\n", "utf8", done);
});
describe("when the day changes", function() {
before(function(done) {
fakeNow = new Date(2012, 8, 13, 0, 10, 12);
stream.write("Second message\n", "utf8", done);
});
it("should be two files", async function() {
const files = await fs.readdir(__dirname);
var logFiles = files.filter(file => file.indexOf("keepFileExt") > -1);
logFiles.should.have.length(2);
(await fs.readFile(
path.join(__dirname, "keepFileExt.2012-09-12.log"),
"utf8"
)).should.eql("First message\n");
(await fs.readFile(
path.join(__dirname, "keepFileExt.log"),
"utf8"
)).should.eql("Second message\n");
});
});
after(async function() {
await close(stream);
await remove(path.join(__dirname, "keepFileExt.log"));
await remove(path.join(__dirname, "keepFileExt.2012-09-12.log"));
});
});
describe("with compress option and keepFileExt option", function() {
var stream;
before(function(done) {
fakeNow = new Date(2012, 8, 12, 0, 10, 12);
stream = new DateRollingFileStream(
path.join(__dirname, "compressedAndKeepExt.log"),
".yyyy-MM-dd",
{ compress: true, keepFileExt: true }
);
stream.write("First message\n", "utf8", done);
});
describe("when the day changes", function() {
before(function(done) {
fakeNow = new Date(2012, 8, 13, 0, 10, 12);
stream.write("Second message\n", "utf8", done);
});
it("should be two files, one compressed", async function() {
const files = await fs.readdir(__dirname);
var logFiles = files.filter(
file => file.indexOf("compressedAndKeepExt") > -1
);
logFiles.should.have.length(2);
const gzipped = await fs.readFile(
path.join(__dirname, "compressedAndKeepExt.2012-09-12.log.gz")
);
const contents = await gunzip(gzipped);
contents.toString("utf8").should.eql("First message\n");
(await fs.readFile(
path.join(__dirname, "compressedAndKeepExt.log"),
"utf8"
)).should.eql("Second message\n");
});
});
after(async function() {
await close(stream);
await remove(path.join(__dirname, "compressedAndKeepExt.log"));
await remove(
path.join(__dirname, "compressedAndKeepExt.2012-09-12.log.gz")
);
});
});
describe("with daysToKeep option", function() {
let stream;
var daysToKeep = 4;
var numOriginalLogs = 10;
before(async function() {
for (let i = 0; i < numOriginalLogs; i += 1) {
await fs.writeFile(
path.join(__dirname, `daysToKeep.log.2012-09-${20-i}`),
`Message on day ${i}\n`,
{ encoding: "utf-8" }
);
}
stream = new DateRollingFileStream(
path.join(__dirname, "daysToKeep.log"),
".yyyy-MM-dd",
{
alwaysIncludePattern: true,
daysToKeep: daysToKeep
}
);
});
describe("when the day changes", function() {
before(function(done) {
fakeNow = new Date(2012, 8, 21, 0, 10, 12);
stream.write("Second message\n", "utf8", done);
});
it("should be daysToKeep + 1 files left from numOriginalLogs", async function() {
const files = await fs.readdir(__dirname);
var logFiles = files.filter(
file => file.indexOf("daysToKeep.log") > -1
);
logFiles.should.have.length(daysToKeep + 1);
});
});
after(async function() {
await close(stream);
const files = await fs.readdir(__dirname);
const logFiles = files
.filter(file => file.indexOf("daysToKeep.log") > -1)
.map(f => remove(path.join(__dirname, f)));
await Promise.all(logFiles);
});
});
describe("with daysToKeep and compress options", function() {
let stream;
const daysToKeep = 4;
const numOriginalLogs = 10;
before(async function() {
for (let i = numOriginalLogs; i >= 0; i -= 1) {
fakeNow = new Date(2012, 8, 20 - i, 0, 10, 12);
const contents = await gzip(`Message on day ${i}\n`);
await fs.writeFile(
path.join(__dirname, `compressedDaysToKeep.log.2012-09-${20-i}.gz`),
contents
);
}
stream = new DateRollingFileStream(
path.join(__dirname, "compressedDaysToKeep.log"),
".yyyy-MM-dd",
{
alwaysIncludePattern: true,
compress: true,
daysToKeep: daysToKeep
}
);
});
describe("when the day changes", function() {
before(function(done) {
fakeNow = new Date(2012, 8, 21, 0, 10, 12);
stream.write("New file message\n", "utf8", done);
});
it("should be 4 files left from original 3", async function() {
const files = await fs.readdir(__dirname);
var logFiles = files.filter(
file => file.indexOf("compressedDaysToKeep.log") > -1
);
logFiles.should.have.length(daysToKeep + 1);
});
});
after(async function() {
await close(stream);
const files = await fs.readdir(__dirname);
const logFiles = files
.filter(file => file.indexOf("compressedDaysToKeep.log") > -1)
.map(f => remove(path.join(__dirname, f)));
await Promise.all(logFiles);
});
});
});

451
node_modules/streamroller/test/RollingFileStream-test.js generated vendored Executable file
View file

@ -0,0 +1,451 @@
require("should");
const fs = require("fs-extra"),
path = require("path"),
util = require("util"),
zlib = require("zlib"),
streams = require("stream"),
RollingFileStream = require("../lib").RollingFileStream;
const gunzip = util.promisify(zlib.gunzip);
const fullPath = f => path.join(__dirname, f);
const remove = filename => fs.unlink(fullPath(filename)).catch(() => {});
const create = filename => fs.writeFile(fullPath(filename), "test file");
const write = (stream, data) => {
return new Promise((resolve, reject) => {
stream.write(data, "utf8", e => {
if (e) {
reject(e);
} else {
resolve();
}
});
});
};
const writeInSequence = async (stream, messages) => {
for (let i = 0; i < messages.length; i += 1) {
await write(stream, messages[i] + "\n");
}
return new Promise(resolve => {
stream.end(resolve);
});
};
const close = async (stream) => new Promise(
(resolve, reject) => stream.end(e => e ? reject(e) : resolve())
);
describe("RollingFileStream", function() {
describe("arguments", function() {
let stream;
before(async function() {
await remove("test-rolling-file-stream");
stream = new RollingFileStream(
path.join(__dirname, "test-rolling-file-stream"),
1024,
5
);
});
after(async function() {
await close(stream);
await remove("test-rolling-file-stream");
});
it("should take a filename, file size (bytes), no. backups, return Writable", function() {
stream.should.be.an.instanceOf(streams.Writable);
stream.filename.should.eql(
path.join(__dirname, "test-rolling-file-stream")
);
stream.size.should.eql(1024);
stream.backups.should.eql(5);
});
it("should apply default settings to the underlying stream", function() {
stream.theStream.mode.should.eql(420);
stream.theStream.flags.should.eql("a");
});
});
describe("with stream arguments", function() {
let stream;
it("should pass them to the underlying stream", function() {
stream = new RollingFileStream(
path.join(__dirname, "test-rolling-file-stream"),
1024,
5,
{ mode: parseInt("0666", 8) }
);
stream.theStream.mode.should.eql(parseInt("0666", 8));
});
after(async function() {
await close(stream);
await remove("test-rolling-file-stream");
});
});
describe("without size", function() {
let stream;
it("should default to max int size", function() {
stream = new RollingFileStream(
path.join(__dirname, "test-rolling-file-stream")
);
stream.size.should.eql(Number.MAX_SAFE_INTEGER);
});
after(async function() {
await close(stream);
await remove("test-rolling-file-stream");
});
});
describe("without number of backups", function() {
let stream;
it("should default to 1 backup", function() {
stream = new RollingFileStream(
path.join(__dirname, "test-rolling-file-stream"),
1024
);
stream.backups.should.eql(1);
});
after(async function() {
await close(stream);
await remove("test-rolling-file-stream");
});
});
describe("writing less than the file size", function() {
before(async function() {
await remove("test-rolling-file-stream-write-less");
const stream = new RollingFileStream(
path.join(__dirname, "test-rolling-file-stream-write-less"),
100
);
await writeInSequence(stream, ["cheese"]);
});
after(async function() {
await remove("test-rolling-file-stream-write-less");
});
it("should write to the file", async function() {
const contents = await fs.readFile(
path.join(__dirname, "test-rolling-file-stream-write-less"),
"utf8"
);
contents.should.eql("cheese\n");
});
it("should write one file", async function() {
const files = await fs.readdir(__dirname);
files
.filter(
file => file.indexOf("test-rolling-file-stream-write-less") > -1
)
.should.have.length(1);
});
});
describe("writing more than the file size", function() {
before(async function() {
await remove("test-rolling-file-stream-write-more");
await remove("test-rolling-file-stream-write-more.1");
const stream = new RollingFileStream(
path.join(__dirname, "test-rolling-file-stream-write-more"),
45
);
await writeInSequence(
stream,
[0, 1, 2, 3, 4, 5, 6].map(i => i + ".cheese")
);
});
after(async function() {
await remove("test-rolling-file-stream-write-more");
await remove("test-rolling-file-stream-write-more.1");
});
it("should write two files", async function() {
const files = await fs.readdir(__dirname);
files
.filter(
file => file.indexOf("test-rolling-file-stream-write-more") > -1
)
.should.have.length(2);
});
it("should write the last two log messages to the first file", async function() {
const contents = await fs.readFile(
path.join(__dirname, "test-rolling-file-stream-write-more"),
"utf8"
);
contents.should.eql("5.cheese\n6.cheese\n");
});
it("should write the first five log messages to the second file", async function() {
const contents = await fs.readFile(
path.join(__dirname, "test-rolling-file-stream-write-more.1"),
"utf8"
);
contents.should.eql("0.cheese\n1.cheese\n2.cheese\n3.cheese\n4.cheese\n");
});
});
describe("with options.compress = true", function() {
before(async function() {
const stream = new RollingFileStream(
path.join(__dirname, "compressed-backups.log"),
30, //30 bytes max size
2, //two backup files to keep
{ compress: true }
);
const messages = [
"This is the first log message.",
"This is the second log message.",
"This is the third log message.",
"This is the fourth log message."
];
await writeInSequence(stream, messages);
});
it("should produce three files, with the backups compressed", async function() {
const files = await fs.readdir(__dirname);
const testFiles = files
.filter(f => f.indexOf("compressed-backups.log") > -1)
.sort();
testFiles.length.should.eql(3);
testFiles.should.eql([
"compressed-backups.log",
"compressed-backups.log.1.gz",
"compressed-backups.log.2.gz"
]);
let contents = await fs.readFile(
path.join(__dirname, testFiles[0]),
"utf8"
);
contents.should.eql("This is the fourth log message.\n");
let gzipped = await fs.readFile(path.join(__dirname, testFiles[1]));
contents = await gunzip(gzipped);
contents.toString("utf8").should.eql("This is the third log message.\n");
gzipped = await fs.readFile(path.join(__dirname, testFiles[2]));
contents = await gunzip(gzipped);
contents.toString("utf8").should.eql("This is the second log message.\n");
});
after(function() {
return Promise.all([
remove("compressed-backups.log"),
remove("compressed-backups.log.1.gz"),
remove("compressed-backups.log.2.gz")
]);
});
});
describe("with options.keepFileExt = true", function() {
before(async function() {
const stream = new RollingFileStream(
path.join(__dirname, "extKept-backups.log"),
30, //30 bytes max size
2, //two backup files to keep
{ keepFileExt: true }
);
const messages = [
"This is the first log message.",
"This is the second log message.",
"This is the third log message.",
"This is the fourth log message."
];
await writeInSequence(stream, messages);
});
it("should produce three files, with the file-extension kept", async function() {
const files = await fs.readdir(__dirname);
const testFiles = files
.filter(f => f.indexOf("extKept-backups") > -1)
.sort();
testFiles.length.should.eql(3);
testFiles.should.eql([
"extKept-backups.1.log",
"extKept-backups.2.log",
"extKept-backups.log"
]);
let contents = await fs.readFile(
path.join(__dirname, testFiles[0]),
"utf8"
);
contents.should.eql("This is the third log message.\n");
contents = await fs.readFile(path.join(__dirname, testFiles[1]), "utf8");
contents.toString("utf8").should.eql("This is the second log message.\n");
contents = await fs.readFile(path.join(__dirname, testFiles[2]), "utf8");
contents.toString("utf8").should.eql("This is the fourth log message.\n");
});
after(function() {
return Promise.all([
remove("extKept-backups.log"),
remove("extKept-backups.1.log"),
remove("extKept-backups.2.log")
]);
});
});
describe("with options.compress = true and keepFileExt = true", function() {
before(async function() {
const stream = new RollingFileStream(
path.join(__dirname, "compressed-backups.log"),
30, //30 bytes max size
2, //two backup files to keep
{ compress: true, keepFileExt: true }
);
const messages = [
"This is the first log message.",
"This is the second log message.",
"This is the third log message.",
"This is the fourth log message."
];
await writeInSequence(stream, messages);
});
it("should produce three files, with the backups compressed", async function() {
const files = await fs.readdir(__dirname);
const testFiles = files
.filter(f => f.indexOf("compressed-backups") > -1)
.sort();
testFiles.length.should.eql(3);
testFiles.should.eql([
"compressed-backups.1.log.gz",
"compressed-backups.2.log.gz",
"compressed-backups.log"
]);
let contents = await fs.readFile(
path.join(__dirname, testFiles[2]),
"utf8"
);
contents.should.eql("This is the fourth log message.\n");
let gzipped = await fs.readFile(path.join(__dirname, testFiles[1]));
contents = await gunzip(gzipped);
contents.toString("utf8").should.eql("This is the second log message.\n");
gzipped = await fs.readFile(path.join(__dirname, testFiles[0]));
contents = await gunzip(gzipped);
contents.toString("utf8").should.eql("This is the third log message.\n");
});
after(function() {
return Promise.all([
remove("compressed-backups.log"),
remove("compressed-backups.1.log.gz"),
remove("compressed-backups.2.log.gz")
]);
});
});
describe("when many files already exist", function() {
before(async function() {
await Promise.all([
remove("test-rolling-stream-with-existing-files.11"),
remove("test-rolling-stream-with-existing-files.20"),
remove("test-rolling-stream-with-existing-files.-1"),
remove("test-rolling-stream-with-existing-files.1.1"),
remove("test-rolling-stream-with-existing-files.1")
]);
await Promise.all([
create("test-rolling-stream-with-existing-files.11"),
create("test-rolling-stream-with-existing-files.20"),
create("test-rolling-stream-with-existing-files.-1"),
create("test-rolling-stream-with-existing-files.1.1"),
create("test-rolling-stream-with-existing-files.1")
]);
const stream = new RollingFileStream(
path.join(__dirname, "test-rolling-stream-with-existing-files"),
18,
5
);
await writeInSequence(
stream,
[0, 1, 2, 3, 4, 5, 6].map(i => i + ".cheese")
);
});
after(function() {
return Promise.all(
[
"test-rolling-stream-with-existing-files.-1",
"test-rolling-stream-with-existing-files",
"test-rolling-stream-with-existing-files.1.1",
"test-rolling-stream-with-existing-files.0",
"test-rolling-stream-with-existing-files.1",
"test-rolling-stream-with-existing-files.2",
"test-rolling-stream-with-existing-files.3",
"test-rolling-stream-with-existing-files.4",
"test-rolling-stream-with-existing-files.5",
"test-rolling-stream-with-existing-files.6",
"test-rolling-stream-with-existing-files.11",
"test-rolling-stream-with-existing-files.20"
].map(remove)
);
});
it("should roll the files, removing the highest indices", async function() {
const files = await fs.readdir(__dirname);
files.should.containEql("test-rolling-stream-with-existing-files");
files.should.containEql("test-rolling-stream-with-existing-files.1");
files.should.containEql("test-rolling-stream-with-existing-files.2");
files.should.containEql("test-rolling-stream-with-existing-files.3");
files.should.containEql("test-rolling-stream-with-existing-files.4");
});
});
// in windows, you can't delete a directory if there is an open file handle
if (process.platform !== "win32") {
describe("when the directory gets deleted", function() {
var stream;
before(function(done) {
stream = new RollingFileStream(
path.join("subdir", "test-rolling-file-stream"),
5,
5
);
stream.write("initial", "utf8", done);
});
after(async () => {
await fs.unlink(path.join("subdir", "test-rolling-file-stream"));
await fs.rmdir("subdir");
});
it("handles directory deletion gracefully", async function() {
stream.theStream.on("error", e => {
throw e;
});
await fs.unlink(path.join("subdir", "test-rolling-file-stream"));
await fs.rmdir("subdir");
await new Promise(resolve => stream.write("rollover", "utf8", resolve));
await close(stream);
(await fs.readFile(
path.join("subdir", "test-rolling-file-stream"),
"utf8"
)).should.eql("rollover");
});
});
}
});

1441
node_modules/streamroller/test/RollingFileWriteStream-test.js generated vendored Executable file

File diff suppressed because it is too large Load diff

508
node_modules/streamroller/test/fileNameFormatter-test.js generated vendored Executable file
View file

@ -0,0 +1,508 @@
require("should");
const { normalize } = require("path");
describe("fileNameFormatter", () => {
describe("without a date", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
}
});
it("should take an index and return a filename", () => {
fileNameFormatter({
index: 0
}).should.eql(normalize("/path/to/file/thefile.log"));
fileNameFormatter({ index: 1, date: "" }).should.eql(
normalize("/path/to/file/thefile.log.1")
);
fileNameFormatter({ index: 15, date: undefined }).should.eql(
normalize("/path/to/file/thefile.log.15")
);
fileNameFormatter({ index: 15 }).should.eql(
normalize("/path/to/file/thefile.log.15")
);
});
});
describe("with a date", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
}
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({ index: 0, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.log")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-16"));
});
});
describe("with the alwaysIncludeDate option", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
alwaysIncludeDate: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-15"));
fileNameFormatter({ index: 0, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.log.2019-07-15")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-16"));
});
});
describe("with the keepFileExt option", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
keepFileExt: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log"));
fileNameFormatter({ index: 1 }).should.eql(normalize("/path/to/file/thefile.1.log"));
fileNameFormatter({ index: 2 }).should.eql(normalize("/path/to/file/thefile.2.log"));
fileNameFormatter({ index: 15 }).should.eql(
normalize("/path/to/file/thefile.15.log")
);
});
});
describe("with the keepFileExt option and a date", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
keepFileExt: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log"));
fileNameFormatter({ index: 1, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.2019-07-15.log")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.2019-07-16.log"));
});
});
describe("with the keepFileExt, alwaysIncludeDate options", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
keepFileExt: true,
alwaysIncludeDate: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.2019-07-15.log"));
fileNameFormatter({ index: 1, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.2019-07-15.log")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.2019-07-16.log"));
});
});
describe("with the compress option", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
compress: true
});
it("should take an index and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log"));
fileNameFormatter({ index: 1 }).should.eql(
normalize("/path/to/file/thefile.log.1.gz")
);
fileNameFormatter({
index: 2
}).should.eql(normalize("/path/to/file/thefile.log.2.gz"));
});
});
describe("with the compress option and a date", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
compress: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log"));
fileNameFormatter({ index: 1, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.log.2019-07-15.gz")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-16.gz"));
});
});
describe("with the compress, alwaysIncludeDate option and a date", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
compress: true,
alwaysIncludeDate: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-15"));
fileNameFormatter({ index: 1, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.log.2019-07-15.gz")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-16.gz"));
});
});
describe("with the compress, alwaysIncludeDate, keepFileExt option and a date", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
compress: true,
alwaysIncludeDate: true,
keepFileExt: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.2019-07-15.log"));
fileNameFormatter({ index: 1, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.2019-07-15.log.gz")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.2019-07-16.log.gz"));
});
});
describe("with the needsIndex option", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
compress: true,
needsIndex: true,
alwaysIncludeDate: true,
keepFileExt: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.2019-07-15.log"));
fileNameFormatter({ index: 1, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.2019-07-15.1.log.gz")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.2019-07-16.2.log.gz"));
});
});
describe("with a date and needsIndex", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
needsIndex: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({ index: 0, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.log")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-16.2"));
});
});
describe("with the alwaysIncludeDate, needsIndex option", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
needsIndex: true,
alwaysIncludeDate: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-15"));
fileNameFormatter({ index: 0, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.log.2019-07-15")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-16.2"));
});
});
describe("with the keepFileExt, needsIndex option", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
needsIndex: true,
keepFileExt: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log"));
fileNameFormatter({ index: 1 }).should.eql(normalize("/path/to/file/thefile.1.log"));
fileNameFormatter({ index: 2 }).should.eql(normalize("/path/to/file/thefile.2.log"));
fileNameFormatter({ index: 15 }).should.eql(
normalize("/path/to/file/thefile.15.log")
);
});
});
describe("with the keepFileExt, needsIndex option and a date", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
needsIndex: true,
keepFileExt: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log"));
fileNameFormatter({ index: 1, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.2019-07-15.1.log")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.2019-07-16.2.log"));
});
});
describe("with the keepFileExt, needsIndex, alwaysIncludeDate options", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
needsIndex: true,
keepFileExt: true,
alwaysIncludeDate: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.2019-07-15.log"));
fileNameFormatter({ index: 1, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.2019-07-15.1.log")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.2019-07-16.2.log"));
});
});
describe("with the compress, needsIndex option", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
needsIndex: true,
compress: true
});
it("should take an index and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log"));
fileNameFormatter({ index: 1 }).should.eql(
normalize("/path/to/file/thefile.log.1.gz")
);
fileNameFormatter({
index: 2
}).should.eql(normalize("/path/to/file/thefile.log.2.gz"));
});
});
describe("with the compress, needsIndex option and a date", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
needsIndex: true,
compress: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log"));
fileNameFormatter({ index: 1, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.log.2019-07-15.1.gz")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-16.2.gz"));
});
});
describe("with the compress, alwaysIncludeDate, needsIndex option and a date", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
needsIndex: true,
compress: true,
alwaysIncludeDate: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-15"));
fileNameFormatter({ index: 1, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.log.2019-07-15.1.gz")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.log.2019-07-16.2.gz"));
});
});
describe("with the compress, alwaysIncludeDate, keepFileExt, needsIndex option and a date", () => {
const fileNameFormatter = require("../lib/fileNameFormatter")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
needsIndex: true,
compress: true,
alwaysIncludeDate: true,
keepFileExt: true
});
it("should take an index, date and return a filename", () => {
fileNameFormatter({
index: 0,
date: "2019-07-15"
}).should.eql(normalize("/path/to/file/thefile.2019-07-15.log"));
fileNameFormatter({ index: 1, date: "2019-07-15" }).should.eql(
normalize("/path/to/file/thefile.2019-07-15.1.log.gz")
);
fileNameFormatter({
index: 2,
date: "2019-07-16"
}).should.eql(normalize("/path/to/file/thefile.2019-07-16.2.log.gz"));
});
});
});

180
node_modules/streamroller/test/fileNameParser-test.js generated vendored Executable file
View file

@ -0,0 +1,180 @@
const should = require("should");
describe("fileNameParser", () => {
describe("with default options", () => {
const parser = require("../lib/fileNameParser")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
}
});
it("should return null for filenames that do not match", () => {
should(parser("cheese.txt")).not.be.ok();
should(parser("thefile.log.biscuits")).not.be.ok();
});
it("should take a filename and return the index", () => {
parser("thefile.log.2").should.eql({
filename: "thefile.log.2",
index: 2,
isCompressed: false
});
parser("thefile.log.2.gz").should.eql({
filename: "thefile.log.2.gz",
index: 2,
isCompressed: true
});
});
});
describe("with pattern option", () => {
const parser = require("../lib/fileNameParser")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
pattern: "yyyy-MM-dd"
});
it("should return null for files that do not match", () => {
should(parser("thefile.log.biscuits")).not.be.ok();
should(parser("thefile.log.2019")).not.be.ok();
should(parser("thefile.log.3.2")).not.be.ok();
should(parser("thefile.log.04-18")).not.be.ok();
should(parser("anotherfile.log.2020-04-18")).not.be.ok();
should(parser("2020-05-18")).not.be.ok();
});
it("should take a filename and return the date", () => {
parser("thefile.log.2019-07-17").should.eql({
filename: "thefile.log.2019-07-17",
index: 0,
date: "2019-07-17",
timestamp: new Date(2019, 6, 17).getTime(),
isCompressed: false
});
parser("thefile.log.gz").should.eql({
filename: "thefile.log.gz",
index: 0,
isCompressed: true
});
});
it("should take a filename and return both date and index", () => {
parser("thefile.log.2019-07-17.2").should.eql({
filename: "thefile.log.2019-07-17.2",
index: 2,
date: "2019-07-17",
timestamp: new Date(2019, 6, 17).getTime(),
isCompressed: false
});
parser("thefile.log.2019-07-17.2.gz").should.eql({
filename: "thefile.log.2019-07-17.2.gz",
index: 2,
date: "2019-07-17",
timestamp: new Date(2019, 6, 17).getTime(),
isCompressed: true
});
});
});
describe("with keepFileExt option", () => {
const parser = require("../lib/fileNameParser")({
file: {
dir: "/path/to/file",
base: "thefile.log",
ext: ".log",
name: "thefile"
},
keepFileExt: true
});
it("should take a filename and return the index", () => {
should(parser("thefile.log.2")).not.be.ok();
should(parser("thefile.log.2.gz")).not.be.ok();
parser("thefile.2.log").should.eql({
filename: "thefile.2.log",
index: 2,
isCompressed: false
});
parser("thefile.2.log.gz").should.eql({
filename: "thefile.2.log.gz",
index: 2,
isCompressed: true
});
});
});
describe("with a two-digit date pattern", () => {
const parser = require("../lib/fileNameParser")({
file: {
dir: "/path/to/file",
base: "thing.log",
ext: ".log",
name: "thing"
},
pattern: "mm"
});
it("should take a filename and return the date", () => {
const expectedTimestamp = new Date(0, 0);
expectedTimestamp.setMinutes(34);
parser("thing.log.34").should.eql({
filename: "thing.log.34",
date: "34",
isCompressed: false,
index: 0,
timestamp: expectedTimestamp.getTime()
});
});
})
describe("with a four-digit date pattern", () => {
const parser = require("../lib/fileNameParser")({
file: {
dir: "/path/to/file",
base: "stuff.log",
ext: ".log",
name: "stuff"
},
pattern: "mm-ss"
});
it("should return null for files that do not match", () => {
should(parser("stuff.log.2020-04-18")).not.be.ok();
should(parser("09-18")).not.be.ok();
});
it("should take a filename and return the date", () => {
const expectedTimestamp = new Date(0, 0);
expectedTimestamp.setMinutes(34);
expectedTimestamp.setSeconds(59);
parser("stuff.log.34-59").should.eql({
filename: "stuff.log.34-59",
date: "34-59",
isCompressed: false,
index: 0,
timestamp: expectedTimestamp.getTime()
});
});
it("should take a filename and return both date and index", () => {
const expectedTimestamp_1 = new Date(0, 0);
expectedTimestamp_1.setMinutes(7);
expectedTimestamp_1.setSeconds(17);
parser("stuff.log.07-17.2").should.eql({
filename: "stuff.log.07-17.2",
index: 2,
date: "07-17",
timestamp: expectedTimestamp_1.getTime(),
isCompressed: false
});
const expectedTimestamp_2 = new Date(0, 0);
expectedTimestamp_2.setMinutes(17);
expectedTimestamp_2.setSeconds(30);
parser("stuff.log.17-30.3.gz").should.eql({
filename: "stuff.log.17-30.3.gz",
index: 3,
date: "17-30",
timestamp: expectedTimestamp_2.getTime(),
isCompressed: true
});
});
})
});

View file

@ -0,0 +1,118 @@
require("should");
const fs = require('fs-extra');
const path = require('path');
const zlib = require('zlib');
const proxyquire = require('proxyquire').noPreserveCache();
const moveAndMaybeCompressFile = require('../lib/moveAndMaybeCompressFile');
const TEST_DIR = path.normalize(`/tmp/moveAndMaybeCompressFile_${Math.floor(Math.random()*10000)}`);
describe('moveAndMaybeCompressFile', () => {
beforeEach(async () => {
await fs.emptyDir(TEST_DIR);
});
after(async () => {
await fs.remove(TEST_DIR);
});
it('should move the source file to a new destination', async () => {
const source = path.join(TEST_DIR, 'test.log');
const destination = path.join(TEST_DIR, 'moved-test.log');
await fs.outputFile(source, 'This is the test file.');
await moveAndMaybeCompressFile(source, destination);
const contents = await fs.readFile(destination, 'utf8');
contents.should.equal('This is the test file.');
const exists = await fs.pathExists(source);
exists.should.be.false();
});
it('should compress the source file at the new destination', async () => {
const source = path.join(TEST_DIR, 'test.log');
const destination = path.join(TEST_DIR, 'moved-test.log.gz');
await fs.outputFile(source, 'This is the test file.');
await moveAndMaybeCompressFile(source, destination, true);
const zippedContents = await fs.readFile(destination);
const contents = await new Promise(resolve => {
zlib.gunzip(zippedContents, (e, data) => {
resolve(data.toString());
});
});
contents.should.equal('This is the test file.');
const exists = await fs.pathExists(source);
exists.should.be.false();
});
it('should do nothing if the source file and destination are the same', async () => {
const source = path.join(TEST_DIR, 'pants.log');
const destination = path.join(TEST_DIR, 'pants.log');
await fs.outputFile(source, 'This is the test file.');
await moveAndMaybeCompressFile(source, destination);
(await fs.readFile(source, 'utf8')).should.equal('This is the test file.');
});
it('should do nothing if the source file does not exist', async () => {
const source = path.join(TEST_DIR, 'pants.log');
const destination = path.join(TEST_DIR, 'moved-pants.log');
await moveAndMaybeCompressFile(source, destination);
(await fs.pathExists(destination)).should.be.false();
});
it('should use copy+truncate if source file is locked (windows)', async () => {
const moveWithMock = proxyquire('../lib/moveAndMaybeCompressFile', {
"fs-extra": {
exists: () => Promise.resolve(true),
move: () => Promise.reject({ code: 'EBUSY', message: 'all gone wrong'}),
copy: (fs.copy.bind(fs)),
truncate: (fs.truncate.bind(fs))
}
});
const source = path.join(TEST_DIR, 'test.log');
const destination = path.join(TEST_DIR, 'moved-test.log');
await fs.outputFile(source, 'This is the test file.');
await moveWithMock(source, destination);
const contents = await fs.readFile(destination, 'utf8');
contents.should.equal('This is the test file.');
// won't delete the source, but it will be empty
(await fs.readFile(source, 'utf8')).should.be.empty()
});
it('should truncate file if remove fails when compressed (windows)', async () => {
const moveWithMock = proxyquire('../lib/moveAndMaybeCompressFile', {
"fs-extra": {
exists: () => Promise.resolve(true),
unlink: () => Promise.reject({ code: 'EBUSY', message: 'all gone wrong'}),
createReadStream: fs.createReadStream.bind(fs),
truncate: fs.truncate.bind(fs)
}
});
const source = path.join(TEST_DIR, 'test.log');
const destination = path.join(TEST_DIR, 'moved-test.log.gz');
await fs.outputFile(source, 'This is the test file.');
await moveWithMock(source, destination, true);
const zippedContents = await fs.readFile(destination);
const contents = await new Promise(resolve => {
zlib.gunzip(zippedContents, (e, data) => {
resolve(data.toString());
});
});
contents.should.equal('This is the test file.');
// won't delete the source, but it will be empty
(await fs.readFile(source, 'utf8')).should.be.empty()
});
});