Skip to content

Commit 7c493cd

Browse files
author
evilebottnawi
committed
Chore: prepare 5.0.0 release.
1 parent d8a726d commit 7c493cd

10 files changed

+9157
-9398
lines changed

CHANGELOG.md

+31-21
Original file line numberDiff line numberDiff line change
@@ -4,38 +4,48 @@ All notable changes to this project will be documented in this file.
44

55
This project adheres to [Semantic Versioning](http://semver.org).
66

7+
## 5.0.0 - 2017-11-15
8+
9+
* Changed: use `[cosmiconfig](https://github.com/davidtheclark/cosmiconfig) for
10+
loading configuration.
11+
* Feature: in CLI if the parent directory does not exist when you write
12+
`robots.txt`, it's created.
13+
714
## 4.0.4 - 2017-10-09
815

9-
- Chore: update deps.
16+
* Chore: update deps.
1017

1118
## 4.0.3 - 2017-03-13
1219

13-
- Fixed: `is-absolute-url` package semver.
20+
* Fixed: `is-absolute-url` package semver.
1421

1522
## 4.0.2 - 2016-12-30
1623

17-
- Fixed: `host` options is now processed based `URL`.
18-
- Fixed: thrown error if the `host` option being IP address.
19-
- Fixed: clarified error message on multiple and not string the `userAgent` option.
20-
- Fixed: `Host` directive is now not containing `80` port.
21-
- Fixed: thrown error if the `cleanParam` not string or array and if string not more than 500 characters.
22-
- Fixed: supported unicode characters in a `Allow` and a `Disallow` directives.
23-
- Fixed: thrown error if the `sitemap` option not an array or a string and not an absolute URL.
24+
* Fixed: `host` options is now processed based `URL`.
25+
* Fixed: thrown error if the `host` option being IP address.
26+
* Fixed: clarified error message on multiple and not string the `userAgent`
27+
option.
28+
* Fixed: `Host` directive is now not containing `80` port.
29+
* Fixed: thrown error if the `cleanParam` not string or array and if string not
30+
more than 500 characters.
31+
* Fixed: supported unicode characters in a `Allow` and a `Disallow` directives.
32+
* Fixed: thrown error if the `sitemap` option not an array or a string and not
33+
an absolute URL.
2434

2535
## 4.0.1 - 2016-10-27
2636

27-
- Chore: added CI test on `node.js` version `7`.
28-
- Documentation: improve `README.md` and fix typos.
37+
* Chore: added CI test on `node.js` version `7`.
38+
* Documentation: improve `README.md` and fix typos.
2939

3040
## 4.0.0
3141

32-
- Added: `crawlDelay` to each `police` item.
33-
- Added: `cleanParam` to each `police` item (used only Yandex bot).
34-
- Chore: used `remark-preset-lint-itgalaxy` preset.
35-
- Chore: updated `devDependencies`.
36-
- Chore: updated copyright year in `LICENSE`.
37-
- Chore: improved tests.
38-
- Fixed: strict order directives for each `User-agent`.
39-
- Fixed: added newline after each `User-agent`.
40-
- Removed: `crawlDelay` from `options`.
41-
- Removed: `cleanParam` from `options`.
42+
* Added: `crawlDelay` to each `police` item.
43+
* Added: `cleanParam` to each `police` item (used only Yandex bot).
44+
* Chore: used `remark-preset-lint-itgalaxy` preset.
45+
* Chore: updated `devDependencies`.
46+
* Chore: updated copyright year in `LICENSE`.
47+
* Chore: improved tests.
48+
* Fixed: strict order directives for each `User-agent`.
49+
* Fixed: added newline after each `User-agent`.
50+
* Removed: `crawlDelay` from `options`.
51+
* Removed: `cleanParam` from `options`.

README.md

+53-46
Original file line numberDiff line numberDiff line change
@@ -17,59 +17,66 @@ npm install --save-dev generate-robotstxt
1717
## Usage
1818

1919
```js
20-
const robotstxt = require('generate-robotstxt').default;
20+
import robotstxt from "generate-robotstxt";
2121

2222
robotstxt({
23-
policy: [
24-
{
25-
userAgent: 'Googlebot',
26-
allow: '/',
27-
disallow: '/search',
28-
crawlDelay: 2
29-
},
30-
{
31-
userAgent: '*',
32-
allow: '/',
33-
disallow: '/search',
34-
crawlDelay: 10,
35-
cleanParam: 'ref /articles/'
36-
}
37-
],
38-
sitemap: 'sitemap.xml',
39-
host: 'http://example.com'
40-
})
41-
.then((content) => {
42-
console.log(content);
43-
});
23+
policy: [
24+
{
25+
userAgent: "Googlebot",
26+
allow: "/",
27+
disallow: "/search",
28+
crawlDelay: 2
29+
},
30+
{
31+
userAgent: "*",
32+
allow: "/",
33+
disallow: "/search",
34+
crawlDelay: 10,
35+
cleanParam: "ref /articles/"
36+
}
37+
],
38+
sitemap: "http://example.com/sitemap.xml",
39+
host: "http://example.com"
40+
}).then(content => {
41+
console.log(content);
42+
});
4443
```
4544

46-
Or
45+
## File based configuration
46+
47+
**robots-txt.config.js**
4748

4849
```js
49-
import robotstxt from 'generate-robotstxt';
50+
module.exports = {
51+
policy: [
52+
{
53+
userAgent: "Googlebot",
54+
allow: "/",
55+
disallow: "/search",
56+
crawlDelay: 2
57+
},
58+
{
59+
userAgent: "*",
60+
allow: "/",
61+
disallow: "/search",
62+
crawlDelay: 10,
63+
cleanParam: "ref /articles/"
64+
}
65+
],
66+
sitemap: "http://example.com/sitemap.xml",
67+
host: "http://example.com"
68+
};
69+
```
5070

51-
robotstxt({
52-
policy: [
53-
{
54-
userAgent: 'Googlebot',
55-
allow: '/',
56-
disallow: '/search',
57-
crawlDelay: 2
58-
},
59-
{
60-
userAgent: '*',
61-
allow: '/',
62-
disallow: '/search',
63-
crawlDelay: 10,
64-
cleanParam: 'ref /articles/'
65-
}
66-
],
67-
sitemap: 'http://example.com/sitemap.xml',
68-
host: 'http://example.com'
69-
})
70-
.then((content) => {
71-
console.log(content);
72-
});
71+
## CLI
72+
73+
```shell
74+
Awesome generator robots.txt
75+
76+
Usage generate-robotstxt [options] <dest>
77+
78+
Options:
79+
--config Path to a specific configuration file.
7380
```
7481

7582
## Contribution

lint-staged.config.js

+10-2
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,14 @@
11
"use strict";
22

33
module.exports = {
4-
"*.{js,jsx}": ["prettier --list-different", "eslint", "git add"],
5-
"*.{md,markdown,mdown,mkdn,mkd,mdwn,mkdown,ron}": ["remark -f -q", "git add"]
4+
"*.{js,jsx}": [
5+
"prettier --list-different",
6+
"eslint --report-unused-disable-directives",
7+
"git add"
8+
],
9+
"*.{md,markdown,mdown,mkdn,mkd,mdwn,mkdown,ron}": [
10+
"prettier --list-different",
11+
"remark -f -q",
12+
"git add"
13+
]
614
};

0 commit comments

Comments
 (0)