Skip to content

Commit 4201201

Browse files
committed
added image extension use cases in doc following #4
1 parent b379e67 commit 4201201

File tree

2 files changed

+73
-17
lines changed

2 files changed

+73
-17
lines changed

Dockerfile

+2-2
Original file line numberDiff line numberDiff line change
@@ -91,8 +91,8 @@ ADD ./11-nginx.conf /etc/logstash/conf.d/11-nginx.conf
9191
ADD ./30-lumberjack-output.conf /etc/logstash/conf.d/30-lumberjack-output.conf
9292

9393
# patterns
94-
ADD ./nginx.pattern /opt/logstash/patterns/nginx
95-
RUN chown logstash:logstash /opt/logstash/patterns/*
94+
ADD ./nginx.pattern ${LOGSTASH_HOME}/patterns/nginx
95+
RUN chown -R logstash:logstash ${LOGSTASH_HOME}/patterns
9696

9797

9898
###############################################################################

README.md

+71-15
Original file line numberDiff line numberDiff line change
@@ -6,12 +6,15 @@ This Docker image provides a convenient centralised log server and log managemen
66

77
- [Installation](#installation)
88
- [Usage](#usage)
9-
- [Running the image using Docker Compose](#running-with-docker-compose)
9+
- [Running the container using Docker Compose](#running-with-docker-compose)
1010
- [Creating a dummy log entry](#creating-dummy-log-entry)
1111
- [Forwarding logs](#forwarding-logs)
1212
- [Linking a Docker container to the ELK container](#linking-containers)
1313
- [Building the image](#building-image)
1414
- [Extending the image](#extending-image)
15+
- [Installing Elasticsearch plugins](#installing-elasticsearch-plugins)
16+
- [Installing Logstash plugins](#installing-logstash-plugins)
17+
- [Starting Logstash's web interface](#starting-logstash-web)
1518
- [Making log data persistent](#persistent-log-data)
1619
- [Security considerations](#security-considerations)
1720
- [References](#references)
@@ -21,11 +24,11 @@ This Docker image provides a convenient centralised log server and log managemen
2124

2225
Install [Docker](https://docker.com/), either using a native package (Linux) or wrapped in a virtual machine (Windows, OS X – e.g. using [Boot2Docker](http://boot2docker.io/) or [Vagrant](https://www.vagrantup.com/)).
2326

24-
To pull this image from the Docker registry, open a shell prompt and enter:
27+
To pull this image from the [Docker registry](https://registry.hub.docker.com/u/sebp/elk/), open a shell prompt and enter:
2528

2629
$ sudo docker pull sebp/elk
2730

28-
**Note** – This image has been built automatically from the source files in the source Git repository. If you want to build the image yourself, see the [Building the image](#building-image) section below.
31+
**Note** – This image has been built automatically from the source files in the [source Git repository on GitHub](https://github.com/spujadas/elk-docker). If you want to build the image yourself, see the [Building the image](#building-image) section below.
2932

3033
**Note** – The size of the virtual image (as reported by `docker images`) is 1,076 MB.
3134

@@ -39,11 +42,11 @@ This command publishes the following ports, which are needed for proper operatio
3942

4043
- 5601 (Kibana web interface).
4144
- 9200 (Elasticsearch JSON interface).
42-
- 5000 (Logstash server, receives logs from logstash forwarders – see the [Forwarding logs](#forwarding-logs) section below).
45+
- 5000 (Logstash server, receives logs from logstash forwarders – see the *[Forwarding logs](#forwarding-logs)* section below).
4346

4447
**Note** – The image also exposes Elasticsearch's transport interface on port 9300. Use the `-p 5300:5300` option with the `docker` command above to publish it.
4548

46-
**Note** – Logstash includes a web interface, but it is not started in this Docker image.
49+
**Note** – Logstash includes a web interface, but it is not started in this Docker image. See the *[Starting Logstash's web interface](#starting-logstash-web)* section below for guidance on how to extend the base image to start it.
4750

4851
The figure below shows how the pieces fit together.
4952

@@ -70,7 +73,7 @@ You can stop the container with `^C`, and start it again with `sudo docker start
7073

7174
As from Kibana version 4.0.0, you won't be able to see anything (not even an empty dashboard) until something has been logged (see the *[Creating a dummy log entry](#creating-dummy-log-entry)* sub-section below on how to test your set-up, and the *[Forwarding logs](#forwarding-logs)* section on how to forward logs from regular applications).
7275

73-
### Running the image using Docker Compose <a name="running-with-docker-compose"></a>
76+
### Running the container using Docker Compose <a name="running-with-docker-compose"></a>
7477

7578
If you're using [Docker Compose](https://docs.docker.com/compose/) (formerly known as [Fig](http://fig.sh)) to manage your Docker services (and if not you really should as it will make your life much easier!), then you can create an entry for the ELK Docker image by adding the following lines to your `docker-compose.yml` file:
7679

@@ -108,17 +111,17 @@ Open a shell prompt in the container and type (replacing `<container-name>` with
108111

109112
- At the container's shell prompt, type `start.sh&` to start Elasticsearch, Logstash and Kibana in the background, and wait for everything to be up and running (wait for `{"@timestamp":... ,"message":"Listening on 0.0.0.0:5601",...}`)
110113

111-
Now enter:
114+
Wait for Logstash to start (as indicated by the message `Logstash startup completed`), then enter:
112115

113116
# /opt/logstash/bin/logstash -e 'input { stdin { } } output { elasticsearch { host => localhost } }'
114117

115-
And then type some dummy text followed by Enter to create a log entry:
118+
Type some dummy text followed by Enter to create a log entry:
116119

117120
this is a dummy entry
118121

119122
**Note** - You can create as many entries as you want. Use `^C` to go back to the bash prompt.
120123

121-
After a few seconds if you browse to *http://<your-host>:9200/_search?pretty* (e.g. [http://localhost:9200/_search?pretty](http://localhost:9200/_search?pretty) for a local native instance of Docker) you'll see that Elasticsearch has indexed the entry:
124+
If you browse to *http://<your-host>:9200/_search?pretty* (e.g. [http://localhost:9200/_search?pretty](http://localhost:9200/_search?pretty) for a local native instance of Docker) you'll see that Elasticsearch has indexed the entry:
122125

123126
{
124127
...
@@ -135,7 +138,7 @@ After a few seconds if you browse to *http://<your-host>:9200/_search?pretty* (e
135138

136139
You can now browse to Kibana's web interface at *http://<your-host>:5601* (e.g. [http://localhost:5601](http://localhost:5601) for a local native instance of Docker).
137140

138-
From the drop-down "Time-field name" field, select `@timestamp`, then click on "Create", and you're good to go.
141+
Make sure that the drop-down "Time-field name" field is pre-populated with the value `@timestamp`, then click on "Create", and you're good to go.
139142

140143
## Forwarding logs <a name="forwarding-logs"></a>
141144

@@ -217,7 +220,7 @@ With Compose here's what example entries for a (locally built log-generating) co
217220

218221
To build the Docker image from the source files, first clone the [Git repository](https://github.com/spujadas/elk-docker), go to the root of the cloned directory (i.e. the directory that contains `Dockerfile`), and:
219222

220-
- If you're using the vanilla `docker` command then run `sudo docker build . -t <repository-name>`, where `<repository-name>` is the repository name to be applied to the image, which you can then use to run the image with the `docker run` command.
223+
- If you're using the vanilla `docker` command then run `sudo docker build -t <repository-name> .`, where `<repository-name>` is the repository name to be applied to the image, which you can then use to run the image with the `docker run` command.
221224

222225
- If you're using Compose then run `sudo docker-compose build elk`, which uses the `docker-compose.yml` file from the source repository to build the image. You can then run the built image with `sudo docker-compose up`.
223226

@@ -232,17 +235,70 @@ To create a new image based on this base image, you want your `Dockerfile` to in
232235

233236
followed by instructions to extend the image (see Docker's [Dockerfile Reference page](https://docs.docker.com/reference/builder/) for more information).
234237

238+
The next few subsections present some typical use cases.
239+
240+
### Installing Elasticsearch plugins <a name="installing-elasticsearch-plugins"></a>
241+
242+
Elasticsearch's home directory in the image is `/usr/share/elasticsearch`, its [plugin management script](https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-plugins.html) (`plugin`) resides in the `bin` subdirectory, and plugins are installed in `plugins`.
243+
244+
A `Dockerfile` like the following will extend the base image and install Elastic HQ, a management and monitoring plugin for Elasticsearch, using `plugin`.
245+
246+
FROM sebp/elk
247+
248+
ENV ES_HOME /usr/share/elasticsearch
249+
WORKDIR ${ES_HOME}
250+
251+
RUN bin/plugin -i royrusso/elasticsearch-HQ
252+
253+
You can now build the new image (see the *[Building the image](#building-image)* section above) and run the container in the same way as you did with the base image. The Elastic HQ interface will be accessible at *http://<your-host>:9200/_plugin/HQ/* (e.g. [http://localhost:9200/_plugin/HQ/](http://localhost:9200/_plugin/HQ/) for a local native instance of Docker).
254+
255+
### Installing Logstash plugins <a name="installing-logstash-plugins"></a>
256+
257+
The name of Logstash's home directory in the image is stored in the `LOGSTASH_HOME` environment variable (which is set to `/opt/logstash` in the base image). Logstash's plugin management script (`plugin`) is located in the `bin` subdirectory.
258+
259+
The following `Dockerfile` can be used to extend the base image and install the [RSS input plugin](https://www.elastic.co/guide/en/logstash/current/plugins-inputs-rss.html):
260+
261+
FROM sebp/elk
262+
263+
WORKDIR ${LOGSTASH_HOME}
264+
RUN bin/plugin install logstash-input-rss
265+
266+
See the *[Building the image](#building-image)* section above for instructions on building the new image. You can then run a container based on this image using the same command line as the one in the *[Usage](#usage)* section.
267+
268+
### Starting Logstash's web interface <a name="starting-logstash-web"></a>
269+
270+
Starting Logstash's web interface requires overriding the `start.sh` script from the base `sebp/elk` image to start the `logstash-web` service.
271+
272+
To do that:
273+
274+
1. Download the [`start.sh` script from the image's source](https://raw.githubusercontent.com/spujadas/elk-docker/master/start.sh), and add this line in it before the `tail -f /var/log/elasticsearch/elasticsearch.log` line:
275+
276+
service logstash-web start
277+
278+
2. Create the following `Dockerfile` next to this updated `start.sh` script:
279+
280+
FROM sebp/elk
281+
282+
ADD ./start.sh /usr/local/bin/start.sh
283+
EXPOSE 9292
284+
285+
286+
3. Build the image as usual (see the *[Building the image](#building-image)* section above).
287+
288+
4. Start the image with port 9292 published (e.g. `docker run ... -p 9292:9292 ...`).
289+
290+
235291
## Making log data persistent <a name="persistent-log-data"></a>
236292

237293
If you want your ELK stack to keep your log data across container restarts, you need to create a Docker data volume inside the ELK container at `/var/lib/elasticsearch`, which is the directory that Elasticsearch stores its data in.
238294

239295
One way to do this with the `docker` command-line tool is to first create a named container called `elk_data` with a bound Docker volume by using the `-v` option:
240296

241-
$ sudo docker run -p 5601:5601 -p 9200:9200 -5000:5000 -v /var/lib/elasticsearch -it --name elk_data sebp/elk
297+
$ sudo docker run -p 5601:5601 -p 9200:9200 -5000:5000 -v /var/lib/elasticsearch --name elk_data sebp/elk
242298

243299
You can now reuse the persistent volume from that container using the `--volumes-from` option:
244300

245-
$ sudo docker run -p 5601:5601 -p 9200:9200 -p 5000:5000 --volumes-from elk_data -it --name elk sebp/elk
301+
$ sudo docker run -p 5601:5601 -p 9200:9200 -p 5000:5000 --volumes-from elk_data --name elk sebp/elk
246302

247303
Alternatively, if you're using Compose, then simply add the two following lines to your `docker-compose.yml` file, under the `elk:` entry:
248304

@@ -261,16 +317,16 @@ As it stands this image is meant for local test use, and as such hasn't been sec
261317

262318
To harden this image, at the very least you would want to:
263319

264-
- Restrict the access to the ELK services to authorised hosts/networks only, as described in e.g. [Elasticsearch Scripting and Security](http://www.elasticsearch.org/blog/scripting-security/) and [Elastic Security: Deploying Logstash, ElasticSearch, Kibana "securely" on the Internet ](http://blog.eslimasec.com/2014/05/elastic-security-deploying-logstash.html).
320+
- Restrict the access to the ELK services to authorised hosts/networks only, as described in e.g. [Elasticsearch Scripting and Security](http://www.elasticsearch.org/blog/scripting-security/) and [Elastic Security: Deploying Logstash, ElasticSearch, Kibana "securely" on the Internet](http://blog.eslimasec.com/2014/05/elastic-security-deploying-logstash.html).
265321
- Password-protect the access to Kibana and Elasticsearch (see [SSL And Password Protection for Kibana](http://technosophos.com/2014/03/19/ssl-password-protection-for-kibana.html)).
266322
- Generate a new self-signed authentication certificate for the Logstash server (`cd /etc/pki/tls; sudo openssl req -x509 -batch -nodes -days 3650 -newkey rsa:2048 -keyout private/logstash-forwarder.key -out certs/logstash-forwarder.crt` for a 10-year certificate) or (better) get a proper certificate from a commercial provider (known as a certificate authority), and keep the private key private.
267323

268324
## References <a name="references"></a>
269325

270326
- [How To Install Elasticsearch, Logstash, and Kibana 4 on Ubuntu 14.04](https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-4-on-ubuntu-14-04)
271-
- [Elasticsearch, Fluentd, and Kibana: Open Source Log Search and Visualization](https://www.digitalocean.com/community/tutorials/elasticsearch-fluentd-and-kibana-open-source-log-search-and-visualization)
272327
- [The Docker Book](http://www.dockerbook.com/)
273328
- [The Logstash Book](http://www.logstashbook.com/)
329+
- [Elastic's reference documentation](https://www.elastic.co/guide/index.html)
274330

275331
## About <a name="about"></a>
276332

0 commit comments

Comments
 (0)