Skip to content

Commit

Permalink
Merge pull request #1 from hartfordfive/0.1.0
Browse files Browse the repository at this point in the history
Release of first version 0.1.0
  • Loading branch information
hartfordfive committed May 2, 2017
2 parents 88a6ed8 + ecaf86b commit 4e8641f
Show file tree
Hide file tree
Showing 4,889 changed files with 1,232,373 additions and 0 deletions.
The diff you're trying to view is too large. We only load the first 3000 changed files.
8 changes: 8 additions & 0 deletions .gitignore
@@ -0,0 +1,8 @@
/.idea
/build

.DS_Store
/protologbeat
/protologbeat.test
protologbeat.local.yml
*.pyc
43 changes: 43 additions & 0 deletions .travis.yml
@@ -0,0 +1,43 @@
sudo: required
dist: trusty
services:
- docker

language: go

go:
- 1.6

os:
- linux
- osx

env:
matrix:
- TARGETS="check"
- TARGETS="testsuite"

global:
# Cross-compile for amd64 only to speed up testing.
- GOX_FLAGS="-arch amd64"

addons:
apt:
packages:
- python-virtualenv

before_install:
# Redo the travis setup but with the elastic/libbeat path. This is needed so the package path is correct
- mkdir -p $HOME/gopath/src/github.com/harfordfive/protologbeat/
- rsync -az ${TRAVIS_BUILD_DIR}/ $HOME/gopath/src/github.com/harfordfive/protologbeat/
- export TRAVIS_BUILD_DIR=$HOME/gopath/src/github.com/harfordfive/protologbeat/
- cd $HOME/gopath/src/github.com/harfordfive/protologbeat/

install:
- true

script:
- make $TARGETS

after_success:
# Copy full.cov to coverage.txt because codecov.io requires this file
Empty file added CONTRIBUTING.md
Empty file.
46 changes: 46 additions & 0 deletions Makefile
@@ -0,0 +1,46 @@
BEATNAME=protologbeat
BEAT_DIR=github.com/harfordfive/protologbeat
SYSTEM_TESTS=false
TEST_ENVIRONMENT=false
ES_BEATS?=./vendor/github.com/elastic/beats
GOPACKAGES=$(shell glide novendor)
PREFIX?=.
NOTICE_FILE=NOTICE

# Path to the libbeat Makefile
-include $(ES_BEATS)/libbeat/scripts/Makefile

# Initial beat setup
.PHONY: setup
setup: copy-vendor
make update

# Copy beats into vendor directory
.PHONY: copy-vendor
copy-vendor:
mkdir -p vendor/github.com/elastic/
cp -R ${GOPATH}/src/github.com/elastic/beats vendor/github.com/elastic/
rm -rf vendor/github.com/elastic/beats/.git

.PHONY: git-init
git-init:
git init
git add README.md CONTRIBUTING.md
git commit -m "Initial commit"
git add LICENSE
git commit -m "Add the LICENSE"
git add .gitignore
git commit -m "Add git settings"
git add .
git reset -- .travis.yml
git commit -m "Add protologbeat"
git add .travis.yml
git commit -m "Add Travis CI"

# This is called by the beats packer before building starts
.PHONY: before-build
before-build:

# Collects all dependencies and then calls update
.PHONY: collect
collect:
159 changes: 159 additions & 0 deletions README.md
@@ -0,0 +1,159 @@
# Protologbeat

## Description

This application is intended as a replacement for [udplogbeat](https://github.com/hartfordfive/udplogbeat). Although quite similar, it does have some improvements and allows you to start up via either UDP or TCP. It can act accept plain-text or JSON logs and also act as a syslog destination replacement.

Ensure that this folder is at the following location:
`${GOPATH}/github.com/harfordfive`

## Getting Started with Protologbeat

### Configuration Options

- `protologbeat.port` : The UDP port on which the process will listen (Default = 5000)
- `protologbeat.max_message_size` : The maximum accepted message size (Default = 4096)
- `protologbeat.json_mode`: Enable logging of only JSON formated messages (Default = false)
- `protolog.merge_fields_to_root` : When **json_mode** enabled, wether to merge parsed fields to the root level. (Default = false)
- `protologbeat.default_es_log_type`: Elasticsearch type to assign to an event if one isn't specified (Default: protologbeat)
- `protologbeat.enable_syslog_format_only` : Boolean value indicating if only syslog messages should be accepted. (Default = false)
- `protologbeat.enable_json_validation` : Boolean value indicating if JSON schema validation should be applied for `json` format messages (Default = false)
- `protologbeat.validate_all_json_types` : When json_mode enabled, indicates if ALL types must have a schema specified. Log entries with types that have no schema will not be published. (Default = false)
- `protologbeat.json_document_type_schema` : A hash consisting of the Elasticsearch type as the key, and the absolute local schema file path as the value.

### Configuration Example

The following are examples of configuration blocks for the `protologbeat` section.

1. [Configuration](_sample/config1.yml) block for plain-text logging
2. [Configuration](_sample/config2.yml) block that enforces JSON schema only for indicated Elasticsearch types
3. [Configuration](_sample/config4.yml) block that enforces JSON schema for all Elasticsearch types
4. [Configuration](_sample/config3.yml) block for a syslog replacement, with custom ES type of 'myapp'

JSON schemas can be automatically generated from an object here: http://jsonschema.net/. You can also view the [email_contact](_samples/email_contact.json) and [stock_item](_samples/stock_item.json) schemas as examples.

#### Considerations

- If you intend on using this as a drop-in replacement to logging with Rsyslog, this method will not persist your data to a file on disk.
- If protologbeat is down for any given reason, messages sent to the configured UDP port will never be processed or sent to your ELK cluster.
- If you need 100% guarantee each message will be delivered at least once, this may not be the best solution for you.
- If some potential loss of log events is acceptable for you, than this may be a reasonable solution for you.
- This application is intended for scenarios where your application can log to protologbeat running on the same physical host. It's discouraged to use this for cross-server/cross-region/cross-datacenter logging.
- The current date/time is automatically added to each log entry once it is received by protologbeat.
- Considering this could log data with any type of fields, it's suggested that you add your necessary field names and types to the [protologbeat.template-es2x.json](protologbeat.template-es2x.json) or [protologbeat.template.json](protologbeat.template.json) (*ES 5.x*) index templates.

### Sample Clients

Please see the `_samples/` directory for examples of clients in various languages.


### Requirements

* [Golang](https://golang.org/dl/) 1.7

### Init Project
To get running with Protologbeat and also install the
dependencies, run the following command:

```
make setup
```

It will create a clean git history for each major step. Note that you can always rewrite the history if you wish before pushing your changes.

To push Protologbeat in the git repository, run the following commands:

```
git remote set-url origin https://github.com/harfordfive/protologbeat
git push origin master
```

For further development, check out the [beat developer guide](https://www.elastic.co/guide/en/beats/libbeat/current/new-beat.html).

### Build

To build the binary for Protologbeat run the command below. This will generate a binary
in the same directory with the name protologbeat.

```
make
```


### Run

To run Protologbeat with debugging output enabled, run:

```
./protologbeat -c protologbeat.yml -e -d "*"
```


### Test

To test Protologbeat, run the following command:

```
make testsuite
```

alternatively:
```
make unit-tests
make system-tests
make integration-tests
make coverage-report
```

The test coverage is reported in the folder `./build/coverage/`

### Update

Each beat has a template for the mapping in elasticsearch and a documentation for the fields
which is automatically generated based on `etc/fields.yml`.
To generate etc/protologbeat.template.json and etc/protologbeat.asciidoc

```
make update
```


### Cleanup

To clean Protologbeat source code, run the following commands:

```
make fmt
make simplify
```

To clean up the build directory and generated artifacts, run:

```
make clean
```


### Clone

To clone Protologbeat from the git repository, run the following commands:

```
mkdir -p ${GOPATH}/github.com/harfordfive
cd ${GOPATH}/github.com/harfordfive
git clone https://github.com/harfordfive/protologbeat
```


For further development, check out the [beat developer guide](https://www.elastic.co/guide/en/beats/libbeat/current/new-beat.html).


## Packaging

The beat frameworks provides tools to crosscompile and package your beat for different platforms. This requires [docker](https://www.docker.com/) and vendoring as described above. To build packages of your beat, run the following command:

```
make package
```

This will fetch and create all images required for the build process. The hole process to finish can take several minutes.
7 changes: 7 additions & 0 deletions _meta/beat.yml
@@ -0,0 +1,7 @@
################### Protologbeat Configuration Example #########################

############################# Protologbeat ######################################

protologbeat:
# Defines how often an event is sent to the output
period: 1s
9 changes: 9 additions & 0 deletions _meta/fields.yml
@@ -0,0 +1,9 @@
- key: protologbeat
title: protologbeat
description:
fields:
- name: counter
type: long
required: true
description: >
PLEASE UPDATE DOCUMENTATION
6 changes: 6 additions & 0 deletions _meta/kibana/index-pattern/protologbeat.json
@@ -0,0 +1,6 @@
{
"fields": "[{\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"beat.name\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"string\", \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"beat.hostname\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"string\", \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"beat.version\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"string\", \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"@timestamp\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"date\", \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"tags\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"string\", \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"fields\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"meta.cloud.provider\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"string\", \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"meta.cloud.instance_id\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"string\", \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"meta.cloud.machine_type\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"string\", \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"meta.cloud.availability_zone\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"string\", \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"meta.cloud.project_id\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"string\", \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"meta.cloud.region\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"string\", \"scripted\": false}, {\"count\": 0, \"analyzed\": false, \"aggregatable\": true, \"name\": \"counter\", \"searchable\": true, \"indexed\": true, \"doc_values\": true, \"type\": \"number\", \"scripted\": false}]",
"fieldFormatMap": "{\"@timestamp\": {\"id\": \"date\"}}",
"timeFieldName": "@timestamp",
"title": "protologbeat-*"
}
3 changes: 3 additions & 0 deletions _samples/config1.yml
@@ -0,0 +1,3 @@
protologbeat:
port: 6000
max_message_size: 4096
9 changes: 9 additions & 0 deletions _samples/config2.yml
@@ -0,0 +1,9 @@
protologbeat:
port: 6000
max_message_size: 2048
default_es_log_type: protologbeat
merge_fields_to_root: true
enable_json_validation: true
json_schema:
email_contact: "/etc/protologbeat/app1_schema.json"
stock_item: "/etc/protologbeat/app2_schema.json"
5 changes: 5 additions & 0 deletions _samples/config3.yml
@@ -0,0 +1,5 @@
protologbeat:
port: 6000
max_message_size: 2048
default_es_log_type: myapp
enable_syslog_format_only: true
10 changes: 10 additions & 0 deletions _samples/config4.yml
@@ -0,0 +1,10 @@
protologbeat:
port: 6000
max_message_size: 2048
default_es_log_type: protologbeat
merge_fields_to_root: true
enable_json_validation: true
validate_all_json_types: true
json_schema:
email_contact: "/etc/protologbeat/app1_schema.json"
stock_item: "/etc/protologbeat/app2_schema.json"
24 changes: 24 additions & 0 deletions _samples/email_contact.json
@@ -0,0 +1,24 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"properties": {
"email": {
"type": "string"
},
"name": {
"type": "object",
"properties": {
"first": {
"type": "string"
},
"last": {
"type": "string"
}
}
},
"type": {
"type": "string"
}
},
"additionalProperties": false
}
42 changes: 42 additions & 0 deletions _samples/logger.lua
@@ -0,0 +1,42 @@
Logger = {}
Logger.__index = Logger

function Logger.init(host,port,proto,format)
local sock = require("socket")
local json = require("cjson")
local lgr = {} -- our new object
setmetatable(lgr, Logger) -- make Account handle lookup

if proto == "tcp" then
lgr.socket = sock.tcp() -- initialize our object
else
lgr.socket = sock.udp()
end
lgr.socket:settimeout(0)
lgr.host = host
lgr.port = port
if lgr.format == 'json' then
lgr.format = 'json'
else
lgr.format = 'plain'
end
return lgr
end

function Logger:sendMsg(msg)
local payload
if self.format == 'json' then
payload = self.json.encode(msg)
else
payload = msg
end
self.socket:sendto(payload, self.host, self.port)
end

-- Start logger client to send plain-text formated message to protologbeat listening on UDP host/port
logger = Logger.init('127.0.0.1', 6000, "udp", "plain")
logger:sendMsg('This is a sample message sent from the Lua logger.')

-- Start logger client to send json formated message to protologbeat listening on TCP host/port
--logger = Logger.init('127.0.0.1', 6000, "tcp", "json")
--logger:sendMsg({type = 'lua_app_json', message = 'This is a sample message sent from the Lua logger.', log_level = 'INFO'})

0 comments on commit 4e8641f

Please sign in to comment.