Skip to content

Commit

Permalink
Merge feature/modules into master (elastic#7284)
Browse files Browse the repository at this point in the history
* My changes (elastic#7218)

* First upstream PR commit (elastic#7172)

No tests yet.  Just for code review for now

* move all inner classes to their own folder + client and importer

* Fixes and tests (elastic#7228)

Add tests for the `LogStash::Modules:CLIParser` class in `cli_parser.rb`
Fix a typo in `cli_parser.rb` (`uparsed` vs `unparsed`)
Fix a bad variable name found by testing in `cli_parser.rb` and update the error message accordingly in `en.yml`

* Remove fb_modules (elastic#7280)

* fixes to import index-pattern & var updates & savedsearch capability (elastic#7283)

* fixes to import index-pattern & var updates & savedsearch capability

fixes to import index-pattern & var updates

add savedsearch capability

* minimise merge conflicts with PR End-to-End test with filebeat apache2

* End-to-End test with filebeat apache2 (elastic#7279)

This is a first run, but data flows from filebeat through Elasticsearch.

Template uploads from `$LS_HOME/modules/MODULENAME/configuration/elasticsearch/MODULENAME.json`

Specifying `--modules filebeat` from the command-line, with `-M "filebeat.var.elasticsearch.output.host=localhost:9200"`

Some of the saved searches don't get uploaded. @guyboertje is on this already.

The logstash configuration needs tweaking to allow receiving both access logs _and_ error logs.  The dashboards and visualizations all seem to expect the presence of both.

Set default to `localhost` in `elasticsearch_client.rb`

Changed command-line variable parsing to allow for a variable with only `modulename.key.subkey=value`, and updated the error message accordingly.

First draft of the filebeat module, as extracted from filebeat 5.4.0

* Add documentation for Modules

This is specific to the Master branch.  Multiple modules will not be supported in 5.5.

* Add READMEs and prune post-code comments

* Add comment regarding the variable name `modul`

Also, fix the default username for the Elasticsearch output in Logstash.  The default x-pack credentials are `elastic:changeme` rather than `elasticsearch:changeme`

* add cef module files (elastic#7292)

* fixes from reviews of PR elastic#7284
  • Loading branch information
untergeek authored and Guy Boertje committed Jun 5, 2017
1 parent ab3546a commit f443dae
Show file tree
Hide file tree
Showing 145 changed files with 3,801 additions and 9 deletions.
20 changes: 20 additions & 0 deletions config/logstash.yml
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,25 @@
#
# config.debug: false
#
# ------------ Module Settings ---------------
# Define modules here. Modules definitions must be defined as an array.
# The simple way to see this is to prepend each `name` with a `-`, and keep
# all associated variables under the `name` they are associated with, and
# above the next, like this:
#
# modules:
# - name: MODULE_NAME
# var.PLUGINTYPE1.PLUGINNAME1.KEY1: VALUE
# var.PLUGINTYPE1.PLUGINNAME1.KEY2: VALUE
# var.PLUGINTYPE2.PLUGINNAME1.KEY1: VALUE
# var.PLUGINTYPE3.PLUGINNAME3.KEY1: VALUE
#
# Module variable names must be in the format of
#
# var.PLUGIN_TYPE.PLUGIN_NAME.KEY
#
# modules:
#
# ------------ Queuing Settings --------------
#
# Internal queuing model, "memory" for legacy in-memory based queuing and
Expand Down Expand Up @@ -169,3 +188,4 @@
#
# Where to find custom plugins
# path.plugins: []

18 changes: 17 additions & 1 deletion docs/static/running-logstash-command-line.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ is the same as `-f bar`).
+
You can specify wildcards (<<glob-support,globs>>) and any matched files will
be loaded in the order described above. For example, you can use the wildcard feature to
load specific files by name:
load specific files by name:
+
[source,shell]
---------------------------------------------
Expand All @@ -68,6 +68,22 @@ With this command, Logstash concatenates three config files, `/tmp/one`, `/tmp/t
If you wish to use both defaults, please use the empty string for the `-e` flag.
The default is nil.

*`--modules`*::
Launch the named module. Works in conjunction with the `-M` option to assign values to
default variables for the specified module. If `--modules` is used on the command line,
any modules in `logstash.yml` will be ignored, as will any settings there. This flag is
mutually exclusive to the `-f` and `-e` flags. Only one of `-f`, `-e`, or `--modules` may
be specified. Multiple modules can be specified by separating them with a comma, or by
invoking the `--modules` flag multiple times.

*`-M, --modules.variable`*::
Assign a value to a configurable option for a module. The format for assigning variables is
`-M "MODULE_NAME.var.PLUGIN_TYPE.PLUGIN_NAME.KEY_NAME=value"` for Logstash variables. For other
settings, it will be `-M "MODULE_NAME.KEY_NAME.SUB_KEYNAME=value"`. The `-M` flag can be used
as many times as is necessary. If no `-M` options are specified, then the default value for
that setting will be used. The `-M` flag is only used in conjunction with the `--modules`
flag. It will be ignored if the `--modules` flag is absent.

*`-w, --pipeline.workers COUNT`*::
Sets the number of pipeline workers to run. This option sets the number of workers that will,
in parallel, execute the filter and output stages of the pipeline. If you find that events are
Expand Down
26 changes: 24 additions & 2 deletions docs/static/settings-file.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,24 @@ path:
Note that the `${VAR_NAME:default_value}` notation is supported, setting a default batch delay
of `5` and a default `path.queue` of `/tmp/queue` in the above example.

Modules may also be specified in the `logstash.yml` file. The modules definition will have
this format:

[source,yaml]
-------------------------------------------------------------------------------------
modules:
- name: MODULE_NAME1
var.PLUGIN_TYPE1.PLUGIN_NAME1.KEY1: VALUE
var.PLUGIN_TYPE1.PLUGIN_NAME1.KEY2: VALUE
var.PLUGIN_TYPE2.PLUGIN_NAME2.KEY1: VALUE
var.PLUGIN_TYPE3.PLUGIN_NAME3.KEY1: VALUE
- name: MODULE_NAME2
var.PLUGIN_TYPE1.PLUGIN_NAME1.KEY1: VALUE
var.PLUGIN_TYPE1.PLUGIN_NAME1.KEY2: VALUE
-------------------------------------------------------------------------------------

IMPORTANT: If the <<command-line-flags,command-line flag>> `--modules` is used, any modules defined in the `logstash.yml` file will be ignored.

The `logstash.yml` file includes the following settings:

[options="header"]
Expand Down Expand Up @@ -120,6 +138,10 @@ The `logstash.yml` file includes the following settings:
in plaintext passwords appearing in your logs!
| `false`

| `modules`
| When configured, `modules` must be in the nested YAML structure described above this table.
| None

| `queue.type`
| The internal queuing model to use for event buffering. Specify `memory` for legacy in-memory based queuing, or `persisted` for disk-based ACKed queueing (<<persistent-queues,persistent queues>>).
| `memory`
Expand All @@ -137,7 +159,7 @@ The `logstash.yml` file includes the following settings:
| 0 (unlimited)

| `queue.max_bytes`
| The total capacity of the queue in number of bytes. Make sure the capacity of your disk drive is greater than the value you specify here. If both `queue.max_events` and `queue.max_bytes` are specified, Logstash uses whichever criteria is reached first.
| The total capacity of the queue in number of bytes. Make sure the capacity of your disk drive is greater than the value you specify here. If both `queue.max_events` and `queue.max_bytes` are specified, Logstash uses whichever criteria is reached first.
| 1024mb (1g)

| `queue.checkpoint.acks`
Expand Down Expand Up @@ -172,7 +194,7 @@ The `logstash.yml` file includes the following settings:
a|
The log level. Valid options are:

* `fatal`
* `fatal`
* `error`
* `warn`
* `info`
Expand Down
82 changes: 78 additions & 4 deletions logstash-core/lib/logstash/bootstrap_check/default_config.rb
Original file line number Diff line number Diff line change
@@ -1,13 +1,87 @@
# encoding: utf-8
require "logstash/errors"
require "logstash/logging"

module LogStash module BootstrapCheck
class DefaultConfig
def self.check(settings)
# currently none of the checks applies if there are multiple pipelines
if settings.get("config.reload.automatic") && settings.get_setting("config.string").set?
raise LogStash::BootstrapCheckError, I18n.t("logstash.runner.reload-with-config-string")
include LogStash::Util::Loggable

def initialize(settings)
@settings = settings
end

def config_reload?
@settings.get("config.reload.automatic")
end

def config_string?
@settings.get("config.string")
end

def path_config?
@settings.get("path.config")
end

def config_modules?
# We want it to report true if not empty
!@settings.get("modules").empty?
end

def cli_modules?
# We want it to report true if not empty
!@settings.get("modules.cli").empty?
end

def both_config_flags?
config_string? && path_config?
end

def both_module_configs?
cli_modules? && config_modules?
end

def config_defined?
config_string? || path_config?
end

def modules_defined?
cli_modules? || config_modules?
end

def any_config?
config_defined? || modules_defined?
end

def check
# Check if both -f and -e are present
if both_config_flags?
raise LogStash::BootstrapCheckError, I18n.t("logstash.runner.config-string-path-exclusive")
end

# Make note that if modules are configured in both cli and logstash.yml that cli module
# settings will be used, and logstash.yml modules settings ignored
if both_module_configs?
logger.info(I18n.t("logstash.runner.cli-module-override"))
end

# Check if both config (-f or -e) and modules are configured
if config_defined? && modules_defined?
raise LogStash::BootstrapCheckError, I18n.t("logstash.runner.config-module-exclusive")
end

# Check for absence of any configuration
if !any_config?
raise LogStash::BootstrapCheckError, I18n.t("logstash.runner.missing-configuration")
end

# Check to ensure that if configuration auto-reload is used that -f is specified
if config_reload? && !path_config?
raise LogStash::BootstrapCheckError, I18n.t("logstash.runner.reload-without-config-path")
end
end

def self.check(settings)
DefaultConfig.new(settings).check
end
end
end end
52 changes: 52 additions & 0 deletions logstash-core/lib/logstash/config/source/modules.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
# encoding: utf-8
require "logstash/config/source/base"
require "logstash/config/pipeline_config"
require "logstash/util/loggable"
require "logstash/elasticsearch_client"
require "logstash/modules/importer"
require "logstash/errors"

module LogStash module Config module Source
class Modules < Base
include LogStash::Util::Loggable
def pipeline_configs
pipelines = []
plugin_modules = LogStash::PLUGIN_REGISTRY.plugins_with_type(:modules)

modules_array = @settings.get("modules.cli").empty? ? @settings.get("modules") : @settings.get("modules.cli")
logger.debug("Configured modules", :modules_array => modules_array.to_s)
module_names = []
module_names = modules_array.collect {|module_hash| module_hash["name"]}
if module_names.length > module_names.uniq.length
duplicate_modules = module_names.group_by(&:to_s).select { |_,v| v.size > 1 }.keys
raise LogStash::ConfigLoadingError, I18n.t("logstash.modules.configuration.modules-must-be-unique", :duplicate_modules => duplicate_modules)
end
### Here is where we can force the modules_array to use only [0] for 5.5, and leave
### a warning/error message to that effect.
modules_array.each do |module_hash|
begin
import_engine = LogStash::Modules::Importer.new(LogStash::ElasticsearchClient.build(module_hash))

current_module = plugin_modules.find { |allmodules| allmodules.module_name == module_hash["name"] }
alt_name = "module-#{module_hash["name"]}"
pipeline_id = alt_name

current_module.with_settings(module_hash)
current_module.import(import_engine)
config_string = current_module.config_string

config_part = org.logstash.common.SourceWithMetadata.new("module", alt_name, config_string)
pipelines << PipelineConfig.new(self, pipeline_id.to_sym, config_part, @settings)
rescue => e
raise LogStash::ConfigLoadingError, I18n.t("logstash.modules.configuration.parse-failed", :error => e.message)
end
end
pipelines
end

def match?
# will fill this later
true
end
end
end end end
1 change: 1 addition & 0 deletions logstash-core/lib/logstash/config/source_loader.rb
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
# encoding: utf-8
require "logstash/config/source/local"
require "logstash/config/source/modules"
require "logstash/config/source/multi_local"
require "logstash/errors"
require "thread"
Expand Down
97 changes: 97 additions & 0 deletions logstash-core/lib/logstash/elasticsearch_client.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
# encoding: utf-8
require "logstash/namespace"
require "logstash/logging"
require "elasticsearch"
require "elasticsearch/transport/transport/http/manticore"

module LogStash class ElasticsearchClient
include LogStash::Util::Loggable

class Response
# duplicated here from Elasticsearch::Transport::Transport::Response
# to create a normalised response across different client IMPL
attr_reader :status, :body, :headers
def initialize(status, body, headers={})
@status, @body, @headers = status, body, headers
@body = body.force_encoding('UTF-8') if body.respond_to?(:force_encoding)
end
end

def self.build(settings)
new(RubyClient.new(settings, logger))
end

class RubyClient
def initialize(settings, logger)
@settings = settings
@logger = logger
@client = Elasticsearch::Client.new(client_args)
end

def delete(path)
begin
normalize_response(@client.perform_request('DELETE', path, {}, nil))
rescue Exception => e
if e.class.to_s =~ /NotFound/ || e.message =~ /Not\s*Found|404/i
Response.new(404, "", {})
else
raise e
end
end
end

def put(path, content)
normalize_response(@client.perform_request('PUT', path, {}, content))
end

def head(path)
begin
normalize_response(@client.perform_request('HEAD', path, {}, nil))
rescue Exception => e
if is_404_error?(e)
Response.new(404, "", {})
else
raise e
end
end
end

private

def is_404_error?(error)
error.class.to_s =~ /NotFound/ || error.message =~ /Not\s*Found|404/i
end

def normalize_response(response)
Response.new(response.status, response.body, response.headers)
end

def client_args
{
:transport_class => Elasticsearch::Transport::Transport::HTTP::Manticore,
:hosts => [*unpack_hosts],
:logger => @logger,
}
end

def unpack_hosts
@settings.fetch("var.output.elasticsearch.hosts", "localhost:9200").split(',').map(&:strip)
end
end

def initialize(client)
@client = client
end

def delete(path)
@client.delete(path)
end

def put(path, content)
@client.put(path, content)
end

def head(path)
@client.head(path)
end
end end # class LogStash::ModulesImporter
2 changes: 2 additions & 0 deletions logstash-core/lib/logstash/environment.rb
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ module Environment
Setting::NullableString.new("path.config", nil, false),
Setting::WritableDirectory.new("path.data", ::File.join(LogStash::Environment::LOGSTASH_HOME, "data")),
Setting::NullableString.new("config.string", nil, false),
Setting.new("modules.cli", Array, []),
Setting.new("modules", Array, []),
Setting::Boolean.new("config.test_and_exit", false),
Setting::Boolean.new("config.reload.automatic", false),
Setting::Numeric.new("config.reload.interval", 3), # in seconds
Expand Down
Loading

0 comments on commit f443dae

Please sign in to comment.