Skip to content

Commit

Permalink
[DOCS] http -> https, remove outdated plugin docs (elastic#60380)
Browse files Browse the repository at this point in the history
Plugin discovery documentation contained information about installing
Elasticsearch 2.0 and installing an oracle JDK, both of which is no
longer valid.

While noticing that the instructions used cleartext HTTP to install
packages, this commit replaces HTTPs links instead of HTTP where possible.

In addition a few community links have been removed, as they do not seem
to exist anymore.
  • Loading branch information
spinscale authored Jul 31, 2020
1 parent 9471867 commit c7ac9e7
Show file tree
Hide file tree
Showing 80 changed files with 188 additions and 242 deletions.
3 changes: 1 addition & 2 deletions docs/Versions.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,7 @@ include::{docs-root}/shared/versions/stack/{source_branch}.asciidoc[]
Javadoc roots used to generate links from Painless's API reference
///////
:java11-javadoc: https://docs.oracle.com/en/java/javase/11/docs/api
:joda-time-javadoc: http://www.joda.org/joda-time/apidocs
:lucene-core-javadoc: http://lucene.apache.org/core/{lucene_version_path}/core
:lucene-core-javadoc: https://lucene.apache.org/core/{lucene_version_path}/core

ifeval::["{release-state}"=="unreleased"]
:elasticsearch-javadoc: https://snapshots.elastic.co/javadoc/org/elasticsearch/elasticsearch/{version}-SNAPSHOT
Expand Down
13 changes: 5 additions & 8 deletions docs/community-clients/index.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ a number of clients that have been contributed by the community for various lang
* https://github.com/mpenet/spandex[Spandex]:
Clojure client, based on the new official low level rest-client.

* http://github.com/clojurewerkz/elastisch[Elastisch]:
* https://github.com/clojurewerkz/elastisch[Elastisch]:
Clojure client.

[[coldfusion]]
Expand All @@ -65,12 +65,12 @@ a number of clients that have been contributed by the community for various lang
[[erlang]]
== Erlang

* http://github.com/tsloughter/erlastic_search[erlastic_search]:
* https://github.com/tsloughter/erlastic_search[erlastic_search]:
Erlang client using HTTP.

* https://github.com/datahogs/tirexs[Tirexs]:
An https://github.com/elixir-lang/elixir[Elixir] based API/DSL, inspired by
http://github.com/karmi/tire[Tire]. Ready to use in pure Erlang
https://github.com/karmi/tire[Tire]. Ready to use in pure Erlang
environment.

* https://github.com/sashman/elasticsearch_elixir_bulk_processor[Elixir Bulk Processor]:
Expand Down Expand Up @@ -145,10 +145,10 @@ Also see the {client}/perl-api/current/index.html[official Elasticsearch Perl cl

Also see the {client}/php-api/current/index.html[official Elasticsearch PHP client].

* http://github.com/ruflin/Elastica[Elastica]:
* https://github.com/ruflin/Elastica[Elastica]:
PHP client.

* http://github.com/nervetattoo/elasticsearch[elasticsearch] PHP client.
* https://github.com/nervetattoo/elasticsearch[elasticsearch] PHP client.

* https://github.com/madewithlove/elasticsearcher[elasticsearcher] Agnostic lightweight package on top of the Elasticsearch PHP client. Its main goal is to allow for easier structuring of queries and indices in your application. It does not want to hide or replace functionality of the Elasticsearch PHP client.

Expand Down Expand Up @@ -218,9 +218,6 @@ Also see the {client}/rust-api/current/index.html[official Elasticsearch Rust cl
* https://github.com/newapplesho/elasticsearch-smalltalk[elasticsearch-smalltalk] -
Pharo Smalltalk client for Elasticsearch

* http://ss3.gemstone.com/ss/Elasticsearch.html[Elasticsearch] -
Smalltalk client for Elasticsearch

[[vertx]]
== Vert.x

Expand Down
2 changes: 1 addition & 1 deletion docs/java-rest/high-level/getting-started.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ The javadoc for the REST high level client can be found at {rest-high-level-clie
=== Maven Repository

The high-level Java REST client is hosted on
http://search.maven.org/#search%7Cga%7C1%7Cg%3A%22org.elasticsearch.client%22[Maven
https://search.maven.org/search?q=g:org.elasticsearch.client[Maven
Central]. The minimum Java version required is `1.8`.

The High Level REST Client is subject to the same release cycle as
Expand Down
8 changes: 4 additions & 4 deletions docs/java-rest/low-level/configuration.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ openssl pkcs12 -export -in client.crt -inkey private_key.pem \
-name "client" -out client.p12
```

If no explicit configuration is provided, the http://docs.oracle.com/javase/7/docs/technotes/guides/security/jsse/JSSERefGuide.html#CustomizingStores[system default configuration]
If no explicit configuration is provided, the https://docs.oracle.com/javase/7/docs/technotes/guides/security/jsse/JSSERefGuide.html#CustomizingStores[system default configuration]
will be used.

=== Others
Expand All @@ -154,11 +154,11 @@ indefinitely and negative hostname resolutions for ten seconds. If the resolved
addresses of the hosts to which you are connecting the client to vary with time
then you might want to modify the default JVM behavior. These can be modified by
adding
http://docs.oracle.com/javase/8/docs/technotes/guides/net/properties.html[`networkaddress.cache.ttl=<timeout>`]
https://docs.oracle.com/javase/8/docs/technotes/guides/net/properties.html[`networkaddress.cache.ttl=<timeout>`]
and
http://docs.oracle.com/javase/8/docs/technotes/guides/net/properties.html[`networkaddress.cache.negative.ttl=<timeout>`]
https://docs.oracle.com/javase/8/docs/technotes/guides/net/properties.html[`networkaddress.cache.negative.ttl=<timeout>`]
to your
http://docs.oracle.com/javase/8/docs/technotes/guides/security/PolicyFiles.html[Java
https://docs.oracle.com/javase/8/docs/technotes/guides/security/PolicyFiles.html[Java
security policy].

=== Node selector
Expand Down
8 changes: 4 additions & 4 deletions docs/java-rest/low-level/usage.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ The javadoc for the low level REST client can be found at {rest-client-javadoc}/
=== Maven Repository

The low-level Java REST client is hosted on
http://search.maven.org/#search%7Cga%7C1%7Cg%3A%22org.elasticsearch.client%22[Maven
https://search.maven.org/#search%7Cga%7C1%7Cg%3A%22org.elasticsearch.client%22[Maven
Central]. The minimum Java version required is `1.8`.

The low-level REST client is subject to the same release cycle as
Expand Down Expand Up @@ -57,7 +57,7 @@ dependencies {
=== Dependencies

The low-level Java REST client internally uses the
http://hc.apache.org/httpcomponents-asyncclient-dev/[Apache Http Async Client]
https://hc.apache.org/httpcomponents-asyncclient-dev/[Apache Http Async Client]
to send http requests. It depends on the following artifacts, namely the async
http client and its own transitive dependencies:

Expand Down Expand Up @@ -212,7 +212,7 @@ include-tagged::{doc-tests}/RestClientDocumentation.java[rest-client-init-client
--------------------------------------------------
<1> Set a callback that allows to modify the http client configuration
(e.g. encrypted communication over ssl, or anything that the
http://hc.apache.org/httpcomponents-asyncclient-dev/httpasyncclient/apidocs/org/apache/http/impl/nio/client/HttpAsyncClientBuilder.html[`org.apache.http.impl.nio.client.HttpAsyncClientBuilder`]
https://hc.apache.org/httpcomponents-asyncclient-dev/httpasyncclient/apidocs/org/apache/http/impl/nio/client/HttpAsyncClientBuilder.html[`org.apache.http.impl.nio.client.HttpAsyncClientBuilder`]
allows to set)


Expand Down Expand Up @@ -401,7 +401,7 @@ https://hc.apache.org/httpcomponents-core-ga/httpcore/apidocs/org/apache/http/Ht
`HttpEntity#getContent` method comes handy which returns an `InputStream`
reading from the previously buffered response body. As an alternative, it is
possible to provide a custom
http://hc.apache.org/httpcomponents-core-ga/httpcore-nio/apidocs/org/apache/http/nio/protocol/HttpAsyncResponseConsumer.html[`org.apache.http.nio.protocol.HttpAsyncResponseConsumer`]
https://hc.apache.org/httpcomponents-core-ga/httpcore-nio/apidocs/org/apache/http/nio/protocol/HttpAsyncResponseConsumer.html[`org.apache.http.nio.protocol.HttpAsyncResponseConsumer`]
that controls how bytes are read and buffered.

[[java-rest-low-usage-logging]]
Expand Down
4 changes: 2 additions & 2 deletions docs/painless/painless-guide/painless-walkthrough.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -219,7 +219,7 @@ Painless's native support for regular expressions has syntax constructs:

* `/pattern/`: Pattern literals create patterns. This is the only way to create
a pattern in painless. The pattern inside the ++/++'s are just
http://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html[Java regular expressions].
https://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html[Java regular expressions].
See <<pattern-flags>> for more.
* `=~`: The find operator return a `boolean`, `true` if a subsequence of the
text matches, `false` otherwise.
Expand Down Expand Up @@ -281,7 +281,7 @@ POST hockey/_update_by_query
----------------------------------------------------------------

`Matcher.replaceAll` is just a call to Java's `Matcher`'s
http://docs.oracle.com/javase/8/docs/api/java/util/regex/Matcher.html#replaceAll-java.lang.String-[replaceAll]
https://docs.oracle.com/javase/8/docs/api/java/util/regex/Matcher.html#replaceAll-java.lang.String-[replaceAll]
method so it supports `$1` and `\1` for replacements:

[source,console]
Expand Down
6 changes: 3 additions & 3 deletions docs/painless/painless-lang-spec.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,10 @@ refer to the corresponding topics in the
https://docs.oracle.com/javase/specs/jls/se8/html/index.html[Java Language
Specification].

Painless scripts are parsed and compiled using the http://www.antlr.org/[ANTLR4]
and http://asm.ow2.org/[ASM] libraries. Scripts are compiled directly
Painless scripts are parsed and compiled using the https://www.antlr.org/[ANTLR4]
and https://asm.ow2.org/[ASM] libraries. Scripts are compiled directly
into Java Virtual Machine (JVM) byte code and executed against a standard JVM.
This specification uses ANTLR4 grammar notation to describe the allowed syntax.
However, the actual Painless grammar is more compact than what is shown here.

include::painless-lang-spec/index.asciidoc[]
include::painless-lang-spec/index.asciidoc[]
12 changes: 6 additions & 6 deletions docs/plugins/analysis-icu.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ convert `nfc` to `nfd` or `nfkc` to `nfkd` respectively:

Which letters are normalized can be controlled by specifying the
`unicode_set_filter` parameter, which accepts a
http://icu-project.org/apiref/icu4j/com/ibm/icu/text/UnicodeSet.html[UnicodeSet].
https://icu-project.org/apiref/icu4j/com/ibm/icu/text/UnicodeSet.html[UnicodeSet].

Here are two examples, the default usage and a customised character filter:

Expand Down Expand Up @@ -103,7 +103,7 @@ PUT icu_sample
==== ICU Tokenizer

Tokenizes text into words on word boundaries, as defined in
http://www.unicode.org/reports/tr29/[UAX #29: Unicode Text Segmentation].
https://www.unicode.org/reports/tr29/[UAX #29: Unicode Text Segmentation].
It behaves much like the {ref}/analysis-standard-tokenizer.html[`standard` tokenizer],
but adds better support for some Asian languages by using a dictionary-based
approach to identify words in Thai, Lao, Chinese, Japanese, and Korean, and
Expand Down Expand Up @@ -137,7 +137,7 @@ for a more detailed explanation.

To add icu tokenizer rules, set the `rule_files` settings, which should contain a comma-separated list of
`code:rulefile` pairs in the following format:
http://unicode.org/iso15924/iso15924-codes.html[four-letter ISO 15924 script code],
https://unicode.org/iso15924/iso15924-codes.html[four-letter ISO 15924 script code],
followed by a colon, then a rule file name. Rule files are placed `ES_HOME/config` directory.

As a demonstration of how the rule files can be used, save the following user file to `$ES_HOME/config/KeywordTokenizer.rbbi`:
Expand Down Expand Up @@ -210,7 +210,7 @@ with the `name` parameter, which accepts `nfc`, `nfkc`, and `nfkc_cf`

Which letters are normalized can be controlled by specifying the
`unicode_set_filter` parameter, which accepts a
http://icu-project.org/apiref/icu4j/com/ibm/icu/text/UnicodeSet.html[UnicodeSet].
https://icu-project.org/apiref/icu4j/com/ibm/icu/text/UnicodeSet.html[UnicodeSet].

You should probably prefer the <<analysis-icu-normalization-charfilter,Normalization character filter>>.

Expand Down Expand Up @@ -287,7 +287,7 @@ no need to use Normalize character or token filter as well.

Which letters are folded can be controlled by specifying the
`unicode_set_filter` parameter, which accepts a
http://icu-project.org/apiref/icu4j/com/ibm/icu/text/UnicodeSet.html[UnicodeSet].
https://icu-project.org/apiref/icu4j/com/ibm/icu/text/UnicodeSet.html[UnicodeSet].

The following example exempts Swedish characters from folding. It is important
to note that both upper and lowercase forms should be specified, and that
Expand Down Expand Up @@ -433,7 +433,7 @@ The following parameters are accepted by `icu_collation_keyword` fields:
The strength property determines the minimum level of difference considered
significant during comparison. Possible values are : `primary`, `secondary`,
`tertiary`, `quaternary` or `identical`. See the
http://icu-project.org/apiref/icu4j/com/ibm/icu/text/Collator.html[ICU Collation documentation]
https://icu-project.org/apiref/icu4j/com/ibm/icu/text/Collator.html[ICU Collation documentation]
for a more detailed explanation for each value. Defaults to `tertiary`
unless otherwise specified in the collation.

Expand Down
3 changes: 0 additions & 3 deletions docs/plugins/analysis-stempel.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,6 @@
The Stempel Analysis plugin integrates Lucene's Stempel analysis
module for Polish into elasticsearch.

It provides high quality stemming for Polish, based on the
http://www.egothor.org/[Egothor project].

:plugin_name: analysis-stempel
include::install_remove.asciidoc[]

Expand Down
2 changes: 1 addition & 1 deletion docs/plugins/analysis-ukrainian.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

The Ukrainian Analysis plugin integrates Lucene's UkrainianMorfologikAnalyzer into elasticsearch.

It provides stemming for Ukrainian using the http://github.com/morfologik/morfologik-stemming[Morfologik project].
It provides stemming for Ukrainian using the https://github.com/morfologik/morfologik-stemming[Morfologik project].

:plugin_name: analysis-ukrainian
include::install_remove.asciidoc[]
Expand Down
2 changes: 1 addition & 1 deletion docs/plugins/analysis.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ transliteration.

<<analysis-kuromoji,Kuromoji>>::

Advanced analysis of Japanese using the http://www.atilika.org/[Kuromoji analyzer].
Advanced analysis of Japanese using the https://www.atilika.org/[Kuromoji analyzer].

<<analysis-nori,Nori>>::

Expand Down
4 changes: 2 additions & 2 deletions docs/plugins/api.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ API extension plugins add new functionality to Elasticsearch by adding new APIs
A number of plugins have been contributed by our community:

* https://github.com/carrot2/elasticsearch-carrot2[carrot2 Plugin]:
Results clustering with http://project.carrot2.org/[carrot2] (by Dawid Weiss)
Results clustering with https://github.com/carrot2/carrot2[carrot2] (by Dawid Weiss)

* https://github.com/wikimedia/search-extra[Elasticsearch Trigram Accelerated Regular Expression Filter]:
(by Wikimedia Foundation/Nik Everett)
Expand All @@ -18,7 +18,7 @@ A number of plugins have been contributed by our community:
(by Wikimedia Foundation/Nik Everett)

* https://github.com/YannBrrd/elasticsearch-entity-resolution[Entity Resolution Plugin]:
Uses http://github.com/larsga/Duke[Duke] for duplication detection (by Yann Barraud)
Uses https://github.com/larsga/Duke[Duke] for duplication detection (by Yann Barraud)

* https://github.com/zentity-io/zentity[Entity Resolution Plugin] (https://zentity.io[zentity]):
Real-time entity resolution with pure Elasticsearch (by Dave Moore)
Expand Down
2 changes: 1 addition & 1 deletion docs/plugins/authors.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -116,5 +116,5 @@ AccessController.doPrivileged(
);
--------------------------------------------------

See http://www.oracle.com/technetwork/java/seccodeguide-139067.html[Secure Coding Guidelines for Java SE]
See https://www.oracle.com/technetwork/java/seccodeguide-139067.html[Secure Coding Guidelines for Java SE]
for more information.
24 changes: 2 additions & 22 deletions docs/plugins/discovery-azure-classic.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -139,7 +139,7 @@ about your nodes.

Before starting, you need to have:

* A http://www.windowsazure.com/[Windows Azure account]
* A https://azure.microsoft.com/en-us/[Windows Azure account]
* OpenSSL that isn't from MacPorts, specifically `OpenSSL 1.0.1f 6 Jan
2014` doesn't seem to create a valid keypair for ssh. FWIW,
`OpenSSL 1.0.1c 10 May 2012` on Ubuntu 14.04 LTS is known to work.
Expand Down Expand Up @@ -331,27 +331,7 @@ scp /tmp/azurekeystore.pkcs12 azure-elasticsearch-cluster.cloudapp.net:/home/ela
ssh azure-elasticsearch-cluster.cloudapp.net
----

Once connected, install Elasticsearch:

["source","sh",subs="attributes,callouts"]
----
# Install Latest Java version
# Read http://www.webupd8.org/2012/09/install-oracle-java-8-in-ubuntu-via-ppa.html for details
sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-java8-installer
# If you want to install OpenJDK instead
# sudo apt-get update
# sudo apt-get install openjdk-8-jre-headless
# Download Elasticsearch
curl -s https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-{version}.deb -o elasticsearch-{version}.deb
# Prepare Elasticsearch installation
sudo dpkg -i elasticsearch-{version}.deb
----
// NOTCONSOLE
Once connected, {stack-gs}/get-started-elastic-stack.html#install-elasticsearch[install {es}]:

Check that Elasticsearch is running:

Expand Down
10 changes: 5 additions & 5 deletions docs/plugins/discovery-ec2.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ will work correctly even if it finds master-ineligible nodes, but master
elections will be more efficient if this can be avoided.

The interaction with the AWS API can be authenticated using the
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html[instance
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html[instance
role], or else custom credentials can be supplied.

===== Enabling EC2 discovery
Expand Down Expand Up @@ -76,7 +76,7 @@ The available settings for the EC2 discovery plugin are as follows.
`discovery.ec2.endpoint`::

The EC2 service endpoint to which to connect. See
http://docs.aws.amazon.com/general/latest/gr/rande.html#ec2_region to find
https://docs.aws.amazon.com/general/latest/gr/rande.html#ec2_region to find
the appropriate endpoint for the region. This setting defaults to
`ec2.us-east-1.amazonaws.com` which is appropriate for clusters running in
the `us-east-1` region.
Expand Down Expand Up @@ -152,7 +152,7 @@ For example if you tag some EC2 instances with a tag named
`elasticsearch-host-name` and set `host_type: tag:elasticsearch-host-name` then
the `discovery-ec2` plugin will read each instance's host name from the value
of the `elasticsearch-host-name` tag.
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/Using_Tags.html[Read more
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/Using_Tags.html[Read more
about EC2 Tags].

--
Expand Down Expand Up @@ -293,7 +293,7 @@ available on AWS-based infrastructure from https://www.elastic.co/cloud.
EC2 instances offer a number of different kinds of storage. Please be aware of
the following when selecting the storage for your cluster:

* http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/InstanceStorage.html[Instance
* https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/InstanceStorage.html[Instance
Store] is recommended for {es} clusters as it offers excellent performance and
is cheaper than EBS-based storage. {es} is designed to work well with this kind
of ephemeral storage because it replicates each shard across multiple nodes. If
Expand Down Expand Up @@ -327,7 +327,7 @@ https://aws.amazon.com/ec2/instance-types/[instance types] with networking
labelled as `Moderate` or `Low`.

* It is a good idea to distribute your nodes across multiple
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html[availability
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html[availability
zones] and use {ref}/modules-cluster.html#shard-allocation-awareness[shard
allocation awareness] to ensure that each shard has copies in more than one
availability zone.
Expand Down
24 changes: 1 addition & 23 deletions docs/plugins/discovery-gce.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -182,29 +182,7 @@ Failing to set this will result in unauthorized messages when starting Elasticse
See <<discovery-gce-usage-tips-permissions>>.
==============================================


Once connected, install Elasticsearch:

[source,sh]
--------------------------------------------------
sudo apt-get update
# Download Elasticsearch
wget https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-2.0.0.deb
# Prepare Java installation (Oracle)
sudo echo "deb http://ppa.launchpad.net/webupd8team/java/ubuntu trusty main" | sudo tee /etc/apt/sources.list.d/webupd8team-java.list
sudo echo "deb-src http://ppa.launchpad.net/webupd8team/java/ubuntu trusty main" | sudo tee -a /etc/apt/sources.list.d/webupd8team-java.list
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys EEA14886
sudo apt-get update
sudo apt-get install oracle-java8-installer
# Prepare Java installation (or OpenJDK)
# sudo apt-get install java8-runtime-headless
# Prepare Elasticsearch installation
sudo dpkg -i elasticsearch-2.0.0.deb
--------------------------------------------------
Once connected, {stack-gs}/get-started-elastic-stack.html#install-elasticsearch[install {es}]:

[[discovery-gce-usage-long-install-plugin]]
===== Install Elasticsearch discovery gce plugin
Expand Down
2 changes: 1 addition & 1 deletion docs/plugins/discovery.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ addresses of seed hosts.

The following discovery plugins have been contributed by our community:

* https://github.com/fabric8io/elasticsearch-cloud-kubernetes[Kubernetes Discovery Plugin] (by Jimmi Dyson, http://fabric8.io[fabric8])
* https://github.com/fabric8io/elasticsearch-cloud-kubernetes[Kubernetes Discovery Plugin] (by Jimmi Dyson, https://fabric8.io[fabric8])

include::discovery-ec2.asciidoc[]

Expand Down
Loading

0 comments on commit c7ac9e7

Please sign in to comment.