-
Notifications
You must be signed in to change notification settings - Fork 21
/
Copy pathinstall-docker.txt
100 lines (66 loc) · 2.83 KB
/
install-docker.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
.. _install-docker:
:orphan:
=================================================
Install with a Containerized Cluster Using Docker
=================================================
.. default-domain:: mongodb
.. facet::
:name: genre
:values: tutorial
.. contents:: On this page
:local:
:backlinks: none
:depth: 1
:class: singlecol
You can install {+rel-mig+} with Kafka in a containerized environment
using `Docker <https://docs.docker.com/engine/>`__. This method configures Kafka
to store data locally in Docker containers. Use this method if you want to learn
how to configure your own multi-server Kafka environment.
.. warning::
This deployment method is ideal for quick evaluation. It is not recommended for
production workloads as it may not provide a resilient production environment.
About this Task
---------------
This deployment method uses a ``docker-compose`` file to set up a Kafka node,
a Kafka Connect node, and a {+rel-mig+} node.
Before you Begin
----------------
You must have Docker installed on your computer. For more information, see
`Install Docker Engine <https://docs.docker.com/engine/install/>`__.
Steps
-----
.. procedure::
:style: normal
.. step:: Download the docker-compose file
In the `Download Center <https://www.mongodb.com/try/download/relational-migrator>`__,
select :guilabel:`Docker` as the platform. Then select the :guilabel:`Kafka
reference implementation` file.
.. step:: Configure environment variables
a. Configure ``MIGRATOR_DATA_PATH``
In your ``docker-compose`` file, configure the ``MIGRATOR_DATA_PATH`` variable to a
path where {+rel-mig+} saves data for persistence.
b. (Optional) If your source database is MySQL or Oracle, configure ``MIGRATOR_PATH_DRIVER``
{+rel-mig+} uses the JDBC driver of the respective source database to read
database schema. It bundles SQL Server and PostgreSQL JDBC drivers. For MySQL and
Oracle, you must add their drivers.
In your ``docker-compose`` file, configure the ``MIGRATOR_PATH_DRIVER`` variable to
the location of the ``.jar`` file for the additional JDBC drivers.
.. step:: Download the Docker images
Run the following command to download the Docker images for your setup:
.. code-block::
docker-compose -f docker-compose-migrator-kafka.yml pull
.. step:: Launch {+rel-mig+} with Docker
Run the following command to start {+rel-mig+} with Docker:
.. code-block::
docker-compose -f docker-compose-migrator-kafka.yml up
Next Steps
----------
- :ref:`rm-projects`
- :ref:`rm-create-jobs`
Learn More
----------
- :ref:`kafka-intro`
- :ref:`install-confluent-kafka`
- :ref:`install-kafka-cluster`
- :ref:`advanced-settings`
- `Docker Engine Overview <https://docs.docker.com/engine/>`__