Skip to content

Commit

Permalink
readme
Browse files Browse the repository at this point in the history
  • Loading branch information
gmrqs committed Mar 2, 2023
1 parent 74caa58 commit 9ae73c6
Show file tree
Hide file tree
Showing 4 changed files with 16 additions and 21 deletions.
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
volumes/
mount/
work/*/
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
# pyspark-dev-env
Ambiente de desenvolvimento interativo para PySpark utilizando Docker Compose

![alt text](analytics-lab.drawio.png "Title")
Binary file added analytics-lab.drawio.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
33 changes: 13 additions & 20 deletions work/1-minio-read-write-test.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -30,33 +30,26 @@
"metadata": {},
"source": [
"## Criando a [Spark Session](http://127.0.0.1:4040/)\n",
"\n",
"### [spark-master](http://127.0.0.1:5050)\n",
"\n",
"#### • [spark-worker-a](http://127.0.0.1:5051)\n",
"\n",
"#### • [spark-worker-b](http://127.0.0.1:5052)\n",
"\n",
"\n",
"spark\n",
"Configurações padrão utilizadas no _spark-defaults.conf_ em $SPARK_HOME/conf/ "
]
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 7,
"id": "316c4753",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Setting default log level to \"WARN\".\n",
"To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"23/02/24 20:32:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\n",
"23/02/24 20:32:02 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties\n"
]
},
{
"data": {
"text/html": [
Expand All @@ -67,7 +60,7 @@
" <div>\n",
" <p><b>SparkContext</b></p>\n",
"\n",
" <p><a href=\"http://d37b6a177fcc:4040\">Spark UI</a></p>\n",
" <p><a href=\"http://e0e1013d92db:4040\">Spark UI</a></p>\n",
"\n",
" <dl>\n",
" <dt>Version</dt>\n",
Expand All @@ -83,10 +76,10 @@
" "
],
"text/plain": [
"<pyspark.sql.session.SparkSession at 0x7ff7b0fb5840>"
"<pyspark.sql.session.SparkSession at 0x7f218c3966e0>"
]
},
"execution_count": 2,
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
Expand Down

0 comments on commit 9ae73c6

Please sign in to comment.