Skip to content

Commit

Permalink
ch eats to much mem
Browse files Browse the repository at this point in the history
  • Loading branch information
jangorecki committed Jun 17, 2020
1 parent 0b8e971 commit 919c1a7
Showing 1 changed file with 5 additions and 2 deletions.
7 changes: 5 additions & 2 deletions clickhouse/exec.sh
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ ch_active || echo "clickhouse-server should be already running, investigate" >&2
ch_active || exit 1

# load data
CH_MEM=107374182400 # 100GB ## old value 128849018880 # 120GB ## now set to 100GB due to #132
CH_MEM=103079215104 # 96 GB ## old value 107374182400 # 100GB ## old value 128849018880 # 120GB ## now set to 96GB after cache=1 to in-memory temp tables because there was not enough mem for R to parse timings
clickhouse-client --query="TRUNCATE TABLE $2"
clickhouse-client --max_memory_usage=$CH_MEM --query="INSERT INTO $2 FORMAT CSVWithNames" < "data/$2.csv"
# confirm all data loaded yandex/ClickHouse#4463
Expand All @@ -32,7 +32,10 @@ rm -f clickhouse/log/$1_$2_q*.csv

# execute sql script on clickhouse
clickhouse-client --query="TRUNCATE TABLE system.query_log"
cat "clickhouse/$1-clickhouse.sql" | clickhouse-client -mn --max_memory_usage=$CH_MEM --format=Pretty --output_format_pretty_max_rows 1 || echo "# clickhouse/exec.sh: benchmark sql script for $2 terminated with error" >&2
cat "clickhouse/$1-clickhouse.sql" | clickhouse-client -mn --max_memory_usage=$CH_MEM --receive_timeout=10800 --format=Pretty --output_format_pretty_max_rows 1 || echo "# clickhouse/exec.sh: benchmark sql script for $2 terminated with error" >&2

# need to wait in case if server crashed to release memory
sleep 120

# parse timings from clickhouse/log/[task]_[data_name]_q[i]_r[j].csv
Rscript clickhouse/clickhouse-parse-log.R "$1" "$2"
Expand Down

0 comments on commit 919c1a7

Please sign in to comment.