Skip to content

Commit

Permalink
fix No such file or directory error when copying test file (fluid-clo…
Browse files Browse the repository at this point in the history
…udnative#492)

* fix No such file or directory error when copying test file
Signed-off-by: mahao <[email protected]>
  • Loading branch information
allenhaozi authored Dec 20, 2020
1 parent d0d3d80 commit 9ac883c
Showing 1 changed file with 7 additions and 7 deletions.
14 changes: 7 additions & 7 deletions docs/zh/userguide/get_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,9 +135,9 @@ Fluid提供了云原生的数据加速和管理能力,并抽象出了`数据
4. 登录到应用容器中访问数据,初次访问会花费更长时间。
```shell
$ kubectl exec -it demo-app -- bash
$ du -sh /data/spark/spark-3.0.1-bin-without-hadoop.tgz
150M /data/spark/spark-3.0.1-bin-without-hadoop.tgz
$ time cp /data/spark/spark-3.0.1-bin-without-hadoop.tgz /dev/null
$ du -sh /data/spark/spark-3.0.1/spark-3.0.1-bin-without-hadoop.tgz
150M /data/spark/spark-3.0.1/spark-3.0.1-bin-without-hadoop.tgz
$ time cp /data/spark/spark-3.0.1/spark-3.0.1-bin-without-hadoop.tgz /dev/null
real 0m13.171s
user 0m0.002s
sys 0m0.028s
Expand All @@ -147,10 +147,10 @@ Fluid提供了云原生的数据加速和管理能力,并抽象出了`数据
```shell
$ kubectl delete -f app.yaml && kubectl create -f app.yaml
$ kubectl exec -it demo-app -- bash
$ time cp /data/spark/spark-3.0.1-bin-without-hadoop.tgz /dev/null
real 0m0.344s
user 0m0.002s
sys 0m0.020s
$ time cp /data/spark/spark-3.0.1/spark-3.0.1-bin-without-hadoop.tgz /dev/null
real 0m0.034s
user 0m0.001s
sys 0m0.032s
```
到这里,我们简单地创建了一个数据集并实现了数据集的抽象管理与加速, 更多有关 Fluid 的更详细的信息, 请参考以下示例文档:
Expand Down

0 comments on commit 9ac883c

Please sign in to comment.