Because the world needs yet another way to talk to HDFS from Python.
This library provides a Python client for WebHDFS. NameNode HA is supported if you pass in both NameNodes. Any failed operation will raise some subclass of HdfsException
.
from pyhdfs import HdfsClient
client = HdfsClient(hosts='namenode1.example.com:50070,namenode2.example.com:50070')
print(client.list_status('/'))
You can also pass the hostname as part of the URI:
from pyhdfs import HdfsClient
client = HdfsClient()
print(client.list_status('//namenode1.example.com:50070;namenode2.example.com:50070/'))
The methods and return values generally map directly to WebHDFS endpoints. The client also provides convenience methods that mimic Python os
methods and HDFS CLI commands (e.g. walk
and copy_to_local
).
pyhdfs
logs all HDFS actions at the INFO level, so turning on INFO level logging will give you a debug record for your application.
For more information, see the full API docs.
pip install pyhdfs
You'll need Python 2.7 or Python 3.
First get an environment with HDFS. The Cloudera QuickStart VM works fine for this. (Note that the VM only comes with Python 2.6, so you might want to use your host and forward port 50070.)
WARNING: The tests create and delete hdfs://localhost/tmp/pyhdfs_test
.
Python 3:
virtualenv3 --no-site-packages env3 source env3/bin/activate pip3 install -e . pip3 install -r dev_requirements.txt py.test
And again for Python 2 (after deactivate
):
virtualenv2 --no-site-packages env2 source env2/bin/activate pip2 install -e . pip2 install -r dev_requirements.txt py.test