LinkedDataHub (LDH) is open source software you can use to manage data, create visualizations and build apps on RDF Knowledge Graphs.
LDH features a completely data-driven application architecture: generic server and client components process declarative RDF/OWL, SPARQL, and XSLT instructions. The default application structure and user interface are provided, making LDH a standalone product, yet they can be completely overridden and customized, thus also making LDH a low-code application platform. Unless a custom processing is required, no imperative code such as Java or JavaScript needs to be involved at all.
- Install Docker
- Install Docker Compose, if it is not already included in the Docker installation
- Fork this repository and clone the fork into a folder
- In the folder, create an
.env
file and fill out the missing values (you can use.env_sample
as a template). For example:
COMPOSE_CONVERT_WINDOWS_PATHS=1
COMPOSE_PROJECT_NAME=linkeddatahub
PROTOCOL=https
HTTP_PORT=81
HTTPS_PORT=4443
HOST=localhost
ABS_PATH=/
[email protected]
OWNER_GIVEN_NAME=John
OWNER_FAMILY_NAME=Doe
OWNER_ORG_UNIT=My unit
OWNER_ORGANIZATION=My org
OWNER_LOCALITY=Copenhagen
OWNER_STATE_OR_PROVINCE=Denmark
OWNER_COUNTRY_NAME=DK
OWNER_KEY_PASSWORD=changeit
- Run this from command line:
docker-compose up
- LinkedDataHub will start and create the following sub-folders:
certs
where your WebID certificates are storeddata
where the triplestore(s) will persist RDF datauploads
where LDH stores content-hashed file uploads
- Install
certs/owner.p12
into a web browser of your choice (password is theOWNER_KEY_PASSWORD
value)- Google Chrome:
Settings > Advanced > Manage Certificates > Import...
- Mozilla Firefox:
Options > Privacy > Security > View Certificates... > Import...
- Apple Safari: The file is installed directly into the operating system. Open the file and import it using the Keychain Access tool.
- Microsoft Edge: Does not support certificate management, you need to install the file into Windows. Read more here.
- Google Chrome:
- Open https://localhost:4443/ in that web browser
After a successful startup, the last line of the Docker log should read:
linkeddatahub_1 | 02-Feb-2020 02:02:20.200 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in 3420 ms
- You will likely get a browser warning such as
Your connection is not private
in Chrome orWarning: Potential Security Risk Ahead
in Firefox due to the self-signed server certificate. Ignore it: clickAdvanced
andProceed
orAccept the risk
to proceed.- If this option does not appear in Chrome (as observed on some MacOS), you can open
chrome://flags/#allow-insecure-localhost
, switchAllow invalid certificates for resources loaded from localhost
toEnabled
and restart Chrome
- If this option does not appear in Chrome (as observed on some MacOS), you can open
.env_sample
and.env
files might be invisible in MacOS Finder which hides filenames starting with a dot. You should be able to create it using Terminal however.- On Linux your user may need to be a member of the
docker
group. Add it using
sudo usermod -aG docker ${USER}
and re-login with your user. An alternative, but not recommended, is to run
sudo docker-compose up
Besides owner WebID configuration, the most common case is changing the base URI from the default https://localhost:4443/
to your own.
Lets use https://ec2-54-235-229-141.compute-1.amazonaws.com/linkeddatahub/
as an example. We need to split the URI into components and set them in the .env
file using the following parameters:
PROTOCOL=https
HTTP_PORT=80
HTTPS_PORT=443
HOST=ec2-54-235-229-141.compute-1.amazonaws.com
ABS_PATH=/linkeddatahub/
ABS_PATH
is required, even if it's just /
.
Dataspaces are configured in config/system.trig
. Their base URIs need to be relative to the base URI configured in the .env
file.
Reusing the https://ec2-54-235-229-141.compute-1.amazonaws.com/linkeddatahub/
as the new base URI, the easiest way is to simple replace the default https://localhost:4443/
value with it. It can be done using the following shell command:
sed -i 's/https:\/\/localhost:4443\//https:\/\/ec2-54-235-229-141.compute-1.amazonaws.com\/linkeddatahub\//g' config/system.trig
Note that sed
requires to escape forward slashes /
with backslashes \
.
If you need to start fresh and wipe the existing setup (e.g. after configuring a new base URI), you can do that using
sudo rm -rf certs data && docker-compose down -v
LinkedDataHub CLI wraps the HTTP API into a set of shell scripts with convenient parameters. The scripts can be used for testing, automation, scheduled execution and such. It is usually much quicker to perform actions using CLI rather than the user interface, as well as easier to reproduce.
The scripts can be found in the scripts
subfolder.
An environment variable JENA_HOME
is used by all the command line tools to configure the class path automatically for you. You can set this up as follows:
On Linux / Mac
export JENA_HOME=the directory you downloaded Jena to
export PATH=$PATH:$JENA_HOME/bin
On Windows
SET JENA_HOME =the directory you downloaded Jena to
SET PATH=%PATH%;%JENA_HOME%\bat
Get the source code
SCRIPT_ROOT
environmental variable to the scripts
subfolder of your LinkedDataHub fork or clone. For example:
export SCRIPT_ROOT="/c/Users/namedgraph/WebRoot/AtomGraph/LinkedDataHub/scripts"
LinkedDataHub includes a basic HTTP test suite.
Please report issues if you've encountered a bug or have a feature request.
Commercial consulting, development, and support are available from AtomGraph.
- [email protected] (mailing list)
- @atomgraphhq
- W3C Declarative Linked Data Apps Community Group