Database Lab aims to boost software development and testing processes via enabling ultra-fast provisioning of multi-terabyte databases.
- prepare a machine with ZFS,
- generate some PostgreSQL database for testing purposes,
- prepare at least one snapshot to be used for cloning,
- configure and launch the Database Lab server,
- setup client CLI,
- start using its API and client CLI for the fast cloning of the Postgres database,
- and, finally, consider benefits and power of thin clones.
If you want to use any cloud platform (like AWS or GCP) or run your Database Lab on VMWare, or on bare metal, only the first step will slightly differ. In general, the overall procedure will be pretty much the same.
That's it! Congratulations!
- Sign in to https://postgres.ai/console/ and we'll show you SaaS part (GUI, access control, history, more)
- Check out the documentation https://postgres.ai/docs/
- Join our community chats:
- Slack (English): https://database-lab-team-slack-invite.herokuapp.com/
- Telegram (Russian): https://t.me/databaselabru
Tutorial: Database Lab Engine
Step 1. Prepare a Linux machine with Docker and ZFS
sudo apt-get update && sudo apt-get install -y \ apt-transport-https \ ca-certificates \ curl \ gnupg-agent \ postgresql-client \ software-properties-common \ zfsutils-linux
Create a regular file to mock ZFS:
truncate --size 10GB disk
Now it is time to create a ZFS pool.
zpool create -f \ -O compression=on \ -O atime=off \ -O recordsize=8k \ -O logbias=throughput \ -m /var/lib/dblab/data \ dblab_pool \ "$(pwd)/disk"
And check the result:
zfs list df -hT