There are a number of ways to add data to PostgreSQL. If you've completed the
other scenarios in this course, you've already used the
INSERT command to add
one or more rows to a database table. In this scenario, we'll take that a step
further by trying these methods of loading data in Postgres:
- Executing a SQL script via the command line using
- Using the
\copymeta-command to bulk import from a data file
This scenario will not cover all of the details of scripting or bulk imports; nor will we be able to tackle all the possible ways to load data in Postgres. That said, towards the end we'll take a brief look at the Import tool in pgAdmin4 as well as other utilities.
Since we'll be doing work in this scenario mainly with the
psql utility, it
may help to complete our Intro to
first if you haven't already done so.
We've looked at a few different ways of loading data into a PostgreSQL database.
Data can be added in bulk via a script or the COPY interface: the
\copy meta-command in psql, the SQL
COPY command, driver level
COPY, or a client like pgAdmin.
These are certainly not the only methods of loading data into Postgres - there
are many others that might be suitable depending on your requirements. For example, foreign data wrappers (such as
postgres_fdw) allow access to external databases, from which you could then import to your own. There are many types of foreign data wrappers that give you access to other kinds of DBMS's such as MySQL, SQL Server, and MongoDB.
pgloader is an example of a data migration tool that supports database migrations to PostgreSQL from other DBMS's like MySQL, as well as other file formats in addition to flat files like CSV.
Enjoy learning about PostgreSQL? Sign up for our newsletter and get the latest tips from us each month.
Loading Data into PostgreSQL
Load data with a script
Looking back at inserts
An INSERT statement in Postgres is transactional, meaning that each one is considered as a single unit of work. Running a few inserts is fine if you just want some rows data for testing, but once you need more than that you'll see the impact in:
- Performance - each write operation has to be completed before the server can proceed to the next.
- Your productivity - it's not a great use of your time to write so many INSERT statements especially if you or someone else needs to carry out this work more than once.
In terms of productivity, you might already be thinking: "Well, can't I write a script to help me out?" Absolutely! If you know basic SQL or you've done the other scenarios in this course, you should be able to put together at least a basic script. Additionally, you could try a utility such as pg_dump which can extract (i.e. back up) a database into a "dump" script.
In this section we'll take a simple script and see how easy it is to run it with the psql tool. This script also takes advantage of multi-row inserts, which are more performant than running multiple individual inserts. There are plenty of options to speed up data loads into Postgres.
Load data from the command line
First, let's log in to Postgres as user
groot, and look at what's currently
workshop database (password:
psql -U groot -h localhost -d workshop -c '\dt'
So far there are two tables... Nothing that looks related to employees.
Let's take a look at the contents of the
(Press the spacebar to scroll to the next page, or
q to return to the command
We're creating four new tables and adding just a handful of data rows to each.
Now, let's use the
-f flag in
psql to run the script and add these new
objects to the
psql -U groot -h localhost -d workshop -f /data/employees.sql
-f flag takes in a filename or path as argument, and the
will read commands contained in that file. You should see which commands were
completed, although running
psql with this flag does not log you in to
Postgres so you're still on the shell prompt.
Let's actually log back in and take a look at
psql -U groot -h localhost -d workshop
You should now see the four additional tables. And if we run:
SELECT * FROM department;
We see the entire contents of this table.
If you're in the psql shell, you can also execute a script like so:
Try an advanced example with our pg_dump and pg_restore class
The example we tried above was a basic demonstration of feeding a script into
psql tool to get some data into Postgres.
You may encounter situations in which you need to export and import a database--for example, to restore a backup for data recovery. As mentioned above,
pg_dump can generate plaintext script files which can also be used with
psql. pg_dump can export the database in other file formats as well, for which
you'll need pg_restore.
To learn more about pg_dump and pg_restore, check out the intro course available in our learning portal.