Load Large Data into MySQL
Overview
In this guide, we will see how to load large data into a MySQL database using the LOAD DATA INFILE
mechanism.
If your data is not very large, it's best to just import data using a MySQL client connect - either via
a program or mysql
client from the app. This guide is only worth the trouble if you want to load several GB of data.
SSH access required
To follow this guide, SSH access to the server and basic docker knowledge is required.
Copy data
The LOAD DATA INFILE
command works by importing a file which is located on the same file system as the MySQL server.
On Cloudron, this means that the data file must be located under /run/mysql-files/
of the mysql
container.
-
First, copy the data file, says
data.csv
, into some location on the server. -
Next, copy the data file into the
mysql
container.
root@my:~# docker cp data.csv mysql:/run/mysql-files/data.csv
Load data
To load the data, you have to execute LOAD DATA INFILE
from inside the mysql
container.
root@my:~# docker exec -ti mysql /bin/bash
root@mysql:/# mysql -uroot -p${CLOUDRON_MYSQL_ROOT_PASSWORD}
mysql: [Warning] Using a password on the command line interface can be insecure.
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 17
Server version: 8.0.31-0ubuntu0.22.04.1 (Ubuntu)
mysql>
With the MySQL prompt, you can load the data into the databases. To identify the database of your app, check the CLOUDRON_MYSQL_DATABASE
environment variable in your app's Web Terminal.
mysql> show databases;
+--------------------+
| Database |
+--------------------+
| 405ae49cca94474c |
...
mysql> use 405ae49cca94474c;
mysql> LOAD DATA INFILE '/run/mysql-files/data.csv' INTO TABLE employees FIELDS TERMINATED BY ',';
Query OK, 5 rows affected (0.00 sec)
Records: 5 Deleted: 0 Skipped: 0 Warnings: 0