Bulk .csv import in localhost using shell.
Searching to bulk import .csv
files? Well, you are the right place. I have a shell script for you which will help you import bulk .csv
files. No need to import .csv
files one by one. Ok, let’s go with the script directly.
#!/usr/bin/env bash # Uncomment if you want to see debug mode. # set -x # Define database connectivity _db="your_database_name" _db_user="username" _db_password="password" # Directory location to CSV files _csv_directory="location/to/your/csv_files" # Go into directory cd $_csv_directory # loop through csv files for _csv_file in *.csv do # define table name _table_name=$(echo $_csv_file | rev | cut -c5- | rev); # get header columns from CSV file _header_columns=`head -1 $_csv_directory/$_csv_file | tr ',' '\n' | sed 's/^"//' | sed 's/"$//' | sed 's/ /_/g'` _header_columns_string=`head -1 $_csv_directory/$_csv_file | sed 's/ /_/g' | sed 's/"//g'` echo "Importing $_csv_file"; # use this if mysql -u steps doesn't work # /usr/local/mysql/bin/mysql --local-infile=1 -u $_db_user --password=$_db_password $_db -e " mysql -u $_db_user --password=$_db_password $_db -e " # disable foeign key constraint check SET FOREIGN_KEY_CHECKS=0; LOAD DATA LOCAL infile '$_csv_file' IGNORE INTO TABLE $_table_name FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES ($_header_columns_string);" 2> #/dev/null echo "✅"; done exit
Save above script with extension .sh
. For example shell.sh
.
If you just want to copy paste and want to execute the code, follow commands in your terminal as follows.
cd location/to/shell/file/created
bash shell.sh
Once, you execute shell script, 💥boom, done👍. But how? Well, if you want to understand in detail, please continue reading my blog 👽.
1) Database setup
# Define database connectivity _db="your_database_name" _db_user="username" _db_password="password"
I hope these script is quite easy to understand where you are trying to fill information to connect with database. Let me know if you are confused 😉.
2) Fetching table name
_table_name=$(echo $_csv_file | rev | cut -c5- | rev);
In this line, the code gets the .csv
file name and removes .csv
name.
3) Fetching column name
_header_columns_string=`head -1 $_csv_directory/$_csv_file | sed 's/ /_/g' | sed 's/"//g'`
Here, we get column names.
4) Main part
mysql -u $_db_user --password=$_db_password $_db -e " # disable foreign key constraint check SET FOREIGN_KEY_CHECKS=0; LOAD DATA LOCAL infile '$_csv_file' IGNORE INTO TABLE $_table_name FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES ($_header_columns_string);" 2> /dev/null
- mysql -u $_db_user –password=$_db_password $db -e “
- It connects with database with information set on the variable names.
- SET FOREIGN_KEY_CHECKS=0;
- Disables foreign key constraints. This is important to turn off if you trying to import data from csv.
- While importing csv files, it is not sure which table will execute first and it might create foreign key constraint issue.
- So, simply turn off the check and it will work 🧈 smoothly.
- IGNORE 1 LINES
- This simply ignores first row of csv file.
- Remove it if your csv file doesn’t include header name at row first.
- ($_header_columns_string);”
- It writes all column names as per each csv files.
- 2> /dev/null
- This line of code removes all warning or error message.
- Use this only if your shell scripts is working 100%.
- It simple removes warning regarding security. In localhost, it is not that important too.
And once again 💥 boom, you now know in detail about the code. Well done 👍.
Or no?
Are you getting this error?
Loading local data is disabled; this must be enabled on both the client and server sides
Ok, I got cover for you 🙂. Follow these steps.
> mysql -u db_username -p db_password database_name_goes_here
mysql> SET GLOBAL local_infile=1;
Now try again to run the shell script. 💥💥💥 Now you should be able to run the script. I hope this final step will make everything run smooth. I hope so.
Ok guys, that’s all for today. I hope you like it. Let me know if any other issues you encountered. Please share it.
Thanks