Sqlpro load csv file to database12/14/2023 ![]() So I have to think of something else since I dont want to preprocess the CSV file. Since comma (,) is the fieldterminator, the 2nd record will be considered as 4 fields. Is there anything with my table DDL that would slow down an import? CREATE TABLE `some_schema`. Qualifier will be included in the table (Ex: 'Insane) 2. I have reached out to their support to try and understand what command Sequel Pro is using to load data. Sequel Pro does import the data in 30 seconds, even with the key. We need to import data from this CSV file to the review table in the database (MySQL), which has the following structure: I will share with you two different ways of inserting data read from a CSV file to a database, according two types of CSV format: simple and complex. the tags table and assumed it must be the key slowing things down. My requirements are fairly simple: The file formats are CSV, theyre delimited with commas, and are text qualified with double quotes. I looked at the DDL for what I had been testing with vs. The reason why I suspected it was the key is that I tried to use the same tags.csv dataset that the other commenter used, but for that table (and for the same DDL that the commenter used) there was no speed difference between DataGrip and Sequel Pro. I tried to import the data without the key, but it still took about 2m40s via DataGrip. 16:03:39 finished - execution time: 2 m 50 s 437 ms, fetching time: 1 ms, total update count: 10000 INSERT INTO import_perf_test_tags_datagrip (user_id, email) VALUES (?, ?) ![]() 16:00:49 finished - execution time: 87 ms, fetching time: 106 ms, total result sets count: 1 SELECT t.* FROM import_perf_test_tags_datagrip t 16:00:48 finished - execution time: 171 ms Is it possible to see the exact command that DataGrip is using to load data so that I can cross reference against what my other SQL client is doing?Īgain, this is all I see in the DataGrip logs: I did do some testing on a table without any keys specified and there was no performance discrepancy. I have another MySQL client on my mac (Sequel Pro) that will import a 10,000 row file with two columns (IDs and email addresses) in about 30 seconds, whereas DataGrip takes almost 3 minutes. I am only experiencing performance issues with the import. ![]() Alternatively, you can select Import from the context menu by right. Server ping isn't very straightforward to measure since I'm connecting via an SSH tunnel. Go to File Import CSV or press -I and select a CSV file youd like to import. ![]()
0 Comments
Leave a Reply.AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |