In my previous article I wrote about “Latest PHP frameworks in 2014?”. Today I am going to share with you another real time experience of mine, which I faced couple of days back.
Recently I came across the task where I need to import large csv records into mysql database. It’s quite tough because your phpmyadmin will not allow you to import large file. That csv contains around 10 million records. You may think don't I use mysql LOADINFILE ? Due to some restriction I avoided and choosed my next option as csv spliting. First thing I tried to open that file, but it shows “File not loaded completely”. The reason you get this error because excel can only handle 65,536 lines per sheet and it will not open remaining records. I even tried to import it directly via phpmyadmin but maximum time limit exceeded, then next thing I increased php max execution time but it was taking much time to process that file.
So you can solve this problem simply by splitting the large csv file into smaller fragments. By doing that script will consume time limit.
You can simply download FXFisherman’s CSV Spliter program and run the program in your windows system. Installation no longer required.
Once you run you will see this screen below-
Now you may browse your csv file which you want to split, specify how many numbers of lines you want to have per csv and also another way you can simply specify max number of pieces. Now simply click “Split Now!”.
It will start splitting your csv into number of pieces. When process is completed you should see screen like below.
Now you can find a folder contains chunks of csv file into it, in the same directory. Thank God! Saved My Day.
I was able to import 3 millions of records at the end of the day.
Do you have any other better or faster way to accomplish this job?
Hope this article is useful, if so please don’t forget to give a small like or share with your friends. Keep visiting for new posts and also if you find my articles are useful you may join us simply following us.