本篇文章的次要次要讲述的是对PHP导出的海量数据进行优化,具备肯定的参考代价,有需求的冤家能够看看。
导出数据量很年夜的状况下,天生excel的内存需要十分宏大,效劳器吃没有消,这个时分思考天生csv来处理成绩,cvs读写功能比excel高。
测试表student 数据(各人能够剧本拔出300多万测数据。这里只给个简略的示例了)
SET NAMES utf8mb4; SET FOREIGN_KEY_CHECKS = 0; -- ---------------------------- -- Table structure for student -- ---------------------------- DROP TABLE IF EXISTS `student`; CREATE TABLE `student` ( `ID` int(11) NOT NULL AUTO_INCREMENT, `StuNo` varchar(32) CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL, `StuName` varchar(10) CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL, `StuAge` int(11) NULL DEFAULT NULL, PRIMARY KEY (`ID`) USING BTREE ) ENGINE = InnoDB AUTO_INCREMENT = 12 CHARACTER SET = utf8 COLLATE = utf8_general_ci ROW_FORMAT = Compact; -- ---------------------------- -- Records of student -- ---------------------------- INSERT INTO `student` VALUES (1, 'A001', '小明', 22); INSERT INTO `student` VALUES (2, 'A005', '小李', 23); INSERT INTO `student` VALUES (3, 'A007', '小红', 24); INSERT INTO `student` VALUES (4, 'A003', '小明', 22); INSERT INTO `student` VALUES (5, 'A002', '小李', 23); INSERT INTO `student` VALUES (6, 'A004', '小红', 24); INSERT INTO `student` VALUES (7, 'A006', '小王', 25); INSERT INTO `student` VALUES (8, 'A008', '乔峰', 27); INSERT INTO `student` VALUES (9, 'A009', '欧阳克', 22); INSERT INTO `student` VALUES (10, 'A010', '老顽童', 34); INSERT INTO `student` VALUES (11, 'A011', '黄老邪', 33); SET FOREIGN_KEY_CHECKS = 1;
导出剧本export.php
<?php set_time_limit(0); ini_set('memory_limit', '128M'); $fileName = date('YmdHis', time()); header('Content-Encoding: UTF-8'); header("Content-type:application/vnd.ms-excel;charset=UTF-8"); header('Content-Disposition: attachment;filename="' . $fileName . '.csv"'); //留意,数据量正在年夜的状况下。比方导出几十万到几百万,会呈现504 Gateway Time-out,请修正php.ini的max_execution_time参数 //关上php规范输入流以写入追加的形式关上 $fp = fopen('php://output', 'a'); //衔接数据库 $dbhost = '127.0.0.1'; $dbuser = 'root'; $dbpwd = 'root'; $con = mysqli_connect($dbhost, $dbuser, $dbpwd); if (mysqli_connect_errno()) die('connect error'); $database = 'test';//抉择数据库 mysqli_select_db($con, $database); mysqli_query($con, "set names UTF8");//假如需求请设置编码 //用fputcsv从数据库中导出1百万的数据,比方咱们每一次取1万条数据,分100步来执行 //一次性读取1万条数据,也能够把$nums调小,$step相应增年夜。 $step = 100; $nums = 10000; $where = "where 1=1"; //挑选前提,可自行增加 //设置题目 $title = array('id', '编号', '姓名', '春秋'); //留意这里是小写id,不然ID定名关上会提醒Excel 曾经检测到"xxx.xsl"是SYLK文件,然而不克不及将其加载: CSV 文或许XLS文件的前两个字符是年夜写字母"I","D"时,会发作此成绩。 foreach ($title as $key => $item) $title[$key] = iconv("UTF-8", "GB2312//IGNORE", $item); fputcsv($fp, $title); for ($s = 1; $s <= $step; $s++) { $start = ($s - 1) * $nums; $result = mysqli_query($con, "SELECT ID,StuNo,StuName,StuAge FROM `student` " . $where . " ORDER BY `ID` LIMIT {$start},{$nums}"); if ($result) { while ($row = mysqli_fetch_assoc($result)) { foreach ($row as $key => $item) $row[$key] = iconv("UTF-8", "GBK", $item); //这里必需转码,否则会乱码 fputcsv($fp, $row); } mysqli_free_result($result); //开释后果集资本 ob_flush(); //每一1万条数据就刷新缓冲区 flush(); } } mysqli_close($con);//断开衔接
导出成果:
相干教程:PHP视频教程
以上就是若何对PHP导出的海量数据进行优化的具体内容,更多请存眷资源魔其它相干文章!
标签: php php开发教程 php开发资料 php开发自学 数据优化
版权声明:除非特别标注,否则均为本站原创文章,转载时请以链接形式注明文章出处。
抱歉,评论功能暂时关闭!