PHP implements insertion of 100000 data optimization

The first method uses a foreach loop

$arr = [    
    [
        'name' => 'testname1',
        'age' => 18,
    ],
    [
        'name' => 'testname2',
        'age' => 19,
    ],
    [
        'name' => 'testname3',
        'age' => 18,
    ],
];

$servername = "localhost";
$port = 3306;
$username = "username";
$password = "password";
$dbname = "mytestdb";

// Create connection
$conn = new mysqli($servername, $username, $password, $dbname, $port);

// Detection connection
if ($conn->connect_error) {
 die("connect failed: " . $conn->connect_error);
} 

$costBegin = microtime(true);

foreach($arr as $item) {
    $sql = sprintf("INSERT INTO user_info (name, age) VALUES ( '%s', %d);", $item['name'], (int)$item['age']);   
    if ($conn->query($sql) === TRUE) {
     echo "insert success";
    } else {
     echo "Error: " . $sql . "<br>" . $conn->error;
    }
}

$costEnd = microtime(true);
$cost = round($costEnd - $costBegin, 3);
var_dump($cost);

$conn->close();

Compare inserting a small amount of data with inserting a large amount of data, using the for loop above to insert time-consuming:

Number of bars Time in seconds
10 0.011
1000 0.585
10000 5.733
100000 60.587

The second uses insert to merge inserts

In mysql, insert statements can be used to merge and insert, such as
INSERT INTO user_info (name, age) VALUES ('name 1', 18), ('name 2', 19); indicates insertion of two data at a time

$arr = [    
    [
        'name' => 'testname1',
        'age' => 18,
    ],
    [
        'name' => 'testname2',
        'age' => 19,
    ],
    [
        'name' => 'testname3',
        'age' => 18,
    ],
    // Omit here
    ......
    ......
];

$servername = "localhost";
$port = 3306;
$username = "username";
$password = "password";
$dbname = "mytestdb";

// Create connection
$conn = new mysqli($servername, $username, $password, $dbname, $port);

// Detection connection
if ($conn->connect_error) {
 die("connect failed: " . $conn->connect_error);
} 

$costBegin = microtime(true);

if (!empty($arr)) {
    $sql = sprintf("INSERT INTO user_info (name, age) VALUES ");

    foreach($arr as $item) {
  $itemStr = '( ';
  $itemStr .= sprintf("'%s', %d", $item['name'], (int)$item['age']);
  $itemStr .= '),';
  $sql .= $itemStr;
  }

 // Remove the last comma and add the closing semicolon
 $sql = rtrim($sql, ',');
 $sql .= ';';

    if ($conn->query($sql) === TRUE) {
    } else {
     echo "Error: " . $sql . "<br>" . $conn->error;
    }
}

$costEnd = microtime(true);
$cost = round($costEnd - $costBegin, 3);
var_dump($cost);

$conn->close();

Let's take a look at the time comparison between a small amount of data and a large amount of data. From the overall time, we can see that insert merge insert saves a lot of time than for loop insert just now. The effect is obvious.

Number of bars Time (in seconds)
10 0.006
1000 0.025
10000 0.131
100000 1.23

If you think the array is too large and you want to reduce the risk of sql errors, you can also use array_chunk to cut the array into blocks of a specified size and insert each block together.

Keywords: SQL MySQL

Added by Adam W on Fri, 11 Oct 2019 19:49:38 +0300