Bulk Insert In Every Two Minutes

I have a job that inserts about 200k data every two minutes. But there is not any bulk insert option in .Net SDK 3.1. Is there any other solution for that kind of problem?
Thank you,
Mehmet Firat

Hello @mehmet.firat.komurcu there is no Bulk Insert in SDK 3.1 however its easy to implement with Task Parallel Library , as all methods are async

1 Like

Hello @AV25242 , thank you for your response.

Personally, I recommend against using Parallel.ForEach, as this is primarily designed for CPU-bound rather than IO-bound workloads. It also doesn’t play very nicely with asynchronous tasks. I recommend parallelizing with the asynchronous tasks themselves. However, to get the best results I also recommend controlling the degree of parallelism.

So far, the best approach I’ve found is to break the work up into parallel batches. The exact number of batches which is optimal may vary, somewhere between 20 and 100 batches in my experience. Note that this is also much more effective with the .NET 3.1.1 SDK (which will hopefully release today). There has been a lot of work done in the last few weeks to improve parallel performance.

IEnumerable<IEnumerable<Doc>> BatchWork(IEnumerable<Doc> documents, int numBatches)
    // BatchWork is a routine you write that breaks up the work in a logical way for your use case
    // This is a simple example that pre batches in rotation between each batch
    // There are other options based on your use case or that may be more performant in some cases

    var batches = new List<List<Doc>>(numBatches);
    for (var i=0; i<numBatches; i++) {
        batches.Add(new List<Doc>());

    var batchNum = 0;
    foreach (var doc in documents) {
        batches[batchNum % numBatches].Add(doc);

    return batches;

List<IEnumerable<Doc>> var batches = BatchWork(documents, 100);

List<Task> tasks = batches.Select(batch => Task.Run(async () => {
    foreach (var doc in batch)
        // Do your work here for each doc, be sure to await the result

await Task.WhenAll(tasks);