Bulk upsert data into Couchbase

I want to insert 10000 data into the couchbase, but only part of data has been inserted successfully. I have tried several times and the number of data that has been inserted successfully is different everytime.

How can I insert all the data into couchbase?

bucket, _ := myCluster.OpenBucket("test", "")
for i := 0; i < 10000; i++ {
    s := strconv.Itoa(i)
    batch = append(batch, &gocb.UpsertOp{Key: s, Value: "test"})
}

err = bucket.Do(batch)
bucket.Close()

There is a pictures about the bucket.

Bucket after operation:

Hi,

This is an issue in the Go SDK (https://issues.couchbase.com/projects/GOCBC/issues/GOCBC-231). The issue is that too many requests are being queued and the dispatcher queue is becoming full. You can see this issue is happening by adding something like this before your bucket close, after Do:

for _, op := range batch {
    upsertOp := op.(*gocb.UpsertOp)
    if upsertOp.Err != nil {
		fmt.Println(upsertOp.Err)
	}
}

You can see that the error is “queue overflowed”, you can check for this error explicitly by comparing to gocb.ErrOverload. You could try using more, smaller batches or adding failed ops to a new batch and repeatedly doing that until none have failed.

1 Like

Thank you for your reply.
I have seen this error info and I will try another way to do this insert operation.

Hi! Faced this problem as well. I was getting around 2400 out of 5200 documents loaded. Decided to tackle this problem the way @chvck suggested. This is my code:

	// Bulk insert
	err = h.env.Bucket.Do(itemsToUpsert)
	if err != nil {
		return err
	}

	for {
		var retryItems []gocb.BulkOp
		for _, op := range itemsToUpsert {
			upsertOp := op.(*gocb.UpsertOp)
			if upsertOp.Err == gocb.ErrOverload {
				retryItems = append(retryItems, upsertOp)
			}
		}

		if len(retryItems) == 0 {
			break
		}

		err = h.env.Bucket.Do(retryItems)
		if err != nil {
			return err
		}
	}