How to transfer documents from one bucket to another

Hi All,

I hope you can help me out and point me to the right direction.
The scenario is that I have two Couchbase buckets ‘source’ and ‘destination’ and I would like to transfer some documents to the destination bucket. The condition is to select and transfer only those documents containing instituionId:60

*NOTE: There will be several document of different type that has the institutionId field.

I already looked at some of the forums and blogs and tried the SELECT query using N1QL but I do not know how to transfer documents given the condition above?

You can use INSERT-SELECT examples at

I looked into this and I would like to do bulk copy of the documents. Is that possible?

OKay let me check this one first:

Inserting Values using SELECT is best option.

During a Couchbase Day event we learned about a clever way to use XDCR (Cross Datacenter Replication) to transfer documents from one bucket to another. It is also possible to specify a filter condition. It is maybe kind of an abuse of this feature, but it is possible to move documents between buckets even within the same data center or same cluster.

Is there any document that I can use to further explore this option? Thanks!

No, I just have the link I posted and maybe this one for 5.0

It was basically setting up replication, but use your current cluster and your target-Bucketname as targets for the replication.

I think you can do the same without much code by using sync gateway to sync gateway replication. For this in each document you can put instituionId in the channel and then use sync gateway to sync gateway replication.

You have several options:

  1. Typical INSERT-SELECT queries as @vsr1 has said. You can do bulk copies of the data.

  2. Cli tool cbtransfer You can use it also to move data from one bucket to another in the same cluster.

  3. Use XDCR as @synesty has said.

If you need filter by the key of the document you can use all above options. However, is you want to filter using a field of the document you have to use the first one.

Can you help provide the sample query based on the scenario that I posted? I tried some queries but I cannot transfer the documents where their institutionId=64

I tried this and seem to work

INSERT INTO destination(KEY _k, VALUE _v)
SELECT META().id _k, _v
FROM source _v
WHERE institutionId=60;

1 Like

With below query you will copy all the documents (without changing the ID of the document neither the content of the JSON) from the beer-sample bucket where field brewery_id is 21st_amendment_brewery_cafe to a new bucket beer-sample-replica (you have to create first the second bucket).

INSERT INTO `beer-sample-replica` (KEY META().id)
FROM `beer-sample`
WHERE brewery_id='21st_amendment_brewery_cafe';

I hope this help you :slight_smile:

At the end of the section you will find some INSERT-INTO-SELECT examples

1 Like

Yes this approach works! :slight_smile:

hey @Teix , what is the query to copy all documents to another bucket by changing the id of the document?

You can use UUID() or generate custom expression on the key


Also use Eventing Bulk copy of existing documents

@Kratos - a trivial example of using Eventing to copy all documents from one bucket to another can be found in this post Mirror Couchbase bucket with bulk documents - #2 by jon.strabala

Note Eventing uses DCP (database change protocol) and can accomplish this replication extremely fast.