Sync gateway not returning results when performing get request for a document

So I’ve successfully upgraded to latest couchbase DB (server:community-5.0.1) and sync gateway (sync-gateway:1.5.1-community) for my production server.

But when taking a backup and restoring locally I’m having issues sending get requests to retrieve a document. When I run the same query against my production sync gateway it returns a document. Heres an example:

curl http://localhost:4984/db/

{"error":"Bad Request","reason":"Invalid JSON: \"invalid character '\\x00' looking for beginning of value\""}%

In the couchbase sync gateway logs I can see:

sync_1        | 2018-01-13T02:22:52.059Z HTTP:  #001: GET /db/
sync_1        | 2018-01-13T02:22:52.060Z CRUD+: No xattr content found for, xattrKey=_sync: sub-document path does not exist
sync_1        | 2018-01-13T02:22:52.060Z HTTP: #001:     --> 400 Invalid JSON: "invalid character '\x00' looking for beginning of value"  (3.7 ms)

How can I update my documents to contain xattr content and why do I get this error when restoring production db locally?

(My dev environment is also running latest sync gateway and couchbase server - same as my production).

Heres my sync gateway config file used for both production and dev:

  "interface": ":4984",
  "profileInterface": "80",
  "adminInterface": ":4985",
  "MaxFileDescriptors": 25000,
  "log": [
  "databases": {
    "litehq": {
      "username": "user",
      "password": "password",
      "bucket": "litehq",
      "server": "http://db:8091",
      "revs_limit": 20,
      "enable_shared_bucket_access": true,
      "sync": `
      function(doc, oldDoc) {
        if(doc.type === "food") {
        } else {
      "users": {
        "GUEST": { "disabled": false, "admin_channels": ["*"] }


Do you have an existing data set using SG 1.4.x and CB 4.x that you backed up from?
Did you upgrade SG to 1.5.x and CB 5.x with the old data set?
Did you do the import(Migration) from non-xattr to xattr i.e. import_docs=true? Docs Link

NOTE set import_docs=true only on one of your sync gateways.

@househippo I didn’t set import_docs when upgrading. I have attempted to set it when restoring backup in dev but that doesn’t do anything. Should I set this on my production server and restart sync gateway?

So when you set import_docs=true in your SG 1.5.x in dev are you still seeing “_sync” in the documents in Couchbase server 5.x?

"_sync":{ .... }


When I look at my production or dev DB I can’t see any documents with the “_sync” key. Prior to the upgrade they all had it.

Ok … Looks like Sync Gateway works in Dev and Prod is using xattr.

Sync Gateway is complaining about parsing your JSON thats in Couchbase. This ‘\x00’ can you find and remove this string.


Why is your rev limit set so slow? When I seen revs this low I’ve seen people get conflicts often both on SG and CBL.

thanks @househippo any idea how to alter the document? I can’t view it from couchbase db either I get an error message “Warning: Editing of binary document is not allowed”

As for rev_limit, we have issues of really slow initial replication (takes 5 minutes) so was experimenting with different rev limits. Although that didn’t really solve the issue.

In CB data is stored as a:

  1. Int(“100”)
  2. String (“bob smith”)
  3. JSON("{“data”:“yup”}")
  4. base64 Binary string(“4nsou9…”)

Looks like that data is not “true” JSON and its stored in CB it as a binary base64 string.

I think there is something in the JSON who is updating the doc either SDK or SG is storing it as a binary object instead of JSON, and SG is complaining.

I’ve seen it happen before on very large JSON strings ,16MB, were the JAVA App JSON encoder/decoder didn’t translate the string correctly and thus stored it as a base64 string. The funny thing was in the even though the JAVA App stored it as a base64 string it still read the object like JSON. So the encoder/decoded was imperfect both ways. In the dev environment it was stored as JSON and UAT it was a base64 string in CB. So that meant the encoding/decoding between the two environments for the same data object was different, crazy.

That makes sense, did you know how I could update the document?


If you have the JSON in hand you could DELETE the document in the CB Admin GUI and then re-create the document with the JSON you want.

Look at the Sync Gateway logs to make sure the JSON was parsed correctly.
You should see SG process the data with the CRUD log name.

@househippo right I’ll try that.

So what would happen if I enable "import_docs": "continuous" on my production server now. Would that mean I would be able to take another backup from production and restore without issues?

BTW your suggestion to delete the doc and recreate it worked