[MB-12043] cbq crash after trying to delete a key Created: 21/Aug/14  Updated: 21/Aug/14

Status: Open
Project: Couchbase Server
Component/s: query
Affects Version/s: cbq-DP4
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Blocker
Reporter: Iryna Mironava Assignee: Gerald Sangudi
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
cbq> delete from my_bucket KEYS ['query-testa7480c4-0'];
PANIC: Expected plan.Operator instead of <nil>..




[MB-12204] New doc-system does not have anchors Created: 17/Sep/14  Updated: 17/Sep/14

Status: Open
Project: Couchbase Server
Component/s: doc-system
Affects Version/s: 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Blocker
Reporter: Patrick Varley Assignee: Amy Kurtzman
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
The support team uses anchors all the time to link customers directly to the selection that has the information they required.

 know that we have broken a number of sections out into their own page but there are still some long pages for example:

http://draft.docs.couchbase.com/prebuilt/couchbase-manual-3.0/Misc/security-client-ssl.html


It would be good if we could link the customer directly to: "Configuring the PHP client for SSL"

I have marked this as a blocker as it will affect the way the support team works today.




[MB-11923] Document settings needed to achieve 2.5.1 levels of performance for single-bucket append workloads Created: 11/Aug/14  Updated: 11/Aug/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Critical
Reporter: Dave Rigby Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Yes

 Description   
As agreed as the outcome of MB-11675, we need to document how *not* to have a performance regression for append-heavy single bucket workloads.

Please liaise with Chiyoung for the wording / details.




[MB-10524] need working and official instructions or make target to correctly clean working repository from all build products Created: 20/Mar/14  Updated: 31/Jul/14

Status: Open
Project: Couchbase Server
Component/s: build
Affects Version/s: 3.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Critical
Reporter: Aleksey Kondratenko Assignee: Chris Hillery
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Relates to
Triage: Untriaged
Is this a Regression?: Unknown

 Description   
SUBJ.

So that people can work.

The following was tried and did not work:

# repo forall -c 'git clean -dfx'
# rm cmake/CMakeCache.txt

As part of investigating that we found that it is automagically, without being asked for, tried to build against libcurl in /opt/couchbase. Removing /opt/couchbase moved it forward but did not resolve the problem.


 Comments   
Comment by Trond Norbye [ 20/Mar/14 ]
THis is what's I've been running for the last n months..

gmake clean-xfd
repo forall -c 'git clean -dfx'

Ideally we should focus on completing the transition leaving the transition period as short as possible..
Comment by Aleksey Kondratenko [ 20/Mar/14 ]
Here's what I've added to .repo/Makefile.extra


superclean:
rm -rf install
repo forall -c 'git clean -dfx'
rm -fr cmake/CMakeCache.txt cmake/CMakeFiles dependencies/
cd cmake/ && (ruby -e 'puts Dir["*"].select {|n| File.file?(n)}' | xargs rm)
cp -f tlm/CMakeLists.txt cmake/
cp -f tlm/Makefile.top ./Makefile

it appears to work. And after superclean I'm seeing no extra stuff being left.
Comment by Aleksey Kondratenko [ 20/Mar/14 ]
It appears that trouble with instructions above were caused by repo's inability to replace cmake/CMakelists.txt with fresh copy. Old top level makefile had special rule to refresh top makefile. In fact it still has that. But not for cmakelists.





[MB-11946] performance.eperf.EVPerfClient.test_minimal from simple-test does not work standalone Created: 13/Aug/14  Updated: 02/Sep/14

Status: Open
Project: Couchbase Server
Component/s: performance, test-execution
Affects Version/s: 3.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Critical
Reporter: Trond Norbye Assignee: Thomas Anderson
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
I'm having a test failure from running "make simple-test" in the test mentioned above. If I remove all of the other tests in conf/simple.conf I'm just getting a ton of messages like:

2014-08-13 14:08:28 | ERROR | MainProcess | test_thread | [rest_client._http_request] http://127.0.0.1:9000/pools/default error 404 reason: unknown "unknown pool"




 Comments   
Comment by Volker Mische [ 13/Aug/14 ]
It also happens when you run it as single test from CLI as:

./testrunner -i b/resources/dev-4-nodes-xdcr.ini makefile=True -t performance.eperf.EVPerfClient.test_minimal,stats=0,items=1000,hot_init_items=1000
Comment by Volker Mische [ 13/Aug/14 ]
Assigning to Wayne as I fear that no one will take a look at unassigned issues.




[MB-12118] XDCR Replication lag over WAN, 2.5.1 vs 3.0.0-1205 147% regression Created: 03/Sep/14  Updated: 04/Sep/14

Status: Open
Project: Couchbase Server
Component/s: None
Affects Version/s: 3.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Critical
Reporter: Thomas Anderson Assignee: Unassigned
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: atlas 5x5 two cluster configuration.

Triage: Untriaged
Operating System: Centos 64-bit
Link to Log File, atop/blg, CBCollectInfo, Core dump: http://ci/sc/couchbase.com/job/xdcr-5x5/483/artifact/172.23.100.17.zip
 http://ci/sc/couchbase.com/job/xdcr-5x5/483/artifact/172.23.100.18.zip
 http://ci/sc/couchbase.com/job/xdcr-5x5/483/artifact/172.23.100.19.zip
 http://ci/sc/couchbase.com/job/xdcr-5x5/483/artifact/172.23.100.20.zip
 http://ci/sc/couchbase.com/job/xdcr-5x5/483/artifact/172.23.100.21.zip
 http://ci/sc/couchbase.com/job/xdcr-5x5/483/artifact/172.23.100.22.zip
 http://ci/sc/couchbase.com/job/xdcr-5x5/483/artifact/172.23.100.23.zip
 http://ci/sc/couchbase.com/job/xdcr-5x5/483/artifact/172.23.100.24.zip
 http://ci/sc/couchbase.com/job/xdcr-5x5/483/artifact/172.23.100.25.zip
 http://ci/sc/couchbase.com/job/xdcr-5x5/483/artifact/172.23.100.26.zip
Is this a Regression?: Yes

 Description   
replication lag increased over simulated WAN to XDCR cluster.


 Comments   
Comment by Cihan Biyikoglu [ 04/Sep/14 ]
I suggest we close this as won't fix, if you can test and validate that changing the replicator count fixes the issue.
thanks
-cihan




[MB-12157] Intrareplication falls behind OPs causing data loss situation Created: 09/Sep/14  Updated: 09/Sep/14

Status: Open
Project: Couchbase Server
Component/s: ns_server
Affects Version/s: 3.0.1, 3.0, 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Critical
Reporter: Thomas Anderson Assignee: Thomas Anderson
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: 4 node cluster; 4 core nodes; beer-sample application run at 60Kops (50/50 ratio), nodes provisioned on RightScale EC2 x1.large images

Triage: Untriaged
Operating System: Centos 64-bit
Is this a Regression?: Yes

 Description   
the intra-replication queue grows to unacceptable limits, exposing dataloss of multiple seconds of queued replication.
the problem is more pronounced on the RightScale provision cluster, but can be seen on local physical clusters with long enough test run (>20min). recovery requires stopping input request queue.
initial measurements of the erlang process suggests that minor retries on scheduled network i/o eventually build up into a limit for push of replication data, scheduler_wait appears to be the consuming element, epoll_wait counter increases per measurement, as does the mean time wait, suggesting thrashing in the erlang event scheduler. there are various papers/presentations that suggest Erlang is sensitive to the balance of tasks (a mix of long event and short event can cause performance thruput issues).

cbcollectinfo logs will be attached shortly

 Comments   
Comment by Aleksey Kondratenko [ 09/Sep/14 ]
Still don't have any evidence. Cannot own this ticket until evidence is provided.




[MB-12193] Docs should explicitly state that we don't support online downgrades in the installation guide Created: 15/Sep/14  Updated: 15/Sep/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 3.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Critical
Reporter: Gokul Krishnan Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
In the installation guide, we should call out the fact that online downgrades (from 3.0 to 2.5.1) isn't supported and downgrades will require servers to be taken offline.

 Comments   
Comment by Ruth Harris [ 15/Sep/14 ]
In the 3.0 documentation:

Upgrading >
<note type="important">Online downgrades from 3.0 to 2.5.1 is not supported. Downgrades require that servers be taken offline.</note>

Should this be in the release notes too?
Comment by Matt Ingenthron [ 15/Sep/14 ]
"online" or "any"?




[MB-11936] "IP address seems to have changed. Unable to listen on 'xxx.xxx.xxx.xxx'." Error message misleading Created: 12/Aug/14  Updated: 12/Aug/14

Status: Open
Project: Couchbase Server
Component/s: ns_server
Affects Version/s: 2.0, 2.0.1, 2.1.0, 2.2.0, 2.1.1, 2.5.0, 2.5.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Mark Woosey Assignee: Aleksey Kondratenko
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: ALL

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
"IP address seems to have changed. Unable to listen" message appears for a variety of errors, more often than not ones internal and unrelated to external facing IP that is listed within the message.

More accurate/precise error messages would allow for quicker diagnoses by support and (potentially) less confusion by users.




[MB-11950] When a view fails to index it prevents other views in the same design document indexing Created: 13/Aug/14  Updated: 13/Aug/14

Status: Open
Project: Couchbase Server
Component/s: view-engine
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Ian McCloy Assignee: Sriram Melkote
Resolution: Unresolved Votes: 0
Labels: customer
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Dependency
Triage: Untriaged
Is this a Regression?: Unknown

 Description   
A customer reported that they had a view which was missing documents.
The view was incredibly simple and it didn't have any errors reported in the logs so it didn't make sense. There were errors for other views so I suspected that having a failed view (emit too much data, key too large, etc) in a design document may cause the other views in the same design document to not get processed for a specific document


I've setup a basic test that confirms what I suspect, the view engine will not attempt to index further views in a Design Document if a view has failed.

I created a design document with 2 views,

View A (longkey) emits a key which is too long (one of the errors you are seeing)

function (doc, meta) {

  function addnumbers(str) {
    for (k = 0; k < 10000; k += 1) {
      str += k;
  }
    return str
  }
    foo = addnumbers(meta.id);
        emit(foo, null);
}
View B (list_docs) lists all documents by id

function (doc, meta) {
  emit(meta.id, null);
}

When I run this design document, the view engine is failing to index the document in the first view and doesn't attempt to index the 2nd view. View B (list_docs) gives an empty result. When I remove the erroring View A (longkey) the index is regenerated and View B gives the desired result.

Can we either log the fact that View B (list_docs) has been skipped or attempt to run View B (list_docs) even though View A (longkey) failed ?




[MB-11617] error '404 Object Not Found - {"error":"not_found","reason":"missing"' appears for queries in cluster Created: 02/Jul/14  Updated: 13/Aug/14

Status: Open
Project: Couchbase Server
Component/s: query
Affects Version/s: 2.5.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Iryna Mironava Assignee: Manik Taneja
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Operating System: Centos 64-bit
Is this a Regression?: Unknown

 Description   
-t tuqquery.tuq_cluster_ops.QueriesOpsTests.test_incr_rebalance_out,GROUP=REBALANCE;P1,nodes_out=3,nodes_init=4

rebalanced in 1->4 nodes
rebalanced out 1 node, rebalance finishes,queries during rebalance are successfull
run query (i used q=SELECT+name%2C+VMs+FROM+default+AS+employee+WHERE+ANY+vm+IN+employee.VMs+SATISFIES+vm.RAM+%3E+5+AND+vm.os+%3D+%22ubuntu%22+END+ORDER+BY+name%2C+VMs%5B0%5D.RAM)

error appears
{u'code': 5000, u'message': u'Unable to access view', u'caller': u'view_util:82', u'cause': u'error executing view req at http://10.1.2.7:8092/default/_all_docs?limit=1001&startkey=%22query-test16233f7-4%22&startkey_docid=query-test16233f7-4: 404 Object Not Found - {"error":"not_found","reason":"missing"}\n', u'key': u'Internal Error'}




[MB-11618] for cluster after failover Bucket standard_bucket0 not found Created: 02/Jul/14  Updated: 13/Aug/14

Status: Open
Project: Couchbase Server
Component/s: query
Affects Version/s: 2.5.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Iryna Mironava Assignee: Manik Taneja
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Operating System: Centos 64-bit
Is this a Regression?: Unknown

 Description   
 tuqquery.tuq_cluster_ops.QueriesOpsTests.test_failover,GROUP=FAILOVER;P1,nodes_out=2,nodes_init=4,replicas=2

4 nodes cluster
failover two nodes, rebalance is just started
error appears for query
q=SELECT+tasks_points.task1+AS+task+from+standard_bucket0+WHERE+join_mo%3E7+GROUP+BY+tasks_points.task1+HAVING+COUNT%28tasks_points.task1%29+%3E+0+AND+%28MIN%28join_day%29%3D1+OR+MAX%28join_yr%3D2011%29%29+ORDER+BY+tasks_points.task1

{u'code': 5000, u'message': u'Bucket standard_bucket0 not found.', u'caller': u'view_index:200', u'key': u'Internal Error'}

console log shows:
ERROR: Get /pools/default/buckets/standard_bucket0?bucket_uuid=8c1c43f11ecf197a2dd8784ae1cb9c1c: unsupported protocol scheme "" -- couchbase.(*viewIndex).ScanRange() at view_index.go:192




[MB-6923] Please rename lib/couchbase module so that it doesn't conflict with public sdk Created: 15/Oct/12  Updated: 13/Aug/14

Status: Open
Project: Couchbase Server
Component/s: test-execution
Affects Version/s: None
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Tommie McAfee Assignee: Deepkaran Salooja
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified


 Description   
Hey Deep, believe you have many tests relying on the couchbase.py module. Problem is whenever I have public couchbase sdk installed and try to use anything from testrunner that relies on rest_client, I get the following error:

  File "../lib/membase/api/rest_client.py", line 8, in <module>
    from couchbase.document import DesignDocument, View
ImportError: No module named document


In the past I've done workaround by modifying sys.path to import local couchbase module first. But now I have a complicated situation where this is no longer working.
Will be great to find another name for this: 'lib/couchbaseinternal' or 'lib/rest_couchbase' ?



 Comments   
Comment by Thuan Nguyen [ 16/Oct/12 ]
Integrated in single-node-2.0.x-windows7-64-view #18 (See [http://qa.hq.northscale.net/job/single-node-2.0.x-windows7-64-view/18/])
    MB-6923: workaround for couchbase module confilct (Revision a81142c9617483e325e4eab98e8fc92ecae68b5a)

     Result = UNSTABLE
tmcafee :
Files :
* lib/membase/api/rest_client.py
Comment by Thuan Nguyen [ 16/Oct/12 ]
Integrated in multi-nodes-2.0.x-windows-64-backup-cli #18 (See [http://qa.hq.northscale.net/job/multi-nodes-2.0.x-windows-64-backup-cli/18/])
    MB-6923: workaround for couchbase module confilct (Revision a81142c9617483e325e4eab98e8fc92ecae68b5a)

     Result = UNSTABLE
tmcafee :
Files :
* lib/membase/api/rest_client.py
Comment by Thuan Nguyen [ 16/Oct/12 ]
Integrated in single-node-windows-64-install #372 (See [http://qa.hq.northscale.net/job/single-node-windows-64-install/372/])
    MB-6923: workaround for couchbase module confilct (Revision a81142c9617483e325e4eab98e8fc92ecae68b5a)

     Result = SUCCESS
tmcafee :
Files :
* lib/membase/api/rest_client.py
Comment by Thuan Nguyen [ 17/Oct/12 ]
Integrated in multi-nodes-windows-64-viewtest #20 (See [http://qa.hq.northscale.net/job/multi-nodes-windows-64-viewtest/20/])
    MB-6923: workaround for couchbase module confilct (Revision a81142c9617483e325e4eab98e8fc92ecae68b5a)

     Result = SUCCESS
tmcafee :
Files :
* lib/membase/api/rest_client.py
Comment by Thuan Nguyen [ 17/Oct/12 ]
Integrated in multi-nodes-2.0.x-windows-64-install #17 (See [http://qa.hq.northscale.net/job/multi-nodes-2.0.x-windows-64-install/17/])
    MB-6923: workaround for couchbase module confilct (Revision a81142c9617483e325e4eab98e8fc92ecae68b5a)

     Result = SUCCESS
tmcafee :
Files :
* lib/membase/api/rest_client.py
Comment by Maria McDuff (Inactive) [ 15/Apr/14 ]
Deep, Tommie,

is this still a valid request? or we can close this now?
Comment by Deepkaran Salooja [ 22/Apr/14 ]
Tommie,

Do we need this?
Comment by Tommie McAfee [ 23/Jun/14 ]
I'd still like it.

the workaround is to remove lib from the sys.path before import
https://github.com/couchbase/testrunner/blob/master/pysystests/app/sdk_client_tasks.py#L38




[MB-10266] Testrunner does not mark tests failed if imports fail Created: 20/Feb/14  Updated: 13/Aug/14

Status: Open
Project: Couchbase Server
Component/s: test-execution
Affects Version/s: 3.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Volker Mische Assignee: Ketaki Gangal
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged

 Description   
The testrunner test run [1] was marked as success although the tests weren't actually run. It failed with

Traceback (most recent call last):
  File "./testrunner", line 326, in <module>
    suite = unittest.TestLoader().loadTestsFromName(before_suite_name)
  File "/usr/lib/python2.7/unittest/loader.py", line 91, in loadTestsFromName
    module = __import__('.'.join(parts_copy))
ImportError: Import by filename is not supported.

Probably due to not finding the configuration file (it was moved into another directory). This should make the build fail, not pass.

[1] http://factory.couchbase.com/job/couchdb-gerrit-views-upr/52/consoleFull

 Comments   
Comment by Wayne Siu [ 09/Jun/14 ]
Ketaki,
a. can you review if testrunner still marks the job succeeded if it does not get run?
b. do we still run this job? just check the Jenkins, last time this job was run was in March?
Comment by Volker Mische [ 11/Jun/14 ]
b: we don't run this job anymore. It was for the UPR work on the view-engine which has now been merged. The job can be removed (and also all others that work with the couchdb upr branch). Though "a" might still be there.




[MB-11956] Document XDCR SSL protocols and ciphers. Created: 13/Aug/14  Updated: 13/Aug/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Ian McCloy Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: customer
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Dependency
Triage: Untriaged
Is this a Regression?: Unknown

 Description   
A customer asked for details about which protocols and ciphers are used in encrypted XDCR, this isn't currently documented so it needs adding.

In Couchbase 2.5.1, supported SSL/TLS-versions are SSL-3.0 and TLS-1.0. XDCR will utilize the rc4-128 cipher suite by default, it will fall back to aes128 if rc4-128 isn't available but you can force XDCR to only ever attempt rc4-128 by setting the COUCHBASE_WANT_ARCFOUR environmental variable. SSL XDCR only is supported with self-signed certs. It does not support importing your own certificate files nor does it support signed certificates from a Certificate Authority (CA). OpenSSL is not used for the TLS/SSL handshake logic. Instead, the TLS/SSL logic is implemented in Erlang (see http://blog.couchbase.com/heartbleed-bug-and-couchbase-server) If you require specific ciphers/protocol/certificates an alternative option is to connect the clusters over an encrypted VPN connection.




[MB-12006] Weird Server state after vm wakeup with 3.0 beta Created: 19/Aug/14  Updated: 20/Aug/14

Status: Open
Project: Couchbase Server
Component/s: ns_server
Affects Version/s: 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Michael Nitschinger Assignee: Michael Nitschinger
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Attachments: Zip Archive cbcollect.zip     PNG File Screen Shot 2014-08-19 at 11.30.29.png    
Triage: Untriaged
Operating System: Ubuntu 32-bit
Is this a Regression?: Unknown

 Description   
I was experiencing odd behavior on vm wakeup. I had 2 virtualbox vms running that I always use, a 2 node cluster. I just woke up 1 VM and expected to see the other down (which was the case). I wanted to fail it over and continue work on a single node, but the failover box is missing and it its indicating to be in rebalance but it isn't.

I'll attach a screenshot and a cbcollectinfo

 Comments   
Comment by Michael Nitschinger [ 19/Aug/14 ]
Screenshot from the UI
Comment by Michael Nitschinger [ 19/Aug/14 ]
More notes:

- I just restarted the service, but it didn't change, so I suspect it is in a malicious state somehow?
- Then I started this 103 machine and it picked it up and all is green, but it is asking for a rebalance.
- The rebalance went through successfully.

Maybe you can pick some info out of the logs, I'm fairly certain though that I didn't shut down the machines during a rebalance. If it was during a rebalance, is this expected to happen?
Comment by Aleksey Kondratenko [ 19/Aug/14 ]
Have you by any chance attempted graceful failover ?
Comment by Aleksey Kondratenko [ 19/Aug/14 ]
And without logs it's going to be hard for me to investigate anything.
Comment by Michael Nitschinger [ 20/Aug/14 ]
You mean before shutting down the VMs? No, I didn't do any failover.

I've attached a cbcollectinfo, what else do you want me to collect?
Comment by Aleksey Kondratenko [ 20/Aug/14 ]
Ah. I misunderstood your description as saying that you did failover but saw rebalance instead. Will take a look at cbcollectinfo which should hopefully be enough to diagnose it.
Comment by Aleksey Kondratenko [ 20/Aug/14 ]
Sorry but cbcollectinfo you've provided is useless. It needs to be gathered by root or at least couchbase user.

And best way is to use our ui (Logs section tab Collect Logs).
Comment by Michael Nitschinger [ 20/Aug/14 ]
Woops, sorry about that. I had to reset the box, maybe I can reproduce it again.




[MB-12039] Document the maximum length of a bucket name Created: 21/Aug/14  Updated: 21/Aug/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 2.2.0, 2.5.0, 2.5.1, 3.0, 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Major
Reporter: Ian McCloy Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified


 Description   
As per MB-5641 since Couchbase Server 2.2 the bucket name has been limited to 100 characters in the UI to prevent problems.

Can you ensure the following documentation pages include the information,
"The bucket name can only contain characters in range A-Z, a-z, 0-9 as well as underscore, period, dash and percent symbols and can only be a maximum of 100 characters in length."

http://docs.couchbase.com/couchbase-manual-2.5/cb-admin/#couchbase-admin-web-console-data-buckets-createedit
http://docs.couchbase.com/prebuilt/couchbase-manual-3.0/UI/ui-data-buckets.html
http://docs.couchbase.com/prebuilt/couchbase-manual-3.0/Misc/limits.html
http://docs.couchbase.com/couchbase-manual-2.5/cb-rest-api/#creating-and-editing-buckets
http://docs.couchbase.com/prebuilt/couchbase-manual-3.0/REST/rest-bucket-create.html






[MB-12044] Enable on/off from the Couchbase UI where it pre-loads a random document while editing a view Created: 21/Aug/14  Updated: 21/Aug/14

Status: Open
Project: Couchbase Server
Component/s: UI
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Major
Reporter: Larry Liu Assignee: Anil Kumar
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Dependency

 Description   
When I see a random document displayed initially at the top of the page used for editing a View, then I would think it might be a sampled document FROM THAT VIEW. But it’s not. It’s random and may have nothing to do with that view. Can that “initial load” of a random document be turned off if you don’t want to see it?

To be clear, I DO want that window to be visible after I click Show Results for the view and click on one of the resulting document links. Those documents are returned from the view, so it makes sense there, but it’s the immediate load of an unrelated document that doesn’t make sense.




[MB-12060] should date_diff_str should difference 1 even if difference is less? Created: 25/Aug/14  Updated: 25/Aug/14

Status: Open
Project: Couchbase Server
Component/s: query
Affects Version/s: cbq-DP4
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Iryna Mironava Assignee: Gerald Sangudi
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
select date_diff_str("2014-08-24T01:33:59", "2014-08-24T07:33:59", "day") ;
{
    "results": [
        {
            "$1": -1
        }
    ],
    "metrics": {
        elapsedTime: 2.925ms,
        executionTime: 2.444ms,
        resultCount: 1
    }
}

6 hours only of difference
but it shows 1 day, is it ok? i would expect 0 days




[MB-12059] date_add_str fn doesn't show correct result Created: 25/Aug/14  Updated: 25/Aug/14

Status: Open
Project: Couchbase Server
Component/s: query
Affects Version/s: cbq-DP4
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Iryna Mironava Assignee: Gerald Sangudi
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
select date_add_str(clock_str(), 1, 'hour');
{
    "results": [
        {
            "$1": "240826-08-2426T08:246:20.7248-07:00"
        }
    ],
    "metrics": {
        elapsedTime: 2.899ms,
        executionTime: 2.535ms,
        resultCount: 1
    }
}
it shows date 240826-08-2426 which is impossible also 246 minutes doesn't seems to be right




[MB-12070] Append command should return the size of the document Created: 26/Aug/14  Updated: 26/Aug/14

Status: Open
Project: Couchbase Server
Component/s: memcached
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Major
Reporter: Patrick Varley Assignee: Trond Norbye
Resolution: Unresolved Votes: 0
Labels: customer
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Dependency
Relates to
relates to MB-9817 Meta.size property Open

 Description   
It would be useful if the append command returned the current size of the object so a user know how big it is without the need to get it.




[MB-12076] Internal moxi misconfiguration Created: 22/Aug/14  Updated: 26/Aug/14

Status: Open
Project: Couchbase Server
Component/s: moxi
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: kay Assignee: Sergey Avseyev
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: centos 6.5

Attachments: Text File normal.log     Text File problem.log    
Triage: Untriaged
Operating System: Centos 64-bit
Is this a Regression?: Unknown

 Description   
I have 4 servers cluster. Four buckets. One of them is default couchbase bucket with replica=1.

On one server moxi behavior is very strange. Third server's moxi lives its own live.
I've telneted to 11211 moxi's port and tried to set test keys. This key apperared only on that server, not on whole cluster. Also couchbase monitoring tool doesn't show any activity on cluster.

I've noticed that problem moxi process listens only three TCP port's:
{code}
netstat -nlpt | grep 30070
tcp 0 0 0.0.0.0:11211 0.0.0.0:* LISTEN 30070/moxi
tcp 0 0 :::11211 :::* LISTEN 30070/moxi
tcp 0 0 :::6696 :::* LISTEN 30070/moxi
{code}

Other servers' moxies have four listen ports:
{code}
netstat -nltp | grep 2577
tcp 0 0 0.0.0.0:11211 0.0.0.0:* LISTEN 2577/moxi
tcp 0 0 0.0.0.0:60593 0.0.0.0:* LISTEN 2577/moxi
tcp 0 0 :::11211 :::* LISTEN 2577/moxi
tcp 0 0 :::18347 :::* LISTEN 2577/moxi

netstat -nlpt | grep 23001
tcp 0 0 0.0.0.0:11211 0.0.0.0:* LISTEN 23001/moxi
tcp 0 0 0.0.0.0:11339 0.0.0.0:* LISTEN 23001/moxi
tcp 0 0 :::11211 :::* LISTEN 23001/moxi
tcp 0 0 :::5191 :::* LISTEN 23001/moxi

netstat -nlpt | grep 31535
tcp 0 0 0.0.0.0:11211 0.0.0.0:* LISTEN 31535/moxi
tcp 0 0 0.0.0.0:33578 0.0.0.0:* LISTEN 31535/moxi
tcp 0 0 :::11211 :::* LISTEN 31535/moxi
tcp 0 0 :::53475 :::* LISTEN 31535/moxi
{code}

So it seems that moxi on problem server was not able to listen one TCP port.

I've attached debug logs for two servers: problem and normal server.

The problem process is still running. Please let me know which logs do you need for further problem investigation.

 Comments   
Comment by kay [ 22/Aug/14 ]
I use couchbase-server-2.5.1-1083.x86_64
Comment by kay [ 22/Aug/14 ]
please change subproject to moxi for this issue




[MB-11865] Docs: Correctly specify the port to telnet to test the server Created: 01/Aug/14  Updated: 01/Aug/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 2.5.1, 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Dave Rigby Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
In the install guide [1] we tell people how to test their system using telnet. However, the telnet command omits the memcached (moxi) port number to connect to, and so these instructions don't actually work.

The correct command should be:

    telnet localhost1 11211


We should also probably highlight that this is connecting to the legacy memcached protocol using moxi).

[1]: http://docs.couchbase.com/couchbase-manual-2.5/cb-install/index.html#testing-with-telnet

 Comments   
Comment by Dave Rigby [ 01/Aug/14 ]
See this stack overflow question for confusion along these lines: http://stackoverflow.com/questions/25073498/couchbase-test-running-failed




[MB-11978] Add undocumented "detailed memcached stats" to the deprication list. Created: 16/Aug/14  Updated: 18/Aug/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Cihan Biyikoglu Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified


 Description   
Add the following under the deprecated list for 2.5.1 and 3.0 as the docs become available at
http://docs.couchbase.com/couchbase-manual-2.5/#deprecated-items

- The undocumented facility for enabling legacy memcached detailed stats through "stats detail on" message is deprecated.

 Comments   
Comment by Cihan Biyikoglu [ 16/Aug/14 ]
Trond, could you ensure my description is accurate for what we are taking out. Also are there REST, CLI or any tools impact (even unsupported tools we ship) that we need to add?
thanks
Comment by Trond Norbye [ 18/Aug/14 ]
Perhaps reword to:

- The undocumented facility for enabling legacy memcached detailed stats through "stats detail on" and "stats detail dump" is deprecated.
Comment by Trond Norbye [ 18/Aug/14 ]
In addition the following should be added to the release notes about features _removed_ from our server as of 3.0:

The following command line options is removed from memcached:
 * -h, -i, -p, -l, -U, -B, -r, -s, -a, -u, -n, -f, -M, -m, -I

The following command line options will be deprecated as of 3.0:
 * -d, -L, -P, -k
Comment by Cihan Biyikoglu [ 18/Aug/14 ]
great lets go with this for now: since details are not documented, I think we are all good.
Ruth, could we add the following line to deprecation list at the top level. no cli or rest impact specifically. thanks!

- The undocumented facility for enabling legacy memcached detailed stats through "stats detail on" and "stats detail dump" is deprecated.




[MB-12008] XDCR@next release - Secondary index refactoring roll out to mainline: Merge and Resolve conflicts Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 40h
Time Spent: Not Specified
Original Estimate: 40h

Epic Link: XDCR next release




[MB-12009] XDCR@next release - Secondary index refactoring roll out: Testing Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 40h
Time Spent: Not Specified
Original Estimate: 40h

Epic Link: XDCR next release




[MB-12011] XDCR@next release - Parts: CAPI nozzle Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 40h
Time Spent: Not Specified
Original Estimate: 40h

Epic Link: XDCR next release




[MB-12012] XDCR@next release - Parts : Queue Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 16h
Time Spent: Not Specified
Original Estimate: 16h

Epic Link: XDCR next release




[MB-12013] XDCR@next release - Pipeline Runtime : Data Item Tracker Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 16h
Time Spent: Not Specified
Original Estimate: 16h

Epic Link: XDCR next release




[MB-12014] XDCR@next release - Pipeline Runtime: Checkpoint Manager Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 40h
Time Spent: Not Specified
Original Estimate: 40h

Epic Link: XDCR next release




[MB-12016] XDCR@next release - Pipeline Runtime: Statistics Manager Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 40h
Time Spent: Not Specified
Original Estimate: 40h

Epic Link: XDCR next release




[MB-12021] XDCR@next release - Erlang XDCR REST interface Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 56h
Time Spent: Not Specified
Original Estimate: 56h

Epic Link: XDCR next release




[MB-12022] XDCR@next release - Hookup statistics with ns_server stats_collector Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 24h
Time Spent: Not Specified
Original Estimate: 24h

Epic Link: XDCR next release




[MB-12025] XDCR@next release - Integration Testing and Initial Performance Tuning Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 80h
Time Spent: Not Specified
Original Estimate: 80h

Epic Link: XDCR next release




[MB-12023] XDCR@next release - Hookup XDCR REST server with CLI Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 32h
Time Spent: Not Specified
Original Estimate: 32h

Epic Link: XDCR next release




[MB-12024] XDCR@next release - Logging and diagnostics Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 40h
Time Spent: Not Specified
Original Estimate: 40h

Epic Link: XDCR next release




[MB-12017] XDCR@next release - Pipeline Runtime Environment: Pipeline Supervisor (Error Handler) Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: sprint1_xdcr
Remaining Estimate: 24h
Time Spent: Not Specified
Original Estimate: 24h

Epic Link: XDCR next release




[MB-12027] XDCR@next release - Replication Manager: second phase Created: 19/Aug/14  Updated: 19/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: 32h
Time Spent: Not Specified
Original Estimate: 32h

Epic Link: XDCR next release




[MB-12034] run of get query to *:8093 causes cbq crash Created: 20/Aug/14  Updated: 20/Aug/14

Status: Open
Project: Couchbase Server
Component/s: query
Affects Version/s: cbq-DP4
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Iryna Mironava Assignee: Gerald Sangudi
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
curl 'http://10.1.3.176:8093&#39; -v
* About to connect() to 10.1.3.176 port 8093 (#0)
* Trying 10.1.3.176... connected
* Connected to 10.1.3.176 (10.1.3.176) port 8093 (#0)
> GET / HTTP/1.1
> User-Agent: curl/7.21.4 (x86_64-unknown-linux-gnu) libcurl/7.21.4 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: 10.1.3.176:8093
> Accept: */*
>
< HTTP/1.1 500 Internal Server Error
< Content-Type: text/plain; charset=utf-8
< Date: Wed, 20 Aug 2014 16:17:28 GMT
< Transfer-Encoding: chunked
<
* Connection #0 to host 10.1.3.176 left intact
* Closing connection #0
Either command or prepared must be provided.


09:13:47.207162 cbq-engine started...
09:13:47.207313 version: 0.7.0
09:13:47.207325 datastore: dir:/tmp/data/
2014/08/20 09:13:53 http: multiple response.WriteHeader calls
2014/08/20 09:13:53 http: multiple response.WriteHeader calls
panic: runtime error: invalid memory address or nil pointer dereference [recovered]
panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xb code=0x1 addr=0x20 pc=0x57034c]

goroutine 5 [running]:
net/http.(*switchWriter).Write(0xc200152da0, 0xc2005a3800, 0x54, 0x800, 0x335bf38a88, ...)
/usr/local/go/src/pkg/net/http/chunked.go:0 +0x5c
bufio.(*Writer).Flush(0xc20015b440, 0xc20017f770, 0x2b855bf38b30)
/usr/local/go/src/pkg/bufio/bufio.go:465 +0xb9
net/http.(*response).Flush(0xc20017f770)
/usr/local/go/src/pkg/net/http/server.go:952 +0x4a
github.com/couchbaselabs/query/server/http.(*httpRequest).Flush(0xc2005b3b00)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/http/http_response.go:216 +0x50
github.com/couchbaselabs/query/server/http.(*httpRequest).writeString(0xc2005b3b00, 0xc200628a00, 0x48, 0xc200628a50)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/http/http_response.go:114 +0x85
github.com/couchbaselabs/query/server/http.(*httpRequest).Fail(0xc2005b3b00, 0xc200153870, 0xc200628a50)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/http/http_response.go:33 +0xa6
github.com/couchbaselabs/query/server.func·003()
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:88 +0xf9
net/http.(*switchWriter).Write(0xc200152da0, 0xc2005a3800, 0xc, 0x800, 0x335bf38d90, ...)
/usr/local/go/src/pkg/net/http/chunked.go:0 +0x5c
bufio.(*Writer).Flush(0xc20015b440, 0xc20017f770, 0x2b855bf38e38)
/usr/local/go/src/pkg/bufio/bufio.go:465 +0xb9
net/http.(*response).Flush(0xc20017f770)
/usr/local/go/src/pkg/net/http/server.go:952 +0x4a
github.com/couchbaselabs/query/server/http.(*httpRequest).Flush(0xc2005b3b00)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/http/http_response.go:216 +0x50
github.com/couchbaselabs/query/server/http.(*httpRequest).writeString(0xc2005b3b00, 0xc200626480, 0xc, 0xc200628960)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/http/http_response.go:114 +0x85
github.com/couchbaselabs/query/server/http.(*httpRequest).Fail(0xc2005b3b00, 0xc200153870, 0xc200628960)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/http/http_response.go:33 +0xa6
github.com/couchbaselabs/query/server.(*Server).serviceRequest(0xc20016c1e0, 0xc20013bbe0, 0xc2005b3b00)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:101 +0x18f
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:80 +0x6c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 1 [IO wait]:
net.runtime_pollWait(0x2b855b536f00, 0x72, 0x0)
/usr/local/go/src/pkg/runtime/znetpoll_linux_amd64.c:118 +0x82
net.(*pollDesc).WaitRead(0xc200180230, 0xb, 0xc200136f90)
/usr/local/go/src/pkg/net/fd_poll_runtime.go:75 +0x31
net.(*netFD).accept(0xc2001801b0, 0x9bc7c8, 0x0, 0xc200136f90, 0xb, ...)
/usr/local/go/src/pkg/net/fd_unix.go:385 +0x2c1
net.(*TCPListener).AcceptTCP(0xc2000002d0, 0x562186, 0x2b855a8c4d00, 0x562186)
/usr/local/go/src/pkg/net/tcpsock_posix.go:229 +0x45
net.(*TCPListener).Accept(0xc2000002d0, 0xc20016c300, 0xc200000040, 0xc2001802d0, 0x0, ...)
/usr/local/go/src/pkg/net/tcpsock_posix.go:239 +0x25
net/http.(*Server).Serve(0xc20016c2b0, 0xc20015be40, 0xc2000002d0, 0x0, 0x0, ...)
/usr/local/go/src/pkg/net/http/server.go:1542 +0x85
net/http.(*Server).ListenAndServe(0xc20016c2b0, 0x0, 0xc200136360)
/usr/local/go/src/pkg/net/http/server.go:1532 +0x9e
github.com/couchbaselabs/query/server/http.(*HttpEndpoint).ListenAndServe(0xc20016c2a0, 0x80b6c0, 0xcfebd8)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/http/http_endpoint.go:36 +0x2a
main.main()
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/main/main.go:61 +0x547

goroutine 2 [syscall]:

goroutine 261 [semacquire]:
sync.runtime_Semacquire(0xc200000328)
/usr/local/go/src/pkg/runtime/zsema_linux_amd64.c:165 +0x2e
sync.(*Cond).Wait(0xc20058f4b0)
/usr/local/go/src/pkg/sync/cond.go:74 +0x95
io.(*pipe).read(0xc20058f480, 0xc2005b6000, 0x1000, 0x1000, 0x0, ...)
/usr/local/go/src/pkg/io/pipe.go:52 +0x1f2
io.(*PipeReader).Read(0xc200000310, 0xc2005b6000, 0x1000, 0x1000, 0x9bc988, ...)
/usr/local/go/src/pkg/io/pipe.go:130 +0x5d
net/http.(*liveSwitchReader).Read(0xc2001802f8, 0xc2005b6000, 0x1000, 0x1000, 0x0, ...)
/usr/local/go/src/pkg/net/http/server.go:205 +0x91
io.(*LimitedReader).Read(0xc20015afa0, 0xc2005b6000, 0x1000, 0x1000, 0x2, ...)
/usr/local/go/src/pkg/io/io.go:394 +0xc0
net/http.(*switchReader).Read(0xc200152cb0, 0xc2005b6000, 0x1000, 0x1000, 0x0, ...)
/usr/local/go/src/pkg/net/http/chunked.go:0 +0x62
bufio.(*Reader).fill(0xc20016c360)
/usr/local/go/src/pkg/bufio/bufio.go:79 +0x10c
bufio.(*Reader).ReadSlice(0xc20016c360, 0x40af0a, 0x0, 0x0, 0x0, ...)
/usr/local/go/src/pkg/bufio/bufio.go:262 +0x202
bufio.(*Reader).ReadLine(0xc20016c360, 0x0, 0x0, 0x0, 0x2b855a8d1c00, ...)
/usr/local/go/src/pkg/bufio/bufio.go:293 +0x61
net/textproto.(*Reader).readLineSlice(0xc2005b51b0, 0xe00558a2b, 0x41f1cf, 0xc20015ca90, 0x899020, ...)
/usr/local/go/src/pkg/net/textproto/reader.go:55 +0x51
net/textproto.(*Reader).ReadLine(0xc2005b51b0, 0xc20015ca90, 0x1000, 0x5, 0x0, ...)
/usr/local/go/src/pkg/net/textproto/reader.go:36 +0x25
net/http.ReadRequest(0xc20016c360, 0xc20015ca90, 0x0, 0x0)
/usr/local/go/src/pkg/net/http/request.go:510 +0x86
net/http.(*conn).readRequest(0xc2001802d0, 0x0, 0x0, 0x0)
/usr/local/go/src/pkg/net/http/server.go:547 +0x1bc
net/http.(*conn).serve(0xc2001802d0)
/usr/local/go/src/pkg/net/http/server.go:1052 +0x398
created by net/http.(*Server).Serve
/usr/local/go/src/pkg/net/http/server.go:1564 +0x266

goroutine 6 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 7 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 8 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 9 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 10 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 11 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 12 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 13 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 14 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 15 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 16 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 17 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 18 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 19 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 20 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 21 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 22 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 23 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 24 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 25 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 26 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 27 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 28 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 29 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 30 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 31 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 32 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 33 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 34 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 35 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 36 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 37 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 38 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 39 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 40 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 41 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 42 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 43 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 44 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 45 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 46 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 47 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 48 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 49 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 50 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 51 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 52 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 53 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 54 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 55 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 56 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 57 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 58 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 59 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 60 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 61 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 62 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 63 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 64 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 65 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 66 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 67 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 68 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 69 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 70 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 71 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 72 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 73 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 74 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 75 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 76 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 77 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 78 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 79 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 80 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 81 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 82 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 83 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 84 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 85 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 86 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 87 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 88 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 89 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 90 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 91 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 92 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 93 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 94 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 95 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 96 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 97 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 98 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 99 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 100 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 101 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 102 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 103 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 104 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 105 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 106 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 107 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 108 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 109 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 110 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 111 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 112 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 113 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 114 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 115 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 116 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 117 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 118 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 119 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 120 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 121 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 122 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 123 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 124 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 125 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 126 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 127 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 128 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 129 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 130 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 131 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 132 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 133 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 134 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 135 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 136 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 137 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 138 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 139 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 140 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 141 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 142 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 143 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 144 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 145 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 146 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 147 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 148 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 149 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 150 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 151 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 152 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 153 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 154 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 155 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 156 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 157 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 158 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 159 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 160 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 161 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 162 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 163 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 164 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 165 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 166 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 167 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 168 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 169 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 170 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 171 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 172 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 173 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 174 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 175 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 176 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 177 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 178 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 179 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 180 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 181 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 182 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 183 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 184 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 185 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 186 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 187 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 188 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 189 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 190 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 191 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 192 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 193 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 194 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 195 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 196 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 197 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 198 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 199 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 200 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 201 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 202 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 203 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 204 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 205 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 206 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 207 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 208 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 209 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 210 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 211 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 212 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 213 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 214 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 215 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 216 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 217 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 218 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 219 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 220 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 221 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 222 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 223 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 224 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 225 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 226 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 227 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 228 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 229 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 230 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 231 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 232 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 233 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 234 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 235 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 236 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 237 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 238 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 239 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 240 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 241 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 242 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 243 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 244 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 245 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 246 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 247 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 248 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 249 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 250 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 251 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 252 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 253 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 254 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 255 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 256 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 257 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 258 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 259 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 260 [chan receive]:
github.com/couchbaselabs/query/server.(*Server).doServe(0xc20016c1e0)
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:79 +0x3c
created by github.com/couchbaselabs/query/server.func·002
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/server.go:73 +0x4a

goroutine 262 [IO wait]:
net.runtime_pollWait(0x2b855b536e60, 0x72, 0x0)
/usr/local/go/src/pkg/runtime/znetpoll_linux_amd64.c:118 +0x82
net.(*pollDesc).WaitRead(0xc2001802c0, 0xb, 0xc200136f90)
/usr/local/go/src/pkg/net/fd_poll_runtime.go:75 +0x31
net.(*netFD).Read(0xc200180240, 0xc200603000, 0x8000, 0x8000, 0x0, ...)
/usr/local/go/src/pkg/net/fd_unix.go:195 +0x2b3
net.(*conn).Read(0xc200000040, 0xc200603000, 0x8000, 0x8000, 0x8000, ...)
/usr/local/go/src/pkg/net/net.go:123 +0xc3
io.Copy(0xc2005b5510, 0xc200000318, 0xc2005b5090, 0xc200000040, 0x0, ...)
/usr/local/go/src/pkg/io/io.go:348 +0x1c6
net/http.func·004()
/usr/local/go/src/pkg/net/http/server.go:162 +0x66
created by net/http.(*conn).closeNotify
/usr/local/go/src/pkg/net/http/server.go:168 +0x1c6

goroutine 263 [chan receive]:
github.com/couchbaselabs/query/server/http.func·001()
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/http/http_request.go:93 +0x39
created by github.com/couchbaselabs/query/server/http.newHttpRequest
/root/tuq/gocode/src/github.com/couchbaselabs/query/server/http/http_request.go:95 +0x587




[MB-12033] if query has both union and order by nil error appears Created: 20/Aug/14  Updated: 20/Aug/14

Status: Open
Project: Couchbase Server
Component/s: query
Affects Version/s: cbq-DP4
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Iryna Mironava Assignee: Gerald Sangudi
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
cbq> select join_day from my_bucket union select join_mo from my_bucket ORDER BY join_mo;
PANIC: runtime error: invalid memory address or nil pointer dereference.




[MB-12004] XDCR@next release - Secondary index refactor testing Created: 19/Aug/14  Updated: 20/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Yu Sui
Resolution: Unresolved Votes: 0
Labels: sprint1_xdcr
Remaining Estimate: 40h
Time Spent: Not Specified
Original Estimate: 40h

Epic Link: XDCR next release




[MB-11926] defined and reliable behavior needed when connections are exhausted Created: 11/Aug/14  Updated: 15/Aug/14

Status: Open
Project: Couchbase Server
Component/s: memcached
Affects Version/s: 2.5.1, 3.0, 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Major
Reporter: Matt Ingenthron Assignee: Trond Norbye
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified


 Description   
In the scenario that a user scales their client libraries or other connections beyond the maximum number of allowed connections, various services (K-V, ns_server, views) should signal the exhaustion of resources in a reliable way and we should document and test to that.

For K-V, I believe the current situation is that after we hit max connections, new connections will be accepted by the process but immediately dropped. I know memcached has a few different options here, like dropping an older connection.

I believe that for K-V, we should consider accept()ing the connection and then perhaps introduce a new error code that indicates connection exhaustion followed by shutdown() on the connection. This way the client library and other parts of the system can at least log the error appropriately. I'm open to other suggestions from the component owners on how this should be fixed/documented though.

I believe I filed this before, but can't find it, and it came up in discussion of MB-11066

For other services, I believe MB-8211 covers the need since right now there is no maximum.

p.s.: I don't know that I got the right components here. please adjust as required.

 Comments   
Comment by Dave Rigby [ 11/Aug/14 ]
This is core memcached so assigning to Trond as I *think* he's the right owner.
Comment by Dave Rigby [ 11/Aug/14 ]
@Matt: So one deviation from the norm here is I believe binary protocol is currently command-response based (at least when you initially connect), so I don't believe there's any precedent for a client receiving a "response" before it has made a request - therefore if we want to add such a feature we'd need to see how all existing clients would handle such a situation.

Note that I don't think it's feasible for memcached to wait for the first client command and *then* send a EMAXCONNS error or similar as the response (and then close the connection) - as clients could effectively wait forever after connecting before they send a request - and so holding an arbitrary number of "initial" client connections open for an indeterminate duration essentially defeats the point of a connection limit.
Comment by Matt Ingenthron [ 11/Aug/14 ]
@dave I was not suggesting sending a response before a request there actually, though I wasn't very complete. I was suggesting that on the first command, perhaps that's when we'd send back this new EIAMOUTOFCONNECTIONSRIGHTNOW error. This would typically happen during authentication, and we can then enhance the clients to fail in a reliable way.
Comment by Trond Norbye [ 12/Aug/14 ]
@matt: leaving the connection open until you receive the first command may lead to a DOS failure (and even worse problems) on Linux and Unix (windows doesn't seem to be affected) due to the fact that you may run out of file descriptors. With the current design in couchstore we're constantly opening and closing the underlying database files, so we might end up in a situation where the clients eats up all the file descriptors so that we cannot open the database files... We currently do have a problem that the user may increase the -c setting so high that the process may start seeing EMFILE errors on Linux and Unix and then bad things may happen...
Comment by Trond Norbye [ 12/Aug/14 ]
Changed to "Improvement" (It's been like this since epoc) and "Major" (there are no crashes, dataloss etc caused by this)
Comment by Matt Ingenthron [ 12/Aug/14 ]
What you refer to there I think would be a different kind of exploit. The DOS situation we have now. If all connections are used by an attacker and no clients (legitimate or illegitimate) can connect, then services are denied.

However, I think there's a way out here.

How about the following:
- Give the user a setting for max connections
- Ensure the number of fds is max connections + a buffer + fds for other work
- Allow connections to proceed normally until you hit max
- Between max and some level above that (call it 100 or 1000) allow connections, but send back an error on the first request indicating connections are exhausted. We can call this CONN_MAX_HEADROOM
- At that level, accept connections, wait a second, and drop them.

This approach gives us a reliable failure in the case of misconfiguration and the same protection in the case of malicious attacks. What do you think?

Most sophisticated users will want to protect against DOS at several other network levels, like TCP SYN flood cookies and number of TCP connections per expected client.




[MB-11856] Change default max_doc_size and other parameters to sensible default limit values Created: 30/Jul/14  Updated: 11/Aug/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 3.0
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Sarath Lakshman Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Dependency
depends on MB-11831 a doc gets added but not indexed Resolved

 Comments   
Comment by Harsha Havanur [ 04/Aug/14 ]
Review in progress http://review.couchbase.org/#/c/40175/
Comment by Sriram Melkote [ 04/Aug/14 ]
Ruth, this is a behavior change from 2.5 to 3.0 and hence we should clearly highlight these limits and also note the change in release notes.

In 2.5 - documents larger than 1MB were not indexed by default.
In 3.0 - documents up to 20MB (max allowed by KV) are indexed.

However, passing such large documents to view engine will have performance impact. The customer can control this behavior by changing indexer_max_doc_size variable.

Commit message:
The view engine enforced a limit of 1 MB on documents that can be indexed. This limit is now increased to 20 MB to ensure every document gets indexed and not silently dropped by the view engine if size of document exceeded previously enforced limit.
Comment by Sarath Lakshman [ 04/Aug/14 ]
There are two more parameters that needs to be documented (2.5 and 3.0):

1. max_kv_size_per_doc
Default value: 1MB

The maximum byte size allowed to be emitted for a single document and per view. This is the
sum of the sizes of all emitted keys and values.

    To avoid consuming too much memory while running a single map function,
    enforce a maximum limit for the total amount of data that can be emitted
    per document and per view. This limit is configurable via the couch_config
    option "max_kv_size_per_doc", section "mapreduce", and corresponds to the
    sum of the byte size of all emitted keys and values.

    A value of 0 for this new setting, disables the limit (meaning unlimited,
    as before this change).

    So far, before any memory blowup could happen, the function execution was
    aborted due to the trigger of the timeout, such as in the following example
    case:

    function(doc, meta) {
        for (var i = 0; i < 1000000; i++) {
            for (var j = 0; j < 1000000; j++) {
                emit(doc.value, null);
            }
        }
    }

2. function_timeout
Default value: 10000 ms
Maximum duration, in milliseconds, for the execution time of all the map/reduce functions in a design document against a single document (map function) or against a list of map values/reductions (reduce/rereduce function).
Comment by Ruth Harris [ 05/Aug/14 ]
Got this from Todd. This stuff is for the rest api area, but mention should also be in the Admin Views section.

Related MB: http://www.couchbase.com/issues/browse/MB-9713

# default 1048576 bytes
$ curl -X POST http://Administrator:password@localhost:8091/diag/eval -d \
        'rpc:eval_everywhere(erlang, apply, [fun() -> couch_config:set("set_views", "indexer_max_doc_size", "2048576") end, []]).'

# default 1048576 bytes
$ curl -X POST http://Administrator:password@localhost:8091/diag/eval -d \
        'rpc:eval_everywhere(erlang, apply, [fun() -> couch_config:set("mapreduce", "max_kv_size_per_doc", "524288") end, []]).'

BTW, I won't be able to get to this right now. In a couple of weeks probably.
Please add any additional background information in this bug.

Thanks, Ruth




[MB-12088] Memcached should return an uninitiated error code Created: 28/Aug/14  Updated: 28/Aug/14

Status: Open
Project: Couchbase Server
Component/s: memcached
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Major
Reporter: Patrick Varley Assignee: Trond Norbye
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified


 Description   
This is a blow out from MB-11875 and CBSE-1370.

This is no way that CCCP can tell if a node is uninitiated versus auth error.

Currently LCB does not retry on auth error.

This can be painful when a user is doing maintenance as it mean they have to keep updating the bootstrap list when they have removed a node from the cluster which they plan to add back in.

If memcached returned an uninitiated error code LCB would know to try the other nodes in the bootstrap list.

 Comments   
Comment by Patrick Varley [ 28/Aug/14 ]
What I mean by "uninitiated" is the node has been removed from the cluster and is at the setup wizard.




[MB-12105] cosine is not calculated correctly Created: 01/Sep/14  Updated: 01/Sep/14

Status: Open
Project: Couchbase Server
Component/s: query
Affects Version/s: cbq-DP4
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Iryna Mironava Assignee: Gerald Sangudi
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
cos(90 degrees) is 0
so i would expect to see that select cos(radians(90)) will show me also 0
but i see
select cos(radians(90));
{
    "results": [
        {
            "$1": 6.123233995736757e-17
        }
    ],
    "metrics": {
        elapsedTime: 1.522ms,
        executionTime: 1.162ms,
        resultCount: 1
    }
}

i know that is very small number but still is it ok?

 Comments   
Comment by Iryna Mironava [ 01/Sep/14 ]
also sin(x)^2 +cos(x)^2 = 1 , but it is not, is that ok?

select power(sin(radians(4)), 2) + power(cos(radians(4)), 2);
{
    "results": [
        {
            "$1": 0.9999999999999999
        }
    ],
    "metrics": {
        elapsedTime: 2.104ms,
        executionTime: 1.737ms,
        resultCount: 1
    }
}
cbq> select power(sin(radians(74)), 2) + power(cos(radians(74)), 2);
{
    "results": [
        {
            "$1": 0.9999999999999998
        }
    ],
    "metrics": {
        elapsedTime: 2.183ms,
        executionTime: 1.805ms,
        resultCount: 1
    }
}
cbq> select power(sin(radians(60)), 2) + power(cos(radians(60)), 2);
{
    "results": [
        {
            "$1": 1
        }
    ],
    "metrics": {
        elapsedTime: 2.295ms,
        executionTime: 1.914ms,
        resultCount: 1
    }
}




[MB-12107] calculated sum of float number is not more, less neither equal to another float Created: 01/Sep/14  Updated: 01/Sep/14

Status: Open
Project: Couchbase Server
Component/s: query
Affects Version/s: cbq-DP4
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Iryna Mironava Assignee: Gerald Sangudi
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
shell shows that sum(test_rate) > 13915.44 -> false
                        sum(test_rate) < 13915.44 -> false
                        sum(test_rate) = 13915.44 -> false
I suppose that at least one of query should return true
cbq> select sum(test_rate) = 13915.44 from my_bucket;
{
    "results": [
        {
            "$1": false
        }
    ],
    "metrics": {
        elapsedTime: 199.511ms,
        executionTime: 199.449ms,
        resultCount: 1
    }
}
cbq> select sum(test_rate) < 13915.44 from my_bucket;
{
    "results": [
        {
            "$1": false
        }
    ],
    "metrics": {
        elapsedTime: 198.899ms,
        executionTime: 198.805ms,
        resultCount: 1
    }
}
cbq> select sum(test_rate) > 13915.44 from my_bucket;
{
    "results": [
        {
            "$1": false
        }
    ],
    "metrics": {
        elapsedTime: 171.867ms,
        executionTime: 171.858ms,
        resultCount: 1
    }
}


I have a float attribute test_rate:
select distinct test_rate, count(test_rate) from my_bucket group by test_rate
{
    "results": [
        {
            "$1": 168,
            "test_rate": 10.1
        },
        {
            "$1": 168,
            "test_rate": 3.3
        },
        {
            "$1": 168,
            "test_rate": 5.5
        },
        {
            "$1": 168,
            "test_rate": 7.7
        },
        {
            "$1": 168,
            "test_rate": 9.9
        },
        {
            "$1": 168,
            "test_rate": 12.12
        },
        {
            "$1": 168,
            "test_rate": 6.6
        },
        {
            "$1": 168,
            "test_rate": 4.4
        },
        {
            "$1": 168,
            "test_rate": 8.8
        },
        {
            "$1": 168,
            "test_rate": 2.2
        },
        {
            "$1": 168,
            "test_rate": 1.1
        },
        {
            "$1": 168,
            "test_rate": 11.11
        }
    ],
    "metrics": {
        elapsedTime: 265.256ms,
        executionTime: 265.196ms,
        resultCount: 12
    }
}

I expect the sum to be 13915.44 (no more)
but sum has some small extra value
select sum(test_rate) from my_bucket;
{
    "results": [
        {
            "$1": 13915.440000000108
        }
    ],
    "metrics": {
        elapsedTime: 197.398ms,
        executionTime: 197.019ms,
        resultCount: 1
    }
}

 Comments   
Comment by Gerald Sangudi [ 01/Sep/14 ]
Hi Iryna,

Please do the following:
- git pull
- rebuild

select sum, sum < 13915.44 lt, sum = 13915.44 eq, sum > 13915.44 gt
from my_bucket
let sum = sum(test_rate)

And post the results here.

Thanks.
Comment by Iryna Mironava [ 01/Sep/14 ]
cbq> select sum, sum < 13915.44 lt, sum = 13915.44 eq, sum > 13915.44 gt
   > from my_bucket
   > let sum = sum(test_rate)
   > ;
{
    "signature": {
        "eq": "boolean",
        "gt": "boolean",
        "lt": "boolean",
        "sum": "json"
    },
    "results": [
    ],
    "errors": [
        {
            "caller": "let:54",
            "cause": "Error evaluating aggregate: interface conversion: interface is nil, not map[algebra.Aggregate]value.Value.",
            "code": 5000,
            "key": "Internal Error",
            "message": "Error evaluating LET."
        }
    ],
    "metrics": {
        elapsedTime: 149.299ms,
        executionTime: 149.173ms,
        resultCount: 0,
        errorCount: 1
    }
}
Comment by Gerald Sangudi [ 01/Sep/14 ]
You found another problem, which I filed as a separate ticket.

https://www.couchbase.com/issues/browse/MB-12108

Let me fix that and then we can address this one.

Thanks.




[MB-12068] XDCR - only minority of items get replicated Created: 26/Aug/14  Updated: 26/Aug/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: 2.2.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Marek Obuchowicz Assignee: Aleksey Kondratenko
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: Cluster "US" - 3x AWS m3.medium instances
Cluster "EU" - 1x AWS m3.medium instance
All running amzn-ami-pv-2014.03.1.x86_64-ebs
Cochbase server (latest community) installed from couchbase-server-2.2.0-837.x86_64.rpm

Triage: Untriaged
Operating System: Centos 64-bit
Link to Log File, atop/blg, CBCollectInfo, Core dump: https://s3.amazonaws.com/couchbase_debug/eu.zip
https://s3.amazonaws.com/couchbase_debug/us.zip
Is this a Regression?: Unknown

 Description   
I have two clusters:
 - "US" with three nodes - there is one bucket, profiles (with 2 replicas)
 - "EU" with one node - there is one bucket, profiles_debug (without replicase)

I have setup XDCR from "US" cluster to "EU" cluster. Number of documents in "US" cluster is 2.35m (+ 4.7m replicas), but number of replicated documents to "EU" cluster is only 780k - exactly 1/3 of all documents. When I create new buckets in destination cluster and setup XDCR again, the situation doesn't change - new bucket after a while gets 1/3 of documents and no more.

In web console -> XDCR Replication on US cluster, status shows "Replicating - Last 10 errors". Last 10 errors state:
{code}
2014-08-26 09:39:14 [Vb Rep] Error replicating vbucket 765. Please see logs for details.
2014-08-26 09:39:13 [Vb Rep] Error replicating vbucket 649. Please see logs for details.
2014-08-26 09:39:13 [Vb Rep] Error replicating vbucket 593. Please see logs for details.
2014-08-26 09:39:13 [Vb Rep] Error replicating vbucket 567. Please see logs for details.
2014-08-26 09:39:13 [Vb Rep] Error replicating vbucket 552. Please see logs for details.
2014-08-26 09:39:13 [Vb Rep] Error replicating vbucket 521. Please see logs for details.
2014-08-26 09:39:13 [Vb Rep] Error replicating vbucket 502. Please see logs for details.
2014-08-26 09:39:13 [Vb Rep] Error replicating vbucket 454. Please see logs for details.
2014-08-26 09:39:13 [Vb Rep] Error replicating vbucket 439. Please see logs for details.
2014-08-26 09:39:13 [Vb Rep] Error replicating vbucket 417. Please see logs for details.
{code}

I have been using XDCR (bi-directional) in the past, when both US and EU clusters had only one node (no replicas within clusters). At that time it was working fine. Problems started, when I've added two nodes to US cluster, setup new bucket with 2 replicas and tried to use XDCR on this bucket.

List of steps we did, in chronological order:
 - create cluster US, 1 node
 - create cluster EU, 1 node
 - setup bucket A, no replicas, on both clusters
 - setup bi-directional XDCR A(US) <-> A(EU), default settings
 - add two nodes to US cluster
 - setup bucket B, 2 replicas, on cluster US
 - terminate XDCR A(US) <-> A(EU)
 - setup XDCR of bucket A(US) to bucket B(US), terminate it when all documents are copied.
 - delete bucket A(EU)
 - create bucket B(EU), 0 replicas
 - setup bi-directional XDCR B(US) <-> B(EU)
 - only 1/3 of documents from B(US) got replicated to B(EU)

Attached cbcollectInfo output from both clusters. As suggested by pfehre, created Jira issue. Please get in touch by email to get direct access to servers if it helps with investigation.

 Comments   
Comment by Aleksey Kondratenko [ 26/Aug/14 ]
I'll need logs from all nodes.




[MB-11965] dev guide incorrect about how GETL works Created: 14/Aug/14  Updated: 15/Aug/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Matt Ingenthron Assignee: Amy Kurtzman
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
From a user, in the dev guide, there is a section that says:
The documentation says: "While a key is locked, other clients are not able to update the key, nor are they able to retrieve it."

Then a few paragraphs later we have a Java code snippet where we do a get on a key while it is locked and it seems to work

I'm 95% certain this is incorrect. It’s less about the client and more about the operation type. I usually describe the GETL operation as a cooperative lock in that if all clients are using that path, then the item isn’t accessible. Other operation paths are still available.


 Comments   
Comment by Ruth Harris [ 15/Aug/14 ]
Re-assigning to Amy.




[MB-12114] Expose more cluster health information in REST Created: 02/Sep/14  Updated: 02/Sep/14

Status: Open
Project: Couchbase Server
Component/s: tools
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Major
Reporter: Larry Liu Assignee: Bin Cui
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Relates to

 Description   
Currently, we don't have a REST way to get the cluster health information.

One option is to use cbhealthchecker. cbhealthchecker can provide a very comprehensive information about the cluster. Can we expose cbhealthchecker report via REST selectively?




[MB-12113] Enable REST for failover alert Created: 02/Sep/14  Updated: 02/Sep/14

Status: Open
Project: Couchbase Server
Component/s: ns_server
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Major
Reporter: Larry Liu Assignee: Aleksey Kondratenko
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Dependency

 Description   
Currently we send alert email when auto failover happens. Can we monitor it via REST or other methods?

 Comments   
Comment by Aleksey Kondratenko [ 02/Sep/14 ]
/alerts and /logs can be used for that today.




[MB-12116] Improve efficiency of clusters with 100 nodes Created: 03/Sep/14  Updated: 03/Sep/14

Status: Open
Project: Couchbase Server
Component/s: None
Affects Version/s: 3.0
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Major
Reporter: Cihan Biyikoglu Assignee: Unassigned
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: Today couchbase clusters with 100 nodes have a chatty communication among nodes. This items tracks work needed to improve clusters of couchbase server over 100 nodes.





[MB-12133] GETL metric in statistics missing Created: 05/Sep/14  Updated: 05/Sep/14

Status: Open
Project: Couchbase Server
Component/s: None
Affects Version/s: 2.2.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Alexander Petrossian (PAF) Assignee: Unassigned
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: cmd_get does not count GETL operations.

Triage: Untriaged
Operating System: Centos 64-bit
Is this a Regression?: Unknown

 Description   
cmd_get -- have
cmd_getl -- don't have



 Comments   
Comment by Alexander Petrossian (PAF) [ 05/Sep/14 ]
https://tracker.teligent.ru/issues/15305
Comment by Alexander Petrossian (PAF) [ 05/Sep/14 ]
Since
ops per second
are
Total amount of operations per second to this bucket (measured from cmd_get + cmd_set + incr_misses + incr_hits + decr_misses + decr_hits + delete_misses + delete_hits)

getl operations are missing from "ops per second" too.

in our solution we mostly use getl+cas all the time, so "ops per second" show approx 1/2 of all operations.

(we have extensive monitoring on client side too, and we see those getl-s there)




[MB-12134] moxi does not check cluster-map connection, which may be dropped Created: 05/Sep/14  Updated: 05/Sep/14

Status: Open
Project: Couchbase Server
Component/s: None
Affects Version/s: 2.2.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Alexander Petrossian (PAF) Assignee: Unassigned
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: our solution is deployed in environment, where tcp-connections are resources.
and are dropped on inactivity of 1 hour.

we can't change this.

Triage: Untriaged
Operating System: Centos 64-bit
Is this a Regression?: Unknown

 Description   
1. moxi initiates connection to couchbase to get cluster-map and it's updates.
2. since normally there are no updates, this connection gets silently dropped by a network element that treats connections as valuable resources.
3. new bucket/rebalance
4. couchbase cluster-map get changed
5. couchbase tries to push cluster-map updates down to moxi
6. tcp packet get silently dropped by network element in the middle

we can't change behavior of this network element.

we want moxi to regularly check that vital tcp connection with cluster-map updates is still alive, and, if not, reconnect automatically.

 Comments   
Comment by Alexander Petrossian (PAF) [ 05/Sep/14 ]
http://jira.teligent.ru/browse/....-7028




[MB-12140] Meaningful error should be given to the user Created: 05/Sep/14  Updated: 05/Sep/14

Status: Reopened
Project: Couchbase Server
Component/s: UI
Affects Version/s: 3.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Raju Suravarjjala Assignee: Anil Kumar
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: Windows build 3.0.1_1261
Environment: Windows 7 64 bit

Attachments: PNG File Screen Shot 2014-09-05 at 5.09.59 PM.png    
Triage: Untriaged
Is this a Regression?: No

 Description   
Login to the Couchbase console
http://10.2.2.52:8091/ (Administrator/Password)
Click on Server Nodes
Try to Add a server
Give the Server IP address (10.3.2.43)
In the Security give a read only user name and password
You will see the error as seen in the screenshot
Expected behavior: Attention - Authentication failed as Readonly username and password are not allowed.

 Comments   
Comment by Aleksey Kondratenko [ 05/Sep/14 ]
It is against security recommendations (including PCI DSS) to reveal security sensitive details such us "such user exists".
Comment by Raju Suravarjjala [ 05/Sep/14 ]
Anil: I have logged this bug as per your suggestion. Please advice




[MB-12144] REST API doesn't always return 60 stat entries Created: 08/Sep/14  Updated: 08/Sep/14

Status: Open
Project: Couchbase Server
Component/s: RESTful-APIs
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Ian McCloy Assignee: Aleksey Kondratenko
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Dependency
Triage: Untriaged
Is this a Regression?: Unknown

 Description   
The output of the REST API stats doesn't always return 60 entries for each stat.. sometimes 59 or 61 are returned.

for i in {1..1000}; do /opt/couchbase/bin/curl localhost:8091/pools/default/buckets/default/stats 2>/dev/null| cut -d":" -f4 | cut -d \" -f1 | tr , "\n" | grep -v '^$' | wc -l | grep -v 60 ; done




[MB-12150] [Windows] Cleanup unnecessary files that are part of the windows installer Created: 08/Sep/14  Updated: 08/Sep/14

Status: Open
Project: Couchbase Server
Component/s: installer
Affects Version/s: 3.0.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Raju Suravarjjala Assignee: Bin Cui
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: Windows 7
Build 3.0.1-1261

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
Install windows build 3.0.1-1261
As part of the installation you will see 2 files couchbase_console.html and also membase_console.html. You do not need membase_console.html. Please remove it




[MB-12155] View query and index compaction failing on 1 node with error view_undefined Created: 09/Sep/14  Updated: 09/Sep/14

Status: Open
Project: Couchbase Server
Component/s: view-engine
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Ian McCloy Assignee: Harsha Havanur
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Dependency
Triage: Untriaged
Operating System: Windows 64-bit
Is this a Regression?: Unknown

 Description   
Customer upgraded their 6 node cluster from 2.2 to 2.5.1 running on Microsoft Windows Server 2008 R2 Enterprise and one of their views stopped working.

It appears the indexing and index compaction stopped working on 1 node out of the 6. This appeared to only affect 1 design document.

snips from problem node -->>

[couchdb:error,2014-09-08T17:20:31.840,ns_1@HOST:<0.23288.321>:couch_log:error:42]Uncaught error in HTTP request: {throw,view_undefined}

Stacktrace: [{couch_set_view,get_group,3},
             {couch_set_view,get_map_view,4},
             {couch_view_merger,get_set_view,5},
             {couch_view_merger,simple_set_view_query,3},
             {couch_httpd,handle_request,6},
             {mochiweb_http,headers,5},
             {proc_lib,init_p_do_apply,3}]
[couchdb:info,2014-09-08T17:20:31.840,ns_1@HOST:<0.23288.321>:couch_log:info:39]10.7.43.229 - - POST /_view_merge/?stale=false 500

=====

[ns_server:warn,2014-09-08T17:25:10.506,ns_1@HOST:<0.14357.327>:compaction_daemon:do_chain_compactors:725]Compactor for view `Bucket/_design/DDOC/main` (pid [{type,view},
                                                {important,true},
                                                {name,
                                                  <<"Bucket/_design/DDoc/main">>},
                                                {fa,
                                                  {#Fun<compaction_daemon.16.22390493>,
                                                  [<<"Bucket">>,
                                                    <<"_design/DDoc">>,main,
                                                    {config,
                                                    {30,18446744073709551616},
                                                    {30,18446744073709551616},
                                                    undefined,false,false,
                                                    {daemon_config,30,
                                                      131072}},
                                                    false,
                                                    {[{type,bucket}]}]}}]) terminated unexpectedly: {error,
                                                                                                    view_undefined}
[ns_server:warn,2014-09-08T17:25:10.506,ns_1@HOST:<0.14267.327>:compaction_daemon:do_chain_compactors:730]Compactor for view `Bucket/_design/DDoc` (pid [{type,view},
                                            {name,<<"Bucket/_design/DDoc">>},
                                            {important,false},
                                            {fa,
                                            {#Fun<compaction_daemon.20.107749383>,
                                              [<<"Bucket">>,<<"_design/DDoc">>,
                                              {config,
                                                {30,18446744073709551616},
                                                {30,18446744073709551616},
                                                undefined,false,false,
                                                {daemon_config,30,131072}},
                                              false,
                                              {[{type,bucket}]}]}}]) terminated unexpectedly (ignoring this): {error,
                                                                                                                view_undefined}
[ns_server:debug,2014-09-08T17:25:10.506,ns_1@HOST:compaction_daemon<0.480.0>:compaction_daemon:handle_info:505]Finished compaction iteration.




[MB-12166] Linux: Warnings on install are poorly formatted and unlikely to be read by a user. Created: 10/Sep/14  Updated: 10/Sep/14

Status: Open
Project: Couchbase Server
Component/s: installer
Affects Version/s: 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Dave Rigby Assignee: Bin Cui
Resolution: Unresolved Votes: 0
Labels: supportability
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: Centos 6

Attachments: PNG File Screen Shot 2014-09-10 at 15.21.55.png    
Triage: Untriaged
Operating System: Centos 64-bit
Is this a Regression?: Unknown

 Description   
When installing the 3.0 RPM, we check for various OS settings and print warnings if they don't meet our recommendations.

This is a great idea in principle, but the actual output isn't very well presented, meaning users are (IMHO) likely to not spot the issues which are being raised.

I've attached a screenshot to show this exactly as displayed in the console, but the verbatim text is:

---cut ---
$ sudo rpm -Uvh couchbase-server-enterprise_centos6_x86_64_3.0.0-1209-rel.rpm
Preparing... ########################################### [100%]
Warning: Transparent hugepages may be used. To disable the usage
of transparent hugepages, set the kernel settings at runtime with
echo never > /sys/kernel/mm/transparent_hugepage/enabled
Warning: Transparent hugepages may be used. To disable the usage
of transparent hugepages, set the kernel settings at runtime with
echo never > /sys/kernel/mm/redhat_transparent_hugepage/enabled
Warning: Swappiness is not 0.
You can set the swappiness at runtime with
sysctl vm.swappiness=0
Minimum RAM required : 4 GB
System RAM configured : 0.97 GB

Minimum number of processors required : 4 cores
Number of processors on the system : 1 cores

   1:couchbase-server ########################################### [100%]
Starting couchbase-server[ OK ]

You have successfully installed Couchbase Server.
Please browse to http://localhost.localdomain:8091/ to configure your server.
Please refer to http://couchbase.com for additional resources.

Please note that you have to update your firewall configuration to
allow connections to the following ports: 11211, 11210, 11209, 4369,
8091, 8092, 18091, 18092, 11214, 11215 and from 21100 to 21299.

By using this software you agree to the End User License Agreement.
See /opt/couchbase/LICENSE.txt.
$
---cut ---

A couple of observations:

1) Everything is run together, including informational things (Preparing, Installation successful), things the user should act on (Warning: Swappiness, THP, Firewall information).

2) It's not very clear how serious some of these messages are - Is the fact I'm running with 1/4 of the minimum RAM just a minor thing, or a showstopper? Similary with THP - Support have seen on many occasions this can can cause false-positive fail overs, but we just casually say here:

"Warning: Transparent hugepages may be used. To disable the usage of transparent hugepages, set the kernel settings at runtime with echo never > /sys/kernel/mm/transparent_hugepage/enabled"


Suggestions:

1) Make the Warnings more pronounced - e.g prefix with "[WARNING]" and add some blank lines between things

2) Make clearer why these things are listed - linking back to more detailed information in our install guide if necessary. For example: "THP may cause slowdown of the cluster manager and false positive fail overs. Couchbase recommend disabling it. See http://docs.couchbase.com/THP for more details."

3) For things like THP which we can actually fix, ask the user if they want them fixed - after all we are already root if we are installing - e.g. "THP bad.... Would you like to change system THP setting to be changed to the recommended value (madvise) (y/n)?"

4) For things we can't fix (low memory, low CPUs) make the user confirm their decision to continue - e.g. "CPUs below minimum. Couchbase recommends at least XXX for production systems. Please type "test system" to continue installation.



 Comments   
Comment by David Haikney [ 10/Sep/14 ]
+1 from me - we can clearly improve the presentation here. I expect making the install interactive ("should I fix THP?") could be difficult. Are there existing precedents we can refer to here to help consistency?
Comment by Dave Rigby [ 10/Sep/14 ]
@DaveH: Admittedly I don't think they use RPM, but VMware guest tools springs to mind - they present the user a number of questions when installing - "do you want to automatically update kernel modules?", "do you want to use printer sharing", etc.

Admittedly they don't have a secondary config stage unlike us with our GUI, *but* if we are going to fix things like THP, swappiness, then we need to be root to do so (and so install-time is the only option).




[MB-12177] document SDK usage of CA and self-signed certs Created: 12/Sep/14  Updated: 12/Sep/14

Status: Open
Project: Couchbase Server
Component/s: None
Affects Version/s: 3.0
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Major
Reporter: Matt Ingenthron Assignee: Unassigned
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Gantt: finish-start
has to be done after MB-12173 SSL certificate should allow importin... Open

 Description   
To be done after Couchbase Server supports this.




[MB-12174] Clarification on SSL communication documentation for 3.0 Created: 12/Sep/14  Updated: 12/Sep/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 3.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Cihan Biyikoglu Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: customer
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown




[MB-11989] XDCR next release Created: 18/Aug/14  Updated: 12/Sep/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: feature-backlog
Fix Version/s: None
Security Level: Public

Type: Epic Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Epic Name: XDCR next release
Epic Status: To Do




[MB-12182] XDCR@next release - unit test "asynchronize" mode of XmemNozzle Created: 12/Sep/14  Updated: 12/Sep/14

Status: Open
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: feature-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: sprint1_xdcr
Remaining Estimate: 16h
Time Spent: Not Specified
Original Estimate: 16h

Epic Link: XDCR next release




[MB-12020] XDCR@next release - REST Server Created: 19/Aug/14  Updated: 12/Sep/14

Status: In Progress
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Yu Sui
Resolution: Unresolved Votes: 0
Labels: sprint1_xdcr
Remaining Estimate: 32h
Time Spent: Not Specified
Original Estimate: 32h

Epic Link: XDCR next release

 Description   
build on top of admin port
1. request\response message format defined in protobuf
2. handlers for request




[MB-12019] XDCR@next release - Replication Manager #1: barebone Created: 19/Aug/14  Updated: 12/Sep/14

Status: In Progress
Project: Couchbase Server
Component/s: cross-datacenter-replication
Affects Version/s: techdebt-backlog
Fix Version/s: None
Security Level: Public

Type: Task Priority: Major
Reporter: Xiaomei Zhang Assignee: Xiaomei Zhang
Resolution: Unresolved Votes: 0
Labels: sprint1_xdcr
Remaining Estimate: 32h
Time Spent: Not Specified
Original Estimate: 32h

Epic Link: XDCR next release

 Description   
build on top of generic FeedManager with XDCR specifics
1. interface with Distributed Metadata Service
2. interface with NS-server




[MB-12194] [Windows] When you try to uninstall CB server it comes up with Installer wizard instead of uninstall Created: 15/Sep/14  Updated: 15/Sep/14

Status: Open
Project: Couchbase Server
Component/s: installer
Affects Version/s: 3.0.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Raju Suravarjjala Assignee: Bin Cui
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: Windows 7
Build: 3.0.1_1299

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
Install Windows 3.0.1_1299 build
Try to uninstall the CB server
You will see the CB InstallShield Installation Wizard and then it comes up with the prompt of removing the selected application and all of its features

Expected result: It would be nice to come up with Uninstall Wizard instead of confusing Installation wizard




[MB-12196] [Windows] When I run cbworkloadgen.exe, I see a Warning message Created: 15/Sep/14  Updated: 15/Sep/14

Status: Open
Project: Couchbase Server
Component/s: installer
Affects Version/s: 3.0.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Raju Suravarjjala Assignee: Bin Cui
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: Windows 7
Build 1299

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
Install 3.0.1_1299 build
Go to bin directory on the installation directory, run cbworkloadgen.exe
You will see the following warning:
WARNING:root:could not import snappy module. Compress/uncompress function will be skipped.

Expected behavior: The above warning should not appear





[MB-12199] curl -H arguments need to use double quotes Created: 16/Sep/14  Updated: 16/Sep/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 2.5.0, 2.5.1, 3.0.1, 3.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Matt Ingenthron Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
Current documentation states:

Indicates that an HTTP PUT operation is requested.
-H 'Content-Type: application/json'

And that will fail, seemingly owing to the single quotes. See also:
https://twitter.com/RamSharp/status/511739806528077824


 Comments   
Comment by Ruth Harris [ 16/Sep/14 ]
TASK for TECHNICAL WRITER
Fix in 3.0 == FIXED: Added single quotes or removed quotes from around the http string in appropriate examples.
Design Doc rest file - added single quotes, Compaction rest file ok, Trbl design doc file ok

FIX in 2.5: TBD

-----------------------

CONCLUSION:
At least with PUT, both single and double quotes work around: Content-Type: application/json. Didn't check GET or DELETE.
With PUT and DELETE, no quotes and single quotes around the http string work. Note: Some of the examples are missing a single quote around the http string. Meaning, one quote is present, but either the ending or beginning quote is missing. Didn't check GET.

Perhaps a missing single quote around the http string was the problem?
Perhaps there was formatting tags associated with ZlatRam's byauth.ddoc code that was causing the problem?

----------------------

TEST ONE:
1. create a ddoc and view from the UI = testview and testddoc
2. retrieve the ddoc using GET
3. use single quotes around Content-Type: application/json and around the http string. Note: Some of the examples are missing single quotes around the http string.
code: curl -X GET -H 'Content-Type: application/json' 'http://Administrator:password@10.5.2.54:8092/test/_design/dev_testddoc'
results: {
    "views": {
        "testview": {
            "map": "function (doc, meta) {\n emit(meta.id, null);\n}"
        }
    }
}

TEST TWO:
1. delete testddoc
2. use single quotes around Content-Type: application/json and around the http string
code: curl -X DELETE -H 'Content-Type: application/json' 'http://Administrator:password@10.5.2.54:8092/test/_design/dev_testddoc'
results: {"ok":true,"id":"_design/dev_testddoc"}
visual check via UI: Yep, it's gone


TEST THREE:
1. create a myauth.ddoc text file using the code in the Couchbase design doc documentation page.
2. Use PUT to create a dev_myauth design doc
3. use single quotes around Content-Type: application/json and around the http string. Note: I used "| python -m json.tool" to get pretty print output

myauth.ddoc contents: {"views":{"byloc":{"map":"function (doc, meta) {\n if (meta.type == \"json\") {\n emit(doc.city, doc.sales);\n } else {\n emit([\"blob\"]);\n }\n}"}}}
code: curl -X PUT -H 'Content-Type: application/json' 'http://Administrator:password@10.5.2.54:8092/test/_design/dev_myauth' -d @myauth.ddoc | python -m json.tool
results: {
    "id": "_design/dev_myauth",
    "ok": true
}
visual check via UI: Yep, it's there.

TEST FOUR:
1. copy myauth.ddoc to zlat.ddoc
2. Use PUT to create a dev_zlat design doc
3. use double quotes around Content-Type: application/json and single quotes around the http string.

zlat.ddoc contents: {"views":{"byloc":{"map":"function (doc, meta) {\n if (meta.type == \"json\") {\n emit(doc.city, doc.sales);\n } else {\n emit([\"blob\"]);\n }\n}"}}}
code: curl -X PUT -H "Content-Type: application/json" 'http://Administrator:password@10.5.2.54:8092/test/_design/dev_zlat' -d @zlat.ddoc | python -m json.tool
results: {
    "id": "_design/dev_zlat",
    "ok": true
}
visual check via UI: Yep, it's there.


TEST FIVE:
1. create a ddoc text file using ZlatRam's ddoc code
2. flattened the formatting so it reflected the code in the Couchbase example (used above)
3. Use PUT and single quotes.

zlatram contents: {"views":{"byauth":{"map":"function (doc, username) {\n if (doc.type == \"session\" && doc.user == username && Date.Parse(doc.expires) > Date.Parse(Date.Now()) ) {\n emit(doc.token, null);\n }\n}"}}}
code: curl -X PUT -H 'Content-Type: application/json' 'http://Administrator:password@10.5.2.54:8092/test/_design/dev_zlatram' -d @zlatram.ddoc | python -m json.tool
results: {
    "id": "_design/dev_zlatram",
    "ok": true
}
visual check via UI: Yep, it's there.

TEST SIX:
1. delete zlatram ddoc but without quotes around the http string: curl -X DELETE -H 'Content-Type: application/json' http://Administrator:password@10.5.2.54:8092/test/_design/dev_zlatram
2. results: {
    "id": "_design/dev_zlatram",
    "ok": true
}
3. verify via UI: Yep, it gone
4. add zlatram but without quotes around the http string: curl -X PUT -H 'Content-Type: application/json' http://Administrator:password@10.5.2.54:8092/test/_design/dev_zlatram
5. results: {
    "id": "_design/dev_zlatram",
    "ok": true
}
6. verify via UI: Yep, it back.




[MB-11938]  N1QL developer preview does not work with couchbase 3.0 beta. Created: 12/Aug/14  Updated: 17/Sep/14

Status: Open
Project: Couchbase Server
Component/s: query
Affects Version/s: 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Patrick Varley Assignee: Gerald Sangudi
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
This came in on IRC, user dropped offline before I could point them at Jira. I have created this defect on their behalf:

N1QL makes use of _all_docs which we have removed in 3.0.

The error from the query engine:

couchbase-query_dev_preview3_x86_64_mac ► ./cbq-engine -couchbase http://127.0.0.1:8091/
19:13:38.355197 Info line disabled false
19:13:38.367261 tuqtng started...
19:13:38.367282 version: v0.7.2
19:13:38.367287 site: http://127.0.0.1:8091/
19:14:24.179252 ERROR: Unable to access view - cause: error executing view req at http://127.0.0.1:8092/free/_all_docs?limit=1001: 400 Bad Request - {"error":"bad_request","reason":"_all_docs is no longer supported"}
 -- couchbase.(*viewIndex).ScanRange() at view_index.go:186
19:14:24.179272 Checking bucket URI: /pools/default/buckets/free?bucket_uuid=660ff64e9d1fdfee0c41017e89a4fe72
19:14:24.179315 ERROR: Get /pools/default/buckets/free?bucket_uuid=660ff64e9d1fdfee0c41017e89a4fe72: unsupported protocol scheme "" -- couchbase.(*viewIndex).ScanRange() at view_index.go:192

 Comments   
Comment by Gerald Sangudi [ 12/Aug/14 ]
Please use

CREATE PRIMARY INDEX

before issuing queries against 3.0.
Comment by Brett Lawson [ 17/Sep/14 ]
Hey Gerald,
I assume this is just a temporary workaround?
Cheers, Brett
Comment by Gerald Sangudi [ 17/Sep/14 ]
HI Brett,

It may not be temporary. User would need to issue

CREATE PRIMARY INDEX

once per bucket. After that, they can query the bucket as often as needed. Subsequent calls to CREATE PRIMARY INDEX will notice the existing index and return immediately.

Maintaining the primary index is not cost-free, so we may not want to automatically create it for every bucket (e.g. a very large KV bucket with no N1QL or view usage).

Thanks,
Gerald




[MB-12195] Update notifications does not seem to be working Created: 15/Sep/14  Updated: 17/Sep/14

Status: Open
Project: Couchbase Server
Component/s: UI
Affects Version/s: 2.5.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Raju Suravarjjala Assignee: Ian McCloy
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: Centos 5.8
2.5.0

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
I have installed 2.5.0 build and enabled Update Notifications
Even though I enabled "Enable software Update Notifications", I keep getting "No Updates available"
I thought I will be notified in the UI that there is a 2.5.1 is available.

I have consulted Tony to see if I have done something wrong but he also confirmed that this seems to be an issue and is a bug

 Comments   
Comment by Aleksey Kondratenko [ 15/Sep/14 ]
Based on dev tools we're getting "no new version" from phone home requests. So it's not UI bug.
Comment by Ian McCloy [ 17/Sep/14 ]
Added the missing available upgrade paths to the database,

2.5.0-1059-rel-enterprise -> 2.5.1-1083-rel-enterprise
2.2.0-837-rel-enterprise -> 2.5.1-1083-rel-enterprise
2.1.0-718-rel-enterprise -> 2.2.0-837-rel-enterprise

but it looks like the code that parses http://ph.couchbase.net/v2?callback=jQueryxxx isn't checking the database.




[MB-12189] (misunderstanding) XDCR REST API "max-concurrency" only works for 1 of 3 documented end-points. Created: 15/Sep/14  Updated: 17/Sep/14

Status: Reopened
Project: Couchbase Server
Component/s: ns_server, RESTful-APIs
Affects Version/s: 2.5.1, 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Major
Reporter: Jim Walker Assignee: Aleksey Kondratenko
Resolution: Unresolved Votes: 0
Labels: supportability, xdcr
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: Couchbase Server 2.5.1
RHEL 6.4
VM (VirtualBox0
1 node "cluster"

Triage: Untriaged
Operating System: Centos 64-bit
Is this a Regression?: Unknown

 Description   
This defect relates to the following REST APIs:

* xdcrMaxConcurrentReps (default 32) http://localhost:8091/internalSettings/
* maxConcurrentReps (default 32) http://localhost:8091/settings/replications/
* maxConcurrentReps (default 32) http://localhost:8091/settings/replications/ <replication_id>

The documentation suggests these all do the same thing, but with the scope of change being different.

<docs>
/settings/replications/ — global settings applied to all replications for a cluster
settings/replications/<replication_id> — settings for specific replication for a bucket
/internalSettings - settings applied to all replications for a cluster. Endpoint exists in Couchbase 2.0 and onward.
</docs>

This defect is because only "settings/replications/<replication_id>" has any effect. The other REST endpoints have no effect.

Out of these APIs I can confirm that changing "/settings/replications/<replication_id>" has an effect. The XDCR code shows that the concurrent reps setting feeds into the concurreny throttle as the number of available tokens. I use xdcr log files where we print the concurrency throttle token data to observe that the setting has an effect.

For example, a cluster in the default configuration has a total tokens of 32. We can grep to see this.

[root@localhost logs]# grep "is done normally, total tokens:" xdcr.*
2014-09-15T13:09:03.886,ns_1@127.0.0.1:<0.32370.0>:concurrency_throttle:clean_concurr_throttle_state:275]rep <0.33.1> to node "192.168.69.102:8092" is done normally, total tokens: 32, available tokens: 32,(active reps: 0, waiting reps: 0)

Now changing the setting to 42 the log file shows the change take affect.

curl -u Administrator:password http://localhost:8091/settings/replications/01d38792865ba2d624edb4b2ad2bf07f%2fdefault%2fdefault -d maxConcurrentReps=42

[root@localhost logs]# grep "is done normally, total tokens:" xdcr.*
dcr.1:[xdcr:debug,2014-09-15T13:17:41.112,ns_1@127.0.0.1:<0.32370.0>:concurrency_throttle:clean_concurr_throttle_state:275]rep <0.2321.1> to node "192.168.69.102:8092" is done normally, total tokens: 42, available tokens: 42,(active reps: 0, waiting reps: 0)

Since this defect is that both of the other two REST end-points don't appear to have any affect here's an example changing "settings/replication". This example was on a clean cluster, i.e. no other settings have been changed. Only creating bucket and replication + client writes has been performed.

root@localhost logs]# curl -u Administrator:password http://localhost:8091/settings/replications/ -d maxConcurrentReps=48
{"maxConcurrentReps":48,"checkpointInterval":1800,"docBatchSizeKb":2048,"failureRestartInterval":30,"workerBatchSize":500,"connectionTimeout":180,"workerProcesses":4,"httpConnections":20,"retriesPerRequest":2,"optimisticReplicationThreshold":256,"socketOptions":{"keepalive":true,"nodelay":false},"supervisorMaxR":25,"supervisorMaxT":5,"traceDumpInvprob":1000}

Above shows that the JSON has acknowledged the value of 48 but the log files show no change. After much waiting and re-checking grep shows no evidence.

[root@localhost logs]# grep "is done normally, total tokens:" xdcr.* | grep "total tokens: 48" | wc -l
0
[root@localhost logs]# grep "is done normally, total tokens:" xdcr.* | grep "total tokens: 32" | wc -l
7713

The same was observed for /internalSettings/

Found on both 2.5.1 and 3.0.

 Comments   
Comment by Aleksey Kondratenko [ 15/Sep/14 ]
This is because global settings affect new replications or replications without per-replication settings defined. UI always defines all per-replication settings.
Comment by Jim Walker [ 16/Sep/14 ]
Have you pushed a documentation update for this?
Comment by Aleksey Kondratenko [ 16/Sep/14 ]
No. I don't own docs.
Comment by Jim Walker [ 17/Sep/14 ]
Then this issue is not resolved.

Closing/resolving this defect with breadcrumbs to the opening of an issue on a different project would suffice as a satisfactory resolution.

You can also very easily put a pull request into docs on github with the correct behaviour.

Can you please perform *one* of those task so that the REST api here is correctly documented with the behaviours you are aware of and this matter can be closed.
Comment by Jim Walker [ 17/Sep/14 ]
Resolution requires either:

* Corrected documentation pushed to documentation repository.
* Enough accurate API information placed into a documentation defect so docs-team can correct.





[MB-11891] cbrestore: override the TTL Created: 06/Aug/14  Updated: 06/Aug/14

Status: Open
Project: Couchbase Server
Component/s: tools
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Minor
Reporter: Patrick Varley Assignee: Bin Cui
Resolution: Unresolved Votes: 0
Labels: cbback
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified


 Description   
There are times a user will wish to restore data that has already expired since the backup was taken.

We should have the option on cbrestore to overwrite the TTL or increase it by a fix amount.




[MB-11898] language field missing on view definition Created: 07/Aug/14  Updated: 07/Aug/14

Status: Open
Project: Couchbase Server
Component/s: tools
Affects Version/s: 2.5.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Minor
Reporter: Michael Nitschinger Assignee: Bin Cui
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
Hi folks,

I know this field doesn't really has a meaning right now, but it might be in the future. While working on design doc management for the Java SDK I noticed something odd.

All the design documents normally have the "language" field set to default. But when you go to the UI, click "Create new Development View" and give it a new design doc and a random view name, click save and do nothing more it's there and all is good, but the language field is missing!

For example (I created a brewery design doc):

http://localhost:8092/beer-sample/_design/dev_brewery

{
views: {
brew: {
map: "function (doc, meta) { emit(meta.id, null); }"
}
}
}

See that the language field is missing.

 Comments   
Comment by Volker Mische [ 07/Aug/14 ]
I've changed the component to "tools" (and the assignee to automatic) as it is related to the beer sample. What Michael refers to as "normal" is the beer sample. The view engine just sets what it gets.
Comment by Michael Nitschinger [ 07/Aug/14 ]
Actually let me clarify: when you import the beer-sample it _has_ the language set. If you create a new design doc it hasn't. I dont know which of those behavior is the good one, but at least it should be consistent (some have it set, some others do not)
Comment by Volker Mische [ 07/Aug/14 ]
I think it's only our samples (beer and gamesim) that have those set.
Comment by Sriram Melkote [ 07/Aug/14 ]
Yes. I think Filipe decided to remove unwanted parameters to keep things clean (which I agree with).

Please see change 18a46a64c20176f11a22b2c92ece0eb7816c3d84 on couchdb.




[MB-11927] Pump system doesn't close backup database files Created: 11/Aug/14  Updated: 11/Aug/14

Status: Open
Project: Couchbase Server
Component/s: tools
Affects Version/s: 2.2.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Minor
Reporter: Bryce Jasmer Assignee: Bin Cui
Resolution: Unresolved Votes: 0
Labels: cbback
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Operating System: Centos 64-bit
Is this a Regression?: Unknown

 Description   
Instead of calling out to cbbackup from our backup wrapper, I'd like to call pump_transfer.Backup().main() myself from within a python script. However, if I do that, I notice that the data-0000.cbb file never gets closed. Since I sleep until the next day and start a new backup, I end up with lots of open file handles to files that I've rotated away (deleted), so my backup drive eventually fills up.

It looks like cbbackup doesn't experience this simply because it exits when it is done with a single backup and that will release the open file handle.

I realize my use case is a little unorthodox, so I'm marked this only as "Minor". But it would be nice if everything was cleaned up properly when components are done using them.




[MB-12050] Set LANG=C on cbcollect_info. Created: 22/Aug/14  Updated: 22/Aug/14

Status: Open
Project: Couchbase Server
Component/s: ns_server
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Minor
Reporter: Patrick Varley Assignee: Aleksey Kondratenko
Resolution: Unresolved Votes: 0
Labels: supportability
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified


 Description   
There have been a number of cases where we have got the couchbase.log in another language unfortunately most of the support team only speaks English.

I believe if we set the LANG=C it should return English.





[MB-11885] XDCR createReplication REST API arguments are not all documented Created: 05/Aug/14  Updated: 05/Aug/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Minor
Reporter: Don Stacy Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: No

 Description   
The section http://docs.couchbase.com/couchbase-manual-2.5/cb-rest-api/#creating-xdcr-replications shows some of the parameters available to the REST API call, but not all of the parameters. A user asked how to set the Protocol value via the REST API. The option is there (via the "type" attribute), but it's not in the documentation. Ideally the doc should include every available setting and the allowed values for each (e.g., type only allows xmem or capi as values).

This ticket is specific to one REST API method, but the requirement to document all values really applies to all REST calls.

 Comments   
Comment by Ruth Harris [ 05/Aug/14 ]
Agreed. REST needs to be overhauled. However, it's not in the bandwidth for the 3.0 release.




[MB-11894] Documentation does not feel smooth between pages with a scroll bar and pages without. Created: 06/Aug/14  Updated: 26/Aug/14

Status: Open
Project: Couchbase Server
Component/s: doc-system
Affects Version/s: 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Minor
Reporter: Patrick Varley Assignee: Amy Kurtzman
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Operating System: MacOSX 64-bit
Is this a Regression?: Unknown

 Description   
This might be me being nitpicking. When you flick between these two links it does not feel smooth:

http://docs.couchbase.com/prebuilt/couchbase-manual-3.0/Features/features.html
http://docs.couchbase.com/prebuilt/couchbase-manual-3.0/Features/bcp.html

Which are sequence sections on the left hand bar; "Couchbase 3.0 Features" to "Database Change Protocol"

 Comments   
Comment by Amy Kurtzman [ 26/Aug/14 ]
Yes, it might be nitpicking. :-)

Can you clarify what the problem is? I'm not really sure what you mean. Are you commenting on the physical function of the page or the textual content of the page/

The last sentence is unclear:
"Which are sequence sections on the left hand bar; "Couchbase 3.0 Features" to "Database Change Protocol"
Is that a question?

Also please keep in mind that this section will be revised for to the GA release.
Comment by Patrick Varley [ 26/Aug/14 ]
"""
The last sentence is unclear:
"Which are sequence sections on the left hand bar; "Couchbase 3.0 Features" to "Database Change Protocol"
Is that a question?
"""
It was more a indication that it might be noticed easily as the chapters are beside each other.

 It only happens on my bigger display it looks like because "features.html" page is longer and as a result Chrome puts a scroll bar in. Which then causes everything to shift over to the left. I will record it when I'm in the office tomorrow.




[MB-12079] Cannot edit documents with textual or numeric data. Created: 27/Aug/14  Updated: 03/Sep/14

Status: Reopened
Project: Couchbase Server
Component/s: UI
Affects Version/s: 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Task Priority: Minor
Reporter: Brett Lawson Assignee: Aleksey Kondratenko
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified


 Description   
When attempting to create or modify a document in the Web UI, and that value is a string or a number, errors occur preventing you from saving the document.

 Comments   
Comment by Matt Ingenthron [ 27/Aug/14 ]
Related is MB-12078. Looks like the validator we're using in the console is wrong.
Comment by Aleksey Kondratenko [ 27/Aug/14 ]
It is not "validator is wrong". It's deliberate choice to refuse editing such values.
Comment by Aleksey Kondratenko [ 27/Aug/14 ]
See MB-9208
Comment by Matt Ingenthron [ 27/Aug/14 ]
It looks like things have changed for the better and the console is a bit out of sync. Now views do work with numbers such as "1" which may be incr/decr'd too. So, what is said in MB-9208 isn't valid any more. Brett verified this.

At this stage you can insert "1234", incr it, but you can't insert it through the console because it says "JSON should represent an object": http://puu.sh/b9H8i.png

Do you see a downside to syncing up the console to what views/view-engine actually do these days?

Comment by Aleksey Kondratenko [ 27/Aug/14 ]
Have you tried arrays ?
Comment by Matt Ingenthron [ 28/Aug/14 ]
I have not, but you certainly could. The use case we were looking to was the number with incr/decr. Strings should also be handled. From code inspection, it looks like the current JSON parser will handle these things correctly.
Comment by Aleksey Kondratenko [ 28/Aug/14 ]
Parsers ? Or parser?
Comment by Matt Ingenthron [ 28/Aug/14 ]
To my knowledge, as of 3.0 the memcached process does the "isJson()" identification and then in the view engine, that'll control whether meta.type is document or base64. So, I was referring to that parser in 3.0. Sorry for the lack of clarity. There could be other issues at the view-engine level or at the console level, but at least for numbers it seems to do what users want it to do at the view-engine, memcached, SDK level. The console doesn't allow insert/edit even though 1234 is technically JSON.
Comment by Aleksey Kondratenko [ 28/Aug/14 ]
3.0's ep-engine to my surprise indeed does full json detection (unlike 2.5 which AFAIK only allowed objects).

But AFAIK 3.0.0's view engine _does not_ use datatype inferred by memcached. Same is true for some obscure CAPI XDCR paths which need json-ness. For xdcr _I know_ that detection happens via couchdb routines. I'm sure there's no other way for views to do json detection.

BTW this is one of areas where some higher level entity could finally decide something. And let me say that I'm not eager to let my team work on this until there's clear resolution on json detection (from tag or from ep-engine folks or from anybody else).

Comment by Matt Ingenthron [ 28/Aug/14 ]
Siri, can you clarify? From one of our previous conversations I was under the impression that over DCP, view-engine is trusting the JSON detection done in the memcached process. Is that accurate that as of 3.0,
Comment by Aleksey Kondratenko [ 28/Aug/14 ]
Haven't we made decision to disable data type in 3.0?
Comment by Matt Ingenthron [ 28/Aug/14 ]
Alk: agreed this should be a higher level thing. We ended up having to disable datatype at memcached HELLO because if it had gone in and a mixed version cluster had failed over, logical corruption would have occurred. That's one of the areas that probably should have had better review.

That said, I'm just in the "what do we do given the current state" kind of mode. The other changes are in and good-- I don't see any reason we'd avoid making the Web UI changes in a future release.
Comment by Sriram Melkote [ 01/Sep/14 ]
View engine switched to using DCP datatype in MB-11044.

There has been a lot of discussions, changes to turn off HELLO etc, but none of them have resulted in a direction to view engine to stop using datatype sent by DCP.

So it still uses datatype as transmitted by DCP.
Comment by Matt Ingenthron [ 02/Sep/14 ]
Correct, we evaluated the JSON detection in memcached from a performance perspective and it seemed okay so the disabling of HELLO was determined to be good enough. This means the memcached processes "isJson()" is canonical at this stage.

Based on this thread, and based on the "what do we do now", I think the best thing is to add editing text and numbers to the console. Your thoughts?
Comment by Aleksey Kondratenko [ 02/Sep/14 ]
My thought (not sure if you referred to me or somebody else): I want to wait until this entire json-detection/datatype/common-flags story is final and settled.
Comment by Matt Ingenthron [ 02/Sep/14 ]
Yes, it was to you Alk, since it's assigned to you. I think it *is* settled for 3.0.x so I'd propose fixing the console in 3.0.1. Our users will be pretty happy with the improvement.

That's not to say we won't have changes in a new version though. There's a phrase you may have heard of before: "the only constant is change". :)
Comment by Aleksey Kondratenko [ 02/Sep/14 ]
I'm not sure it's the case. Do I have API other than DCP to see inferred datatype?
Comment by Aleksey Kondratenko [ 02/Sep/14 ]
Actually if it is settled, then we may have forward-compat issue. I.e. what if we want to change datatype completely in future, then how do we handle mixed clusters ?
Comment by Matt Ingenthron [ 03/Sep/14 ]
I think any "in future" kinds of questions will depend on what the change is from and to. There could be mixed cluster concerns that I've not thought about (I didn't make the change), but that's orthogonal to the console's ability to edit "1234", correct?




[MB-12161] per-server UI does not refresh properly when adding a node Created: 09/Sep/14  Updated: 09/Sep/14

Status: Open
Project: Couchbase Server
Component/s: UI
Affects Version/s: 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Minor
Reporter: Perry Krug Assignee: Aleksey Kondratenko
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
Admittedly quite minor, but a little annoying.

When you're looking at a single stat across all nodes of a cluster (i.e active vbuckets):

-Add a new node to the cluster from another tab open to the UI
-Note that the currently open stats screen stops displaying graphs for the existing nodes and does not update that a new node has joined until you refresh the screen




[MB-12168] Documentation: Clarification around server RAM quota best practice Created: 10/Sep/14  Updated: 10/Sep/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Minor
Reporter: Brian Shumate Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified


 Description   
The sizing[1] and RAM quota[2] documentation should be more clear about the specific best practices around general server RAM quota of no greater than 80% physical RAM per node on nodes with 16GB or more, or no greater than 60% on nodes with less than 16GB.

Emphasizing that the 20% or 40% remainder of RAM is required for the operating system, file system caches, and so on would be helpful as well.

Additionally, the RAM quota sub-section of the Memory quota section[3] reads as if it is abruptly cut off or otherwise incomplete:

--------
RAM quota

You will not be able to allocate all your machine RAM to the per_node_ram_quota as there may be other programs running on your machine.
--------

1. http://docs.couchbase.com/couchbase-manual-2.5/cb-admin/#couchbase-bestpractice-sizing
2. http://docs.couchbase.com/couchbase-manual-2.5/cb-admin/#ram-quotas
3. http://docs.couchbase.com/couchbase-manual-2.5/cb-admin/#memory-quota






[MB-12171] Typo missing space on point 4 couchbase data files Created: 11/Sep/14  Updated: 11/Sep/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 1.8.0, 2.0.1, 2.1.0, 2.2.0
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Minor
Reporter: Patrick Varley Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: documentation
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: http://docs.couchbase.com/couchbase-manual-2.2/#couchbase-data-files
http://docs.couchbase.com/couchbase-manual-2.1/#couchbase-data-files
http://docs.couchbase.com/couchbase-manual-2.0/#couchbase-data-files
http://docs.couchbase.com/couchbase-manual-1.8/#couchbase-data-files

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
Point 4 needs a space between and monitor.

Start the service again andmonitor the “warmup” of the data.

 Comments   
Comment by Ruth Harris [ 11/Sep/14 ]
Fixed in 2.5. N/A in 3.0




[MB-12184] Enable logging to a remote server Created: 12/Sep/14  Updated: 12/Sep/14

Status: Open
Project: Couchbase Server
Component/s: None
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Improvement Priority: Minor
Reporter: James Mauss Assignee: Cihan Biyikoglu
Resolution: Unresolved Votes: 0
Labels: customer
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified


 Description   
It would be nice to be able to configure Couchbase Server to log events into a remote syslog-ng or the like server.




[MB-12190] Typo in the output of couchbase-cli bucket-flush Created: 15/Sep/14  Updated: 15/Sep/14

Status: Open
Project: Couchbase Server
Component/s: tools
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Minor
Reporter: Patrick Varley Assignee: Bin Cui
Resolution: Unresolved Votes: 0
Labels: cli
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
There should be a space between the full stop and Do.

[patrick:~] 2 $ couchbase-cli bucket-flush -b Test -c localhost
Running this command will totally PURGE database data from disk.Do you really want to do it? (Yes/No)

Another Typo when the command times out:

Running this command will totally PURGE database data from disk.Do you really want to do it? (Yes/No)TIMED OUT: command: bucket-flush: localhost:8091, most likely bucket is not flushed





[MB-12203] Available-stats table formatted incorrectly Created: 17/Sep/14  Updated: 17/Sep/14

Status: Open
Project: Couchbase Server
Component/s: documentation
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Task Priority: Minor
Reporter: Patrick Varley Assignee: Ruth Harris
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: http://docs.couchbase.com/couchbase-manual-2.5/cb-cli/#available-stats


 Description   
See the pending_ops cell in the link below.

http://docs.couchbase.com/couchbase-manual-2.5/cb-cli/#available-stats

I believe "client connections blocked for operations in pending vbuckets" should all be in one cell.




[MB-12207] Related links could be clearer. Created: 17/Sep/14  Updated: 17/Sep/14

Status: Open
Project: Couchbase Server
Component/s: doc-system
Affects Version/s: 3.0-Beta
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Minor
Reporter: Patrick Varley Assignee: Amy Kurtzman
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Triage: Untriaged
Is this a Regression?: Unknown

 Description   
I think it would be better if "Related link" at the bottom of the page was layout a little different and we added the ability to navigate (MB-12205) from the bottom of a page(Think long pages).

Maybe something like this could work:

Links

Parent Topic:
    Installation and upgrade
Previous Topic:
    Welcome to couchbase
Next Topic:
    uninstalling couchbase
Related Topics:
    Initial server setup
    Testing Couchbase Server
    Upgrading




[MB-12202] UI shows a cbrestore as XDCR ops Created: 17/Sep/14  Updated: 17/Sep/14

Status: Open
Project: Couchbase Server
Component/s: ns_server
Affects Version/s: 2.5.1
Fix Version/s: None
Security Level: Public

Type: Bug Priority: Minor
Reporter: Ian McCloy Assignee: Aleksey Kondratenko
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment: [info] OS Name : Linux 3.13.0-30-generic
[info] OS Version : Ubuntu 14.04 LTS
[info] CB Version : 2.5.1-1083-rel-enterprise

Attachments: PNG File cbrestoreXDCRops.png    
Triage: Untriaged
Is this a Regression?: Unknown

 Description   
I noticed while doing a cbrestore of a backup on a cluster that doesn't have any XDCR configured that the stats in the UI showed ongoing ops for XDCR. (screenshot attached)

the stats code at
http://src.couchbase.org/source/xref/2.5.1/ns_server/src/stats_collector.erl#334 is including all set with meta as XDCR ops.

 Comments   
Comment by Aleksey Kondratenko [ 17/Sep/14 ]
That's the way it is. We have no way to distinguish sources of set-with-metas.




Generated at Thu Sep 18 10:53:13 CDT 2014 using JIRA 5.2.4#845-sha1:c9f4cc41abe72fb236945343a1f485c2c844dac9.