I would have loved to post this into the github or JIRA bugtracker, but unfortunately the first one is closed, and the second one doesn’t offer a straight forward way to get an account … anyways.
I made my first steps with couchbase today and tested inserting data chunks using upsert_multi. Now this quickly triggered a timeout. So far nothing special, but what I noted, that after once a timeout was triggered, any subsequent upsert_multi ran more than a magnitude slower. This is the code to reproduce:
import sys
import random
import gevent
from time import time
import couchbase
from gcouchbase.bucket import Bucket
print("python " + sys.version.replace('\n', ' '))
print("gevent " + gevent.__version__)
print("couchbase " + couchbase.__version__)
print("libcouchbase {} {}".format(*Bucket.lcb_version()))
c = Bucket('http://localhost/default')
d1 = dict(("a_" + str(x).rjust(12, '0'), dict(type="zzzz", val=random.random())) for x in range(200000))
for timeout in [10, 2, 100]: # 2s should run into the timeout
print("Running with {} s timeout".format(timeout))
c.timeout = timeout
try:
t = time(); c.upsert_multi(d1); print(time() - t)
except Exception as e:
print(e)
Which gives:
python 2.7.10 (default, Oct 14 2015, 16:09:02) [GCC 5.2.1 20151010]
gevent 1.0.2
couchbase 2.0.6.dev5+gfc1ddc8
libcouchbase 2.5.4 132356
Running with 10 s timeout
9.90698695183
Running with 2.5 s timeout
<Key=u'a_000000134899', RC=0x17[Client-Side timeout exceeded for operation. Inspect network conditions or increase the timeout], Operational Error, Results=300000, C Source=(src/multiresult.c,309)>
Running with 100 s timeout
<Key=u'a_000000134899', RC=0x17[Client-Side timeout exceeded for operation. Inspect network conditions or increase the timeout], Operational Error, Results=300000, C Source=(src/multiresult.c,309)>
As seen here, the statement, that ran in less than 10s at first, after the timeout, it even hits the 100s timeout. I’m not sure if it’s a deadlock or just horribly slow. I guess I had one case, when it eventually finished after half an eternity and a less data.