This is on a Couchbase 6.0.0 test system. We are trying to test performance with a subset of data.
We had 680 million items in Couchbase with Ejection method set to full. For performance we wanted all the keys in memory so we changed the bucket to be value-only Ejection. I realized that this may cause problems because this test system had limited RAM. When I made the change Couchbase took the bucket down and brought it up again. This took a few hours. When things came back up again there were only 280 million items using 38.5GB/41GB and 0% resident. I am assuming that the keys and metadata of the 280 million items filled the available memory. 38.5GB/280 million items = 148 bytes/item? Are the other 400 million items gone? Are they there, but just not counted because the key could not fit into memory?