Hi, We are looking at couchbase as a replacement for an application built on top of MySQL. Our application is a voice broadcasting application where users have phonebooks and phonebooks contains a lot of phone numbers. When the campaigns are stated, the contacts from the phonebook for that campaign are copied to a queue system, (another MySQL table, we call it callrequests) and a back-end process selects those numbers and dial. Once the number is dialed, a status is set in callrequests table for that number and so on.
Since our data is growing, it’s becoming harder to manage MySQL performance. Specially when we require custom stats from callrequests table and CDRs etc. We are talking about millions of callrequests and CDR(s) per day.
Tried to use cbtransfer tool to import csv file for the contacts in the phonebook. It works fine, but having issues how to do something like INSERT ON DUPLICATE KEY UPDATE like MySQL. Basically want to bulk import data in the buckets. Speed is important because each file contains like half a million of contacts. 1 per line. And it won’t get good to perform a CAS in this case? Will kill the performance. In MySQL we do something like LOCAL DATA INFILE which is fast.
My questions would be,
i) How to do manage bulk imports, we use PHP and LUA for backend.
ii) While doing bulk imports, is it possible to update the records which exists only?