Help get this topic noticed by sharing it on Twitter, Facebook, or email.

SF_Replicate failure on Contact table with batchsize 100000: Exceeded max size limit of 20000

This error initially occurred using DBAmp version 3.5.8 and continues after upgrade to 3.7.2. Our local Contact table currently shows around 500,000 records.

Full log output:

Msg 50000, Level 16, State 1, Server NARCISSUS\SALESFORCE, Procedure SF_Replicate, Line 370
--- Ending SF_Replicate. Operation FAILED.
--- Starting SF_Replicate for Contact V3.6.5
20:24:06: Parameters: SALESFORCE Contact pkchunk,batchsize(100000)
Version: V3.6.5
20:24:06: Drop Contact_Previous if it exists.
20:24:07: Create Contact_Previous with new structure.
20:24:08: DBAmpNet2 3.7.2.0 (c) Copyright 2015-2017 forceAmp.com LLC
20:24:08: Batch size reset to 100000 rows per batch.
20:24:08: Parameters: replicate Contact_Previous NARCISSUS\SALESFORCE
SalesforceMSSQL SALESFORCE pkchunk,batchsize(100000)
20:24:12: Job 7502I000007GqYZQA0 created on salesforce.
20:24:13: Using the bulkapi with polling every 60 seconds
20:24:13: Job still running.
20:24:29: Job Complete.
20:24:29: Error: Batch 7512I00000AWWA7QAP failed.
20:24:29: Error: ClientInputError : Failed to read query. Exceeded max
size limit of 20000 with response size 20001
20:24:29: Error: Could not get batch result Ids from Salesforce.

20:24:29: DBAmpNet2 Operation FAILED.
20:24:31: Error: Replicate program was unsuccessful.
20:24:31: Error: Command string is C:\"Program
Files"\DBAmp\DBAmpNet2.exe Export
"Replicate:pkchunk,batchsize(100000)" "Contact_Previous"
"SERVER\SALESFORCE" "SalesforceMSSQL" "SALESFORCE"
--- Ending SF_Replicate. Operation FAILED.
1>
1 person has
this problem
+1
Reply