Help get this topic noticed by sharing it on Twitter, Facebook, or email.

OPERATION_TOO_LARGE during Refresh and Replicate

When trying to perform our daily refresh of a particular table, we receive an error message indicating that we have exceeded a certain count of distinct IDs:

exec SF_Refresh 'SFDC', 'ProcessInstanceStep', 'yes', 'no'

--- Starting SF_Refresh for ProcessInstanceStep
13:56:15: Using Schema Error Action of yes
13:56:19: Using last run time of 2014-04-28 02:13:00
13:56:23: Indentified 6754 updated/inserted rows.
13:56:23: Using alternate method to determine deleted records.
OLE DB provider "DBAmp.DBAmp" for linked server "SFDC" returned message "Error 1 : OPERATION_TOO_LARGE: exceeded 100000 distinct ids".
Msg 7320, Level 16, State 2, Line 1
Cannot execute the query "SELECT "Tbl1010"."Id" "Col1016" FROM "ProcessInstanceStep" "Tbl1010"" against OLE DB provider "DBAmp.DBAmp" for linked server "SFDC".

Following up on this, an attempt was made to replicate the table, but again it yielded the same error:

exec SF_Replicate 'SFDC', 'ProcessInstanceStep'

--- Starting SF_Replicate for ProcessInstanceStep
14:05:27: Drop ProcessInstanceStep_Previous if it exists.
14:05:27: Create ProcessInstanceStep_Previous with new structure.
14:05:28: Run the DBAmp.exe program.
14:05:28: DBAmp Bulk Operations. V2.17.3 (c) Copyright 2006-2013 forceAmp.com LLC
14:05:28: Populating local table ProcessInstanceStep_Previous , SQLCHSADB / SFDC_Archive .
14:05:29: DBAmp is using the SQL Native Client.
14:05:29: Opening SQL Server rowset
14:05:31: Error: RunQuery failed with com_error.
14:05:31: OPERATION_TOO_LARGE: exceeded 100000 distinct ids
14:05:31: Error: DBAmp.exe was unsuccessful.
14:05:31: Error: Command string is C:\"Program Files"\DBAmp\DBAmp.exe Export "ProcessInstanceStep_Previous" "SQLCHSADB" "SFDC_Archive" "SFDC"
--- Ending SF_Replicate. Operation FAILED.

(0 row(s) affected)

(1 row(s) affected)
Msg 50000, Level 16, State 1, Procedure SF_Replicate, Line 244
--- Ending SF_Replicate. Operation FAILED.

Is there a way to determine the cause of this issue? And if so, what steps can be taken to work around this problem to refresh the table as needed?

Thanks in advance for any suggestions/solutions you can provide.
2 people have
this problem
+1
Reply