Help get this topic noticed by sharing it on Twitter, Facebook, or email.

SF_BulkOps Bulk api Error column

Used sf_bulkops with bulk api switch

exec SF_BulkOps 'Upsert:BulkAPI,batchsize(10000),Parallel','SALESFORCE_SANDBOX','customer_login__C_INSERT','Customer_ID_c__c'

Error column in load table is getting populated with error message even for records that got successfully inserted into salesforce.

Can you please advice?
1 person has
this problem
+1
Reply
  • Please post the complete message output of the command.

    Also what values are you seeing for Error on those records?
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • This is the structure of my load table.I am seeing error messages for records that got successfully inserted in salesforce.Also ID's get populated for records that were to suppose to fail in the load table and those ID's point to records which got successfully inserted in salesforce.

    CREATE TABLE [dbo].[customer_login__C_INSERT](
    [ID] [nchar](18) NULL,
    [Error] [nvarchar](255) NULL,
    [Customer_ID_c__c] [int] NULL,
    [Name] [varchar](200) NULL,
    [Primary_Contact__c] [varchar](18) NULL,
    [sort] [nchar](18) NULL
    ) ON [PRIMARY]

    --- Starting SF_BulkOps for customer_login__C_INSERT
    21:12:53: Run the DBAmp.exe program.
    21:12:53: DBAmp Bulk Operations. V2.20.5 (c) Copyright 2006-2015 forceAmp.com LLC
    21:12:53: Upserting Salesforce using customer_login__C_INSERT (LAIW2K8DB266D / Salesforce_SIP) .
    21:12:54: DBAmp is using the SQL Native Client.
    21:12:54: Batch size reset to 10000 rows per batch.
    21:12:55: Sort column will be used to order input rows.
    21:12:55: SF_Bulkops will poll every 60 seconds for up to 3600 seconds.
    21:12:56: Job 750Z0000000xIvwIAE created.
    21:12:57: Batch 751Z0000001RRudIAG created with 10000 rows.
    21:12:57: Batch 751Z0000001RRuiIAG created with 10000 rows.
    21:12:58: Batch 751Z0000001RRunIAG created with 10000 rows.
    21:12:59: Batch 751Z0000001RRusIAG created with 10000 rows.
    21:12:59: Batch 751Z0000001RRuxIAG created with 10000 rows.
    21:13:00: Batch 751Z0000001RRv2IAG created with 10000 rows.
    21:13:01: Batch 751Z0000001RRv7IAG created with 10000 rows.
    21:13:01: Batch 751Z0000001RRvCIAW created with 10000 rows.
    21:13:02: Batch 751Z0000001RRvHIAW created with 10000 rows.
    21:13:03: Batch 751Z0000001RRvMIAW created with 10000 rows.
    21:13:03: Batch 751Z0000001RRvRIAW created with 10000 rows.
    21:13:04: Batch 751Z0000001RRvWIAW created with 10000 rows.
    21:13:05: Batch 751Z0000001RRvbIAG created with 10000 rows.
    21:13:06: Batch 751Z0000001RRvgIAG created with 10000 rows.
    21:13:06: Batch 751Z0000001RRuyIAG created with 10000 rows.
    21:13:07: Batch 751Z0000001RRvlIAG created with 10000 rows.
    21:13:08: Batch 751Z0000001RRvIIAW created with 10000 rows.
    21:13:08: Batch 751Z0000001RRvqIAG created with 10000 rows.
    21:13:09: Batch 751Z0000001RRvvIAG created with 10000 rows.
    21:13:10: Batch 751Z0000001RRuoIAG created with 10000 rows.
    21:13:11: Batch 751Z0000001RRw0IAG created with 10000 rows.
    21:13:12: Batch 751Z0000001RRw5IAG created with 10000 rows.
    21:13:12: Batch 751Z0000001RRv3IAG created with 10000 rows.
    21:13:13: Batch 751Z0000001RRwAIAW created with 10000 rows.
    21:13:14: Batch 751Z0000001RRwFIAW created with 10000 rows.
    21:13:15: Batch 751Z0000001RRwKIAW created with 10000 rows.
    21:13:18: Batch 751Z0000001RRwPIAW created with 10000 rows.
    21:13:19: Batch 751Z0000001RRwUIAW created with 10000 rows.
    21:13:20: Batch 751Z0000001RRwZIAW created with 10000 rows.
    21:13:20: Batch 751Z0000001RRweIAG created with 10000 rows.
    21:13:21: Batch 751Z0000001RRuzIAG created with 10000 rows.
    21:13:22: Batch 751Z0000001RRwjIAG created with 10000 rows.
    21:13:23: Batch 751Z0000001RRueIAG created with 10000 rows.
    21:13:25: Batch 751Z0000001RRwoIAG created with 10000 rows.
    21:13:25: Batch 751Z0000001RRwtIAG created with 10000 rows.
    21:13:26: Batch 751Z0000001RRwyIAG created with 10000 rows.
    21:13:27: Batch 751Z0000001RRx3IAG created with 10000 rows.
    21:13:27: Batch 751Z0000001RRx8IAG created with 10000 rows.
    21:13:28: Batch 751Z0000001RRv4IAG created with 10000 rows.
    21:13:29: Batch 751Z0000001RRufIAG created with 10000 rows.
    21:13:29: Batch 751Z0000001RRwpIAG created with 10000 rows.
    21:13:30: Batch 751Z0000001RRxDIAW created with 10000 rows.
    21:13:31: Batch 751Z0000001RRxIIAW created with 10000 rows.
    21:13:32: Batch 751Z0000001RRxNIAW created with 10000 rows.
    21:13:32: Batch 751Z0000001RRx4IAG created with 10000 rows.
    21:13:33: Batch 751Z0000001RRxSIAW created with 10000 rows.
    21:13:34: Batch 751Z0000001RRxXIAW created with 10000 rows.
    21:13:35: Batch 751Z0000001RRvmIAG created with 10000 rows.
    21:13:35: Batch 751Z0000001RRxcIAG created with 10000 rows.
    21:13:36: Batch 751Z0000001RRxJIAW created with 10000 rows.
    21:13:36: Batch 751Z0000001RRxOIAW created with 10000 rows.
    21:13:37: Batch 751Z0000001RRxhIAG created with 10000 rows.
    21:13:38: Batch 751Z0000001RRugIAG created with 10000 rows.
    21:13:40: Batch 751Z0000001RRxKIAW created with 10000 rows.
    21:13:41: Batch 751Z0000001RRxmIAG created with 10000 rows.
    21:13:42: Batch 751Z0000001RRxrIAG created with 10000 rows.
    21:13:42: Batch 751Z0000001RRxwIAG created with 10000 rows.
    21:13:43: Batch 751Z0000001RRy1IAG created with 10000 rows.
    21:13:44: Batch 751Z0000001RRvSIAW created with 10000 rows.
    21:13:46: Batch 751Z0000001RRy6IAG created with 10000 rows.
    21:13:47: Batch 751Z0000001RRyBIAW created with 10000 rows.
    21:13:47: Batch 751Z0000001RRyGIAW created with 10000 rows.
    21:13:48: Batch 751Z0000001RRyLIAW created with 10000 rows.
    21:13:49: Batch 751Z0000001RRxdIAG created with 10000 rows.
    21:13:49: Batch 751Z0000001RRyQIAW created with 10000 rows.
    21:13:50: Batch 751Z0000001RRyVIAW created with 10000 rows.
    21:13:51: Batch 751Z0000001RRyaIAG created with 10000 rows.
    21:13:52: Batch 751Z0000001RRyfIAG created with 10000 rows.
    21:13:52: Batch 751Z0000001RRwaIAG created with 10000 rows.
    21:13:53: Batch 751Z0000001RRvhIAG created with 10000 rows.
    21:13:54: Batch 751Z0000001RRw6IAG created with 10000 rows.
    21:13:54: Batch 751Z0000001RRykIAG created with 10000 rows.
    21:13:55: Batch 751Z0000001RRv8IAG created with 10000 rows.
    21:13:56: Batch 751Z0000001RRypIAG created with 10000 rows.
    21:13:57: Batch 751Z0000001RRy2IAG created with 10000 rows.
    21:13:58: Batch 751Z0000001RRyuIAG created with 10000 rows.
    21:13:58: Batch 751Z0000001RRyzIAG created with 10000 rows.
    21:13:59: Batch 751Z0000001RRwVIAW created with 10000 rows.
    21:14:00: Batch 751Z0000001RRz4IAG created with 10000 rows.
    21:14:01: Batch 751Z0000001RRxTIAW created with 10000 rows.
    21:14:01: Batch 751Z0000001RRz9IAG created with 10000 rows.
    21:14:02: Batch 751Z0000001RRv9IAG created with 10000 rows.
    21:14:03: Batch 751Z0000001RRy3IAG created with 10000 rows.
    21:14:04: Batch 751Z0000001RRzEIAW created with 10000 rows.
    21:14:05: Batch 751Z0000001RRzJIAW created with 10000 rows.
    21:14:05: Batch 751Z0000001RRzOIAW created with 10000 rows.
    21:14:06: Batch 751Z0000001RRzTIAW created with 10000 rows.
    21:14:07: Batch 751Z0000001RRwfIAG created with 10000 rows.
    21:14:08: Batch 751Z0000001RRzYIAW created with 10000 rows.
    21:14:08: Batch 751Z0000001RRzdIAG created with 10000 rows.
    21:14:09: Batch 751Z0000001RRuhIAG created with 10000 rows.
    21:14:10: Batch 751Z0000001RRyWIAW created with 10000 rows.
    21:14:10: Batch 751Z0000001RRziIAG created with 10000 rows.
    21:14:11: Batch 751Z0000001RRznIAG created with 10000 rows.
    21:14:11: Batch 751Z0000001RRzsIAG created with 10000 rows.
    21:14:14: Batch 751Z0000001RRzxIAG created with 10000 rows.
    21:14:15: Batch 751Z0000001RS02IAG created with 10000 rows.
    21:14:16: Batch 751Z0000001RS07IAG created with 10000 rows.
    21:14:17: Batch 751Z0000001RS0CIAW created with 10000 rows.
    21:14:17: Batch 751Z0000001RRwuIAG created with 10000 rows.
    21:14:18: Batch 751Z0000001RS0HIAW created with 10000 rows.
    21:14:20: Batch 751Z0000001RS0MIAW created with 10000 rows.
    21:14:21: Batch 751Z0000001RRxiIAG created with 10000 rows.
    21:14:22: Batch 751Z0000001RRyHIAW created with 10000 rows.
    21:14:22: Batch 751Z0000001RS0RIAW created with 10000 rows.
    21:14:23: Batch 751Z0000001RS0WIAW created with 10000 rows.
    21:14:24: Batch 751Z0000001RS0DIAW created with 10000 rows.
    21:14:24: Batch 751Z0000001RS0bIAG created with 10000 rows.
    21:14:25: Batch 751Z0000001RS0gIAG created with 10000 rows.
    21:14:26: Batch 751Z0000001RS0lIAG created with 10000 rows.
    21:14:27: Batch 751Z0000001RS0qIAG created with 10000 rows.
    21:14:28: Batch 751Z0000001RRztIAG created with 10000 rows.
    21:14:29: Batch 751Z0000001RS03IAG created with 10000 rows.
    21:14:30: Batch 751Z0000001RS0vIAG created with 10000 rows.
    21:14:31: Batch 751Z0000001RS10IAG created with 10000 rows.
    21:14:31: Batch 751Z0000001RS15IAG created with 10000 rows.
    21:14:32: Batch 751Z0000001RS1AIAW created with 10000 rows.
    21:14:33: Batch 751Z0000001RS1FIAW created with 10000 rows.
    21:14:34: Batch 751Z0000001RS0mIAG created with 10000 rows.
    21:14:34: Batch 751Z0000001RS0rIAG created with 10000 rows.
    21:14:35: Batch 751Z0000001RS1KIAW created with 10000 rows.
    21:14:36: Batch 751Z0000001RRxjIAG created with 10000 rows.
    21:14:36: Batch 751Z0000001RS04IAG created with 10000 rows.
    21:14:37: Batch 751Z0000001RS0wIAG created with 10000 rows.
    21:14:38: Batch 751Z0000001RS11IAG created with 10000 rows.
    21:14:39: Batch 751Z0000001RS1PIAW created with 10000 rows.
    21:14:40: Batch 751Z0000001RRyvIAG created with 10000 rows.
    21:14:40: Batch 751Z0000001RRxnIAG created with 10000 rows.
    21:14:41: Batch 751Z0000001RS1UIAW created with 10000 rows.
    21:14:42: Batch 751Z0000001RS1ZIAW created with 10000 rows.
    21:14:42: Batch 751Z0000001RS1eIAG created with 10000 rows.
    21:14:43: Batch 751Z0000001RS1jIAG created with 10000 rows.
    21:14:43: Batch 751Z0000001RS0XIAW created with 10000 rows.
    21:14:44: Batch 751Z0000001RS0EIAW created with 10000 rows.
    21:14:45: Batch 751Z0000001RS1BIAW created with 10000 rows.
    21:14:46: Batch 751Z0000001RS1oIAG created with 10000 rows.
    21:14:47: Batch 751Z0000001RS0nIAG created with 10000 rows.
    21:14:47: Batch 751Z0000001RRvXIAW created with 10000 rows.
    21:14:48: Batch 751Z0000001RS1tIAG created with 10000 rows.
    21:14:49: Batch 751Z0000001RRxkIAG created with 10000 rows.
    21:14:49: Batch 751Z0000001RS08IAG created with 10000 rows.
    21:14:50: Batch 751Z0000001RS1yIAG created with 10000 rows.
    21:14:51: Batch 751Z0000001RS16IAG created with 10000 rows.
    21:14:53: Batch 751Z0000001RRy4IAG created with 10000 rows.
    21:14:54: Batch 751Z0000001RRwQIAW created with 10000 rows.
    21:14:54: Batch 751Z0000001RRzPIAW created with 10000 rows.
    21:14:55: Batch 751Z0000001RS23IAG created with 10000 rows.
    21:14:56: Batch 751Z0000001RS05IAG created with 10000 rows.
    21:14:57: Batch 751Z0000001RS1kIAG created with 10000 rows.
    21:14:57: Batch 751Z0000001RS1zIAG created with 10000 rows.
    21:14:58: Batch 751Z0000001RS28IAG created with 10000 rows.
    21:14:59: Batch 751Z0000001RS2DIAW created with 10000 rows.
    21:14:59: Batch 751Z0000001RS2IIAW created with 10000 rows.
    21:15:01: Batch 751Z0000001RS2NIAW created with 10000 rows.
    21:15:02: Batch 751Z0000001RS2SIAW created with 10000 rows.
    21:15:02: Batch 751Z0000001RS2XIAW created with 10000 rows.
    21:15:03: Batch 751Z0000001RRyRIAW created with 10000 rows.
    21:15:04: Batch 751Z0000001RS2cIAG created with 10000 rows.
    21:15:05: Batch 751Z0000001RS2hIAG created with 10000 rows.
    21:15:05: Batch 751Z0000001RS2mIAG created with 10000 rows.
    21:15:06: Batch 751Z0000001RRzyIAG created with 10000 rows.
    21:15:07: Batch 751Z0000001RS2rIAG created with 10000 rows.
    21:15:08: Batch 751Z0000001RS2wIAG created with 10000 rows.
    21:15:09: Batch 751Z0000001RS31IAG created with 10000 rows.
    21:15:10: Batch 751Z0000001RS36IAG created with 10000 rows.
    21:15:10: Batch 751Z0000001RRzjIAG created with 10000 rows.
    21:15:11: Batch 751Z0000001RRwRIAW created with 10000 rows.
    21:15:12: Batch 751Z0000001RS3BIAW created with 10000 rows.
    21:15:13: Batch 751Z0000001RRzzIAG created with 10000 rows.
    21:15:14: Batch 751Z0000001RS3GIAW created with 10000 rows.
    21:15:15: Batch 751Z0000001RS3LIAW created with 10000 rows.
    21:15:16: Batch 751Z0000001RS29IAG created with 10000 rows.
    21:15:17: Batch 751Z0000001RS3QIAW created with 10000 rows.
    21:15:18: Batch 751Z0000001RS3VIAW created with 10000 rows.
    21:15:19: Batch 751Z0000001RS3aIAG created with 10000 rows.
    21:15:20: Batch 751Z0000001RS3fIAG created with 10000 rows.
    21:15:21: Batch 751Z0000001RS3kIAG created with 10000 rows.
    21:15:22: Batch 751Z0000001RS3pIAG created with 10000 rows.
    21:15:22: Batch 751Z0000001RRy5IAG created with 10000 rows.
    21:15:23: Batch 751Z0000001RS2iIAG created with 10000 rows.
    21:15:27: Batch 751Z0000001RS2nIAG created with 10000 rows.
    21:15:28: Batch 751Z0000001RS3uIAG created with 10000 rows.
    21:15:29: Batch 751Z0000001RS2jIAG created with 10000 rows.
    21:15:30: Batch 751Z0000001RS2oIAG created with 10000 rows.
    21:15:31: Batch 751Z0000001RS0SIAW created with 10000 rows.
    21:15:32: Batch 751Z0000001RS3zIAG created with 10000 rows.
    21:15:33: Batch 751Z0000001RS44IAG created with 10000 rows.
    21:15:39: Batch 751Z0000001RRwSIAW created with 10000 rows.
    21:15:39: Batch 751Z0000001RS49IAG created with 10000 rows.
    21:15:42: Batch 751Z0000001RS4EIAW created with 10000 rows.
    21:15:45: Batch 751Z0000001RS4JIAW created with 10000 rows.
    21:15:46: Batch 751Z0000001RS4OIAW created with 10000 rows.
    21:15:47: Batch 751Z0000001RS4TIAW created with 10000 rows.
    21:15:48: Batch 751Z0000001RS4YIAW created with 10000 rows.
    21:15:48: Batch 751Z0000001RS4dIAG created with 10000 rows.
    21:15:49: Batch 751Z0000001RS20IAG created with 10000 rows.
    21:15:50: Batch 751Z0000001RS40IAG created with 10000 rows.
    21:15:51: Batch 751Z0000001RS4iIAG created with 10000 rows.
    21:15:52: Batch 751Z0000001RS4nIAG created with 10000 rows.
    21:15:53: Batch 751Z0000001RS4UIAW created with 10000 rows.
    21:15:53: Job submitted.
    21:15:53: 2000000 rows read from SQL Table.
    21:15:53: Job still running.
    21:16:08: Job Complete.
    21:16:09: DBAmp is using the SQL Native Client.
    21:17:18: 1999797 rows successfully processed.
    21:17:18: 203 rows failed.
    21:17:18: Errors occurred. See Error column of row and above messages for more information.
    21:17:18: Error: DBAmp.exe was unsuccessful.
    21:17:18: Error: Command string is C:\"Program Files"\DBAmp\DBAmp.exe upsert:bulkapi,batchsize(10000),parallel customer_login__C_INSERT "LAIW2K8DB266D" "Salesforce_SIP" "SALESFORCE_SANDBOX" "Customer_ID_c__c" " "
    --- Ending SF_BulkOps. Operation FAILED.
    Msg 50000, Level 16, State 1, Procedure SF_BulkOps, Line 135
    SF_BulkOps Error: 21:12:53: DBAmp Bulk Operations. V2.20.5 (c) Copyright 2006-2015 forceAmp.com LLC21:12:53: Upserting Salesforce using customer_login__C_INSERT (LAIW2K8DB266D / Salesforce_SIP) .21:12:54: DBAmp is using the SQL Native Client.21:12:54: Batch size reset to 10000 rows per batch.21:12:55: Sort column will be used to order input rows.21:12:55: SF_Bulkops will poll every 60 seconds for up to 3600 seconds.21:12:56: Job 750Z0000000xIvwIAE created.21:12:57: Batch 751Z0000001RRudIAG created with 10000 rows.21:12:57: Batch 751Z0000001RRuiIAG created with 10000 rows.21:12:58: Batch 751Z0000001RRunIAG created with 10000 rows.21:12:59: Batch 751Z0000001RRusIAG created with 10000 rows.21:12:59: Batch 751Z0000001RRuxIAG created with 10000 rows.21:13:00: Batch 751Z0000001RRv2IAG created with 10000 rows.21:13:01: Batch 751Z0000001RRv7IAG created with 10000 rows.21:13:01: Batch 751Z0000001RRvCIAW created with 10000 rows.21:13:02: Batch 751Z0000001RRvHIAW created with 10000 rows.21:13:03: Batch 751Z0000001RRvMIAW created with 10000 rows.21:13:03: Batch 751Z0000001RRvRIAW created with 10000 rows.21:13:04: Batch 751Z0000001RRvWIAW created with 10000 rows.21:13:05: Batch 751Z0000001RRvbIAG created with 10000 rows.21:13:06: Batch 751Z0000001RRvgIAG created with 10000 rows.21:13:06: Batch 751Z0000001RRuyIAG created with 10000 rows.21:13:07: Batch 751Z0000001RRvlIAG created with 10000 rows.21:13:08: Batch 751Z0000001RRvIIAW created with 10000 rows.21:13:08: Batch 751Z0000001RRvqIAG created with 10000 rows.21:13:09: Batch 751Z0000001RRvvIAG created with 10000 rows.21:13:10: Batch 751Z0000001RRuoIAG created with 10000 rows.21:13:11: Batch 751Z0000001RRw0IAG created with 10000 rows.21:13:12: Batch 751Z0000001RRw5IAG created with 10000 rows.21:13:12: Batch 751Z0000001RRv3IAG created with 10000 rows.21:13:13: Batch 751Z0000001RRwAIAW created with 10000 rows.21:13:14: Batch 751Z0000001RRwFIAW created with 10000 rows.21:13:15: Batch 751Z0000001RRwKIAW created with 10000 rows.21:13:18: Batch 751Z0000001RRwPIAW created with 10...
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • For example for the record below error column is getting populated with error message below even though it got inserted in salesforce.

    Customer_ID_c__c Name
    973779 billyegraham@mac.com

    BulkAPI:upsert:750Z0000000xIxxIAE:751Z0000001RSmlIAG:7961:Error - STRING_TOO_LONG:Customer Login/Email: data value too large: gudi@ussaacademy.com, gudi@ussaacademy.com, gudi@ussaacademy.com, gudi@ussaacademy.com, gudi@ussaaca (max length=80):Name --

    This is the record which is supposed to fail but i see the ID column populated with salesforce ID and success message in error column
    Customer_ID_c__c Name
    1774182 gudi@ussaacademy.com, gudi@ussaacademy.com, gudi@ussaacademy.com, gudi@ussaacademy.com, gudi@ussaaca

    Error
    BulkAPI:upsert:750Z0000000xIxxIAE:751Z0000001RSo8IAG:3671:Operation Successful.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • Hi ,

    It happens only when i add SORT column to the load table
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • I am placing [Primary_Contact__c] in the sort column and can be nullable.Not all the records in customer_login__C_INSERT can have values for primary_contact__c.

    I even tried making sort column as identity column identity(1,1) but still the same issue.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • We are trying to reproduce this issue without much success.

    Do you see the issue occurring with smaller table sizes ?
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • Yes you are right it doesn't happen with tables having less record size say 10,000.

    It happens only when there are more records like 100,000.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • I can walk you through the issue.Please let me know if you would like to setup a meeting or something.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • We have tried some large datasets but have been unable to reproduce the issue.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • hello Bill, we happen to have the same issue above.
    used a parallel bulk load size = 10K, no master-detail relationship on the object. however, there is a unqiue external Id field and we have a sort column in the load table.

    we got about 4.5K fails out of 500K records, all the failed records happen to be loaded in SF but returned duplicated external Id error.

    wonder if this is caused by the sort column or it is a SF limitation?

    Thanks for your help.

    Rick
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • Couple of questions:

    1. What version of DBAmp are you currently running?
    2. What is the exact command you are using?
    3. What is the exact error messages from the error column of the ones that failed?
    4. Are all of the failed records failing with the same error message in the error column?
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • Also, one more, what is the column definition of the sort column?
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • 1. Version 3.1.7
    2. EXEC SF_BulkOps 'Insert:bulkapi,batchsize(10000),parallel','sf','table'
    3. BulkAPI:insert:7503B0000005zYJQAY:7513B000000MMJ9QAO:2362:Error - DUPLICATE_VALUE:duplicate value found: EXT_Source_System_Customer_ID__c duplicates value on record with id: a1C3B0000001WTE:--
    4. yes
    5. int IDENTITY (1,1)

    note the errored out records have already loaded in SF, there is no duplicate value on that field.

    Thanks Justin!
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • On the rows that are failing, Salesforce is saying that you cannot insert this row because there currently exists a row in Salesforce that already has that same value for the EXT_Source_System_Customer_ID__c column. Salesforce is not going to allow you to insert that row because of this duplication.

    If you want to update the existing row instead, then consider using an Upsert instead of an Insert.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • yes, that's what i am trying to understand, there is no duplicate value on EXT_Source_System_Customer_ID__c field in my load table, but why Salesforce is showing that this duplicate error? is it because i am using the parallel? or it is because of the sort column? or it's a known issue in DBAmp that i need to upgrade to the newest version?
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • Salesforce is not saying there is a duplicate record in your load table. Salesforce is saying that there is a value in Salesforce that has the same value as your load table record. That existing record is: a1C3B0000001WTE

    We could take a look at this via web meeting this afternoon at 1pm CST. Send an email to support at forceamp.com to confirm.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • Did someone figure out the resolution for this? I am attempting to use SF_TableLoader with 3.6.2,

    W (39): 11:18:16: 979241 rows successfully processed.
    W (40): 11:18:52: Error: Unable to update Error column in Account_UATRefresh_UPDATE_Result
    W (41): 11:18:52: System.Data.SqlClient.SqlException (0x80131904): Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding. ---> System.ComponentModel.Win32Exception (0x80004005): The wait operat
    W (42): ion timed out
    W (43): at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
    W (44): at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)
    W (45): at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error)
    W (46): at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync()
    W (47): at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket()
    W (48): at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer()
    W (49): at System.Data.SqlClient.TdsParserStateObject.TryReadByte(Byte& value)
    W (50): at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)
    W (51): at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)
    W (52): at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParamet
    W (53): erEncryptionRequest)
    W (54): at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolea
    W (55): n inRetry)
    W (56): at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(TaskCompletionSource`1 completion, String methodName, Boolean sendToPipe, Int32 timeout, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)
    W (57): at System.Data.SqlClient.SqlCommand.ExecuteNonQuery()
    W (58): at DBAmpNet2.BulkOpsJobSubmitProcessor.GetSuccessfulResults()
    W (59): ClientConnectionId:b3e56a88-66e4-4d85-96c6-77a9d4263583
    W (60): Error Number:-2,State:0,Class:11
    W (61): 11:18:52: Error: System.InvalidOperationException: Invalid operation. The connection is closed.
    W (62): at System.Data.ProviderBase.DbConnectionClosed.GetSchema(DbConnectionFactory factory, DbConnectionPoolGroup poolGroup, DbConnection outerConnection, String collectionName, String[] restrictions)
    W (63): at System.Data.SqlClient.SqlConnection.GetSchema(String collectionName, String[] restrictionValues)
    W (64): at DBAmpNet2.BulkOpsJobSubmitProcessor.GetFailedResults()
    W (65): at DBAmpNet2.Program.HandleBulkOpsBulk(String currentOperation, Options currentOptions, DBAmpRegistry currentDBAmpRegistry)
    W (66): at DBAmpNet2.Program.Main(String[] args)
    W (67): 11:18:52: DBAmpNet2 Operation FAILED.
    W (68): 11:18:53: Allowed Failure Percent = 20.
    W (69): 11:19:02: Percent Failed = 100.000.
    W (70): 11:19:02: Error: DBAmpNet2.exe was unsuccessful.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • Could you send a create table script for the Account_UATRefresh_UPDATE_Result table ?

    Also, is it possible that this table was in use or locked at the timeout because the same command was running in another job ?
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • This happened with ALL of the SP_TableLoader commands I ran. We are updating our sandbox by masking PII data.

    Here is the script to create and alter the table and then the execution of SF_TableLoader (updated to use drive E: instead of C:

    SELECT
    ID
    ,CAST(NULL AS nvarchar(1024)) AS Error
    ,CASE WHEN PersonEmail IS NULL THEN NULL ELSE 'noemail@domain.com.uat' END AS PersonEmail
    ,CASE WHEN Phone IS NULL THEN NULL ELSE '407 123 1234' END AS Phone
    ,CASE WHEN AddressStreet__pc IS NULL THEN NULL ELSE '123 Main St' END AS AddressStreet__pc
    ,CASE WHEN WorkflowEmailAlert__c IS NULL THEN NULL ELSE 'noemail@domain.com.uat' END AS WorkflowEmailAlert__c
    INTO Account_UATRefresh_UPDATE
    FROM
    SalesForceDataMart..Account
    WHERE 0=0
    go
    --Add Sort column
    ALTER TABLE Account_UATRefresh_UPDATE
    Add [Sort] int identity (1,1)
    go

    --New SP, ignore errors up tpo 20%
    exec SF_TableLoader 'UPDATE:IgnoreFailures(20)','SALESFORCE_FULL','Account_UATRefresh_UPDATE'
    go

    Otherwise SF_TableLoader did perform MUCH faster as advertised, I just could not get the errors returned from the failures.

    I was also updating 13 million leads, and did have to do a million at a time due to "Uploaded batch file size greater than max size. Org max upload file size is 150000000 bytes"
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • Try removing the error column from the select that creates your load table. Tableloader will create an error column automatically.

    Also, the 150 Meg limit is a Salesforce restrictions. You can file a case with them and ask them to increase the bulkapi 2 limit to 2gig.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • Tried removing Error column
    same errors:
    Command was executed successfully

    Warnings: --->
    W (1): --- Starting SF_TableLoader for Account_UATRefresh_UPDATE V3.6.2
    W (2): 19:17:03: Run the DBAmpNet2.exe program.
    W (3): 19:17:07: DBAmpNet2 3.6.2.0 (c) Copyright 2015-2017 forceAmp.com LLC
    W (4): 19:17:08: Parameters: update Account_UATRefresh_UPDATE OOCDDB501V\DBAMPDEV DBAMPUpgrade SALESFORCE_FULL
    W (5): 19:17:08: Using the Salesforce bulkapi2 API.
    W (6): 19:17:20: Drop Account_UATRefresh_UPDATE_Result if it exists.
    W (7): 19:17:21: Create Account_UATRefresh_UPDATE_Result with new structure.
    W (8): 19:17:22: Drop ID column.
    W (9): 19:17:25: Add ID column.
    W (10): 19:17:27: Sort column will be used to order input rows.
    W (11): 19:17:34: Add Error column in Account_UATRefresh_UPDATE_Result.
    W (12): 19:17:34: Warning: Column Sort ignored because it does not exist in the account object.
    W (13): 19:26:40: 980250 rows read from SQL Table.
    W (14): 19:26:41: Job 7500m000001CW7CAAW created on salesforce.
    W (15): 19:26:47: Using the bulkapi with polling every 60 seconds
    W (16): 19:27:02: Job still running.
    W (17): 19:27:17: Job still running.
    W (18): 19:28:17: Job still running.
    W (19): 19:29:17: Job still running.
    W (20): 19:30:17: Job still running.
    W (21): 19:31:18: Job still running.
    W (22): 19:32:18: Job still running.
    W (23): 19:33:18: Job still running.
    W (24): 19:34:18: Job still running.
    W (25): 19:35:18: Job still running.
    W (26): 19:36:18: Job still running.
    W (27): 19:37:18: Job still running.
    W (28): 19:38:18: Job still running.
    W (29): 19:39:18: Job still running.
    W (30): 19:40:19: Job still running.
    W (31): 19:41:19: Job still running.
    W (32): 19:42:19: Job still running.
    W (33): 19:43:19: Job Complete.
    W (34): 19:44:10: Error: Unable to update Success SQL table.
    W (35): 19:44:10: System.Data.SqlClient.SqlException (0x80131904): Could not allocate space for object 'dbo.Account_UATRefresh_UPDATE_Result' in database 'DBAMPUpgrade' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, droppin
    W (36): g objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.
    W (37): at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
    W (38): at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)
    W (39): at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)
    W (40): at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
    W (41): at System.Data.SqlClient.SqlBulkCopy.RunParser(BulkCopySimpleResultSet bulkCopyHandler)
    W (42): at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinuedOnSuccess(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
    W (43): at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinued(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
    W (44): at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsync(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
    W (45): at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource`1 source)
    W (46): at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource`1 source)
    W (47): at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken)
    W (48): at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken)
    W (49): at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader)
    W (50): at DBAmpNet2.BulkOpsJobSubmitProcessor.GetSuccessfulResults()
    W (51): ClientConnectionId:3af1203b-59ad-4b22-9934-e0af0af51121
    W (52): Error Number:1105,State:2,Class:17
    W (53): 19:44:11: Error: Unable to update Failed SQL table.
    W (54): 19:44:11: System.InvalidOperationException: The given ColumnMapping does not match up with any column in the source or destination.
    W (55): at System.Data.SqlClient.SqlBulkCopy.AnalyzeTargetAndCreateUpdateBulkCommand(BulkCopySimpleResultSet internalResults)
    W (56): at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource`1 source)
    W (57): at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource`1 source)
    W (58): at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken)
    W (59): at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken)
    W (60): at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader)
    W (61): at DBAmpNet2.BulkOpsJobSubmitProcessor.GetFailedResults()
    W (62): 19:44:11: 0 rows unprocessed.
    W (63): 19:44:11: Error: Integrity check. The number of records processed on salesforce does not match the number received from Salesforce.
    W (64): 19:44:11: Check the Salesforce Application Setup / Monitoring / Bulk Data Load Jobs for final job disposition.
    W (65): 19:44:11: DBAmpNet2 Operation FAILED.
    W (66): 19:44:14: Allowed Failure Percent = 20.
    W (67): --- Ending SF_TableLoader. Operation successful.
    <---
    [Executed: 5/6/2018 7:16:59 PM] [Execution: 27m 15s]
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • We did find an existing issue where we were only waiting 30 seconds for SQL to complete an operation. This was the cause of your first timeout. We have corrected this issue in the upcoming v3.6.3 release.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • Great news, when do you think the next release will be available?
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated

  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. kidding, amused, unsure, silly indifferent, undecided, unconcerned happy, confident, thankful, excited sad, anxious, confused, frustrated