API requests calculation on a bulk upsert costing 6 callsHow to perform upsert in Salesforce Bulk APIProcess Builder Error with Bulk APIExporting entire database via APIWhat are Best Practices for Upserting Objects with Foreign Key Members (i.e. References to Other Objects)Updating record from Ruby on Rails to SalesforceCan the Bulk API solve the CPU Time Limit Issue I am having

A drug that allows people to survive on less food

What is this utensil for?

Are there non JavaScript ways to hide HTML source code?

If an object moving in a circle experiences centripetal force, then doesn't it also experience centrifugal force, because of Newton's third law?

What are these pixel-level discolored specks? How can I fix it?

Who created the Lightning Web Component?

Why is the missed-approach course for the "RNAV (GNSS) - A" approach to runway 28 at ENSB shaped all funny?

What are these ingforms of learning?

How to ask a man to not take up more than one seat on public transport while avoiding conflict?

To what extent is it worthwhile to report check fraud / refund scams?

Performance for simple code that converts a RGB tuple to hex string

My 15 year old son is gay. How do I express my feelings about this?

The 100 soldier problem

Examples of "unsuccessful" theories with afterlives

Where are they calling from?

Norwegian refuses EU delay (4.7 hours) compensation because it turned out there was nothing wrong with the aircraft

What can a pilot do if an air traffic controller is incapacitated?

Conditionally execute a command if a specific package is loaded

How use custom order in folder on Windows 7 and 10

Why is there is no screening for Ovarian Cancer?

Writing a letter of recommendation for a mediocre student

Wrong result by FindRoot

reverse a list of generic type

How is the problem, ⟨G⟩ in Logspace?



API requests calculation on a bulk upsert costing 6 calls


How to perform upsert in Salesforce Bulk APIProcess Builder Error with Bulk APIExporting entire database via APIWhat are Best Practices for Upserting Objects with Foreign Key Members (i.e. References to Other Objects)Updating record from Ruby on Rails to SalesforceCan the Bulk API solve the CPU Time Limit Issue I am having






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








1















I am using the simple-salesforce Python package to upsert records into our Salesforce instance. I am finding that each bulk API call counts as 6 API requests. Is this correct? Is there a way to reduce this whilst performing this operation.



The function I am using to perform the upsert:



def bulk_upsert(self, object_name, data_set, identity_field):


upsert_response = SFBulkType(object_name, self.bulk_con.connect.bulk_url,
self.bulk_con.connect.headers, self.bulk_con.connect.session).upsert(data_set, identity_field)

return upsert_response


The object name is the custom object created in Salesforce for our instance, the dataset is a list of dictionaries containing 200 records, the identity field is the external id for the Salesforce object.



I establish the Salesforce connection (sf) and then loop through requested_chunks which is a list of the data to upsert broken into 200 records each.



for request in request_chunks:

upsert_response = sf.bulk_upsert(sf_object, request, sf_identity)


I am finding that each time through this loop it is costing 6 credits. I have done a comparison with the integration recommended by simple-salesforce and that also costs 6 calls per bulk upsert. Any thoughts?










share|improve this question






























    1















    I am using the simple-salesforce Python package to upsert records into our Salesforce instance. I am finding that each bulk API call counts as 6 API requests. Is this correct? Is there a way to reduce this whilst performing this operation.



    The function I am using to perform the upsert:



    def bulk_upsert(self, object_name, data_set, identity_field):


    upsert_response = SFBulkType(object_name, self.bulk_con.connect.bulk_url,
    self.bulk_con.connect.headers, self.bulk_con.connect.session).upsert(data_set, identity_field)

    return upsert_response


    The object name is the custom object created in Salesforce for our instance, the dataset is a list of dictionaries containing 200 records, the identity field is the external id for the Salesforce object.



    I establish the Salesforce connection (sf) and then loop through requested_chunks which is a list of the data to upsert broken into 200 records each.



    for request in request_chunks:

    upsert_response = sf.bulk_upsert(sf_object, request, sf_identity)


    I am finding that each time through this loop it is costing 6 credits. I have done a comparison with the integration recommended by simple-salesforce and that also costs 6 calls per bulk upsert. Any thoughts?










    share|improve this question


























      1












      1








      1








      I am using the simple-salesforce Python package to upsert records into our Salesforce instance. I am finding that each bulk API call counts as 6 API requests. Is this correct? Is there a way to reduce this whilst performing this operation.



      The function I am using to perform the upsert:



      def bulk_upsert(self, object_name, data_set, identity_field):


      upsert_response = SFBulkType(object_name, self.bulk_con.connect.bulk_url,
      self.bulk_con.connect.headers, self.bulk_con.connect.session).upsert(data_set, identity_field)

      return upsert_response


      The object name is the custom object created in Salesforce for our instance, the dataset is a list of dictionaries containing 200 records, the identity field is the external id for the Salesforce object.



      I establish the Salesforce connection (sf) and then loop through requested_chunks which is a list of the data to upsert broken into 200 records each.



      for request in request_chunks:

      upsert_response = sf.bulk_upsert(sf_object, request, sf_identity)


      I am finding that each time through this loop it is costing 6 credits. I have done a comparison with the integration recommended by simple-salesforce and that also costs 6 calls per bulk upsert. Any thoughts?










      share|improve this question














      I am using the simple-salesforce Python package to upsert records into our Salesforce instance. I am finding that each bulk API call counts as 6 API requests. Is this correct? Is there a way to reduce this whilst performing this operation.



      The function I am using to perform the upsert:



      def bulk_upsert(self, object_name, data_set, identity_field):


      upsert_response = SFBulkType(object_name, self.bulk_con.connect.bulk_url,
      self.bulk_con.connect.headers, self.bulk_con.connect.session).upsert(data_set, identity_field)

      return upsert_response


      The object name is the custom object created in Salesforce for our instance, the dataset is a list of dictionaries containing 200 records, the identity field is the external id for the Salesforce object.



      I establish the Salesforce connection (sf) and then loop through requested_chunks which is a list of the data to upsert broken into 200 records each.



      for request in request_chunks:

      upsert_response = sf.bulk_upsert(sf_object, request, sf_identity)


      I am finding that each time through this loop it is costing 6 credits. I have done a comparison with the integration recommended by simple-salesforce and that also costs 6 calls per bulk upsert. Any thoughts?







      bulk-api httprequest upsert






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Apr 15 at 14:33









      James WellingtonJames Wellington

      62 bronze badges




      62 bronze badges























          2 Answers
          2






          active

          oldest

          votes


















          1
















          Is your python lib uses Bulk API or simple record insert/update?
          If it uses Salesforce Bulk API it needs 1 request to create Bulk Job, 1 request to upload records, 1 request to mark job as posted.
          And needs to poll request to track job status if records are processed






          share|improve this answer
































            1
















            The Bulk API is designed for bulk uploads of data. If you're importing less than 1,000 records, you're definitely wasting API calls. Six calls sounds about right; it definitely requires at least three just to initialize, upload, and start the Bulk API call, plus more calls to wait for completion. Use the normal upsert API call if you want to upload smaller batches of data (e.g. 200 records at a time).






            share|improve this answer

























            • Will this cost 1 API call per record? So a batch of 400 records which was split into two API requests to the bulk API previously, would then be 400 separate upsert calls?

              – James Wellington
              Apr 15 at 15:42











            • I can increase the batch size from 200 to circa 2000 without coming into problems with total characters. I went with 200 as assumed it wouldn't change the API number of calls. The documentation states that the API splits batches into 200 records behind the scenes.

              – James Wellington
              Apr 15 at 15:49











            • @JamesWellington The synchronous upsert call can do 200 records API call. The Bulk API can process up to 10,000 records per file, and millions of records per day. The Bulk API will indeed split the file in to batches of 200. If you use the bulk API, there's really no reason to send less than 10,000 records per file. You'll want to read the documentation for more information.

              – sfdcfox
              Apr 15 at 16:05













            Your Answer








            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "459"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: false,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );














            draft saved

            draft discarded
















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsalesforce.stackexchange.com%2fquestions%2f257890%2fapi-requests-calculation-on-a-bulk-upsert-costing-6-calls%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            1
















            Is your python lib uses Bulk API or simple record insert/update?
            If it uses Salesforce Bulk API it needs 1 request to create Bulk Job, 1 request to upload records, 1 request to mark job as posted.
            And needs to poll request to track job status if records are processed






            share|improve this answer





























              1
















              Is your python lib uses Bulk API or simple record insert/update?
              If it uses Salesforce Bulk API it needs 1 request to create Bulk Job, 1 request to upload records, 1 request to mark job as posted.
              And needs to poll request to track job status if records are processed






              share|improve this answer



























                1














                1










                1









                Is your python lib uses Bulk API or simple record insert/update?
                If it uses Salesforce Bulk API it needs 1 request to create Bulk Job, 1 request to upload records, 1 request to mark job as posted.
                And needs to poll request to track job status if records are processed






                share|improve this answer













                Is your python lib uses Bulk API or simple record insert/update?
                If it uses Salesforce Bulk API it needs 1 request to create Bulk Job, 1 request to upload records, 1 request to mark job as posted.
                And needs to poll request to track job status if records are processed







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Apr 15 at 14:36









                pklochkovpklochkov

                5584 silver badges13 bronze badges




                5584 silver badges13 bronze badges


























                    1
















                    The Bulk API is designed for bulk uploads of data. If you're importing less than 1,000 records, you're definitely wasting API calls. Six calls sounds about right; it definitely requires at least three just to initialize, upload, and start the Bulk API call, plus more calls to wait for completion. Use the normal upsert API call if you want to upload smaller batches of data (e.g. 200 records at a time).






                    share|improve this answer

























                    • Will this cost 1 API call per record? So a batch of 400 records which was split into two API requests to the bulk API previously, would then be 400 separate upsert calls?

                      – James Wellington
                      Apr 15 at 15:42











                    • I can increase the batch size from 200 to circa 2000 without coming into problems with total characters. I went with 200 as assumed it wouldn't change the API number of calls. The documentation states that the API splits batches into 200 records behind the scenes.

                      – James Wellington
                      Apr 15 at 15:49











                    • @JamesWellington The synchronous upsert call can do 200 records API call. The Bulk API can process up to 10,000 records per file, and millions of records per day. The Bulk API will indeed split the file in to batches of 200. If you use the bulk API, there's really no reason to send less than 10,000 records per file. You'll want to read the documentation for more information.

                      – sfdcfox
                      Apr 15 at 16:05















                    1
















                    The Bulk API is designed for bulk uploads of data. If you're importing less than 1,000 records, you're definitely wasting API calls. Six calls sounds about right; it definitely requires at least three just to initialize, upload, and start the Bulk API call, plus more calls to wait for completion. Use the normal upsert API call if you want to upload smaller batches of data (e.g. 200 records at a time).






                    share|improve this answer

























                    • Will this cost 1 API call per record? So a batch of 400 records which was split into two API requests to the bulk API previously, would then be 400 separate upsert calls?

                      – James Wellington
                      Apr 15 at 15:42











                    • I can increase the batch size from 200 to circa 2000 without coming into problems with total characters. I went with 200 as assumed it wouldn't change the API number of calls. The documentation states that the API splits batches into 200 records behind the scenes.

                      – James Wellington
                      Apr 15 at 15:49











                    • @JamesWellington The synchronous upsert call can do 200 records API call. The Bulk API can process up to 10,000 records per file, and millions of records per day. The Bulk API will indeed split the file in to batches of 200. If you use the bulk API, there's really no reason to send less than 10,000 records per file. You'll want to read the documentation for more information.

                      – sfdcfox
                      Apr 15 at 16:05













                    1














                    1










                    1









                    The Bulk API is designed for bulk uploads of data. If you're importing less than 1,000 records, you're definitely wasting API calls. Six calls sounds about right; it definitely requires at least three just to initialize, upload, and start the Bulk API call, plus more calls to wait for completion. Use the normal upsert API call if you want to upload smaller batches of data (e.g. 200 records at a time).






                    share|improve this answer













                    The Bulk API is designed for bulk uploads of data. If you're importing less than 1,000 records, you're definitely wasting API calls. Six calls sounds about right; it definitely requires at least three just to initialize, upload, and start the Bulk API call, plus more calls to wait for completion. Use the normal upsert API call if you want to upload smaller batches of data (e.g. 200 records at a time).







                    share|improve this answer












                    share|improve this answer



                    share|improve this answer










                    answered Apr 15 at 14:38









                    sfdcfoxsfdcfox

                    286k14 gold badges236 silver badges495 bronze badges




                    286k14 gold badges236 silver badges495 bronze badges















                    • Will this cost 1 API call per record? So a batch of 400 records which was split into two API requests to the bulk API previously, would then be 400 separate upsert calls?

                      – James Wellington
                      Apr 15 at 15:42











                    • I can increase the batch size from 200 to circa 2000 without coming into problems with total characters. I went with 200 as assumed it wouldn't change the API number of calls. The documentation states that the API splits batches into 200 records behind the scenes.

                      – James Wellington
                      Apr 15 at 15:49











                    • @JamesWellington The synchronous upsert call can do 200 records API call. The Bulk API can process up to 10,000 records per file, and millions of records per day. The Bulk API will indeed split the file in to batches of 200. If you use the bulk API, there's really no reason to send less than 10,000 records per file. You'll want to read the documentation for more information.

                      – sfdcfox
                      Apr 15 at 16:05

















                    • Will this cost 1 API call per record? So a batch of 400 records which was split into two API requests to the bulk API previously, would then be 400 separate upsert calls?

                      – James Wellington
                      Apr 15 at 15:42











                    • I can increase the batch size from 200 to circa 2000 without coming into problems with total characters. I went with 200 as assumed it wouldn't change the API number of calls. The documentation states that the API splits batches into 200 records behind the scenes.

                      – James Wellington
                      Apr 15 at 15:49











                    • @JamesWellington The synchronous upsert call can do 200 records API call. The Bulk API can process up to 10,000 records per file, and millions of records per day. The Bulk API will indeed split the file in to batches of 200. If you use the bulk API, there's really no reason to send less than 10,000 records per file. You'll want to read the documentation for more information.

                      – sfdcfox
                      Apr 15 at 16:05
















                    Will this cost 1 API call per record? So a batch of 400 records which was split into two API requests to the bulk API previously, would then be 400 separate upsert calls?

                    – James Wellington
                    Apr 15 at 15:42





                    Will this cost 1 API call per record? So a batch of 400 records which was split into two API requests to the bulk API previously, would then be 400 separate upsert calls?

                    – James Wellington
                    Apr 15 at 15:42













                    I can increase the batch size from 200 to circa 2000 without coming into problems with total characters. I went with 200 as assumed it wouldn't change the API number of calls. The documentation states that the API splits batches into 200 records behind the scenes.

                    – James Wellington
                    Apr 15 at 15:49





                    I can increase the batch size from 200 to circa 2000 without coming into problems with total characters. I went with 200 as assumed it wouldn't change the API number of calls. The documentation states that the API splits batches into 200 records behind the scenes.

                    – James Wellington
                    Apr 15 at 15:49













                    @JamesWellington The synchronous upsert call can do 200 records API call. The Bulk API can process up to 10,000 records per file, and millions of records per day. The Bulk API will indeed split the file in to batches of 200. If you use the bulk API, there's really no reason to send less than 10,000 records per file. You'll want to read the documentation for more information.

                    – sfdcfox
                    Apr 15 at 16:05





                    @JamesWellington The synchronous upsert call can do 200 records API call. The Bulk API can process up to 10,000 records per file, and millions of records per day. The Bulk API will indeed split the file in to batches of 200. If you use the bulk API, there's really no reason to send less than 10,000 records per file. You'll want to read the documentation for more information.

                    – sfdcfox
                    Apr 15 at 16:05


















                    draft saved

                    draft discarded















































                    Thanks for contributing an answer to Salesforce Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsalesforce.stackexchange.com%2fquestions%2f257890%2fapi-requests-calculation-on-a-bulk-upsert-costing-6-calls%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Tamil (spriik) Luke uk diar | Nawigatjuun

                    Align equal signs while including text over equalitiesAMS align: left aligned text/math plus multicolumn alignmentMultiple alignmentsAligning equations in multiple placesNumbering and aligning an equation with multiple columnsHow to align one equation with another multline equationUsing \ in environments inside the begintabularxNumber equations and preserving alignment of equal signsHow can I align equations to the left and to the right?Double equation alignment problem within align enviromentAligned within align: Why are they right-aligned?

                    Training a classifier when some of the features are unknownWhy does Gradient Boosting regression predict negative values when there are no negative y-values in my training set?How to improve an existing (trained) classifier?What is effect when I set up some self defined predisctor variables?Why Matlab neural network classification returns decimal values on prediction dataset?Fitting and transforming text data in training, testing, and validation setsHow to quantify the performance of the classifier (multi-class SVM) using the test data?How do I control for some patients providing multiple samples in my training data?Training and Test setTraining a convolutional neural network for image denoising in MatlabShouldn't an autoencoder with #(neurons in hidden layer) = #(neurons in input layer) be “perfect”?