Following the successful generation of batch commits as previously discussed, I delved into the primary reason for the creation of the seed generator over the last week. This objective was to facilitate testing of the pages and an entity combo box, employed within a dropdown button. The goal was to monitor the response and ensure the correct appearance of the dropdown button, as well as accurate data display. In line with Mr Peter’s predictions, the button that should have displayed a comprehensive list of entities was not displayed as expected.
To address this, I initiated a debugging phase. My digging eventually led me to the conclusion that the problem could be solved by adjusting the query’s page size limit when invoking the list of entities. Due to my lack of knowledge about setting up an unlimited page size, I initially manually set the page size to display 100,000 entries. Mr. Peter, thankfully, provided guidance, recommending the use of “-1” to indicate an unlimited page size. After getting familiar with this method, the entity successfully transmitted its whole list without issue.
Following that, I discovered another entity within the seed data generator that did not yet have 5000 records. It also revealed a new error that only appeared when attempting to generate over 100 records on the fly, halting at 100. My first thought was that there was an issue with the entity mapping because I wasn’t able to edit any existing data for this entity. After I successfully solved the mapping issue, I attempted to create 5000 data entries once more. Regrettably, this endeavor failed once more, with an error message referencing time parsing. The issue was ultimately resolved after I adjusted the date format inside a method that processes a list of objects in batches, that was generating and accumulating SQL INSERT statements to save the object data into a database using batch operations.
My next attempt was met with yet another setback, this time with an error message indicating a ‘batch too large’ problem. Looking for solutions from other developers who faced similar issues from stack overflow, I discovered that inserting multiple inserts (> 10) into a batch for can have a negative impact on performance. As a result, increasing the batch limit was not an option. After some experimentation, I discovered that only by setting the batch maximum to 15 was I able to generate data that exceeded the 100-entry threshold.
