No announcement yet.

Large CSV process as batch very slow

  • Filter
  • Time
  • Show
Clear All
new posts

  • Large CSV process as batch very slow

    I have an interface which reads a csv file and wirted the data to database.
    I'm trying to process a 14 MB CSV file as a batch process and it's taking a long time to process this file, as it goes row after row and it has 50,000 rows.

    Is there any way to speed up the process?


  • #2
    Please post the channel here so we can better troubleshoot. First thing is, you could turn on the source queue. Then, does the order in which specific rows are processed by the Database Writer matter? If not, then you can turn on the destination queue and crank up the queue threads. After that to increase throughput further, if you have any source or destination filtering/transformation, you can also move all of that logic into the destination filter/transformer, and then enable "Include Filter/Transformer" on the advanced queue settings.
    Step 1: JAVA CACHE...DID YOU CLEAR ...wait, ding dong the witch is dead?

    Nicholas Rupley
    Work: 949-237-6069
    Always include what Mirth Connect version you're working with. Also include (if applicable) the code you're using and full stacktraces for errors (use CODE tags). Posting your entire channel is helpful as well; make sure to scrub any PHI/passwords first.

    - How do I foo?
    - You just bar.