Announcement

Collapse
No announcement yet.

Still don't know how to solve Java Heap Space Problem?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Still don't know how to solve Java Heap Space Problem?

    Hi guys!
    I want to covert csv to json file
    First, I can use 16kb file through mirth connect
    But I can't use 50mb file to process.
    Even I change mcserver.vmoptions and mcservice.vmoptions file to Xmx2048M still have same problem.
    I still don't know where the problem is.

    Hope someone can help me. I'm newbie

    Code:
    [2018-06-05 19:57:30,456]  ERROR (com.mirth.connect.connectors.file.FileReceiver:435): Error processing file in channel: 26bdb1b3-0f79-49bb-a243-4efce165f065
    com.mirth.connect.connectors.file.FileConnectorException: Error reading file /home/theo/ctmr-large.csvJava heap space
    	at com.mirth.connect.connectors.file.FileReceiver.processFile(FileReceiver.java:371)
    	at com.mirth.connect.connectors.file.FileReceiver.processFiles(FileReceiver.java:247)
    	at com.mirth.connect.connectors.file.FileReceiver.poll(FileReceiver.java:203)
    	at com.mirth.connect.donkey.server.channel.PollConnectorJob.execute(PollConnectorJob.java:49)
    	at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
    	at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)Caused by: java.lang.OutOfMemoryError: Java heap space

  • #2
    Are you able to share any of your code or data structures? Maybe there are some optimizations that could be applied to reduce memory usage.

    Comment


    • #3
      Originally posted by agermano View Post
      Are you able to share any of your code or data structures? Maybe there are some optimizations that could be applied to reduce memory usage.
      Actually, I just use the file reader and file writer output file to another folder in the same server. So I don't write the code for this file. That's why I annoy about.

      Comment


      • #4
        Perhaps you can use the batch processor. Can you provide some more detail as to what you are doing?
        Best,

        Kirby

        Mirth Certified|Epic Bridges Certified|Cloverleaf Level 2 Certified

        Appliance Version 3.11.4
        Mirth Connect Version 3.8.0
        Java Version 1.6.0_45-b06
        Java (64 bit) Version 1.6.0_45-b06
        Java 7 (64 bit) Version 1.7.0_151-b15
        Java 8 (64 bit) Version 1.8.0_181-b13
        PostgreSQL Version 9.6.8

        Comment


        • #5
          csv won't convert itself to json, so there has to be something happening in a transformer... batch processing would likely help.

          Comment


          • #6
            Does the file reader still load the entire file into memory for processing? It may be better to create a JS Reader to open the file and read one record at a time. That may use less memory.
            Thanks.

            Jon

            Comment


            • #7
              Hi Agermano,

              Thank you so much it solves my queries.it is very helpful.keep it up.

              Comment


              • #8
                Originally posted by Jon Blanchard View Post
                Does the file reader still load the entire file into memory for processing? It may be better to create a JS Reader to open the file and read one record at a time. That may use less memory.
                A File Reader will load the entire file into memory unless the source inbound data type is delimited and you have batch mode enabled to split by record. Then it will stream the file line by line. I'm pretty sure it frees the previous file from memory before loading the next one when there are multiple files returned, even when not in batch mode.

                In order to return multiple messages from a single javascript reader poll, you still need to create all of your messages for that poll and keep them in memory before returning. That can theoretically use less ram than the file reader if you don't need to load the whole file at once to parse through it, and your messages you are creating are smaller than the original data, and you are only reading one file at a time.

                Comment

                Working...
                X