web stats
Using an API GET from file store returns out of memory error - Mirth Community

Go Back   Mirth Community > Mirth Connect > Support

Reply
 
Thread Tools Display Modes
  #1  
Old 07-05-2018, 10:31 AM
llong llong is offline
Mirth Newb
 
Join Date: Jul 2018
Location: Indianapolis, IN
Posts: 11
llong is on a distinguished road
Question Using an API GET from file store returns out of memory error

My goal is to use an API to sync files from a separate site to an internal location.

These are very large files so when I execute my GET request Mirth returns an out of memory error.

Is there a functionality in Mirth that can handle such a request? I tried writing the job routine in another language outside of Mirth to see what would happen and got the same results.

I came across this older thread but the link is broken and doesn't give much information to go off of after that.

http://www.mirthproject.org/communit...ad.php?t=15056
Reply With Quote
  #2  
Old 07-05-2018, 02:38 PM
agermano agermano is offline
Mirth Guru
 
Join Date: Apr 2017
Location: Indiana, USA
Posts: 967
agermano is on a distinguished road
Default

Attachments will reduce your overall memory usage, but still require loading the entire file into memory.

Calling out to a command line utility like curl could be an option.

Another option could be to do it entirely in a javascript reader and work with Java input and output streams so that you can write as you read in small chunks instead of waiting to read the whole thing.

Your message then could just be a summary of the results.
Reply With Quote
  #3  
Old 07-09-2018, 04:53 AM
llong llong is offline
Mirth Newb
 
Join Date: Jul 2018
Location: Indianapolis, IN
Posts: 11
llong is on a distinguished road
Default

I tried the curl method, but didn't initiate that call from Mirth, it worked alright in compressing the files for me but still get a timeout when doing large folders or just the entire folder structure. Which I would hope to be the goal.

My current setup is using a javascript reader, using unirest to make a GET request to their ending in "/Download" API

Like you mentioned in relation to attachments, it's seems to be trying to load the whole thing into memory before deciding what to do with it. I'm still looking into the stream functionality, tried a stream function in Python which didn't work.

What I don't understand is how can I write at the same time it's reading even with i/o streams... to me it looks like it has to establish the connection and get all the files before it can do anything with it. I could force a curl command to wait longer before timing out, but don't know if that'd do anything besides have me wait longer before eventually timing out.


Here is my current call to their API:

var response = Unirest.get($g('ExternalVendor_URL') + "/Items(id)/Download")
.header("Authorization", "Bearer " + accessToken)
.asString();

Then below I would execute the write functionality, but it fails at the above because of timeouts or out of memory in Mirth

Last edited by llong; 07-09-2018 at 04:59 AM.
Reply With Quote
  #4  
Old 07-09-2018, 12:54 PM
agermano agermano is offline
Mirth Guru
 
Join Date: Apr 2017
Location: Indiana, USA
Posts: 967
agermano is on a distinguished road
Default

It appears that unirest does indeed download the entire file before you can do anything with it.
Reply With Quote
  #5  
Old 07-12-2018, 04:57 AM
llong llong is offline
Mirth Newb
 
Join Date: Jul 2018
Location: Indianapolis, IN
Posts: 11
llong is on a distinguished road
Default

Gotcha. Good news is that I found out I'm not needing to download the full file so that helps, but doesn't help with Mirth.

I tried breaking down the files into smaller sections where a folder may be 1GB, but get an out of memory error. Tried increasing the heap size to 2GB but Mirth wouldn't start, then moved it to 1GB and it started, but still got an out of memory error. I manually set the Unirest timeout to (0,0), but it would hang indefinitely.

I wrote the program in a Python script instead which is able to handle it.
Reply With Quote
  #6  
Old 07-24-2018, 05:43 AM
llong llong is offline
Mirth Newb
 
Join Date: Jul 2018
Location: Indianapolis, IN
Posts: 11
llong is on a distinguished road
Lightbulb End Solution

Wanted to post a reply to what my actual solution ended up being (in case someone else runs into this problem) as Python experienced the same issue with large file sizes.

Instead of downloading the files as a whole for a sync between the cloud and our local storage. I created a recursive function in Mirth to make API calls to each folder in the cloud file storage to find any deltas and if it found one it would update that single file to local storage.
Reply With Quote
Reply

Tags
api, download, files

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -8. The time now is 11:48 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2019, vBulletin Solutions, Inc.
Mirth Corporation