Fork me on GitHub
#aws
<
2019-12-17
>
thomas55920:12:53

Is there a recommended way to be able to call large numbers (more than 64) of lambdas with aws-api? Initially I ran into the limit from Cognitect's HttpClient, and I tried creating more AWS lambda clients (suggested in this issue: https://github.com/cognitect-labs/aws-api/issues/98) However, we've had servers falling over in production because the file descriptor limit is exceeded. Testing at the REPL, it does look like when a client is created for AWS lambda, file descriptors are not released. Should I be doing something to close the client? (Doesn't look like there's a .close method.) on the aws client. Might be related to this issue: https://github.com/cognitect-labs/aws-api/issues/109

kulminaator20:12:07

i ran into the same issue ๐Ÿ™‚

kulminaator20:12:34

tried to run 128 lambdas at once and results were failures to execute them ๐Ÿ˜ž

kulminaator21:12:48

for the leaks of filehandles, doesn't (aws/stop s3) release them ?

kulminaator21:12:56

rather unconventional ...

kulminaator21:12:19

seems to do it for me

kulminaator21:12:45

small test i ran

kulminaator21:12:03

@ i think this should at least help you put out the fire ๐Ÿ™‚

thomas55922:12:16

Looks like aws/stop does the trick. Thanks @.

thomas55922:12:00

Right there in the README too, I just missed it. "Invoke cognitect.aws.client.api/stop on the client if you want it to shut down any resources it and its http-client are using."

ghadi21:12:45

I canโ€™t type now but weโ€™ll systematically fix this