Fork me on GitHub
#aws
<
2020-11-19
>
zackteo05:11:55

I understand one usually types their bash script in User data but I can't seem to find a Clojure library that allows me to do this? Not sure if this is how to do automate the spinning up of a EC2 cluster

zackteo05:11:33

Am not allowed to use managed services such as EMR, RDS, Kubernetes (it is a school project)

zackteo06:11:29

I understand this exists https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/user-data.html but automatically do this instead of having to go through the process via a website?

valtteri06:11:47

@zackteo did you check this recipe from Spark docs already? https://spark.apache.org/docs/1.6.2/ec2-scripts.html

zackteo07:11:24

@valtteri I actually don't understand how this works. Is this a preconfigured OS image?

zackteo07:11:28

ohhh I'm not sure if i can expect spark to be locally installed - if this is what it requires

valtteri07:11:10

It's a script that you run locally with your AWS credentials and it will spin up the EC2 cluster for you.

zackteo07:11:51

Where is this Spark’s ec2 directory suppose to be located

valtteri07:11:22

Probably inside the Spark distribution

valtteri07:11:37

I've never used it myself but I'd guess that does all the steps you need to achieve. 🙂

zackteo07:11:21

Thanks @valtteri 🙂 Let me look into it - I'm not sure if I am allowed to do it this way

valtteri07:11:42

Ahh, if the goal is to learn how to set it up from scratch manually then that might be too automated. 😄 However, you can check clues from the script what kind of things are involved and try to reproduce them using Amazonica or whatever

valtteri07:11:27

I guess there are many steps involved (firing up the instances, installing the libs, setting up networking, security groups, autoscaling etc..)

zackteo07:11:35

I guess my issue is that I understand how I might want to use Amazonica to initialise the instance. But how do I access that instance in an automated way to run script to just install dependencies etc

valtteri07:11:08

Yeah, Amazonica just delegates stuff to the AWS Java SDK afaik.. So you need to check how that's done on Java side and then figure out how to tell Amazonica to do that. You can provide the 'user data script' with the runInstances request if I remember correctly.

valtteri07:11:33

It's been a while since I've been bootstrapping EC2-instances..

valtteri07:11:56

Perhaps you can shortcut by setting up one of the instances manually and then saving it as a machine image (AMI) and then just telling Amazonica to launch an instance that uses that AMI

zackteo07:11:28

unfortunately we need to start off from a base ami :x

valtteri07:11:36

aws ec2 run-instances --image-id ami-abcd1234 --count 1 --instance-type m3.medium \
--key-name my-key-pair --subnet-id subnet-abcd1234 --security-group-ids sg-abcd1234 \
--user-data 

valtteri07:11:50

user-data script will be run when the instance launches

zackteo07:11:14

Think they expect us to use boto3 from python but that's also just the aws sdk

zackteo07:11:19

That's bash right?

zackteo07:11:54

Was trying to find that in the aws sdk but maybe ill just have to use bash

valtteri07:11:30

☝️ there's how to do it with boto

zackteo07:11:35

ooo! Thanks 😄

👌 3
zackteo08:11:12

I tried to find the equivalent on amazonica but to no avail - also considered using babashka with https://github.com/tzzh/pod-tzzh-aws but think it isn't worth the trouble ><. I'm just gonna approach this with boto3 after all 😅 makes more sense to revisit the project and rewrite this part in Clojure if I am interested

viesti10:11:55

I'm thinking that it might be easier to do the setup with Terraform for example, is has great support for the AWS APIs and makes it easier to destroy resources after creation. There is some learning curve, but for a school project, one could use local state file store for example.

viesti10:11:06

that said, I'm too biased, since I like Terraform too much 🙂

borkdude11:11:28

@zackteo Shelling out to the AWS cli is pretty common with babashka

zackteo11:11:06

@vlesti unfortunately am only allowed to use cloud formation at max ... would like to learn Terraform at some point!

zackteo11:11:00

@borkdude ohhh. Are there any AWS and/or spark examples? Though I think that I'll likely end up sticking to bash given my use case of mainly installing dependencies and doing hadoop and spark setup

zackteo12:11:25

Btw, could someone explain how I might cloud formation fit into all this? I will also need to create a web app that links to MongoDB and MySQL so probably separate servers for each. I understand that cloud formation just makes things more declarative?

valtteri12:11:02

Yes, CloudFormation is a JSON declaration of the AWS resources that you want to setup. It's pretty hard to grasp in the beginning, but I've learned to like it a lot over the years.

zackteo13:11:27

If I set it up with cloud formation. How do I insert the instance with scripts I it want executed :o is there also a user-data part in the declaration?

zackteo13:11:44

Or do I need to access the instances in another way?

valtteri14:11:09

You embed the script into the cloudformation template JSON. Yes it's ugly. 😄

valtteri14:11:04

"UserData": { "Fn::Base64": { "Fn::Join": [ "", 
  "#!/bin/bash\n",
  "echo 'doing stuff'\n"
] } }

lukasz15:11:11

@borkdude yes, that's my current approach.

lukasz15:11:58

FWIW I'd recommend Terraform over cloudformation as it covers 99% of the usecases with much nicer syntax and better primitives. And yes, you can specify user-data scripts there. Or use Packer to pre-bake machine images

markbastian18:11:34

When calling :GetExport with the cognitect aws api for apigateway I get the following response:

{:logref "4b35cdac-1783-4c1d-bdc2-ef84bc1fa68e",
 :message "Unable to build exporter: null",
 :cognitect.anomalies/category :cognitect.anomalies/incorrect}
Here is my invocation:
(aws-api/invoke
  client
  {:op      :GetExport
   :request {:restApiId  api-id
             :stageName  stage-name
             :exportType "swagger"}})
When I use the exact same arguments with boto3 it works, so I think the arguments are right. Anything seem wrong that I am missing?

dchelimsky17:11:09

@U0JUR9FPH this will work:

(aws-api/invoke
 client
 {:op      :GetExport
  :request {:restApiId  api-id
            :stageName  stage-name
            :exportType "swagger"
            :accepts    "application/json"}})
I posted an issue https://github.com/cognitect-labs/aws-api/issues/158. Please keep the convo going there if you’re interested.