Fork me on GitHub
#aws
<
2019-05-28
>
caleb.macdonaldblack01:05:26

What do people use to manage environment variables, parameters and secrets? I have a full stack app using terraform. So I'm dealing with: - Building a shadow-cljs project, injecting env vars for auth, backend, google api keys, - Running a backend service that needs api keys, database credentials - Managing CI which need access to parameters and secrets for builds and deployment - Managing terraform which requires input variables for my terraform configuration and also outputs that are passed into other services (for example frontend and backend) - Developing locally where I need access to those variables, parameters and secrets in some capacity - Right now I've hacked together a combination of: - Terraform passing vars, parameters and secrets into services such as the backend - Defining env vars into my CI using their environment variable store - Pulling env vars from terraform outputs when needed for building and deploying. Ideally I'd like to manage all this centrally some how. What have people use in the past for consolidating all this?

ghadi03:05:40

SSM Parameter store

👌 8
👍 8
orestis05:05:49

How do you do local (offline?) development when your stack depends on AWS stuff?

orestis05:05:35

I can see some things (e.g. SQS) can be reasonably abstracted out via an interface that has a local naive implementation.

valtteri06:05:30

I usually have dev account(s) on AWS and I use real services when developing. In simple cases I have used DynamoDB local and local Postgres. Mocking all the services seems too much hassle to me.

orestis06:05:11

How is the experience of doing that? In terms of network issues, delays, latency etc.

valtteri08:05:02

Network hasn’t been an issue to me but at last job we had some trouble with having dangling left-over resources after developers who didn’t clean up after themselves. 🙂 Nowadays AWS offers nice ways to monitor this kind of issues. Creating dev “sub-accounts” that can be safely pruned is a good option.

valtteri08:05:17

I’ve used dynamodb local mostly for unit tests and such.

valtteri08:05:27

But my emphasis is usually more on integration tests and there you want to have the real thing. Or as real as it gets.

viesti08:05:56

CI running on cloud has fast latency too

viesti08:05:21

one probably doesn't want to move a lot of data back and forth between own laptop and the cloud, in cases like that, running a job in the CI (or on some node in the cloud) makes sense

caleb.macdonaldblack06:05:05

I use a develop account for stuff I cannot easily replicate locally. My webapp, backend and db is local.

caleb.macdonaldblack06:05:57

When we have another dev helping we step on each other's toes though. Ideally we would each have their own account and environment

viesti07:05:09

haven't gotten to it, but namespacing resources via Terraform would be neat in shared dev account.

viesti07:05:44

I guess it depends on the case/project, if there isn't any costly infrastructure, then separated accounts for each dev would be maximum isolation, but in a bigger system, one might want to share data sources, which means cross account policies

viesti07:05:09

separate account per stage is good for hygiene anyway

👍 4
viesti08:05:30

in a previous project, we used Kinesis from https://github.com/localstack/localstack in tests, but it was a bit of a hassle to avoid timing related brittleness

viesti08:05:42

but also used resources in dev account for running tests, since running tests against real infrastructure made sense in a data pipeline (Redshift+S3)

viesti08:05:12

in that project, we used Ansible Vault for keeping secrets in version control (a tad bit legacy, since we initially started with Ansible), passed secrets to Terraform in a runner script via -var-file + process substitution (http://tldp.org/LDP/abs/html/process-sub.html) and stored state files in S3 with KMS encryption. We had utility libraries to read parameters from SSM, Terraform state and local per-env config files. Config files were merged so that overrides could be defined (in TF, passing first the prod config var file, then env specific var file. Same done in libraries).

dangercoder12:05:28

Does anyone know if you can use AWS cognito with your own front-end forms (login/sign-up)?

valtteri13:05:25

You can use the JavaScript SDK to chat with Cognito directly from the browser. There are quite many moving parts so it’s probably a good idea to try find a reference implementation somewhere.

dangercoder19:05:05

@valtteri is this Javascript which is used in the front end or is it node.js? 😄 I want my users to go to my controller and then my application calls aws for sign up

dangercoder19:05:37

for sign up etc.

valtteri06:05:58

Most operations are possible both client and server-side. You just need to configure your user-pool accordingly. There used to be a separate SDK for browser JavaScript but I think they’ve combined Node/Browser stuff under one JavaScript SDK nowadays.

valtteri06:05:32

It’s possible to implement many kinds of scenarios using Cognito. You can handle signup/login/forgot-password/managing user-data directly in the browser if you wish or you can choose to do all that on the backend side (or mix & match).

dangercoder10:05:08

Yeah im communicating with Cognito using the cognitect aws-api now. Got the sign up working really nice 🙂

👍 4