Hey guys, I’m trying to implement AWS_IAM auth for...
# sst
a
Hey guys, I’m trying to implement AWS_IAM auth for some of my endpoints. Found that I can do that, but how I should CONSUME these resources? For example my
API One
it’s defined with IAM auth, and then must be consumed by
Service Two
.. how should I setup
Service Two
to send the proper authentication? I’m lost on this, can’t find examples of it.
s
Morning @Adrián Mouly, you need to add the sigV4 to the outgoing http headers of the requests coming from service one, into service two https://www.npmjs.com/package/@aws-sdk/signature-v4 can generate these for you in node. Service two needs to have IAM permissions, to invoke the API gateway methods of service one.
a
@Adrián Mouly I believe you should handle cross-service interactions using EventBridge / Topic / SNS / SQS. These remove the need to use HTTP for almost all service to service calls, are more robust and reduce communication latency drastically.
s
Some really good points about choreography, certainly makes sense if service two is a backend service that your own organisation owns 👍 @Adrián Mouly, could you let us know what service two is (frontend, backend, 3rd party API)?
a
@Simon Reilly even if the backend service was a 3rd party API, we could use EventBridge integration to trigger them. The API destinations support works like a charm.
s
That is super cool 👍
f
@Adrián Mouly I’m not sure the HTTP proxy routes sends the signature info along.. I want to say it doesn’t, but you’d want to check. Otherwise you’d need to sign it like @Simon Reilly suggested.
I know you gave ur reason for this design in another thread… but give what @Ashishkumar Pandey suggested a thought 😜
a
Yeah.
I’m using event-bridge for communication too.
But well, I have some really specific internal APIs, that are abstractions or wrappers created around third-party services.
For example, we have a bunch of AI providers, like Open AI and Google NPL… so I have an internal API called “AI service”, which encapsulates some logic around those third parties… My initial architecture considered to build this REST/HTTP API, which can be used across the organization.
This integration pattern, again, called API-led architecture, is what I been trying to replicate here, but apparently there isn’t an easy way to do it.
I don’t need user authentication for those internal services, what I really need is service-authentication, similar to what Service Mesh concept has, which uses the concept of Service Identity.
I thought AWS_IAM can help me on that, due being an internal service, is going to be called by some lambdas, or other internal legacy systems… is not going to be shared to customers/users on the internet.
The thing with Event Bridge is it works for ASYNC calls.
But what if I want a SYNC call?
For example, I want to call OpenAI “GET PROJECTS”. I can’t use EventBridge to do that.
f
Yeah, you are right, EventBridge is async. All of our micro-services communicate with each other all async, on purpose. So each micro-service does the minimal and does not “dependent” on other.
As for third party wrappers, can those be made into private npm modules that are shared across the organization?
Sorry if i’m asking the wrong questions.. obviously I’m not aware of all the requirements 🥶
a
Yes, actually today all my wrappers are NPM packages.. but this doesn’t allow me to share across the organization. Other websites are PHP for example… Having an HTTP API allows me to share an agnostic interface that many systems can just use. I can’t find a way to do this, and looks like I’m forcing the architecture in the wrong way 😞
t
^ I've given this same thing a lot of thought. Event based architectures are entirely focused around pushing async events to a bus. This is great because it maximally decouples parts of your system. Publishers don't have to know about consumers. But you're right in that not all things can be modeled this way, sometimes various domains need to know about each other and call each other up and ask for information. This should be done first through code sharing by implementing libraries that have static contracts. Of course this gets harder if you have multiple languages at which point you need a req/reply protocol that works over the wire. Although HTTP is the immediate thing that pops up, I'd actually consider using SQS. You can implement a simple rpc protocol over this (I'd be surprised if there aren't already libraries that do this) and use that to query various services. In the past I'd have used RabbitMQ but now that I'm all serverless I'd try to use SQS. Technically you can build this on eventbridge as well
Another option would be to wrap the functionality in a lambda and let others directly invoke the lambda
a
I don’t like the idea of building packages. The HTTP API it’s an standard interface that every system can understand.
I mean everybody as every language/technology can use, consume, produce.
Is the common “language” that every system can understand.
a
Are all your services in AWS and under the same account?
a
At the moment yes.
a
So, what I think you could do is use AWS STS API to assume a role, this role would be the one that already has access to that particular service. So, technically you could create a role 1 that has access to service A & D and another role 2 that has access to service A, C & D and service B could be a public service which could use JWT auth or anything else. Now, if you need to call service C from service B you would use AWS STS api and assume role 2 and then generate the signature to call service C. I think this could work, it’s tedious and convoluted but can work. Here’s a reference to AWS STS api - https://docs.aws.amazon.com/STS/latest/APIReference/welcome.html