Question on integration tests: I ran `npx sst test...
# sst
l
Question on integration tests: I ran
npx sst test
on a sst/cdk test that I wrote in the spirit of the
MyStack
test that's include with a new sst project. It passed just fine, but it did so without deploying anything. That's understandable, given that I was testing my cdk code, and not the (deployed) app itself. How do we run Jest integration tests AFTER deployment? Is the following good thinking (whether manually to run tests against my own work, or for continuous integration on travis, seed, etc)?: • run
npx sst deploy
to deploy to dev • run
npx sst test
which runs the Jest tests against the published APIs (but how will the tests know what the just-published API url is? Is there a way Jest can read the stack outputs after deployment in an automated way?) • run
npx sst remove
to tear down But then I don't understand when you want my
@aws/cdk-assert
tests are supposed to run (which my gut says should be BEFORE deploy), and when I want my
CURL -X POST blah, blah, blah
integration tests to run which need to run AFTER deployment? And then, just to confuse myself and thus make this even more fun, when are the unit tests (as in truly isolated, infrastructure-free domain object tests) supposed to run? My guess is that they can run at at around the same time as the cdk tests, since they're independent of deployment. So, that would leave how to get those #@$% integration tests to run!.... šŸ¤”šŸ˜– Some other thoughts to add: • My inclination and habit around integration tests is to start clean every time. (should we have a
test
environment for this approach? Or am I thinking wrong and just test against
dev
with updates to the stacks on each deploy? • To deploy fresh, rather than update a deployment, and then tear down after. • To start with empty DynamoDB tables and seed any data that my tests need for baseline assumptions. • Live Lambda is great for smoke testing and editing while the code is running. But it's not my assumption that that's how integration tests would run. Seems a fully deploy is good for that... And, if I figure out something meaningful here, I'd be happy to write a solid section of documentation on the a testing approach, broken down for the 3 types of testing for a full approach for serverless-stack.com. << Dangles šŸ„• in front of @Frank... >>
t
I wouldn't consider the type of test you're doing to be a test of your cdk stack. It looks like you're doing normal integration tests of your application You are right that these tests might depend on output values from the cdk deploy. I think the
sst-env
package should be extended to help with this situation
f
@Luke Wyman,
sst test
is for running unit tests, at least at the moment (given you just made me think more about integration tests lol). So you are right, there r 3 types of tests: 1. unit tests for SST code => run before deploy via
sst test
2. unit tests for Lambda code => run before deploy via
sst test
3. integration tests => run after deploy on deployed resources For #3, you can use the
--outputs-file
to dump out the stack outputs to a json file, ie. `sst deploy --outputs-file outputs.json`; parse the values; and use them in the integration tests.
I think having an
sst integration-test
command; or have
sst-env
work with an existing integration test tool like @thdxr suggested makes a lot of sense.
t
oo didn't know about outputs-file, can definitely take advantage of this
l
Okay, here's the flow for the CI part of my pipeline (assuming the Jest path flag works with
npx sst test
) Under this scenario, make the
RemovalPolicy
on my DynamoDB tables dependent on stage. If
--stage test
, then
RemovalPolicy.DESTROY
, else
RemovalPolicy.RETAIN
so that the DynamoDB tables are empty at the start of each test run, and there aren't any infrastructure artifacts hanging around in the test environment between runs. Deploying to dev updates the stack so that Outputs remain the same and the DynamoDB tables retain. I really like the blank slate for any "stateful" infrastructure for each test environment run - I can seed any test data I need in my Cognito User Pool and DynamoDB tables. Very clean and deterministic. The flow would look something like this: 1.
npx sst test --testPathPattern=tests/sst
2.
npx sst test --testPathPattern=tests/unit
3.
npx sst deploy --stage test --outputs-file outputs.json
4.
npx sst test --testPathPattern=tests/integration
5.
npx sst remove --stage test
6.
npx sst deploy --stage dev
f
Hey @Luke Wyman, hmm.. r u running the same integration test in step 1 and 4?
l
No, that's called a typo, haha. I've edited to say sst tests.
f
that flow actually looks pretty neatšŸ‘
t
Don't know if this is too far but I actually wipe my dynamo tables between each test and reset to seed data
l
Okay, so you're doing a wipe "afterEach" on the dynamo tables. What command are you using to wipe?
t
Copy code
const rows = await Dynamo.Table.scan()
    await Promise.all(
      rows.Items.filter((item: any) => item.entity).forEach((item: any) =>
        Dynamo.Table.delete(item.entity, item)
      ) || []
    )
I'm using dynamodb-toolbox
l
cool, thanks! šŸ™‚
t
but ultimately just scanning the table and deleting each entry
l
yup