Hi everyone! Looking for ideas to setup testing e...
# docker
p
Hi everyone! Looking for ideas to setup testing environment locally in parallel with dev environment. I’m using
docker/sdk
ofc. At the moment, the issue is that same resources (DB, redis, queues) are being used by testing environment, so if we have a test that needs a fresh state of DB, we have a problem. In Testify Environment.php APPLICATION_ENV is set to
devtest
. There’s even a configuration file for this environment, but how to set it up? Basically there has to be a separate database (with a dedicated user), separate RMQ vhost, and likely redis db aswell. That could be achieved by manually creating those resources, but is there any way to automate this? The only idea that comes to my mind is to have a separate
deploy.yml
file, but this has a great downside - everytime you wan to run tests, the environment would have to be rebuilt… Any ideas welcome under the thread. Thanks ❤️
l
Never tried that myself, but if it’s about your local env + testing - I would go and check for using a separate “region” for testing environment. OOTB you have 2 regions - EU and US - they have separate resources (incl. DB). Then you pick a store from this new region and use it as your testing store:
Copy code
testing:
    store: ...
then you can introduce and maintain your import data that you need for testing purposes
But from another perspective - I would question this
Copy code
we have a test that needs a fresh state of DB
and why you would rely on some data present/missing in DB before test starts? To make sure your test is isolated, you have to prepare the data you need by using TransactionHelper and DataBuilders
Second part of your question (devtest env and configuration): Normally, the ootb config is enough to run your tests. If you follow one of the approach above (either region with separate DB or isolated tests) - then you won’t need to automate/configure (almost) anything - only the right config in deploy.dev.yml
p
@little-umbrella-40933, thanks for Your reply! We do prepare the data, but the problem arises in integration tests (e.g middleware data import process), during which transaction gets committed, all prepared dataends up on database. Datacleanup fails too because new entries are added during data import.
👍🏻 1
on subsequent test run a failure happens because of duplicated SKUs
l
yes, acceptance/integration tests would have this issue you described, I agree. Unless you’re able to randomise the input in the acceptance test and be able to verify/compare the result on the frontend
Then go ahead and try out a new region setup for testing 😉
a
I remember there was a setup that would restore a database backup before each test suite. There should be a helper class for that
🤔 couldn't find anything. Perhaps my mind is playing tricks on me 🤪
l
@powerful-oyster-48938 Maybe there was something like that in the past.. Certainly before the docker/sdk era..
g
i just like to add my 2 cents here as right now we have the same issue: imo the tests should indeed have it’s own database. if the same database is used you never know what changes had been done to the database structure or data before running the tests. it’s impossible then to have a test make sure to prepare structure and data before execution (except dropping & recreating whole database, which would be slow and also may break your development). it’s necessary to be able to rely on some fixed initial state of the database. sure some tests still need to prepare data (like adding a customer if sth customer-realted has to get integration-tested).
👍🏻 1
👍 2