terion
07/13/2018, 12:07 PMmigrations
table (in fact replicating local migration history). Maybe not very robust solution, but should work. One can locally build a migration process using temps and other tricks to alter/move/add structure and data, than "freeze" the process in migration history and target server will replicate this process
B) Opposite direction: developers use a versioned migration approach as in other frameworks and after migration server generates and outputs final datamodel as a helper for developers to see actual data state in one place
Second variant I see as more solid and controllable, but it fundamentally changes approach to data modeling and requires development of special datamodel-builder syntax and/or special cluster api for this. An example of graphql api for migrations can be something like this:
mutation {
model(
name: "Article",
addField: [{ name: "slug", default: "", index: INDEX_UNIQUE }],
changeFeild: [{ name: "old_name", rename: "new_name", nullable: true }]
removeField: [{name: "old_name"}]
) {
status
}
}
So the migration will be a file with mutations like this. Also this approach simplifies adding basic data: not seeds with test data, but system-critical data, like initial admin account or initial settings, etc