hey guys, using dynamodb and a lambda via the gate...
# help
k
hey guys, using dynamodb and a lambda via the gateway api. What do you guys do when your posting an update that creates a row in your database, but you want to make sure its unique? for example If I were to run curl -X POST -H 'Content-Type: application/json' -d '{"emailAddress":"Example@123.com"}' https://endpoint.amazonaws.com/subscribe The first time it would add the entry into dynamo db, but the second time it would reject because it already exists. Currently When i add, it just adds a second item into my database. my table creation is as follows. this.EmailSubscribersTable = new sst.Table(this, "Email_Subscribers", {       fields: {         subscribeId: sst.TableFieldType.STRING,         emailAddress: sst.TableFieldType.STRING,         confirmed: sst.TableFieldType.BOOL,         timestamp: sst.TableFieldType.STRING,       },       primaryIndex: { partitionKey: "subscribeId", sortKey: "emailAddress" },
t
DynamoDB is actually a lot more of an advanced tool than it initially seems, it takes a bit of work to learn how to model things properly. To directly answer your question I believe you can use a conditional expression with your put to reject if the primary key is already there. Sorry didn't see that you wanted to keep email as unique. I don't think you can do this with your current data model, would likely need a secondary index but there are better ways to model this: https://aws.amazon.com/blogs/database/simulating-amazon-dynamodb-unique-constraints-using-transactions/ You should probably look into single table design to learn those patterns - having a table per concept usually isn't the right approach
r
Get yourself The DynamoDB Book by Alex DeBrie, it's a very readable masterclass in designing data models for DynamoDB
k
nice
is this link a hacky way of doing it though?
pk using email#bobby.tables@gmail.com
t
All of dynamo feels like a hacky way of doing things haha but it's not
It's just different from traditional DBs
My general advice is dynamo is an excellent tool to pair with serverless, but it's not something you can incrementally learn. There's a fairly high upfront cost to learning the patterns and re-using knowledge from relational DBs won't work. If you're new to serverless, it might be a lot to learn serverless and also learn Dynamo at the same time. Would suggest using RDS serverless (which we have a new construct for) or planetscale
k
I dont mind learning both 😄 - I have to skill up somehow
so on insert. you add your row data fields: {     subscribeId: sst.TableFieldType.STRING,     emailAddress: sst.TableFieldType.STRING,     confirmed: sst.TableFieldType.BOOL,     timestamp: sst.TableFieldType.STRING,    }, then you add another fields: {     subscribeId: sst.TableFieldType.STRING,    }, where subscribeId = emailAddress#Example@123.com ?
t
yeah in that case I recommend that book Ross mentioned but also this video:

https://www.youtube.com/watch?v=Xn12QSNa4RE&t=1382s

yes...but I wouldn't have the table structure you have to begin with since I'd opt for single table design
And I usually use a library like this: https://github.com/sensedeep/dynamodb-onetable Or this one which I haven't tried yet: https://github.com/tywalch/electrodb
k
ah okay
I need to go away and hit dyamodb and learn the onetable pattern 🙂
thanks alot
lol...
Preface My DynamoDB story begins with an abundance of unfounded confidence, slowly beaten out of me by the club of experience.
t
hahah
don't worry, some of us made it all the way to production on unfounded confidence
k
😄
To my readers born after 1995, a "phone book" was a real-life physical book containing the names, addresses, and phone numbers of everyone in a particular location. The phone book was in alphabetical order by last name. Imagine that! This was before fancy things like Google and Facebook rendered phone books obsolete for anything other than kindling.
d
I came late to this, but I see no reason why a
ConditionExpression
wouldnt work given the data model he has. Both PK and SK (the email) will be checked, and it will fail to create if not unique.
k
mangling my brain trying to comprehend lol
@Derek Kershner it would solve my problem but understanding how to leverage dynamodb correctly was the more valuable lesson in this thread, specifically the single table design principle. Which comes with learning to create ERD diagrams and understanding access patterns to design an application with an efficient model
its quite cool because initially I want to capture someone's email address and that's all I will know about them, but later on if they register in my application I want to link that user email address to that person who registered. So Im starting my single table with only an email address, but then once they register, I have more data about them, once they complete their profile, I again have more info, and once they go though a shopping cart I again have more information. at the same time they might never register, or they might register but never sign up to the newsletter lol
and then theres the fact that people can have multiple email addresses 😄
i could use a complex type for the multiple email addresses actually
instead of email -> example@123.com i could do email -> [{example@123.com,mysecondemail@123.com}]
d
You can't use a list in a PK or SK, though
k
true. Also struggling to figure out if i use cognito and post hook that data to dynamodb... at least the email and id or something so cognito identities are linked to a customer identity.
r
That's how we do it, we have a user signup and record the necessary information in DDB in the postConfirmation lambdaTrigger
k
thanks Ross.
do you use hooks to keep in sync with cognito too? say if someone changes something in cognito? or do you stick to immutable data?
r
We don't provide the facility for them to change anything (other than password) in Cognito so it doesn't really apply
k
I mean what I'm struggling with right now is creating my ERD but what's the "start" point. I know I need to store emails before someone creates a cognito identity. For me I will allow things like password changes and MFA and federated identities - all stuff that cognito does well, but I probably want to link an identity to my dynamodb item via the cognito id for the identity. One thing that worries me is if you change anything in the cognito bit it can wipe out your user data
r
We use the sub field as the PK in DDB. Email could be an SK, so you could have a secondary index that flips them and allows you to retrieve by email address should you need to. There's no reason a change should wipe your data though, if you write your updates correctly
k
ah okay thanks
d
We don't copy, only supplement, Cognito data, so Cognito is the source of truth for the data it holds. Syncing it would be tough, I don't know of a hook that fires on info change.
k
using https://github.com/sensedeep/dynamodb-onetable how do you get sst to deploy it? 🙂
r
We just deploy our table with generic names for PK and SK. Any GSIs we have to mirror in the schema. The schema definition in onetable is more a runtime thing in terms of how it handles look ups
k
something like this @Ross Coundon? this.table = new sst.Table(this, "SingleTable_V1", {       fields: {         userId: sst.TableFieldType.STRING,         email: sst.TableFieldType.STRING,       },       primaryIndex: { partitionKey: "userId", sortKey: "email"},     });
what I dont quite understand yet is if you need to create the whole schema in advance into the table or if you can just add to the single table
like if i deployed a table with id, tmp Can i add to the table with something not define? like create into table id, tmp, beta
where id = id
r
In one table design you're generally going to have a partition key called PK and sort key called SK. That's all DynamoDB needs to know about. (unless you have a TTL field too) Your schema in onetable needs to define for each model has in terms of fields and how you overload your PK and SK
k
thanks @Ross Coundon
do you have default fields set too?
in the initial bit
Error: No fields defined for the "SingleTable_V2" Table
this.table = new sst.Table(this, "SingleTable_V2", {        fields: {       //   userId: sst.TableFieldType.STRING,       //   email: sst.TableFieldType.STRING,       },       primaryIndex: { partitionKey: "PK", sortKey: "SK"},     });
would it be this.table = new sst.Table(this, "SingleTable_V2", {        fields: {          PK: sst.TableFieldType.STRING,          SK: sst.TableFieldType.STRING,       },       primaryIndex: { partitionKey: "PK", sortKey: "SK"},     }); ???
oooh think i got it!
so i did do this.table = new sst.Table(this, "SingleTable_V2", {        fields: {          pk: sst.TableFieldType.STRING,          sk: sst.TableFieldType.STRING,       },       primaryIndex: { partitionKey: "pk", sortKey: "sk"},     });
then had some jiggling about
created a route
routes: {         "POST /test": {           function: {             srcPath: "services/test/",             handler: "create.main",             environment: { TABLE_NAME: props.table.tableName },             permissions: [props.table],           },         },
then i created the js
import handler from "../../util/handler"; import {Table} from 'dynamodb-onetable' import AWS from "aws-sdk"; const client = new AWS.DynamoDB.DocumentClient(); export const main = handler(async (event) => {     const MySchema = {       format: 'onetable:1.1.0',       version: '0.0.1',       indexes: {           primary: { hash: 'pk', sort: 'sk' },           gs1:     { hash: 'gs1pk', sort: 'gs1sk', follow: true },           ls1:     { sort: 'id', local: true },       },       models: {           Account: {               pk:          { type: String, value: 'account:${name}' },               sk:          { type: String, value: 'account:' },               //id:          { type: String, generate: 'ulid', validate: /^[0-9A-F]{32}$/i },               id:          { type: String, generate: 'ulid' },               name:        { type: String, required: true },               status:      { type: String, default: 'active' },               zip:         { type: String },           },           User: {               pk:          { type: String, value: 'account:${accountName}' },               //sk:          { type: String, value: 'user:${email}', validate: EmailRegExp },               sk:          { type: String, value: 'user:${email}' },               id:          { type: String, required: true },               accountName: { type: String, required: true },               email:       { type: String, required: true },               firstName:   { type: String, required: true },               lastName:    { type: String, required: true },               username:    { type: String, required: true },               role:        { type: String, enum: ['user', 'admin'], required: true, default: 'user' },               balance:     { type: Number, default: 0 },               gs1pk:       { type: String, value: 'user-email:${email}' },               gs1sk:       { type: String, value: 'user:' },           }       },       params: {           'isoDates': true,           'timestamps': true,       },   }   const table = new Table({     client: client,     name: 'kldev-OxyDjinn-SingleTable_V2',     schema: MySchema, })   //CREATE   const Account = table.getModel('Account')   let account = await Account.create({     id: '2ab73092-82bc-47e6-8271-481457eca6e3',     name: 'Acme Airplanes',   })   return account; });
(working on crud testing)
and that did work nicely!
(ignore the post -d i havnt wired that bit up yet)
curl -X POST -H 'Content-Type: application/json' -d '{"emailAddress":"Example@123.com"}' https://blah.execute-api.us-east-1.amazonaws.com/test {"id":"2ab73092-82bc-47e6-8271-481457eca6e3","name":"Acme Airplanes","status":"active","created":"2022-02-12T220639.254Z","updated":"2022-02-12T220639.254Z"}
woot! the main gotcha for me was a bunch of permission errors because of the name of the table name: 'kldev-OxyDjinn-SingleTable_V2', but once i figured that it started working
r
Great, re permissions, in your stack you can do permissions: [myTable] In the function definition
k
i just did const table = new Table({     client: client,     name: process.env.TABLE_NAME,     schema: MySchema,   }) worked a treat
thanks so much
what would be the best practise for this
// Scan is not normally advised -- scans entire table let users = await User.scan({}) console.log('FOUND users', users) for (let user of users) { await User.remove({id: user.id}) }
say if i was an admin on an admin console and i wanted to get a list of users for some reason
r
You would typically have an SK that started USER where you specify a known PK and SK begins_with USER
This is why you need to define your data access patterns up front so you can design the data structures and key overloading to get what you need
Sometimes this is via a GSI where you do things like reversing PK and SK. There lots of good examples in The DynamoDB Book
k
nice thanks. I did go though the book, but as I'm starting to do things in practice too I can now understand the whys. Sometimes you need to hit a problem in the real world to get the best learning experience
r
Absolutely, building ToDo apps can only get you so far. 😁
k
😄
have you ever used onetable and needed to get the cancellation reason from a transact? im struggling to get the reason and google isnt helping me one bit 😞
let user = await table.transact('write', transaction).catch((error) => {       let errorMessage = 'Could not create user'       console.log(error)       console.log(transaction.TransactItems)       return {         error: errorMessage       }     })
OneTableError: OneTable execute failed "transactWrite" for "_Generic. Transaction cancelled, please refer cancellation reasons for specific reasons [ConditionalCheckFailed, None]
i cant seem to get the cancellation reasons
r
You could try using sst start in debug mode and step into the onetable code to see what’s going on
k
i can do that?!
lol thanks Ross i will give that a go
r
Arthur C Clarke's 3rd Law Any sufficiently advanced technology is indistinguishable from magic
k
I couldnt solve how to pull the exact errors, so had to hack around it 😞
let user = await table.transact('write', transaction).catch((error) => {     let errorMessage = 'Could not create user'     if(error.context.err.code === "TransactionCanceledException")     {       //we have to split the err.message to get the transactional failures. a null is OK but anything else is not.       var errors = error.context.err.message.split('[')[1].split(']')[0].split(',')       if(errors[0] === 'ConditionalCheckFailed') {         errorMessage = 'User with this username already exists.'       } else if (errors[1] === 'ConditionalCheckFailed'){         errorMessage = 'User with this email already exists.'       }       console.log(error.context.err.message)     }         return {       error: errorMessage     }   })
r
Good sleuthing but there must be a better way! I've not done much with transactions in onetable so I don't have any knowledge of what that might be though. As an aside, I'd get out of the habit of using var, the scoping is too lax, let and const are safer
k
I updated to version 3 of the AWS framework and i allowed me to get rid of that split with var errors = error.context.err.CancellationReasons
if(errors[0].Code === 'ConditionalCheckFailed') {         errorMessage = 'User with this username already exists.'       } else if (errors[1].Code === 'ConditionalCheckFailed'){         errorMessage = 'User with this email already exists.'       }