Hi all! I am just trying to use Supabase for the f...
# help
f
Hi all! I am just trying to use Supabase for the first time. Is someone able to help teach me some of the basics? I'm not sure if this should be in the "JavaScript" channel or here, but I am building a small app with Javascript to fetch data from an API and store it into a table. I have little experience with backend, so I am hoping to get some help as I am having some trouble wrapping my head around documentation!
t
Hi all! I am just trying to use Supabase for the first time. Is someone able to help teach me some of the basics? I'm not sure if this should be in the "JavaScript" channel or here, but I am building a small app with Javascript to fetch data from an API and store it into a table. I have little experience with backend, so I am hoping to get some help as I am having some trouble wrapping my head around documentation!
Hi. Welcome I'm also new to supabase though I have backend experience using a very similar stack that supabase seems to use.
f
Thanks! Nice to meet you.
t
My frontend is not very good though I've previously done the svelte tutorials about 6 months ago. So I used the getting started with svelte guide and copied it as close as possible and had a positive experience.
What js framework do you intend to use if any?
f
I am just working with vanilla JavaScript at the moment.
Essentially, I am fetching data from an API, looping through it and then I would like to store some of the data into a table in Supabase for use on the front-end.
I am able to fetch the data, loop through it and store the data I would like into three variables that I would like to store into Supabase. For simplicity, lets call the variables
Copy code
name, price, volume
t
So you will need to create a table for that.
f
I want to store this data for later use... Now the tricky part for me to understand is that I have this running on a CRON job, so it will be doing this every hour. I won't want to create a new row each time the CRON runs, but instead I want to check if the name exists in the DB already and if it does, then append the price and volume to the columns for that, so I think I would need to have that be a JSON object for those columns?
(hopefully that makes some sense in what I'm trying to accomplish)
t
the backend of supabase is postgres. What you describe is a upsert Below is a snippet from the svelte guide
Copy code
js
  async function updateProfile() {
    try {
      loading = true
      const user = supabase.auth.user()

      const updates = {
        id: user.id,
        username,
        website,
        avatar_url,
        lat,
        lon,
        location_interior_pnt,
        location_the_geom,
        updated_at: new Date(),
      }

      let { error } = await supabase.from('profiles').upsert(updates, {
        returning: 'minimal', // Don't return the value after inserting
      });

      if (error) throw error
    } catch (error) {
      alert(error.message);
    } finally {
      loading = false;
    }
  }
The guide is: https://supabase.io/docs/guides/with-svelte And simple and it looks like mostly vanila js to me.
f
The goal is that on the front end I can eventually fetch data by Name and I will be able to create charts based on all of the stored prices and volumes.. I was thinking it should look something like this:
Copy code
Name: FreakDJ 
    Price: { 50.5, 55.7, 60, 58.2 } 
    Volume: { 302, 393.2, 309.2 }
So I have my Table created so that name is text, price is json and volume is json.
t
sounds good.
f
My code looks something like this:
Copy code
data.results.forEach((x) => {
          const Name = x.name;
          const currentPrice = x.Price.toFixed(2);
          const allTimeVolume = x.txVolume.toFixed(2);

          if ( Name === "exampleName") {
            supabase.from("TestDB").insert([
              {
                Name: Name,
                Price: currentPrice,
                Volume: allTimeVolume,
              },
            ]);
          }
        });
since its an empty table right now I was just trying to get it to insert something before adjusting to be an 'upsert'
t
So the table is called "TestDB"
f
Yeah, just testing for now
sorry the code is a bit of a mess - I was just going for quick functionality and refactor later 😅
t
Are you getting an error?
f
I am not seeing one, but I wasn't sure how to output one either
t
I got to head back to work. though happy to help out over the weekend.
f
Have a nice time at work. Hopefully its a nice day for you. Thanks for taking a look! I will keep trying some things out and try to get some output working properly.
g
If you are testing make sure you turn off RLS on any table and get it working, then learn policies. RLS can fail silently in many cases.
f
RLS is not enabled on my table currently - is that what you mean by turning it off?
g
Yes.
f
Good, that should be disabled for my testing then! I will look into policies once I can get this working.
g
Just glancing at your code I don't see an await before the supabase call or a .then() after
f
Trying to add await before gives me a SyntaxError: Unexpected reserved word
g
So supabase calls return promises and are async. You can do .then((response)=>{console.log(response)}) if not in an async function (only place you can use await)
f
Ah that response allows me to see an error message
g
It is also required to actually run the the supabase call. Other wise you are just setting it up.
f
The error is message: 'column "Volume" of relation "TestDB" does not exist' but I am seeing it in my table?
Ah, nevermind - there was an accidental space after the word Volume in my column name.
I was able to successfully insert the data once! Thank you. Now I need to learn how to do 'upsert' to append the price and volume each time the cron job runs...
g
just remember if you are doing base javascript, you need to have a .then() instead of await with any supabase call unless you are in an async function already.
all the guides I believe use await so it will be a bit confusing
f
Thank you , I will note that!
I suppose my code might be a bit awkward - probably as I am still also trying to grasp async awaits
g
also you can look at the network and should see get/post with response data to dobule check.
f
But thats the point of me trying to build these projects so I can learn more about how they work!
g
At this point you need to learn that as everything is going async.
Have fun.
f
So my understanding is upsert will insert data if it doesn't exist, and update it if it does; however, I don't want to override what I have but rather append the new data to the price and volume columns. Is this possible with upsert?
I also tried changing
insert
to
upsert
and that still created a new row - did I do something wrong?
t
you need a unique key for upsert
f
the name should be unique
t
yes.
will you consider having a new row per name, price, volume combination? How big is it likely to get?
f
I am hoping to have one row per name - and then in that row just append price and volume to a json object each hour
t
you could append the price and volume array on a update trigger though the db will need to read / update that row every time.
f
So I don't want upsert but rather read and update the row?
t
I understand that now. Though a custome update trigger can do what you want. You will be able to do new.price = old.price || new.price -- use || as its the the postgres array function as per https://www.postgresql.org/docs/13/functions-array.html
I better get going again though for the update trigger it will be something like https://discord.com/channels/839993398554656828/869405720934744086/908830212538646608
f
Hmm I am not sure I am understanding... I think I get the concept in english but not sure how to translate it to code
Copy code
javascript
//Goal: if row with 'name' exists, append price to json array and volume to json array; else insert new row

//how can I check if row with name exists already before trying to update? 
if ( name exists ) {
   supabase.from('TestDB').update({ /* also not sure how to just append it to the array here */ }).eq('name', name).then((response) => { console.log(response) });
} else {
// insert data for the first time
}
g
So you have 3 ways to approach this. 1) just write a new row with your 3 values every hour. Then deal with putting the data together with reads. 2) read and then update an array or json column. Arrays of ints/strings are supported as a column type. 3) write an rpc function that does array appends in SQL which are supported. Knowledge of postgres/supabase go up with each, but last one is probably most performant depending on what you really are doing. Not sure what your plan is for max items in an array as far as reading and writing huge amounts of data at some point.
f
I am thinking I’ll have around like 100-200 unique “names”, that will fetch the price and volume hourly… and I will need to have the data setup from the database so that I can create a chart for price vs time and volume vs time… of the three ways you mention which might be the preferred way? Also I hadn’t considered data cleanup to be honest, so I might end up filling the db quickly unless I remove older data at some point, but I’d want to keep as much as possible I think….
g
I'm thinking your approach might be flawed on just storing data values if you are dealing with something like stock. What if API is down, database goes down, internet issues etc. Now you have gaps of missing data, but no timestamp to reconcile they are missing.
f
What do you propose as a solution? By the way I am more using it to learn than making a product that I will sell and/or needs to be perfect so if for some reason something went down for a brief time I think it would be ok
I suppose I can probably use a created_at column and just insert many columns for each “name”
My DB would have many rows and I’d have to read all rows with a specific name and sort by created_at date when displaying the data…
Just not too sure how quick my db would fill up that way creating many more rows than I thought and having to read them all when creating front end
t
I encourage you to try the many rows approach you describe its the way relational databases like postgres are designed to be used. Having many lean rows would not be much larger than a few bloated rows.
g
I agree with @User that you should do row per, especially if just experimenting as it is the "right" way. The idea of reading and then updating an ever growing array is not going to work well long as all the data is transferred for every update 2x. There certainly are other approaches like "daily" rows with 24 entry arrays, that are comprises on #rows and reasonable size but postgres is easily able to handle millions of small rows.