ola berglund
05/22/2023, 6:29 PMhttps://cdn.discordapp.com/attachments/1110273262816075786/1110273263294222356/image.pngβΎ
https://cdn.discordapp.com/attachments/1110273262816075786/1110273263659143319/image.pngβΎ
taylor
05/22/2023, 6:59 PMHugos
05/22/2023, 7:00 PMlimitlessloop
05/22/2023, 8:14 PMjs
const { data, error } = await supabase
.storage
.from('snippet_thumbnails')
.list('private')
but from what I can tell, it doesn't return a url or content. What am I missing?LittlePinkCookie
05/22/2023, 8:41 PMjs
export const listenUsers = (onChange: (payload: RealtimePostgresChangesPayload<User>) => void) => {
supabaseClient.channel('users')
.on('postgres_changes', { event: '*', schema: 'auth', table: 'users' }, onChange)
.subscribe()
}
listenUsers(async (change) => {
// New user has been created
if (change.eventType === 'INSERT') {
console.log("π ~ file: events.ts:17 ~ listenUsers ~ change:", change)
// some actions
}
})
But the output of the console log is
π ~ file: events.ts:17 ~ listenUsers ~ change: {
schema: 'auth',
table: 'users',
commit_timestamp: null,
eventType: 'INSERT',
new: {},
old: {},
errors: [ 'Error 401: Unauthorized' ]
}
How can I achieve that ?
Thanks in advance π€osamita
05/22/2023, 9:02 PMMr.Furious
05/22/2023, 9:08 PMLyingcap
05/22/2023, 9:13 PMhttps://cdn.discordapp.com/attachments/1110314634474422325/1110314634583478302/image.pngβΎ
formigueiro
05/22/2023, 9:45 PMFailed to delete user: update or delete on table "users" violates foreign key constraint "objects_owner_fkey" on table "objects"
https://cdn.discordapp.com/attachments/1110322447615525036/1110322448408260628/image.pngβΎ
pshushereba
05/23/2023, 12:53 AMmephiles
05/23/2023, 2:25 AMFailed to create function: failed to create pg.functions: column "id" does not exist
, maybe it's because the Id field is Primary key somehow? π I don't know. :/
Any hints about passing the arguments?
In the advanced settings, I've select:
Language = sql
Behavior = stable
https://cdn.discordapp.com/attachments/1110392997578805248/1110392997725618206/Screenshot_2023-05-23_100948.pngβΎ
Nixelite
05/23/2023, 2:39 AMpowershell
$apiKey = "MY-SECRET-API-KEY"
$baseUrl = "https://<MY-SECRET-SUPABASE-PROJECT>.supabase.co"
$endpoint = "MY-SECRET-ENDPOINT"
$headers = @{
"apikey" = $apiKey
"Authorization" = "Bearer $apiKey"
}
$response = Invoke-RestMethod -Uri "$baseUrl/rest/v1/$endpoint?select=key" -Method GET -Headers $headers
$response | ConvertTo-Json
I'm using this, then it returns Invoke-RestMethod : The remote server returned an error: (404) Not Found.
confidential
05/23/2023, 3:10 AMmdc405
05/23/2023, 3:57 AMerror trying to connect: tcp connect error: Cannot assign requested address (os error 99)
I've started by supabase stack and the edge function is issuing a simple await supabase.from('foo').select('id').eq('id', 'a')
The REST query that's generated is http://localhost:54321/rest/v1/foo?select=id&id=a
and this route is accessible from a browser but my edge function is failing to make this call.
The full error is:
TypeError: error sending request for url (http://localhost:54321/rest/v1/foo?select=id&id=a): error trying to connect: tcp connect error: Cannot assign requested address (os error 99)
I'm invoking the function via supabase functions serve foo-function --env-file ./supabase/.env.local
And my .env.local contains SB_URL=http://localhost:54321
and I'm initializing the supabase client in my function via:
const supabase = createClient<Database>(supabaseUrl, supabaseAnonKey)
Been trying to figure this out but not sure where to start. Help!TrashPanda
05/23/2023, 5:52 AMuser_profile
table whenever a new user is signed up.
The triggered function is
BEGIN
INSERT INTO public.user_profile(id) VALUES(new.id);
RETURN new;
END;
I have tried disable the user_profile
RLS and this is the error I'm getting.
permission denied for table user_profile
Not sure what I might be doing wrong here?Fatih G.
05/23/2023, 8:46 AMTheRien
05/23/2023, 9:04 AMawait this.supabaseService.supabase.functions.invoke('create-user')
I would love to have a typed response. I type my edge functions explicitly so that's not the problem. Can I somehow generate these types or otherwise write them manually? I have the database.types.ts
but that doensΒ΄t cover itYANN
05/23/2023, 9:04 AMts
export function FloatingWindow({ inventoryIndex }: { inventoryIndex: keyof typeof proxies }) {
console.log("rendering");
const query = use(supabase.from("item").select("*"));
if (!query.error) {
for (const data of query.data as { type: number }[]) {
proxies[inventoryIndex].set(`bags-${data.parent_slot}`, { slot: data.parent_slot });
}
}
return <div></div>;
}
The rendering log of this component and is being called 2 times only (2 because of react strict mode), but you clearly see that there is more than 17 rest endpoint calls being made, my item table contains only 1 item, am I missing something obvious ?
https://cdn.discordapp.com/attachments/1110493535649660938/1110493536027160646/image.pngβΎ
mephiles
05/23/2023, 9:06 AMavalanche
05/23/2023, 9:41 AMtypescript
const { data, error } = await supabase.rpc('search_context', {
database_id: databaseId,
embedding: `[${embedding.join(',')}]`,
match_count: 10,
match_threshold: 0.5,
min_content_length: 1,
}).explain()
I'm hoping to get analyze result to investigate performance issues with my rpc funciton but instead I get error: 'None of these media types are available: application/vnd.pgrst.plan+text; for="undefined"'
Query works as expected if I remove .explain() functionCokaps016
05/23/2023, 12:13 PMpowershell
TypeError: getSession is not a function
at load (D:/Hobby/thuchanh/src/routes/+layout.server.ts:5:20)
at Module.load_server_data (D:/Hobby/thuchanh/node_modules/.pnpm/@sveltejs+kit@1.18.0_svelte@3.59.1_vite@4.3.8/node_modules/@sveltejs/kit/src/runtime/server/page/load_data.js:57:41)
at eval (D:/Hobby/thuchanh/node_modules/.pnpm/@sveltejs+kit@1.18.0_svelte@3.59.1_vite@4.3.8/node_modules/@sveltejs/kit/src/runtime/server/page/index.js:150:41)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
When I come to the step Send session to client, the whole things break (500 - Interval issue) and I get error above.
Even when I finish step: Generate types from your database, the issues isn't fixed.
Sorry if this sound noob because Im a self-learn dev and I've just learnt abt supabase for a weekGeoff
05/23/2023, 12:19 PMtenant_id
for each user in their raw_user_meta_data
. I'm then using the following function for row-level security:
DROP FUNCTION IF EXISTS current_tenant_id();
CREATE OR REPLACE FUNCTION current_tenant_id() RETURNS integer
LANGUAGE "plpgsql" SECURITY DEFINER SET search_path = public
AS $$
DECLARE retval jsonb;
BEGIN
select coalesce(raw_user_meta_data->'tenant_id', null) from auth.users into retval where id = auth.uid();
return retval::integer;
END;
$$;
The policy I'm using is as follows:
DROP POLICY IF EXISTS "Enable tenant-based access" ON test;
CREATE POLICY "Enable tenant-based access" ON test USING (tenant_id = current_tenant_id());
ALTER TABLE test ENABLE ROW LEVEL SECURITY;
When subscribing to all changes, the only events that fire are those for delete
(which bypasses RLS). If I temporarily disable RLS from the dashboard I get events for insert
and update
too, but only until I reenable RLS.
This is driving me nuts as I really need both real-time updates and RLS. Any help gratefully received.
Thanks,
GeoffRazoth
05/23/2023, 12:33 PMformigueiro
05/23/2023, 12:58 PMsql
select subscriptions->'FREE'
from config
order by created_at desc
limit 1;
when i run this query, is cathing data from database, but if i put into whatever_variable
im getting empty rowbdz
05/23/2023, 1:49 PM{
"code": 504,
"errors": [],
"message": "Backend query timeout! Optimizing your query will help. Some tips:\n\n- `select` fewer columns. Only columns in the `select` statement are scanned.\n- Narrow the date range - e.g `where timestamp > timestamp_sub(current_timestamp, interval 1 hour)`.\n- Aggregate data. Analytics databases are designed to perform well when using aggregate functions.\n- Run the query again. This error could be intermittent.\n\nIf you continue to see this error please contact support.\n",
"status": "TIMEOUT"
}
I have the default "last hour" checked however when I limited it to just an hour, i get the same error.Thiago
05/23/2023, 2:37 PMhttps://cdn.discordapp.com/attachments/1110577344239910983/1110577344374120538/image.pngβΎ
https://cdn.discordapp.com/attachments/1110577344239910983/1110577344629968906/image.pngβΎ
https://cdn.discordapp.com/attachments/1110577344239910983/1110577344869056542/image.pngβΎ
AstroBear
05/23/2023, 3:12 PMts
supabaseServiceClient
.channel('postgres_changes')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'ChatMessage',
},
(payload) => {
console.log(payload);
}
)
.subscribe();
But when an event occurs, the payload contains the following:
ts
{
schema: 'public',
table: 'ChatMessage',
commit_timestamp: null,
eventType: 'INSERT',
new: {},
old: {},
errors: [ 'Error 401: Unauthorized' ]
}
mdc405
05/23/2023, 4:08 PMmdc405
05/23/2023, 4:15 PMconst supabase = createClient<Database>(supabaseUrl, supabaseAnonKey, {
global: {
headers: { Authorization: req.headers.get('Authorization')! },
},
})
This function works fine if I don't make this supabase query.. but that query is important π
Help!kevlust
05/23/2023, 4:47 PMnew_users
table set up with 4 columns: 2 bigints, 1 timestamp, and 1 uuid. From my understanding, these should be pretty lightweight, right? But according to Supabase, each row is taking 24kb. I was thinking more like 48 bytes total for all this, so I'm intrigued as to how we arrive at the 24kb figure, which is about 500 times larger than my original estimate.
While testing, my app has been storing some pretty basic data, and I've seen that my database space usage has already hit about 100mb. If I carry on at this pace, I'll reach 10gb with just 100 average users like me. Now when I look at the Pro plan, with its 8gb database and room for 100,000 monthly active users, I'm left scratching my head a bit. With the way things are going, how would it be possible to house 100,000 average users within that 8gb limit?
Looking forward to your insight on this. If there's a way I can be more efficient with my data storage, I'm all ears! Appreciate any advice you guys can share.
Cheers! π