ven
05/09/2023, 11:41 AMaar2dee2
05/09/2023, 11:43 AMf1omo_staff
role:
sql
-- new role for staff
CREATE ROLE "f1omo_staff" WITH LOGIN INHERIT;
GRANT "f1omo_staff" TO authenticator;
GRANT anon, authenticated to "f1omo_staff";
GRANT USAGE ON SCHEMA "public" TO "f1omo_staff";
ALTER DEFAULT PRIVILEGES FOR ROLE "postgres" IN SCHEMA "public" GRANT ALL ON SEQUENCES TO "f1omo_staff";
ALTER DEFAULT PRIVILEGES FOR ROLE "postgres" IN SCHEMA "public" GRANT ALL ON FUNCTIONS TO "f1omo_staff";
ALTER DEFAULT PRIVILEGES FOR ROLE "postgres" IN SCHEMA "public" GRANT ALL ON TABLES TO "f1omo_staff";
CREATE POLICY "Allow all for f1omo_staff users" ON "public"."feed_css_selector"
FOR ALL TO "f1omo_staff"
USING (true);
CREATE POLICY "Allow read access for f1omo_staff" ON "public"."feed"
FOR SELECT TO "f1omo_staff"
USING (true);
CREATE POLICY "Allow all for f1omo_staff" ON "public"."digest"
FOR ALL TO "f1omo_staff"
USING (true);
The policies show up in the Supabase dashboard. I can see the logged in user's role set to f1omo_staff
on after logout and login again. But when I try to read the feed
table using the supabase client I get data []
error null
. Here's the client side code:
ts
export const getAllFeeds = async () => {
const { data, error } = await supabase
.from("feed")
.select("*")
.order("newsblur_feed_id", { ascending: false });
console.log("data", data);
console.log("error", error);
if (error) {
console.log(error);
return [];
}
return data;
};
export async function getStaticProps() {
const feeds = await getAllFeeds();
return {
props: {
feeds: feeds,
},
revalidate: 600,
};
}
https://cdn.discordapp.com/attachments/1105459979386892409/1105461400781991977/Screenshot_2023-05-09_at_5.16.13_PM.png▾
TheRien
05/09/2023, 1:08 PMsupabase link --project-ref $HELP
Merovingian
05/09/2023, 1:27 PMconst openGallery = async () => {
const options: ImageLibraryOptions = {
mediaType: 'photo',
includeBase64: false,
maxHeight: 200,
maxWidth: 200,
};
launchImageLibrary(options, async result => {
if (result.didCancel) {
console.log('User cancelled image picker');
} else {
const localUri = result?.assets?.[0]?.uri;
const userId = user?.id; // user's id.
setAvatarUrl(localUri);
if (localUri && userId) {
await uploadAvatarToSupabase(localUri, userId);
}
}
});
};
const uploadAvatarToSupabase = async (localUri, userId) => {
const response = await fetch(localUri);
console.log('response', response);
const blob = await response.blob();
console.log('blob', blob);
const fileExtension = localUri.split('.').pop();
console.log('fileExtension', fileExtension);
const fileName = `${userId}.${fileExtension}`;
console.log('fileName', fileName);
const {data, error} = await supabase.storage
.from('profilePhotos')
.upload(fileName, decode('includeBase64'), {
cacheControl: '3600',
upsert: true,
contentType: `image/${fileExtension}`,
});
console.log('data', data);
if (error) {
console.error('Upload Failed:', error.message);
} else {
console.log('Upload Successful');
}
};
And there is a response: 👇Epailes
05/09/2023, 1:46 PMaccess_token
for a user using the refresh_token
after the access_token
has expired:
AuthApiError: Invalid Refresh Token: Already Used
at E:\Programming\Projects\ArtPage-SvelteKit\node_modules\@supabase\gotrue-js\dist\main\lib\fetch.js:41:20
at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
__isAuthError: true,
status: 400
}
I'm using this command: const {data, error} = await userClient.auth.refreshSession({refresh_token});
If I attempt to run that before the access_token expires, it works correctly so I know it's not an issue with something in my application errantly using the refresh_token meaning it's actually used.
I'm running supabase on my local machine, and I've set jwt_expiry = 60
just to make testing this easier.
Is this the intended behaviour? I thought refresh_tokens didn't expire as they're meant to be long lived...SkyNightSnow
05/09/2023, 2:40 PMhttps://cdn.discordapp.com/attachments/1105504609100902481/1105504609746813030/image.png▾
Matt Freeman
05/09/2023, 3:26 PMMartiN
05/09/2023, 3:38 PMnpx supabase db dump --db-url "$OLD_DB_URL" -f roles.sql --role-only
and got
Dumping roles from remote database...
Error: Error running pg_dump on remote database: Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
Try rerunning the command with --debug to troubleshoot the error.
I tried running with --debug but got the same thing.
I have docker, and postgres on my mac. I restarted docker and ensured it is running but nothing changed.
At some point it asked me to run supabase link
so I created a local supabase project using supabase init
, I logged in with supabase login
, and ran
supabase link --project-ref <my_project_ref>
, but then again got the same error.dave
05/09/2023, 3:47 PMsql
-- Create a function to get the encrypted_password for a specific user by email
CREATE OR REPLACE FUNCTION get_encrypted_password(p_email TEXT)
RETURNS TEXT
LANGUAGE plpgsql
SECURITY DEFINER
AS $$
BEGIN
RETURN (
SELECT auth.users.encrypted_password
FROM auth.users
WHERE auth.users.email = p_email
);
END;
$$;
I can't do bcrypt easily in my application, so doing it in PostgreSQL would be ideal.
For context, I'm trying to workaround the rate limiting on the auth endpoints for our server-side application.mikel
05/09/2023, 3:48 PM<img src={supabase_post.image}/>
, will it count towards the storage egress every time someone views the post because of the image public URL? (Again, I'm not calling the public URL from the bucket every time. The public URL is already stored in a database cell.)
Thanks!nimo
05/09/2023, 3:53 PMYuu
05/09/2023, 4:21 PMDignifiedSwine
05/09/2023, 4:28 PMsili
05/09/2023, 4:33 PMSANTlAG0
05/09/2023, 5:09 PMconst { data, error } = await supabase
.from("profiles") // Updated with 2 type arguments
.select(
`
username,
credits,
accounts: accounts!profile_id (
id,
account_name,
account_password,
whatsapp_number,
subscription_start_date
)
`
)
.eq("id", user.id)
However it seems accounts can be of type Account | account[] | null
Where I would expect an array with one, empty array or array with accounts.
How can I make sure an array is always returned매튜
05/09/2023, 5:33 PMhttps://cdn.discordapp.com/attachments/1105548161579241512/1105548161826684948/0.png▾
Miles
05/09/2023, 5:41 PMWizzel
05/09/2023, 5:54 PMjson_data
of type jsonb
and allow_nullable
is set to false.
When I update another column in this table and row, the stream emits data where json_data
is null.
When I check the respective row, json_data
is not null.
Why is the emitted data from the stream incomplete?malphine2
05/09/2023, 6:18 PMFuture<void> listenToOrderTableInsert() async {
supabase.channel('public:order').on(
prefix.RealtimeListenTypes.postgresChanges,
prefix.ChannelFilter(
event: 'INSERT',
schema: 'public',
table: 'order',
filter:
'company_id=eq.${getProvider()!['company_id'].toString()}'),
(payload, [ref]) {
print('Change received: ${payload['new']['company_id']}');
orders!.add(payload['new']);
notifyListeners();
},
).subscribe();
}
Future<void> listenToOrderTableUpdate() async {
supabase.channel('public:order').on(
prefix.RealtimeListenTypes.postgresChanges,
prefix.ChannelFilter(
event: 'UPDATE',
schema: 'public',
table: 'order',
filter:'company_id=eq.${getProvider()!['company_id'].toString()}'),
(payload, [ref]) {
print('Change received: ${payload['new']['company_id']}');
},
).subscribe();
}
viktor
05/09/2023, 6:21 PMnicollegah
05/09/2023, 7:04 PMeusoubrunocamargo
05/09/2023, 7:09 PMphemartin
05/09/2023, 7:19 PMcontentHtml
+ userId
It seems pretty slow. Is that normal? How can I speed things up?
The only reason I can think of is that sometimes the contentHtml
payload is large ~0.5mb and may be affecting performance.Thierry-Hackeet
05/09/2023, 7:45 PMjj_sessa
05/09/2023, 7:48 PMdave
05/09/2023, 8:12 PMven
05/09/2023, 8:59 PMRyan [untitled]
05/09/2023, 9:23 PMGrantly
05/09/2023, 9:34 PMNin
05/09/2023, 10:00 PM