brett
12/29/2022, 2:53 PMsks
12/29/2022, 2:54 PMbrett
12/29/2022, 2:54 PMsks
12/29/2022, 2:56 PMexport default {
async fetch(request, env, ctx) {
let id = env.TestObject.idFromName("default");
let stub = env.TestObject.get(id);
return await stub.fetch(request);
},
};
Is this the correct way to proxy it ?brett
12/29/2022, 2:57 PMsks
12/29/2022, 3:00 PMexport default {
async fetch(request, env, ctx) {
let id = env.TestObject.idFromName("default");
let stub = env.TestObject.get(id);
return await stub.fetch(request);
},
};
/* Durable Object */
export class TestObject {
constructor(state, env) {
if(this.count === undefined) {
this.count = 0;
}
this.connections = [];
}
async fetch(request) {
const upgradeHeader = request.headers.get('Upgrade');
if (upgradeHeader && upgradeHeader.toLowerCase() === 'websocket') {
const pair = new WebSocketPair();
await this.handleSocket(pair[1]);
return new Response(null, { status: 101, webSocket: pair[0] });
} else {
return new Response("Count HTTP: " + this.count);
}
}
async handleSocket(socket) {
socket.accept();
this.connections.push(socket);
}
}
This is just testing. I am just today beginning my journey in durable objects.sks
12/29/2022, 3:00 PMryucode
12/29/2022, 3:02 PMbrett
12/29/2022, 3:04 PMryucode
12/29/2022, 3:12 PMbrett
12/29/2022, 3:14 PMryucode
12/29/2022, 3:17 PMdavidfm
12/29/2022, 3:27 PMfetch
done from DO alarm handler is performed with cacheOptions
to cacheEverything: true
and a specific cacheKey
. When the fetch
completes, I expect it to write to the Cache with that cacheKey
- I'd need another worker, running independently of the DO, to find the same resource from the cache using the same cacheKey
and not hit the origin
I somewhat expect the cache to propagate/sync globally eventually.... or would it stay local to the datacenter? Will there be cache misses in each and every datacenter/ 1 origin call per datacenter?
What's the best way in your view for the other worker to find the element already fetched by the DO alarm?brett
12/29/2022, 3:38 PMdavidfm
12/29/2022, 5:16 PMcache.put
? or again a fetch
with cf
object options (to DO for example...)?
In the limit we need the resource pre-fetch and pre-warming the cache from DO request/response, ahead of the Worker request coming and requesting it (should find it somewhere, not the origin). But ideally the resource should end up in the cache, whether or not initially needs to be in KV.
Hopefully makes sense, and thanks for prompt responsesckoeninger
12/29/2022, 5:19 PMaarhus
12/29/2022, 10:42 PMsks
12/30/2022, 6:06 AMsks
12/30/2022, 6:16 AMHardAtWork
12/30/2022, 8:42 AMsks
12/30/2022, 8:43 AMHardAtWork
12/30/2022, 8:52 AMsks
12/30/2022, 8:54 AMKai
12/30/2022, 1:20 PMjed
12/30/2022, 3:21 PMKai
12/30/2022, 4:59 PMehesp
12/30/2022, 5:00 PMKai
12/30/2022, 5:01 PMmuslax
12/30/2022, 6:09 PMckoeninger
12/30/2022, 6:23 PM