Quick question. Granularity is useful primarily for increasing efficiency when reading? We have data that is separated my nanoseconds so I’m not sure if there’s a general rule of thumb on what the granularity should be. If we’re only ingesting ~100 rows per second on a table is there a certain level of granularity we would want for that?