Conversation
✅ Deploy Preview for cute-starship-2d9c9b canceled.
|
| 1 | ||
| } else { | ||
| // CEIL(variants / 7) | ||
| (nvars.len() + 6) / 7 |
There was a problem hiding this comment.
This isn't right, it should be 2^7 not 7, we want to check how many variants fit in how many varint bytes.
max-heller
left a comment
There was a problem hiding this comment.
A couple thoughts that have been bouncing around in my head
| /// | ||
| /// You must not rely on this value for safety reasons, as implementations | ||
| /// could be wrong. | ||
| const MANUAL_MAX_SIZE: Option<usize> = None; |
There was a problem hiding this comment.
Instead of having this manual override and a max_size() function that checks it, this could default to the calculated size and be overridden as appropriate:
trait Schema {
const MAX_SIZE: Option<usize> = max_size_nt(Self::SCHEMA);
}
// Manual max size
impl Schema for heapless::Vec<T, N> {
const MAX_SIZE: Option<usize> = Some(...);
}This seems more intuitive to me, since there'd be only one source of truth--T::MAX_SIZE--instead of the separate override and calculator function.
| assert!(nty_eq(nt, Inner::SCHEMA), "Mismatched Outer/Inner types!"); | ||
| let size_one = max_size::<Inner>(); |
There was a problem hiding this comment.
Is this dance required to use Inner::MANUAL_MAX_SIZE if it's set (as opposed to using max_size_nt())?
Somewhat relatedly, I'm wondering if it would make sense to embed known max sizes in the schema types somewhere in order for postcard-dyn to be able to take them into account. Currently, from my understanding, a client using postcard-dyn couldn't be told the max size of a heapless::Vec<T, N> because the only information in the serialized schema is the unbounded Seq(T::SCHEMA). If the serialized schema included the max length, then that length could be used to size buffers even without knowing the concrete type.
Concretely, this could take the form of bundling an overall max size with an OwnedNameType:
struct OwnedSchema {
ty: OwnedNamedType,
max_size: Option<usize>,
}Or it could be more tightly coupled with the schema types themselves, something like:
struct NamedType {
name: &'static str,
ty: &'static DataModelType,
max_size: Option<usize>,
}
// In this case, Schema could remain as just a single associated const
trait Schema {
const SCHEMA: &'static schema::NamedType;
}There was a problem hiding this comment.
Or, less flexibly, unbounded data model types (strings, seqs, maps, etc.) could have an optional max length field. Then the max size could always be determined based on the schema. Though that might not play as nicely with schema hashing—are two schemas interchangeable if they have different max lengths for a contained string?
There was a problem hiding this comment.
Or, less flexibly, unbounded data model types (strings, seqs, maps, etc.) could have an optional max length field. Then the max size could always be determined based on the schema. Though that might not play as nicely with schema hashing—are two schemas interchangeable if they have different max lengths for a contained string?
I'm starting to think this might be the best option:
- It's harder to implement in subtly wrong ways (e.g., by assuming integers take the same number of bytes as their in-memory representations, which I've seen in the wild)
- It conveys additional information (bounds on the length of collections) that could be calculated based on the max size, but not as obviously
- If max size is needed in a
postcard-dyn/postcard-rpccontext, it's more compact to sendNon the wire thanN * max_size(T)
@jamesmunns can you think of any cases where max size could not be determined this way?
No description provided.