I have a postgres table with a creation_time field defined like this:
creation_time timestamp with time zone not null default now(),
And I have a protobuf message defined like this:
message Thing {
//...Other fields here
google.protobuf.Timestamp creation_time = 1;
}
After generating my go struct with protoc, I end up with a Go struct like this:
type Thing struct {
CreationTime *timestamppb.Timestamp `protobuf:"bytes,3,opt,name=creation_time,json=creationTime,proto3" json:"creation_time,omitempty"`
}
I use sqlx to read and write into a postgres database. I want to read a database record and deserialize it into a Thing struct. The deserialization works for all the "trivial" fields (ints, strings) but the timestamp is giving me trouble. As you can see above, the CreationTime field ends up being of type *timestamppb.Timestamp and sqlx doesn't know how to handle this type (it doesn't have the Value and Scan methods for this type).
I would like to know how people handle this situation, I believe it must happen all the time as it's so fundamental.
I am aware that I could use the protobuf struct only for the communication with my API and use a normal go struct for internal business logic, and write some adapters to handle the conversion. I don't want to go this way, I would like to use the proto struct everywhere. I am also aware that I could use lots of sqlx.Scan to specifically handle the timestamps. But I'd have to do that in multiple places in my code and I don't like this idea.
I was thinking about implementing my own timestamp proto type (might be just a wrapper around google's timestamp type), just so I can implement the scanner and valuer interface.
How do you people normally do?
Here are common strategies people use to handle this situation:
Implement the Scanner and Valuer interface for your protobuf Timestamp. This is essentially what you've mentioned — creating a wrapper around the timestamppb.Timestamp or your custom timestamp type. This way, you can directly control how to read from and write to the database. The wrapper would convert between the database's timestamp format and the protobuf timestamp.
Introduce a middleware layer or use repository pattern where the conversion logic is centralized. In this approach, the repository functions are responsible for reading from the database into a Go-native structure (like time.Time for timestamps) and then converting it to the necessary protobuf type. This keeps the conversion logic in one place and avoids scattering sqlx.Scan calls throughout your code.
Utilize nullable time types that are compatible with sqlx. Libraries like github.com/guregu/null or github.com/jmoiron/sqlx/types provide NullTime types which can be scanned directly from SQL and can be checked for nullability. You would then convert this into your protobuf timestamp type as needed.
Some frameworks or libraries allow you to define hooks or custom types that integrate with the ORM's lifecycle. This means you can intercept a value after it's read from the database but before it's assigned to the struct field, or vice versa. You can use this hook to convert the timestamp.
Create functions specifically for serializing and deserializing between your database models and your protobuf models. This is more manual but allows for explicit control over how each field is handled.
I prefer the third variant. Here is the snippets with how it may look like:
read from DB:
convert back to TimeStamp:
By using null.Time, you allow sqlx to automatically handle the nullable aspect of the timestamp directly from SQL queries. This approach keeps your database code cleaner and makes it easier to deal with the possibility of null timestamps. The conversion back to the protobuf timestamp when you need to work with the protobuf message is straightforward