'Scan from Postgres into struct with pointers
What would be the benefit of using a struct like below with pointers instead of a struct without pointers when scanning from database?
type User struct {
ID *UserID `json:"id,omitempty"`
Email *string `json:"email"`
Username *string `json:"username"`
PasswordHash *[]byte `json:"password_hash"`
CreatedAt *time.Time `json:"created_at"`
}
Solution 1:[1]
As @Luke hinted, if you try to scan a row with null DB values into non-pointer types, the row-scan operation will result in an error.
Best practice is to examine the schema of the DB table. If columns cannot have null-values, use a non-pointer. If a null is possible (however remote a possibility you may think) use a pointer.
If you find pointers tedious to work with (i.e. boilerplate nil checks) use, say a, sql.NullString for a column field. If a NULL is encountered during a row-scan, a NullString
value will default to an empty string.
Go 1.13 also added NullTime
etc. to handle other Nullable column types.
Solution 2:[2]
Using a struct with pointers when scanning an unmarshaling is necessary if you need to distinguish between zero-values and null values. A string/int/struct will fail (or initialized to zero-value when unmarshaling) when the corresponding value is null, whereas a pointer to string/int/struct will be set to nil.
Solution 3:[3]
I suggest to use pgx
as mentioned at the end of the lib/pg
README file on Github:
For users that require new features or reliable resolution of reported bugs, we recommend using pgx which is under active development.
Then, you could use custom pgtype instead of Go built-in ones.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | |
Solution 2 | Burak Serdar |
Solution 3 | Big_Boulard |