Using Dynamic Structs in Go and GORM

March 20, 2021

I was working with some JSON-LD data sources that I needed to import into a database for testing. Since I was already in a Go project, I wanted to figure out how to manage the database schema dynamically. I have been using the GORM package to manage databases elsewhere, so it became a good excuse to test out the reflect package with dynamic structs of data.

Prototypically, statically dynamic

Traditionally, Go is not a very "dynamic" language, so I started off by using a sample record to figure out what it might look like. Starting from the JSON...

var sampleJSON = []byte(`{ "@type": "measurement",
  "@id": "a42fadde-ee76-4687-8f8f-303e083461e8",
  "at": "2020-10-29T23:17:21Z",
  "value": 79.3 }`)

I can then use the reflect package to define a few fields. The StructField type can first be used to list out fields of the to-be struct. In this case, I know the @table must be a string, and any other field can be named Field# since it doesn't really matter.

var dynamicFields = []reflect.StructField{
	{Name: "Table",
		Type: reflect.TypeOf(""),
		Tag:  reflect.StructTag(`json:"@type" gorm:"-"`)},
	{Name: "Field0",
		Type: reflect.TypeOf(""),
		Tag:  reflect.StructTag(`json:"@id" gorm:"column:_id;"`)},
	{Name: "Field1",
		Type: reflect.TypeOf(time.Time{}),
		Tag:  reflect.StructTag(`json:"at" gorm:"column:at"`)},
	{Name: "Field2",
		Type: reflect.TypeOf(float32(0)),
		Tag:  reflect.StructTag(`json:"value" gorm:"column:value"`)}}

Notice that the fields are defining both json tags (which will be used when reading the JSON input) as well as gorm tags (which will be used by the ORM when saving to the database).

Once the list of fields are ready, the StructOf function can be used for preparing a type. From there, I can create a new value of it where I can unmarshal the sample JSON.

var dynamicType = reflect.StructOf(dynamicFields)
var record = reflect.New(dynamicType).Interface()

err := json.Unmarshal(sampleJSON, &record)
if err != nil {

fmt.Printf("%#+v\n", record)
&struct {
  Table string "json:\"@type\" gorm:\"-\"";
  Field0 string "json:\"@id\" gorm:\"column:_id\"";
  Field1 time.Time "json:\"at\" gorm:\"column:at\"";
  Field2 float32 "json:\"value\" gorm:\"column:value\""
} {
  Table: "measurement",
  ID: "a42fadde-ee76-4687-8f8f-303e083461e8",
  Field0: time.Time{wall:0x0, ext:63739610241, loc:(*time.Location)(nil)},
  Field1: 79.3

Because it is printing a dynamic struct, it uses the more verbose, inline format. But, all the output looks correct!

For the most part, GORM will extract the fields automatically, but we will want to pull out the Table field. Since this is a dynamic struct we cannot plainly reference sampleValue.Table; but we can use reflect again to get the string with its FieldByName function. In this case, the Elem function is used to make sure we're looking at our dynamic struct value (and not a field from the reflect internal objects).

table := reflect.ValueOf(record).Elem().FieldByName("Table").String()
fmt.Printf("%s\n", table)

Next, I can use the ORM library to automatically CREATE/ALTER the table. Note that, in this situation, new fields can safely be added, but changing field types (e.g. float to boolean) is not supported.

if err := db.Table(table).AutoMigrate(record); err != nil {

And, finally, use the Create function to save our record in the database. Although GORM supports defining the table on the model and avoid the repetition, it was easier to use the Table function directly since this is a dynamic struct.

if err := db.Table(table).Create(record).Error; err != nil {

Into the Dynamic

Now that we have some functional building blocks it's time to make it work from arbitrary data.

Building a Type

To start, we'll use a function that converts a generic JSON object and builds a reflect.Type from it.

func buildType(recordRaw map[string]interface{}) (reflect.Type, error) {

Within it, we prepare fields to be a list of the reflect.StructFields. Since a Table field is required, it gets hard-coded before ranging through map of the record.

for key, value := range recordRaw {

Inside the loop we can perform any special logic around converting keys or values for our data domain. For example, ignore the @type key, or converting values to native types (like the time.Time type).

if valueT, ok := value.(string); ok && reValueRFC3339.MatchString(valueT) {
	valueTime, err := time.Parse(time.RFC3339, valueT)
	if err == nil {
		value = valueTime

After we're done making changes, we add our generated reflect.StructField in a similar manner to the original prototype.

fields = append(fields, reflect.StructField{
	Name: fmt.Sprintf("Field%d", len(fields)),
	Type: reflect.TypeOf(value),
	Tag:  reflect.StructTag(fmt.Sprintf(`json:"%s" gorm:"column:%s"`, key, dbkey)),

And, once all the key/values are added, we can finally return back the generated struct.

return reflect.StructOf(fields), nil

Building a Value

Next, I add a new function which takes care of both building the type, creating a value, and then "remarshal" it – marshal back to JSON, then unmarshal into the struct value – before returning it back.

func buildRecord(recordRaw map[string]interface{}) (interface{}, error) {
	recordType, err := buildType(recordRaw)
	if err != nil {
		return nil, fmt.Errorf("building type: %s", err)

	record := reflect.New(recordType).Interface()

	if err := remarshal(&record, recordRaw); err != nil {
		return nil, fmt.Errorf("updating struct: %s", err)

	return record, nil

Adding Some Data

Finally, the main loop uses a json.Decoder to read in JSON Lines before using the functions described earlier to get the record into the database.

var recordRaw map[string]interface{}

if err := jsonl.Decode(&recordRaw); err == io.EOF {
} else if err != nil {

record, err := buildRecord(recordRaw)
if err != nil {

table := reflect.ValueOf(record).Elem().FieldByName("Table").String()

if err := db.Table(table).AutoMigrate(record); err != nil {

if err := db.Table(table).Create(record).Error; err != nil {

fmt.Printf("%#+v\n", record)

Running the program and piping the same sample data via STDIN, it parses it, creates the table, and inserts the record.

go run . <<< '{ "@type": "measurement", "@id": "a42fadde-ee76-4687-8f8f-303e083461e8", "at": "2020-10-29T23:17:21Z", "value": 79.3 }'
&struct { Table string "json:\"@type\" gorm:\"-\""; Field1 string "json:\"@id\" gorm:\"column:_id\""; Field2 time.Time "json:\"at\" gorm:\"column:at\""; Field3 float64 "json:\"value\" gorm:\"column:value\"" }{Table:"measurement", Field1:"a42fadde-ee76-4687-8f8f-303e083461e8", Field2:time.Time{wall:0x0, ext:63739610241, loc:(*time.Location)(nil)}, Field3:79.3}

Looking directly at the database, we can verify the results as well.

sqlite3 main.sqlite \
  '.schema measurement' \
  'SELECT * FROM measurement'
CREATE TABLE `measurement` (`_id` text,`at` datetime,`value` real);
a42fadde-ee76-4687-8f8f-303e083461e8|2020-10-29 23:17:21+00:00|79.3


By the end, it was working well enough for testing and I learned more about the reflect package. Mission accomplished. Still, if you're working with a similar scenario, you might want to consider alternatives such as:

  • Use something other than Go – this type of dynamic implementation (for both Go and GORM) is not really an approach or use case they're designed for.
  • Use a library – the dynamic-struct package, for examples, seems to provide a nicer abstraction for working with dynamic structs if you really need them.
  • Use GORM's schema management directly – a couple subpackages seem responsible for managing table schemas and could possibly be used directly (instead of defining the schema via struct).