Python - TypeError: Object of type 'int64' is not JSON serializable (Works for all data types - Integer, Float, Array, etc)

1. Overview


In this tutorial, You will be learning how to solve the problem in python "TypeError: Object of type 'int64' is not JSON serializable" and "TypeError: (Integer) is not JSON serializable". These two errors come into existence when we working with JSON Serialization. Solutions shown in this article will work for apart from the int or int64 type such as float, array and etc.

Below is the scenario where my friend faced the problem.

Python TypeError Object of type  int64 is not JSON serializable


2. Problem 1:  Statement (Scenario): TypeError: Object of type 'int64' is not JSON serializable


I have a Dataframe that stores Store name and daily sales count. I am trying to insert this to Salesforce using the below Python script. I, however, get an error

    TypeError: Object of type 'int64' is not JSON serializable

Given below is the view of the Dataframe

Storename,Count
Store A,10
Store B, 12
Store C, 5


I use the below code to insert it to Salesforce

update_list = []
for i in range((len(store))):
update_data = {
'name' : store['entity_name'].iloc[i],
'count__c': store['count'].iloc[i] }

update_list.append(update_data)

sf_data_cursor = sf_datapull.salesforce_login()
sf_data_cursor.bulk.Account.update(update_list)

Get the error the last line above gets executed.

Error:

TypeError: Object of type 'int64' is not JSON serializable

You can try this problem solving without seeing the answer first.

3. Solution 1 for TypeError: Object of type 'int64' is not JSON serializable


JSON does not recognize NumPy data types. Convert the number to a Python int before serializing the object. Just add int typecasting to the store(..) value.

'count__c': int(store['count'].iloc[i])

4. Solution 2 to TypeError: Object of type 'int64' is not JSON serializable


import numpy package and create your own decoder for solving the problem of any data type.


import json
import numpy as np

class NpEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, np.integer):
return int(obj)
elif isinstance(obj, np.floating):
return float(obj)
elif isinstance(obj, np.ndarray):
return obj.tolist()
else:
return super(NpEncoder, self).default(obj)

# Your codes ....
json.dumps(data, cls=NpEncoder)


5. Solution 3 with typecasting to String all values in CSV file


Another option is that when you create the data frame that treats all values as string from CSV file. For that, we need to use "dtype=str" which defines the type for each field as String. So that String is serialized by default.

For example, if you loaded the store from a CSV file:

import pandas as pd
store = pd.read_csv('store.csv', dtype=str)


Then everything has a type of str that can be serialized to JSON.

6. Problem 2: preserve int64 values when parsing JSON in Go


The problem faced in the below program.

I am processing a JSON POST in Go that contains an array of objects containing 64bit integers. When using JSON. Unmarshal these values seem to be converted to a float64 which isn't very helpful.

body := []byte(`{"tags":[{"id":4418489049307132905},{"id":4418489049307132906}]}`)

var dat map[string]interface{}
if err := json.Unmarshal(body, &dat); err != nil {
panic(err)
}

tags := dat["tags"].([]interface{})

for i, tag := range tags {

fmt.Println("tag: ", i, " id: ", tag.(map[string]interface{})["id"].(int64))

}

Is there any way to preserve the original int64 in the output of JSON.Unmarshal?

7. Solution 1: To preserve the int64 type in unmarshalling


You can use a Decoder and UseNumber to decode your numbers without loss.

The Number type is defined like this :

// A Number represents a JSON number literal.
type Number string

which means you can easily convert it :

package main

import (
"encoding/json"
"fmt"
"bytes"
"strconv"
)

func main() {
body := []byte("{\"tags\":[{\"id\":4418489049307132905},{\"id\":4418489049307132906}]}")
dat := make(map[string]interface{})
d := json.NewDecoder(bytes.NewBuffer(body))
d.UseNumber()
if err := d.Decode(&dat); err != nil {
panic(err)
}
tags := dat["tags"].([]interface{})
n := tags[0].(map[string]interface{})["id"].(json.Number)
i64, _ := strconv.ParseUint(string(n), 10, 64)
fmt.Println(i64) // prints 4418489049307132905
}


8. Solution 2: To preserve the int64 type in unmarshalling


You can also decode into a specific structure tailored as per your needs.

package main

import (
"encoding/json"
"fmt"
)

type A struct {
Tags []map[string]uint64 // "tags"
}

func main() {
body := []byte("{\"tags\":[{\"id\":4418489049307132905},{\"id\":4418489049307132906}]}")
var a A
if err := json.Unmarshal(body, &a); err != nil {
panic(err)
}
fmt.Println(a.Tags[0]["id"]) // logs 4418489049307132905
}

Personally, I generally prefer this solution which feels more structured and easier to maintain.

Note:

A small note if you use JSON because your application is partly in JavaScript: JavaScript has no 64 bits integers but only one number type, which is the IEEE754 double-precision float. So you wouldn't be able to parse this JSON in JavaScript without loss using the standard parsing function.

9. Conclusion


In this article, we have seen the solutions for the problems that we face in data types int, int64, float and array serialization. We have shown the problems here for int type. But the same solutions will work for float and remaining types as well.

Just give a try. Keep posting your questions in the comment section for answers.

0 Comments