Easy way to convert Json to Avro

Ben

October 25, 2019

We have recently started using Apache Avro primarily for use with Apache Kafka, and the Confluent Schema registry. Previously we had been, and continue to use Json as the primary data format for the REST APIs into our services. For this we use the Jackson Json serializer to encode and decode the data from incoming bytes to Java bean definitions. We used this approach so we could have the models defined as Java classes, that could then include documentation and validation rules using tools such as hibernate validator and enunciate.


With the move to Avro we wanted to ensure we would not require a Java class definition and an Avro schema, as this would result in two definitions that could easily get out of sync with each other. So we set ourselves the task of making the REST APIs accept Json and convert the Json into the Avro generated object. 


Generating Avro

So first we need to get the build process to generate the java classes from the Avro schema files, for this we could use the maven Avro plugin. This is a simple process and only requires the plugin to be configured in the pom.xml. The generated classes are then added to the artifact of the build.

            <plugin>

                <groupId>org.apache.avro</groupId>

                <artifactId>avro-maven-plugin</artifactId>

                <version>1.8.2</version>

                <executions>

                    <execution>

                        <id>schemas</id>

                        <phase>generate-sources</phase>

                        <goals>

                            <goal>schema</goal>

                            <goal>protocol</goal>

                            <goal>idl-protocol</goal>

                        </goals>

                        <configuration>

                            <sourceDirectory>${project.basedir}/src/main/avro</sourceDirectory>

                        </configuration>

                    </execution>

                </executions>

</plugin>

Type conversion

Now we have the types generated we need to work out how to convert them from one type to another. As we mentioned we are using the Jackson library for serialising the Json and wanted to try to keep using that library. After some searching and testing of our own, we found that converting Json to Avro was actually quite straight forward. As you can use the normal way of decoding Json into the Java type.


  jsonMapper.readValue( bytes, clazz )


When it came to converting the Avro type to Json it became a bit more complicated, we encountered problems with the Jackson serializer picking up properties in the Avro type that we did not want to be on the Json object. Such as the schema for the avro object that is attached via the schema property on all Avro generated objects.


To solve this problem we used the mixin feature of Jackson. This allows you to add specific serialization properties to types that are processed by the ObjectMapper to alter the result of the serialization (or deserialization).


We defined the mixin as follows:

/**

 * This is used as a mixin to Jackson to allow conversion of Avro type to json and back.

 */

@JsonIgnoreProperties(ignoreUnknown = true)

abstract class AvroJsonMixin

{

    /**

     * Ignore the Avro schema property.

     */

    @JsonIgnore

    abstract Schema getSchema();



    /**

     * Ignore the specific data property.

     */

    @JsonIgnore

    abstract SpecificData getSpecificData();

}

{code}


This mixin tells Jackson to ignore the properties Schema and SpecificData from the serialized Json data, it is also telling Jackson to ignore any unknown properties when it is trying to convert Json to the Avro types. We do this to better support adding fields in new versions of our APIs without having to update all users to send this new field. With this mixin defined we need to register this with the ObjectMapper.


new ObjectMapper().addMixIn( SpecificRecordBase.class, AvroJsonMixin.class );

When we register the mixin the first parameter is the target type, this is the type that this mixin should be applied to. As we want this to affect all Avro types, we have defined the target as SpecificRecordBase.class as this is the base class of all generated Avro types created by the maven plugin.


Using with Jersey

Now that we are able to convert Json to Avro and back again, we want to incorporate this into the Jersey framework that we use for the REST APIs we create. In Jersey we use the JacksonFeature to enable the use of Json directly in the REST API. This feature creates and manages the ObjectMappers for us, so we need a way to configure the ObjectMapper used by the JacksonFeature. This can be done by adding a ContextResolver for the ObjectMapper type, and then registering this with the Jersey configuration.


{code}

@Provider

public class ObjectMapperResolver implements ContextResolver<ObjectMapper>

{

    @Override

    public ObjectMapper getContext( final Class<?> type )

    {

                return new ObjectMapper().addMixIn( SpecificRecordBase.class, AvroJsonMixin.class );

    }

}

{code}


Now that we have the resolver, mapper and feature configured, we can now define our REST APIs using the Avro generated types allowing Jackson to manage the conversion from Json to Avro objects.


{code}

            @POST

        @PermitAll

        public void test( YourAvroType avroType )

{code}


Conclusion

With this approach we can maintain a single definition of the model that is used in the REST APIs, and subsequently used in the kafka streams. We can also provide model definitions for all the languages that might use our REST APIs, or at least those supported by Avro. 


This also allows us to not have to perform any manual handling of the conversion of the types from Json to Avro in the business logic. Allowing us to concentrate on the business logic, and reducing the code we have to test and maintain.

Ben

Ben

Experienced developer in various languages, currently a product owner of nerd.vision leading the back end architecture.