Big Data/Analytics Zone is brought to you in partnership with:

I'm an enthusiastic software developer who tries to keep up with the all-time changing trends in software development. My interests go from trying new coding stuff, new design and architectural techniques, to studying how to encourage the adoption of new methodologies and tools in my organization. Martín has posted 8 posts at DZone. You can read more from them at their website. View Full User Profile

PostgreSQL to SQLite: The Journey

10.21.2013
| 7423 views |
  • submit to reddit

This article will be useful if you want to support both PostgreSQL and SQLite using JDBC. It will be especially useful if you:

  • Are already accessing values from your (PostgreSQL) database using the regular JDBC ResultSet interface, like:
    Date d = rs.getDate("date_field");
    BigDecimal bd = rs.getBigDecimal("bigdecimal_field");
    And it is creating trouble when doing the same for SQLite, but you don't want to change that code.
  • Are already retrieving autogenerated keys in PostgreSQL with a RETURNING clause, but this won't work in SQLite. You want a unified solution that works for both databases.
  • Thought foreign keys are enforced in SQLite by default (like in PostgreSQL) and crashed with a wall. SQLite is allowing you to delete entries from your tables even when they are referenced in another table and you have explicitly told SQLite about it with a REFERENCES table_name(field_name) clause.
  • Are having trouble with the differences between PostgreSQL and SQLite dialects (mostly concerning data types), for example, when making query filters with boolean values.
  • Had your own way to manage exceptions for PostgreSQL and it is not working for SQLite (obviously). You want SQLite to fit into the model you already have.
  • Other stuff might appear if you keep up...

A few months ago I wanted to migrate an app to use SQLite as a data backend. In fact, I wanted it to work with both PostgreSQL and SQLite indistinctly (but not at the same time). I wanted to switch between these two databases easily without changing any code. I did it, but along the way I had to solve some problems that might be interesting to many other people.

Many solutions I found were spread across the web, but there was no single place that explained how to completely achieve what I wanted. So, the aim of this post is to try to condense my learning into one article that may be of help to others as a (semi) complete guide. This guide might be useful not only to those creating their own frameworks, but for anyone who doesn't use any and are willing to try some quirks and tricks to make their app work.

THE BEGINNING

There are many cross-database incompatibilities between PostgreSQL and SQLite, most notably on data types. If you want to have the same code to work for both databases, you better use a framework that manages this for you. But here's the thing: the framework I use  is created by myself, and didn't (completely) take these differences into account, since I mainly use PostgreSQL as database; that's how and why my problems arose. 

My framework conveys many things, but I focus here in the data access part. It uses some JDBC driver to connect to the databases, but it provides more abstract ways to do it; that's pretty much the data access part of the framework.

A basic DAO class for my framework would look like this:

public class MyDAO extends BaseDAO {
    public MyDAO() {
        super("context_alias", new DefaultDataMappingStrategy() {
            @Override
            public Object createResultObject(ResultSet rs) throws SQLException {
                MyModel model = (MyModel)ObjectsFactory.getObject("my_model_alias");
               
                model.setStringField(rs.getString("string_field"));
                model.setIntegerField(rs.getInt("integer_field"));
                model.setBigDecimalField(rs.getBigDecimal("bigdecimal_field"));
                model.setDateField(rs.getDate("date_field"));
                model.setBooleanField(rs.getBoolean("boolean_field"));
               
                return model;
            }
        });
    }

    @Override
    public String getTableName() {
        return "table_name";
    }

    @Override
    public String getKeyFields() {
        return "string_field|integer_field";
    }
   
    @Override
    protected Map getInsertionMap(Object obj) {
        Map map = new HashMap();
        MyModel model = (MyModel) obj;
        map.put("string_field", model.getStringField());
        map.put("integer_field", model.getIntegerField());
        map.put("bigdecimal_field", model.getBigDecimalField());
        map.put("date_field", model.getDateField());
        map.put("boolean_field", model.getBooleanField());
        return map;
    }
   
    @Override
    protected Map getUpdateMap(Object obj) {
        Map map = new HashMap();
        MyModel model = (MyModel) obj;
        map.put("bigdecimal_field", model.getBigDecimalField());
        map.put("date_field", model.getDateField());
        map.put("boolean_field", model.getBooleanField());
        return map;
    }

    @Override
    public String getFindAllStatement() {
        return "SELECT * FROM :@ ";
    }

So, that I wanted to switch between databases without changing code means that I wanted to switch without changing my DAO classes.

For SQLite, I used the xerial-jdbc-sqlite driver. I talk about drivers because there are some things that might be driver-specific when solving some problems; so when I say 'SQlite does it this way', I generally mean 'xerial-jdbc-sqlite driver does it this way'.

Now, let's start.

WARNING: Some of the solutions I give here fit into my framework, but might not directly fit into your code. It's up to you to imagine how to adapt what I provide here.

DATA TYPES

Since there are some differences between PostgreSQL and SQLite regarding data types, and I wanted to continue to access database values through the regular ResultSet interface, I had to have some mechanism to intercept the call to, for instance, resultset.getDate("date_field"). So I created a ResultSetWrapper class that would redefine the methods I was interested in, like this:

public class ResultSetWrapper implements ResultSet {
    
    // The wrappped ResultSet
    ResultSet wrapped;
   
    /* I will use this DateFormat to format dates. I'm assuming an SQLite style pattern. I should not */
    SimpleDateFormat df = new SimpleDateFormat("yyyy-mm-dd");

    public ResultSetWrapper(ResultSet wrapped) {
        this.wrapped = wrapped;
    }

    /* Lots of ResultSet methods implementations go here, 
       but this is an example of redefining a method 
       I'm interested in changing its behavior: */

    public Date getDate(String columnLabel) throws SQLException {
        Object value = this.wrapped.getObject(columnLabel);
        return (Date)TypesInferreer.inferDate(value);
    }	
}

The getDate() method in ResultSetWrapper relies on TypesInferreer to convert the value retrieved to a Date value.

All data types convertions would be encapsulated inside TypesInferreer, which would have methods to convert from different data types as needed. For instance, it would have a method like this one:

public static Object inferDate(Object value) {
    java.util.Date date;
   
    // Do convertions here (convert value and asign to date)

    return date;
}

Which tries to convert any value to a Date (I'll show the actual implementation further).

Now, instead of using the original resultset retrieved from saying preparedStatement.executeQuery(), you use new ResultSetWrapper(preparedStatement.executeQuery()). That's what my framework does: it passes this new resultset to DAO objects.

Now let's see some type conversions.

Published at DZone with permission of its author, Martín Proenza.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Comments

Lukas Eder replied on Sat, 2013/11/23 - 3:00am

When looking at your journey, I wonder if you might've enjoyed considering something like jOOQ in the first place... A lot of the SQL dialect standardisation that you're describing has been taken care of, so you don't have to worry about these things.

For instance, fetching return IDs is standardised in a way that resembles PostgreSQL's awesome INSERT .. RETURNING  clause:

// Add another author, with a generated ID
Record<?> record =
create.insertInto(AUTHOR, AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME)
      .values("Charlotte", "Roche")
      .returning(AUTHOR.ID)
      .fetchOne();

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.