Skip to content
jonifreeman edited this page Jan 12, 2013 · 47 revisions

Configuration

Compile time configuration

The macro needs a connection to the database at compile time to read schema metadata. The connection is configured with following system properties:

  • sqltyped.url
  • sqltyped.driver
  • sqltyped.username
  • sqltyped.password

In addition, the macro needs an instance of Configuration which can be passed as an implicit parameter.

    object Tables { trait person; trait job_history }
    object Columns { object name; object age; object salary; object count; object avg }
    implicit val c = Configuration(Tables, Columns)

In object Tables you need to create a type for each table. These types will be used when tagging PK and FK columns. See Tagging.

In object Columns you need to create a type for each column accessed with record syntax. See Working with records.

Note, future versions will automatically create these types by using macro types (not available in Scala yet).

Runtime configuration

Function sql returns an instance of a QueryN where N is input arity. Concretely a query with two arguments, String and Int returns an instance of

   trait Query2[R] {
     def apply(i1: String, i2: Int)(implicit conn: java.sql.Connection): List[R]
   }

To execute the query with apply function, an implicit java.sql.Connection must be available.

Defining and executing SQL statements

SQL statements are defined with function sql.

    val q = sql("select age, name from person where age > ?")

To execute the statement a connection to database is needed.

    implicit val conn = ...
    val rows = q(20)

SQL select statement returns results as records. See Working with records.

    rows map (_.show) mkString("\n")
    // { name = joe, age = 36 }
    // { name = moe, age = 14 }

In addition to function sql there's a couple of specialized versions.

  • sqlk

Works like sql but returns generated keys as a result. Useful in some SQL insert cases.

  • sqlt

Works like sql but tags input arguments too. See Tagging.

Connection & transaction management

Sqltyped does not provide any connection pooling or transaction management solution. Existing solutions can be used instead. The integration is easy, just expose a connection as an implicit parameter as in following Slick example.

    import scala.slick.session.Database
    val db = Database.forURL("jdbc:mysql://localhost:3306/sqltyped", 
                             driver = "com.mysql.jdbc.Driver", user = "root", password = "")
    implicit def conn = Database.threadLocalSession.conn

    db withTransaction {
      sql("update person set name=? where id=?").apply("danny", 1)
      sql("update person set name=? where id=?").apply("fanny", 2)
    }

    db withSession {
      sql("select name from person where id=?").apply(1)
    }

Tagging

If a column is a primary or foreign key its type is tagged. For instance, a column which references 'person.id' is typed as Long @@ person. That funny little @@ symbol is a type tag from Shapeless project. It is used to add extra type information to otherwise simple type and can be used for extra type safety in data access code.

    scala> def findName(id: Long @@ person) = sql("select name from person where id=?").apply(id)

    scala> sql("select person from job_history").apply map findName

The above code compiles because 'job_history.person' is a foreign key referencing 'person.id'. Thus, its type is Long @@ person.

Note, input parameters are not tagged (just typed). Otherwise this wouldn't compile:

    sql("select name,age from person where id=?").apply(1)

Instead, explicit tagging would had been required:

    sql("select name,age from person where id=?").apply(tag[person](1))

There's a separate function called sqlt (t as tagged) which tags input parameters too.

    scala> val q = sqlt("select name,age from person where id=?")
    scala> q.apply(1)
    <console>:31: error: type mismatch;
     found   : Int(1)
     required: shapeless.TypeOperators.@@[Long,Tables.person]
        (which expands to)  Long with AnyRef{type Tag = Tables.person}
                  q.apply(1)

    scala> q.apply(tag[person](1))
    res2: Option[shapeless.::[(Columns.name.type, String),shapeless.::[(Columns.age.type, Int),shapeless.HNil]]] = 
      Some((Columns$name$@d6d0dbe,joe) :: (Columns$age$@72a13bd4,36) :: HNil)

Working with records

Introduction

Record is a type safe list of key-value pairs. Scala does not have a native record system but they can be emulated with HLists. Before going into details how to do that let's just recap how records differ from case classes, a native Scala construct.

Scala provides us case classes which are very nice to represent data.

    case class Person(name: String, age: Int)

    val p = Person("Joe", 36)

While case classes are nice there are some limitations.

  • case classes must be named
  • we cannot easily modify the schema of a case class (add or remove fields)

Often those limitations do not matter but a data structure which lifts those restrictions would be quite nice to represent data read from database. After all, we do not have a name for each SELECT query and sometimes we like to add (or remove) something to the query results before further processing.

The above data can be represented as an extensible record too.

    record Person = { name: String, age: Int }

    val p = Person { name = "Joe", age = 36 }

Now, of course that's not valid Scala syntax. Like said, we do not have a first-class record system available. Fortunately emulating those with HLists turns out to work quite ok (compilation failures are sometimes pretty horrendous though :).

    object name
    object age
    type Person = (name.type, String) :: (age.type, Int) :: HNil

    val p: Person = (name -> "Joe") :: (age -> 36) :: HNil

We defined types for record keys: name and age. Then defined a record as a HList of key/value pairs. A bit verbose but naming is optional.

    object name
    object age

    val p = (name -> "Joe") :: (age -> 36) :: HNil

Using records

Basic data access and data modifications are supported.

    scala> val p = (name -> "Joe") :: (age -> 36) :: HNil

    scala> p.show
    res0: String = { name = Joe, age = 36 }
    scala> p get name
    res1: String = Joe

    scala> object lname
    scala> p get lname
    <console>:19: error: No such key lname.type
                  p get lname

    scala> val p2 = (lname -> "Doe") :: p
    scala> p2.show
    scala> res2: String = { lname = Doe, name = Joe, age = 36 }
    scala> p2 get lname
    res3: String = Doe

    scala> val p3 = p removeKey name
    scala> p3.show
    res3: String = { age = 36 }

    scala> val p4 = p renameKey(name, lname)
    scala> p4.show
    res4: String = { lname = Joe, age = 36 }

    scala> val p5 = p.modify(name)((s: String) => s.length)
    scala> p5 get name
    res5: Int = 3

Advanced use is possible by using HList utilities provided by Shapeless. Like mapping over record fields.

    object toUpper extends Poly1 {
      implicit def stringToUpper[K] = at[(K, String)] {
        case (k, v) => (k, v.toUpperCase)
      }

      implicit def otherTypes[K, V] = at[(K, V)](identity)
    }

    scala> val p6 = p map toUpper
    scala> p6.show
    res6: String = { name = JOE, age = 36 }

We have to define toUpper as a polymorphic function. If a value is of type String, we upcase it. For other types we just use identity function.

JSON example

One particular advantage of records over case classes is that it is possible to implement polymorphic functions without resorting to runtime reflection. As a concrete example, let's see how a record could be converted to a JSON string.

    object toJSON extends Poly1 {
      implicit def numToJSON[V <% Number] = at[V](_.toString)
      implicit def stringToJSON[V <% String] = at[V]("\\"" + _.toString + "\\"")
      implicit def boolToJSON[V <% Boolean] = at[V](_.toString)
      implicit def dateToJSON[V <% java.util.Date] = at[V]("\\"" + _.toString + "\\"")

      implicit def seqToJSON[V, L <% Seq[V]](implicit c: Case1[V]) = 
        at[L](_.map(v => toJSON(v)).mkString("[", ",", "]"))

      implicit def recordToJSON[R <: HList](implicit foldMap: MapFolder[R, String, fieldToJSON.type]) = {
        val concat = (s1: String, s2: String) => if (s2 != "") s1 + "," + s2 else s1

        at[R](r => "{" + (r.foldMap("")(fieldToJSON)(concat)) + "}")
      }

      object fieldToJSON extends Poly1 {
        implicit def value[K, V](implicit c: toJSON.Case1[V]) = at[(K, V)] {
          case (k, v) => "\\"" + keyAsString(k) + "\\":" + toJSON(v)
        }

        implicit def option[K, V](implicit c: Case1[(K, V)]) = at[(K, Option[V])] { 
          case (k, Some(v)) => fieldToJSON((k, v)).toString
          case (k, None) => ""
        } 
      }
    }

Whoa! That begs an explanation.

We begin by introducing a function toJSON which can convert Scala values to a JSON values. First some primitive types are handled.

    implicit def numToJSON[V <% Number] = at[V](_.toString)

That case handles value (V <% Number). That is, any value which can be converted to a Number. The implementation renders it as JSON value. Other primitives are rendered in a same way.

Record fields are rendered with function fieldToJSON. Remember that a record field is simply a tuple (K, V).

    implicit def value[K, V](implicit c: toJSON.Case1[V]) = at[(K, V)] {
      case (k, v) => "\\"" + keyAsString(k) + "\\":" + toJSON(v)
    }

To render a field we need evidence that its value can be rendered as JSON too. The evidence is enforced by adding an implicit parameter toJSON.Case1[V].

Handling Option type is pretty straightforward too.

    implicit def option[K, V](implicit c: Case1[(K, V)]) = at[(K, Option[V])] { 
      case (k, Some(v)) => fieldToJSON((k, v)).toString
      case (k, None) => ""
    } 

To render a field (K, Option[V]) we need an evidence that (K, V) can be rendered as JSON. When the value is Some, we extract the value and render it by recursively calling fieldToJSON function. In case of None, nothing is rendered.

Then we introduce a case which can render one record as a JSON object.

    implicit def recordToJSON[R <: HList](implicit foldMap: MapFolder[R, String, fieldToJSON.type]) = {
      val concat = (s1: String, s2: String) => if (s2 != "") s1 + "," + s2 else s1

      at[R](r => "{" + (r.foldMap("")(fieldToJSON)(concat)) + "}")
    }

Shapeless provides many functions to work with HLists. Here we utilize foldMap which is a combination of common fold and map functions. Our foldMap maps over each field in a record by using function fieldToJSON, it then folds over those and builds a String. Scala can't infer the types here, that's why they are explicitly listed in: foldMap: MapFolder[R, String, fieldToJSON.type]. If it could, the essence of that function would be more clear:

    implicit def recordToJSON[R <: HList] = {
      val concat = (s1: String, s2: String) => if (s2 != "") s1 + "," + s2 else s1

      at[R](r => "{" + (r.foldMap("")(fieldToJSON)(concat)) + "}")
    }

Now we have a function which can convert query results to JSON.

    scala> val rows = sql("select id, name as fname, age from person limit 100").apply
    scala> toJSON(rows)
    res0: String = [{"id":1,"fname":"joe","age":36},{"id":2,"fname":"moe","age":14}]

Module sqltyped-json4s contains ready made implementation which uses json4s for JSON rendering.

Runtime query building

To infer the types the SQL statement must be known statically at compile time. But what if we have to build the SQL at runtime? Say for instance that some parts of the query depends on user input. Turns out that if we relax the inference requirements a bit we can add some support for runtime query building.

When the SQL is fully known at compile time we can use full type inference:

sql("select age, name from person where age > ? and name = ?")

// Inferred type will be:
(Int, String) => List[{ age: Int, name: String }]

To support runtime query building we do not try to infer the types of input parameters:

sql"select age, name from person where $whereExpr"

// Inferred type will be:
Seq[Any] => List[{ age: Int, name: String }]

Note, we changed to interpolation syntax. It allows building of where-part of the SQL statement dynamically. Here sqlτyped macro only parses the beginning of the statement to infer the return types. Input parameters are not inferred and therefore can't be verified at compile time. Now we can build rest of the query at runtime:

scala> val where = "age > ?" + " and " + "age < ?"
scala> val orderBy = "age"
scala> val q = sql"select name from person where $where order by $orderBy"
scala> q.apply(Seq(5, 100))
res0: List[String] = List("moe", "joe")

If you find string concatenation lame (which it of course is) you can quickly craft a small lib which lets you compose the where part of the query. As in the example below.

sealed trait Expr {
  def sql: String = this match {
    case Predicate(e, _) => e
    case And(l, r) => "(" + l.sql + " and " + r.sql + ")"
    case Or(l, r) => "(" + l.sql + " or " + r.sql + ")"
  }

  def args: Seq[Any] = this match {
    case Predicate(_, as) => as
    case And(l, r) => l.args ++ r.args
    case Or(l, r) => l.args ++ r.args
  }

  def and(other: Expr) = And(this, other)
  def or(other: Expr)  = Or(this, other)
}

case class Predicate(sqlExpr: String, arguments: Seq[Any]) extends Expr
case class And(l: Expr, r: Expr) extends Expr
case class Or(l: Expr, r: Expr) extends Expr

def pred(sql: String, args: Any*) = Predicate(sql, args)

Query composition is now a lot nicer:

val p1 = pred("age > ?", 15)
val p2 = pred("age < ?", 2)
val p3 = pred("length(name) < ?", 6)

val expr = (p1 or p2) and p3

sql"select name from person where ${expr.sql}".apply(expr.args)
Clone this wiki locally