Skip to content
jonifreeman edited this page Nov 1, 2012 · 47 revisions

Configuration

Compile time configuration

The macro needs a connection to the database at compile time to read schema metadata. The connection is configured with following system properties:

  • sqltyped.url
  • sqltyped.driver
  • sqltyped.username
  • sqltyped.password

In addition, the macro needs an instance of Configuration which can be passed as an implicit parameter.

    object Tables { trait person; trait job_history }
    object Columns { object name; object age; object salary; object count; object avg }
    implicit val c = Configuration(Tables, Columns)

In object Tables you need to create a type for each table. These types will be used when tagging PK and FK columns. See Tagging.

In object Columns you need to create a type for each column accessed with record syntax. See Working with records.

Note, future versions will automatically create these types by using macro types (not available in Scala yet).

Runtime configuration

Function sql returns an instance of a QueryN where N is input arity. Concretely a query with two arguments, String and Int returns an instance of

   trait Query2[R] {
     def apply(i1: String, i2: Int)(implicit conn: java.sql.Connection): List[R]
   }

To execute the query with apply function, an implicit java.sql.Connection must be available.

Connection & transaction management

Sqltyped does not provide any connection pooling or transaction management solution. Existing solutions can be used instead. The integration is easy, just expose a connection as an implicit parameter as in following Slick example.

    import scala.slick.session.Database
    val db = Database.forURL("jdbc:mysql://localhost:3306/sqltyped", 
                             driver = "com.mysql.jdbc.Driver", user = "root", password = "")
    implicit def conn = Database.threadLocalSession.conn

    db withTransaction {
      sql("update person set name=? where id=?").apply("danny", 1)
      sql("update person set name=? where id=?").apply("fanny", 2)
    }

    db withSession {
      sql("select name from person where id=?").apply(1)
    }

Tagging

If a column is a primary or foreign key its type is tagged. For instance, a column which references 'person.id' is typed as Long @@ person. That funny little @@ symbol is a type tag from Shapeless project. It is used to add extra type information to otherwise simple type and can be used for extra type safety in data access code.

    scala> def findName(id: Long @@ person) = sql("select name from person where id=?").apply(id)

    scala> sql("select person from job_history").apply map findName

The above code compiles because 'job_history.person' is a foreign key referencing 'person.id'. Thus, its type is Long @@ person.

Note, input parameters are not tagged (just typed). Otherwise this wouldn't compile:

    sql("select name,age from person where id=?").apply(1)

Instead, explicit tagging would had been required:

    sql("select name,age from person where id=?").apply(tag[person](1))

There's a separate function called sqlt (t as tagged) which tags input parameters too.

    scala> val q = sqlt("select name,age from person where id=?")
    scala> q.apply(1)
    <console>:31: error: type mismatch;
     found   : Int(1)
     required: shapeless.TypeOperators.@@[Long,Tables.person]
        (which expands to)  Long with AnyRef{type Tag = Tables.person}
                  q.apply(1)

    scala> q.apply(tag[person](1))
    res2: Option[shapeless.::[(Columns.name.type, String),shapeless.::[(Columns.age.type, Int),shapeless.HNil]]] = 
      Some((Columns$name$@d6d0dbe,joe) :: (Columns$age$@72a13bd4,36) :: HNil)

Working with records

Introduction

Record is a type safe list of key-value pairs. Scala does not have a native record system but they can be emulated with HLists. Before going into details how to do that let's just recap how records differ from case classes, a native Scala construct.

Scala provides us case classes which are very nice to represent data.

    case class Person(name: String, age: Int)

    val p = Person("Joe", 36)

While case classes are nice there are some limitations.

  • case classes must be named
  • we cannot easily modify the schema of a case class (add or remove fields)

Often those limitations do not matter but a data structure which lifts those restrictions would be quite nice to represent data read from database. After all, we do not have a name for each SELECT query and sometimes we like to add (or remove) something to the query results before further processing.

The above data can be represented as an extensible record too.

    record Person = { name: String, age: Int }

    val p = Person { name = "Joe", age = 36 }

Now, of course that's not valid Scala syntax. Like said, we do not have a first-class record system available. Fortunately emulating those with HLists turns out to work quite ok.

    object name
    object age
    type Person = (name.type, String) :: (age.type, Int) :: HNil

    val p: Person = (name -> "Joe") :: (age -> 36) :: HNil

We defined types for record keys: name and age. Then defined a record as a HList of key/value pairs. A bit verbose but naming is optional.

    object name
    object age

    val p = (name -> "Joe") :: (age -> 36) :: HNil

Using records

Basic data access and data modifications are supported.

    scala> val p = (name -> "Joe") :: (age -> 36) :: HNil

    scala> p.show
    res0: String = { name = Joe, age = 36 }
    scala> p get name
    res1: String = Joe

    scala> object lname
    scala> p get lname
    <console>:19: error: No such key lname.type
                  p get lname

    scala> val p2 = (lname -> "Doe") :: p
    scala> p2.show
    scala> res2: String = { lname = Doe, name = Joe, age = 36 }
    scala> p2 get lname
    res3: String = Doe

    scala> val p3 = p removeKey name
    scala> p3.show
    res3: String = { age = 36 }

    scala> val p4 = p renameKey(name, lname)
    scala> p4.show
    res4: String = { lname = Joe, age = 36 }

    scala> val p5 = p.modify(name)((s: String) => s.length)
    scala> p5 get name
    res5: Int = 3

Advanced use is possible by using HList utilities provided by Shapeless. Like mapping over record fields.

    object toUpper extends Poly1 {
      implicit def stringToUpper[K] = at[(K, String)] {
        case (k, v) => (k, v.toUpperCase)
      }

      implicit def otherTypes[K, V] = at[(K, V)](identity)
    }

    scala> val p6 = p map toUpper
    scala> p6.show
    res6: String = { name = JOE, age = 36 }

We have to define toUpper as a polymorphic function. If a value is of type String, we upcase it. For other types we just use identity function.

JSON example

One particular advantage of records over case classes is that it is possible to implement polymorphic functions without resorting to runtime reflection. As a concrete example, let's see how a record could be converted to a JSON string.

    object JSON {
      object fieldToJSON extends Poly1 {
        implicit def numToJSON[K, V <% Number] = at[(K, V)] { 
          case (k, v) => "\\"" + keyAsString(k) + "\\":" + v.toString 
        }
        implicit def stringToJSON[K, V <% String] = at[(K, V)] { 
          case (k, v) => "\\"" + keyAsString(k) + "\\":\\"" + v + "\\""
        }
        implicit def boolToJSON[K, V <% Boolean] = at[(K, V)] { 
          case (k, v) => "\\"" + keyAsString(k) + "\\":" + v.toString 
        }
        implicit def tstampToJSON[K, V <% java.sql.Timestamp] = at[(K, V)] { 
          case (k, v) => "\\"" + keyAsString(k) + "\\":\\"" + v.toString + "\\""
        }
        implicit def optionToJSON[K, V](implicit c: Case1[(K, V)]) = at[(K, Option[V])] { 
          case (k, Some(v)) => fieldToJSON((k, v)).toString
          case (k, None) => ""
        }
      }

      def toJSON[R <: HList](record: R)(implicit foldMap: MapFolder[R, String, fieldToJSON.type]): String = {
        val concat = (s1: String, s2: String) => if (s2 != "") s1 + "," + s2 else s1

        "{" + (record.foldMap("")(fieldToJSON)(concat)) + "}"
      }

      def toJSON[R <: HList](records: List[R])(implicit foldMap: MapFolder[R, String, fieldToJSON.type]): String = 
        (records map (r => toJSON(r))).mkString("[", ",", "]")
    }

Whoa! That begs an explanation.

We begin by introducing a function fieldToJSON which can convert a record field to a JSON field. Remember that a record field is simply a tuple (K, V). First some primitive types are handled.

    implicit def numToJSON[K, V <% Number] = at[(K, V)] { 
      case (k, v) => "\\"" + keyAsString(k) + "\\":" + v.toString 
    }

That case handles field (K, V <% Number). That is, any field whose value can be converted to a Number. The implementation destructures the field and renders it as JSON field.

Handling Option type is pretty straightforward too.

    implicit def optionToJSON[K, V](implicit c: Case1[(K, V)]) = at[(K, Option[V])] { 
      case (k, Some(v)) => fieldToJSON((k, v)).toString
      case (k, None) => ""
    }

To render a field (K, Option[V]) we need an evidence that (K, V) can be rendered as JSON too. The evidence is enforced by adding an implicit parameter Case1[(K, V)]. When the value is Some, we extract the value and render it by recursively calling fieldToJSON function. In case of None, nothing is rendered.

Then we introduce a function which can render one record as a JSON object.

    def toJSON[R <: HList](record: R)(implicit foldMap: MapFolder[R, String, fieldToJSON.type]): String = {
      val concat = (s1: String, s2: String) => if (s2 != "") s1 + "," + s2 else s1

      "{" + (record.foldMap("")(fieldToJSON)(concat)) + "}"
    }

Shapeless provides many functions to work with HLists. Here we utilize foldMap which is a combination of common fold and map functions. Our foldMap folds over records by mapping each field with fieldToJSON and builds a String. Scala can't infer the types here, that's why they are explicitly listed in: foldMap: MapFolder[R, String, fieldToJSON.type].

Finally, a function to render a List of records as a JSON array is introduced.

    def toJSON[R <: HList](records: List[R])(implicit foldMap: MapFolder[R, String, fieldToJSON.type]): String = 
      (records map (r => toJSON(r))).mkString("[", ",", "]")

Now we have a function which can convert query results to JSON.

    scala> val rows = sql("select id, name as fname, age from person limit 100").apply
    scala> JSON.toJSON(rows)
    res0: String = [{"id":1,"fname":"joe","age":36},{"id":2,"fname":"moe","age":14}]

Building queries dynamically

TBD

Clone this wiki locally