Composing Service Layers in Scala

The Problem

We just described standard design issues you have when you start creating layers of services, DAOs and other components to implement an application. That blog/gist is here.
The goal is to think through some designs in order to develop something useful for an application.

Working through Layers

If you compose services and DAOs the normal way, you typically get imperative style objects. For example, imagine the following:
  object DomainObjects {
    type UserId = Long;
    case class User(id: UserId, properties: Map[String, Any])
  }

  import DomainObjects._

  /**
   * The service layer. Most service layers define the unit of work. Many
   * times a unit of work in the service layer is the same as that implemented
   * in the DAO layer, but that's because some internet examples are too small
   * to show the nuanances. Since this layer defines units of work, this is where
   * the transaction strategy is also implemented. Spring implements this with
   * annotations and proxies/byte-code enhancers. We'll use explicit coding because we are not
   * using proxies/byte-code enhancers like spring uses.
   *
   * Classic cake pattern. Any implementation must define
   * a service val (or directly create an object) to satisfy
   * this abstract type.
   */
  trait UserServiceComponent {

    val service: UserService

    trait UserService {
      def updateName(User: UserId, newName: String): Either[String, User]
      def findById(name: UserId): Option[User]
    }
  }

  /**
   * This is the usual "interface" declaration.
   *
   */
  trait UserDaoComponent {

    val userDao: UserDao

    trait UserDao {
      /**
       * Find a user by their id.
       */
      def findById(user: UserId): Option[User]

      /**
       * Update the user properties, whatever has changed.
       */
      def update(user: User): Unit
    }
  }

  /**
   * The simple application auditing component.
   */
  trait AuditDaoComponent {

    /**
     * We define a def here. If you want singleton behavior,
     * define a private member _auditDao, instantiate that
     * then return it in the def. If you want a new auditDao
     * each time, create a new one each time this def is called.
     * DAOs traditionally do not hold any state so it which you
     * choose is not too critical for most applications but the
     * option is demonstated here by using a def instead of a val.
     */
    def auditDao: AuditDao

    trait AuditDao {
      /**
       * Send the changes to an audit log somewhere.
       */
      def auditChange(user: User, changedProperties: Seq[String]): Unit
    }
  }
The next step would be to create the implementation classes. But here's where you get stuck. In spring, you would use the @Transactional and the container injection (with our without java config) to configure your objects. In other words, you would provide specific technology choices in the form of annotations like @Transactional or @Autowired. Since scala and the cake pattern does not use byte code generation or proxies, you have to be a bit more explicit with the technology choices. But we do not want to bake in a specific technology at the service and DAO "interface" level.
So what we really need to do is reframe the standard service and DAO model so that is more flexible--which in this case means more composable.
In this case, we really need to not return the actual return value directly, but a function that returns the return value. This way, we can than wrap the functions and compose them together with others. For example, we can call the function asynchronously or synchronously.
The most obvious approach is to use the Reader monad, say from scalaz:
  trait UserServiceComponent[T] {

    val service: UserService

    trait UserService {
      def updateName(User: UserId, newName: String): Reader[T, Either[String, User]]
      def findById(name: UserId): Reader[T, Option[User]]
    }
  }
Because we did not want to specify the actual "value" in the reader that would be "injected" into the function, we parameterized the type. We could also have the UserService take a constructor parameter that contains an environment. So we need to think through this. But there are issues with parameterization, namely that the parameter would get carried throughout the API. And we do not want to lock us into the Reader monad either as that reduces flexibility. So trying to genericize our DAO structures using type parameters can work but we may want to try existential types as well so we can mix in the context. The service/DAO object can consume it as it sees fit. Since existential types are a form of cake layer, we are choosing cake over type parameterization. The cake layer, at least using self-types, can also help us ensure that the dependencies are always available within the context so it also helps us force not only a certain way of composing our service/DAO but it also helps with dependency management which is one of our objectives.
First let's take a quick look at a standard approach of using a cake layer to weave in the database needed to open a transaction/session.

Remembering Slick

Let's at how to provide some simple slick-specific parts to the service and DAO implementations. We saw in a previous blog that we could do something like:
/**
   * If you only define queries you do not need a database you just need a profile.
   * This profile is all that is need to mix in.
   */
  trait ProfileContext {
    /**
     * To allow automatic query lifting with a Session: {{{ import profile.Implicit._ }}}
     */
    val profile: BasicProfile
  }

  /**
   * A cake layer with an abstract type members that says a database object is available
   * and a driver profile. This
   * allows you to "lift" queries for execution in the driver and to create
   * transactions to wrap your queries. This corresponds to conceptually providing
   * a spring @Transactional annotation, giving you control over units of work, and
   * an JPA EntityManager so you can define and execute queries. This is one way to
   * mix in the database so we can get access to objects that can provide us
   * transactions. We could skip this trait and use the DataComponent trait.
   *
   * If you are using this component to
   * open a transaction {{{database.withTx}}}, you are not using proxies or AOP to slice
   * in transactional behavior like spring does. So you have to be explicit. The spring
   * @Transactional attribute and the container wraps each method in a transaction
   * automatically.
   *
   * This component allows us, in a totally type safe way, to provide access
   * to the instance that allows us to run transactions. We could have
   * abstracted the database away and just said there is an object there that
   * provides a transaction method {{{withTx}}} using duck typing, but
   * we just stuck in the member as the database object to make it apparent where
   * it was coming from. We could also abstract the profile away. Since we include
   * that mostly to lift our queries into the driver via implicits, we could
   * just have provided some implicit def that did that as well. But again, it was
   * easier just to provide the profile. You'll need to call {{{import profile.Implicit._}}}
   * in your code so that {{{Query}}}s are lifted into the driver to call {{{.run}}}.
   */
  trait DatabaseContext extends ProfileContext {
    /**
     * This is a def because using a val (can't use a var) forces
     * us to have to define the object at compile time whereas most databases
     * are configured and opened dynamically in an application based
     * on configuration parameters (e.g. host name) you want to
     * connect to. Always use defs for flexibility unless you know
     * you can configure the object to be set at compile time. Many
     * times you can, but for databases that must be "opened" after
     * the program starts, you need to use a def.
     *
     * # syntax is used because these are existential types.
     */
    def database: profile.simple.Database
  }
And then our service and DAO could be something like:
  /**
   * This layer adds a simple implementation. Since this layer
   * uses database specific calls, it needs the database within scope
   * in order to create a transaction.
   *
   * The implementation assumes some knowledge of how to create
   * a transaction or how to create a session context for the lower level
   * layers (the DAOs). By including a DatabaseContext we bake in knowledge on the Profile type and
   * enforce the need for having a database and profile members so that
   * the service layer can handle transactions (through the database object).
   * We could really use any "object" that is specific to the technology. 
   * Strictly, the business
   * layer should only have knowledge about creating a transaction but we
   * want to remain flexible.
   *
   * The database is accessed merely to be able to open transactions
   * which define units of work, and these should be managed in the
   * business layer. Without bytecode generation or proxies as in spring, we cannot
   * intercept the "methods" on the class and wrap them in the transaction
   * automatically. But avoiding proxies and bytecode generation is kind of
   * the point of static typing.
   *
   * We could also convert this to a class and
   * take a constructor, or set the database as a "setter".
   * We could fiddle with implicits (which is much harder in this
   * scenario) as well. So if we dropped the type parameter, we would need to
   * ensure that the injected DAOs (through setters) had the right type.
   *
   */
  trait UserServiceImpl extends UserServiceComponent {

    self: UserDaoComponent with DatabaseContext =>

    /**
     * Override the "forced" val definition so that the type is more specific.
     * You don't have to do this if the standard UserService interface
     * is all that you need. You don't need the "override" here but we use
     * it to be clear that its the same definition as in the supertrait.
     */
    override val service: UserService

    class UserService extends super.UserService {

      import profile.simple._

      /**
       * Update the name and return the new user object. This is considered
       * an Unit of Work for our application.
       */
      def updateName(user: UserId, newName: String): Either[String, User] = {
        database.withTx { implicit session =>
          val x = CypherQuery("""start n=node(*) return n""")
          x.run

          val newUser: Option[User] = for {
            user <- span=""> findById(user)
            val changes = Map("name" -> newName)
            newUser <- span=""> Option(user.copy(properties = user.properties ++ changes))
          } yield newUser

          // Only run for the side-effect of the update
          newUser.foreach { v =>
            val r = userDao.update(v)
            v
          }
          newUser.toRight("No user " + user + " found")
        }

      }

      /**
       * This is more of a pass through.
       */
      def findById(user: UserId): Option[User] = {
        database.withTx { implicit session =>
          userDao.findById(user)
        }

      }

      /**
       * Another service method that just shows you can add methods and that the
       * abstract type member "service" is better off with the more specific
       * UserService type declared in this UserService implementation. Instances
       * accessing the "service" can see this method directly since the type
       * is specific. Again, you may or may not want this in your design.
       */
      def doSomethingTransactionalWithUser(user: UserId): Boolean =
        database.withTx { implicit session =>
          println("Doing something transactional with user: " + user)
          true
        }

    }
  }
But this only helps with ensuring required dependencies are available (in this case a slick implementation) versus helping composability. What it does show us is that to get transactional behavior, we need a database-like object. Also, if, as in slick, queries are to be defined in the service, we would need a "profile" as well as a parameter. It's pretty easy to see that if we want to abstract away the specific technology choice but we need to introduce an abstraction at the "UserService" level and whatever we do introduce, needs to be technology agnostic.
So it looks like if we want to use a constructor parameter or cake layer, it will probably need to carry both the "database," the "profile," and something that defines a type for the service/DAO methods so it can return a function instead of a direct value.

Improving Composability

To improve composability, methods need to return functions instead of just raw values. If we did not do this, the imperative nature of the DAO would not allow us to compose a sequence of DAO calls or wrap functions within functions which is a more functional way of writing code.
We know that the implementations will need a technology-specific context to execute under--an environment. So we need an abstraction for the environment and an easy way to apply it. We have a bunch of choices:
  • Put this technology-specific context at the UserDao trait level (and make UserDao a class) so that it becomes a constructor argument but then a new UserDao must be constructed each time you need to use the DAO. This restricts your choices of how to handle the technology specific component and could limit future design choices.
  • Provide a type parameter could also be used but that sometimes makes inheritance more restricted.
  • Use an abstract type member (the over all object is know an existential type) we also allow ourselves the ability to combine the context members across all components that are instantiated together to allow the context to satisfy multiple component context needs. But this sounds a bit too open ended.
  • Use the Reader from scalaz or, if we wanted to to say a Keisli object, either of which could lift a function (A => M[B]) into an environment. But using Reader is rather restrictive actually. Perhaps someone wants to use their own Reader or equivalent in their functions.
So the options all look good, but we take some lessons from the typesafe slick driver. The slick drivers use a lifted-embedded type design to lift the queries (which are object types separate from the driver) into the driver for execution. It uses implicit defs that when brought into scope usingimport profile.simple._ lift the query object into the appropriate driver. That seems like a decent way to do this.

Playing with a Cake Layer and a Lifting Layer and a Composable Abstraction

We need an abstraction that allows us to store some state, if needed, specify a return type from methods in our service/DAO method calls, allow each method to obtain a context value when called and allows us to compose the functions together. The cue from the slick layer suggests:
  /**
   * The environment that will be mixed into objects
   * that need to make database calls. Concrete classes
   * may have requirements for the client, for example,
   * to use certain imports or provide specific  methods.
   *
   * Subclasses can carry additional state that helps
   * the components execute queries against a backing
   * repository e.g. RDBMS or a REST service. Additional
   * state used across the services/DAOs can go in the 
   * Environment class or it could be provided to functions
   * on a per function call basis using the {{{Context}}}.
   */
  trait Environment {

    /**
     * The context used for each {{{Method}}} invocation in the environment.
     * It has no type bounds restrictions.
     */
    type Context

    /**
     * The type that any function in the service/DAO
     * should return. It is NOT {{{Method[Context, T]}}} to
     * allows implementors flexibility in their API.
     * @tparam T the return value of the method
     */
    type Method[T]

    /**
     * Subclasses can stick other lifters and helper functions in the
     * Helper trait and import them for use under their direct scope control.
     */
    trait Helpers {
      /**
       * Lift {{{Method}}} objects into the Environment. We could just allow
       * each subclass to define its own lift approach e.g. using implicit classes
       * or companion objects but this enforces the requirement that you lift
       * the function.
       */
      implicit def createExecutor[T](m: Method[T]): Executor[T]
    }

    /**
     * Import helpers explicitly to get the lifting action.
     * Subclasses should override their own val and subclass
     * this trait's {{{Helpers}}} trait.
     */
    val helpers: Helpers

    /**
     * An executor wraps a function and contains the extensible logic for
     * executing the method. The basic executor definition could call the method
     * asynchronously. Subclasses would provide their API inside the {{{Executor}}}
     * for how they want the service/DAO method called. If there are no special
     * tricks, the default apply method should work of course.
     */
    trait Executor[T] extends (Context => T)
  }
This seems useful. It essentially is technology agnostic, that is, the Environment is technology agnostic but specifies all of the elements we had itemized. The trait Method is used as the return type from every service/DAO method that you wish to make more composable.
We can now define the UserServiceComponent using this mixin:
  /**
   * An example service that uses the environment.
   * By having each method take a type parameter, you can alter
   * the context provided to each method or just use the {{{Context}}}
   * type directly.
   */
  trait UserServiceComponent2 {
    self: Environment =>

    val userService: UserService

    trait UserService {
      def updateName(user: UserId, newName: String): Method[Either[String, User]]
      def findById(name: UserId): Method[Option[User]]
      def findByName(name: String): Method[Option[User]]
    }
  }
We see that the service/DAO is still technology agnostic but does use types from our environment. We can now make a Slick specific environment:
  /**
   * An environment that uses slick fairly simply and scalaz Reader
   * to obtain the context for each method call, which in this case
   * is a Session object. Mixing in this environment also forces
   * the definition of the slick {{{profile}}} and a {{{database}}} instance
   * which needs to be defined using the standard slick approach.
   * 
   * If you need to mix in more than one SlickEnvironment to the same
   * instance, use subtraits of this trait then mix. Path dependent types
   * will ensure that objects cannot mix across the environments.
   */
  trait SlickEnvironment extends Environment {

    import scalaz._
    import Scalaz._

    val profile: slick.driver.JdbcProfile
    def database: profile.simple.Database

    import profile.simple._

    type Context = Session
    type Method[T] = Reader[Context, T]

    trait Helpers extends super.Helpers {
      override implicit def createExecutor[T](m: Method[T]): Executor[T] = {
        new Executor(m)
      }
    }

    val helpers = new Helpers {}

    /**
     * The default action just calls the reader with the session.
     */
    class Executor[T](reader: Method[T]) extends super.Executor[T] {

      /**
       * Run the function with a specific context.
       */
      def apply(context: Context): T = reader(context)

      /**
       * Get the value using an implicit session.
       */
      def get(implicit session: Session): T = reader(session)
    }
  }
This just uses the Session object from slick as the context and the environment ensures that a profile and database are defined and available. The profile is available through a well-formed path. We can now define the slick-specific service component:
  /**
   * Define tables and queries for user service.
   */
  trait UserTableComponents {
    self: SlickEnvironment =>

    import profile.simple._

    class Users(tag: Tag) extends Table[User](tag, "Users") {
      val id = column[Long]("id", O.AutoInc, O.PrimaryKey)
      val name = column[String]("name")
      val email = column[Option[String]]("email")
      def * = (id, name, email) <> (User.tupled, User.unapply)
    }

    lazy val Users = TableQuery[Users]
  }

  trait SlickUserServiceComponent2 extends UserServiceComponent2 {

    self: SlickEnvironment with UserTableComponents =>

    import helpers._

    class UserService extends super.UserService {

      import profile.simple._

      override def updateName(user: UserId, newName: String): Method[Either[String, User]] =
        Reader { implicit session =>
          findById(user).get match {
            case Some(u) =>
              val x = Users.where(_.id === user).update(u.copy(name = newName))
              findById(user).get.toRight("Could not find updated user with id: " + user)
            case _ => Left("Could not find user with user id: " + user)
          }
        }

      override def findById(id: UserId): Method[Option[User]] =
        Reader { implicit context =>
          (for (u <- span=""> Users if u.id === id) yield u).firstOption
        }

      override def findByName(name: String): Method[Option[User]] =
        Reader { implicit session =>
          Users.filter(_.name === name).firstOption
        }
    }
  }
And create a small program to test it:
object TestEnvironment {

  import slick.driver.H2Driver

  object AppConfig extends SlickUserServiceComponent2
    with SlickEnvironment with UserTableComponents {
    val profile = H2Driver.profile
    val database = profile.simple.Database.forURL("jdbc:h2:mem:test;DB_CLOSE_DELAY=-1", driver = "org.h2.Driver")
    val userService = new UserService()
  }

  def main(args: Array[String]): Unit = {
    println("testing enviroment with user service")

    import AppConfig._
    import profile.simple._
    import DomainObjects._

    var theId: UserId = -1
    database.withSession { implicit session =>
      Users.ddl.create
      Users ++= Seq(User(-1, "John Smith", Some("jsmith@acme.com")),
        User(-1, "Alice Jacobs", Some("ajacobs@acme.com")))
      theId = (for (u <- span=""> Users if u.name === "John Smith") yield u.id).firstOption.get
    }

    import helpers._

    // Create a query that is composed with other queries. These queries
    // are composed using the service directly and could be used with other
    // queries all at the service or DAO level. And the can be composed
    // external to the actual database session.
    def findAndChange(name: String, newName: String) = for {
      u <- span=""> userService.findByName(name)
      // ... more service calls ...
      u2 <- span=""> userService.updateName(u.get.id, newName)
    } yield u2

    val func = findAndChange("John Smith", "Johnny Smith")

    // Run the new query that performs the update.
    database.withSession { implicit session =>
      func.get
    }

    // Get the User object again to check that it was changed
    database.withTransaction { implicit session =>
      val rval = userService.findByName("Johnny Smith").get
      rval match { 
        case Some(u) => println("user found: " + u)
        case _ => println("Could not find user")
      }
    }

  }
}
That seems to meet the objectives. You could create some Executors that allow asynchronous execution using, say, a future. There are lots of choices you can make with the Executor. Since existential types are used, you can also create multiple, finely-sliced mixins and mix them together as needed.

The above Environment is not prefect and can be improved and there are a few issues buried in it, but the idea helps take a step in the right direction.Read the entire article, including the code on gisthub: https://gist.github.com/aappddeevv/8509607


The article touches on: scala, cake pattern, spring, DAO, slick, RDBMS and service layers.

Popular posts from this blog

graphql (facebook), falcor (netflix) and odata and ...

React, Redux, Recompose and some simple steps to remove "some" boilerplate and improve reuse

Using wye and tee with scalaz-stream