Extensions for PostgreSQL
In this chapter we discuss the extended support that doobie offers for users of PostgreSQL. To use these extensions you must add an additional dependency to your project:
libraryDependencies += "org.tpolecat" %% "doobie-postgres" % "1.0.0-RC4"
This library pulls in PostgreSQL JDBC Driver as a transitive dependency.
There are extensions available for dealing with JSON by using Circe, if you like to use those, include this dependency:
libraryDependencies += "org.tpolecat" %% "doobie-postgres-circe" % "1.0.0-RC4"
Then, you will be able to import the implicits for dealing with JSON:
import doobie.postgres.circe.json.implicits._
import doobie.postgres.circe.jsonb.implicits._
Setting Up
The following examples require a few imports.
import cats._
import cats.data._
import cats.effect._
import cats.implicits._
import doobie._
import doobie.implicits._
import doobie.util.ExecutionContexts
// This is just for testing. Consider using cats.effect.IOApp instead of calling
// unsafe methods directly.
import cats.effect.unsafe.implicits.global
// A transactor that gets connections from java.sql.DriverManager and executes blocking operations
// on an our synchronous EC. See the chapter on connection handling for more info.
val xa = Transactor.fromDriverManager[IO](
driver = "org.postgresql.Driver", // JDBC driver classname
url = "jdbc:postgresql:world", // Connect URL - Driver specific
user = "postgres", // Database user name
password = "password", // Database password
logHandler = None // Don't setup logging for now. See Logging page for how to log events in detail
)
doobie adds support for a large number of extended types that are not supported directly by JDBC. All mappings (except postgis) are provided in the pgtypes
module.
import doobie.postgres._
import doobie.postgres.implicits._
Java 8 Time Types (JSR310)
An explicit import is required to bring in mappings for java.time.OffsetDateTime
/ java.time.Instant
/ java.time.ZonedDateTime
/ java.time.LocalDateTime
/ java.time.LocalDate
/ java.time.LocalTime
import doobie.postgres.implicits._
To ensure doobie performs the conversion correctly between Java 8 time types and PostgreSQL Date/Time types when handling timezones or the lack thereof. The correct combination of date/time types should be used:
TIMESTAMP
maps tojava.time.LocalDateTime
TIMESTAMPTZ
maps tojava.time.Instant
,java.time.ZonedDateTime
orjava.time.OffsetDateTime
DATE
maps tojava.time.LocalDate
TIME
maps tojava.time.LocalTime
Array Types
doobie supports single-dimensional arrays of the following types:
bit[]
maps toArray[Boolean]
int4[]
map toArray[Int]
int8[]
maps toArray[Long]
float4[]
maps toArray[Float]
float8[]
maps toArray[Double]
varchar[]
,char[]
,text[],
andbpchar[]
all map toArray[String]
.uuid[]
maps toArray[UUID]
In addition to Array
you can also map to List
and Vector
. Note that arrays of advanced types and structs are not supported by the driver; arrays of Byte
are represented as bytea
; and arrays of int2
are incorrectly mapped by the driver as Array[Int]
rather than Array[Short]
and are not supported in doobie.
See the previous chapter on SQL Arrays for usage examples.
Enum Types
doobie supports mapping PostgreSQL enum
types to Scala enumerated types, with the slight complication that Scala doesn’t really support enumerated types as a first-class notion. We will examine three ways to construct mappings for the following PostgreSQL type:
create type myenum as enum ('foo', 'bar')
The first option is to map myenum
to an instance of the execrable scala.Enumeration
class via the pgEnum
constructor.
object MyEnum extends Enumeration {
val foo, bar = Value
}
implicit val MyEnumMeta = pgEnum(MyEnum, "myenum")
sql"select 'foo'::myenum".query[MyEnum.Value].unique.transact(xa).unsafeRunSync()
// res0: MyEnum.Value = foo
It works, but Enumeration
is terrible so it’s unlikely you will want to do this. A better option, perhaps surprisingly, is to map myenum
to a Java enum
via the pgJavaEnum
constructor.
// This is Java code
public enum MyJavaEnum { foo, bar; }
implicit val MyJavaEnumMeta = pgJavaEnum[MyJavaEnum]("myenum")
And the final, most general construction simply requires evidence that your target type can be translated to and from String
.
sealed trait FooBar
object FooBar {
case object Foo extends FooBar
case object Bar extends FooBar
def toEnum(e: FooBar): String =
e match {
case Foo => "foo"
case Bar => "bar"
}
def fromEnum(s: String): Option[FooBar] =
Option(s) collect {
case "foo" => Foo
case "bar" => Bar
}
}
implicit val FoobarMeta: Meta[FooBar] =
pgEnumStringOpt("myenum", FooBar.fromEnum, FooBar.toEnum)
sql"select 'foo'::myenum".query[FooBar].unique.transact(xa).unsafeRunSync()
// res1: FooBar = Foo
Geometric Types
The following geometric types are supported, and map to driver-supplied types.
- the
box
schema type maps toorg.postgresql.geometric.PGbox
- the
circle
schema type maps toorg.postgresql.geometric.PGcircle
- the
lseg
schema type maps toorg.postgresql.geometric.PGlseg
- the
path
schema type maps toorg.postgresql.geometric.PGpath
- the
point
schema type maps toorg.postgresql.geometric.PGpoint
- the
polygon
schema type maps toorg.postgresql.geometric.PGpolygon
It is expected that these will be mapped to application-specific types via xmap
as described in Custom Mappings.
PostGIS Types
doobie provides mappings for the top-level PostGIS geometric types provided by the org.postgis
driver extension.
Mappings for postgis are provided in the pgistypes
module. Doobie expects postgis dependency to be provided, so if you use this module you should add postgis as a dependency.
libraryDependencies += "net.postgis" % "postgis-jdbc" % "2.3.0"
// Not provided via doobie.postgres.imports._; you must import them explicitly.
import doobie.postgres.pgisimplicits._
PGgeometry
PGbox2d
PGbox3d
In addition to the general types above, doobie provides mappings for the following abstract and concrete fine-grained types carried by PGgeometry
:
Geometry
ComposedGeom
GeometryCollection
MultiLineString
MultiPolygon
PointComposedGeom
LineString
MultiPoint
Polygon
Point
Geographic types mappings are defined in a different object (pgisgeographyimplicits
), to allow geometric types using geodetic coordinates.
import doobie.postgres.pgisgeographyimplicits._
// or define the implicit conversion manually
implicit val geographyPoint: Meta[Point] =
doobie.postgres.pgisgeographyimplicits.PointType
- Point
- Polygon
- MultiPoint
- LineString
- PointComposedGeom
- MultiPolygon
- MultiLineString
Other Nonstandard Types
- The
uuid
schema type is supported and maps tojava.util.UUID
. - The
inet
schema type is supported and maps tojava.net.InetAddress
. - The
hstore
schema type is supported and maps to bothjava.util.Map[String, String]
and ScalaMap[String, String]
.
Extended Error Handling
A complete table of SQLSTATE values is provided in the doobie.postgres.sqlstate
module. Recovery combinators for each of these states (onUniqueViolation
for example) are provided in doobie.postgres.syntax
.
val p = sql"oops".query[String].unique // this won't work
Some of the recovery combinators demonstrated:
p.attempt.transact(xa).unsafeRunSync() // attempt is provided by ApplicativeError instance
// res2: Either[Throwable, String] = Left(
// value = org.postgresql.util.PSQLException: ERROR: syntax error at or near "oops"
// Position: 1
// ) // attempt is provided by ApplicativeError instance
p.attemptSqlState.transact(xa).unsafeRunSync() // this catches only SQL exceptions
// res3: Either[SqlState, String] = Left(value = SqlState(value = "42601")) // this catches only SQL exceptions
p.attemptSomeSqlState { case SqlState("42601") => "caught!" } .transact(xa).unsafeRunSync() // catch it
// res4: Either[String, String] = Left(value = "caught!") // catch it
p.attemptSomeSqlState { case sqlstate.class42.SYNTAX_ERROR => "caught!" } .transact(xa).unsafeRunSync() // same, w/constant
// res5: Either[String, String] = Left(value = "caught!") // same, w/constant
p.exceptSomeSqlState { case sqlstate.class42.SYNTAX_ERROR => "caught!".pure[ConnectionIO] } .transact(xa).unsafeRunSync() // recover
// res6: String = "caught!" // recover
p.onSyntaxError("caught!".pure[ConnectionIO]).transact(xa).unsafeRunSync() // using recovery combinator
// res7: String = "caught!"
Server-Side Statements
PostgreSQL supports server-side caching of prepared statements after a certain number of executions, which can have desirable performance consequences for statements that only need to be planned once. Note that this caching happens only for PreparedStatement
instances that are re-used within a single connection lifetime. doobie supports programmatic configuration of the prepare threshold:
- For a given
Connection
you can set and query the prepare threshold with theConnectionIO
constructorsdoobie.postgres.hi.connection.pgSetPrepareThreshold
andpgGetPrepareThreshold
. - For a specific
PreparedStatement
you can set and query the prepare threshold with thePreparedStatementIO
constructorsdoobie.postgres.hi.preparedstatement.pgSetPrepareThreshold
andpgGetPrepareThreshold
.
See the JDBC driver documentation for more information.
LISTEN
and NOTIFY
PostgreSQL provides a simple transactional message queue that can be used to notify a connection that something interesting has happened. Such notifications can be tied to database triggers, which provides a way to notify clients that data has changed. Which is cool.
doobie provides ConnectionIO
constructors for SQL LISTEN
, UNLISTEN
, and NOTIFY
in the doobie.postgres.hi.connection
module. New notifications are retrieved (synchronously, sadly, that’s all the driver provides) via pgGetNotifications
. Note that all of the “listening” operations apply to the current connection, which must therefore be long-running and typically off to the side from normal transactional operations. Further note that you must setAutoCommit(false)
on this connection or commit
between each call in order to retrieve messages. The examples
project includes a program that demonstrates how to present a channel as a Stream[IO, PGNotification]
.
Large Objects
PostgreSQL provides a facility for storing very large objects (up to 4TB each) in a single uniform storage, identified by unique numeric ID and accessed via fast byte-block transfer. Note that “normal” large object columns types such as bytea
and text
can store values as large as 1GB each, so the large object API is rarely used. However there are cases where the size and/or efficiency of large objects justifies the use of this API.
doobie provides an algebra and free monads for the driver’s LargeObjectManager
and LargeObject
types in the doobie.postgres.free
package. There is also [the beginnings of] a high-level API that includes constructors for creating large objects from files and vice-versa. The example
project contains a brief usage example.
Please file an issue or ask questions on the Gitter channel if you need to use this API; it will evolve as use cases demand.
Copy Manager
The PostgreSQL JDBC driver’s CopyManager API provides a pass-through for the SQL COPY
statement, allowing very fast data transfer via java.io
streams. Here we construct a program that dumps a table to Console.out
in CSV format, with quoted values.
val q = """
copy country (name, code, population)
to stdout (
encoding 'utf-8',
force_quote *,
format csv
)
"""
val prog: ConnectionIO[Long] =
PHC.pgGetCopyAPI(PFCM.copyOut(q, Console.out)) // return value is the row count
See the links above and sample code in the examples/
project in the doobie GitHub repo for more information on this specialized API.
doobie also provides a specialized API for very fast batch inserts using upates of the form COPY ... FROM STDIN
and a Text
typeclass that defines how data types are encoded in Postgres text format (similar to Put
; instances must be present for all fields in the data type to be inserted).
First a temp table for our experiment.
val create: ConnectionIO[Unit] =
sql"""
CREATE TEMPORARY TABLE food (
name VARCHAR,
vegetarian BOOLEAN,
calories INTEGER
)
""".update.run.void
And some values to insert. Text
instances are provided for all the data types we are using here.
case class Food(name: String, isVegetarian: Boolean, caloriesPerServing: Int)
val foods = List(
Food("banana", true, 110),
Food("cheddar cheese", true, 113),
Food("Big Mac", false, 1120)
)
Our insert statement must have the form COPY ... FROM STDIN
, and we can insert any Foldable
.
def insert[F[_]: Foldable](fa: F[Food]): ConnectionIO[Long] =
sql"COPY food (name, vegetarian, calories) FROM STDIN".copyIn(fa)
We can run it thus, yielding the number of affected rows.
(create *> insert(foods)).transact(xa).unsafeRunSync()
// res8: Long = 3L
Fastpath
doobie provides an algebra and free monad for constructing programs that use the FastPathAPI
provided by the PostgreSQL JDBC driver, however this API is mostly deprecated in favor of server-side statements (see above). And in any case I can’t find an example of how you would use it from Java so I don’t have an example here. But if you’re using it let me know and we can figure it out.
EXPLAIN/EXPLAIN ANALYZE
The PostgreSQL server can provide an analysis of any query, using the EXPLAIN
keyword. doobie can run EXPLAIN
on any Query0
or Query
object, as long as doobie.postgres._
and doobie.postgres.implicits._
have been imported. Using an example from earlier in the book:
sql"select name from country"
.query[String] // Query0[String]
.explain
.transact(xa)
.unsafeRunSync()
.foreach(println)
// Seq Scan on country (cost=0.00..7.39 rows=239 width=11)
Similary, explainAnalyze
will analyze the query and run it, comparing the query planner’s estimates with real performance. Using the example above again:
sql"select name from country"
.query[String] // Query0[String]
.explainAnalyze
.transact(xa)
.unsafeRunSync()
.foreach(println)
// Seq Scan on country (cost=0.00..7.39 rows=239 width=11) (actual time=0.004..0.027 rows=239 loops=1)
// Planning Time: 0.111 ms
// Execution Time: 0.056 ms
explain
and explainAnalyze
both return a ConnectinIO[List[String]]
result, where each member of the list is one row of the query planner’s output.