This rule is used for building Scala projects with Bazel. There are
currently four rules, scala_library
, scala_macro_library
, scala_binary
and scala_test
.
In order to use scala_library
, scala_macro_library
, and scala_binary
,
you must have bazel 0.3.1 or later and add the following to your WORKSPACE file:
git_repository(
name = "io_bazel_rules_scala",
remote = "https://github.com/bazelbuild/rules_scala.git",
commit = "73743b830ae98d13a946b25ad60cad5fee58e6d3", # update this as needed
)
load("@io_bazel_rules_scala//scala:scala.bzl", "scala_repositories")
scala_repositories()
To use a particular tag, use the tagged number in tag =
and omit the commit
attribute.
Note that these plugins are still evolving quickly, as is bazel, so you may need to select
the version most appropriate for you.
Then in your BUILD file just add the following so the rules will be available:
load("@io_bazel_rules_scala//scala:scala.bzl", "scala_library", "scala_binary", "scala_test")
You may wish to have these rules loaded by default using bazel's prelude. You can add the above to the file tools/build_rules/prelude_bazel
in your repo (don't forget to have a, possibly empty, BUILD file there) and then it will be automatically prepended to every BUILD file in the workspace.
To run with a persistent worker (much faster), you need to add
build --strategy=Scalac=worker
test --strategy=Scalac=worker
to your command line, or to enable by default for building/testing add it to your .bazelrc.
scala_library(name, srcs, deps, runtime_deps, exports, data, main_class, resources, resource_strip_prefix, scalacopts, jvm_flags)
scala_macro_library(name, srcs, deps, runtime_deps, exports, data, main_class, resources, resource_strip_prefix, scalacopts, jvm_flags)
scala_library
generates a .jar
file from .scala
source files. This rule
also creates an interface jar to avoid recompiling downstream targets unless
their interface changes.
scala_macro_library
generates a .jar
file from .scala
source files when
they contain macros. For macros, there are no interface jars because the macro
code is executed at compile time. For best performance, you want very granular
targets until such time as the zinc incremental compiler can be supported.
In order to make a java rule use this jar file, use the java_import
rule.
Attributes | |
---|---|
name |
A unique name for this target |
srcs |
List of Scala |
deps |
List of other libraries to linked to this library target |
runtime_deps |
List of other libraries to put on the classpath only at runtime. This is rarely needed in Scala. |
exports |
List of targets to add to the dependencies of those that depend on this target. Similar to the `java_library` parameter of the same name. Use this sparingly as it weakens the precision of the build graph. |
data |
List of files needed by this rule at runtime. |
main_class |
Name of class with main() method to use as an entry point
The value of this attribute is a class name, not a source file. The
class must be available at runtime: it may be compiled by this rule
(from |
resources |
A list of data files to be included in the JAR. |
resource_strip_prefix |
The path prefix to strip from Java resources. If specified, this path prefix is stripped from every file in the `resources` attribute. It is an error for a resource file not to be under this directory. |
scalacopts |
Extra compiler options for this library to be passed to scalac. Subject to Make variable substitution and Bourne shell tokenization. |
jvm_flags |
List of JVM flags to be passed to scalac after the
|
scala_binary(name, srcs, deps, runtime_deps, data, main_class, resources, resource_strip_prefix, scalacopts, jvm_flags)
scala_binary
generates a Scala executable. It may depend on scala_library
, scala_macro_library
and java_library
rules.
A scala_binary
requires a main_class
attribute.
Attributes | |
---|---|
name |
A unique name for this target |
srcs |
List of Scala |
deps |
List of other libraries to linked to this binary target |
runtime_deps |
List of other libraries to put on the classpath only at runtime. This is rarely needed in Scala. |
data |
List of files needed by this rule at runtime. |
main_class |
Name of class with main() method to use as an entry point
The value of this attribute is a class name, not a source file. The
class must be available at runtime: it may be compiled by this rule
(from |
resources |
A list of data files to be included in the JAR. |
resource_strip_prefix |
The path prefix to strip from Java resources. If specified, this path prefix is stripped from every file in the `resources` attribute. It is an error for a resource file not to be under this directory. |
scalacopts |
Extra compiler options for this binary to be passed to scalac. Subject to Make variable substitution and Bourne shell tokenization. |
jvm_flags |
List of JVM flags to be passed to scalac after the
|
scala_test(name, srcs, suites, deps, data, main_class, resources, resource_strip_prefix, scalacopts, jvm_flags)
scala_test
generates a Scala executable which runs unit test suites written
using the scalatest
library. It may depend on scala_library
,
scala_macro_library
and java_library
rules.
A scala_test
by default runs all tests in a given target.
For backwards compatiblity it accepts a suites
attribute which
is ignored due to the ease with which that field is not correctly
populated and tests are not run.
scala_repl(name, deps, scalacopts, jvm_flags)
A scala repl allows you to add library dependendencies (not currently scala_binary
targets)
to generate a script to run which starts a REPL.
Since bazel run
closes stdin, it cannot be used to start the REPL. Instead,
you use bazel build
to build the script, then run that script as normal to start a REPL
session. An example in this repo:
bazel build test:HelloLibRepl
bazel-bin/test/HelloLibRepl
The scala library suite allows you to define a glob or series of targets to generate sub scala libraries for. The outer target will export all of the inner targets. This allows splitting up of a series of independent files in a larger target into smaller ones. This lets us cache outputs better and also build the indvidual targets in parallel. Downstream targets should not be aware of its presence.
The scala test suite allows you to define a glob or series of targets to generate sub scala tests for. The outer target defines a native test suite to run all the inner tests. This allows splitting up of a series of independent tests from one target into several. This lets us cache outputs better and also build and test the indvidual targets in parallel.